TE
TechEcho
Home24h TopNewestBestAskShowJobs
GitHubTwitter
Home

TechEcho

A tech news platform built with Next.js, providing global tech news and discussions.

GitHubTwitter

Home

HomeNewestBestAskShowJobs

Resources

HackerNews APIOriginal HackerNewsNext.js

© 2025 TechEcho. All rights reserved.

NeurIPS 2020 Optimization Competition

77 pointsby mccourtalmost 5 years ago

4 comments

wencalmost 5 years ago
Interesting. There&#x27;s been decades of research on Derivative-Free Optimization (DFO) and stochastic&#x2F;evolutionary algorithms (most of which are derivative-free). They&#x27;re used in practical applications, but have been hard to reliably perf benchmark because solution paths are so dependent on initial guess and random chance.<p>This one focuses on maximizing sample efficiency. That&#x27;s an interesting (and important) metric to benchmark, especially for functions that are computationally expensive to evaluate, like full-on simulations. Sounds like the algorithm would need to be able to efficiently come up with an accurate surrogate model for the expensive function -- which is hard to do in the general case, but if something is known about the underlying function, some specialization is possible.
评论 #23791045 未加载
cs702almost 5 years ago
NeurIPS, 2020: &quot;We need more sample-efficient algorithms for finding better hyperparameters that specify how to train computationaly expensive deep learning models.&quot;<p>Rich Sutton, 2019: &quot;The biggest lesson that can be read from 70 years of AI research is that general methods that leverage computation are ultimately the most effective, and by a large margin.&quot; (<a href="https:&#x2F;&#x2F;news.ycombinator.com&#x2F;item?id=23781400" rel="nofollow">https:&#x2F;&#x2F;news.ycombinator.com&#x2F;item?id=23781400</a>)<p>I wonder if in the end simply throwing more and more computation at the problem of finding good hyperparameters will end up working better as computation continues to get cheaper and cheaper.
评论 #23784239 未加载
评论 #23784471 未加载
评论 #23784316 未加载
mpfundsteinalmost 5 years ago
if anyone wants to do this. i have a threadripper build with 2 2080ti. would be cool to do a group project. write pm if you want. i am located in amsterdam, europe
评论 #23785979 未加载
评论 #23788541 未加载
reedwolfalmost 5 years ago
Search is the problem that solves all other problems.
评论 #23784880 未加载