TE
TechEcho
Home24h TopNewestBestAskShowJobs
GitHubTwitter
Home

TechEcho

A tech news platform built with Next.js, providing global tech news and discussions.

GitHubTwitter

Home

HomeNewestBestAskShowJobs

Resources

HackerNews APIOriginal HackerNewsNext.js

© 2025 TechEcho. All rights reserved.

Black box optimization competition

32 pointsby silentvoiceabout 10 years ago

6 comments

darkmightyabout 10 years ago
I really dislike the term &quot;Black box optimization&quot;. There&#x27;s no such thing. You <i>have</i> to make assumptions about your function, so in the end this is just rewarding people whose optimizers happen to match the chosen functions; but those functions are not made explicit whatsoever. That doesn&#x27;t make any sense.<p>For example, if the output&#x2F;input are floating point numbers than you can assume the domain&#x2F;range is [-M,M]. Otherwise, with even the most clever function you have no guarantee of ever approaching the optimum, even if the function is continuous. Now even with a limited range there are no guarantees if the function is not well behaved -- so you have to again assume the function is well behaved. And for any assumption you make there is a condition on function for which it is terrible. There is no best assumption, or best algorithm, then. You could, for instance, <i>assume</i> the function is adversarial (trying to make your life difficult), for which the best algorithm is perhaps just sampling randomly the range, which is really a terrible algorithm -- but that&#x27;s of course just another assumption, and a terrible one.<p>I would much prefer &#x27;Typical function optimization&#x27;, if you&#x27;re optimizing unlabeled functions so frequently, or at least not try to hide the inevitable assumptions.<p>TL;DR: The contest may be useful, but the concept of &quot;Black box optimization&quot; is nonsense.
评论 #9439372 未加载
评论 #9443649 未加载
评论 #9442557 未加载
评论 #9439750 未加载
murbard2about 10 years ago
It&#x27;s a little strange that they do not have a track that gives gradient information, given that it is often a real world possibility. Also, this basically allows unlimited time between eval... So this becomes a contest about - coming up with a distribution over R^n -&gt; R function - finding the optimal evaluation points to do Bayesian update<p>I predict the winner will use some a mixture of Gaussian processes with various kernels and stochastic control (with a limited look ahead, otherwise it blows up) to pick the test points.
评论 #9440417 未加载
obstinateabout 10 years ago
Seems really interesting. Too mathy for my skillset.<p>If I may, I propose that the organizers remove the restriction on disassembling the client library or intercepting network connections. This restriction seems like it cannot benefit the organizers, unless the protocol is insecure. People are going to ignore this rule anyway, and you can&#x27;t stop them or even detect them doing it. So why put it in there? It&#x27;s only going to generate ill will.
评论 #9439412 未加载
darklajidabout 10 years ago
Whoa. The servers for this competion are about 8km away. That&#x27;s the most &#x27;local&#x27; content I&#x27;ve ever seen on HN.<p>Unfortunately I have to agree with obstinate here. The pure math is too much for me and reverse engineering (still daunting, but interesting&#x2F;possible) is not acceptable. If any HN person wins this contest, I offer beers close to the black box :)
评论 #9440428 未加载
cshimminabout 10 years ago
Wish I had seen something about this sooner. The competition began in January and ends on the 30th of this month.
评论 #9456032 未加载
评论 #9439169 未加载
ramgorurabout 10 years ago
1. You do not know what the function looks like, even there is no gradient information<p>2. You have a fixed number of probes M<p>2. Among M, You have N number probes to get the silhouette of the function (exploration).<p>3. Then from the rest of the (M - N) trials, you need to find the optima (exploiation).<p>Sounds more like a pseudo-science than a math problem to me.
评论 #9444099 未加载