I really dislike the term "Black box optimization". There's no such thing. You <i>have</i> to make assumptions about your function, so in the end this is just rewarding people whose optimizers happen to match the chosen functions; but those functions are not made explicit whatsoever. That doesn't make any sense.<p>For example, if the output/input are floating point numbers than you can assume the domain/range is [-M,M]. Otherwise, with even the most clever function you have no guarantee of ever approaching the optimum, even if the function is continuous. Now even with a limited range there are no guarantees if the function is not well behaved -- so you have to again assume the function is well behaved. And for any assumption you make there is a condition on function for which it is terrible. There is no best assumption, or best algorithm, then. You could, for instance, <i>assume</i> the function is adversarial (trying to make your life difficult), for which the best algorithm is perhaps just sampling randomly the range, which is really a terrible algorithm -- but that's of course just another assumption, and a terrible one.<p>I would much prefer 'Typical function optimization', if you're optimizing unlabeled functions so frequently, or at least not try to hide the inevitable assumptions.<p>TL;DR: The contest may be useful, but the concept of "Black box optimization" is nonsense.
It's a little strange that they do not have a track that gives gradient information, given that it is often a real world possibility. Also, this basically allows unlimited time between eval... So this becomes a contest about
- coming up with a distribution over R^n -> R function
- finding the optimal evaluation points to do Bayesian update<p>I predict the winner will use some a mixture of Gaussian processes with various kernels and stochastic control (with a limited look ahead, otherwise it blows up) to pick the test points.
Seems really interesting. Too mathy for my skillset.<p>If I may, I propose that the organizers remove the restriction on disassembling the client library or intercepting network connections. This restriction seems like it cannot benefit the organizers, unless the protocol is insecure. People are going to ignore this rule anyway, and you can't stop them or even detect them doing it. So why put it in there? It's only going to generate ill will.
Whoa. The servers for this competion are about 8km away. That's the most 'local' content I've ever seen on HN.<p>Unfortunately I have to agree with obstinate here. The pure math is too much for me and reverse engineering (still daunting, but interesting/possible) is not acceptable.
If any HN person wins this contest, I offer beers close to the black box :)
1. You do not know what the function looks like, even there is no gradient information<p>2. You have a fixed number of probes M<p>2. Among M, You have N number probes to get the silhouette of the function (exploration).<p>3. Then from the rest of the (M - N) trials, you need to find the optima (exploiation).<p>Sounds more like a pseudo-science than a math problem to me.