It's unfortunate that the only other comments so far are about the fact this article is a few years old. Clearly what's interesting here is the method itself, not its immediate impact on the StarCraft II metagame.<p>Real-time strategy games have recently stimulated a lot of great research [1] because they exhibit several interesting subproblems such as resource allocation optimization, strategy selection, or plan optimization (like here). It also has the nice side effect that it makes it easy to evaluate the quality of the resulting plan in a real-world setting and see how it fares when certain assumptions are relaxed (non-static opponents, effect of imperfect strategy execution, etc.).<p>In this case what they did is solve a multi-objective optimization problem [2] using a genetic algorithm (a popular way to do so). The goal is to optimize some metric (e.g. time) with respect to some objective (e.g. "build 8 units of this type") and some constraints (resource requirements, unit trees). Of course you could perform some form of search in the solution space, but this space can become pretty big pretty fast and for complex games it's fairly easy to be stuck in a local optimum because the net effect of some choices can be hard to assess (some can be bad in the short term but be a nice set-up in the medium term, and two bad tactics can interact to be a good strategy). In contrast, a genetic algorithm can help find the global optimum.<p>While it's not as big as the article makes it sound (it's not a "winning build" per se -- more like a method to find the fastest way to achieve a known opening), if you combined plan selection methods (e.g. [3]) with plan-optimization techniques like this one, you could come up with an IA that is able to come-up with both near-optimal tactics (how to achieve an objective) and strategies (which objectives to pursue). Pretty interesting stuff overall, and applicable to more than just real-time strategy games.<p>[1] <a href="http://scholar.google.com/scholar?q=%22real-time+strategy+game%22%2Balgorithm" rel="nofollow">http://scholar.google.com/scholar?q=%22real-time+strategy+ga...</a><p>[2] <a href="http://en.wikipedia.org/wiki/Multi-objective_optimization" rel="nofollow">http://en.wikipedia.org/wiki/Multi-objective_optimization</a><p>[3] <a href="http://link.springer.com/chapter/10.1007/11536406_4" rel="nofollow">http://link.springer.com/chapter/10.1007/11536406_4</a>
Ha. I wrote the blog post mentioned in this article that started this. As pointed out, this is an old repost(1), but perhaps interestingly (at least at a meta level) is that this is all actually HN's fault.<p>When I first saw this thread on teamliquid, I thought "Oh man, proggit and HN will love this" so I wrote it up. And sure enough, it was near the top of both, for a day. Next came the random news articles (like this one). And here several years later is an HN repost of a blog article that only exists because HN and proggit upvoted the original blog post. You guys created the article that's being reposted :)<p>[1] <a href="https://news.ycombinator.com/item?id=1856390" rel="nofollow">https://news.ycombinator.com/item?id=1856390</a><p>From the last time around, the most common criticisms were:<p>1. Without any form of crossover, it's just hill-climbing/gradient-descent, it's not really a "genetic algorithm".<p>2. BFS would work better (maybe. IIRC people tried BFS and there's a bunch of subtle things that make the BFS search space huge (e.g. putting workers on and off gas at certain times). Of course since we have a goal state, something like A* might win the day).
I used that strategy a few times when it came out. I found it interesting to see how fast humans could responded and nullify the advantage. I think after 3-4 wins with the strategy I started getting countered effectively.<p>So it took a few days and all the players at my level knew about it and could handle it easily.<p>It's hard to beat a general purpose strong AI in the long run!
I remember reading this and thinking "That would be so much simpler to do with a breadth first search".<p>Sure 'genetic' sounds cool, but when the search space is as limited as it is in a build order optimization like this, you should use the right tools for the task.
I enjoy genetic algorithms conceptually and they were one of my biggest "lightbulb moment" during university...a little silly thinking back but the order of classes was lined up in a way where I didn't really know much about algorithms and had just learned JAVA basics and naively assumed "computers are powerful...I can brute force everything" and then took a class called "soft computing" (iirc) and we had to solve some problem and implemented some naive brute force and let it run for a couple of days (lol!) until the next lecture where GA were introduced. I was totally floored that such a simple idea could speed up stuff so much.<p>That being said there's usually better local/heuristic search methods (eventhough they are also often quite a bit harder to understand unless you just want to use them).
I'm not a java programmer and currently eclipse is telling me that it's going to take 60 minutes to download. How does the program itself work? Is there some SC2 API they are using for this or did they just write down all the values and are "simulating" a SC2 build.