A lot of people here seem to be underestimating the difficulty of this problem. There are several incorrect comments saying that in SC1 AIs have already been able to beat professionals - right now they are nowhere near that level.<p>Go is a discrete game where the game state is 100% known at all times. Starcraft is a continuous game and the game state is not 100% known at any given time.<p>This alone makes it a much harder problem than go. Not to mention that the game itself is more complex, in the sense that go, despite being a very hard game for humans to master, is composed of a few very simple and well defined rules. Starcraft is much more open-ended, has many more rules, and as a result its much harder to build a representation of game state that is conducive to effective deep learning.<p>I do think that eventually we will get an AI that can beat humans, but it will be a non-trivial problem to solve, and it may take some time to get there. I think a big component is not really machine learning but more related to how to represent state at any given time, which will necessarily involve a lot of human-tweaking of distilling down what really are the important things that influence winning.
<i>Related</i>: Today I learned that a group of AI researchers has released a paper called: <i>STARDATA: A StarCraft AI Research Dataset</i>. According to one of the authors: "We're releasing a dataset of 65k StarCraft: Brood War games, 1.5b frames, 500m actions, 400GB of data. Check it out!"<p>> Article: <a href="https://arxiv.org/abs/1708.02139" rel="nofollow">https://arxiv.org/abs/1708.02139</a><p>> Github: <a href="https://github.com/TorchCraft/StarData" rel="nofollow">https://github.com/TorchCraft/StarData</a>
The API Blizzard is exposing is really nice. Sadly most of the advantages AI had in SC1 were just due to the fact that an automated process could micro-manage the tasks the game didn't automate for you (a lot of boring, repetitive work). SC2 got rid of a lot of that while still allowing room for innovative and overpowered tactics to be discovered (MarineKing's insane marine micro, SlayerS killing everyone with blue flame hellions, some more recent stuff I'm sure from the newest expansions). Hopefully the API lets AIs converge on optimal resource management and get to exploring new and innovative timings, transitions, army makeups, etc.
This seems all in good fun but I wonder if it's come too late.<p>Starcraft 2 is at its twilight.<p>The biggest leagues of South Korea have disbanded. [1]
The prolific progamers who transitioned to Starcraft 2 have gone back to Broodwar. [2]<p>Blizzard itself has scrubbed all references to Starcraft 2 on the very home page of Starcraft. [3] Except for the twitter embed, it has only only one "2" character... in the copyright statement.<p>My take is that the future for the Starcraft franchise will be through remastered and potential expansion packs following it.<p>Starcraft 2 had a good run but, with the entire RTS genre stagnating [4], I don't think Blizzard wants to bet on anything less than the top horse.<p>[1] <a href="https://www.kotaku.com.au/2016/10/the-end-of-an-era-for-starcraft-and-south-korea/" rel="nofollow">https://www.kotaku.com.au/2016/10/the-end-of-an-era-for-star...</a><p>[2] <a href="http://www.espn.com/esports/story/_/id/18935988/starcraft-brood-war-glory-days-jaedong-best-bisu-talk-starcraft" rel="nofollow">http://www.espn.com/esports/story/_/id/18935988/starcraft-br...</a><p>[3] <a href="http://starcraft.com" rel="nofollow">http://starcraft.com</a><p>[4]<a href="http://www.pcgamer.com/the-decline-evolution-and-future-of-the-rts/" rel="nofollow">http://www.pcgamer.com/the-decline-evolution-and-future-of-t...</a> (Aside from MOBAs)
It's a bit too bad they're having to move towards supervised learning and imitation learning.<p>I totally understand why they need to do that given the insane decision trees, but I was really hoping to see what the AI would learn to do without any human example, simply because it would be inhuman and interesting.<p>I'm really interested in particular if an unsupervised AI would use very strange building placements and permanently moving ungrouped units.<p>One thing that struck me in the video was the really actively weird mining techniques in one clip and then another clip where it blocked its mineral line with 3 raised depots...
I also want to see the algorithm win on unorthodox maps. Perhaps a map they have never seen before, or one where the map is the same as before but the resources have moved.<p>Don't tell the player or the algorithm this, and see how both react, and adapt. This tells us a great deal about the resiliency of abilities.
When Watson won at Jeopardy, one of its prime advantages was the faster reaction time at pushing the buzzer. The fairness of that has already been hashed out elsewhere, but.....<p>We already know that computers can have superior micro and beat humans at Starcraft through that(1). Is DeepMind going to win by giving themselves a micro advantage that is beyond what reasonable humans can do?<p>(1)<a href="https://www.youtube.com/watch?v=IKVFZ28ybQs" rel="nofollow">https://www.youtube.com/watch?v=IKVFZ28ybQs</a> as one example
Are there any known arbitrary code injection for starcraft? Like how you can use a regular controller to reprogram super mario world to play pong?<p><a href="https://www.reddit.com/r/programming/comments/1v5mqg/using_bugs_in_super_mario_world_to_inject_new/" rel="nofollow">https://www.reddit.com/r/programming/comments/1v5mqg/using_b...</a><p><a href="https://bulbapedia.bulbagarden.net/wiki/Arbitrary_code_execution" rel="nofollow">https://bulbapedia.bulbagarden.net/wiki/Arbitrary_code_execu...</a><p>Is this how we are going to accidentally let AGI loose into the world!? /s<p>On a more realistic note I think this will degenerate into a game of who can fuzz test for the best game breaking glitch. Think of all the programming bugs that turned into game mechanics in BW that we haven't discovered for SC2 yet: <a href="http://www.codeofhonor.com/blog/the-starcraft-path-finding-hack" rel="nofollow">http://www.codeofhonor.com/blog/the-starcraft-path-finding-h...</a>
The StarCraft 1 BroodWar AI scene has been thriving for a few years now: <a href="https://sscaitournament.com/" rel="nofollow">https://sscaitournament.com/</a>
You can watch 24/7 live AI vs AI games on Twitch at: <a href="https://www.twitch.tv/sscait" rel="nofollow">https://www.twitch.tv/sscait</a>
Support for voting on who to play next and even a betting system are in place, too. For those who wish to get their feet wet with BW AI development, here are the Java / C++ tutorials: <a href="https://sscaitournament.com/index.php?action=tutorial" rel="nofollow">https://sscaitournament.com/index.php?action=tutorial</a>
The SCAI bots I've seen are more hardcoded tactics engines rather than machine learning models. They're still impressive, but their logic isn't quite 'learned' it's hand coded which is a crucial difference.
I thought this was already happening. Right after AlphaGo beat Lee, I remember hearing about it. Did they give up on having their AI playing SC2? I wondered if that would work, since it seemed to take turns in Go at the same speed as a normal player, I wondered if it was trying to compute the most likely winning move each turn and the late game implications of those moves. If it tried that in a fast paced game how it would deal with the speed. It obviously would need to develop a pattern of pre-baked strategies that would win it the game. Would it play the same build every round or would it realize that changing things up each match wins it more games?
It's a bit too bad they're having to move towards supervised learning and imitation learning.<p>I totally understand why they need to do that given the insane decision trees, but I was really hoping to see what the AI would learn to do without any human example, simply because it would be inhuman and interesting.<p>I'm really interested in particular if an unsupervised AI would use very strange building placements and permanently moving ungrouped units.<p>One thing that struck me in the video was the really actively weird mining techniques in one clip and then another clip where it blocked its mineral line with 3 raised depots...
There's something funny about a company that is actively developing bleeding edge AI technology, but who can't design a webpage that works on mobile without crashing.
When I used to play a lot of StarCraft, and then later with Total Annihilation, I wished for the ability to customize the AI.<p>So then BWAPI came along ... and ... AI is hard. The best SCBW bots are still pretty pathetic compared to a human player, never mind an expert human player.
I'd be really interested in how differently tiered data sets (ladder rank) would work as sources for teaching.<p>Is it possible that training on diamond players is less effective than training on, say, silver? Is that actually even an interesting thing to look at?
> even strong baseline agents, such as A3C, cannot win a single game against even the easiest built-in AI.<p>Then, why not release code for the built in ai, and improve on it ? Or is the built in ai cheating ?
Someone needs to link this to FB's ELF platform (An End-To-End, Lightweight and Flexible Platform for Game Research). That was specifically made for RTS games like SC.
great they opened it up. I'm sure reinforcement learning / Deep learning will solve this. It has been a tough problem before, but honestly doesnt seem that tough compared to all the harder AI problems.
"so agents must interact with the game within limits of human dexterity in terms of “Actions Per Minute”."<p>I am really glad they are limiting APM because otherwise things just get stupid.
It's not like this is going to create fantastic AI.<p>Keep in mind there's been an amateur AI project for broodwar for almost 7 years now. Even after such a long learning period, the games are very primitive, and the AI's still couldn't pose a threat to even a beginner human player. Sometimes the games take hours. Trying to build strategy and decision making into an AI is incredibly complicated. There have been teams working at the SSCAIT for many years now, and the product is still fairly primitive.<p>So what CA did was instead write up a simpler AI that mimics strategy and decision making. We all know it's not great, but I'd be really skeptical that 3rd parties would magically create an AI that can think strategically.
Novice here: I really want to try this Starcraft API but I don't know how to start. I believe this uses more reinforcement learning and agent-based models (which honestly I am not familiar with yet) What are good papers to get started on this?