Google DeepMind Wants Its AI to Dominate STARCRAFT II

Back in March of this year, Google DeepMind had its AI system, AlphaGo, “sit down” with international Go champion Lee Sedol in a 5-game Go match with a purse of a cool $1 million. Despite Go being far more difficult than say, chess, to program for (due in part to number of possible moves), it destroyed Sedol 4-1. Now, the same company that took down the best in what many consider to be the most difficult board game in the world, is turning its sights on Starcraft II.

The company made the announcement at this year’s BlizzCon 2016 in Anaheim, California, and in an associated press release, says that it has established a “collaboration with Blizzard Entertainment to open up StarCraft II to AI and Machine Learning researchers around the world.”For anybody paying attention to Google DeepMind, or one of its central driving forces, Demis Hassabis, the leap to 3-D video games has been expected for some time. The company has already trained its AI on rudimentary video games such as Breakout and Space Invaders (prior to beating Sedol), and it’s quite stunning to watch how quickly the AI becomes superhuman at its task.

With Starcraft II, DeepMind’s AI is going to face a far more difficult challenge than Breakout or even Go. The AI system is going to have to do things like play simultaneously with other players (as opposed to turn-based board games), explore unknown territories (no perfect information for the AI), and develop complex long-term planning (resource collection, usage, etc.). A sample of one layer of the AI’s API—the way the AI interacts with Starcraft II—is shown in the video above. Note how it deconstructs its environment on the left-hand side of the screen.

And even though Google DeepMind’s press release says that the company is “still a long way from being able to challenge a professional human [Starcraft II] player,” it’s difficult to imagine that the company couldn’t, eventually, duplicate its Go results. This becomes really interesting to think about in the context of Google DeepMind’s longer-term goals, mentioned in their press release:

StarCraft is an interesting testing environment for current AI research because it provides a useful bridge to the messiness of the real-world. The skills required for an agent to progress through the environment and play StarCraft well could ultimately transfer to real-world tasks.”

Make no mistake about Google DeepMind’s AI. Video games are just the training grounds. Solving real-world challenges is the ultimate goal. “DeepMind is on a scientific mission to push the boundaries of AI,” the company said, “developing programs that can learn to solve any complex problem without needing to be told how.” Fellow humanoids, this is strong AI. This is general AI. The real machine-learning, and machine-doing, is just getting started.

What do you think about Google DeepMind taking on Starcraft II? Give us know your thoughts in the comments below!