Looks like the Deepmind Starcraft AI relied on superhuman speed after all

Deepmind presented an AI that could beat human champions at Starcraft II. It claimed the AI was limited to what human players can physically do, putting its achievement in the realm of strategic analysis rather than finger twitchery. But there's a problem: it was often tracked clicking with superhuman speed and efficiency.

Aleksi Pietikäinen:

1. AlphaStar played with superhuman speed and precision.

2. Deepmind claimed to have restricted the AI from performing actions that would be physically impossible to a human. They have not succeeded in this and most likely are aware of it.

3. The reason why AlphaStar is performing at superhuman speeds is most likely due to it's inability to unlearn the human players tendency to spam click. I suspect Deepmind wanted to restrict it to a more human like performance but they are simply not able to.

Pietikäinen suggests that because Deepmind would have depended on recorded human games to train the AI, it picked up a peculiar human behavior: idle or unnecessary "spam clicking". As a result Deepmind would have been forced to lift the AI's clickspeed limits to escape this behavior, at which point it develops strategies that irreducibly depend on bursts of superhuman speed. In brief:

"It is deeply unsatisfying to have prominent members of this research project make claims of human-like mechanical limitations when the agent is very obviously breaking them and winning it's games specifically because it is demonstrating superhuman execution."

It looks rather like Deepmind bungled an interesting AI's announcement by making claims about it that it didn't realize were wrong. It almost seems like they didn't know why their AI is so good at Starcraft II and ended up fooled by its relatively slow mean speed.

Google spent all the money in the world creating a machine god to dream the perfect Startcraft II strategy, and woke it from its slumber, and it raised its ghostly data-hand, and it said "click faster."