An AI developed by graduate students at the Israel Institute of Technology is garnering attention after having defeated an 'expert player' at Mortal Kombat.

Artificial intelligence in the modern era has reached such impressive heights that developers are searching for opportunities to test their software's chops. The mechanical structure of games lends itself perfectly to this, particularly board games. But after Google's DeepMind AI AlphaGO challenged and defeated professional player Lee Sedol at Go, a game which many described as beyond any AI's potential, a question grew -- where is AI's next challenge?

Video games was the quick and appropriate answer. After all, not only does the digital nature of video games complement AI, but video games already feature AI for any number of tasks. That foundation provides AI developers with an initial understanding into what it will take to build an AI that might compete with expert human players in competitive games. Google has already announced that DeepMind will be using StarCraft II for research into AI development, but it's not the only AI developers exploring these ideas.

One such new example comes from a group of graduate students studying at the Israel Institute of Technology. Without having the funds and hardware of Google at their fingertips, StarCraft II might be a bit beyond their reach. Instead, the team has focused on a platform featuring much simpler games -- the Super Nintendo Entertainment System. The team developed what they call the Retro Learning Environment, which can be used to create AI to play any number of  SNES games (as well as the Atari 2600 like Google has in the past). One game which they've found particular success with is Mortal Kombat.

A research paper showcasing the team's efforts in creating "state of the art algorithms" to enable their AI to play against humans revealed that only in Mortal Kombat was the team able to create an AI that was competitive and able to "outperform expert human players." However, it's noted and implied without specifics that the AI does not compete directly with humans. Rather, they're compared on a unique scoring metric made specifically for the test.

Other games which the graduate students attempted to train their AI on included Gradius III, where AI had difficulty with power-up usage, Wolfenstein, F-Zero, and Super Mario World. The difficulty in training the AI for these games revolved around what the developers called "rewards." These are tangible goals for the AI designed to encourage them complete a level, or attain a higher score. But for example with F-Zero, an AI has to be quite robust to hear, "Complete a lap," and then make the necessary assumptions that go into simply driving forward.

Experiments with AI learning in video games continues to be a fascinating subject. When basic AI learning algorithms are able to compete with in-game AI with little direction, it's an encouraging  sign -- both for science in general, and the future of video game AI.

Source: Vice