Sponsored By

Gamasutra was at the recent AIIDE Conference at Stanford to see tools company AiLive demonstrate its LiveMove tool for Wiimote recognition, as used in Mortal Kombat for Wii - also revealing a new demo for its new LiveCombat game AI tool based on th

June 21, 2007

4 Min Read

Author: by J.Fleming

Along with fascinating new game play possibilities, the Nintendo Wii Remote’s motion sensing capabilities present new production challenges as well. Even the simple motion of drawing the number three in the air with the remote produces a complex data set that can vary widely from one user to another. Programming a recognizer that is able to accurately classify hand motions can be a time consuming task. At the Artificial Intelligence and Interactive Digital Entertainment (AIIDE) Conference held at Stanford University June 6-8, 2007, John Funge and Wolff Daniel Dobson from the Palo Alto-based firm AiLive showed off LiveMove and LiveCombat, new software tools that are aiming to radically simplify the process for the Wii-mote and multi-SKU combat respectively. (Gamasutra has previously interviewed AiLive chairman and founder Dr. Wei Yen on the tech, which debuted for licensing late in 2006 and has been used in titles including Mortal Kombat Armageddon for the Wii.) Context Learning Typically, the AI used for similar tasks such as speech and handwriting recognition has been slow and processor intensive. However, AiLive claim to have developed an efficient machine learning-based AI called Context Learning that is able to perform its tasks at speed and with a minimal footprint. “We’re not interested in batch learning at all. We’re totally focused on online, real-time learning so you can see immediate feedback on your work,” Funge said. Applying Context Learning to the problem of motion recognition for the Nintendo Wii resulted in LiveMove, a software tool that greatly simplifies the job. On a basic level LiveMove operates by being taught a variety movement gestures that it is then able to recognize in play. Funge claimed of their solution: “We’ve had cases where people have been working for months on motion recognizers for the Wii remote and they’ve been struggling with it and they’ve used our LiveMove software and in an afternoon they’ve had better motion recognition than they’ve had in months trying to hand code solutions.” Showing LiveMove’s ease of use, Dobson fired up a Wii development kit (hidden inside a cardboard box) and quickly ran through a series of controller motions for a hypothetical cooking game. Frying, pounding, rolling, and shaking motions were pantomimed and once the data for each of the motions was captured, Dobson assigned them to the LiveMove recognizer. As Dobson performed the motions again, LiveMove correctly identified each unique gesture without requiring them to be precisely replicated. “The concept of slack is very important. Because as people get excited they begin to make very strange gestures and we still have to recognize it,” Dobson said. Creating the basic motion recognizers for the Wii was accomplished in a matter of minutes and required no coding. “We’ve basically gone from absolutely nothing to a complete classifier in about a minute and a half. This is a really powerful thing because it allows designers to sit down and make up their own motions and try a whole bunch of different ones during the course of a morning,” Dobson said. “More Monkeys is better…” Derived from the same Context Learning engine that drives LiveMove’s motion recognizers, LiveCombat is a game dependent implementation focused on providing trainable AIs for hand-to-hand or squad-level combat in real-time. It allows users to create Context Learning AI “brains” that can be assigned to game characters. The AI brains are then taught by example in real-time by keying them to player actions. As an example, Dobson booted up a simple, free-roaming hand-to-hand combat game called Tale of the Monkey King. Initially, the blank AI brain did nothing in response to its environment but after few quick lessons on how to fight conveyed by controller inputs from Dobson, the AI brain learned to pursue, circle, and attack enemies. Because learning is continuous, old behaviors are gradually over-written as the AI brain acquires new layers of instruction. “You’re allowing non-programmers to create AI content, which is pretty big,” Dobson noted. For large scale battles, Dobson showed LiveCombat’s player amplification feature in which a player can lead a troop of AI comrades who will follow the player’s example, adjusting their tactics to mirror the player’s actions in real-time. AiLive suggests a variety of game play possibilities for LiveCombat, including AI tournaments, celebrity created AIs, and most intriguingly, specially trained AIs that could be traded or sold from player to player. AiLive has also created a developer-oriented version of the software that is geared toward productivity and testing, allowing trained AIs to perform repetitive tasks. “Being able to capture your testers using LiveCombat is a huge time savings. Back when I worked at Sega we would just get a videotape of errors. You would be watching a guy play and suddenly it would freeze. Now you would be able to actually see the behavior,” Dobson said.

Daily news, dev blogs, and stories from Game Developer straight to your inbox

You May Also Like