Sponsored By

A new category of game development middleware has emerged for controlling the behavior of NPCs. This article begins a five-part survey of the category, starting with today's look at AI Implant from Biographic Technologies.

Eric Dybsand, Blogger

July 21, 2003

19 Min Read

 

We know them well. HAL 9000 of 2001: A Space Odyssey, "Data" the positronic android of Star Trek: Next Generation fame, the "computer" of all the Star Trek series and movies, C3PO/R2D2 of the Star Wars movies, "Collossus" of The Forbin Project movie, "David" the artificial boy from Spielberg's A.I. movie. These artificial intelligent (AI) entities are examples of the level of AI that game developers would love to achieve in their computer games. Of course, what can be easily done for film and fantasy is much harder to achieve for computer games played by the millions of expert level game players who are the customers of the computer game development industry.

So how does a developer increase the quality and behavior believability of the artificial intelligence in games? A number of software companies think they have a solution to that problem: AI middleware.

AI middleware provides services for game engines for performing the AI function in computer games. It has recently emerged as a serious category of products that some developers are considering over "rolling their own" AI systems. For the most part, AI middleware will find itself located outside the game engine and the process of producing the desired behavior of agents or non-player characters (NPC) or decision-making objects found in a computer game.


AI.implant
aiimplantlogo.gif

3body_arrow_sm_right.gifEngenuity Technologies
4700 de la Savane, Suite 300
Montreal, QC H4P 1T7
Phone: 514.341.3874

3body_arrow_sm_right.gifPrice: Contact for Pricing

 

One of the reasons game developers may choose to use AI middleware is that their staff may not possess the AI expertise to develop the desired algorithms and processes. Another reason might be that the project schedule is tight and there's insufficient time to develop the desired level of AI for the game from scratch. Or perhaps an AI middleware product contains the exact algorithms or processes that may achieve the desired level of AI.

There are some factors that stand in the way AI middleware adoption, however. First is the "not invented here" syndrome, and the fear of not having complete control over all game processes that most game developers desire. Also of concern is a perceived performance hit that may be realized by having to rely on the AI middleware "engine" or library routines for some processing, and the concern that the programmer might not be able to optimize this code to the performance level desired. And of course it's possibile that the AI middleware may not do exactly what the developer wants, and that the learning curve for implementing and integrating the AI middleware into the game might be too steep.


figure_01.gif

Figure 1. Typically, character and game state status flow from the Game Engine to the AI middleware, and then character control requests flow from the AI middleware to the Game Engine, and are acted out by the characters.

Over course of this five-part series this week, which was introduced in the August 2003 issue of Game Developer magazine, we will survey a number of character-oriented AI middleware products. While many different forms of AI middleware exist, this series will only focus on the AI products that advertise themselves as solutions to computer game AI development.

AI middleware products come in many flavors, but this series will concentrate on four products that focus on character behavior. Throughout the entire review series we will try to answer three basic questions:

1. What does the AI middleware product do for the game developer?
2. What are the main features of this AI middleware product?
3. How is this AI middleware product implemented in a game?

So, without further introduction, we begin the series by looking at AI.Implant, which could be described as an animation and character control AI toolkit.

AI.implant

AI.implant, from BioGraphic Technologies in Montreal, Canada, has a sophisticated animation control engine that introduces AI to the computer game and video media character development process. By focusing on animation control, AI.implant offers unique AI solutions to the game developer with complex animation considerations.

What makes AI.implant unique is the inclusion of a plug-in interface to Maya and 3ds Max, the two most widely-used animation and modeling systems for game development. AI.implant also has a C++ SDK for calling AI.implant functions from within a game.

What does this product do for the game developer?

AI.implant contains a number of features, including productivity tools for managing crowds (either as agents or characters); an interface between the game engine and AI.implant; the ability to design and edit character-assigned AI algorithms, behaviors, character states and sensors in the developer's modeling system of choice; a GUI for the AI designer to work within. The plug-in for Maya and 3ds Max lets you assign AI to cameras and locations, gives you the ability to create paths with waypoints, and lets you execute all of the character decision-making processes in real time. Essentially it serves as a GUI-based waypoint editor, which is very handy during level design processes.

Essentially, AI.implant provides autonomous character control for the game engine and rendering system, which in turn provides animation and locomotion control -- and some behavioral decision making -- for autonomous characters (agents) in the game.


figure_02.gif

Figure 2. AI.implant Autonomous Character Engine accepts input from the game and provides action and animation control.

AI.implant does not play animation files - that's left up to the game engine. Rather, AI.implant tells the developer's game engine which animation clip to play, when to play the clip and what portion of the clip to play, based on the behaviors and attributes of the characters.

As mentioned, an AI.implant world consists of autonomous agents, but also non-autonomous characters and objects. These characters and objects represent not only agents that demonstrate behavior, but also objects such as terrain, surfaces, paths and obstacles that exist in the game world, but also interact with autonomous characters. AI.implant agents have attributes, which define performance and constraints about movement and action. Agents can also be assigned behaviors to perform. Non-autonomous characters and objects can be used to represent walls, moving doors, tunnels, constrained areas and paths (formed by using waypoint objects). And agents can employ decision-making processes to select the behaviors and actions to carry out.

In the product's SDK documentation, an example is presented in which a guard must patrol a fortified compound against intruders. The guard is defined as an autonomous character and the buildings and fence around the compound are defined as barriers (a type of non-autonomous object). Trees, crates and any static vehicles are also considered non-autonomous objects. The terrain of the compound is defined as terrain and the autonomous characters (the guard in this case) would be assigned to "hug the terrain", causing the guard to appear to walk on the ground when it moves. The behaviors "Avoid Obstacles" and "Avoid Barriers" are assigned to the guard to prevent the guard from walking into any trees or buildings or the fence. The guard is also assigned a behavior of "Seek to Via Network," enabling the guard to use a network of waypoints to vary its path in the compound based on its location, giving it a less predicable path during its patrol. Sensors are created for the guard, letting it perceive events in the compound, and based on those events "binary decision trees" are assigned to enable the guard to determine how to respond to received events.

What are the main features of this AI middleware product?

Maya and 3ds max plug-ins. The plug-ins are one of the best features of AI.implant, allowing the development of character AI control during the creation of the modeling and animation assets of the game. Once a character has been created in Maya/3ds Max, the AI.implant plug-in can be used to set animation control for the character, add attributes to the character, set default and initial state values, add sensors to the character, assign behaviors to the character and create decision trees for the character to use to make decisions.


figure_03.jpg

Figure 3. AI.implant Maya Plug-in menus used for defining an autonomous character.

 


figure_04.jpg

Figure 4. AI.implant Maya Plug-in attributes dialog box for autonomous character.


Hierarchical pathfinding. Using the plug-ins, level editing can include the placement of waypoints that can be connected to create waypoint networks, or AI.implant can be set to automatically create a waypoint network which the game developer can subsequently edit as needed. Various behaviors can be used to navigate the waypoint network; if a game developer is planning on using waypoint navigation (a popular form of passable terrain representation used for pathfinding and navigation) then using the AI.implant plug-in to place and maintain the waypoints in a level is extremely handy.


figure_05.jpg

Figure 5. AI.implant Maya Plug-in autonomous character with attributes displayed, and part of a way point network (in blue).

Rule-based decisions. Agents can make decisions based on what they know about their environment. The decision-making process takes the form of binary decision trees (BDT). The agent can maintain data (like a memory), and accept input from the environment via Sensors, and then apply them to a BDT to determine the action or behavior to execute. This process is outlined in the set of partial screen shots that describe setting up an autonomous agent, assigning a sensor and creating a BDT for the agent to use to decide what to do.


figure_06.jpg

Figure 6. A Sensor for an autonomous agent is created using the AI.implant Plug-in for Maya, and attached to the agent.



figure_07.jpg

Figure 7. A Binary Decision Tree is created using the AI.implant Plug-in for Maya, to decide if the Sensor saw a Sphere.


figure_08.jpg

Figure 8. The BDT is edited using the AI.implant Maya plug-in, to execute an animation via the Panic Decision when the Sensor sees the Sphere.

The BDT can be used to create complex decisions of arbitrary depth. It is even possible to construct a Finite State Machine (FSM) using the BDT appropriately. FSMs are an AI tool that are widely used in game AI, so BDTs should be easy for game developers to understand and helpful to have around.

Crowd productivity tools. Agents can be grouped together and assigned common behaviors, aptly called "group behaviors". Collective use of several crowd productivity tools can generate flocking behavior by members of the group.

Assignable AI. Agents can have default behaviors assigned that execute in the absence of behaviors determined by the BDT.

How does the game developer implement this AI middleware product in a game?

Maya and 3ds max plug-ins
The plug-in support is certainly the major benefit to AI.implant, provided you already use one or both of these tools. A designer could use the plug-in to assign behaviors after the artists have provided the models defining the characters. The plug-in itself can be extended as needed, as can the behaviors of the characters (more on that shortly).

The AI.implant Software Development Kit (SDK)
There are a number of components to the AI.implant SDK that assist the game developer with the task of integrating AI.implant into the game:

  • The Autonomous Character Engine (ACE) calculates and updates the position of each character for each frame of the game, selects the appropriate set of animation clips and provides the correct "game logic" for the current situation of the game. A sub-component of the ACE is the AI Solver. The AI Solver provides several services: it lets you create of intelligent characters capable of self-navigation through the game world, it manages the autonomous characters, and it provides a container for the character and non-character objects of the game world.

  • The Autonomous Character Pipeline (ACP) is a set of libraries that may be useful in the complete game-creation process or the production pipeline. These libraries provide functionality that simplifies the transfer of data between the plug-ins and AI.implant.

  • A sample game engine (the RD_Viewer) written using the AI.implant SDK is provided to help the game developer learn to use AI.implant in her games.

  • A core set of support libraries (structures and math classes) is provided.

  • The binary distributions of AI.implant for various platforms are included. AI.implant supports Windows, Xbox, Playstation 2 and Gamecube platforms.

  • Finally, well written documentation is provided.

Putting AI.implant to work

The process of integrating the AI.implant SDK with the game code has a number of steps, some required and some optional, so a game developer is able to select the appropriate level of integration.

Step one is programmatically initializing the SDK. That involves initializing the core AI.implant module with a call to ACE_Core::InitializeModule(). Then depending on whether the Behavior, Animation, Action Selection and Physics modules are used, the appropriate module will have to be initialized with a similar call.

Next, an ACE_Solver object needs to be programmatically instantiated and configured based on the characteristics of the game world. Sub-solvers, like modules, are optionally enabled based on the features of AI.implant the game developer will be using. The available sub-solvers are Action Selection, Behavior, Integration, Surface-hugging, Collision, Terrain-hugging and Animation. For example, to initialize the Behavior sub-solver, the game developer would include a call:

aiSolver->AddSubSolver( new ACE_BehaviourSolver) ;

to the initialization process after the ACE_Solver (aiSolver) had been created.

The game developer can now tell AI.implant about the characters used in the game. A new character object (ACE_Character) is instantiated and the AI Solver object is informed about the character via a call:

aiSolver->AddChild(character);

Once characters have been created, behaviors can be created and applied to the character. AI.implant supports a number of preset behaviors, in these categories:

  • Basic Navigation

  • Group Behavior

  • Targeted Behavior

  • State Change Behavior

Once a behavior object has been instantiated, it can be added to a character simply by calling:

character->AddChild(behavior);

Behaviors are additive, thus multiple behaviors may be assigned to a character, and the ACE_Solver will calculate a final motivation based on each behavior's intensity and priority.

If a behavior is needed that is not supported by the SDK initially, it is possible to extend the SDK to add a new behavior. This process, while more involved than a typical animator or game designer might want to take on, is relatively straightforward for an experienced C++ programmer. Adding a new behavior is performed by either sub-classing from an existing behavior that is close to the desired new behavior, or deriving a new behavior class from one of the AI.implant behavior interface classes and providing an override for the virtual motion computation method of the inherited class. It is also necessary to implement some "standard" methods in the new behavior class in order to have the new behavior class remain consistent with the interface standard set forth for AI.implant's core behaviors. Additionally, the Maya plug-in should be extended to support the new behavior. The SDK provides wrapper classes and documentation to help the game developer extend the plug-in.

Finally, implementing AI.implant in a game engine main loop is very straightforward. An EXT_UserInput object is created with the latest player input, which is in turn used to update AI.implant's view of the game world via a call to UpdateGameWorldFromInput(), which updates an EXT_GameWorld object that contains the AI.implant representation of the developer's game world. That in turn is passed to UpdateACEWorldFromGameWorld() for update. Next, the AI Solver object's Solve() is called to process this timeslice of AI.implant. Finally, the EXT_GameWorld object is updated from the AI Solver object.

Of course, if the game developer does not want to treat AI.implant as a black box, and needs more control, then the game developer can selectively ignore the updated version of EXT_GameWorld that the AI Solver object produces.

Wrap Up

The AI.implant AI middleware product fills a unique niche in relation to its AI middleware competitors. It targets games that have complex animation and character control needs, whose development teams are well-funded enough to use Maya or 3ds max (assuming the team wants to take advantage of AI.implant's biggest strengths). The plug-in aspect of AI.implant means that the users who already use Maya or max may find it intuitive to use the tool. Augmenting that ease-of-use is AI.implant's documentation and tutorials and example programs. When a developer finds a missing behavior that he wanted, the extensibility of the behavior code let him write his own behavior, and the extensibility of the plug-in would let him to incorporate it into the user interface for character modeling. Developers using Maya or 3ds max should look closely at AI.implant for character AI modeling.


Read more about:

Features

About the Author(s)

Eric Dybsand

Blogger

Eric Dybsand ([email protected]) has consulted on an extensive list of computer games, including designing and developing the AI for Full Spectrum Command, a tactical command simulator used by the US Army. And he has designed strategic AI for MOO3, AI for Racing, Baseball and Wrestling games, he developed the AI opponents for the RTS game ENEMY NATIONS, and for the FPS games REBEL MOON REVOLUTION and the WAR IN HEAVEN, and a number of turn-based wargames. Eric has been involved with computer game AI since 1987, doing game design, programming and testing, and is a contributing author on AI to the Game Programming Gems and AI Wisdom series.

Daily news, dev blogs, and stories from Game Developer straight to your inbox

You May Also Like