Daily news, dev blogs, and stories from Game Developer straight to your inbox
AI Middleware: Getting Into Character, Part 4: Simbionic
AI middleware is a new category of commercial tools for controlling the behavior of NPCs in games. This article, the fourth in a five-part survey, examines Simbionic from Stottler Henke.
July 24, 2003
12 Min Read
This installment is the fourth in a five-part product survey of AI middleware tools for game development, products that control the behavior of agents, NPCs and/or decision-making objects in games. In this article I'll look at SimBionic, an authoring tool and runtime engine for developing complex behaviors, developed by Stottler Henke of San Mateo.
Like many of its competitors, SimBionic, is designed to control the behavior of NPCs and agents in a game world. What sets SimBionic apart is that it can also control objects in the game world that move or behave in reaction to events or conditions within the game world - functionality that's typically considered game world physics, not AI.
What does this AI Middleware product do for the game developer?
SimBionic provides a framework for defining the objects that display behavior within the game world. This framework is very state oriented, in that most control flow is influenced by the state or condition of some object or process (similar to a finite state machine). In addition, complex and hierarchical state systems can be built using SimBionic's visual editing tools.
What are the main features of this AI Middleware product?
Components of a SimBionic project
The SimBionic state systems have many components that can be classified as descriptors and declarations. Descriptors are identifiers used to help create references in the SimBionic system to represent objects and behaviors that exist in the game world. SimBionic also describes attributes of objects as descriptors. These attributes can be organized into hierarchies that help classify objects and represent different object states. For instance, an attribute could be the descriptor Weapons. Under Weapons, could be other descriptors: Hand Guns, Assault Rifles and Missile Launchers. Under Assault Rifles, could be the descriptors AK-47 and M-16. Thus, such a hierarchy defines a M-16 as an Assault Rifle, Weapon, which can be used to describe an object in the game world. Likewise, the attributes Jammed, Dirty and Clean could be used to refer to the state of the object that is an M-16, Assault Rifle, and Weapon.
Declarations are symbolic associations used by the SimBionic project. These associations consist of actions, predicates, behaviors, global variables, constants and local variables that are used within a SimBionic project.
Entities are descriptors that define NPCs, agents and objects in the world for SimBionic to use. In SimBionic, any object that exhibits any form of behavior in the game world is considered to be an entity. Objects, that can't be moved or do not exhibit behavior, like a tree or a rock, are not considered to be entities by SimBionic.
Actions are declarations that define all the different behaviors an entity can perform and act as the fundamental behavior building blocks in SimBionic. In essence, actions act like small programs that perform functions that implement behaviors.
Behaviors are declarations that are also like small programs that dynamically determine decisions and actions performed by entities. Behaviors can also call other behaviors that will result in nesting behaviors within behaviors. A behavior can invoke an action or another behavior as a result of evaluating conditions that are associated with the behavior. Those behaviors, in turn, can determine when each action or other behaviors are executed by the original behavior. Conditions are evaluated based on connectors that control the order in which the conditions are evaluated. Behaviors can also be imported from previous projects into new projects.
Global and local variables are declarations of named memory areas that can store values used by behaviors. Local variables are only used and affected by the behavior in which they are declared. Global variables are accessible from anywhere within the SimBionic project. The types of data these variables support are: integer, float, string, vector (x, y, z location), Boolean, entity (numeric ID associated with entities) and data (a catch-all type). Additionally, an any data type can refer to data that could be any of the above data types, and an invalid data type can be used for testing validity.
Constants are declarations of symbolic names associated with globally accessible static values that can be reference by other declarations.
Core predicates are built-in functions that provide access and evaluation services relative to entities, behaviors and messages on the blackboards (more about that in a moment).
User-defined (or custom) predicates are declarations that function as another mini-program for the purpose of gathering status and state information about the game world and the entities and objects within it. A custom predicate typically consists of custom code that performs the function of the predicate, a declaration of the predicate in the catalog that identifies any parameters to be used by the predicate, and the inclusion of the predicate in the canvas as part of a condition for execution. For example, a predicate could be a component that checks the health status of an entity or determines if an entity can "see" another entity.
Core actions are additional built-in functions that provide blackboard maintenance functions and (entity) group maintenance functions.
User-defined (or custom) actions are declarations that are mini-programs that perform some activity or calculation that can be used by an entity. A custom action typically consists of user-written code that makes the action work, a declaration of the action in the catalog that identifies any parameters to be used by the action, and the inclusion of the action in the canvas (which I'll explain in a moment) that tells SimBionic where the action should be triggered.
Custom actions and predicates can be imported from previous projects into new projects.
How does the game developer implement this AI Middleware product in a game?
All of these elements are combined visually on the SimBionic project canvas. The canvas is part of SimBionic's visual editor, where you program the SimBionic AI for your game. You can move, copy, delete and search for elements on the canvas, as well as add and edit elements from the canvas. SimBionic AI primarily evaluates conditions to determine what behavior or action to execute next, and then executes it.
The SimBionic Visual Editor
SimBionic comes with its own visual editor to use to create the AI rules that manage the game world, and it is very easy to use. The visual editor consists of a number of toolbars, panes and windows where you create your game's AI by creating and compiling a SimBionic project.
Figure 1. The SimBionic Visual Editor
In the upper left portion of the visual editor is the Project Window. This window contains two panes: the Catalog and the Descriptors panes. The Catalog pane is where the declarations are created, which will become the building blocks of the SimBionic AI program. The Descriptors pane is where the entities and attributes are defined.
Dominating the upper central and right portion of the visual editor is the Canvas Window. This window is where you spend the majority of your time creating the SimBionic AI program using the components from the Project Window. The process is to basically drag and drop components from one of the panes of the Project Window onto the canvas. This is typically done to create behaviors on the Canvas, and you can create conditions and complex connectors that control the evaluation of the state of the behaviors.
Conditions give you the ability to test the state of aspects of the visual world and the entities populating it to determine if particular actions or behaviors should be triggered. Conditions contain expressions using variables or constants, and predicates providing access to data. The condition must evaluate to true or false.
Connectors control the order in which conditions are evaluated, and actions and behaviors take place and only exist between two Canvas elements.
Running along the bottom of the visual editor is the Output Window. The Output Window consists of multiple panes: Build, Debug and Find. The Build pane displays the result of any compilation of the current SimBionic project. The Debug pane displays debug output and data produced during the execution of the compiled SimBionic AI Program. The Find pane displays the results of search requests for components that make up the SimBionic project.
Communication between entities
SimBionic provides a built-in messaging capability that lets entities communicate data and status information between them. This communication is available in two ways:
1. Group messaging. When an entity joins a named group of entities, the entity receives every message sent to that group. Each entity has its own unique message queue that stores messages until the entity retrieves a message, optionally reads the message, and optionally discards the message. This message queue is a First-In-First-Out queue (sometimes referred to as a "push up" stack - messages go into the queue from the bottom of the stack and are retrieved by popping the message off the top of the stack).
2. Virtual blackboards. These are storage areas that are available to all entities for sharing information. An entity can store any information in a section of the blackboard, and then other entities can retrieve that information by referencing the blackboard and section names.
Compiling A SimBionic Project
Compiling the project is fairly straightforward. Once all the descriptors and declarations (entities, actions, predicates, behaviors, variables and constants) have been defined and entered into the Catalog; and the actions, behaviors, conditions and connectors have been established on the Canvas; then the project can be compiled by selecting the Build Menu from within the visual editor. Any errors encountered in the compilation process are reported in the Output Window of the Build Pane of the visual editor.
Debugging A SimBionic project
The interactive debugger in the visual editor lets SimBionic projects execute clock-tick by clock-tick, so you can see how behaviors perform. Also, the debugger provides access to the value of all descriptors in the project, as well as view the project's stack and messages, all of which is displayed in the Debug pane of the Output Window in the Visual Editor. Like most debuggers, breakpoints can be set to fire either conditionally or simply upon being encountered.
Application Interface To The Game Engine
The SimBionic AI Program is executed by the run-time engine, which must be integrated into your game. The game engine can either link to the SimBionic run-time engine at the time the game is built, or the game engine can dynamically load the SimBionic run-time engine via a DLL. The SimBionic API is provided via seven header and C++ implementation files that must be included with the game project.
As mentioned earlier, additional custom code must be written by game developers to support the underlying code functionality of custom Actions and Predicates. This code would define and implement a derived version of a SimBionic interface class, and then override that class's virtual DoAction() and DoPredicate() member functions in order to achieve an appropriate interface.
The API lets you:
Create, make and destroy entities
Obtain entity IDs
Get last error
Initialize the run-time engine
Set behavior for an entity
Set global variables
Set update frequency
Set update priority
Terminate the run-time engine
Update the run-time engine clock
Cause update to run for an entity
Summary of SimBionic
SimBionic provides a sophisticated framework for creating and debugging state systems. Since FSMs are so widely used by game developers, SimBionic is an alternative to the custom FSM development that goes on in game development today. In addition, SimBionic's Visual Editor makes the development of these state systems "designer" accessible - once the world, objects and behaviors have been described and declared, and any necessary code has been hooked in.
Read more about:Features
About the Author(s)
Eric Dybsand ([email protected]) has consulted on an extensive list of computer games, including designing and developing the AI for Full Spectrum Command, a tactical command simulator used by the US Army. And he has designed strategic AI for MOO3, AI for Racing, Baseball and Wrestling games, he developed the AI opponents for the RTS game ENEMY NATIONS, and for the FPS games REBEL MOON REVOLUTION and the WAR IN HEAVEN, and a number of turn-based wargames. Eric has been involved with computer game AI since 1987, doing game design, programming and testing, and is a contributing author on AI to the Game Programming Gems and AI Wisdom series.
You May Also Like
Accessibility and fancy footwork with GLYDR's John Warren - Game Developer Podcast ep. 40Feb 28, 2024
Exploring the 2024 State of the Game Industry report - Game Developer Podcast ep. 39Feb 2, 2024
Phantom inspiration and the ethical auteur with Xalavier Nelson Jr.Dec 8, 2023
Designing Killer Queen: from playground experiment to modern arcade sensationOct 18, 2023
Get daily news, dev blogs, and stories from Game Developer straight to your inbox
Subscribe to Game Developer Newsletters to stay caught up with the latest news, design insights, marketing tips, and more