For the next part in Gamasutra’s ‘Tooling Around’ feature, which profiles and interviews middleware and tools developers about their products, today’s interview is with Dr. Paul Kruszewski, CTO at Engenuity Technologies, and the creator of AI.implant
The software, currently on version 5.0, is a multiplatform artificial intelligence solution used for populating virtual worlds with computer controlled characters—vehicles, people, and so forth. AI.implant allows these characters to make context specific decisions and independently move in a realistic fashion using its Dynamic Path Refinement system, which responds to player choices.
AI.implant has been integrated into Epic’s Unreal Engine 3, and is currently being used by Vivendi Games, Electronic Arts, Midway and BioWare, for a “major upcoming title”.
We spoke to Dr. Kruszewski recently, and asked about the company, AI.implant and its place in the industry.
When and why was Engenuity formed?
Engenuity was originally called Virtual Prototypes when it was founded back in 1985. Since then, it went public in 1999 and became Engenuity Technologies Inc. Engenuity is a company dedicated to developing tools for the design, visualization, and implementation of data in both real and simulated environments.
The products at Engenuity span from avionic design tools, to simulation software, to artificial intelligence tools and middleware. Engenuity acquired a number of companies over the years and one such company was BioGraphic Technologies, where AI.implant was invented.
What were the aims and goals of the company at this time?
When Engenuity acquired BGT, the goal was to bring an emerging game technology — AI.implant middleware — to the simulation world, while at the same time developing the games middleware business. Maintaining a significant stake in the games world continues to be important to Engenuity due to the faster development cycles, and the continuous demand for innovation – both of which have a strong positive influence on our development of simulation technology.
How did you realize the need for a product like AI.implant?
I envisioned AI.implant in 2000 after working as a CTO in an up-and-coming Montreal video game studio, A2M. At the time, it was clear the visual quality of games were beginning to flat-line and one of the next big trends would be AI driving a higher quality of immersive gameplay through intelligent characters and dynamic pathfinding. 5,000 polygon characters, well lit and rendered, are of limited use if they walk into walls.
What was the development time on the product, and what challenges did you run into in preparing the product for industry use?
Five years and counting. No, really! AI.implant has to evolve as fast or faster than the video game industry—clearly, no easy task. And, because we sell to both film and games customers, we face the huge challenge of developing a system that can deliver cinematic motion quality to thousands of characters in film like Stargate Digital’s Spartacus
, while at the same time be flexible enough to scale to the 30 frames per second world of video games.
How has the product developed over the time you've been producing it?
We have always tried to stay one step ahead in our development process, such as becoming the first AI tool integrated in Unreal Engine 3, and that’s what our customers really appreciate.
We initially started out with a strong view to creating crowd scenes for high-end film and post projects. That was a big challenge for us, but eventually we became an integrated plug-in for both 3ds Max and Maya and our customers were able to easily implement crowds into their rendered scenes.
Our product is still used in high-end film projects; most recently, Fuel used us in their work on Charlotte’s Web
to power a multitude of baby spiders. Once we felt we had nailed the offline world, we were already pushing hard at being the AI tools and middleware for the games market. At first, the consoles just didn’t have the CPU power to pull of a lot of what our tool did, but we had some early customer wins that looked hot—especially Midway’s PsyOps: Mindgate Conspiracy
. And once the new generation of consoles came out, like the PS3 and the Xbox 360, we knew that AI was going to become the next big thing in games.
How have you acted on feedback to improve AI.implant?
In recent years the most notable changes made to AI.implant has been our integration to Epic’s Unreal Engine 3 and our ability to manage dynamic physical worlds. AI.implant has always prided itself on tools for the production team such as level designers, so with the popularity of UE3 within our client base we took on the challenge of integrating our production tools with the UE3 Editor.
The second trend was to build a system we call Dynamic Path Refinement to enable our AI characters to maneuver intelligently in physical worlds that change based on player actions.
How does the product work on a technical level?
Logically there are two parts to AI.implant: the authoring and debug tools, and the runtime-processing component.
Authoring in AI.implant has two main components. First, AI.implant is used to author the decision making logic and locomotive characteristics of the characters themselves using a data driven physical system.
Second, AI.implant allows creation of the ‘AI world’, which is to say, a special version of the environmental data used by the AI character for perception, and path planning even within a dynamically changing environment. Depending on the customer’s workflow, this environment can be created right in a level editor like UE3, or Instinct Studio, directly within AI.implant via AI.DE (AI Development Environment), or in an art tool like Maya or Max.
During run-time it’s AI.implant’s job to iterate through a “perceive, decide, act” loop in every frame, for every character. This loop is made up of the following decisions and actions:
Perceive – what is my environment telling me?
Decide – based on my intentions and the environment where should I go?
Do – calculate my new location, orientation and speed and pass that information back to the host application to reconcile with the physics and animate, and pass back to AI.implant to iterate once more.
How has AI.implant's integration into the Unreal Engine affected the development of the product?
As mentioned earlier, AI.implant has always ensured that both programmers and level designers were provided with tools to do their jobs. Therefore our integration with UE3 was intended to provide these same tools, but in the UE3 environment so the client’s workflow isn’t interrupted.
In the end, this meant building a UE3 plug in that would expose AI.implant’s decision tree GUI, debug visualization and events in Kismet and UnrealScript, plus much, much more.
Do you feel your product works best when combined with other middleware?
Middleware is simply a software component game studios decided to outsource rather than build themselves. Therefore, it really makes no difference from where the complementary technology originated. That being said, the quality of the other technical components such as rendering, animation, physics which either feed or are fed by AI.implant greatly affect the value of AI.implant in a game.
What are some of the more notable examples of the product's use?
Some of the higher profile examples include Midway Entertainment’s upcoming release of Stranglehold
, demonstrating AI in massively destructible environments. On the simulation side we are proud of the work Lockheed Martin STS is doing with their Virtual Combat Convoy Trainer
is a fully enclosed, visual simulation allowing multiple soldiers to drive through a city on a humvee. All the combatants and civilians are driven by AI.implant. If Microsoft sold an Xbox 360 game for a million dollars – this would be it!
Who is currently using the product?
From the games industry Midway, on Stranglehold
and Area 51
, Vivendi Games, BioWare Corp, SCEA, and Electronic Arts are using AI.implant.
What do you see as the next evolution of AI.implant?
Ambient Crowds. We’ve seen in military simulations how hundreds of entities drastically affect one’s perception of the realism of their world. Today many first-person shooters deal with only the red and blue teams, and completely ignore the fact that most environments have innocent bystanders whose reaction to the player’s actions create an enhanced/visceral effect on their experience.
AI.implant makes the creation of hundreds or even thousands of ambient characters a reality. Game or simulation designers can author character brains individually or using templates. When using group brain authoring you can create crowd segments.
For example, in a crowd of 100 you might have 20 characters associated to an “outgoing” brain, 30 associated to a brain that has a preponderance to use a cell phone while in a large group, and so on—it is very flexible. The fact that these brains are authored in a visual environment means that they can be tweaked by anyone in the development process – from designers to programmers.
In the case of simulations, this allows a subject matter expert – say someone who has experienced real crowds in a war zone—to tweak the brain settings for maximum believability. This helps prepare the trainee by giving her/him a more ‘real’ experience. And in games it just makes shooting a gun in a crowded space way more believable!