Sponsored By

Make-Believe Tanks: a Report from the Trenches of the VizSim Industry

Holger Gruen describes his experience working in the VizSim industry developing tank simulators for the military, and compares the similarities and differences between the simulation industry and video games, in this fascinating comparative feature.

Holger Gruen, Blogger

January 31, 2006

21 Min Read

Introduction

As a former game developer who has worked in the Visual Simulation (VizSim) industry, I'd like to share my experiences. We all have read a lot about serious games companies that apply game technology to VizSim-like training and learning games. As you will see, there are a lot of similarities between the games industry and the VizSim industry that make VizSim companies attractive to game developers. Especially as it seems to be a more respectable and serious job and you'll probably get a thumbs up from your mom.

There are things that make the traditional VIZSIM simulation business different from what you probably will experience in the game business. These differences start with terms used for technology components that you've learned to understand. Prepare to no longer say ‘render engine', but ‘computer image generator' (CIG) instead! The focus of the simulation of the world is different for the VizSim industry but most fundamental techniques are the same.

My impression is that games have superior technology in various contexts that are relevant for both industries. Still, games don't provide the same sort of simulation fidelity that simulators do – partially because there is no need to do so. I'll try to shed some light on more details while explaining the training simulator environment.

The following text is not intended to be a general text about the VizSim industry – I obviously can only tell you how it works where I worked, and how it compared to my time in the game industry.

How did it all happen?

A few years ago I was working for a games and game technology company as the head of development. Unfortunately this company became defunct with the dot com crash. Since I did not want to relocate my family again, I tried to find a nearby position. On an online job portal, I stumbled across a job ad saying ‘OpenGL' and ‘visualization.' These are among the things I am experienced at, and I applied for the job. Funnily enough four of my former colleagues had also applied for other jobs at the same company, and we all ended up working there.

Simulators for ground vehicles

The company we worked for mainly builds tanks and armored vehicles. They also have a division that builds training simulators for everything that drives on the ground (i.e. cars, trucks, tanks, trains, metros and subways). In addition to driving simulators, they also do training battle simulators for all sorts of tanks. Since people who buy tanks are also interested in training equipment for these tanks, especially simulators, the simulation division profits a lot from its big ‘mother.' This text will mostly cover military simulators since this is what I worked on.

The project I worked on was a networked battle tank simulator. It used 13 PCs per cabin to render 13 different views from inside the tank. The simulated and displayed world was up to 80 x 50 square kilometers big and 110 tanks plus effects had to rendered at constant 30Hz at 4xAA and 1280x1024. Trees and houses in the database were supposed to be destructible. Dynamic craters from artillery impacts and dynamic track marks from all tanks had to be supported. Furthermore, dynamic shadows from all major scene parts due to a moving sun had to be rendered. Lighting with dynamic and moving light sources also needed to work. In addition to normal views, various sensor imaging techniques that simulate IR or light amplifier devices had to be implemented. All of this in isolation was no problem, but given the size of the scene and a visibility range of over 5.5 kilometers the task was demanding.

Simulator Hardware

The budget for a simulator varies a lot depending on the kind of simulator to build. A complicated training simulator with multiple networked cabins has a budget that can be compared to that of a AAA title, or may even exceed a AAA budget. In general, though, the budget for a civil simulator is well below that of a AAA title.

Hardware is an important part of every simulator. A big chunk of the budget for a training simulator is spent for hardware. This means there is less money available for software and asset creation than for an AAA title. For one project the hardware even included the construction of a building that actually contained all the training equipment.

In general the hardware is a cabin (see e.g. http://www.sdtb.kiev.ua/sdtb_train_en.htm) with an interior that is more or less a perfect copy of the cabin of the real vehicle. All switches, steering wheels, knobs and other equipment are present in the simulator cabin, and are functional as in the real vehicle.

For some simulators the cabin is mounted on a hydraulic motion system that provides motions inside the cabin that are perceived as acceleration, deceleration or other forces acting on the vehicle.


A turret training simulator

Turret training simulators feature a cabin that is actually a copy of the real turret of a simulated tank and can weigh up to several tons (see e.g. http://www.c-its.com/svensk/simulation/CATS_Turret_Leo.html). The turret sits on electrical motors that rotate the turret and its inhabitants based on handles operated by the gunner. Turret trainer hardware also moves the gun up and down and even allows for training reloading of the gun. In addition to the gunner other positions of the tank crew members are also supported and are provided with artificial visuals.

Visualization for simulators does not always happen on monitors, but very often on non-planar projection screens that show edge blended projections of the images computed by several CIG (computer image generator) computers and projected by expensive projectors.

Each simulator features a bunch of computers doing vehicle simulation, physics simulation, sound simulation and visualization. So it's not at all like the hardware environment of a computer game where there is exactly one computer doing all the work. All this hardware needs to be controlled by software, of course, and this is where we begin to see more similarities to game development.

Simulator Software

I'll now try to give some insight into the different software components that run inside a simulator – most of run on their own dedicated computer or even on more than one computer. Very often, simulation software is a distributed software system.

The relatively big number of computers in one simulator is a very astonishing thing for a game developer. I still ask myself from time to time – how can one need so many computers for tasks that run on one computer in a typical game? There are obvious reasons for some of the computers to be around, especially for visualization, as we will see later. One thing to consider, though, is OS boundaries – e.g. real-time 3D rendering happens on Windows PCs, whereas the rest of the simulation usually happens on Linux PCs. Most of the simulation software is written in ADA - a semi-dead language that actually is pretty nice if you take a closer look at it. Again, real-time 3D software is written in C++. Although there are ADA compilers for Windows, the simulation people I workede with feel a lot hate and disgust for Windows and simply don't trust it, thus limiting their solution space.

Vehicle simulation

At least one computer per simulator cabin, usually called the ‘Simulation-Host', polls all input switches and handles of the cabin hardware to simulate the inner state of the vehicle. The Simulation-Host also talks to a software component that provides a spatial index for collisions detection and collision response for the physical simulation of position and orientation of the simulated vehicle.

The Simulation-Host learns about the other entities in the simulation world only through TCP/UDP data using the simulation domain standard protocols DIS (short for Distributed Interactive Simulation – see (people.freebsd.org/~jha/aboutdis.html) or HLA (www.informs-sim.org/wsc98papers/227.PDF) using multi-cast or broadcast. Other entities are other simulators or AI vehicles which are called SAFs (Semi Automated Forces) or CGFs(Computer Generated Forces) in the VizSim world.

The rules of DIS are easy; the Simulation-Host is only allowed to tell other simulation participants about its own state changes. It can't for example determine if it has destroyed any other entity by the shot of its main gun. What it is allowed to do is to tell everybody that it has caused a detonation at a specific location. To decide if other entities take damage from this detonation is up to the software simulating the states of these entities.

The Simulation-Host also talks to the computer running the visual systems or CIGs that produce the real-time 3D views for all windows or other viewing devices inside the vehicle. I'll talk more about 3D rendering later. Simulation-Host also talks to a sound system component to produce sounds as heard from inside the vehicle.

The Simulation-Host is what in the games domain is the player object or PC (player character) and its simulation. The SAFs are the VizSim world equivalent to NPCs (non-player characers).

Since the simulation of a tank, all its states, all its optics and other components is pretty complex, a whole computer is allocated to this task. Because a modern tank has a computer that shows a tactical map of its surroundings and a way to transmit tactical symbols to friendly tanks in an encrypted data format, it also needs to be simulated as well as radio messages between all tanks in a scenario.

Physics simulation

A few words on physics – the VizSim industry does not seem to use much physics middleware like Havok or MathEngine. They do have pretty advanced custom physics simulation software for tank physics, train physics and so on, but no general rigid body physics software. For a game physics developer, a VizSim company might be a good choice because they are just beginning to explore things like full rigid body physics, soft-body or even cloth simulation. The physics solutions are not as refined in terms of mathematics used – I did not see techniques like adaptive time steps, runge kutta or other integration techniques at work (such as implicit integrators, verlet integrators etc.)

AI and World Simulation

All artificial vehicles (NPCs), aircraft, human characters (usually called DIs – dismounted infantry) are usually simulated by yet another computer. Let's call this computer ‘Battlefield-Simulator.' It does all the motion planning, strategic planning and animation control for AI entities. The algorithms and data structures are the same that are used in the game industry, e.g. A*, influence maps or state machines of various flavors. For the project I worked on, an integration of an AI middleware was one of the ways of trying to cope with the immense size of some of the simulation worlds that were used and showed impressive results.

The world simulation also sends positional and other updates via DIS for all simulation participants to inform them about these entities. Therefore, you could call it the MMOG server of a training simulator. There are examples of simulator scenarios that have featured thousands of participants that were all connected via DIS and were spread across several continents.

AI seems to be the area with the most similarities between games and VizSim. My impression was that games have the more advanced AI technology. One thing to consider though is that not every cool AI produces effects that are good for creating reproducible exercises – which is what an VizSim instructor wants. A game has to look cool and must be fun to play. Training simulations that have the goal to as closely as possible mimic existing technology and real scenes in order to fulfill training objectives. The training objectives are not to make a simulator fun to play with, but to achieve a training goal. The requirements that need to be fulfilled to achieve the training goals might even affect things that games just leave out because they are hard to achieve. This might not be possible if a VizSim customer insists on a specific requirement.

Instructor stations

Every training simulator has one or more instructor station computers. These computers and the software they run allow the instructor to create missions for the trainees to fulfill. This pretty much works like what you know from the mission preparation mode some tactical games feature. The instructor has a 2D or 3D tactical map of the world. It places AI enemies, determines their paths and how they react to events in the world. One example would be to tell an AI to open fire once it has a line of sight to a simulator. Further on the instructor may create obstacles like mine fields and other obstacles. Obviously civil simulators feature different mission goals.

Most instructor stations also have one or two computers visualizing 3D views as seen from inside the simulator. The instructor can either freely move around the world with a 3D mouse or attach to a vehicle or even switch to a mimic view from inside a vehicle.

3D Visualization

The software responsible for rendering 3D real-time views is called CIG (Computer Image Generator) or simply IG, but is not called a render engine.

Most VizSim CIGs use a scene graph API of some flavor (e.g. VegaPrime from MultiGen www.multigen.com, Performer from SGI, or open source alternative e.g. OSG http://www.openscenegraph.org/). For those of you who are not familiar with scene graphs, it is a description of the world to render. The scene data is kept in a tree-like data structure. In the VizSim world SceneGraph APIs are usually prepared to use multiple processors or cores. A typical setup uses two threads each using one processor or core. Thread one does the culling, network polling and simulation for the next frame while thread two feeds OpenGL or D3D with the output from the culling thread from the previous frame.

On the hardware side, the VizSim people used to use multi-processor SGI hardware and often proprietary graphics hardware (costing $500,000 or more). When SGI went down and market pressure increased for more affordable hardware, this changed. High-end dual processor PCs with the highest-end graphics cards are now used.

When I joined the visualization team, it was using two commercially available SceneGraph APIs. They had lots of problems with code that worked around flaws and bugs inside these APIs. I was assigned the task to produce a prototype for a project that had to replace old custom CIG hardware with a PC-based CIG. It became pretty clear soon that none of the APIs used was able to give us enough performance, so we started to look for alternatives on the open source side.

In the VizSim world the most commonly used standard for 3D scenes/levels, or databases as VizSim people call them, is the OpenFlight file format from MultiGen. All open source solutions we looked at did back then did not have a loader that could load and display our reference databases without errors. In addition to that, their performance was not very encouraging.

We had implemented several render engines on top of OpenGL and D3D before. One of us started to implement an OpenFlight loader when all people that could say no were on holiday. After having implemented a basic loader we wrote code to optimize the resulting SceneGraph for throwing optimized geometry patches at OpenGL. It all worked out fine and after changing our code to use two processors/ threads we had twice the frame rate that the other APIs could produce.

The custom SceneGraph API is still used in its 3rd iteration now using cg, and OpenGLslang. Nobody has ever thought it was a bad idea to have a custom API. Especially as we had to pay a license fee for every computer we had to deploy the commercial scene graph software on.

The render engine business for simulators has in general the same goals as in the games industry. The pictures produced should look as real as possible and as nice and cool as possible. There are some differences though:

  • We needed at last resolution 1280x1024 with at least 4xAA at constant frame rate of 60 or 30 fps. Anti-aliasing is very important for the VizSim industry and flickering artifacts due to aliasing are not acceptable.

  • To further make life unpleasant we had to support a far clipping plane of generally 5.5kms or even more.

  • In addition to a very large viewing distance we had to also render things near the camera e.g. the hull of the simulated tank.

    Optical devices in the tanks do provide views with very high magnification factors. Customers simply don't accept flickering artifacts that come from less than optimal z-buffer resolution at over 5000 meters. Therefore you have to find solutions to cope with these requirements. Just think about what these z-buffer issues do to techniques like stencil shadow or (post perspective) shadow maps.

  • LOD is also one thing that is handled differently; it is in general not acceptable for LOD to spoil mutual line of sights. If one opponent sees you it is not allowed that a view dependent LOD algorithm prevents you from seeing the opponent and vice versa. I leave it to you to work out how that affects LOD construction and selection.

  • Projection of views by projectors on non-planar projection screens have to be implemented. Since geometry-correcting projectors are extremely expensive, other solutions had to be found. Therefore we had to deal with edge blending and geometry correction in the CIG software. In addition to that, channels need to be in sync to provide one stable big virtual image.

All the requirements listed so far make it a challenge to design good CIGs for simulators. There is pressure to have the same quality and richness of effects in the simulators that games provide and we've already done our share to rectify that. We've even brought motion-captured, bone-animated and skinned characters to the VizSim world. We were , on the other hand, constantly fill-rate and/or memory limited, so every additional texture or pixel-shader/fragment-program instructions did hurt. It doesn't matter if the frame rate of a computer game fluctuates and even if it is jerky occasionally this is no killer for a game. Unfortunately this is not acceptable in the VizSim industry.

Networking

I've already talked about DIS. DIS defines a standard UDP datagrams that potentially allows plugging simulators from different vendors together into one simulation. Imagine being able to plug together game worlds from completely different game vendors to start a multi-player match. All types of vehicles and effects are covered by DIS there are even datagrams to describe flocks of shrimp – obviously for submarine simulation.

The problem keeping positions of distributed entities is solved in DIS via a mechanism that is also known to the games industry, the so called dead reckoning. The software managing an entity sends out position updates only if it knows that an extrapolation based on the previously sent position and speed generates an error bigger than an acceptable threshold. The jumps in position that happen when a new update comes are usually smoothed away on the CIG computers that also do the extrapolation.

Besides DIS custom RPC protocols are used that allow the simulation computers to talk to the CIG computers. For the project I worked on, I implemented a layer that accumulates RPC call and sends them all out at once via multicasting to 13 CIG computer displaying different views from inside the tank. The protocol is optimized to minimize handshakes between the simulation and the visualization.

Asset Creation

As mentioned above, levels are called databases and are very big compared to most game worlds. In the VizSim industry the process to create databases and models like tanks, houses, vehicles and even human characters is more an engineering process than the creation of art. Textures are always scanned in. Models are built to match the dimensions of the original. Database creation is extremely expensive and the outcome can't compare to game assets. This is why most simulators don't look as good as games. You still see simulators with shoebox houses. The VizSim industry has so far not realized that it would need some paradigm shift to also embrace database creation as a process of creating art to achieve the quality of game assets. Game developers seem to be more personally involved with what they work on, whereas VizSim people just seem to do a job.

The tool used for database creation are tools that work on the OpenFlight™ file format and come from MultiGen. For newer and more detailed databases a Tool called TerraVista is used to transform height and features information to a set of OpenFlight™ tiles.

We have brought Maya on stage because we wanted properly animated characters, but it is only used for characters animation purposes and not for building databases or other objects. Character animation is another technology that is not up to game standards in the VizSim world. There actually is VizSim middleware like DI-Guy (www.bdi.com) for character animation, but they have just recently discovered things like facial animation or properly skinned characters.

Another thing we've tackled is automatic replacement of scene graph nodes containing trees with SpeedTree™ trees. Unfortunately we could only use this for demos because the in-house spatial index software could not be easily adapted to perform line of sight calculations with these trees – sigh!

Software Processes

The games community has started to take a more formal look at game projects and to treat them as software projects using techniques and processes from software engineering.

Software process is something the VizSim industry can't be imagined without, at least the military projects. It was actually the customers of the project I worked on that prescribed what kind of software process was needed in order to get the contract. It was a big issue to show that the in-house process could be mapped to their process and that we were compliant.

We really had requirements databases and documents describing in detail what each part of the simulators was supposed to do. Customers and their consultants did reviews of the design documents and the requirements. They also setup their own test procedures to finally test the system. These tests included full load and 24-48 hours tests to see if the system could really cope with a full day of training sessions.

Since I was in charge of the CIG subsystems for military simulators I also worked as an analyst transforming top-level customer requirements into testable and readable requirements. This is something I've never experienced in the game companies I've worked for – what comes close to this might be a good game design document.

The processes used are pretty rigid and old fashioned but that's how things work in the military business. It sometimes even seems as if all documents are only produced to keep the customer happy but not to increase the quality of the software. Note to never say ‘rapid development,' ‘agile methods' or ‘postmortem' – these are evil terms.

Company Culture

Just a few words about the company culture I experienced. Many VizSim industry companies are pretty big and have seem to have a big set of partially contradicting rules and regulations that make them shoot themselves in the foot from time to time. If you change from a small self-motivated team to a bigger company you have to prepare for a culture shock. Make sure to read The Career Programmer: Guerilla Tactics for an Imperfect World by Christopher Duncan before considering a move like that.

A remark that only applies to some VizSim industry companies: Hardware manufacturers have not realized that what actually gives their products a competitive edge is increasingly realized in software. They have already, without their knowledge, transformed into software companies or are on their way to become a software company. The thing is that they don't know about software engineering but only about building hardware and it sometimes shows.

_____________________________________________________

 

Read more about:

Features

About the Author(s)

Holger Gruen

Blogger

Holger ventured into 3D real-time graphics writing fast software rasterizers in 1993. Since then he has held research and also development positions in the games and the simulation industry. He got into developer relations pretty recently and now works for AMD supporting game developers to get the best out of AMD graphics hardware. Holger, his wife, and his four kids live close to Munich near the Alps.

Daily news, dev blogs, and stories from Game Developer straight to your inbox

You May Also Like