Sponsored By

Designing for Immersion: Recreating Physical Experiences in Games

There's long been a question of how much realism is too much, and how much is just right for getting players engaged in your experience -- and it's tackled here by developers from Retro, Guerrilla, DICE, and more.

Michael Thomsen, Blogger

January 7, 2010

15 Min Read

It's easy to think about games in terms of winning and losing. They are a series of increasingly complex challenges that you either pass or fail. Reaching a fail state leads the player to a bitter few seconds of rebuke, before he or she is presented with the same problem again. This is the oldest and most basic stricture of game design, from Jacks to Go.

The inherent power of video games is their ability to make take these rudimentary principles of interaction into an authored space that affects a player's senses. I remember teaching my younger cousin how to play Super Mario Bros. when we were kids. At every jump she jerked the controller upwards, and she cried out in mock pain upon falling into a lava pool.

Immersion is a concept that appears with great frequency in game design, but the means to conjure it can be elusive.

For all the attention to detail put into art and animation, inconsistent AI or awkward lip-synching can make players roll their eyes. Likewise, a grainy DS Pokémon game on the Nintendo DS can hold young players rapt for hours.

How do games hold players' attentions without running aground on disbelief and incredulity? How can designers turn a feeling into lines of code, and then into an experience that moves a player to keep pushing through an interactive fantasy?

It Moved Me, and It Moves Me Still

Movement is the most basic element of 3D game design. You can create a world and a series of rules to govern the objects in that world, but until there is a cipher to move among those objects the game is lifeless. Movement is also the first and most persistent layer of interaction which developers are able to communicate with players.

"Movement is the core of the game and because of that we focused heavily on that until we got it right," said Owen O'Brien, senior producer on DICE's Mirror's Edge. "The movement was the first thing we developed before Faith, before the story, before anything else."

Without a cutscene, dialog box, or instructional manual, character movement can communicate a lot of dramatic qualities. In Mirror's Edge, the emphasis on momentum and slight left to right movement of the camera with each step, subconsciously draw players towards its acrobatic gameplay.

Killzone 2 and Gears of War give players a sense of constraint with the lumbering movement animation that subconsciously encourage methodical play and special attention to cover.

"We had a series of designer-constructed test rooms for every aspect of player movement so we could prototype everything in exhaustive detail," said Michael Kelbaugh, president and CEO of Retro Studios, discussing the development of Metroid Prime. "Only when we had that really mastered did we begin the bulk of world construction."

It can be tempting to rush past this stage of the prototyping process to get to the actual content creation, but having an evocative movement system can be a turnkey in evolving a staid genre. Need for Speed Shift took one of EA's annual franchises out of the oxidizing conventions of its predecessors by focusing on first person racing.

After bottoming out with 2008's Need for Speed Undercover, Shift added twenty points to its Metacritic score on a wave of largely admiring reviews.

"The key aim with the cockpit view was to translate that raw intensity that you feel in a real-life race car to a player holding a control pad," said Andy Tudor of Slightly Mad Studios, describing Shift's design. "At high speed we do a combination of things; blurring the cockpit out to make you concentrate on the upcoming apex, camera shake, and even having the driver's hands shake and grip the wheel tighter as they try to control the car. All these combined give you the cues you need to get an exhilarating sensation of speed."

These ambient flourishes suggest consequence to the in-game action that has a real-life counterpart. It's one thing to see your in-game performance evaluated through abstract meters in a HUD, but it's much more frightening to think a mistimed input could send you through the terrible experience of a full speed car crash in first person.

It can be easy to think about what the player is supposed to accomplish, but the best games also focus on what they want their players to feel while they busy achieving objectives.

"We wanted the player to feel as if they were actually inside Samus' helmet," said Kelbaugh. "Our first idea was that beads of water could appear on the faceplate when Samus moved into and out of water or steam. When this test worked so well, we began to look for more opportunities to use this function, like enemy goo, Samus' reflection, and so on."

With the recently released Dead Space Extraction, Visceral Games and Eurocom invested a lot of time in motion capture, facial animation, and creating a library of first person movements to create a more cinematic horror experience.

"We were pretty fortunate that Eurocom has a really fantastic motion capture studio right there at their offices," said Wright Bagwell, creative director at Visceral. "We basically had one of the actors carrying the camera around and they were just acting out these big scenes we had designed."

"We discovered early on that if you have a guy running around with a camera and you take that capture straight out of the studio, it can be pretty obnoxious."

At What Cost, Gameplay?

While focusing on immersive systems can do much to freshen up a traditional genre experience, it can also be a frustration for fan expectation. Completing simple objectives can suddenly feel like chores with a wobbly camera, motion blur, and disorienting animations.

Deciding to leave out certain immersive features, and subtly balancing the ones you do include can be a tricky business.

"One thing we investigated was -- during cornering -- having the driver's head tilt and look in the direction of the apex as it would in real-life," said Tudor. "Whilst it was a cool feature, we found through testing that that focal point wasn't necessarily where the player was looking, and in some cases it made them feel motion sick."

In developing Mirror's Edge, DICE built a lot of movement systems that wound up not working in the flow of the game as they had hoped.

"We tried what sounded like an awesome system for realistic foot placement whilst sidestepping," said O'Brien. "It looked great and had a more natural feel, but as a player you lost a degree of accuracy. This proved to be too annoying so we abandoned it."

With Resident Evil: Darkside Chronicles, Cavia tried to find a balance between playing on the senses while not getting in the way of the basic point and shoot gameplay. "As a gun-shooter, we needed to restrict the speed, not only to evade enemies but also to give players more time to aim," Kentaro Noguchi, producer at Cavia, said.

"We didn't feel the need to implement gun recoiling into the system. This game requires extensive concentration from the players, so we wanted to make sure not to include anything that would distract them from concentrating on the game."


Resident Evil: Darkside Chronicles

But what about games that actually want to make things hard on the player? Killzone 2 won praise for its terrifically detailed visuals and tactical multiplayer mode. It was also stung by criticism for its weighted controls that struck many players as an unnecessarily realistic inconvenience.

"What we have seen through many playtesting sessions is that a vast majority of gamers had no problems with the controls," said Mathijs de Jonge, Guerrilla's Creative Director in Killzone 2. "However, the more hardcore (and vocal) part of our audience needed some time to adapt to it."

"We added the ability to jump following the same mantra: no unrealistic double jump or insane low-gravity-jumps. And climbing ladders isn't simply bumping into it and then scrolling upwards. Your player character has to climb the ladder step by step."

There is no quick fix for determining what will or won't work. Finding the right balance is, in many ways, a luxury of having enough time to spend on experimentation, prototyping, and play testing.

"At the end of the day we are making a game, not a simulation, so the dividing line needs to be judged by playtesting," said O'Brien. "Then you can see the point where realism just becomes irritating."

I Heard You Calling

Synaesthesia has become a buzz word for some high-minded abstractionists in the games industry, but 3D games can co-opt the phenomenon for their own immersive purposes. Video games are particularly adept at connecting sound and visual motion into a way that can produce a real physical sensation in the player.

"We did a lot of real-time post processing with DSP filtering on sounds to convey distances and room-space perception. Each space the player is in has its own reverb type, to keep audiovisual context," said Guerrilla's Mario Lavin, sound director on Killzone 2.

"We also wanted to make the player feel the weight of their equipment, so all the character movements trigger randomized equipment rattles, belt clicks, clothing rustles, etc."

Sound can play a major part in building atmosphere and suggesting mood to the environment and story. In Halo 3: ODST, for instance, your player will wheeze in pain after just a couple shots, emphasizing the new vulnerability of the troopers in comparison to Master Chief.

In Darkside Chronicles, sound effects are used to insinuate short burst of narrative without forcing players to watch cutscenes or read text. In the opening chapter, set during the events from Resident Evil 2, sound effects are used to establish a mood of imminent danger and imply eerily recent disappearances.

"The police siren follows, and the players become aware that they are under desperate circumstances," said Noguchi. "A song being played at a nearby shop confirms that there was normal human life happening just moments before the zombie invasion occurred."

To conventional thinking, this is wasted effort and geometry; players aren't doing anything here, they're just moving through space. In reality, players create associations with, and assumptions about, the gameworld at all times. Even if you're not killing zombies or collecting ammo, you're still taking part in the experience, picking out audiovisual cues at every turn.

Sometimes a combination of visual and sound cues can be used to completely replace the HUD, letting players absorb all the critical information they need from the game world itself. "We spent a lot of time working on Faith's breathing, the sound of her clothing and footsteps, for instance. That can tell you so much about what speed you are travelling at, over what surface, or if your wall run is about to end," said O'Brien.

"In most games, footsteps are a pretty simple thing to add, but running and moving was so integral to Mirror's Edge that we had to create a huge library of footsteps and a system to manage them. We had them for different speeds, different surfaces, different landings; the list goes on and on. The breathing system was also key. During playtests we actually saw players starting to sync their breathing with Faith!"


Mirror's Edge

With different sounds competing for audio bandwidth, proper layering can be vital. "Our audio team spends a huge amount of time and iteration to make sure that not only are the individual sounds unique and reinforce the game atmosphere, but that they complement one another, so the player does not lose the individual sounds in a cacophony of sources," Retro's Kelbaugh told me.

In the movie industry, there's a saying that if you're having a problem with the third act of your screenplay, the real problem is in the first act. The analog for game development is in planning. "It might seem kind of obvious that the audio always comes last when you're building a game, and you tend to not have the time," said Visceral's Bagwell.

"It was really important to us to lock the game early enough to give the audio guys enough time to make the game really sing and make it sound amazing. If the level designers are still moving things around, or they're still moving the cameras around, you can't really set that up."

Having big budgets and long development cycles can definitely make it easier to focus on atmosphere and immersive flourishes, but immersion can still be accomplished with some rigorous upfront planning that identifies what qualities are most important to your game. Would you rather spend an extra two months prototyping or add an extra two levels to the end-game gauntlet?

A Little Je Ne Sais Quoi

In talking about Shift, Tudor described to me the decision making process that went into the varying intensities of camera shake to emphasize speed and acceleration. "Initially, we set these effects to speed thresholds -- 100mph, 200mph, etc. But we found that the player wasn't getting the same level of intensity in the early vehicles as they were in the high-powered Zondas and Veyrons," he said.

"In the end, we changed it so it was relative to the individual vehicle's top speed; i.e. speed down a straight on the limit in a Ford Focus, and it's still an intense experience."

As I played the game I found this positive feedback, combined with the nasty consequences of crashing, became the foundation around which I wrote my own drama.

I'd get nervous coming into straightaways because I knew I'd have to accelerate to the point of barely being in control, a simultaneously thrilling and nauseating experience.

I began to resent the other drivers on the track who would aggressively cut me off. They weren't just blocking my chance of winning -- they were threatening me with the catastrophe of a crash. I didn't feel like I was losing; I felt like I was in physical danger.

All of that from a racing game.

In Dead Space Extraction, Visceral and Eurocom created an equally forceful experience near the end of the game. An alien tentacle pierces the player's arm and pins them to the surface of the space ship. Suddenly trapped and vulnerable, the player has to chop their own arm off with a few terrible swings of the Nunchuk.

"One of the things we told [Eurocom] was we wanted to see moments where it felt like you had to make a decision and no matter what decision you made, the consequences were horrible," said Bagwell. "I think that's one of the key things about horror -- you're damned if you do and you're damned if you don't."


Dead Space Extraction

It's a brilliant manipulation, forcing players to do to themselves something they've spent most of the game doing to enemies in order to progress. It's the perfect kind of interactivity for a nightmarish horror game. And after it's over the game forces players to wallow in the consequences, crying out in pain, rolling on the ground clutching their bloody stump, complete with protruding bone shard.

The impression is of vivid, physical consequence that was so audacious I happily suspended my disbelief. It might not have been the most realistic scene but, just as Shift's viscera are more impressionistic than literal, the effect is remarkable for its emotional engagement. Winning is almost beside the point in this scene. It was a chance to pantomime a terrible dream for a few seconds.

I was describing Nintendo's demo play feature, in which Luigi will play a level for you, in New Super Mario Bros. Wii to a friend the other day. She hasn't played a video game in 10 years, but she immediately latched onto the zeitgeist of the backlash against that feature. "Don't you feel like you're cheating when you use it?" she asked.

For many, video games are a simple competition against a closed system, the net benefit being a slight puff of pride when you beat a level or clear a row of Tetris blocks. Games are played first and experienced second. With all of the new audiovisual tools and motion-based interfaces, a new possibility for gameplay as sensory experience is making a strong case for itself.

Feeling like you did something right in a game is one thing, but feeling like you did something that had an identifiable human consequence is the point at which games become more powerful than the mainstream art forms that have preceded them.

Fail states can become emotional experiences instead of reset points; mundane character movement can be made an act of voyeuristic inhabitation; coupling gameplay achievement with visual intensity can transform a video game into something of lasting human value.

Using technology to design sensory immersion is a long process, filled with false-starts and failed experiments. But the payoff points toward one definitive principle: player feeling can be just as important as player achievement.

To make a game that actually connects the two, you'll need to plan for it, from the first day of brainstorming to the day your last release candidate has been certified. On these grounds a game designer and poet can be said to have the same purpose, even while a mountain of technology and abstraction separate the two. Who'll be next to try and cross that mountain?

Read more about:

Features

About the Author(s)

Michael Thomsen

Blogger

Michael is a freelance writer based in New York. He has covered video games for the ABC World News Webcast and the Q Show on CBC Radio. He has written for Nerve, the Brooklyn Paper, the New York Daily News, and IGN where he is a regular contributor and author of the Contrarian Corner series. You can follow Michael at his blog www.manoamondo.com.

Daily news, dev blogs, and stories from Game Developer straight to your inbox

You May Also Like