[Grapefruit Games' Eddie Cameron argues that striving to make video games more immersive is "a false design goal," in this #altdevblogaday-reprinted opinion piece.]
We seem rather fond of 'immersion' don't we? Good games are often equated with immersive games. See this 'Most Immersive Games of 2010′ list
, which may as well be 'High metascoring games of 2010′. But what do we mean by immersion? And why do developers and the public lust after it?
Let's imagine that the game world used the same definition as everyone else, to be completely mentally engaged with something.Sounds good right? If a game involves your mind so completely you ignore your surroundings, and even time itself (dusk to dawn Civ
sessions) it can't really be a terrible game. It might not be a good game, for example Farmville
is immersive by this definition and although exploitative, it is well made.
But we seem to have this idea that immersion is synonymous with suspension of disbelief, a player feeling as though they are "in" the game. From hoopla about the "Loading area…" in Oblivion
to shaky cameras and blood spatter, we're still clinging to the unattainable 'Holodeck' ideal. If nothing else, realism is just too expensive to practically pursue forever, at some point we're going to have to say "that'll do."
From a more theoretical perspective, immersion is a false design goal. Trying to make a game 'more immersive' often leads to things like minimised HUDs, transparent saves, and worst of all, justification of game features through narrative (ala Assassin's Creed
). It's as though if players are reminded they're playing a game, their suspension of disbelief will be shattered, and the game ruined.
The problem is that players already know its a game, and manage to be fully involved anyway. Adding so called 'immersive' features may look cool, but they don't function as intended, and may even distract from the experience if they interfere with play.
To pick on Assassin's Creed
yet again, pulling players back to the future just for the sake of an unnecessary plot device is just cruel. It's as though the makers were so in love with the Holodeck idea they thought a narrative one would be as good as a real one.
Core of the problem seems to be this idea that players 'become' their characters while in game. Many designers seem to have this goal of minimizing any interface between the player and their avatar. Kinect may be an opportunity for some unique control systems, but it isn't going to help anyone forget the fact they're controlling a virtual object. This is another symptom of this quest for 'immersion', pretending that all interactions outside of those prescribed to the player don't exist.
A game experience isn't created only while we press the right buttons, but also how we think about it as a game. When we check info in the HUD, compensate for lag, navigate the menus, talk about it on forums, and even fiddling with graphics settings, we're adding to our personal experience of the game, and sometimes others'. Superbrothers: Sword & Sworcery EP
uses this well, constant reminders to tweet don't distract you from the game world, and in fact help spread that world out into the real one.
Basically, don't get tempted by the popular and seemingly intuitive notion that an 'immersive' game has to make the player forget they're playing a game. Games can be abstract
, games can have big ass HUDs and obvious save features, and games can refer to the real world. Ignore people who claim that a health meter has ruined their game experience. It hasn't, and you know better.
Note: You can read more about the 'Immersive Fallacy' and other clever things in Rules of Play
by clever clogs Katie Salen and Eric Zimmerman.
[This piece was reprinted from #AltDevBlogADay, a shared blog initiative started by @mike_acton devoted to giving game developers of all disciplines a place to motivate each other to write regularly about their personal game development passions.]