Sponsored By
John Mawhorter, Blogger

May 14, 2009

3 Min Read

The modern game is a schizophrenic beast concerned at one moment with storytelling and the next with gameplay. Much (far too) has been written about the tension between storytelling and gameplay, but I feel I have an interesting take on the issue.

The modern gaming industry produces, by and large, not games but multimedia products with large amounts of gameplay. The cut-scene is the secondary media. I don't mean to blow a semantic issue out of proportion, and since gameplay is by and large what makes products successful it is right to focus on it, but we should keep in mind certain differences to the current "video game" form.

Play is a discontinuous process cut up by cutscenes, loading screens (though these are slowly being shifted around and shortened), and non-interactive in-game events. This is in contrast to board-games, physical games, and some sports (commercial breaks, anyone) which take the form of a single continuous play session.

Cutscenes are used because they work when used well, because they entertain the consumer, and because we don't have advanced enough technology to tell the story through AI characters alone (this is probably achievable on a limited scale), because Hollywood is the model for success, and because they give authorial control without the "boring" nature of text.

But as an aesthetic choice institutionalized only recently (I'm thinking of arcade or NES games as well that have no real story to speak of other than setting) it must not be thought of as a permanent or indispensable feature of game-making. The fact that I can look slightly to the left on this blog-writing screen and see Cinematic Artist advertised as a job position; how long before the dream of media integration comes true and we get Hollywood directors slumming it by directing cut-scenes?

 Let it be noted that some solutions like Valve's use of in-game dialogue and scripted action sequences, while more pleasing in my view, still don't count as gameplay as it is defined.

While it may seem like a gimmicky idea, my dream game is one designed to be played in one sitting without saving-reloading (another pet peeve of mine) or stopping for a dose of story. Rewinding time (a gimmick in itself) was the first way that occurred to me to achieve this, but lack of player death could also solve this problem. Failure occuring in other ways than death is another thing that isn't popular, but I see possibilities there.

This whole argument sort of hinges on my belief that games are essentially not narrative during play, but rather after it. Being the type of media most linked to experience, one can ask the question who self-conciously thinks about their life as narrative in any way other than through memory?

Cut-scenes, then, create an after-gameplay space which is linked to the remembered events and modifies them with its narrative to create a story. It is this distance between content of cutscene and narrative (remembered) content of the game experience that makes cutscenes jarring for me and others.

True integration of gameplay and story may be impossible in the sense that instrumentalizing dialogue (for example) choices would make them ring false and narrativising gameplay could be seen as absurd (the "you need one hundred koopa shells to unlock the gate to Bowser's castle" line is tired).

Taking a look at real life experience, however, may give us a model (examples of this are Miyamoto's oft-quoted gardening comments, but there are a million pleasing everyday interactions from which to take inspiration).

My mechanics-focused bias is sort of obvious here, and I'm not arguing that people should really change what they are doing (or that it's bad, audience preference is always a better judge) , just that they should be less single-minded in their use of conventions of the form.

Read more about:

2009Featured Blogs
Daily news, dev blogs, and stories from Game Developer straight to your inbox

You May Also Like