In the film industry, or more specifically Hollywood, convergence within game development has arrived. It's happened fast, and in a very big way. The next generation landscape promises even more integration and spectacle in this direction. In this three-part feature we will take a look at how the three different aspects of game audio, music, sound effects and dialogue, are affected by that arrival.
We begin part one with an overview of music, taking a look at the opening of the interactive world to Hollywood composers and the record industry, and how that content is becoming integrated into video games.
New Musical Structures: Communicating Interactive Structures to Traditionally Linear Film Composers
Migration from, not to Hollywood
For designer, producer or sound director, working with composers, not to mention big name Hollywood composers, can be a challenge. Here we consider the inherent differences between content and structure in both cinema and video game music.
It is often said that the games industry is perceived by composers as a stepping stone, where one can train, or at the very least get paid, until film or real work comes along. Over the last five years the stepping stone has transformed and now offers far easier navigation in the opposite direction. Being a small budget game composer has never really represented a clear path into linear post production of mainstream cinema, whose roles and employment hierarchy are rigidly defined after over 100 years of industrial history. The most talented Hollywood film composers are instead migrating to games, larger audio budgets enable publishers to bypass the ‘sample based' and employ the best composers, arrangers and orchestras working in Hollywood. This allows the games industry unprecedented access to the highest quality of cinematic music.
Danny Elfman's recent work on Big Blue Box's Fable, and the more recent mention of Howard Shore's involvement in Webzen's SUN instigates a trend for name composers that is equally becoming established for Hollywood voice talent, sound effects creation and screen writing. Hollywood's finest actors for example are now lured to games by the fact that, among other incentives, rather than embarking on a year-long training and pre-production schedule, and rigorous and tiring location shooting on a film, they can earn similar money for doing a few day's of voice work in a comfortable sound studio.
There is a proven economic advantage to employing name actors and name composers on a video game; it gives public relations a hook to grab onto and to generate much larger PR budgets, this directly equating to increased revenue. Ask any producer how sound can sell more copies of a game and you will get the same answer: big name voice talent. Now that the score is moving into that realm - it is time for the composers in our industry to integrate on a much larger scale. (1)
The incentives for the Hollywood composer are evident. Working on a game actually affords the composer a temporal luxury in that the development time on a large game far outstrips the small amount of time they would have to work on a feature film. Traditionally a feature film commission requires that the entire score is written, arranged and recorded as soon as a temp edit of the film is created. There are exceptions to this, the film composer Gabriel Yared works exclusively on a film title from day one of a project until it is completed; however, not many composers have this luxury. A final edit may result in a few changes to the timings and structure of the piece, but that period of time between the temp edit and the final edit is pretty much all the time the composer has to fully flesh out the score. So let's take a look at the videogame/film music landscape.
Bill Brown, composer for videogames such as The Incredible Hulk: Ultimate Destruction and The Lord of the Rings: The Battle for Middle Earth as well motion pictures such as Michael Mann's Ali, Oliver Stone's Any Given Sunday, and recently the television series CSI New York, suggests:
“First, I think something that is worth sharing is how qualitatively speaking, games, films, and TV music are merging. Over the past 10 years, we have been slowly bringing the consciousness of the value of live orchestra (that is taken for granted in films now) into games. […] Another thing that comes to mind is the ‘cinematic' approach to video games. This to me means more attention is being paid to how music is working to support the narrative of the game - music is now taking the next step in gaming to become a deeper part of the story-telling experience. Game developers are truly interested in the depth and dimension music brings to their product and are willing to invest more now than ever to take their project to that next level. Developers really understand that a 60-90 piece orchestra sounds better than orchestra samples and that makes a difference in the impact of their game. Triple-A titles and A-List films are enlisting some of the same players today. Howard Shore, one of my favorite composers, is included in that new cross-over group of artists. This concept of cross-over artists is becoming more and more the standard for our industry.” (2)
The fact that names like Howard Shore or Danny Elfman are mentioned with such excitement in game music circles reveals a great deal about music in games, especially as Elfman only wrote a main theme for Fable. Why aren't we talking about the other composers on Fable who adapted, fleshed out and integrated this ‘theme' into the core mechanics of the game? This is again representative of the way that games are marketed, in a similar way to films. There is probably little difference in terms of quality the ‘non-name' composers on Fable and the work that Elfman did; however, Elfman's name is the currency. It is his name that is used as an index of quality in the public mind.
Garry Schyman, composer of music for both games Destroy All Humans and Voyeur, and films Lost In Africa, Horse Player and The Last Hour, argues:
“When truly creative opportunities present themselves composers, even Hollywood 's most famous, will want to get on board. Games have evolved to a point where game music has become as important an element to games as it is to films, and the quality expected by game companies is very high now. I think game music is the place to be at the moment for any composer interested in plying his or her trade. What is likely is that composers will cross over back and forth between the two genres.” (3)
This idea of a crossover artist is something that both Bill Brown and Garry Schyman see as clear for the future of composers. A future where there will be no categorization of either ‘game' or ‘film', but simply ‘composers.'
There are some interesting reasons why the games industry would look to a composer of Elfman's caliber. It can be viewed as a sea change for game composition that breaks down some previous boundaries - in the eyes of gamers, critics and the composers themselves, games are becoming recognized as serious cultural artifacts. This is intensified by the huge sales the medium is generating, not to mention the maturing and stratification of the overall core demographic of gamers and game creators.
There are vital structural differences between the music required for a motion picture and the music required for a video game. Nonetheless, both film music and video game music are aesthetically close.
“In film music you are writing to underscore and enhance the action or emotional experience the scene creates or perhaps you are even finding a deeper meaning to the emotions the actors are portraying. But with films, or television for that matter, the scene and the music accompanying it, once locked, never changes. Additionally, in a film score repetitive music can be an attribute, as the score will likely only be heard once. In games repetitive music can get turned off. Because a game the players' choices determine the experience to a significant degree, what the player is seeing and experiencing is somewhat unique each time they play. This means that music is rarely accompanying the exact same visuals twice and can easily get boring if repetitive, or hard to listen to if abrasive. So I find that the approach is quite different, though there is the obvious similarity that you are using music to enhance the emotional experience of the viewer or player.” (3)
“Structurally, where film is static and games are dynamic, the two can share most other aspects [aesthetically] speaking. The score can follow an overall arc in both mediums, it can develop themes, underscore action, communicate exotic locations, and add dimension to the emotional landscape of either medium using similar tools.” (2)
From a technical and structural perspective, delivery formats in games and film are also moving closer together.
“With films and television the norm is to deliver a Pro Tools session with the music placed in time locked to picture. Music could be mixed as stereo tracks (still common in TV) or 5.1 mix (in nearly all films). Additionally, music could be delivered with separate stems (still Pro Tools session locked to picture) with various elements of the music separated out, giving the mixing stage the option of increasing or decreasing the volume of a particular musical element. With games it is common to deliver individual stereo WAV files that the audio lead will mix into the game. 5.1 mixes are beginning to be more common as well. Finally, it is common to deliver the music broken into separate musical stems so that different elements of the score can be brought in and out as gameplay dictates.” (3)
“Even the formats delivered to the developer and dubbing mixer can be the same. I deliver stems (separate instrument groups in tracks for each cue) to both my music editor for films and TV, as well as to some developers for use dynamically in games. This also gives them both an opportunity to mix in 5.1 where applicable” (2)
Working with a solid system and a solid implementer is critical to the success of any interactive score. Garry Schyman continues:
“It is my experience that in games the music implementation has become a critical element in how you write the score. Film music implementation was settled 75 years ago and has changed little since. But I've found that with games the implementation creates challenges and can literally dictate the approach one takes. Music must be flexible enough to change with the players experience and yet it is not possible to write and implement dozens of hour's worth of music to fit every possible game scenario. The audio lead is often the first person you will play music for as you are writing the score. Because they have most likely been involved from the games inception they have a very good idea of the style and approach that the developer is looking for.” (3)
“In film, I work most closely with the director, film editor and my music editor. Ideally, the director has a clear vision that he or she can communicate to me as we collaborate on the project, and/or an open sense of creativity and collaboration that we use to the advantage of the film. This isn't always the case, so sometimes producers are involved creatively as well - hopefully, the conversation stays positive, creative and focused. My intention is to bring as much harmony to that process as I can for myself and others, because the more cooperation there is on all fronts, the better the end result will be - not to mention how much easier and more graceful the process will be for everyone.
In games, I most often work directly with the head sound producer and several of the people from the creative team on the project (designers, artists, writers, etc.) which helps me get a feel for the overall vibe of the game - similar to the creative process I have with a film editor and director. ” (2)
Scott Morgan, the sound director for Radical Entertainment who worked with Bill on The Incredible Hulk: Ultimate Destruction in order to implement his score into within the game, concurs:
“As Sound Director and music implementer I really bookend the music production process. I cull as much information as possible about the game, its story, characters and structure from the design team. With this information I provide the composer with a framework within which to work. I also act as a bit of a filter for the composer, ensuring he/she is not inundated with too much information or information that may not be critical to his/her process. After this, I have little impact on the music other than providing both technical and aesthetic feedback. Once the music is written, I then begin the process of implementation, which is comparable to the role of music editor in film. I edit and arrange the music to fit within the dynamic of the game, mostly sticking to the agreed upon framework set up at the beginning of the process, but occasionally grabbing from other pieces or requesting additional elements from the composer to make it all work within the interactive nature of the game.” (4)
Roughing out the System
The following example system can be applied to seasoned game audio composers or linear film composers unfamiliar with game work alike, and is intended as a suggested guide for working with composers of any caliber.
An often overlooked factor in quality interactive music is not through the musical language itself, i.e. not through the rests and semi-quavers, not through any serialist aesthetics or even any electronic timbral breakthroughs, but a clearly defined and communicated stylistic and structural language based on the content and the systems of the game. This is exactly the way that music for films constantly manages to be different, bold, and perhaps one of the most critically fertile grounds for contemporary music today. It is the stylistic and structural language which differentiates film music from other musical forms such as opera, the symphony, the pop song or the oratorio, all defined by structure rather than style. In terms of video games, a sound lead will work with designers, code and AI code teams to realize, solidify, and communicate this structure on a game per game basis back to the composer.
This process is more complex than simply coming up with a structure, talking to the composer about it and then taking delivery of the music.
A Hollywood Production Model
The structure in question will require an initial design, but this does not mean that the design will remain intact for the length of the project's life. In fact, due to the metamorphosising of the game and its inherent gameplay systems, the structure used for music should be fully expected to fundamentally change two or three times during the project's lifetime.
Fundamental gameplay systems will be designed, tested, scrapped, re-designed, scrapped again, re-designed again and finally re-implemented and re-tuned. During the course of these changes the music is expected to be there and functioning. It may not be evident how the final music should really be structured at all until the final tuning stage of the game. As has been discussed in many game audio articles, having the composer come on board right at the last minute is not an acceptable solution - although it would be the correct time in terms of getting final implementation locked down.
So, why bring in a composer early onto a project if the system isn't designed, if it's all going to change and if it does feel like it's just going to be a nightmare of re-doing the same work over and over again?
There need to be three points of contact on a project for a composer. I have used four phases with filmic production analogies below (fig.1) to describe a project's lifetime. The Primary (Pre-Production) phase is where the stylistic parameters are evolved; this is also known by the composer as ‘getting the gig'; a particular style of music is required for the game, driven by the content and the style of gameplay. You will notice there is no work mentioned for a ‘Secondary Phase of Pre-Production' period of the project, as this is where the fundamental design and system changes make it frustrating and, in fact, almost impossible for the composer to get a clear idea of how to present their content in terms of structure. Structure is also of equal importance to style. However style is not at as much risk as structure in game development. It is unlikely that the entire notion of what the game is will be changed during the Secondary period. Although, let's face it, it can and has happened to many developers, but it is less likely than the manifold structural ripples that will occur within a games development life.
If a composer is involved in the project right from beginning to end on a daily basis, more often than not, their work will need to be redone and reworked so often that there would be a great danger of overexposure, a feeling of lack of creative control, and loss of job satisfaction, not to mention the frustration of working exclusively in one style for quite an extended period of time.
Notes on Interactive Microstructures
Music in cinema, despite its sophistication in terms of content, is rigidly defined by, primarily, the film's stylistic aesthetics, and secondly its visual editing structure; this is why so much frustration can be experienced by composers who have to rework their content to accommodate a new edit of a scene. However, once the edit and structure is locked and the information about the emotional content of the scene is communicated by the director, the composer's work is relatively straightforward.
Structural systems represent different concerns to those of musical language, and defining and communicating the structure to the composer beforehand is essential to success in game audio. With cinema, the editing language is more transparent. In games, it is not so easy for an outside composer to instantly understand the systems that might be in place under the hood of the game - it is through neglect of this that many scores fail to be truly interactive. Of course, scores fail in many other ways too - by being poorly technically executed, by being too literal, even by being too interactive.
In film music, the editing structure of a scene controls the length of the music required, the action even dictating the pace and the tempo of the music, dividing the score up into scenes and phrases therein; there is no real difference to applying the structures of interactive music to a composer's commission, merely a few more structural units which require the composer's initial understanding.
Here are some examples of elemental 'states' and 'units' of interactive music's micro structure that need to be communicated to, and understood by, composers.
A ‘narrative' state is a piece in which plays straight from beginning to end without the need to be interrupted by user input. A good example of this would be a cut-scene movie in a game, as these sections work in a predictable and linear fashion, the old rules of film composition can be applied: the exact timings and lengths of events, once locked, are predictable.
The second type of state, the Continual State, is basically a piece of music that needs to keep playing until a user input interrupts it. A good example would be a simple static theme on a menu awaiting input, or a particular section of a game in which the length of time for that section is an unknown parameter dependant upon many other factors.
The third, which is an evolving state, is slightly more complex, and can consist of several stems of differing intensity. Here building intensity across stacked layers rather than linear temporal movement is required. In linear narrative terms this state is an unknown quantity: the length of time this music is to play for could be anything from ten seconds to an hour - the same as for the simple continual state. However, there may be many game-side factors which influence this piece of music. A good example would be a combat situation within a game. Entering the combat would trigger the continual state related to this, but then variables of how well or badly the player is doing in combat could be required to give audio feedback to the player and modify the music. If the fight is joined by two more enemies, the music may need to become more intense, and if the player runs to the side of the arena away from the fight to recover for a few moments, the music could again be required to reflect this less intense period of activity.
Transition to Narrative Specific Units or Objective Specific Units may also occur within an evolving state, if an objective is accomplished, such as collecting a critical piece of a puzzle. Transition to a short appropriate piece of music may be required to underscore the importance of that event.
The transitional unit is a short piece of music that bridges the two differing evolving states together. It may consist of a drum roll, a build up of some kind which allows smooth exit of any piece at any time and into the next unit.
Inaugural & Resolving Units
The Inaugural and Resolving Unit is a short transitional piece, usually only played once which can be played at any time during the looping continual or evolving states and which signifies an end to the particular piece of music. If the player is successful in combat the combat music will fade out while the triumphant ending to the piece is played over the top, if the player was unsuccessful then a more tragic piece could be played. What you hear and when you hear it, is therefore totally dependant upon the way the user interacts with the system.
This is not by any means a definitive system; it is intended as a simple structural guide for building a 'music map' of the game system. It should encourage communication of these structures to composers who are in fact more than familiar with such structures than one may think, although under a myriad of different names. The key here is communication of the music structure. This can only be done when that system is locked down, and that lock down can only happen when the game itself has been locked down. There may be an initial communication of style, but there should be a communication of structure only when there is confidence that it will not change.
Once the system and design has been defined and laid out (during the ‘secondary' period of production), the scope of the score structure can be communicated to the composer in these or similar terms and this can even happen in terms of a template. The states and units mentioned here will of course evolve over time, yet they do form the core units for any interactive piece of music or ambience. From this a map of the musical scope can begin to emerge. This is where the choice of composer will pay off – he/she should be able to fully realize the score according to the system. Although the ‘secondary' period will catch most of the design changes in a game, it is not always the case. Any further changes need to be communicated as soon as possible to the composer, in the same way as in the changing edits of a film. If a major character were to be cut or changed, the composer should know, as this would potentially affect the entire score.
Film composers have long had their music defined by a predetermined structure.
Every film has a different structure and a different musical approach and these differences are defined even more clearly and stylistically through genres. Video games function in much the same way.
If it ‘Aint Baroque, Don't Fix It
Once style and structure are understood, there is very little else the composer needs other than the talent to deliver. If a game requires something innovative and different in terms of its score, it will be evident. The project itself must make these decisions. Game structures will prove the primary evolutionary force in redefining music for games. The structure and styles will evolve with the gameplay and emotional content needs of the project, emerging as new genres emerge, as before with