(This article was originally published on Designing Music NOW on November 16, 2015)
For those of you who were not able to attend GameSoundCon in LA this year, I wanted to share my presentation on "The Technical Composer" with you. As it was a long talk, I am going to break it down into four parts and release them over the upcoming weeks, as follows: Part 1 - Introduction, Technical Composer and Adaptive Music Examples; Part 2 - Adaptive music with Elias; Part 3 - Adaptive music in WWISE and FMOD; Part 4 - Middleware Comparisons and Adaptive Music in VR. The main idea behind this talk was twofold - to present the importance of learning middleware for composers, and to conduct an experiment on the functionality and usability for creating adaptive music in three popular middleware types: Elias, FMOD and WWISE. The experiment showed how middleware like Elias, which is designed for adaptive music from the ground up, is much easier to use, easier to learn, and speeds up the workflow for composers. However, since Elias is not a sound effects middleware, nor does it handle advanced audio capabilities like busing and profiling, it is best used in conjunction with FMOD or WWISE, and I show how this can be done with little impact to the overall performance of the game.
Part 1a - Introduction and Technical Composer Definition
In this first part, I introduce the concept of a Technical Composer by referring to my inspiration for the concept of a Technical Sound Designer coined by Damian Kastbauer and Anton Woldhek of the Game Audio Podcast. In the same way that a Technical Sound Designer is a sound designer who also implements those sounds into the game using middleware, a Technical Composer is a composer who implements music into the game using middleware. The three types of middleware that I will be covering in this talk are Elias, WWISE and FMOD.
My goal was to compare the functionality and ease of use of the three middleware solutions for implementing adaptive music. I will get into much more detail about this in parts 2 and 3. My methodology was to create an adaptive score in Elias called "Blade Revisited" and then port or convert that same adaptive score into WWISE and FMOD. Then I compare the relative advantages and disadvantages of each.
Spoiler alert: My conclusions were profound and easily demonstrable. Once the music score itself has been composed, implementing a complex 38 layer score in Elias took less than an hour. In order to implement that same score into WWISE, I had to reduce it to a 15 layer score and it still took about 3 days to get all the elements in place. Similarly in FMOD, I had to reduce it to a 9 layer score, and it also took 3 days. On top of that, the overall sound and transitions between layers was much better in Elias, and much easier to manage. Imagine if you had dozens of songs and hundreds of layers to implement - what could take months in WWISE and FMOD could be accomplished in days using Elias. The analogy that comes to mind is this - if implementing an adaptive music score were likened to travelling from Los Angeles to New York - there are many different ways to get there. You can walk, ride a bicycle, take a train, a bus or fly. WWISE and FMOD are like taking a bus, whereas Elias is like flying! Keep in mind this is just for the music implementation for your game, and in my experience, it is best to use FMOD or WWISE for those other audio features like sound effects, mixing, 3D sound and so many of the amazing advantages that they give the games.
But don't take my word for it, you can see for yourself, and even try it for yourself - all three programs are free to download and test out...
Now let's start at the beginning...what is a Technical Composer and what are the main types of Adaptive Music systems:
Part 1b - Two Examples of Adaptive Music Scores in Elias
In this next video, I give an overview of two types of adaptive music scores in Elias - Exploration Mode and Object Mode. Exploration Mode allows you to create near infinite variations of your music based on a small number of music stems. For example, if you had 6 tracks with 6 variations, you could generate 46,656 variations. This is particularly useful for exploration scenes and helps to prevent a common issue with loop based music - ear fatigue.
The second mode, or Objective Mode, is used for greater control over the intensity of the music and is best applied to situations in the game where the intensity needs to increase and decrease based on the actions in the game. In our example, Blade Revisited, you play the hero in search of the evil Gothmong. After landing on the planet, you walk through the techno slums in search of Gothmong in what I call the "Walk" scene. The music increases and decreases in intensity based on your proximity to Gothmong. When you finally spot him, the "Chase" scene begins. This if followed by the "Battle Scene." Each has its own music that has multiple layers of intensity within each.
Links and Further Reading
Middleware Downloads and Tutorials:
Articles about Elias on Designing Music NOW: