Sponsored By

Composing video game music for Virtual Reality: 3D versus 2D

The 2nd of a 4-part series. Video game composer Winifred Phillips shares ideas from her GDC 2018 talk, Music in Virtual Reality. Part 2: 3D versus 2D, with an exploration of the role spatial delivery can play in music implementation in a VR environment.

Winifred Phillips, Blogger

May 15, 2018

9 Min Read

In this article written for video game composers, Winifred Phillips is here pictured working in her music production studio.

By Winifred Phillips | Contact | Follow

Welcome!  I'm videogame composer Winifred Phillips, and this is the continuation of our four-part discussion of the role that music can play in Virtual Reality video games.  These articles are based on the presentation I gave at this year's Game Developer's Conference in San Francisco, entitled Music in Virtual Reality (I've included the official description of my talk at this end of this article).  If you missed the first article exploring the history and significance of positional audio, please go check that article out first.  

Are you back?  Great!  Let's continue!

During my GDC talk, I addressed three questions which are important to video game music composers working in VR:

  • Do we compose our music in 3D or 2D?

  • Do we structure our music to be Diegetic or Non-Diegetic?

  • Do we focus our music on enhancing player Comfort or Performance?

While investigating these topics, we looked at some examples from VR games that provide great demonstrations, including four of my own VR projects –the Bebylon: Battle Royale arena combat game from Kite & Lightning, the Dragon Front strategy game from High Voltage Software, the Fail Factory comedy game from Armature Studio, and the Scraper: First Strike shooter/RPG from Labrodex Inc. In these articles, I'll be sharing the discussions and conclusions that formed the basis of my GDC talk, including the best examples from these four VR game projects.  So now let's turn our attention to the first of our three top questions:

Should our music be 3D or 2D?

We know that spatial delivery of sound design is critical, but does that extend to the music? Do most listeners care if the music is 3D?  It’s vital that we keep listener impact in mind – and some scholarly studies from expert researchers can help us throw light on that subject.

Illustration of audio research at the University of Hull, from the article by Winifred Phillips for video game composersMeasuring galvanic skin responses, a study conducted at the University of Hull in the UK tested for emotional reactions to both spatially treated music and standard stereo music recordings. They found that spatial treatment had no effect on the emotional impact and enjoyment of music. So, using a traditional stereo mix for a VR game’s music isn’t necessarily a bad thing… but spatial positioning for music can be beneficial, and fun too.

For instance:

  • We can use 3D elements to help integrate a 2D musical score into the VR world.

  • We can use 3D music to grab the player’s attention.

  • We can have music transition from 2D to 3D for dramatic effect.

3D music elements can help the musical score feel better connected to the environment in VR. Let's take a look at an example from the Fail Factory VR game, which demonstrates how 3D music elements can share the stage with a conventional stereo music mix.

An illustration for the Fail Factory game on the popular VR platform, from the article for video game composers by Winifred Phillips (game music composer).In this comedic video game, players go to work in a zany robot factory. Players build massive robots, while keeping up with the ever-increasing complexity and speed of the assembly line. The result is often a series of hilarious failures, inspiring the game’s name – Fail Factory. When Armature Studio hired me to compose the music for their Fail Factory game for the Samsung Gear VR, they described a project in which music took center stage.

By necessity, Fail Factory is set on a gigantic factory floor - but what makes this factory uniquely awesome is the musical nature of the environment.  All the machinery in the factory moves rhythmically with the musical score. So, the dev team asked me to create a jazzy score for this music-driven gameplay. Apart from the score, all of the sound design of Fail Factory is also created specifically to be musical. The bleeps and bloops are pitched to integrate with the score, and the bangs and clangs are timed to emphasize the tempo. While much of the music is delivered to the player in traditional stereo, there are also lots of separate rhythmic and pitched elements that are spatially positioned on the game’s factory floor. The sound design team and I worked hard on getting the balance right between these 2D and 3D components.

For instance, in one minigame, heavy machinery slams down to a conveyor belt – this became the central downbeat for the music on this level. We tried just having that big metallic bang issue solely in 3D from its in-game position, but that didn’t work. As a spatialized sound that was rhythmically synced to the 2D music, the 3D metallic bang felt disconnected from the rest of the 2D score – plus, the bang just needed more oomph. The team and I went back and forth with iterations on this until we settled on both a spatialized impact sound and a simultaneous metallic clang integrated into the stereo music mix. Here’s how that sounded during gameplay:


So you can see that 3D music and audio in VR can be a complicated issue.  While the majority of the music in Fail Factory is mixed in stereo, there are percussive and tonal components (such as that big clang) that are spread out in 3D across the VR space. These elements in 3D allow us to have a nice stereo music mix that also integrates well into the three-dimensional soundscape.

Now let’s take a look at a different example that shows how music in VR can transition from 2D to 3D for dramatic effect.

In this article for video game composers, Winifred Phillips explains her music composition work for the Dragon Front game for the famous Oculus Rift VR platform.The popular Dragon Front VR strategy game for Oculus Rift is a mix of the famous tradition of high fantasy storytelling with a dieselpunk, World War II-inspired aesthetic. Each game session is a self-contained battle on a playing field loaded with monsters, missiles and the machinery of war. With all this in mind, the music of Dragon Front had to convey a suitably bold and dramatic style.  When High Voltage Software hired me to compose the music for Dragon Front , one of their biggest priorities was an epic main theme. So, I composed a big victorious anthem, with the stereo mix piped directly to the player’s headphones. The theme music was designed to continue into the hub, but bombastic music in the hub area could be distracting. So at that point the music moves from a direct channel to the player and takes up a position in the environment, as if it were issuing from in-game speakers. Here’s how that worked:


So we've now taken a closer look at the first of the three important questions for video game composers creating music for VR games:

  • Do we compose our music in 3D or 2D?

  • Do we structure our music to be Diegetic or Non-Diegetic?

  • Do we focus our music on enhancing player Comfort or Performance?

We've just explored what it means to compose music with both 2D and 3D considerations in mind.  The next article will focus on the second of the three questions: whether music in VR should be diegetic or non-diegetic.  Thanks for reading, and please feel free to leave your comments in the space below!


This lecture presented ideas for creating a musical score that complements an immersive VR experience. Composer Winifred Phillips shared tips from several of her VR projects. Beginning with a historical overview of positional audio technologies, Phillips addressed several important problems facing composers in VR.

Topics included 3D versus 2D music implementation, and the role of spatialized audio in a musical score for VR. The use of diegetic and non-diegetic music were explored, including methods that blur the distinction between the two categories.

The discussion also included an examination of the VIMS phenomenon (Visually Induced Motion Sickness), and the role of music in alleviating its symptoms. Phillips' talk offered techniques for composers and audio directors looking to utilize music in the most advantageous way within a VR project.


Through examples from several VR games, Phillips provided an analysis of music composition strategies that help music integrate successfully in a VR environment. The talk included concrete examples and practical advice that audience members can apply to their own games.

Intended Audience

This session provided composers and audio directors with strategies for designing music for VR. It included an overview of the history of positional sound and the VIMS problem (useful knowledge for designers.)

The talk was intended to be approachable for all levels (advanced composers may better appreciate the specific composition techniques discussed).



Photo of Winifred Phillips in her video game composers music production studio.Winifred Phillips is an award-winning video game music composer whose most recent projects are the triple-A first person shooter Homefront: The Revolution and the Dragon Front VR game for Oculus Rift. Her credits include games in five of the most famous and popular franchises in gaming: Assassin’s Creed, LittleBigPlanet, Total War, God of War, and The Sims. She is the author of the award-winning bestseller A COMPOSER'S GUIDE TO GAME MUSIC, published by the MIT Press. As a VR game music expert, she writes frequently on the future of music in virtual reality games.

Follow her on Twitter @winphillips.

Read more about:

Featured Blogs
Daily news, dev blogs, and stories from Game Developer straight to your inbox

You May Also Like