
Welcome! I’m videogame composer Winifred Phillips, and this is the continuation of our four-part discussion of the role that music can play in Virtual Reality video games. These articles are based on the presentation I gave at this year’s Game Developer’s Conference in San Francisco, entitled Music in Virtual Reality (I’ve included the official description of my talk at this end of this article). If you missed the first article exploring the history and significance of positional audio, please go check that article out first.
Are you back? Great! Let’s continue!
During my GDC talk, I addressed three questions which are important to video game music composers working in VR:
- Do we compose our music in 3D or 2D?
- Do we structure our music to be Diegetic or Non-Diegetic?
- Do we focus our music on enhancing player Comfort or Performance?
While investigating these topics, we looked at some examples from VR games that provide great demonstrations, including four of my own VR projects –the Bebylon: Battle Royale arena combat game from Kite & Lightning, the Dragon Front strategy game from High Voltage Software, the Fail Factory comedy game from Armature Studio, and the Scraper: First Strike shooter/RPG from Labrodex Inc. In these articles, I’ll be sharing the discussions and conclusions that formed the basis of my GDC talk, including the best examples from these four VR game projects. So now let’s turn our attention to the first of our three top questions:
Should our music be 3D or 2D?
We know that spatial delivery of sound design is critical, but does that extend to the music? Do most listeners care if the music is 3D? It’s vital that we keep listener impact in mind – and some scholarly studies from expert researchers can help us throw light on that subject.
For instance:
- We can use 3D elements to help integrate a 2D musical score into the VR world.
- We can use 3D music to grab the player’s attention.
- We can have music transition from 2D to 3D for dramatic effect.
3D music elements can help the musical score feel better connected to the environment in VR. Let’s take a look at an example from the Fail Factory VR game, which demonstrates how 3D music elements can share the stage with a conventional stereo music mix.
By necessity, Fail Factory is set on a gigantic factory floor – but what makes this factory uniquely awesome is the musical nature of the environment. All the machinery in the factory moves rhythmically with the musical score. So, the dev team asked me to create a jazzy score for this music-driven gameplay. Apart from the score, all of the sound design of Fail Factory is also created specifically to be musical. The bleeps and bloops are pitched to integrate with the score, and the bangs and clangs are timed to emphasize the tempo. While much of the music is delivered to the player in traditional stereo, there are also lots of separate rhythmic and pitched elements that are spatially positioned on the game’s factory floor. The sound design team and I worked hard on getting the balance right between these 2D and 3D components.
For instance, in one minigame, heavy machinery slams down to a conveyor belt – this became the central downbeat for the music on this level. We tried just having that big metallic bang issue solely in 3D from its in-game position, but that didn’t work. As a spatialized sound that was rhythmically synced to the 2D music, the 3D metallic bang felt disconnected from the rest of the 2D score – plus, the bang just needed more oomph. The team and I went back and forth with iterations on this until we settled on both a spatialized impact sound and a simultaneous metallic clang integrated into the stereo music mix. Here’s how that sounded during gameplay:
So you can see that 3D music and audio in VR can be a complicated issue. While the majority of the music in Fail Factory is mixed in stereo, there are percussive and tonal components (such as that big clang) that are spread out in 3D across the VR space. These elements in 3D allow us to have a nice stereo music mix that also integrates well into the three-dimensional soundscape.
Now let’s take a look at a different example that shows how music in VR can transition from 2D to 3D for dramatic effect.
So we’ve now taken a closer look at the first of the three important questions for video game composers creating music for VR games:
- Do we compose our music in 3D or 2D?
- Do we structure our music to be Diegetic or Non-Diegetic?
- Do we focus our music on enhancing player Comfort or Performance?
We’ve just explored what it means to compose music with both 2D and 3D considerations in mind. The next article will focus on the second of the three questions: whether music in VR should be diegetic or non-diegetic. Thanks for reading, and please feel free to leave your comments in the space below!
Music in Virtual Reality
Topics included 3D versus 2D music implementation, and the role of spatialized audio in a musical score for VR. The use of diegetic and non-diegetic music were explored, including methods that blur the distinction between the two categories.
The discussion also included an examination of the VIMS phenomenon (Visually Induced Motion Sickness), and the role of music in alleviating its symptoms. Phillips’ talk offered techniques for composers and audio directors looking to utilize music in the most advantageous way within a VR project.
Takeaway
Through examples from several VR games, Phillips provided an analysis of music composition strategies that help music integrate successfully in a VR environment. The talk included concrete examples and practical advice that audience members can apply to their own games.
Intended Audience
This session provided composers and audio directors with strategies for designing music for VR. It included an overview of the history of positional sound and the VIMS problem (useful knowledge for designers.)
The talk was intended to be approachable for all levels (advanced composers may better appreciate the specific composition techniques discussed).
