VR for the Game Music Composer – Artistry and Workflow

Since the game audio community is abuzz with popular excitement about the impending arrival of virtual reality systems, I’ve been periodically writing blogs that gather together top news about developments in the field of audio and music for VR.  In this blog we’ll be looking at some resources that discuss issues relating to artistry and workflow in audio for VR:

  • We’ll explore an interesting post-mortem article about music for the VR game Land’s End.  
  • We’ll be taking a closer look at the 3DCeption Spatial Workstation.
  • We’ll be checking out the Oculus Spatializer Plugin for DAWs.

Designing Sound for Virtual Reality

In these early days of VR, postmortem articles about the highs and lows of development on virtual reality projects are especially welcome.  Freelance audio producer and composer Todd Baker has written an especially interesting article about the audio development for the Land’s End video game, designed for the Samsung Gear VR system.

Here, you see me trying out the Samsung Gear VR, as it was demonstrated on the show floor at the Audio Engineering Society Convention in 2015.

Here, you see me trying out the Samsung Gear VR, as it was demonstrated on the show floor at the Audio Engineering Society Convention in 2015.

Todd Baker is best known for his audio design work on the whimsical Tearaway games, and his work as a member of the music composition team for the awesome LittleBigPlanet series. His work on Land’s End for Ustwo Games affords him an insightful perspective on audio for virtual reality. “In VR, people are more attuned to what sounds and feels right in the environment, and therefore can be equally distracted by what doesn’t,” writes Baker.  In the effort to avoid distraction, Baker opted for subtlety in regards to the game’s musical score. Each cue began with a gentle fade-in, attracting little notice at first so as to blend with the game’s overall soundscape in a natural way.

LandsEndGoing a step further, Baker enhanced this music/sound design blend by actively blurring the distinction between the two aural elements.  Sound effects were designed with a sense of musicality inherent in them. The score for the entire game was constructed with heavy use of key signatures sharing lots of common tones.  This allowed the “musical” sound effects to blend with the atmospheric score in a pleasing way.  According to Baker, this approach “blurs the line between what the player would recognise as music or sound, and helps them to instead accept that this is how the world sounds.”

Baker’s entire score for Land’s End is available for free download on SoundCloud.  Here’s the trailer video for the Land’s End game:

border-159926_640_white

The 3DCeption Spatial Workstation

3DceptionLogo

3DCeption Spatial Workstation is a new kind of DAW designed for VR audio. The philosophy of this software is driven by the simplification of workflow – eliminating the need to render audio and import it back and forth from one application to another.  Using 3DCeption, the composer or audio designer can continue sound asset creation using existing DAWS such as Reaper, Nuendo, and the famous and ubiquitous Pro Tools application.  The 3DCeption application provides a host of plugins that are compatible with the user’s current DAW.  Taking into account the geometry of the environment, the plugins are able to process the audio to include reflections, occlusions, and realistic spatial positioning that incorporates head tracking. In order for head tracking to work, the software package includes the Intelligent 360 Video Player that integrates directly with the virtual reality headset, allowing the user to preview the audio mix in the VR environment.

How the 3DCeption Plugin appears in the Pro Tools DAW.

How the 3DCeption Plugin appears in the Pro Tools DAW.

One of the most interesting and unique aspects of this software is its ability to accommodate the ambisonic method of recording three dimensional sound. Initially created in the 1970s, the ambisonic recording method uses multiple channels: one captures the sound pressure, while the other three capture the spatial coordinates of the sound (X, Y and Z).

A visual representation of B-Format

A visual representation of B-Format

This audio encoding is known as B-format, and one of its biggest advantages is that it doesn’t dictate a specific speaker configuration in the way that multichannel surround sound does. Instead, the B-format encoding process allows the audio to be subsequently decoded by the end user’s speaker system. This allows the audio in B-format to accommodate many different speaker arrays and configurations. Even more interesting for its application in VR, the B-format can be reconfigured into almost any playback format.

For instance, composers may find this useful when trying to record an orchestra or other live ensemble for use in 3D sound. Using an ambisonic soundfield microphone (for example, the TetraMic from Core Sound), the composer can capture a three-dimensional recording that can then be reconfigured into many different playback formats (including the binaural/HRTF format currently favored by VR developers).  This music recording, captured with the ambisonic method, can then be dropped into 3DCeption using what they call an Ambi Array, which enables the music to function as a binaural soundfield that reacts faithfully to the orientation and head tracking of the player.  This has the potential to give the music a much more natural integration with the rest of the aural environment.

I haven’t yet seen any demonstrations of three dimensional positioning for music using 3DCeption.  However, we can get a sense of the simple possibilities of the 3DCeption software in these two sound design demonstrations (use headphones to experience the three dimensional audio):

border-159926_640_white

The Oculus Spatializer Plugin for DAWs

Oculus-VR-logo

As we know, creating audio for the three dimensional space of a VR experience can involve a clumsy workflow as composers and sound designers jump from one complex audio application to another. Fortunately, software developers have been busily addressing this problem (as we learned in our discussion of the 3DCeption Spatial Workstation). Now Oculus (gearing up to release the Rift in the first quarter of 2016) wants to make things simpler for audio creators.

With the Oculus Spatializer Plugin for DAWs, users of Digital Audio Workstations (such as Pro Tools, Nuendo, Cubase, Reaper, etc) can preview their music and sounds in the 3D spatialized environment of VR, using positioning automation that faithfully replicates the immersive aural environment of the Oculus Rift.

Here's how the Oculus Spatializer Plugin window looks in a DAW

Here’s how the Oculus Spatializer Plugin window looks in a DAW

Let’s take a look at a video that demonstrates the Oculus Spatializer Plugin from within the Ableton Live application:

border-159926_640_white

Studio1_GreenWinifred Phillips is an award-winning video game music composer whose most recent project is the triple-A first person shooter Homefront: The Revolution. Her credits include five of the most famous and popular franchises in video gaming: Assassin’s Creed, LittleBigPlanet, Total War, God of War, and The Sims. She is the author of the award-winning bestseller A COMPOSER’S GUIDE TO GAME MUSIC, published by the Massachusetts Institute of Technology Press. As a VR game music expert, she writes frequently on the future of music in virtual reality video games. Follow her on Twitter @winphillips.

12 responses to “VR for the Game Music Composer – Artistry and Workflow

    • Hi Chris! Glad you enjoyed the blog, and thanks for the great question! I’m not sure how much mainstream penetration VR games will have, but I hope they do well. The artistic possibilities for storytelling in the VR space are enormous! I’ve heard that VR is making a big splash at the ongoing Consumer Electronics Show in Las Vegas, so that’s a good sign. There’s actually a great article at GamesIndustry.biz about the future of game development in 2016 and beyond, which includes some predictions about VR from analyst Patrick Walker of EEDAR: “VR will launch to critical success and strong consumer buzz, but pricing and other barriers will limit mass market adoption in 2016… I believe that VR is long play, and I believe that the companies launching platforms next year are well positioned to grow VR into a mainstream entertainment experience.” It’s a really cool article — you can read it here: http://www.gamesindustry.biz/articles/2015-12-23-what-to-expect-in-2016

  1. Fantastic article as always Winifred. I was wondering if I could ask one question : you mention that “the B-format encoding process allows the audio to be subsequently decoded by the end user’s speaker system”. Does this mean that we composers can work on VR sound without having to purchase specialized equipment (specific headphones or speaker configurations)? In other words, since (I assume) most end-users will listen to the audio on a pair of headphones, does this mean we can also create the audio material by working with a pair of headphones?
    Many thanks in advance!

    • Hi Stellita! So glad you enjoyed the article! From what I’ve read on the Two Big Ears site (the creators of 3DCeption), it sounds like we composers would be able to monitor VR sound without special equipment using 3DCeption. Reading from their site (http://twobigears.com/labs/v1-0-and-a-preview/): “With many of our users developing 360/VR movies and video, we decided to include ambisonics b-format decoding within 3Dception. Just as you would setup a 3Dception Source, you can create a 3Dception Ambi Array, drop your ambisonic recordings and you should have a fully binaural soundfield responding to the orientation of the listener with all the normal playback functionality that is available within Unity.” I imagine we’d need to monitor using headphones to get the full effect of the binaural soundfield, but other than that, I don’t think we’d need special equipment. Varun Nair would be able to confirm that – he’s the cofounder at Two Big Ears (his Twitter: https://twitter.com/ntkeep)

      • Many many thanks for your thorough reply Winifred! This is excellent news and 3DCeption certainly does sound extremely promising. Thank you again for your excellent articles, please keep them coming :)))

  2. Thanks Winifred for this article. Music and SoundFX creation and mixing will certainly be huge with video VR, I’m working on this but We still miss a decoder while playing the video via Youtube&co. Only the mobile app will decode the ambisonic. Do you know any player to play a 360 video and decode ambisonic while moving with the mouse ?

  3. Pingback: Video Game Music Composer: Music and Sound in VR Headphones (Part One) | Composer Winifred Phillips

  4. Hi. Your video shows Ableton Live running the SpatialWorkstation plugin but I have not found that support mentioned anywhere. How are you actually using the plugin with Live, on Windows?

  5. Pingback: The Big Index 2024: Articles for Game Music Composers – Composer Winifred Phillips

Leave a Reply