The Game Developers Conference is always an awesome opportunity for game audio experts to learn and share experiences. I’ve given presentations at GDC for a few years now, and I’m always excited to hear about what’s new and notable in game audio. This year, the hot topic was virtual reality. In fact, the subject received its own dedicated sub-conference that took place concurrently with the main GDC show. The VRDC (Virtual Reality Developers Conference) didn’t focus particularly on the audio and music side of VR, but there were a couple of notable talks on that subject. In this article, let’s take a look at some of the more intriguing VR game music takeaways from those two talks. Along the way, I’ll also share some of my related experience as the composer of the music of the Dragon Front VR game for the Oculus Rift (pictured above).
Inside and outside
The talks we’ll be discussing in this article are entitled “Audio Adventures in VR Worlds” and “The Sound Design of Star Wars: Battlefront VR.” Here’s a common issue that popped up in both talks:
Where should video game music be in a VR game? Should it feel like it exists inside the VR world, weaving itself into the immersive 3D atmosphere surrounding the player? Or should it feel like it’s somehow outside of the VR environment and is instead coasting on top of the experience, being conveyed directly to the player? The former approach suggests a spacious and expansive musical soundscape, and the latter would feel much closer and more personal. Is one of these approaches more effective in VR than the other? Which choice is best?