Hello there! I’m video game music composer Winifred Phillips. Lately, I’ve been very busy in my production studio composing music for a lot of awesome virtual reality games, including the upcoming Scraper: First Strike first person VR shooter (pictured above) that’s coming out next Wednesday (November 21st) for the Oculus Rift, HTC Vive and Windows Mixed Reality Devices, and will be released on December 18th for the Playstation VR. My work on this project has definitely stoked my interest in everything VR! Since the game will be released very soon, here’s a trailer video released by the developers Labrodex Studios, featuring some of the music I composed for the game:
The Game Developers Conference is always an awesome opportunity for game audio experts to learn and share experiences. I’ve given presentations at GDC for a few years now, and I’m always excited to hear about what’s new and notable in game audio. This year, the hot topic was virtual reality. In fact, the subject received its own dedicated sub-conference that took place concurrently with the main GDC show. The VRDC (Virtual Reality Developers Conference) didn’t focus particularly on the audio and music side of VR, but there were a couple of notable talks on that subject. In this article, let’s take a look at some of the more intriguing VR game music takeaways from those two talks. Along the way, I’ll also share some of my related experience as the composer of the music of the Dragon Front VR game for the Oculus Rift (pictured above).
Where should video game music be in a VR game? Should it feel like it exists inside the VR world, weaving itself into the immersive 3D atmosphere surrounding the player? Or should it feel like it’s somehow outside of the VR environment and is instead coasting on top of the experience, being conveyed directly to the player? The former approach suggests a spacious and expansive musical soundscape, and the latter would feel much closer and more personal. Is one of these approaches more effective in VR than the other? Which choice is best?
Welcome back to our three-part discussion of how video game composers (such as ourselves) can make strategy gamers smarter! In these articles, we’re looking at ways in which our music can enhance concentration and tactical decision-making for players engrossed in strategic gameplay. Along the way, I’ve been sharing my personal experiences as the composer for the Dragon Front strategy game for virtual reality. Over the course of these articles we’ll be covering three of the top concepts that pertain to the relationship between music and concentration. In part one, we discussed the concept of ‘music-message congruency,’ so if you haven’t read that article yet, please go check it out and then come back.
Are you back now? Good! Let’s move on to the second big technique for increasing the smarts of strategy gamers!
As video game composers, we create music in a wide variety of tempos designed to support the energy of play and the pacing of the game’s overall design. From leisurely tracks that accompany unstructured exploration to frenetic pieces that support the most high-stakes combat, our music is planned with expert precision to shape the excitement level of players and keep them motivated as they progress.
Can video game composers make you smarter? Well, video gaming can be a pretty cerebral activity, requiring astute problem-solving skills and disciplined concentration in order to excel. That’s especially true for any game built around strategic and/or tactical gameplay, such as real-time or turn-based strategy, tactical shooters, multiplayer online battle arenas (MOBAs), and online collectible card strategy games. To succeed in these types of games, players must assess the current situation and formulate a plan that accounts for future developments and variables. Without this type of tactical forward-thinking gameplay, a gamer has little chance to win. So, can music enable gamers to think tactically, stay focused and make smart decisions? Over the next three articles, I’ll try to answer that question, while exploring the role of music in enhancing the concentration of strategic/tactical gamers.
Along the way, we’ll be taking a look at some scholarly research on the subject, consulting the opinions of experts, and I’ll be sharing my experiences creating the music for the recently released Dragon Front strategy game from High Voltage software. We’ll check out some music tracks I composed for the popular Dragon Front game (pictured at the top of this article), and we’ll discuss methods for supporting and enhancing concentration for strategic/tactical game players. But first, let’s take a closer look at the Dragon Front game.
Welcome to the third installment in our series on the fascinating possibilities created by virtual reality motion tracking, and how the immersive nature of VR may serve to inspire us as video game composers and afford us new and innovative tools for music creation. As modern composers, we work with a lot of technological tools, as I can attest from the studio equipment that I rely on daily (pictured left). Many of these tools communicate with each other by virtue of the Musical Instrument Digital Interface protocol, commonly known as MIDI – a technical standard that allows music devices and software to interact.
In order for a VR music application to control and manipulate external devices, the software must be able to communicate by way of the MIDI protocol – and that’s an exciting development in the field of music creation in VR!
This series of articles focuses on what VR means for music composers and performers. In previous installments, we’ve had some fun exploring new ways to play air guitar and air drums, and we’ve looked at top VR applications that provide standalone virtual instruments and music creation tools. Now we’ll be talking about the most potentially useful application of VR for video game music composers – the ability to control our existing music production tools from within a VR environment.
We’ll explore three applications that employ MIDI to connect music creation in VR to our existing music production tools. But first, let’s take a look at another, much older gesture-controlled instrument that in ways is quite reminiscent of these motion-tracking music applications for VR:
Welcome to part two of our ongoing exploration of some interesting possibilities created by the motion tracking capabilities of VR, and how this might alter our creative process as video game composers.
In part one we discussed how motion tracking lets us be awesome air guitarists and drummers inside the virtual space. In this article, we’ll be taking a look at how the same technology will allow us to make interesting music using more serious tools that are incorporated directly inside the VR environment – musical instruments that exist entirely within the VR ‘machine.’
Our discussion to follow will concentrate on three software applications: Soundscape, Carillon, and Lyra. Later, in the third article of this ongoing series, we’ll take a look at applications that allow our VR user interfaces to harness the power of MIDI to control some of the top music devices and software that we use in our external production studios. But first, let’s look at the ways that VR apps can function as fully-featured musical instruments, all on their own!
Let’s start with something simple – a step sequencer with a sound bank and signal processing tools, built for the mobile virtual reality experience of the Samsung Gear VR.
I got a chance to demo the Samsung Gear VR during the Audio Engineering Society Convention in NYC last year, and while it doesn’t offer the best or most mind-blowing experience in VR (such as what we can experience from products like the famous Oculus Rift), it does achieve a satisfying level of immersion. Plus, it’s great fun! The Soundscape VR app was built for Samsung Gear VR by developer Sander Sneek of the Netherlands. It’s a simple app designed to enable users to create dance loops using three instruments from a built-in electro sound library, a pentatonic step sequencer that enables the user to create rhythm and tone patterns within the loops, and a collection of audio signal processing effects that let the user warp and mold the sounds as the loops progress, adding variety to the performance.
Since I’ve been working recently on music for a Virtual Reality project (more info in the coming months), I’ve been thinking a lot about VR technology and its effect on the creative process. Certainly, VR is going to be a great environment in which to be creative and perform tasks and skills with enhanced focus, according to this article from the VR site SingularityHub. I’ve written in this blog before about the role that music and sound will play in the Virtual Reality gaming experience. It’s clear that music will have an impact on the way in which we experience VR, not only during gaming experiences, but also when using the tools of VR to create and be productive. With that in mind, let’s consider if the opposite statement may also be true – will VR impact the way in which we experience music, not only as listeners, but also as video game composers?
Simple VR technologies like the popular Google Cardboard headset can be a lot of fun – as I personally experienced recently (photo to the left). However, they offer only the rudimentary visual aspects, which omits some of the most compelling aspects of the VR experience. When motion tracking (beyond simple head movement) is added to the mix, the potential of VR explodes. Over the next three articles, we’ll be exploring some interesting possibilities created by the motion tracking capabilities of VR, and how this might alter our creative process. In the first article, we’ll have some fun exploring new ways to play air guitars and air drums in the VR environment. In the second article, we’ll take a look at ways to control virtual instruments and sound modules that are folded into the VR software. And finally, in the third article we’ll explore the ways in which VR motion tracking is allowing us to immersively control our existing real-world instruments using MIDI. But first, let’s take a look at the early days of VR musical technology!
Tension has always served a crucial role in music composition and performance. My next two blog articles will focus on how music works to shape tension and intensity in a dramatic presentation such as a video game.
During these blogs, we’ll be consulting with lots of top experts on the subject, and I’ll be sharing my experiences in regards to the tension-filled music that I composed as a member of the music team of Homefront: The Revolution – an open world, triple-A first person shooter game that was just released by Deep Silver/Dambuster Studios. Along the way we’ll check out some excerpts from music tracks I composed (in my music production studio, pictured right) for Homefront: The Revolution, and we’ll talk about multiple techniques to build tension in a piece of music, with the goal of inciting the most emotional intensity possible in our audience. With that in mind, let’s start things off with a great quote from philosopher Henry David Thoreau:
“The fibers of all things have their tension and are strained like the strings of an instrument.”
Thoreau not only saw the connection between music and tension, but also made a good point about the stresses and strains in our lives – we all possess our own inner emotional pressure. The more fervently we pursue our goals and struggles, the higher the tension grows. Taken to the extreme, it can feel as though our insides are wound up as taut as clockworks. As game composers, our job has always been to induce players to care about what’s happening in the game, and that includes inciting and escalating the nervous anxiety associated with an awesome investment of emotion and empathy. So let’s explore the best ways we can make players feel the tension!
In this blog, I thought we might take a quick look at the development of the three dimensional audio technologies that promise to be a vital part of music and sound for a virtual reality video game experience. Starting from its earliest incarnations, we’ll follow 3D audio through the fits and starts that it endured through its tumultuous history. We’ll trace its development to the current state of affairs, and we’ll even try to imagine what may be coming in the future! But first, let’s start at the beginning:
3D Audio of the Past
In the 1930s, English engineer and inventor Alan Blumlein invented a process of audio recording that involved a pair of microphones that were coincident (i.e. placed closely together to capture a sound source). Blumlein’s intent was to accurately reflect the directional position of the sounds being recorded, thus attaining a result that conveyed spatial relationships in a more faithful way. In reality, Blumlein had invented what we now call stereo, but the inventor himself referred to his technique as “binaural sound.” As we know, stereo has been an extremely successful format, but the fully realized concept of “binaural sound” would not come to fruition until much later.
Virtual Reality Sickness: the nightmare of VR developers everywhere. We all know the symptoms. Nausea. Headache. Sweating. Pallor. Disorientation. All together, these symptoms are a perfect recipe for disaster. No one wants their game to make players feel like they’ve been spinning on a demon-possessed merry-go-round. So, how do we keep this affliction from destroying the brand new, awesome VR industry before it even gets a chance to get off the ground?