Welcome to the third installment in our series on the fascinating possibilities created by virtual reality motion tracking, and how the immersive nature of VR may serve to inspire us as video game composers and afford us new and innovative tools for music creation. As modern composers, we work with a lot of technological tools, as I can attest from the studio equipment that I rely on daily (pictured left). Many of these tools communicate with each other by virtue of the Musical Instrument Digital Interface protocol, commonly known as MIDI – a technical standard that allows music devices and software to interact.
In order for a VR music application to control and manipulate external devices, the software must be able to communicate by way of the MIDI protocol – and that’s an exciting development in the field of music creation in VR!
This series of articles focuses on what VR means for music composers and performers. In previous installments, we’ve had some fun exploring new ways to play air guitar and air drums, and we’ve looked at top VR applications that provide standalone virtual instruments and music creation tools. Now we’ll be talking about the most potentially useful application of VR for video game music composers – the ability to control our existing music production tools from within a VR environment.
We’ll explore three applications that employ MIDI to connect music creation in VR to our existing music production tools. But first, let’s take a look at another, much older gesture-controlled instrument that in ways is quite reminiscent of these motion-tracking music applications for VR:
Last year during the Audio Engineering Society Convention in New York City, I had the chance to play a bit with one of the newest incarnations of that venerable motion-tracking instrument, the Theremin.
Named for its Russian inventor, the Theremin has been around since 1928, entertaining us with the weird and awesome electronic sounds it makes. The instrument I encountered at AES, the Theremini from Moog, features a wider variety of sounds than the classic version, and also offers options for a more forgiving motion-control scheme. Otherwise, the Theremini is the popular device we know best from old sci-fi and horror movie soundtracks.
To demonstrate the motion-control mechanism of this unique musical instrument, here is a skilled theremin player, Clara Venice, demonstrating the instrument with a rendition of the Star Spangled Banner:
It’s useful to remember the control scheme of the theremin, because it bears a striking resemblance to the ways in which we control and manipulate musical content within VR applications. In software such as Pensato, The Music Room, and AeroMIDI, we use sometimes one hand, sometimes two, and always with an eye towards the careful spatial positioning of the hand to control the corresponding musical content. So let’s now take a look at these three software applications:
Since 2001, the famous Ableton Live application has enabled composers, songwriters and DJs to create music with a streamlined visual interface highly suitable for live performance applications. As a Digital Audio Workstation, Ableton Live provides all the recording, sequencing and post-production tools we’ve come to expect from these powerful music applications. However, the Ableton Live tools are all wrapped in a utilitarian GUI designed to get digital music creators out of the recording studio and in front of a live audience. The creation, alignment and triggering of loops is designed to be intuitive in Ableton Live, making loop-based music in Ableton Live especially suitable for concert venues and clubs. So, how is this relevant to the world of VR?
Enter Pensato – the virtual reality application designed by Byron Mallett of the Victoria University of Wellington. Pensato allows a composer to control loops and sounds in Ableton Live by way of a virtual reality representation of those tools, rendered in detailed 3D on the Oculus Rift headset.
“In order to interact with Ableton Live, Pensato functions like a music hardware controller,” said Byron Mallett (pictured left), during an interview with the Ableton Live Blog. “This allows me to capture the entire Session View of Live, including all the tracks, clips and device parameters that can be controlled as part of a performance, and turn them into virtual controls.”
The composer controls the sounds in the VR space with hand motions, as pictured to the right. Special gloves worn by the performer deliver motion tracking data to a set of two sensors that relay this data to the Pensato application. While this is going on, Pensato is also projecting the images from the VR headset to a set of three projectors for the audience to enjoy. The result is a futuristic presentation that lends a new level of sci-fi slickness to the performances of any electronic artist or DJ.
“Most of Pensato has been designed with my own “what ifs” in mind as the basis for how a VR musical interface can be designed,” Mallett observes. “I’m hoping some of these ideas might be useful or encourage others in creating their own VR music interfaces in the future.”
Here’s Byron Mallett himself, performing the composition “Fissure” using Ableton Live and Pensato:
The Music Room
Last month, the National Association of Music Merchants staged its annual Summer NAMM 2016 convention in the Nashville Music City Center. Over 1,500 product brands in the professional music equipment industry showcased their hottest products to a convention crowd of musicians, music retailers and audio industry experts. The show floor teemed with the usual plethora of booths devoted to microphones, DSP racks, guitars, keyboards, DAWS, and other pro audio gear.
Those looking for a more unusual experience, however, could visit the Chroma Coda booth and experience their virtual reality music-making application, The Music Room.
According to the product description, The Music Room “will make you a multi-instrumentalist” by providing an assortment of virtual reality instruments hosted inside their VR environment. The software comes bundled with a selection of virtual instruments, which the developers at Chroma Coda describe as “drums, laser harp, pedal steel guitar and our unique chord harp.” According to Chroma Coda, these virtual instruments “encourage you to explore different ways of songwriting, that aren’t usually possible with electronic instruments. Strum chords or slide from note to note. Change drums kits on the fly with clear visual feedback.” Here’s the trailer for The Music Room that Chroma Coda produced for the Summer NAMM 2016 show:
In addition to these bundled instruments, The Music Room also comes with a simple, compact Digital Audio Workstation application called Bitwig 8-Track that enables some audio recording and processing, as well as access to some additional electronic percussion and synths. What’s interesting about this functionality is that it allows the VR music-making interface to reach out beyond the virtual environment and interact with traditional music software. The Music Room can function as a MIDI controller that sends musical performance data out to traditional DAWs hosting their own virtual instruments.
The Music Room will be initially available later this month for the HTC Vive VR headset, and the developers plan to make their VR application compatible with the Oculus Rift with Touch and the PlayStation VR when those VR systems release later this year.
The AeroMIDI software predates the emergence of consumer VR. It was first introduced by Acoustica in 2013 as a way to use motion tracking to control MIDI devices, but it made the jump into the world of virtual reality by way of the Oculus Rift during a showing (pictured left) at the 2014 convention of the National Association of Music Merchants (NAMM). As a software package designed to communicate with any existing MIDI applications or external hardware, AeroMIDI allows the user to trigger notes and control musical performance data using hand gestures that are tracked by the Leap Motion device.
The makers of the software, Acoustica Inc, describe AeroMIDI as “the virtual 3D glue between your music and your hands.” As seen in the picture to the right, the visual interface consists of an array of three dimensional blocks and cubes that can represent both note values and expression changes (such as vibrato, pitch bend, low pass filter, volume, or whatever other parameters the user wishes to manipulate).
Because the graphical user interface has always been a three-dimensional construct, AeroMIDI was well suited to make the transition to VR. While there aren’t yet any videos showing the VR experience offered by AeroMIDI, this video shows the software in action, allowing us to easily imagine what the experience must be like within a VR space:
So that concludes this article about the intersection between music composition and VR. I hope you’ve enjoyed reading, and I hope you’ll share your thoughts in the comments below!
Winifred Phillips is an award-winning video game music composer whose most recent project is the triple-A first person shooter Homefront: The Revolution. Her credits include five of the most famous and popular franchises in video gaming: Assassin’s Creed, LittleBigPlanet, Total War, God of War, and The Sims. She is the author of the award-winning bestseller A COMPOSER’S GUIDE TO GAME MUSIC, published by the Massachusetts Institute of Technology Press. As a VR game music expert, she writes frequently on the future of music in virtual reality video games. Follow her on Twitter @winphillips.