So happy you’ve joined us! I’m videogame composer Winifred Phillips. Welcome to my two-part article series on the process of composing music for timed challenges in video games! Since timed challenges are a popular gameplay mechanic that has featured prominently in my most recently released project (The Spyder DLC missions), I thought it might be interesting for us to take a closer look at what makes a timed challenge tick!
Welcome! I’m video game composer Winifred Phillips, and in this month’s article, I’d like to go into some depth about an interesting aspect of our work as game composers – creating music for timed challenges. In timed challenges, players must complete a set of tasks within a limited window of time. Over the years, I’ve created music for lots of timed challenges featured in highly divergent projects, from the darkly strategic space battles of Hades’ Star (pictured above), to the wacky assembly-line mayhem of the Fail Factory VR game, to the brand-new DLC release from one of my most recent projects — the Spyder video game. It was this most recent release that actually got me thinking a lot about how difficult timed challenges can be for game composers.
Glad you’re here! I’m video game music composer Winifred Phillips, and I’m the author of the book A Composer’s Guide to Game Music. Recently my publisher The MIT Press requested that I host a question and answer session on Reddit’s famous Ask Me Anything forum, to share my knowledge about game music and spread the word about my book on that topic. I’d be answering questions from a community consisting of thousands of gamers, developers and aspiring composers. It sounded like fun, so last Thursday and Friday I logged onto Reddit and answered as many questions as I possibly could. It was an awesome experience! Over the course of those two days, my Reddit AMA went viral. It ascended to the Reddit front page, receiving 14.8 thousand upvotes and garnering Reddit’s gold and platinum awards. My AMA has now become one of the most engaged and popular Reddit gaming AMAs ever hosted on the Ask-Me-Anything subreddit. I’m so grateful to the Reddit community for their amazing support and enthusiasm!! During the course of those two days, the community posed some wonderful questions, and I thought it would be great to gather together some of those questions and answers that might interest us here. Below you’ll find a discussion focused on the art and craft of game music composition. The discussion covered the gamut of subjects, from elementary to expert, and I’ve arranged the discussion below under topic headings for the sake of convenience. I hope you enjoy this excerpted Q&A from my Reddit Ask-Me-Anything! If you’d like to read the entire AMA (which also includes lots of discussion of my past video game music projects), you’ll find the whole Reddit AMA here.
Interactive music is always a hot topic in the game audio community, and newcomers to game music composition can easily become confused by the structure and process of creating non-linear music for games. To address this issue, I produced four videos that introduce aspiring video game composers to some of the most popular tactics and procedures commonly used by game audio experts in the structuring of musical interactivity for games. Over the next four articles, I’ll be sharing these videos with you, and I’ll also be including some supplemental information and accompanying musical examples for easy reference. Hopefully these videos can answer some of the top questions about interactive music composition. Music interactivity can be awesome, but it can also seem very abstract and mysterious when we’re first learning about it. Let’s work together to make the process feel a bit more concrete and understandable!
Interactive music technologies have swept across the video game industry, changing the way that game music is composed, recorded, and implemented. Horizontal Resequencing and Vertical Layering have changed the way that music is integrated in the audio file format, while MIDI, MOD and generative models have changed the landscape of music data in games. With all these changes, how does the game composer, audio director, sound designer and audio engineer address these unique challenges? This talk will present an overview of today’s interactive music techniques, including numerous strategies for the deployment of successful interactive music structures in modern games. Included in the talk: Vertical Layering in additive and interchange systems, how resequencing methods benefit from the use of digital markers, and how traditionally linear music can be integrated into an interactive music system.
Right after my Bootcamp presentation, all the Audio Bootcamp presenters and attendees will head off to the ever-popular Lunchtime Surgeries. No, the attendees won’t actually be able to crack open the minds of the presenters and see what’s going on in there, but as a metaphor, it does represent the core philosophy of this lively event. The Lunchtime Surgeries offer attendees a chance to sit with the presenters at large roundtables and ask lots of questions. It’s one of the most popular portions of the bootcamp, and I’ll be looking forward to it!
If you’ll be attending the GDC Audio Track, then I highly recommend the Audio Bootcamp on Tuesday, March 3rd. Hope to see you there!
I was tremendously honored to speak at the Audio Engineering Society’s convention last month, and I thought I’d share a video excerpt from my speech, which was entitled “Effective Interactive Music Systems: The Nuts and Bolts of Dynamic Musical Content.” Many thanks to Steve Martz and Bob Lee at the Audio Engineering Society for organizing an outstanding event!
More about the AES:
The Audio Engineering Society is the only professional society devoted exclusively to audio technology. Founded in the United States in 1948, the AES has grown to become an international organization that unites audio engineers, creative artists, scientists and students worldwide by promoting advances in audio and disseminating new knowledge and research. Currently, over 14,000 members are affiliated with more than 75 AES professional sections and more than 95 AES student sections around the world. Conventions, which include scientific presentations, student activities, workshops, and exhibitions, are held annually both in the US and Europe. Additional conferences and regional summits are held periodically throughout Latin America, Asia, Europe, and North America.
Effective Interactive Music Systems: The Nuts and Bolts of Dynamic Musical Content
Interactive methodologies have profoundly impacted the way that music is recorded, mixed and integrated in video games. From horizontal resequencing and vertical layering techniques for the interactive implementation of music recordings, to MIDI and generative systems for the manipulation of music data, the structure of game music poses serious challenges both for the composer and for the game audio engineer. This talk will examine the procedures for designing interactive music models and implementing them effectively into video games. The talk will include comparisons between additive and interchange systems in vertical layering, the lessons that can be learned from conventional stem mixing, the use of markers for switching between segments, and how to disassemble a traditionally composed piece of music for use within an interactive system.
Here’s another installment of a four-part series of videos I produced as a supplement to my book, A Composer’s Guide to Game Music. This video focuses on the Horizontal Resequencing model employed in the Speed Racer video game, providing some visual illustration for this interactive music composition technique. The video demonstrates concepts that are explored in depth in my book, beginning on page 188.