Hello there! I’m video game composer Winifred Phillips. Next week, I’ll be giving a lecture during the Game Developers Conference 2021 event. During my lecture, I’ll be talking about the music I composed for Sumo Digital for both the Sackboy: A Big Adventure and Spyder video games. My lecture is entitled, “From Spyder to Sackboy: A Big Adventure in Interactive Music,” and will take place on Friday July 23rd at 3:40pm PT. Although GDC is still an all-virtual affair, the event does provide lots of opportunities for experts within the game development community to share their knowledge, coupled with forums enabling game audio folks to network and learn from each other. In addition to my prepared lecture, I’ll also be participating in a live Speaker Q&A that will take place right after my presentation. It should be a lot of fun! Really looking forward to sharing my experience working with Sumo Digital simultaneously on these two fantastic games.
This was an incredibly rare and awesome opportunity for me to compose music for two projects simultaneously in development by the same company. Because of this, I found the comparisons between the two games fascinating.
My talk will delve into the mechanics of the dynamic music systems in both games, showing how a comparison between these two projects can shed some light on the utility of the top interactive techniques and strategies. While comparing this list of interactive music techniques provided me with a lot of material for my GDC lecture, there were other ways in which the two projects were similar. I thought I’d share some brief thoughts on one of the other common threads I found between these two Sumo Digital games.
As composers, we’re often asked to provide a general atmosphere that adds either character to gameplay or distinctive flavor to menus. If it’s a horror game, maybe we’re being asked to provide a crushingly heavy drone of doom during tense exploration, with soul-shuddering tone clusters bubbling up from the darkness and then sinking back down into the murky depths. For a whimsical game, we might be creating airy, open textures with little mischievous accents from the mallets or woodwind section… or maybe we’re creating a brightly whimsical melody for an opening menu or splash screen. If it’s a fantasy roleplaying game, we may be providing softly ambient tracks for exploration, with a pensive flute wandering gently through Gaelic figures. Or maybe we’re creating a thunderously epic main theme for an opening menu, designed to emphasize the world-shattering stakes of the adventure to come.
So happy you’ve joined us! I’m videogame composer Winifred Phillips. Welcome to my two-part article series on the process of composing music for timed challenges in video games! Since timed challenges are a popular gameplay mechanic that has featured prominently in my most recently released project (The Spyder DLC missions), I thought it might be interesting for us to take a closer look at what makes a timed challenge tick!
Welcome! I’m video game composer Winifred Phillips, and in this month’s article, I’d like to go into some depth about an interesting aspect of our work as game composers – creating music for timed challenges. In timed challenges, players must complete a set of tasks within a limited window of time. Over the years, I’ve created music for lots of timed challenges featured in highly divergent projects, from the darkly strategic space battles of Hades’ Star (pictured above), to the wacky assembly-line mayhem of the Fail Factory VR game, to the brand-new DLC release from one of my most recent projects — the Spyder video game. It was this most recent release that actually got me thinking a lot about how difficult timed challenges can be for game composers.
Hey everyone! I’m video game music composer Winifred Phillips. This past April, I gave a lecture on video game music composition techniques at the invitation of The Library of Congress in Washington DC. It was the first speech on game music composition given at The Library of Congress, and I was tremendously honored to be able to represent the field of video game music! My presentation was entitled “The Interface Between Music Composition and Game Design,” and was supported by a full house in the Whittall Pavilion of the Thomas Jefferson Building at the Library of Congress. In a previous article, I posted a partial transcript of the Q&A portion from my Library of Congress session, including some of the best questions from the Q&A. Since then, The Library of Congress has included a video of my entire presentation as a part of their permanent archival collection for future generations. I’m very pleased to be able to share the entire video with you!
Glad you’re here! I’m video game music composer Winifred Phillips, and I’m the author of the book A Composer’s Guide to Game Music. Recently my publisher The MIT Press requested that I host a question and answer session on Reddit’s famous Ask Me Anything forum, to share my knowledge about game music and spread the word about my book on that topic. I’d be answering questions from a community consisting of thousands of gamers, developers and aspiring composers. It sounded like fun, so last Thursday and Friday I logged onto Reddit and answered as many questions as I possibly could. It was an awesome experience! Over the course of those two days, my Reddit AMA went viral. It ascended to the Reddit front page, receiving 14.8 thousand upvotes and garnering Reddit’s gold and platinum awards. My AMA has now become one of the most engaged and popular Reddit gaming AMAs ever hosted on the Ask-Me-Anything subreddit. I’m so grateful to the Reddit community for their amazing support and enthusiasm!! During the course of those two days, the community posed some wonderful questions, and I thought it would be great to gather together some of those questions and answers that might interest us here. Below you’ll find a discussion focused on the art and craft of game music composition. The discussion covered the gamut of subjects, from elementary to expert, and I’ve arranged the discussion below under topic headings for the sake of convenience. I hope you enjoy this excerpted Q&A from my Reddit Ask-Me-Anything! If you’d like to read the entire AMA (which also includes lots of discussion of my past video game music projects), you’ll find the whole Reddit AMA here.
This week, I’m beginning a three-part blog series on the art of arrangement for dynamic music systems in games.
I’ll be exploring the techniques of arrangement as they relate to interactive game music by discussing examples from the music I composed for video games from the blockbuster LittleBigPlanet franchise.
Arrangement for interactivity is a complex subject, so I thought we should begin by developing a basic understanding of what arrangement is, and then move on to the reasons why it’s especially important in interactive music.
Side-by-side, these are the covers of the two editions of the book. In Japanese, A Composer’s Guide to Game Music is titled “Game sound production guide – composer techniques for interactive music,” by Winifred Phillips.
I’m very excited that the Japanese language edition of my book has already hit #1 on the “Most Wished For” list on Amazon Japan!
The “Most Wished For” list on Amazon.co.jp.
Coincidentally, the English-language version of A Composer’s Guide to Game Music is now #1 on the Kindle Top Rated list, too!
The Kindle “Top Rated” list on Amazon.com.
O’Reilly Japan is located in Tokyo, and is dedicated to translating books about technological innovation for Japanese readers. They are a division of O’Reilly Media, a California publishing company that acts as “a chronicler and catalyst of leading-edge development, homing in on the technology trends that really matter and galvanizing their adoption by amplifying “faint signals” from the alpha geeks who are creating the future. O’Reilly publishes definitive books on computer technologies for developers, administrators, and users. Bestselling series include the legendary “animal books,” Missing Manuals, Hacks, and Head First.”
I’m tremendously excited about the Japanese edition of my book, and my excitement comes in large part from the venerable tradition of outstanding music in Japanese games. From the most celebrated classic scores of such top game composers as Koji Kondo (Super Mario Bros.) and Nobuo Uematsu (Final Fantasy), to the excellent modern scores of such popular composers as Masato Kouda (Monster Hunter) and Yoko Shimomura (Kingdom Hearts), Japanese video game composers have set the creative bar very high. I’m incredibly honored that my book will be read by both established and aspiring game composers in Japan! I hope they’ll find some helpful information in my book, and I’m excited to contribute to the ongoing conversation about game music in the Japanese development community.
I’ve always loved Japanese game music. In 2008, I participated in a compilation album in which successful game composers created cover versions of celebrated video game songs from classic games. The album was called “Best of the Best: A Tribute to Game Music.” I chose the music by Koji Kondo from Super Mario Bros., and recorded an a cappella vocal version. It’s currently available for sale from the Sumthing Else Music Works record label, and can also be downloaded on iTunes. You can hear the track on YouTube here:
If you’d like to learn more about the rich legacy of game music composition in Japan, you can watch an awesome free documentary series produced by the Red Bull Music Academy, entitled “Diggin’ in the Carts: A Documentary Series About Japanese Video Game Music.” The series interviews famous game composers of Japan, which means that the interviews and narration are both in Japanese (with English subtitles). Here’s an episode that focuses on modern accomplishments by Japanese game composers:
Winifred Phillips is an award-winning video game music composer whose most recent project is the triple-A first person shooter Homefront: The Revolution. Her credits include five of the most famous and popular franchises in video gaming: Assassin’s Creed, LittleBigPlanet, Total War, God of War, and The Sims. She is the author of the award-winning bestseller A COMPOSER’S GUIDE TO GAME MUSIC, published by the Massachusetts Institute of Technology Press. As a VR game music expert, she writes frequently on the future of music in virtual reality video games. Follow her on Twitter @winphillips.
Welcome back to my blog series that offers tutorial resources exploring game music middleware for the game music composer. I initially planned to write two blog entries on the most popular audio middleware solutions (Wwise and FMOD), but since I started this blog series, I’ve been hearing buzz about other middleware solution and so I thought it best to expand the series to incorporate other interesting solutions to music implementation in games. This blog will focus on a brand new middleware application called Elias, developed by Elias Software. While not as famous as Wwise or FMOD, this new application offers some intriguing new possibilities for the creation of interactive music in games.
If you’d like to read the first three blog entries in this series, you can find them here:
Elias stands for Elastic Lightweight Integrated Audio System. It is developed by Kristofer Eng and Philip Bennefall for Microsoft Windows, with a Unity plugin for consoles, mobile devices and browser-based games. What makes Elias interesting is the philosophy of its design. Instead of designing a general audio middleware tool with some music capabilities, Eng and Bennefall decided to bypass the sound design arena completely and create a middleware tool specifically outfitted for the game music composer. The middleware comes with an authoring tool called Elias Composer’s Studio that “helps the composer to structure and manage the various themes in the game and bridges the gap between the composer and level designer to ease the music integration process.”
Here’s the introductory video for Elias, produced by Elias Software:
The interactive music system of the Elias middleware application seems to favor a Vertical Layering (or vertical re-orchestration) approach with a potentially huge number of music layers able to play in lots of combinations. The system includes flexible options for layer triggering, including the ability to randomize the activation of the layers to keep the listening experience unpredictable during gameplay.
As a middleware application designed specifically to address the top needs of game music composers, Elias is certainly intriguing! The software has so far been used in only one published game – Gauntlet, which is the latest entry in the awesome video game franchise first developed by Atari Games for arcade cabinets in 1985. This newest entry in the franchise was developed by Arrowhead Game Studios for Windows PCs. We can hear the Elias middleware solution in action in this gameplay video from Gauntlet:
Elias Software recently demonstrated its Elias middleware application on the expo floor of the Nordic Game 2015 conference in Malmö, Sweden (May 20-22, 2015). Here’s a look at Elias’ booth from the expo:
Since Elias is a brand new application, I’ll be curious to see how widely it is accepted by the game audio community. A middleware solution that focuses solely on music is definitely a unique approach! If audio directors and audio programmers embrace Elias, then it may have the potential to give composers better tools and an easier workflow in the creation of interactive music for games.
LittleBigPlanet 3 and Beyond: Taking Your Score to Vertical Extremes
I was honored to be selected by the Game Developers Conference Advisory Board to present two talks during this year’s GDC in San Francisco earlier this month. On Friday March 6th I presented a talk on the music system of the LittleBigPlanet franchise. Entitled “LittleBigPlanet 3 and Beyond: Taking Your Score to Vertical Extremes,” the talk explored the Vertical Layering music system that has been employed in all of the LittleBigPlanet games (the soundtrack for that game is available here). I’ve been on the LittleBigPlanet music composition team for six of their games so far, and my talk used many examples from musical compositions I created for all six of those projects.
After my talk, several audience members let me know that the section of my presentation covering the music system for the Pod menu of LittleBigPlanet 3 was particularly interesting – so I thought I’d share the concepts and examples from that part of my presentation in this blog.
That’s me, giving my GDC speech on the interactive music system of the LittleBigPlanet franchise. Here I’m just starting the section about the Pod menu music.
The audio team at Media Molecule conceived the dynamic music system for the LittleBigPlanet franchise. According to the franchise’s music design brief, all interactive tracks in LittleBigPlanet games must be arranged in a vertical layering system. I discussed this type of interactive music in a blog I published last year, but I’ll recap the system briefly here as well. In a vertical layering music system, the music is not captured in a single audio recording. Instead, several audio recordings play in sync with one other. Each layer of musical sound features unique content. Each of the layers represents a certain percentage of the entire musical composition. Played all together, we hear the full mix embodying the entire musical composition. Played separately, we hear submixes that are still satisfying and entertaining for their own sake. The music system can play all the layers either together or separately, or can combine the layers into different sets that represent a portion of the whole mix.
When implemented into gameplay, layers are often activated when the player moves into a new area. This helps the music to feel responsive to the player’s actions. The music seems to acknowledge the player’s progress throughout the game. It’s important to think about the way in which individual layers may be activated, and the functions that the layers may be called upon to serve during the course of the game.
In LittleBigPlanet 3, the initial menu system for the game is called “The Pod.” The music for the Pod is arranged in vertical layers that are activated and deactivated according to where the player is in the menu hierarchy. All the layers can be played simultaneously, and they play in multiple combinations… however, each of the individual layers is also associated with a specific portion of the menu system, and is activated when the player enters that particular part of the menu.
Let’s take a quick tour through the layers of the Pod menu music. I’ve embedded some short musical excerpts of each layer. You’ll find the SoundCloud players for each layer embedded below – just click the Play buttons to listen to each excerpt. The first layer of the Pod menu music is associated with the Main Menu, and it features some floaty, science-fiction-inspired textures and effects:
The next layer is associated with a menu labeled “My Levels,” and the music for that layer is very different. Now, woodwinds are accompanied by a gentle harp, combining to create a homey and down-to-earth mood:
Moving on to the music layer for the “Play” menu, we find that the instrumentation now features an ethereal choir and shimmering bells, expressing a much more celestial atmosphere:
Now let’s listen to the “Adventure” menu layer, in which plucked strings and bells combine to deliver a prominent melody line:
Finally, in the music layer associated with the “Community” and “Popit” menus, we hear a quirky mix of synths and effects that hearken back to menu music from previous games in the LittleBigPlanet franchise:
As the player navigates the Pod menu system, these various music layers are activated to correspond with the player’s location within the menu hierarchy. This sort of dynamic music triggering lies at the very heart of the Vertical Layering interactive music mechanism.
Every layer in a Vertical Layering composition can have a very distinct musical identity. When that layer is turned off, the entire mix changes in a noticeable way. The mix can be changed subtly…
… or it can be altered radically, with large scale activations or deactivations of layers. Even with these kinds of dramatic changes, the musical composition retains its identity. The same piece of music continues to play, and the player is conscious of continuing to hear the same musical composition, even though it has just altered in reaction to the circumstances of gameplay and the player’s progress.
In the Pod menu music system, the layers would change in reaction to the player’s menu navigation, which could be either slow and leisurely or brisk and purposeful. Layer activations and deactivations would occur with smooth crossfade transitions as the player moved from one menu to another. Now let’s take a look at a video showing some navigation through the Pod menu system, so we can hear how these musical layers behaved during actual gameplay:
As you can see, triggering unique musical layers for different portions of the menu system helps serve to define them. I hope you found this explanation of the Pod music to be interesting! If you attended GDC but missed my talk on the interactive music of LittleBigPlanet, you’ll be able to find the entire presentation posted as a video in the GDC Vault in just a few weeks. In the meantime, please feel free to add any comments or questions below!
I’m happy to announce that I’ve been invited to participate in this year’s GDC Flash Forward!
This will be the fourth annual GDC Flash Forward event, which this year will kick off the main conference sessions taking place from Wednesday March 4th – Friday March 6th. Like a big “coming attractions” show, the Flash Forward allows attendees to get a first look at sessions that have been selected as especially interesting or noteworthy by the GDC Advisory Board. Out of the over 400 lectures, panels, tutorials and roundtables that take place during GDC Week, the GDC Advisory Board selects around 70 sessions to participate in the Flash Forward, so I’m very pleased to have been asked to participate this year!
During the Flash Forward event at 9:30am on Wednesday March 4th, each speaker will have from 30-45 seconds to present an enticing preview of their presentation, along with a video clip showing some of the sights that will entertain their presentation attendees. I’ll be presenting a preview of my talk, “LittleBigPlanet 3 and Beyond: Taking Your Score to Vertical Extremes,” which will take place on Friday March 6th at 10am in room 3006 West Hall.
Here’s a little more about the Flash Forward, from the official press release:
This year the hour-long session will be headlined by industry veterans Brenda Romero (Romero Games, UCSC) and Laura Fryer (Oculus VR), and they’ll be presenting their own informal take on the state of the industry before participating in what always proves to be a fun, fast-paced event that highlights some of the best GDC 2015 talks.
Flash Forward presenters are hand-picked by the GDC Advisory Board, ensuring that the session will feature an eclectic mix of speakers that represents the full breadth of the conference. Those selected will have the chance to grab attendees’ attention by taking the stage for a brief period of time — 30-45 seconds, tops — to present a rapid-fire overview of what their session is and why it’s worth checking out.
This year’s Flash Forward should be very exciting, and I’m honored to be a part of it! If you’re attending the Game Developers Conference this year, be sure to go to the Flash Forward! It’s sure to be a lot of fun!
Here’s more about the IASIG, from their official site:
The Interactive Audio Special Interest Group (IASIG) exists to allow developers of audio software, hardware, and content to freely exchange ideas about “interactive audio”. The goal of the group is to improve the performance of interactive applications by influencing hardware and software design, as well as leveraging the combined skills of the audio community to make better tools. The IASIG has been influential in the development of audio standards, features, and APIs for Microsoft Windows and other platforms, and has helped numerous hardware companies define their directions for the future.
I’m so honored that out of the 46 sessions in the GDC Audio Track, the Interactive Audio Special Interest Group selected my presentation as one of their 7 recommended talks! Here’s the whole list of IASIG Recommendations:
Making Full Use of Orchestral Colors in Interactive Music
Jim Fowler (SCE- World Wide Studios)
Creating an Interactive Musical Experience for Fantasia: Music Evolved
Jeff Allen (Harmonix Music Systems), Devon Newsom (Harmonix Music Systems)
BioShock Infinite: Scoring in the Sky, a Postmortem
Garry Schyman (Garry Schyman Productions)
Peggle Blast: Big Concepts, Small Project
RJ Mattingly (PopCap), Jaclyn Shumate (PopCap), Guy Whitmore (PopCap)
Inspiring Player Creativity in Disney Fantasia: Music Evolved
Jonathan Mintz (Harmonix Music Systems)
LittleBigPlanet 3 and Beyond: Taking Your Score to Vertical Extremes
Winifred Phillips (Generations Productions LLC)
Where Does the Game End and the Instrument Begin?
Matt Boch (Harmonix Music Systems), Jon Moldover (Smule Inc.), Nick Bonardi (Ubisoft), David Young (Smule Inc.), Brian Schmidt (Brian Schmidt Studios)