I’ll be talking about effective music composition for mobile and portable gaming platforms during my talk, “From Total War to Assassin’s Creed: Music for Mobile Games,” which will take place on March 16th at the upcoming Game Developers Conference at the Moscone Center in San Francisco. With that in mind, I thought I’d use this blog entry to share some resources that explore current strategies and trends in regards to sound and music for mobile – resources that could be useful to the video game composer and sound designer.
While my talk at GDC will focus specifically on music composition and implementation for handheld devices, the resources that will follow in this blog offer assistance with the more general technical issues that face audio pros creating sound assets for a mobile gaming environment. I’ve included links to the original articles, as well as a summation of some of the best points that I thought were particularly interesting:
I’m very proud and excited to announce that I composed the music for the Call of Champions video game, developed by Spacetime Studios!
Call of Champions is an awesome action game in the popular Multiplayer Online Battle Arena (or MOBA) genre. The game pits players against each other in exciting timed matches within a futuristic fantasy-inspired setting. The game was created by the team at Spacetime Studios, an accomplished group of top industry veterans (including developers responsible for the famous Wing Commander series and Star Wars Galaxies.)
The LittleBigPlanet franchise is 7 years old today! On October 28th, 2008, the very first LittleBigPlanet game was published by Sony Computer Entertainment Europe. In the seven years since that auspicious day, players have explored the whimsical world of LittleBigPlanet in countless awesome adventures. I’m very proud to have been a part of the music team for this famous franchise. So, to celebrate the game franchise’s seventh birthday, let’s go for a tour through the history of LittleBigPlanet!
MIDI seems to be making a comeback.
At least, that was my impression a couple of months ago when I attended the audio track of the Game Developers Conference. Setting a new record for attendance, GDC hosted over 24,000 game industry pros who flocked to San Francisco’s Moscone Center in March for a full week of presentations, tutorials, panels, awards shows, press conferences and a vibrant exposition floor filled with new tech and new ideas. As one of those 24,000 attendees, I enjoyed meeting up with lots of my fellow game audio folks, and I paid special attention to the presentations focusing on game audio. Amongst the tech talks and post-mortems, I noticed a lot of buzz about a subject that used to be labeled as very old-school: MIDI.
This was particularly emphasized by all the excitement surrounding the new MIDI capabilities in the Wwise middleware. In October of 2014, Wwise released its most recent version (2014.1) which introduced a number of enhanced features, including “MIDI support for interactive music and virtual instruments (Sampler and Synth).” Wwise now allows the incorporation of MIDI that triggers either a built-in sound library in Wwise or a user-created one. Since I talk about the future of MIDI game music in my book, A Composer’s Guide to Game Music, and since this has become a subject of such avid interest in our community, I thought I’d do some research on this newest version of Wwise and post a few resources that could come in handy for any of us interested in embarking in a MIDI game music project using Wwise 2014.1.
The first is a video produced by Damian Kastbauer, technical audio lead at PopCap games and the producer and host of the now-famous Game Audio Podcast series. This video was released in April of 2014, and included a preview of the then-forthcoming MIDI and synthesizer features of the new Wwise middleware tool. In this video, Damian takes us through the newest version of the “Project Adventure” tutorial prepared by Audiokinetic, makers of Wwise. In the process, he gives us a great, user-friendly introduction to the MIDI capabilities of Wwise.
The next videos were produced by Berrak Nil Boya, a composer and contributing editor to the Designing Sound website. In these videos, Berrak has taken us through some of the more advanced applications of the MIDI capabilities of Wwise, starting with the procedure for routing MIDI data directly into Wwise from more traditional MIDI sequencer software such as that found in a Digital Audio Workstation (DAW) application. This process would allow a composer to work within more traditional music software and then directly route the MIDI output into Wwise. Berrak takes us through the process in this two-part video tutorial:
Finally, Berrak Nil Boya has created a video tutorial on the integration of Wwise into Unity 5, using MIDI. Her explanation of the preparation of a soundbank and the association of MIDI note events with game events is very interesting, and provides a nicely practical application of the MIDI capability of Wwise.
The Game Developers Conference was a fantastic experience for me this year! I gave two presentations — “Advanced Composition Techniques for Adaptive Systems” at the GDC Audio Bootcamp on Tuesday, and “LittleBigPlanet 3 and Beyond: Taking Your Score to Vertical Extremes” during the main conference on Friday. I had a great time! Here are a few of my photos from GDC week. Just click on the first thumbnail image to open the full-sized gallery.
I was honored to be selected by the Game Developers Conference Advisory Board to present two talks during this year’s GDC in San Francisco earlier this month. On Friday March 6th I presented a talk on the music system of the LittleBigPlanet franchise. Entitled “LittleBigPlanet 3 and Beyond: Taking Your Score to Vertical Extremes,” the talk explored the Vertical Layering music system that has been employed in all of the LittleBigPlanet games (the soundtrack for that game is available here). I’ve been on the LittleBigPlanet music composition team for six of their games so far, and my talk used many examples from musical compositions I created for all six of those projects.
After my talk, several audience members let me know that the section of my presentation covering the music system for the Pod menu of LittleBigPlanet 3 was particularly interesting – so I thought I’d share the concepts and examples from that part of my presentation in this blog.
The audio team at Media Molecule conceived the dynamic music system for the LittleBigPlanet franchise. According to the franchise’s music design brief, all interactive tracks in LittleBigPlanet games must be arranged in a vertical layering system. I discussed this type of interactive music in a blog I published last year, but I’ll recap the system briefly here as well. In a vertical layering music system, the music is not captured in a single audio recording. Instead, several audio recordings play in sync with one other. Each layer of musical sound features unique content. Each of the layers represents a certain percentage of the entire musical composition. Played all together, we hear the full mix embodying the entire musical composition. Played separately, we hear submixes that are still satisfying and entertaining for their own sake. The music system can play all the layers either together or separately, or can combine the layers into different sets that represent a portion of the whole mix.
When implemented into gameplay, layers are often activated when the player moves into a new area. This helps the music to feel responsive to the player’s actions. The music seems to acknowledge the player’s progress throughout the game. It’s important to think about the way in which individual layers may be activated, and the functions that the layers may be called upon to serve during the course of the game.
In LittleBigPlanet 3, the initial menu system for the game is called “The Pod.” The music for the Pod is arranged in vertical layers that are activated and deactivated according to where the player is in the menu hierarchy. All the layers can be played simultaneously, and they play in multiple combinations… however, each of the individual layers is also associated with a specific portion of the menu system, and is activated when the player enters that particular part of the menu.
Let’s take a quick tour through the layers of the Pod menu music. I’ve embedded some short musical excerpts of each layer. You’ll find the SoundCloud players for each layer embedded below – just click the Play buttons to listen to each excerpt. The first layer of the Pod menu music is associated with the Main Menu, and it features some floaty, science-fiction-inspired textures and effects:
The next layer is associated with a menu labeled “My Levels,” and the music for that layer is very different. Now, woodwinds are accompanied by a gentle harp, combining to create a homey and down-to-earth mood:
Moving on to the music layer for the “Play” menu, we find that the instrumentation now features an ethereal choir and shimmering bells, expressing a much more celestial atmosphere:
Now let’s listen to the “Adventure” menu layer, in which plucked strings and bells combine to deliver a prominent melody line:
Finally, in the music layer associated with the “Community” and “Popit” menus, we hear a quirky mix of synths and effects that hearken back to menu music from previous games in the LittleBigPlanet franchise:
As the player navigates the Pod menu system, these various music layers are activated to correspond with the player’s location within the menu hierarchy. This sort of dynamic music triggering lies at the very heart of the Vertical Layering interactive music mechanism.
Every layer in a Vertical Layering composition can have a very distinct musical identity. When that layer is turned off, the entire mix changes in a noticeable way. The mix can be changed subtly…
… or it can be altered radically, with large scale activations or deactivations of layers. Even with these kinds of dramatic changes, the musical composition retains its identity. The same piece of music continues to play, and the player is conscious of continuing to hear the same musical composition, even though it has just altered in reaction to the circumstances of gameplay and the player’s progress.
In the Pod menu music system, the layers would change in reaction to the player’s menu navigation, which could be either slow and leisurely or brisk and purposeful. Layer activations and deactivations would occur with smooth crossfade transitions as the player moved from one menu to another. Now let’s take a look at a video showing some navigation through the Pod menu system, so we can hear how these musical layers behaved during actual gameplay:
As you can see, triggering unique musical layers for different portions of the menu system helps serve to define them. I hope you found this explanation of the Pod music to be interesting! If you attended GDC but missed my talk on the interactive music of LittleBigPlanet, you’ll be able to find the entire presentation posted as a video in the GDC Vault in just a few weeks. In the meantime, please feel free to add any comments or questions below!
I’m very pleased that The MIT Press, publishers of my book A COMPOSER’S GUIDE TO GAME MUSIC, have arranged for me to sign copies of my book at the official GDC Bookstore during this year’s Game Developers Conference!
This year, A COMPOSER’S GUIDE TO GAME MUSIC has won the Global Music Award for an exceptional book in the field of music, and an Annual Game Music Award for Best Publication in the field of game music. I’m very pleased that my book will be featured at the GDC bookstore this year, and I’m looking forward to the signing event on March 6th!
BreakPoint Books is the official Game Developers Conference bookstore. You’ll find them on the street level in South Hall of the Moscone Center.
If you buy my book at any time during the conference, you can bring it back during the book signing on Friday so that I can sign it for you! Plus, I’d love to meet you!
Remember, the GDC Bookstore is located in the outer lobby of Moscone South Hall, so you don’t need a GDC pass to shop there. If you’re in the San Francisco area and would like to have a copy of my book signed, please feel free to stop by!
GDC Flash Forward
I’m happy to announce that I’ve been invited to participate in this year’s GDC Flash Forward!
This will be the fourth annual GDC Flash Forward event, which this year will kick off the main conference sessions taking place from Wednesday March 4th – Friday March 6th. Like a big “coming attractions” show, the Flash Forward allows attendees to get a first look at sessions that have been selected as especially interesting or noteworthy by the GDC Advisory Board. Out of the over 400 lectures, panels, tutorials and roundtables that take place during GDC Week, the GDC Advisory Board selects around 70 sessions to participate in the Flash Forward, so I’m very pleased to have been asked to participate this year!
During the Flash Forward event at 9:30am on Wednesday March 4th, each speaker will have from 30-45 seconds to present an enticing preview of their presentation, along with a video clip showing some of the sights that will entertain their presentation attendees. I’ll be presenting a preview of my talk, “LittleBigPlanet 3 and Beyond: Taking Your Score to Vertical Extremes,” which will take place on Friday March 6th at 10am in room 3006 West Hall.
Here’s a little more about the Flash Forward, from the official press release:
This year the hour-long session will be headlined by industry veterans Brenda Romero (Romero Games, UCSC) and Laura Fryer (Oculus VR), and they’ll be presenting their own informal take on the state of the industry before participating in what always proves to be a fun, fast-paced event that highlights some of the best GDC 2015 talks.
Flash Forward presenters are hand-picked by the GDC Advisory Board, ensuring that the session will feature an eclectic mix of speakers that represents the full breadth of the conference. Those selected will have the chance to grab attendees’ attention by taking the stage for a brief period of time — 30-45 seconds, tops — to present a rapid-fire overview of what their session is and why it’s worth checking out.
This year’s Flash Forward should be very exciting, and I’m honored to be a part of it! If you’re attending the Game Developers Conference this year, be sure to go to the Flash Forward! It’s sure to be a lot of fun!
IASIG Recommended Sessions
I’m also very pleased and proud that my session, “LittleBigPlanet 3 and Beyond: Taking Your Score to Vertical Extremes,” was selected by the Interactive Audio Special Interest Group (IASIG) as a Recommended Session for GDC 2015!
Here’s more about the IASIG, from their official site:
The Interactive Audio Special Interest Group (IASIG) exists to allow developers of audio software, hardware, and content to freely exchange ideas about “interactive audio”. The goal of the group is to improve the performance of interactive applications by influencing hardware and software design, as well as leveraging the combined skills of the audio community to make better tools. The IASIG has been influential in the development of audio standards, features, and APIs for Microsoft Windows and other platforms, and has helped numerous hardware companies define their directions for the future.
I’m so honored that out of the 46 sessions in the GDC Audio Track, the Interactive Audio Special Interest Group selected my presentation as one of their 7 recommended talks! Here’s the whole list of IASIG Recommendations:
|Making Full Use of Orchestral Colors in Interactive Music||Wednesday 11:00-12:00||West 3002||Jim Fowler (SCE- World Wide Studios)|
|Creating an Interactive Musical Experience for Fantasia: Music Evolved||Wednesday 14:00-15:00||West 3006||Jeff Allen (Harmonix Music Systems), Devon Newsom (Harmonix Music Systems)|
|BioShock Infinite: Scoring in the Sky, a Postmortem||Wednesday 17:00-18:00||West 3002||Garry Schyman (Garry Schyman Productions)|
|Peggle Blast: Big Concepts, Small Project||Thursday 10:00-11:00||West 3006||RJ Mattingly (PopCap), Jaclyn Shumate (PopCap), Guy Whitmore (PopCap)|
|Inspiring Player Creativity in Disney Fantasia: Music Evolved||Thursday 14:00-14:30||West 3020||Jonathan Mintz (Harmonix Music Systems)|
|LittleBigPlanet 3 and Beyond: Taking Your Score to Vertical Extremes||Friday 10:00-11:00||West 3006||Winifred Phillips (Generations Productions LLC)|
|Where Does the Game End and the Instrument Begin?||Friday 13:30-14:30||West 3006||Matt Boch (Harmonix Music Systems), Jon Moldover (Smule Inc.), Nick Bonardi (Ubisoft), David Young (Smule Inc.), Brian Schmidt (Brian Schmidt Studios)|
Yesterday I shared some info about my upcoming Audio Bootcamp presentation on Tuesday March 3rd at the Game Developers Conference in San Francisco — and today I’d like to share some information about the second presentation I’ll be giving during the main conference. On Friday, March 6th at 10am, I’ll be giving an Audio Track presentation at the Game Developers Conference – I’ll have the pleasure of talking about the interactive music system of the LittleBigPlanet franchise. Here is the official description of my conference session from the GDC 2015 Schedule:
“LittleBigPlanet 3 and Beyond: Taking Your Score to Vertical Extremes” presents down-to-earth strategies for the design and utilization of a vertical layering music system. Composer Winifred Phillips’ credits include six LittleBigPlanet games (LittleBigPlanet 3, LittleBigPlanet 2, LittleBigPlanet Vita, LittleBigPlanet Cross Controller, LittleBigPlanet Karting, LittleBigPlanet Toy Story). Phillips will discuss her music from the LittleBigPlanet franchise — a series that features one of the most complex vertical layering systems in the field of game audio. Intense challenges often lead to inventive solutions. By virtue of the extreme example embodied by the LittleBigPlanet system, Phillips will share the simple approaches that solved some of the common problems associated with vertical construction. This discussion will be augmented by musical examples from a dozen interactive compositions that Phillips created for LittleBigPlanet games. Attendees will learn techniques to avoid problems in any vertical layering system, regardless of whether that system is simple or extreme.
Through detailed examples from the LittleBigPlanet franchise, Phillips will provide a step-by-step analysis of the process that resulted in a tightly-constructed, six-layer interactive music system. This discussion will provide attendees with practical knowledge that can be applied to their own projects.
This session is for anyone interested in game scoring, interactive music systems and game music implementation strategies. Simple approaches to vertical layering will be accessible to attendees at all levels, while more advanced attendees will appreciate the innovative solutions applied to the complex vertical music system of the LittleBigPlanet franchise.
So, if you’ll be attending GDC in San Francisco on March the 6th, I hope you’ll come to my session!
The Game Developers Conference is nearly here! It’ll be a fantastic week of learning and inspiration from March 2nd – March 6th. On Tuesday March 3rd from 10am – 6pm, the GDC Audio Track will be hosting the ever-popular GDC Audio Bootcamp, and I’m honored to be an Audio Bootcamp speaker this year!
This will be the 14th year for the GDC Audio Bootcamp, and I’m honored to join the 9 other speakers who will present this year:
- Michael Csurics, Voice Director/Writer, The Brightskull Entertainment Group
- Damian Kastbauer, Technical Audio Lead, PopCap Games
- Mark Kilborn, Audio Director, Raven Software
- Richard Ludlow, Audio Director, Hexany Audio
- Peter McConnell, Composer, Little Big Note Music
- Daniel Olsén, Audio, Independent
- Winifred Phillips, Composer, Generations Productions LLC
- Brian Schmidt, Founder, Brian Schmidt Studios
- Scott Selfon, Principal Software Engineering Lead, Microsoft
- Jay Weinland, Head of Audio, Bungie Studios
We’ll all be talking about creative, technical and logistical concerns as they pertain to game sound. My talk will be from 11:15am to 12:15pm, and I’ll be focusing on “Advanced Composition Techniques for Adaptive Systems.”
Here’s a description of my Audio Bootcamp talk:
Interactive music technologies have swept across the video game industry, changing the way that game music is composed, recorded, and implemented. Horizontal Resequencing and Vertical Layering have changed the way that music is integrated in the audio file format, while MIDI, MOD and generative models have changed the landscape of music data in games. With all these changes, how does the game composer, audio director, sound designer and audio engineer address these unique challenges? This talk will present an overview of today’s interactive music techniques, including numerous strategies for the deployment of successful interactive music structures in modern games. Included in the talk: Vertical Layering in additive and interchange systems, how resequencing methods benefit from the use of digital markers, and how traditionally linear music can be integrated into an interactive music system.
Right after my Bootcamp presentation, all the Audio Bootcamp presenters and attendees will head off to the ever-popular Lunchtime Surgeries. No, the attendees won’t actually be able to crack open the minds of the presenters and see what’s going on in there, but as a metaphor, it does represent the core philosophy of this lively event. The Lunchtime Surgeries offer attendees a chance to sit with the presenters at large roundtables and ask lots of questions. It’s one of the most popular portions of the bootcamp, and I’ll be looking forward to it!
If you’ll be attending the GDC Audio Track, then I highly recommend the Audio Bootcamp on Tuesday, March 3rd. Hope to see you there!