In a previous blog post, we took a look at a few tutorial resources for the latest version of the Wwise audio middleware. One of the newest innovations in the Wwise software package is a fairly robust MIDI system. This system affords music creators and implementers the opportunity to avail themselves of the extensive adaptive possibilities of the MIDI format from within the Wwise application. Last month, during the Game Developers Conference in the Moscone Center in San Francisco, some members of the PopCap audio development team presented a thorough, step-by-step explanation of the benefits of this MIDI capability for one of their latest projects, Peggle Blast. Since my talk during the Audio Bootcamp at GDC focused on interactive music and MIDI (with an eye on the role of MIDI in both the history and future of game audio development), I thought that we could all benefit from a summation of some of the ideas discussed during the Peggle Blast talk, particularly as they relate to dynamic MIDI music in Wwise. In this blog, I’ve tried to convey some of the most important takeaways from this GDC presentation.
“Peggle Blast: Big Concepts, Small Project” was presented on Thursday, March 5th by three members of the PopCap audio team: technical sound designer RJ Mattingly, audio lead Jaclyn Shumate, and senior audio director Guy Whitmore. The presentation began with a quote from Igor Stravinsky:
The more constraints one imposes, the more one frees oneself, and the arbitrariness of the constraint serves only to maintain the precision of the execution.
This idea became a running theme throughout the presentation, as the three audio pros detailed the constraints under which they worked, including:
- A 5mb memory limit for all audio assets
- Limited CPU
- 2.5mb memory allocation for the music elements
These constraints were a result of the mobile platforms (iOS and Android) for which Peggle Blast had been built. For this reason, the music team focused their attention on sounds that could convey lots of emotion while also maintaining a very small file size. Early experimentation with tracks structured around the use of a music box instrument led the team to realize that they still needed to replicate the musical experience from the full-fledged console versions of the game. A simple music-box score was too unsatisfying, particularly for players who were familiar with the music from the previous installments in the franchise. With that in mind, the team concentrated on very short orchestral samples taken from the previous orchestral session recordings for Peggle 2. Let’s take a look at a video from those orchestral sessions:
Using those orchestral session recordings, the audio team created custom sample banks that were tailored specifically to the needs of Peggle Blast, focusing on lots of very short instrument articulations and performance techniques including:
A few instruments (including a synth pad and some orchestral strings) were edited to loop so that extended note performances became possible, but the large majority of instruments remained brief, punctuated sounds that did not loop. These short sounds were arranged into sample banks in which one or two note samples would be used per octave of instrument range, and note tracking would transpose the sample to fill in the rest of the octave. The sample banks consisted of a single layer of sound, which meant that the instruments did not adjust their character depending on dynamics/velocity. In order to make the samples more musically pleasing, the built-in digital signal processing capability of Wwise was employed by way of a real-time reverb bus that allowed these short sounds to have more extended and natural-sounding decay times.
The audio team worked with a beta version of Wwise 2014 during development of Peggle Blast, which allowed them to implement their MIDI score into the Unity game engine. The composer, Guy Whitmore, composed the music in a style consisting of whimsically pleasant, non-melodic patterns that were structured into a series of chunks. These chunks could be triggered according to the adaptive system in Peggle Blast, wherein the music went through key changes (invariably following the circle of fifths) in reaction to the player’s progress. To better see how this works, let’s watch an example of some gameplay from Peggle Blast:
As you can see, very little in the way of a foreground melody existed in this game. In the place of a melody, foreground musical tones would be emitted when the Peggle ball hit pegs during its descent from the top of the screen. These tones would follow a predetermined scale, and would choose which type of scale to trigger (major, natural minor, harmonic minor, or mixolydian) depending on the key in which the music was currently playing. Information about the key was dropped into the music using markers that indicated where key changes took place, so that the Peggle ball would always trigger the correct type of scale at any given time. The MIDI system did not have to store unique MIDI data for scales in every key change, but would instead calculate the key transpositions for each of the scale types, based on the current key of the music that was playing.
The presentation ended with an emphasis on the memory savings and flexibility afforded by MIDI, and the advantages that MIDI presents to game composers and audio teams. It was a very interesting presentation! If you have access to the GDC Vault, you can watch a video of the entire presentation online. Otherwise, there are plenty of other resources on the music of Peggle Blast, and I’ve included a few below: