Game Music Middleware, Part 4: Elias

Middleware-Blackboard

Welcome back to my blog series that offers tutorial resources exploring game music middleware for the game music composer. I initially planned to write two blog entries on the most popular audio middleware solutions (Wwise and FMOD), but since I started this blog series, I’ve been hearing buzz about other middleware solution and so I thought it best to expand the series to incorporate other interesting solutions to music implementation in games.  This blog will focus on a brand new middleware application called Elias, developed by Elias Software.  While not as famous as Wwise or FMOD, this new application offers some intriguing new possibilities for the creation of interactive music in games.

If you’d like to read the first three blog entries in this series, you can find them here:

Game Music Middleware, Part 1: Wwise

Game Music Middleware, Part 2: FMOD

Game Music Middleware, Part 3: Fabric

Elias-Logo

Elias stands for Elastic Lightweight Integrated Audio System.  It is developed by Kristofer Eng and Philip Bennefall for Microsoft Windows, with a Unity plugin for consoles, mobile devices and browser-based games.  What makes Elias interesting is the philosophy of its design.  Instead of designing a general audio middleware tool with some music capabilities, Eng and Bennefall decided to bypass the sound design arena completely and create a middleware tool specifically outfitted for the game music composer. The middleware comes with an authoring tool called Elias Composer’s Studio that “helps the composer to structure and manage the various themes in the game and bridges the gap between the composer and level designer to ease the music integration process.”

Here’s the introductory video for Elias, produced by Elias Software:

The interactive music system of the Elias middleware application seems to favor a Vertical Layering (or vertical re-orchestration) approach with a potentially huge number of music layers able to play in lots of combinations.  The system includes flexible options for layer triggering, including the ability to randomize the activation of the layers to keep the listening experience unpredictable during gameplay.

Elias has produced a series of four tutorial videos for the Composer’s Studio authoring tool.  Here’s the first of the four tutorials:

There’s also a two-part series of tutorials about Elias produced by Dale Crowley, the founder of the game audio services company Gryphondale Studios.  Here’s the first of the two videos:

As a middleware application designed specifically to address the top needs of game music composers, Elias is certainly intriguing!  The software has so far been used in only one published game – Gauntlet, which is the latest entry in the awesome video game franchise first developed by Atari Games for arcade cabinets in 1985.  This newest entry in the franchise was developed by Arrowhead Game Studios for Windows PCs.  We can hear the Elias middleware solution in action in this gameplay video from Gauntlet:

The music of Gauntlet was composed by Erasmus Talbot.  More of his music from Gauntlet is available on his SoundCloud page.

Elias Software recently demonstrated its Elias middleware application on the expo floor of the Nordic Game 2015 conference in Malmö, Sweden (May 20-22, 2015).  Here’s a look at Elias’ booth from the expo:

Elias-NordicGame2015

Since Elias is a brand new application, I’ll be curious to see how widely it is accepted by the game audio community.  A middleware solution that focuses solely on music is definitely a unique approach!  If audio directors and audio programmers embrace Elias, then it may have the potential to give composers better tools and an easier workflow in the creation of interactive music for games.

Game Music Middleware, Part 3: Fabric

Middleware-Blackboard

Welcome back to my series of blogs that collect some tutorial resources about game music middleware for the game music composer.  I had initially intended to publish two blog entries on this subject, focusing on the most popular audio middleware solutions: Wwise and FMOD.  However, since the Fabric audio middleware has been making such a splash in the game audio community, I thought I’d extend this series to include it.  If you’d like to read the first two blog entries in this series, you can find them here:

Game Music Middleware, Part 1: Wwise

Game Music Middleware, Part 2: FMOD

Fabric is developed by Tazman Audio for the Unity game engine (which enables game development for consoles, PCs, mobile devices such as iOS and Android, and games designed to run within a web browser).  Here’s a Unity game engine overview produced by Unity Technologies:

The Fabric middleware is designed to expand the audio capabilities of the Unity game engine.  The complete product manual for the Fabric middleware is available online.  The video tutorials that I’m featuring below were created by two game audio professionals who have very generously walked us through the use of the software.  If you’d like a more nuts-and-bolts overview of the software features of Fabric, you can find it here.

The first video was shot in 2013 during the Konsoll game development conference in Norway, and gives an overview of the general use of Fabric in game audio. The speaker, Jory Prum, is an accomplished game audio professional whose game credits include The Walking Dead, The Wolf Among Us, Broken Age, SimCity 4, Star Wars: Knights of the Old Republic, and many more.

Making a great sounding Unity game using Fabric

border-159926_640_white

In the next two-part video tutorial, composer Anastasia Devana has expanded on her previous instructional videos about FMOD Studio, focusing now on recreating the same music implementation strategies and techniques using the Fabric middleware in Unity.  Anastasia Devana is an award-winning composer whose game credits include the recently released puzzle game Synergy and the upcoming roleplaying game Anima – Gate of Memories.

Fabric and Unity: Adaptive Music in Angry Bots – Part 1

Fabric and Unity: Adaptive Music in Angry Bots – Part 2

Interactive Game Music of LittleBigPlanet 3 (Concepts from my GDC Talk)

LittleBigPlanet 3 and Beyond: Taking Your Score to Vertical Extremes -- Speaker, Winifred Phillips

LittleBigPlanet 3 and Beyond: Taking Your Score to Vertical Extremes

I was honored to be selected by the Game Developers Conference Advisory Board to present two talks during this year’s GDC in San Francisco earlier this month.  On Friday March 6th I presented a talk on the music system of the LittleBigPlanet franchise.  Entitled LittleBigPlanet 3 and Beyond: Taking Your Score to Vertical Extremes,” the talk explored the Vertical Layering music system that has been employed in all of the LittleBigPlanet games (the soundtrack for that game is available here).  I’ve been on the LittleBigPlanet music composition team for six of their games so far, and my talk used many examples from musical compositions I created for all six of those projects.

After my talk, several audience members let me know that the section of my presentation covering the music system for the Pod menu of LittleBigPlanet 3 was particularly interesting – so I thought I’d share the concepts and examples from that part of my presentation in this blog.

GDC15_LBP3-Game-Music_Winifred-Phillips-1

That’s me, giving my GDC speech on the interactive music system of the LittleBigPlanet franchise.  Here I’m just starting the section about the Pod menu music.

The audio team at Media Molecule conceived the dynamic music system for the LittleBigPlanet franchise.  According to the franchise’s music design brief, all interactive tracks in LittleBigPlanet games must be arranged in a vertical layering system.  I discussed this type of interactive music in a blog I published last year, but I’ll recap the system briefly here as well.  In a vertical layering music system, the music is not captured in a single audio recording.  Instead, several audio recordings play in sync with one other.  Each layer of musical sound features unique content.  Each of the layers represents a certain percentage of the entire musical composition.  Played all together, we hear the full mix embodying the entire musical composition.  Played separately, we hear submixes that are still satisfying and entertaining for their own sake.  The music system can play all the layers either together or separately, or can combine the layers into different sets that represent a portion of the whole mix.

When implemented into gameplay, layers are often activated when the player moves into a new area.  This helps the music to feel responsive to the player’s actions.  The music seems to acknowledge the player’s progress throughout the game.  It’s important to think about the way in which individual layers may be activated, and the functions that the layers may be called upon to serve during the course of the game.

GDC15-Winifred-Phillips

In LittleBigPlanet 3, the initial menu system for the game is called “The Pod.”  The music for the Pod is arranged in vertical layers that are activated and deactivated according to where the player is in the menu hierarchy.  All the layers can be played simultaneously, and they play in multiple combinations… however, each of the individual layers is also associated with a specific portion of the menu system, and is activated when the player enters that particular part of the menu.

Let’s take a quick tour through the layers of the Pod menu music.  I’ve embedded some short musical excerpts of each layer.  You’ll find the SoundCloud players for each layer embedded below – just click the Play buttons to listen to each excerpt.  The first layer of the Pod menu music is associated with the Main Menu, and it features some floaty, science-fiction-inspired textures and effects:

The next layer is associated with a menu labeled “My Levels,” and the music for that layer is very different.  Now, woodwinds are accompanied by a gentle harp, combining to create a homey and down-to-earth mood:

Moving on to the music layer for the “Play” menu, we find that the instrumentation now features an ethereal choir and shimmering bells, expressing a much more celestial atmosphere:

Now let’s listen to the “Adventure” menu layer, in which plucked strings and bells combine to deliver a prominent melody line:

Finally, in the music layer associated with the “Community” and “Popit” menus, we hear a quirky mix of synths and effects that hearken back to menu music from previous games in the LittleBigPlanet franchise:

As the player navigates the Pod menu system, these various music layers are activated to correspond with the player’s location within the menu hierarchy.  This sort of dynamic music triggering lies at the very heart of the Vertical Layering interactive music mechanism.

GDC15_LBP3-Game-Music_Winifred-Phillips-2

Every layer in a Vertical Layering composition can have a very distinct musical identity.  When that layer is turned off, the entire mix changes in a noticeable way.  The mix can be changed subtly…

GDC15_LBP3-Game-Music_Winifred-Phillips-3

… or it can be altered radically, with large scale activations or deactivations of layers.  Even with these kinds of dramatic changes, the musical composition retains its identity.  The same piece of music continues to play, and the player is conscious of continuing to hear the same musical composition, even though it has just altered in reaction to the circumstances of gameplay and the player’s progress.

In the Pod menu music system, the layers would change in reaction to the player’s menu navigation, which could be either slow and leisurely or brisk and purposeful.  Layer activations and deactivations would occur with smooth crossfade transitions as the player moved from one menu to another.  Now let’s take a look at a video showing some navigation through the Pod menu system, so we can hear how these musical layers behaved during actual gameplay:

 As you can see, triggering unique musical layers for different portions of the menu system helps serve to define them.  I hope you found this explanation of the Pod music to be interesting!  If you attended GDC but missed my talk on the interactive music of LittleBigPlanet, you’ll be able to find the entire presentation posted as a video in the GDC Vault in just a few weeks.  In the meantime, please feel free to add any comments or questions below!

GameSoundCon Industry Survey Results

GameSoundCon

As the GameSoundCon conference draws closer, I thought I’d talk a little bit about the Game Audio Industry Survey that was designed by GameSoundCon Executive Producer Brian Schmidt.  The survey was prepared in response to the broader “Annual Game Developer Salary Survey” offered by industry site Gamasutra.  Since the Gamasutra survey suffered from skewed results for game audio compared to other game industry sectors (owing to lower participation from the game audio community), Schmidt set out to obtain more reliable results by adopting a different approach.

Instead of focusing on the yearly salaries/earnings of audio professionals, the survey concentrated on the money generated by the music/sound of individual projects. Each respondent could fill out the survey repeatedly, entering data for each game project that the respondent had completed during the previous year.  The final results of the survey are meant to reflect how game audio is treated within different types of projects, and the results are quite enlightening, and at times surprising.

GSC-SurveyThe financial results include both small-budget indie games from tiny teams and huge-budget games from behemoth publishers, so there is a broad range in those results.  Since this is the first year that the GameSoundCon Game Audio Industry Survey has been conducted, we don’t yet have data from a previous year with which to compare these results, and it might be very exciting to see how the data shifts if the survey is conducted again in 2015.

Some very intriguing data comes from the section of the survey that provides a picture of who game composers are and how they work.  According to the survey, the majority of game composers are freelancers, and 70% of game music is performed by the freelance composer alone.  56% of composers are also acting as one-stop-shops for music and sound effects, likely providing a good audio solution for indie teams with little or no audio personnel of their own.

A surprising and valuable aspect of the survey is to be found in the audio middleware results, which show that the majority of games use either no audio middleware at all, or opt for custom audio tools designed by the game developer.  This information is quite new, and could be tremendously useful to composers working in the field.  While we should all make efforts to gain experience with audio middleware such as FMOD and Wwise, we might keep in mind that there may not be as many opportunities to practice those skills as had been previously anticipated.  Again, this data might be rendered even more meaningful by the results of the survey next year (if it is repeated), to see if commercial middleware is making inroads and becoming more popular over time.

Expanding upon this subject, the survey reveals that only 22% of composers are ever asked to do any kind of music integration (in which the composer assists the team in implementing music files into their game). It seems that for the time being, this task is still falling firmly within the domain of the programmers on most game development teams.

The survey was quite expansive and fascinating, and I’m very pleased that it included questions about both middleware and integration.  If GameSoundCon runs the survey again next year, I’d love to see the addition of some questions about what type of interactivity composers may be asked to introduce into their musical scores, how much of their music is composed in a traditionally linear fashion, and what the ratio of interactive/adaptive to linear music might be per project.  I wrote rather extensively on this subject in my book, and since I’ll also be giving my talk at GameSoundCon this year about composing music for adaptive systems, I’d be very interested in such survey results!

The GameSoundCon Game Audio Industry Survey is an invaluable resource, and is well worth reading in its entirety.  You’ll find it here.  I’ll be giving my talk on “Advanced Composition Techniques for Adaptive Systems” at GameSoundCon at the Millennium Biltmore Hotel in Los Angeles on Wednesday, October 8th.

Many thanks to Brian Schmidt / GameSoundCon for preparing this excellent survey!

A Composer’s Guide to Game Music – Horizontal Resequencing, Part 1

CCGM-VES-1

Here’s another installment of a four-part series of videos I produced as a supplement to my book, A Composer’s Guide to Game Music. This video focuses on the Horizontal Resequencing model employed in the Speed Racer video game, providing some visual illustration for this interactive music composition technique. The video demonstrates concepts that are explored in depth in my book, beginning on page 188.