
Musical Intelligence
Baldur Baldursson
Baldursson began the presentation by explaining why an intelligent music system for games can be a necessity. “We basically want an intelligent music system because we can’t (or maybe shouldn’t really) precompose all of the elements,” Baldursson explains. He describes the conundrum of creating a musical score for a game whose story is still fluid and changeable, and then asserts, “I think we should find ways of making this better.”
Baldursson shared during the course of the talk that the problem of adapting a linear art form (music) to a nonlinear format (interactivity) forces game audio professionals to look for technological solutions to artistic problems. A dynamic music system can best address the issue when it retains the ability to “create the material according to the atmosphere of the game,” Baldursson says. “It should evolve according to the actual progression of the game in real time. It should be possible to control various parameters of music simultaneously.”
These algorithms were then instantaneously translated into sheet music appearing on computer tablets for the musicians who were executing the music live during the dance performance. The result was a reversal of the normal course of creative inception — instead of the dancers moving in accordance with the shapes and activity of the music, the musicians were playing in accordance with the shapes and activity of the dance. Here’s a video that shows this performance in action:
Uses of this generative music technology for video games are still in experimental stages. During the GDC talk, Baldur Baldursson described how his audio team integrated the CALMUS system into prototypes of the EVE Online game, using the audio middleware Wwise from Audiokinetic. “CALMUS feeds the MIDI events into Wwise,” Baldursson explains, “which hosts the instruments. Currently the system runs outside WWise but ideally we’re going to have it as a plugin so that we can use it with other games we’re making.”
Here’s a video showing the prototype of the CALMUS system operating within EVE Online:
The “Intelligent Music for Games” talk was a fascinating exploration of a MIDI-based generative music system. The entire talk is available for viewing by subscribers to the GDC Vault.
Musical Precognition
In linear media (such as films), the narrative is written ahead of time. With the story fully conceived, the musical score can emotionally prepare the listener for events that have not yet occurred. But how do we achieve the same awesome results in nonlinear media, when the story is fluid, and there is no way to predict future events? In the talk, “Precognitive Interactive Music: Yes You Can!,” Microsoft’s senior technical audio director Robert Ridihalgh is joined by former Microsoft audio director Paul Lipson. Together, they explore ways to structure music so that it reacts so quickly to in-game changes that it seems to anticipate what’s to come.
The ultimate goal of the Hindemith system is to build drama in a satisfying way towards a predetermined “event” – which can be any in-game occurrence that the development team would like to carry extra emotional weight. To make this work, the Hindemith system focuses special attention on the moments occurring shortly before the player would encounter and activate the “event.”
In preparing music for the Hindemith system, the video game composer is asked to pore through the musical composition looking for short, declarative segments that may include musical flourishes or escalations. The video game music composer is not asked to create these short musical segments from scratch, but to isolate them from music that was composed in a traditionally linear way.
Copied and isolated from the larger composition, these short chunks of music are now referred to as “Hindebits” in the system. To prepare these short “Hindebits” for use in the interactive music matrix, the composer processes the music chunks into many variations, each one representing a small change in tempo and/or pitch. The Hindemith system then calculates how many of these Hindebits will be required to bridge the gap between the player’s current position and the position of the event trigger (towards which these Hindebits are emotionally building). The system is able to string the Hindebits together by calculating their length and extrapolating the remaining time before the player triggers the event.
“It allows us to react to changes in the amount of time that we’re anticipating it’s going to take a player to get to a particular event,” says Ridihalgh. “And on top of that, because it’s dynamically changing and it’s built up of these bits, we’re able to fight repetition because its different each time.”
The system also allows for the creation and utilization of Hindebits that would be activated if the player were to retreat from the event, rather than triggering it. In this way, the musical drama can swell and recede in accordance with the player’s decisions, and without requiring the composer to create tons of short musical snippets from scratch. “Let the composers create their vision and then be able to take that vision and put it into a system like this,” says Ridihalgh. “This is really a holistic system we’ve come up with. There’s a guidance spec for composers, and a guidance spec for developers… it does work. It can really add to gameplay and how the music supports the entire story of the game.”
The “Precognitive Interactive Music: Yes You Can!” talk offered some interesting new ideas pertaining to musical interactivity for game audio implementation. The entire talk is available for viewing by subscribers to the GDC Vault.
An Example: Spore Hero
In the following video, you’ll see the Spore Hero main menu system in action. There are three basic musical components to this system — the Main Menu music, the Battle Menu music, and the Sporepedia Menu music. All three of these tracks present very different executions of the same melodic content for the Spore Hero Main Theme, with different instrumentation and atmosphere. For instance, the Main Menu is by far the most dramatic in its orchestral treatment, while the Sporepedia Menu is quietly ambient and the Battle Menu is jaunty and primitive. Transitions from one of these tracks to another are seamless because they are playing simultaneously in a Vertical Layering system (which we’ve discussed in previous blogs). The shifts in emotion and intensity between the menus is directly attributable to the emotive swells and dips emanating from the interactive music system. Here’s a video that shows this system in action:
I think this use of the interactive music (that I originally composed for Spore Hero), now heard in the movie trailer for Disney’s The Jungle Book also serves to demonstrate that this type of interactive music can be written with the kind of emotional contours associated with linear composition for movie soundtracks:
Conclusion
While the systems described in the presentations from this year’s Game Developers Conference are prototypes, they offer an interesting glimpse into what ambitious game audio teams will be implementing for musical interactivity in the future. I hope you’ve enjoyed this blog that explored a few interactive music systems, and please let me know what you think in the comments section below!
