Game Music Middleware, Part 5: psai

Middleware-Blackboard

 

This is a continuation of my blog series on the top audio middleware options for game music composers, this time focusing on the psai Interactive Music Engine for games, developed by Periscope Studio, an audio/music production house. Initially developed as a proprietary middleware solution for use by Periscope’s in-house musicians, the software is now being made available commercially for use by game composers.  In this blog I’ll take a quick look at psai and provide some tutorial resources that will further explore the utility of this audio middleware.  If you’d like to read the first four blog entries in this series on middleware for the game composer, you can find them here:

Game Music Middleware, Part 1: Wwise

Game Music Middleware, Part 2: FMOD

Game Music Middleware, Part 3: Fabric

Game Music Middleware, Part 4: Elias

What is psai?

The name “psai” is an acronym for “Periscope Studio Audio Intelligence,” and its lowercase appearance is intentional.  Like the Elias middleware (explored in a previous installment of this blog series), the psai application attempts to provide a specialized environment specifically tailored to best suit the needs of game composers.  The developers at Periscope Studio claim that psai’s “ease of use is unrivaled,” primarily because the middleware was “designed by videogame composers, who found that the approaches of conventional game audio middleware to interactive music were too complicated and not flexible enough.”  The psai music engine was originally released for PC games, with a version of the software for the popular Unity engine released in January 2015.

psai graphical user interface

psai graphical user interface

Both Elias and psai offer intuitive graphical user interfaces designed to ease the workflow of a game composer. However, unlike Elias, which focused exclusively on a vertical layering approach to musical interactivity, the psai middleware is structured entirely around horizontal re-sequencing, with no support for vertical layering.  As I described in my book, A Composer’s Guide to Game Music, “the fundamental idea behind horizontal re-sequencing is that when composed carefully and according to certain rules, the sequence of a musical composition can be rearranged.” (Chapter 11, page 188).

Music for the psai middleware is composed in what Periscope describes as a “snippets” format, in which short chunks of music are arranged into groups that can then be triggered semi-randomly by the middleware.  The overall musical composition is called a “theme,” and the snippets represent short sections of that theme.  The snippets are assigned numbers that best represent degrees of emotional intensity (from most intense to most relaxed), and these intensity numbers help determine which of the snippets will be triggered at any given time.  Other property assignments include whether a snippet is designated as an introductory or ending segment, or whether the snippet is bundled into a “middle” group with a particular intensity designation.  Periscope cautions, “The more Middle Segments you provide, the more diversified your Theme will be. The more Middle Segments you provide for a Theme, the less repetition will occur. For a highly dynamic soundtrack make sure to provide a proper number of Segments across different levels of intensity.”

Here’s an introductory tutorial video produced by Periscope for the psai Interactive Music Engine for videogames:

Because psai only supports horizontal re-sequencing, it’s not as flexible as the more famous tools such as Wwise or FMOD, which can support projects that alternate between horizontal and vertical interactivity models.  However, psai’s ease of use may prove alluring for composers who had already planned to implement a horizontal re-sequencing structure for musical interactivity.  The utility of the psai middleware also seems to depend on snippets that are quite short, as is demonstrated by the above tutorial video produced by Periscope Studio.  There could be some negative effects of this structure on a composer’s ability to develop melodic content (as is sometimes the case in a horizontal re-sequencing model).  It would be helpful if Periscope could demonstrate psai using longer snippets that might give us a better sense of how musical ideas might be developed within the confines of their dynamic music system.  One can imagine an awesome potential for creativity with this system, if the structure can be adapted to allow for more development of musical ideas over time.

The psai middleware has been used successfully in a handful of game projects, including Black Mirror III, Lost Chronicles of Zerzura, Legends of Pegasus, Mount & Blade II – Bannerlord, and The Devil’s Men.  Here’s some gameplay video that demonstrates the music system of Legends of Pegasus:

And here is some gameplay video that demonstrates the music system of Mount & Blade II – Bannerlord:

border-159926_640_white

Studio1_GreenWinifred Phillips is an award-winning video game music composer whose most recent project is the triple-A first person shooter Homefront: The Revolution. Her credits include five of the most famous and popular franchises in video gaming: Assassin’s Creed, LittleBigPlanet, Total War, God of War, and The Sims. She is the author of the award-winning bestseller A COMPOSER’S GUIDE TO GAME MUSIC, published by the Massachusetts Institute of Technology Press. As a VR game music expert, she writes frequently on the future of music in virtual reality video games. Follow her on Twitter @winphillips.

Game Music Middleware, Part 3: Fabric

Middleware-Blackboard

Welcome back to my series of blogs that collect some tutorial resources about game music middleware for the game music composer.  I had initially intended to publish two blog entries on this subject, focusing on the most popular audio middleware solutions: Wwise and FMOD.  However, since the Fabric audio middleware has been making such a splash in the game audio community, I thought I’d extend this series to include it.  If you’d like to read the first two blog entries in this series, you can find them here:

Game Music Middleware, Part 1: Wwise

Game Music Middleware, Part 2: FMOD

Fabric is developed by Tazman Audio for the Unity game engine (which enables game development for consoles, PCs, mobile devices such as iOS and Android, and games designed to run within a web browser).  Here’s a Unity game engine overview produced by Unity Technologies:

The Fabric middleware is designed to expand the audio capabilities of the Unity game engine.  The complete product manual for the Fabric middleware is available online.  The video tutorials that I’m featuring below were created by two game audio professionals who have very generously walked us through the use of the software.  If you’d like a more nuts-and-bolts overview of the software features of Fabric, you can find it here.

The first video was shot in 2013 during the Konsoll game development conference in Norway, and gives an overview of the general use of Fabric in game audio. The speaker, Jory Prum, is an accomplished game audio professional whose game credits include The Walking Dead, The Wolf Among Us, Broken Age, SimCity 4, Star Wars: Knights of the Old Republic, and many more.

Making a great sounding Unity game using Fabric

border-159926_640_white

In the next two-part video tutorial, composer Anastasia Devana has expanded on her previous instructional videos about FMOD Studio, focusing now on recreating the same music implementation strategies and techniques using the Fabric middleware in Unity.  Anastasia Devana is an award-winning composer whose game credits include the recently released puzzle game Synergy and the upcoming roleplaying game Anima – Gate of Memories.

Fabric and Unity: Adaptive Music in Angry Bots – Part 1

Fabric and Unity: Adaptive Music in Angry Bots – Part 2