This is a continuation of my blog series on the top audio middleware options for game music composers, this time focusing on the psai Interactive Music Engine for games, developed by Periscope Studio, an audio/music production house. Initially developed as a proprietary middleware solution for use by Periscope’s in-house musicians, the software is now being made available commercially for use by game composers. In this blog I’ll take a quick look at psai and provide some tutorial resources that will further explore the utility of this audio middleware. If you’d like to read the first four blog entries in this series on middleware for the game composer, you can find them here:
Game Music Middleware, Part 1: Wwise
Game Music Middleware, Part 2: FMOD
Game Music Middleware, Part 3: Fabric
Game Music Middleware, Part 4: Elias
What is psai?
The name “psai” is an acronym for “Periscope Studio Audio Intelligence,” and its lowercase appearance is intentional. Like the Elias middleware (explored in a previous installment of this blog series), the psai application attempts to provide a specialized environment specifically tailored to best suit the needs of game composers. The developers at Periscope Studio claim that psai’s “ease of use is unrivaled,” primarily because the middleware was “designed by videogame composers, who found that the approaches of conventional game audio middleware to interactive music were too complicated and not flexible enough.” The psai music engine was originally released for PC games, with a version of the software for the popular Unity engine released in January 2015.
psai graphical user interface
Both Elias and psai offer intuitive graphical user interfaces designed to ease the workflow of a game composer. However, unlike Elias, which focused exclusively on a vertical layering approach to musical interactivity, the psai middleware is structured entirely around horizontal re-sequencing, with no support for vertical layering. As I described in my book, A Composer’s Guide to Game Music, “the fundamental idea behind horizontal re-sequencing is that when composed carefully and according to certain rules, the sequence of a musical composition can be rearranged.” (Chapter 11, page 188).
Music for the psai middleware is composed in what Periscope describes as a “snippets” format, in which short chunks of music are arranged into groups that can then be triggered semi-randomly by the middleware. The overall musical composition is called a “theme,” and the snippets represent short sections of that theme. The snippets are assigned numbers that best represent degrees of emotional intensity (from most intense to most relaxed), and these intensity numbers help determine which of the snippets will be triggered at any given time. Other property assignments include whether a snippet is designated as an introductory or ending segment, or whether the snippet is bundled into a “middle” group with a particular intensity designation. Periscope cautions, “The more Middle Segments you provide, the more diversified your Theme will be. The more Middle Segments you provide for a Theme, the less repetition will occur. For a highly dynamic soundtrack make sure to provide a proper number of Segments across different levels of intensity.”
Here’s an introductory tutorial video produced by Periscope for the psai Interactive Music Engine for videogames:
Because psai only supports horizontal re-sequencing, it’s not as flexible as the more famous tools such as Wwise or FMOD, which can support projects that alternate between horizontal and vertical interactivity models. However, psai’s ease of use may prove alluring for composers who had already planned to implement a horizontal re-sequencing structure for musical interactivity. The utility of the psai middleware also seems to depend on snippets that are quite short, as is demonstrated by the above tutorial video produced by Periscope Studio. There could be some negative effects of this structure on a composer’s ability to develop melodic content (as is sometimes the case in a horizontal re-sequencing model). It would be helpful if Periscope could demonstrate psai using longer snippets that might give us a better sense of how musical ideas might be developed within the confines of their dynamic music system. One can imagine an awesome potential for creativity with this system, if the structure can be adapted to allow for more development of musical ideas over time.
The psai middleware has been used successfully in a handful of game projects, including Black Mirror III, Lost Chronicles of Zerzura, Legends of Pegasus, Mount & Blade II – Bannerlord, and The Devil’s Men. Here’s some gameplay video that demonstrates the music system of Legends of Pegasus:
And here is some gameplay video that demonstrates the music system of Mount & Blade II – Bannerlord:
Winifred Phillips is an award-winning video game music composer whose most recent project is the triple-A first person shooter Homefront: The Revolution. Her credits include five of the most famous and popular franchises in video gaming: Assassin’s Creed, LittleBigPlanet, Total War, God of War, and The Sims. She is the author of the award-winning bestseller A COMPOSER’S GUIDE TO GAME MUSIC, published by the Massachusetts Institute of Technology Press. As a VR game music expert, she writes frequently on the future of music in virtual reality video games. Follow her on Twitter @winphillips.