Welcome back to our three article series dedicated to collecting and exploring the ideas that were discussed in five different GDC 2017 audio talks about interactive music! These five speakers shared ideas they’d developed in the process of creating interactivity in the music of their own game projects. We’re looking at these ideas side-by-side to cultivate a sense of the “bigger picture” when it comes to the leading-edge thinking for music interactivity in games. In the first article, we looked at the basic nature of five interactive music systems discussed in these five GDC 2017 presentations:
- Always Be Composing: The Flexible Music System of ‘Plants vs. Zombies Heroes’ – Speaker: Becky Allen, Audio Director
- Different Approaches to Game Music (Audio Bootcamp XVI) – Speaker: Leonard J. Paul, Educator
- Epic AND Interactive Music in ‘Final Fantasy XV’ – Speaker: Sho Iwamoto, Audio Programmer
- Interactive Music Approaches (Audio Bootcamp XVI) – Speaker: Steve Green, Sound Designer
- The Sound of ‘No Man’s Sky’ – Speaker: Paul Weir, Audio Director
If you haven’t read part one of this article series, please go do that now and come back.
Okay, so let’s now contemplate some simple but important questions: why were those systems used? What was attractive about each interactive music strategy, and what were the challenges inherent in using those systems?