I’m pleased to announce that my book, A Composer’s Guide to Game Music, is now available its new paperback edition! I’m excited that my book has done well enough to merit a paperback release, and I’m looking forward to getting to know a lot of new readers! The paperback is much lighter and more portable than the hardcover. Here’s a view of the front and back covers of the new paperback edition of my book (click the image for a bigger version if you’d like to read the back cover):
As you might expect, many aspiring game composers read my book, and I’m honored that my book is a part of their hunt for the best resources to help them succeed in this very competitive business. When I’m not working in my music studio, I like to keep up with all the great new developments in the game audio field, and I share a lot of what I learn in these articles. Keeping in mind how many of my readers are aspiring composers, I’ve made a point of devoting an article once a year to gathering the top online guidance currently available for newcomers to the game music profession. In previous years I’ve focused solely on recommendations gleaned from the writings of game audio pros, but this time I’d like to expand that focus to include other types of resources that could be helpful. Along the way, we’ll be taking a look at some nuggets of wisdom that have appeared on these sites. So, let’s get started!
Welcome back to this three article series that’s bringing together the ideas that were discussed in five different GDC 2017 audio talks about interactive music! These five speakers explored discoveries they’d made while creating interactivity in the music of their own game projects. We’re looking at these ideas side-by-side to broaden our viewpoint and gain a sense of the “bigger picture” when it comes to the leading-edge thinking for music interactivity in games. We’ve been looking at five interactive music systems discussed in these five GDC 2017 presentations:
In the first article, we examined the basic nature of these interactive systems. In the second article, we contemplated why those systems were used, with some of the inherent pros and cons of each system discussed in turn. So now, let’s get into the nitty gritty of tools and tips for working with such interactive music systems. If you haven’t read parts one and two of this series, please go do so now and then come back:
Welcome back to our three article series dedicated to collecting and exploring the ideas that were discussed in five different GDC 2017 audio talks about interactive music! These five speakers shared ideas they’d developed in the process of creating interactivity in the music of their own game projects. We’re looking at these ideas side-by-side to cultivate a sense of the “bigger picture” when it comes to the leading-edge thinking for music interactivity in games. In the first article, we looked at the basic nature of five interactive music systems discussed in these five GDC 2017 presentations:
Okay, so let’s now contemplate some simple but important questions: why were those systems used? What was attractive about each interactive music strategy, and what were the challenges inherent in using those systems?
By video game music composer Winifred Phillips | Contact | Follow
The 2017 Game Developers Conference could be described as a densely-packed deep-dive exploration of the state-of-the-art tools and methodologies used in modern game development. This description held especially true for the game audio track, wherein top experts in the field offered a plethora of viewpoints and advice on the awesome technical and artistic challenges of creating great sound for games. I’ve given GDC talks for the past three years now (see photo), and every year I’m amazed at the breadth and diversity of the problem-solving approaches discussed by my fellow GDC presenters. Often I’ll emerge from the conference with the impression that we game audio folks are all “doing it our own way,” using widely divergent strategies and tools.
This year, I thought I’d write three articles to collect and explore the ideas that were discussed in five different GDC audio talks. During their presentations, these five speakers all shared their thoughts on best practices and methods for instilling interactivity in modern game music. By absorbing these ideas side-by-side, I thought we might gain a sense of the “bigger picture” when it comes to the current leading-edge thinking for music interactivity in games. In the first article, we’ll look at the basic nature of these interactive systems. We’ll devote the second article to the pros and cons of each system, and in the third article we’ll look at tools and tips shared by these music interactivity experts. Along the way, I’ll also be sharing my thoughts on the subject, and we’ll take a look at musical examples from some of my own projects that demonstrate a few ideas explored in these GDC talks:
The North American Conference on Video Game Music begins this Saturday, and I’m definitely looking forward to giving the keynote speech there! It will be great to talk about some of the concepts from my book, A Composer’s Guide to Game Music, and I’m also very pleased that I’ll have the opportunity to meet such a wonderful collection of scholars in the field of game music study. Since not everyone will be able to travel to Fort Worth for the conference this weekend, I thought I’d provide you with some of the stimulating ideas that will be enlivening the forthcoming conference. Below you’ll find a collection of links to research papers, articles, essays, PowerPoint presentations and YouTube videos that some of the speakers from the upcoming event have previously created on the subject of video game music.
Dominic Arsenault is an assistant professor in the fields of video game design, history and musicology at the University of Montreal, Canada. This weekend he’ll be presenting a paper at the conference entitled “From Attunement to Interference: A Typology of Musical Intertextuality in Video Games.” Below you’ll find a link to his 2008 research paper from Loading… The Journal of the Canadian Game Studies Association. This article explores the mechanics of guitar playing in the music simulation videogame Guitar Hero, comparing this gameplay mechanic to the musicianship of playing a real-world guitar.
William Gibbons is the organizing chair of the North American Conference on Video Game Music, and teaches musicology at Texas Christian University. This weekend he’ll be presenting the paper “Navigating the Musical Uncanny Valley: Red Dead Redemption, Ni no Kuni, and the Dangers of Cinematic Game Scores” at the upcoming conference in Fort Worth. Below you’ll find a link to his research paper on the music of the video game Bioshock, as published in 2011 in Game Studies: The International Journal of Computer Game Research.
Julianne M. Grasso is currently a doctoral candidate at the University of Chicago, pursuing her degree in music theory. She’ll be presenting the talk “Intersections of Musical Performance and Play in Video Games” this weekend in Fort Worth. What follows is a link to a fascinating and entertaining essay she wrote in 2009 about her experience writing her undergraduate thesis on the music of Zelda and Final Fantasy for her music degree from Princeton University.
Professor Robert Hamilton teaches in the Department of Music at Stanford University, and is also a lecturer at the California College of the Arts on Experimental Game Development. His presentation this weekend will be “Designing Game-Centric Academic Curricula for Procedural Audio and Music.” Below, you can read his 2007 paper exploring a new interactive music composition system triggered by a gamer’s position and actions within an in-game virtual space. This paper was presented at the International Computer Music Conference in Copenhagen, Denmark.
Professor Christopher J. Hopkins researches chiptune music while teaching in the music department of Long Island University in New York. This weekend he’ll be presenting a paper entitled “Compositional Techniques of Chiptune Music.” Below, you can read an interesting PowerPoint presentation from a speech that Professor Hopkins gave about the discipline of video game sound and music at the 2013 Summer Teaching with Technology Institute.
There’s Always a Lighthouse: Commentary and Foreshadowing in the Diegetic Music of BioShock Infinite
Professor Enoch Jacobus’ fields of research include ludomusicology and music theory pedagogy. He teaches advanced musicianship and orchestration at Asbury University in Kentucky. At the upcoming Fort Worth conference he’ll be presenting a paper on BioShock Infinite entitled “Lighter Than Air: A Return to Columbia.” Happily, Professor Jacobus has previously given a speech on the music of BioShock Infinite at the inaugural North American Conference on Video Game Music that took place last year, and we can enjoy that speech via the YouTube video below:
The Origins of Musical Style in Video Games: 1977 – 1983 (Chapter 12 of The Oxford Handbook of Film Music Studies)
by Neil Lerner (Email at Davidson College: nelerner at davidson dot edu)
Neil Lerner teaches a wide assortment of music courses as a professor in the music department of Davidson College in North Carolina. At the conference in Fort Worth this weekend he’ll be giving a presentation entitled “Teaching the Soundtrack in a Video Game Music Class.” Neil Lerner has been active with several scholarly journals in the field of musicology. He has served on the editorial board of Music, Sound, and the Moving Image, and is currently the secretary for the Society for American Music. He also had the honor of holding the position of president of the American Musicological Society-Southeast Chapter. Below is a link to a chapter he contributed to The Oxford Handbook of Film Music Studies, as excerpted on Google Books.
Steven Reale is a music theorist, ludomusicology researcher, and associate professor at Youngstown State University in Ohio. At the Fort Worth conference this weekend he’ll be serving as the program chair. Here’s a 2011 research paper he wrote on the music of the video game Katamari Damacy for the journal ACT, published by The Research Institute for Music Theater Studies in Thurnau, Germany.