By Winifred Phillips | Contact | Follow
Delighted you’re here! I’m very pleased to share that over the next two months I’ll be speaking at two fantastic events focusing on music in video games! My two presentations will explore the unique structure and character of video game music, and how it helps to better envelop players in the worlds that game designers have created. I thought that this article might be a good opportunity to delve into some of the ideas that form the basis of my two upcoming talks. First, I’d like to share some details about the presentations I’ll be giving.
The Library of Congress has invited me to speak this April as a part of their “Augmented Realities” video game music festival. My presentation, “The Interface Between Music Composition and Game Design,” will take place at the Library of Congress in Washington DC. I’m very excited to participate in this event, which will be the first of its kind hosted by the “Concerts from the Library” series at the Library of Congress! The “Augmented Realities” video game music festival will also include panels on video game music history and preservation presented by distinguished curators and archivists at the Library of Congress, a special documentary screening that explores the ChipTunes movement, and a live “game creation lab.” My presentation will be the concluding lecture of the festival, and I’m honored to speak at such an illustrious event! If you find yourself in the Washington DC area on April 6th 2019, you’re very welcome to come to my lecture at the Library of Congress! Tickets are free (first come, first served), and they’re available now via EventBrite.
But before my lecture at the Library of Congress, I’ll be making a trip to San Francisco for the famous Game Developers Conference that takes place this month. For the past few years I’ve been excited and honored to be selected as a Game Developers Conference speaker in the Game Audio track, and I’m happy to share that I’ll be speaking again this month in San Francisco at GDC 2019! My talk this year is entitled “How Music Enhances Virtual Presence.”
As a composer of video game music, the concept of Virtual Presence is a really intriguing topic for me. I find myself often thinking about how my work as a video game composer can help to more fully envelop players in the awesome virtual worlds that they’re exploring. In my GDC 2019 talk, I’ll be discussing Virtual Presence in connection with seven of the Virtual Reality games that I’ve scored, which have either released within the past year or will be released within the coming months.
When I decided to pursue this idea as the basis of my GDC 2019 lecture, I had the pleasure to read some fascinating research papers and expert opinions on the subject. While most of my research is included in my presentation, some interesting topics couldn’t be included within the time constraints of my GDC 2019 talk.
In light of these limitations, I thought that we could explore these extra ideas in this article – that way we’d have a basis of thought-provoking supplementary information in advance of my upcoming GDC lecture this month. So, let’s get started!
What is Virtual Presence?
In studies conducted at the University of Haifa in Israel, a team of researchers attempted to develop the optimal Virtual Reality user profile, with the aim of determining how to achieve ideal Virtual Presence. For the purposes of their experiments, the research team defined Virtual Presence as “the subjective experience in which the client/subject feels as if s/he is “in” the situation even though it is not real. Presence is influenced by personality and technological factors, as well as by the interaction between the two.”
This is a useful definition of Virtual Presence, and offers some important prerequisites that must be achieved in order to attain it. According to these researchers, in order to experience Virtual Presence, we need to fully accept the artificial environment as authentic, and experience our presence within that environment as convincing and emotionally engaging. It’s no small task. In considering how to accomplish this, I found the viewpoint of Harvard professor and researcher Chris Dede to be very helpful. In an article about video games for the journal Science, Dede wrote, “The more a virtual immersive experience is based on design strategies that combine actional, symbolic, and sensory factors, the greater the participant’s suspension of disbelief that s/he is ‘inside’ a digitally enhanced setting.”
In other words, the more compelling stimuli we receive inside a game world, the more we’ll be willing to suspend our disbelief in favor of the virtual reality with which we’ve been presented… but it has to be believable, and engaging on both an intellectual and emotional level.
Within the body of research aimed at understanding how human beings achieve intellectual and emotional engagement with tasks, the theory of Flow stands out. In writing about Flow for Escapist Magazine, renowned RPG game designer Allen Varney described the sensation as an “intense focus, loss of self, distorted time sense, effortless action.” When we’re experiencing the Flow state, we’re engaging in an activity in which our top skill level and the greatest challenge of the task balance perfectly. The task isn’t causing anxiety because it’s dauntingly difficult, nor is the task boring us because its so simplistic. To quote Goldilocks, the task is “just right.” Let’s flesh out this idea by watching a popular video about Flow produced by journalist Evan Puschak, best known as The Nerdwriter:
Okay, so now that we’ve explored the importance of emotional and intellectual engagement, let’s expand that to connect with Virtual Presence. Game designer Allen Varney describes “intense focus, loss of self, distorted time sense, effortless action” as factors that allow gamers to suspend their disbelief and become enveloped by a game. With that in mind, can these intense mental conditions cause us to consider ourselves “in” the game situation even though it is not real, as described by the University of Haifa researchers? In other words, can Flow lead to Virtual Presence in VR? And if so, what can we (as video game composers) do to help make this happen?
Music can make you smarter
In 2016 I wrote a three-article series about how game music has the capability to temporarily enhance cognitive ability in players, increasing their skill level, which in turn makes the challenges of gaming more enjoyable. While the articles were written specifically in connection with the rigors of strategy gaming, the concepts are applicable to other types of games as well (you can read Parts One, Two, and Three at these links). The idea that ‘listening to music elevates our intelligence’ is not new. Listening to music has been proven to increase cognitive ability, spatial ability, and even IQ. It’s definitely a temporary phenomenon, but it’s well documented. The effect depends on two factors. The music has to be energetic and in a major key – that is, it has to be happy. Many of us will be familiar with this concept as the famous Mozart Effect, but subsequent research has shown that the effect depends more on our enjoyment of the music, rather than who the composer is.
The experience of Flow is all about successfully completing tasks that feel satisfyingly challenging. Flow feels great! But in order to experience the full effect, we have to have the prerequisite skills to succeed, and we have to have the necessary focus to respond capably to the demands of the game. We’ve talked about how music can help us perform more skillfully, but can it also focus our minds to the tasks at hand?
According to a study conducted by Dr. Larry Morton of the University of Windsor and published in the Journal of Music Therapy, music can decrease our tendency to be distracted and enhance our focus and memory during tasks. In Morton’s study, a group of subjects were asked to remember a sequence of numerical digits after having either listened to music for awhile, or sat in silence. The results showed that exposure to music not only increased memory for the study subjects, but also enabled them to focus more effectively on the number sequence.
As Time Goes By
Since we’re exploring game designer Allen Varney’s prerequisites for an enveloping gaming experience, let’s take a look at what Allen Varney calls “distorted time sense.” This entails the sensation that time has ceased behaving predictably and is now zipping along without our conscious awareness. For instance, launching a game on Sunday afternoon, playing for what feels like two or three hours, and then finally looking up from the screen to discover with horror that it is now Monday morning at 3am – that is an excellent example of “distorted time sense.” It’s often experienced during the Flow state. Since we’re considering the possibility that this effect could support and enable Virtual Presence in VR, let’s now ask ourselves – how can game music composers distort the sensation of the passage of time?
There are many ways in which music can influence our perception of the passage of time. First, let’s check out this 3 minute video from the BrainCraft series that explores how sensory input (including music) can cause time to become perceptually-distorted:
Now, let’s think in more specific terms about what music can do to alter our perception of time. Dr. James Kellaris of the University of Cincinnati has performed a lot of research studies on this specific subject. Dr. Kellaris is best known for his research into music and the mind, particularly in regards to what he calls the musical “earworm” that gets stuck in our heads and won’t let us be. But Kellaris has also extensively investigated the relationship between music and time perception through several studies.
In one of Kellaris’s studies, a pool of test subjects were exposed to either ‘happy’ or ‘sad’ music (i.e. music that was either in a major or a minor key), with the hypothesis that happy music would cause the subjects to estimate that they’d been listening for a shorter period of time. It was a serious investigation of whether there’s any truth in the age-old saying that time flies when you’re having fun. The result of the study, however, proved exactly the opposite. Music in a happy major key caused listeners to think time was moving more slowly, and music in a minor key made time seem to zip by.
In another study, Kellaris’ research team exposed a group of test subjects to loud and soft music, and then tested their perception of the passage of time. They found that loud music had made time seem swifter, whereas soft music seemed to slow time down.
In a third study conducted by a team led by Kellaris, music complexity was tested against time perception. The study found that intricate music influenced listeners to perceive time as moving faster, and this effect became more pronounced if the intricacy was in the foreground melody.
Finally, Dr. Steve Oakes of the University of Liverpool (an associate of Kellaris) investigated whether musical tempo influenced time perception. In his study, Oakes found that fast tempos made people feel as though they’d been listening for longer than they actually had.
It seems that music can have some very specific effects on the passage of time. Would loud, fast, intricate, minor-mode music cause listeners to feel as though time were racing by? And would this contribute to the sensation of Virtual Presence by enhancing one of the defining characteristics of the Flow state? It’s an interesting possibility to consider.
Empathy: the loss of self
We’ve talked about how music can improve our sense of “intense focus.” We’ve discussed how music can improve our intelligence and skills, leading to “effortless action.” We’ve explored how music can interfere with our sensation of the passage of time, causing a “distorted time sense.” These are three of the four qualities that game designer Allen Varney described as characteristics of the Flow phenomenon in gaming. But what about the fourth? Can music lead to the sensation of a “loss of self?”
Here’s a study that approaches this question from a very unique perspective. A University of California article published in the journal Music Perception tried to understand the reason why music could elicit emotion from autistic people. Sad music could make them feel melancholy. Happy music could fill them with cheerfulness. Yet, these same people had enormous trouble in recognizing or responding to emotions in other human beings.
What was music doing that enabled these people to recognize and respond to happiness and sadness? The researchers, observing these emotional reactions to music from their autistic test group, drew several conclusions, including a concept of musical empathy that included a “Persona” model (as conceived by music philosopher Jerrold Levinson).
According to this model, when we listen to music, in our subconscious we’re envisioning a persona – an individual who is communicating the emotions of the music to us in the way that people normally do using their faces and bodies.
Our instinctive response is to experience the emotions we’re observing in that other imaginary individual. This reaction is associated with our mirror neuron system. As humans, our brains are hardwired to reflect and imitate the behavior of others, and this includes emotions. Here’s a helpful video of a 7 minute TED talk that explains the mirror neuron system and its importance in human evolution and culture:
In the case of music, the mirror neuron system actually plays an integral part. According to Jerrold Levinson’s Persona model, while listening to music our subconscious minds create an imaginary persona. That is, in the backs of our minds we picture a person conveying the sentiments of the music. Because of our mirror neuron systems, we naturally tend to feel the same emotions that we perceive from others. So, as we subconsciously imagine that “persona” communicating the emotions of the music to us, our mirror neuron system prompts us to feel what this musical persona is feeling. We empathize with those emotions in the same way we would normally empathize with another human being whose emotional state was visibly apparent. Remarkably, this musical effect is so strong that it seems to bypass the usual barriers that prevent autistic people from understanding emotions in others. While autism blocks its sufferers from correctly interpreting emotion in the faces and bodies of other people, the autistic person can nevertheless understand emotion in music, and experience those same emotions, by virtue of the mirror neuron system and Jerrold Levinson’s Persona model working in tandem.
Let’s consider the possibility that emotional music can engender empathy in listeners and help to accentuate the “loss of self” that game designer Allen Varney described. This completes Varney’s four prerequisites to the Flow state, which helps to enable Virtual Presence.
We’ve discussed how music can have a powerful influence over the cognition, perceptions, and emotions of players in Virtual Reality. As game composers, our goal is to assist the development team in creating an unforgettable experience for players. Virtual Reality has the potential to create mindblowing experiences in which players feel completely transported to another world, and composers have a broad assortment of creative tools to help make that happen. I look forward to going into more detail during my GDC 2019 lecture regarding practical techniques and real-world examples of how music enables Virtual Presence in VR. Below, I’ve included the official GDC session description of my lecture, “How Music Enhances Virtual Presence.” If you’re coming to San Francisco for the Game Developers Conference, I hope you’ll attend my conference session. It would be great to meet you!
How Music Enhances Virtual Presence
(GDC Session Description)
Virtual Presence is defined as a state in which gamers fully accept the virtual world around them and their existence within it. This talk, “How Music Enhances Virtual Presence,” will explore how highly effective game music can enhance the sensation of Virtual Presence in VR gaming.
The talk will begin with an exploration of both the Flow theory of Mihaly Csikszentmihalyi and the research of Dr. Paul Cairns on psychological engagement in video gaming. By understanding how the mental activity of players interacts with the way a game is designed, composers can create music intended to induce psychological states conducive with the formation of Virtual Presence.
The talk will include a discussion of techniques aimed at drawing attention to mission objectives, facilitating effective concentration, enhancing emotional empathy and intensifying player focus. The discussion will also include an exploration of some inherent drawbacks to Virtual Presence, including its fragility when exposed to negative emotional states, and its possible susceptibility to inducing the “event boundary” phenomenon. Musical solutions to these problems will be explored.
Phillips’ talk will offer techniques for composers and audio directors who seek to employ music as a tool to enhance Virtual Presence for their players.
Using examples from several games, Phillips will explore how music can influence the mental states of players through specific effects documented in scientific research. Study data will be discussed in regards to the interaction between music and cognition. Phillips will offer strategies and tips for composers seeking to use their music to influence the player’s mental state, thus facilitating the formation of Virtual Presence.
This session is intended to inspire and stimulate composers seeking to employ their music towards enhancing player engagement and enjoyment, with a particular emphasis on VR games. Includes overview of Flow theory and the psychological components of Virtual Presence, which may be useful to other disciplines within game development. Talk will be approachable for all levels (advanced composers may better appreciate the specific composition techniques discussed).
Popular music from composer Winifred Phillips’ award-winning Assassin’s Creed Liberation score will be performed live by a top 80-piece orchestra and choir as part of the Assassin’s Creed Symphony World Tour, which kicks off in 2019 with its Los Angeles premiere at the famous Dolby Theatre. As an accomplished video game composer, Phillips is best known for composing music for games in five of the most famous and popular franchises in gaming: Assassin’s Creed, LittleBigPlanet, Total War, God of War, and The Sims. Phillips’ other notable projects include the triple-A first person shooter Homefront: The Revolution, and numerous virtual reality games, including Scraper: First Strike, Dragon Front, and many more. She is the author of the award-winning bestseller A COMPOSER’S GUIDE TO GAME MUSIC, published by the MIT Press. As a VR game music expert, she writes frequently on the future of music in virtual reality games. Phillips’ is a sought-after public speaker, and she has been invited to speak about her work as a game composer at the Library of Congress, the Game Developers Conference, the Audio Engineering Society, the Society of Composers and Lyricists, and many more. Follow her on Twitter @winphillips.