
By Winifred Phillips | Contact | Follow
Glad you’re here! I’m video game music composer Winifred Phillips. Today I’d like to share some news about one of my latest projects as a video game composer: the newest installment in an internationally-acclaimed fantasy RPG franchise known as The Dark Eye. During our discussion, we’ll break down the structure of one of the most important pieces of music I composed for that game.
The latest entry in the award-winning Dark Eye video game franchise will be released this coming Spring 2020 under the title The Dark Eye: Book of Heroes. Before we begin discussing this project and one of the pieces of music I composed for it, let’s take a look at the announcement trailer that was recently released by the publisher Ulisses Games. The trailer prominently features a sizable portion of the main theme I composed for the game:
As you can see from the gameplay captured in the trailer, The Dark Eye: Book of Heroes is an isometric real-time roleplaying game. The developers have compared the gameplay of Book of Heroes to top RPG games from the classic era like Baldur’s Gate and Neverwinter Nights. The game offers both solo missions and cooperative adventures designed for up to four players. Most importantly, the developers stress in an interview that their game will be faithful to the awesome fantasy world of the renowned RPG franchise – it will be “the most Dark Eye game ever.” Composing a main theme is a heavy responsibility, since main theme tracks tend to be regarded as especially important in a composer’s body of work. Just this week (Nov. 9th) I was interviewed on the Sound Of Gaming radio show on BBC Radio 3, and the main theme for The Dark Eye: Book of Heroes premiered on this broadcast, spotlighting my work as a game composer. The entire show is available to listen at this link from now until Dec. 8th. A main theme is not only a prominent showcase of a composer’s abilities, but also serves a crucial function within the main score of the game. So let’s explore that idea further.



In 

The Library of Congress has invited me to speak this April as a part of their “Augmented Realities” video game music festival. My presentation, “The Interface Between Music Composition and Game Design,” will take place at the Library of Congress in Washington DC. I’m very excited to participate in this event, which will be the first of its kind hosted by the “Concerts from the Library” series at the Library of Congress! The “Augmented Realities” video game music festival will also include panels on video game music history and preservation presented by distinguished curators and archivists at the Library of Congress, a special documentary screening that explores the ChipTunes movement, and a live “game creation lab.” My presentation will be the concluding lecture of the festival, and I’m honored to speak at such an illustrious event! If you find yourself in the Washington DC area on April 6th 2019, you’re very welcome to come to my lecture at the Library of Congress! Tickets are free (first come, first served), and they’re
But before my lecture at the Library of Congress, I’ll be making a trip to San Francisco for the famous Game Developers Conference that takes place this month. For the past few years I’ve been excited and honored to be selected as a Game Developers Conference speaker in the Game Audio track, and I’m happy to share that I’ll be speaking again this month in San Francisco at GDC 2019! My talk this year is entitled “

The Game Developers Conference is always an awesome opportunity for game audio experts to learn and share experiences. I’ve given presentations at GDC for a few years now, and I’m always excited to hear about what’s new and notable in game audio. This year, the hot topic was virtual reality. In fact, the subject received its own dedicated sub-conference that took place concurrently with the main GDC show. The VRDC (Virtual Reality Developers Conference) didn’t focus particularly on the audio and music side of VR, but there were a couple of notable talks on that subject. In this article, let’s take a look at some of the more intriguing VR game music takeaways from those two talks. Along the way, I’ll also share some of my related experience as the composer of the music of the Dragon Front VR game for the Oculus Rift (pictured above).
Where should video game music be in a VR game? Should it feel like it exists inside the VR world, weaving itself into the immersive 3D atmosphere surrounding the player? Or should it feel like it’s somehow outside of the VR environment and is instead coasting on top of the experience, being conveyed directly to the player? The former approach suggests a spacious and expansive musical soundscape, and the latter would feel much closer and more personal. Is one of these approaches more effective in VR than the other? Which choice is best?
In psychology, the term ‘affect’ refers to emotion, particularly in terms of the way in which such emotional content is displayed. Whether by visual or aural means, an emotion can not be shared without some kind of ‘affect’ that serves as its mode of communication from one person to another. When we’re happy, we smile. When we’re angry, we frown.