Untitled Goose Game is the most charming game of the year. But while everyone is enraptured with The Goose and his adorable waddle, the reactive and dynamic music is an underrated element tying the humour together. It also turned out to be an enormous technical challenge, because the developers never intended for the music to be dynamic in the first place, and it took iTunes to solve a key part of the problem.
Dan Golding was the composer on Goose Game, having worked with the House House developers on the music for Push Me Pull You. Initially, the Melbourne developers asked Golding if he could compose the music for the first Goose Game trailer — which was just called Goose then.
But the developers made a specific request: rather than giving Golding a blank slate and asking him to consult on what the music should be, they asked him to work with a particular prelude from Claude Debussy, the classical pianist whose work is now in the public domain. House House wanted the music to play through the radio that you find in the first section of Goose Game.
But there’s also another element in the initial trailer that became a huge, huge problem for Golding and House House. And, it’s not the fact The Goose’s honk is actually coming from a duck.
Apart from the enormously favourable response, the trailer was edited in such a way that it made the music look reactive to The Goose by smoothly transitioning the Debussy prelude from the radio to general background music.
As Golding explained during a talk at this year’s Game Connect Asia Pacific conference in Melbourne, the music actually wasn’t reactive. It was the natural byproduct of Mickey Mousing, a film technique where the music mimics the action on screen.
“We were like, oh, OK, cool,” Golding said, summarising the developers’ reaction to the fan response.
So giving into the fans demands, House House and Golding knew they had to figure out a way to make reactive music, something they knew absolutely nothing about. Initially, they thought about creating “themed” music, where the Debussy preludes were broken into bits of music that would play depending on the current state of the game.
It didn’t work as well as they’d planned, however, so they went with the next option. Working off another film theory where viewers’ brains will create meanings between what they hear and what they see, as well as the basics underpinning the Kuleshov effect — where two frames are sewn together for emotion in film editing — House House and Golding opted to run with music that would be split into different states, and players would naturally draw conclusions between the music and on-screen action.
To send the imagination into overdrive, Golding created high and low energy recordings of all the Debussy preludes that were used in-game. Those recordings were then split into what the developers called “phrases” and “stems”, which are basically small shorts of the music.
Golding’s take on the first Debussy prelude, he told the crowd, ran for 2 minutes and 28 seconds. At the end of the process, the one recording was split into 347 separate stems, each of which had a high and low versions. House House then created three persistent states for the in-game action: one when you were being actively chased, another when The Goose was on the prowl or stealthing around, and a third when there’s no action happening.
The first prelude is 2m 28 seconds, its split into 347 stems, each with 2 versions (high/low energy), and with the multiple states it created an enormous amount of permutations. “I don’t think it’s really possible for anybody who’s playing this game to get the same [musical] performance,” Golding said.
But this presented a huge technical problem. How was the game supposed to know when one stem should finish playing, and when to play another one? If you just had a second stem play as soon as the first finished, you’d have sound that was lagging behind the action. It’d also sound hugely mechanical, because players would hear the cutoff between the stems.
That’s where iTunes came in.
To fix the problem with the sounds, Golding ended up exporting all of the stems from Logic Pro X with their own reverb tails. “It sounds like somebody is holding down the sustain pedal with their foot,” he said, which is exactly the kind of tone that suited a sleepy English village.
But while the individual stems sounded better when played back, the game still needed to know when to switch phrases. So he created a spreadsheet with a list in microseconds of every single track in a playlist of when those stems ended. The timings were important, because the game needed to be flexible enough to play a new stem or phrase when it needed to, but it couldn’t play tracks over the top of each other.
Getting the timings, however, was trickier than it seemed. Golding couldn’t find software that would export all the information he needed into a spreadsheet — until he gave up and put everything into iTunes.
“I exported all the files with that reverb audio tail, and then I exported them all again with the hard cutoffs, and then loaded them all into iTunes, the only software that would give me a spreadsheet export of the millisecond timing of every track in a playlist. iTunes is useful for something after all,” he said.
That ability to get all that data was a key part of Goose Game‘s soundtrack flowing the way it does, allowing it to switch between phrases and feel truly reactive to the player. It’s one of those elements that’s infinitely more complicated than it first seems, and particularly for House House and Golding who knew nothing about reactive soundtracks when they started — even though everyone thought they did.
The author’s accommodation for GCAP and PAX Australia 2019 was covered courtesy of Airbnb.