Sponsored By

Featured Blog | This community-written post highlights the best of what the game industry has to offer. Read more like it on the Game Developer Blogs.

Composers Guy Jones and Kent Carter recently got to sit down for a video podcast interview of Darren Korb, Audio Director and composer at Supergiant Games, about his experience with audio middleware.

Kent Carter, Blogger

January 28, 2022

8 Min Read

A question, readers: how do you work with audio middleware and why does it matter? Ask a handful of developers using the powerful software tools to sync their games with music and you’re likely to get quite the mixed bag of responses as there really is no one-size-fits-all solution.

The middleware you choose and how you work with it really boils down to preference, a point that became even more clear during my recent conversation with composer Darren Korb, Part 2 of a video podcast interview I conducted alongside longtime ALIBI composer Guy Jones.

If you haven’t heard his name before, there’s a good chance you’ve heard his music. Darren Korb is the Audio Director and composer for Supergiant Games, and his work is the BGM for Bastion, Transistor, Pyre and Hades. When Darren isn’t composing music for games, he actually plays in a very cool band called Control Group, which you can find on Spotify and all the usual places.

Knowing that Guy and I were getting the opportunity to interview him, we delved into the topic of audio middleware hoping to get his unique perspective on its importance to game developers in building a sonic world.

Darren began by talking about his early experience with the middleware tools commonly used to set up the parameters for how you want the sounds to playback, load into memory, connect with a game’s engine and ultimately appear during game play. He had made his middleware foray while composing for the game, “Bastion,” where they used free software known then as XACT, which was pretty rudimentary.

Supergiant Games' Bastion


By the time he was working on “Transistor,” his team had switched to FMOD, which is what he still uses today. About halfway through the project, they transitioned from FMOD Designer (somewhat similar in style to what he had been used to) to FMOD Studio, which is timeline based and made more sense to him as someone with a background in music production and recording.

“You can load all of your stuff into an actual timeline. You can have multichannel music cues very easily because you just put them all in a single cue under the tracks like you would in a DAW (digital audio workstation) or something,” Darren told us. “So that really opened my mind up when I could see the middleware like a DAW and use it that way.”

His experience with FMOD Studio during that project gave him a much better understanding of how the busing system works, the routing of everything and how you’re able to apply effects to certain sounds or a particular reverb state, for example.

“You can do some really, galaxy brain sort of dynamic mixing stuff that really just makes the job a lot easier. I mean, on XACT, I had to beg our engineers to ‘please write me a reverb, please. Pretty please,’” he added with a laugh.

Now, after having used FMOD for nearly a decade, he believes it’s given him a great way to dig in and go super deep on his projects, with each one allowing him to experiment a little bit more in terms of using markers, doing section transitions and dynamic work with stems. And while there are numerous middleware options out there, Darren acknowledged that FMOD is simply what he knows best, just like he and Guy both happen to prefer Logic over Pro Tools when it comes to DAWs.

“I do everything in Logic, kinda dump stuff into a project that they've set up on FMOD. I’m just kinda dumping it into what they've set up and then it's all kinda got the destination paths all sorted and it’s good to go. And that's worked really well for me,” Guy said. “I've looked at Elias and Wwise, and I think that it's good for game developers to understand what the different middleware programs are because they cater to different mindsets.”

Whereas FMOD, which works as a timeline, might be perfect for Guy and Darren, Wwise is akin to a development tool that even looks a bit like Unity when opened and is all code. Game developers considering audio middleware should first determine whether they will actually be the ones in charge of audio editing.

“If you are, maybe like something like Wwise might work for you. If you're not, and you want the audio guy (or woman) to work on it, then maybe FMOD or Elias is gonna be the way to go.”

Thankfully, all of these companies will allow you to use the software for free if you’re low or no budget, I told them, which is incredibly useful when trying to get a feel for the functionality.

We then touched on the hesitancy of some developers to embrace audio middleware due to the perception that they could lose creative control or add to their costs. Yet, these programs actually cater to the needs of editors that gives them great flexibility.

Since Darren began using FMOD on “Transistor,” where he was taught the software by an engineer who was already well-versed, he has moved from a very particular “states” style set-up for setting the parameters on cues to more of a process that allows him to experiment and find new ways to implement ideas. He shared an example of his thought process.

Supergiant Games' Transistor


“Well, wouldn't it be simpler if we just did this, and I'll try and rejigger the system in a new way that's slightly more streamlined or elegant?” He posed. “I'll come up with like a weird edge case scenario where I want there to be a vocal track on this that I can bring in and out, but I want that track to be processed totally separately from all the rest of the music. And I want to be able to put a low pass filter on only that… then I have to route it to a bus and have the automation occur on the bus instead of the actual thing, and I can process this one way, and then that the other way.”

Darren believes it really comes down to problem solving based on knowing how the system works and using the creative tools available, which – over time – yields him more solutions that prevent him from having to solve the same issues over and over again, regardless of which audio middleware or DAW you choose.

DAWs range from Logic and Digital Performer (which I use) to Cubase, Pro Tools, Reaper and Studio One – a lot of choices for sure.

“At the end of the day, if you've got music and sound effects and dialogue in your game, and you release the game, and people are playing it, and they enjoy it, it doesn't matter what you use. And that goes if you want to code yourself or use any of the forms of middleware,” Guy said.

“And in terms of capabilities, all these are super powerful pieces of software. They're just all sort of work a little differently, but they're all going to allow you to do the same kinds of things,” added Darren. “It's just kind of whatever your preference is. I think it really is a lot about like choosing a DAW, where it's just whatever suits your workflow the best. That's the way to go.”

When it comes to workflow, Darren told us that he’s integrated into the team when working with a developer, but middleware allows him to fully set up the system himself and have everything set up exactly how he wants it.

“And the way we do it is we keep all the parameter names the same. We have sort of a template for each piece, for a particular context,” he explained. “We set them all up with the same parameter names and the same everything else. And so, whenever a particular piece is playing, all that stuff just automatically works no matter what piece is playing because all the parameters have the same names and they're referred to in script in the same way, if that makes sense.”

One of the aspects of audio middleware Darren likes the most is the ability to live sync when you’re running the game in development mode with FMOD open at the same time, a feature he finds particularly useful for tuning sound effects and playing with the sound size attached to a particular action, for example.

“On ‘Bastion,’ what would happen is, I'd make a theoretical change that I couldn't test, I'd have to build banks for 45 minutes or whatever and then, and then I could check it and see how it worked,” he recalled with a laugh. “So the ability to check my work live is really a huge, huge time saving.”

Ultimately, the ability for game developers to entrust such nuanced tweaking of the sound in their projects to audio designers or composers makes a huge difference because even the little things can impact how consumers experience a game when playing with headphones or speakers.

You can watch and listen to Part 2 of our interview with Darren Korb HERE.

And catch Part 1 HERE.

Read more about:

Featured Blogs

About the Author(s)

Daily news, dev blogs, and stories from Game Developer straight to your inbox

You May Also Like