Sponsored By

At GDC 2022, Eidos-Montreal narrative producer Rayna Anderson breaks down the tech behind the studio's narrative tools.

Bryant Francis, Senior Editor

March 22, 2022

4 Min Read
The title characters from Marvel's Guardians of the Galaxy. And Cosmo, a good boy.

Good game stories are built on the backs of good game storytelling tools. While students and independent developers can twiddle with the possibilities of Twine, larger studios need advanced (often proprietary) tools to translate written dialogue into in-game content.

At the 2022 Game Developers Conference, Marvel's Guardians of the Galaxy senior narrative coordinator Rayna Anderson opened up the hood on her team's in-house tool called "Codex," with the goal of highlighting what pipeline problems Eidos-Montreal needed to solve to facilitate the game's banter-driven gameplay. 

Here's a few highlights on what features Codex boasts, in case your tools designer is looking to solve the same problems.

Dialogue for a ragtag bang of idiots

One of Guardians of the Galaxy's selling points is the charismatic cast of characters who are members of the titular Guardians (their paperwork says they're the Gardeners of the Galaxy but who's counting?). 

This doesn't just manifest in dialogue trees, during almost every moment of gameplay, the Guardians are bantering with each other, with the player character, with NPCs—it's a very talky game.

With that in mind, Anderson explained how the narrative team began thinking about tools for this pipeline early in the game's development. These are the top-level goals the team needed to create with Codex.

  • String IDs needed to be easily readable.

  • Game scripts needed to be playable fast.

  • Pain-free iteration

  • Ensemble recording

  • Fast audio importing

  • Easily provide context and error-checking for localization.

Codex has been Eidos-Montreal's dialogue tool for many projects now, so Anderson shared the specific improvements to the tool for Marvel's Guardians of the Galaxy.

First, Eidos-Montreal created a unified filename structure that could be created for each new line. File names would follow a uniform structure, so that anyone looking at "C02_4140_OS_CNV_SEK_Tumble" could parse what they were looking at.

When writers entered lines into Codex, the tool would automatically generate a text-to-speech audio file with a name for the string that would be permanently affiliated with that line for the rest of the process. As the team moved beyond text-to-speech to scratch audio and then voice actor audio, the audio team could count on that filename always being the same.

Those text-to-speech audio files could then be dropped into gameplay prototypes as fast as possible. First they could be added to storyboard animatics of sequences, then after a level prototype was generated, they could be implemented into rough gameplay so everyone could be evaluating the timing and quality of the dialogue.

Dialogue needed to be changed fast and frequently. If a line wasn't working, or worked better in a different spot, writers needed to move the lines elsewhere in the scene. To make sure those danged writers (god, writers, am I right?) could make fast edits, Codex's designers made sure that lines of dialogue retained their string IDs, so changing them wouldn't break the build.

Moving on to dialogue recording, Anderson showed off how Codex was designed to export dialogue in different formats for different individuals in the recording booth. Dialogue could be exported as conventional scripts for actors, or as .CSV files for the recording engineers.

Producers prepping for the recording sessions could also filter scripts by character or actor, and in Codex engineers could tag "keepers" so everyone with access to the tool could identify the best audio takes.

Codex's engineers also created a system to let producers do meta-analysis of scenes when setting up a day's recording. They could identify how many actors were needed, count how many lines were on the recording docket, and combine those data points for very accurate predictions about how long a recording day could run.

"This let us record four actors as an ensemble for an eight-hour day," Rayna explained.

These functions also applied to recording different language performances for the game's localization. When working with recording producers from different parts of the globe, Anderson said that the most notable feedback she received was that the metadata writers could add to Codex was "the most helpful" element for foreign-language actors. 

The extra context in that metadata (explaining the scene direction, other ongoing events in gameplay, etc.) helped performance directors nail the tone of the performances.

Anderson called out other improvements for the localization process, capping off her presentation with two major takeaways for any other studios working on such tools. "It's important to have production and technical allies on your team, and quality-of-of-life features can be a hard-sell because return-on-investment isn't immediate," she said. 

But those quality-of-life features shaved hours off of the dialogue process for Marvel's Guardians of the Galaxy, making it possible to stuff so much dialogue in such a huge game.

Read more about:

Features

About the Author(s)

Bryant Francis

Senior Editor, GameDeveloper.com

Bryant Francis is a writer, journalist, and narrative designer based in Boston, MA. He currently writes for Game Developer, a leading B2B publication for the video game industry. His credits include Proxy Studios' upcoming 4X strategy game Zephon and Amplitude Studio's 2017 game Endless Space 2.

Daily news, dev blogs, and stories from Game Developer straight to your inbox

You May Also Like