Sponsored By

Programming for storytelling in Figment

Music and storytelling are an important part of Figment. This blog post presents the challenges and questions we faced, and the solutions we found to let designers express their creativity with feasible and easy-to-use technical solutions.

Dion Christensen, Blogger

March 10, 2017

12 Min Read

This blog post was initially published on Bedtime Digital Games' devblog.

Greetings, my name is Dion. I am a programmer at Bedtime Digital Games and have been working on Figment since its birth. Figment is quite different from our previous title, Back to Bed. While Figment contains a surreal environment like Back to Bed, the two most significant elements in Figment are, doubtless, its story and music. In this post, we will share some of the steps we have taken to promote these elements through the systems we create.

As you will see, programming for Figment is a journey. Perhaps, in a sense, all programming is. We start off with the desire for a system to solve one specific challenge. While developing a solution, we gain a better understanding of the problem and we get ideas on how to expand the functionality of the system to solve new challenges. We will attempt to tell the chronological story of how the system for audio controlled objects came to be. To do that, we need to start with a slightly different story, which is the story of how the subtitle system came to be.

From the very beginning we knew that Figment would be a game which tells a story. To make that story as accessible as possible to everyone, across languages, we wanted to provide subtitles for the dialogues in the game. So, how do we actually do that? Every time a character in Figment speaks, it happens because we are playing the audio clip containing the voice. We want the corresponding line of text shown on the screen at the same time as it is being spoken - just like when watching a subtitled movie.

A solution for this would be to create a line of text for each audio clip and display this text on the screen as long as we are playing the audio clip. This solution works well, as long as each audio clip contains no more voice than is appropriate to display on the screen at once. Unfortunately, we soon found out, that this is not very suitable for the audio clips in Figment. The first time the player encounters the Plague Nightmare, the player is "greeted" by a song. As you might expect, the entire song is a single audio clip. For our approach to work, our audio guy would have to cut the song in pieces, so we can match each piece to a line of lyrics to be displayed on the screen. This creates a lot of extra work, which also has to be redone whenever we decide to make some changes to the music. We would also have to put the pieces together within the game and make sure that each audio clip (piece) play exactly after the previous one. So yeah, suffice to say, we agreed this is not a good approach.

We needed a solution which would allow us to link multiple lines of text to a single audio clip along with timing information telling us when to display each line of text on the screen. Fortunately, this is the same problem that needs to be solved when displaying subtitles for a movie. A popular format for storing subtitles is the SubRip Text format. If you have ever appropriated a movie from the internet, you may already be aware of this type of file - the filename usually ends with .srt. The image below shows an example of a SubRip Text file.

An example of SubRip Text fileAn example of SubRip Text file.

The SubRip Text file is the lyrics for the song that Plague Nightmare sings when the player meets it for the first time. The file contains four lines of text/entries to be displayed. The first line in the SubRip Text file (1) signifies the start of an entry and the order of the entry. The second line (00:00:02,611 --> 00:00:04,944) is timing information, stating that the entry is displayed 2,611 seconds after the beginning of the audio clip, until 4,944 seconds after the beginning of the audio clip (meaning it is displayed for a duration of 2,333 seconds). The third line of the file (I'm the Plague, your fear of disease) is the text to display on the screen during the time of the entry. SubRip Text files can be edited with a simple text editor or we can use one of the many existing free tools. These tools allow us to easily set and adjust timings while playing the audio clip along with the subtitles - like we do in SubtitleEdit seen below.

The Plague song played in SubtitleEdit

The Plague song played in SubtitleEdit

Yay, we now have a system in place for displaying subtitles! We know when to display which lines of text on the screen, based on how far an audio clip has progressed. The result looks as you would expect to see in a movie - see the video below.

A video of the Plague singing

However, we are not quite done yet - as mentioned earlier, we want to provide subtitles in multiple languages. The simplest way of doing this is to create a SubRip Text file for each language and just change the text of each of those. However, we realized that if we did that and later wanted to adjust the timing, we would have to change the same values in as many files as we support languages - which is possibly a lot (our previous game Back to Bed supports 10 languages).

So, workflow-wise, not an excellent solution. Instead we decided to store the actual (translated) lines of text in a separate file along with a name/identifier that we can use to refer to the particular subtitle. Seen below is a localization file which contains lines of text for subtitles.

A snippet of our localization fileA snippet of our localization file

The fourth row (PLAGUESONG01_01) of the localization file is the first line in the song the Plague Nightmare sings when the player first encounters it. It has both an English and a Danish version. The name/identifier PLAGUESONG01_01 can now be used to refer to those lines of text.

Now, we just need to create a link between the entries in the SubRip Text files and the entries in the localization file, so we replace the lines of text in the SubRip Text file with the names/identifiers so that it looks like this:

Snippet of our updated Plague Song subtitles file Snippet of our updated Plague Song subtitles file 

This is the same SubRip Text file as we saw before, only now the lines are names/identifiers which can be converted to the actual text depending on the language. To get an understanding of how this works, let us have a look at the diagram below.

Diagram showing how our subtitles system select files Diagram showing how our subtitles system select files 

So, starting at the top, when the audio clip Song_W1L01_Plague_01_IntroSong.wav is playing, we examine the current position/time that audio clip. Based on that, we select the corresponding entry from the SubRip Text file, which provides us the name/identifier to use. We then look up the row in the localization file and select the line of text based on which subtitle language is currently selected.

As Figment evolved, it became clear that the music was going to be a big part of the game - to the point that we wanted the gameplay to reflect that. One of the first things our audio guy wanted to do, was to let teeth fall from the sky in tune to the music. So, we needed to find a way of doing that. As it turns out, the subtitle system previously described is also a very good basis for doing this sort of thing. We made a slight extension to the text of the SubRip Text entries by allowing a name/identifier in [brackets] with a special meaning, as seen in the example below.

Example of an audio-based event file

Example of an audio-based event file

This SubRip Text file is not a replacement for the Plague Song displayed earlier. This SubRip Text file belongs to another audio clip - an audio clip which only contains music. The timing of each entry matches a significant beat in the music - we only care about the start time of the entry; the end time has been added to comply with the SubRip Text format. Entry texts in brackets are special. When encountering one of those, we do not try to match the name/identifier with an entry in the localization file - in fact, entries in brackets are not even going to be displayed on the screen. Instead these entries trigger an event. An event is a named occurrence which we can subscribe an action to. In this case, every time a SubRip Text entry with the line [ToothFall] is reached, we allow a tooth to fall from the sky. As we want the tooth to hit the ground at the exact beat of the music, we offset the timing slightly to compensate for the time required for the tooth to reach the ground. You can see the result below.

Example of teeth falling depending on the audio file

The event system is pretty flexible and so far, we have found uses for it in encounters, enemy behaviour and environment visuals - always with the purpose of making the gameplay or the world of Figment feel in tune with the music.

For a time, we felt confident that the event system would be able to do everything we wanted in relation to making elements within the game match the music. However, a new challenge presented itself. Freedom Isles contains several instruments as part of the environment. We wanted the instrument to appear as if they were playing the background music. To give this impression in a sort of cartoony way, we wanted to deform the models of the trumpets in tune to the music. This is where we hit a limitation with the event system. The event system is able to execute an event whenever a tone starts playing but to facilitate the effect we want, we need to know the appropriate amount of deformation continuously. Basically, we need a curve which matches the length of the music audio clip, telling us how to deform the trumpet.

We created a tool in which audio guys and designers could view the waveforms of the music and create one or more deformation curves to match the audio clip. Below is the tool window as it looks in our game editor.

Audio wave and asset deformation in our toolAudio wave and asset deformation in our tool

The red line indicates the waveform of the audio clip that has been loaded, in this case Music_FreedomIsles_01_Brass_Loop.wav. The yellow line is a custom deformation curve, created by our audio guy. As you can see, the deformation curve approximately matches the highs and lows in the music. This approach requires a bit of setup and custom handling of the curves but it also ensures that we can customize the curves to make them appear exactly as we want. The deformation curve is being used to control the trumpet in the video below.

Video showing the Trumpet deformation

The deformation system is used for other props in multiple locations in the world of Figment to create a similar effect. In a way, the background music appears to be the natural sounds of different objects in the environment. They all come together as an orchestra to create a musical ambience.

Whenever we develop something, we get new ideas and we discover what kind of additional features we want. When starting out, it is often difficult to predict what kind of features and ideas might occur. As an example, it was difficult for us to imagine that a system for showing subtitles would later be extended to drive gameplay events. With this in mind, it has been necessary to decide on how we want to prepare for changes. To us, it sometimes seems like a tough balance - spending a lot of time creating an intricate solution which can be extended and changed easily or creating the "quick & dirty" solution which simply solves the particular problem we are facing. Spending too much time creating a perfect™ change ready solution is not productive if we do not need to extend or make changes later on. On the other hand, doing something well from the start can often save us from regret later on. It seems that the best approach is to ask ourselves "how aggravating will it be to change this later on?" if the answer is "very" then it is probably an indication that we will benefit from rethinking the solution.

In the end, we cannot predict every change and extension. We still try to plan for changes, but it seems a more important skill is the ability to accept changes. That is, to accept that sometimes requirements are going to change in ways we did not predict and be willing to consider it a natural part of development. Deciding when to extend, when to rewrite, when not to rewrite and know that the tiny unimportant system that we are creating to solve a simple problem now, may become a core part of our game later.

As always, you can follow the development of Figment on:
Twitter: https://twitter.com/BedtimeDG
Facebook: https://www.facebook.com/figmentthegame/
Or subscribe to our mailing list (at the bottom of our website!).
 
Cheers!

Read more about:

Blogs
Daily news, dev blogs, and stories from Game Developer straight to your inbox

You May Also Like