Sponsored By

Featured Blog | This community-written post highlights the best of what the game industry has to offer. Read more like it on the Game Developer Blogs.

Planckogenesis, Part I: Quantizing Events

First in a series about the design and development of Planck v.0, a "musical shooter" recently entered into the 12th Annual Independent Games Festival.

Matthew Burns, Blogger

November 10, 2009

5 Min Read

[This entry was crossposted from the Shadegrown Games blog.]

One of the questions we’ve been getting about the Planck v.0 video is “looks neat, but what is really going on there?” It’s true that conveying the mechanics through a video isn’t ideal, and while I do have a short explanation I wrote up about how the game translates play into music, I thought it would be better to describe not just how it functions today but the path we took to get there over the last few months. Please click on to read the first of a multi-part series about how Planck v.0 works and how it came to be. 

Planck began as an idea for how to extend the musical properties of a game like Rez– one of the titles that people have been correct to identify as an inspiration for us. Rez, along with Mizuguchi’s other games, combine music and gameplay in a way that points an interesting way forward: using game mechanics as a conduit to musical experience.

The classical Western theory of music treats time as broken up into discreet chunks that we subdivide into notes, half-notes, quarter-notes and so on. Playing a note with the wrong timing (think of the sound of doing badly in Guitar Hero or another rhythm action game) is due to your button or note coming just a little too late or too early– in other words, its not falling exactly on one of these grid-like chunks of time.

Thanks to computer-based music production techniques, we can correct notes played by hand and program rhythms that fall exactly on each beat every time; this is called “quantization.” Quantizing can be performed in real-time by simply delaying notes until the next available time slot, meaning a note coming a little too early will actually play when it is supposed to, and a note coming too late will sound like the next note.

Quantization is a venerable, well-understood feature of music software, and it isn’t new to real-time applications such as games, either– Rez being one of the most prominent examples. The acquiring of targets and firing of weapons in Rez are musical sounds that are played over the top of the level’s track to its beat.

While quantizing like this works great for sounds that can occur every sixteenth note, such as hi-hats or claps, bigger sounds like a bass drum or cymbal crash simply should not occur that fast (except during a fill, possibly). Rez gets around this by placing its large crash sound only when the player has queued up and fired eight successive shots with his or her weapon, timed to eighth notes– meaning the fastest it could ever happen twice in rapid succession is every other whole note. Other games, such as the great Xbox Live Indie Games title Groov, place that type of sound on the player death event, which is also uncommon enough that it works well.

A shortcoming of this approach is that we cannot arrange the pattern of big sounds as well as we might like to for musical reasons. If we quantize too much, such as delaying bass drum sounds to every whole note, the delay between the player’s action and the sound of the note becomes noticeably long, and the “gamey” portion of the experience gets watered down. The first part of the Planck idea, then, was to try and combine instant visual feedback of game events (such as destroying an enemy) with time-delayed reactions that make musical sense.

Planck’s attempt to solve this problem centers around giving instant visual feedback of an enemy being destroyed, but by keeping a “destroyed” version of the enemy in play until the next available note:

1. The player defeats an enemy by shooting at it.

2. The enemy turns black and emits “smoke” for the duration of time it takes to get to the next allowable note.

3. At the instant of the next allowable note, the enemy “explodes,” making its associated sound.

This approach allows us to create “patterns” of allowable notes– meaning we are not restricted to simple grids of whole, half, eighth or sixteenth note durations, but a specific trigger sequence as found in a typical electronic beatbox. Using this method, the bass drum in a breakbeat, a synth lead solo, and many other musical sounds can potentially be associated with individual enemies.

I wrote down a detailed specification of this feature and shared it a friend of mine in Los Angeles, who said, “Funny you should mention that– we just had a summer intern who was really interested in music games and who was looking to work on something. Maybe the two of you should get in touch.” This is how I ended up working with Brenton Woodrow, which I’ll get into in the next installment of Planckogenesis.

Read more about:

Featured Blogs
Daily news, dev blogs, and stories from Game Developer straight to your inbox

You May Also Like