Sponsored By

We recently caught up with BAFTA nominated composer, musician, and producer Petri Alanko to learn more about his anomalous approach to audio on Control.

Chris Kerr, News Editor

October 4, 2019

12 Min Read

If you've ever played a Remedy game, you've also heard the musical stylings of Petri Alanko. The Finnish composer has worked with the studio on its last three projects -- Alan Wake, Quantum Break, and its most recent leap into the surreal, Control -- and after a quick chat with the soundsmith it's easy to see why the two appear to be inseparable at this point. 

You see, much like Remedy, Alanko evidently isn't afraid to go tumbling down the rabbit hole. During his time on Control, he sought out new sounds using unconventional techniques such as dragging tools across a bare piano frame and hauling wood around his garage floor. On paper, that might sound like an exercise in futility, but those audio experiments paid off tenfold, helping Alanko create a soundscape that melts effortlessly into the eldritch surroundings of the Oldest House, the beautifully brutalist skyscraper that servers as the setting for Control

Keen to hear more about Alanko's anomalous approach to audio, we sat down with the BAFTA nominated composer, musician, and producer for what turned out to be a rather extensive Q&A.  

Gamasutra: How do you generally size up a new project? Are there certain universal rituals you have, or does it vary depending on a case by case basis? 

Petri Alanko: Every project begins with a laid back planning phase, which takes about a week or two. During that time I just think about what would be cool, and what will ultimately benefit the project by helping it stand above the noise.

After the planning phase, I usually record a lot of material very vigorously, I literally disappear into the sounds with my microphone arsenal and recording devices. I try to stay away from computers during this phase, as using different standalone recorders (I own an old small Nagra, a Roland 8-track, and about a dozen of tiny dictaphones from 1970s to 2019 as well as some digital 2, 4 and 6 track machines) helps me concentrate on listening rather than seeing, which always happens when you work with computers. When listening to a mixed track, I always switch off the displays.

During that recording phase I sometimes pick up some stuff that needs to be developed a little further right away. That usually leads to the first demos where I test the sounds I've collected just to find out whether they're a good building block. Some of those demos (and on occasions even all of them) will end up in the final product one way or another. There's something really crucial in that first bite. 

How much do you experiment during those early stages, and what sort of instruments and tech did you play around with on Control?

Oh, I experiment a lot. There was a time when I didn't use any score sheets at all, but my process has changed a little. Although, I still pause for a week or two before really putting any writing down, as I tend to believe the chaff disappears from your mind during that pause. If it's worth writing down, it'll stay with you.

In the case of Control, I enjoyed using a piano frame without any wooden parts as a basis for sounds. I actually used a screwdriver and Allen key wrench to pick and bang the strings to open up some cool ideas, which produces some particularly fascinating results when you start doing a lot of tuning tricks to the strings. I also recorded a lot of wooden material -- the heavier, the better.

I dragged them across my garage floor just to provide some harmonic series screeching, and then tuned (or rather mistuned) the hell out of them with Melodyne. The human brain is wonderful: when something doesn't apply to a harmonic series, you try to fix it, and the results are strange. 

I read an interview the other day in which Remedy explained they wanted Control to be both mundane and otherworldly. How did you marry those two contrasting ideologies?

Well, releasing yourself from the chains of melodies and harmonies can be really relaxing, actually. The sounds and the music needed to complement the picture and the emotion. Now, the contrast itself isn't difficult to manifest. It's the subtlety that's hard to achieve; one should never ever underline and emphasize the action on screen. Everybody can pick that up, which means my job is to interpret the motive behind the obvious and tie that with the needed amount of perceivable, most required action.

But, in action, the otherworldliness was actually the hardest feature to fill in. It would've been easy to take the route of digital hassle and trickery, but that could have sounded dated. What struck me was to find a suitable means of getting a similar effect by using something physical that could be recorded -- using real world plugins right from the start.

After all, Control's world relies on the most basic, even mechanical and analogue elements due to their nature of being "not hackable." That left me wondering if there was stutter or glitching in 1960s and 1970s, what would it sound like and how it would be achieved?

To realize that effect, I ended up recording stuff through tissue paper rolls and buckets, before turning those raw sounds into collages or sketches of themselves. Of course, that needed some digital processing, but as a concept it was watertight, and it works brilliantly in the game.

Otherworldliness also appears in the distorted overtone series and tunings, and in the sounds' natural resonance that was "tweaked" a little in Melodyne and some Kyma processes. My favorite take was the board sound that comments and communicates through some translation system. I did a version of it with a plugin that emulates a CBM-64 computer's audio frame rate playback, and that turned out to be rather nice and a viable way to represent something that's just barely human. 

In one piece, when time was running out schedule-wise, I created some background ambience with Logic by using a temp voiceover, removing all sibilants from it, lowering it down for about two octaves, then adding a sample and holding a random LFO to the vocal transformer plugin controlling the formant, another S/H LFO to control the previous LFO's rate, then putting the result through a Zynaptiq Pitchmap and Wormhole, and having a send with a random delay and a Pitchmap - then throwing that into some drone diffusion oddball plugin.

That felt like cheating since it was so easy. I think every sound designer and composer has their "visual plugin chain" in their head; when they get the idea they know what to use to get a certain sound and effect. A little like "hearing" the sound of a Roland SH-101 by just looking at the front panel. You just need a few hours to really get to know your stuff, libraries, equipment and plugin. I started when there was no MIDI interfaces in synths, so I've had my fair share of practicing. 

The dev team also said they wanted your score to mirror Control’s dynamic gameplay elements by evolving around the game’s moments of surrealism and fierce action. What went into achieving that on a technical level?

Well, there was another composer providing some material for the AI engine in WWise, and quite a bit of mine went through it as well. What we basically did was provide the WWise rule set enough Lego bricks to give it some life and material to work with. Martin (Stig Andersen) actually described this really well in one of our making-ofs by saying a track was composed and then dissected into a myriad of pieces, which were fed into the WWise and put under some strict ruling to maintain the dynamic form of the tracks, and to live and breathe according to the events triggered by the player. 

For instance, during the action scenes, the meter of the playback changes from 11/8 to 13/8 to 3/8 or even less depending on the pace of the battle. The same process applies to the atmospheric material; the more intense the fight, the more prominent the 'gnarling and roaring' audio becomes. The in-game sounds never match exactly with the original piece, but that's fine because the original piece was there to provide, say, a picture for the Lego package sleeve -- or a serving suggestion on a meatball microwave dinner.

Was there a specific track you found particularly difficult to nail down?

The latter pieces on the soundtrack were the heaviest I've written, both in sound and processes, despite their seemingly simple setup. I'm not too good on optimizing a Reaktor patch (not to mention Kyma sounds) so at some point my homebrew Reaktor instruments and ensembles were hogging processor cycles like a Saturn Five booster rocket ate mountains of fuel. 

Nowadays I usually have three computers running, connected through VSL Ensemble Pro 7, and one of my Mac Pros was dedicated to granular plugins only. I had made one in Reaktor, which, with three instances, totally drained the trashcan Mac Pro. I wasn't able to commit anything to audio, as the cinematics were constantly evolving, and to save time I had to run it that way. Having my modular present in some cues didn't exactly help either, as interfacing it with the in-the-box setup could be a real challenge. I'm now planning to tone down the modular synth system. Sanity is a positive thing…to simplify. 

What did your studio setup look like on this project? What equipment and instruments were absolutely indispensable?

Well, there were the three Macs; iMac Pro was the main machine, Mac Pro providing the granulars and percussion, then a Mac Mini to run the occasional lighter soft synths, Spire, Serum, U-he stuff (which I love), and NI's FM8. I do use ES2 a lot, and Space Designer for the funny convolutions, but the remaining plugins differ from track to track. A lot of Kontakt and Reaktor stuff, and I probably have almost everything Spitfire Audio has ever made, although they weren't used in this project that much.

Phobos was around in some pads. On the hardware side, some of it changed since Quantum Break, but the essential ones are still the same: Prophet-6, Roland System-8 (I run its own engine or JP-8 engine with age parameter to the max), V-Synth GT and XT (the other is always doing a vocoder thing, the other is mostly a D-50), Moog One, Waldorf Quantum, JD-Xa, Studio Electronics SE-1x, Roland JD-990 (which is used for two things only), and my Eurorack Modular consisting of about 150 modules or so. Kyma (with Paca hardware) appears here and there as well as my lovely Arp 2600 remake, TTSH. I cannot imagine parting with it, it's not subtle and polite but it does that so marvelously and with a perfect confidence, so you just have to admire it. In the garage there are lots of piano leftovers and trashed pieces of plywood and other stuff I recorded for the project.

After Quantum Break, I sold a lot of gear, and haven't looked back. MS-20 and Arp Odyssey had to go, as did the DX-7 centennial, and my Neve desk remains and this and that. I've always kept some books about gear usage, and if some piece hasn't gotten any use in 6 months, it'll go to a warning list, and sitting on a warning list for another 6 months means I'll sell that piece of gear. I've always been like that, and for me, it seems to suit really well. There are serious reasons why something gets no more studio hours -- just like there are serious reasons why someone becomes an ex, or an ex-friend or ex-something.

Finally, looking more broadly at your entire career, is there any advice you'd pass on to other composers and musicians looking to venture into the world of game audio? 

Try getting a sound of your own and hone your skills so you become confident in what you can and can't do. You need to be very persistent, and you need to take care of the business side (your agreements, not only fees, but everything connected; how the works are presented in the media.) A lot of stuff can be figured out by just thinking "how can I explain to my parents what to do with this piece of music and how to relate to it?" If you can't explain it on those terms, you've probably gone wrong. 

You also need to deliver. You need to keep to the budget and the schedules. You need to be the suave lounge lizard and the just-barely-speaking-studio-nerd. You need to be able to put your music in words between the first and the second floor in an elevator. The more crystal clear is the concept (I'm not going to use the word "vision" as it doesn't really apply here), the easier your job is going to be, so work on your concept and communicate it with the audio director and the producers, and let them communicate their ideas to you. Also remember to ask for a story. Story for everything. Story for the protagonist, her teen years. Story for the good and the evil - and why they're that.

For me personally, the most important thing really is learning to know yourself, and learning to laugh at yourself. Try looking at everything you do through someone else's eyes, and you'll soon notice "this happens, because" issues. You need to be able to exchange your creative role to an analytical role in an instant, and that can't be done unless you're confident in yourself.

Oh, and also learn how to say 'no' to something that's not really your thing, because that will ultimately give you the chance to say 'yes' to other projects that are more suited to you.

About the Author(s)

Chris Kerr

News Editor, GameDeveloper.com

Game Developer news editor Chris Kerr is an award-winning journalist and reporter with over a decade of experience in the game industry. His byline has appeared in notable print and digital publications including Edge, Stuff, Wireframe, International Business Times, and PocketGamer.biz. Throughout his career, Chris has covered major industry events including GDC, PAX Australia, Gamescom, Paris Games Week, and Develop Brighton. He has featured on the judging panel at The Develop Star Awards on multiple occasions and appeared on BBC Radio 5 Live to discuss breaking news.

Daily news, dev blogs, and stories from Game Developer straight to your inbox

You May Also Like