Sponsored By

Alt.Ctrl.GDC Showcase: Student project Emotional Fugitive Detector

Emotions make people dangerous in the world of Emotional Fugitive Detector. A machine is designed to scan the player's face and look for them, failing them should they be a too expressive.

Joel Couture, Contributor

February 16, 2017

7 Min Read

The 2017 Game Developer's Conference will feature an exhibition called Alt.Ctrl.GDC dedicated to games that use alternative control schemes and interactions. Gamasutra will be talking to the developers of each of the games that have been selected for the showcase. You can find all of the interviews here.

Emotions make people dangerous in the world of Emotional Fugitive Detector. As such, this machine is designed to scan the player's face and look for any emotions, failing them should they be a little too expressive.

However, Emotional Fugitive Detector is a game for two, tasking players with sticking their heads inside of the booth, one player trying to figure out which emotion a player is expressing while not being overt enough that the computer figure out which expression they're making.

Built to keep the player's face in place and checking them out using face-tracking technology, Emotional Fugitive Detector was built by a group of students from NYU Game Center, all pooling their diverse skills together to form this interesting project where the only input was trying to offer no input at all.

Gamasutra spoke with the developers of this future ALT.CTRL.GDC exhibit, talking about the unexplored fun of non-input as input, and the way they turned a challenge into great design with the emotion-tracking game.

What’s your name, and what was your role on this project?

I'm Alexander King, and my co-creators are Sam Von Ehren and Noca Wu. The core design was a collaborative process between the three of us. Most of the technical work, both programming and hardware, was done by Sam. The digital visual assets and fabrication of the robot itself were done by Noca. Additionally, I did the sound and audio, and have also been handling the project logistics.

How do you describe your innovative controller to someone who’s completely unfamiliar with it?

Sometimes we say it's like a game of emotional charades. It's a two player cooperative game where the players are trying to outwit a malevolent computer. One player is using only their face to get their partner to guess an emotion, but without the computer detecting that emotion before their partner does.

What's your background in making games?

The three of us are part of the MFA program at the NYU Game Center, and we all have very different backgrounds. I was an analytics consultant for a number of years. My real passion was always for games, so I decided to switch careers and put all that Excel experience to better use. Sam studied computer science in his undergrad and got his first exposure to game development working in QA. That's what convinced him he had to make games, and he's worked on small game projects for the past few years.

Noca was a graphic novelist before joining the program, and was drawn to games by interactive storytelling and being interested in narratives where the player makes their own choices and forms their own stories. So, we all bring different things to the table! It's one of the greatest things about studying at the Game Center.

What development tools did you use to build Emotional Fugitive Detector?

In terms of software, the game is built in Javascript using node.js. We used an open-source face-tracking library called clmtrackr, which uses a face recognition technique called constrained-local-models to identify emotions. Lastly, we're using an Arduino Flora to process the button inputs from the second player and control NeoPixels to show how many lives the humans have left.

What physical materials did you use to make it?

We used lasercut acrylic for some of the detail pieces, but the robot's case was primarily made from foamcore boards with a gloss paper exterior. It's actually twelve separate pieces currently. Noca used traditional joinery techniques in the design, so the whole robot slots together without tape or nails. That made it very portable and easy to assemble! We're making a new case for GDC though, something extra sturdy since there'll be so many people.

How much time have you spent working on the game?

It's hard to say exactly because we spent a lot of time iterating and playtesting the core idea. We worked intermittently over last summer, then more heavily in the fall. We were testing new implementations of the idea almost every week at Playtest Thursdays, trying different things and tweaking the design until we hit on the gameloop we have now.

How did you come up with the concept?

While we experimented a great deal to reach the final design, we knew we wanted to make a game using face reading as a mechanic. We thought it hadn't been explored too much in games before. So much of our mental hardware is dedicated to reading faces and making expressions. Humans are wonderfully emotive, it's why we see faces in things so easily - it's what we're wired for. 

Our game plays exactly to that very human skillset. By needing to make an expression that's obvious enough for your partner to detect, but too subtle for the computer, it asks us to exercise our emotional reading and expressing abilities. Even though we use those skills all day every day without thinking much about it, we rarely get to "test" them, and when we do, it's usually framed as deception (like bluffing games).

Emotional Fugitive Detector seeks to have the players fool a computer, but not each other, with their facial expressions. Why did you want to explore this difficult middle ground?

We got there through experimentation, actually. Earlier versions of the game had players competing in various ways to be read correctly or incorrectly by the face scanning computer. These prototypes tended to be either too difficult or too easy though, because no matter how much tuning we did, the algorithms aren't 100% accurate at reading people. Matt Parker, one of our faculty members who's done a lot of hardware hacking, suggested leaning in to the fact that the computer can't always detect an emotion, and turn the bug into a feature! Trying out those ideas eventually brought us to the current design.

What challenges did you face in creating a game that recognized the nuances of human emotion through facial expression?

The implementation challenges were pretty immense. None of us had done a physical game like this before, so debugging hardware was a new experience, like trying to find out if your code is wrong or if a wire is misplaced. But optimizing the face-reading was the biggest challenge. Lighting affects it a lot, and it can break completely if the player turns their head. That's what led to the "evaluation aperture" design, it was inspired by boardwalk cutouts and it's actually just there to keep the player from moving their head around!

Why create a controller when the goal is to offer no input? What thoughts go into creating a controller with that goal?

That's an area you can explore with digital-analog hybrid games. We're so used to computers reacting perfectly to our inputs, controllers exist to detect our movements and then react. Inverting that relationship opens up a very different design space. Here, it's really your face that's the controller, that's what you use to play the game. And because we're so expressive, not broadcasting using that controller is the hard (and fun) part.

How do you think standard interfaces and controllers will change over the next five or ten years?

I think the pendulum is swinging towards simpler interfaces. It was a revelation I had while playtesting another game, just how byzantine standard controllers are for most people. If you're not accustomed to it, the modern console controller is this baffling object bristling with buttons and pads and sticks. Especially as game designers, we take for granted the many, many hours of acclimation it takes to fluently use such a complicated input device. But complexity of input in no way causes complexity of game system in and of itself. I think simpler controllers, be they touch controls or simplified gamepads like the Nintendo Switch's, are going to be the way forward.

About the Author(s)

Daily news, dev blogs, and stories from Game Developer straight to your inbox

You May Also Like