Games can evoke a strong emotional response in people, but what if games could have an emotional response to people? There are techniques today that allow video games to respond to human emotions. In this article, I’ll discuss how games can be modified to respond to human emotion!
In order to learn Unity, Unity Technologies offers video tutorials, documentation, and training days at their Unite conferences. When Affectiva decided to enter the gaming industry I took a look at the various games their tutorials taught, to see which would be the best to emotion-enable. I settled on the game “Nightmares”, which is a game about a person having a nightmare, in which they are in a magnified version of their room and toys attack them. I went through their training day and created the original game.
After we finished an alpha version of Affectiva’s Unity plugin, I used it to make an emotion-enabled version of “Nightmares”, which I call “EmoSurvival”. In the original game, the toys (ZomBears, ZomBunnies, and Hellephants), would always walk straight to the player and attack. In EmoSurvival, the toys interact with the player based on the player’s emotions. Negative emotions cause the toys to attack. Positive emotions (expressed via the green “zen” bar at the bottom of the screen) cause the ZomBears and ZomBunnies to run away. Fear causes the Hellephants to say, “I smell fear”, and attack. This type of interaction with enemy characters is common in stealth games. The gun firing sound in the original “Nightmares” was a loud shotgun, so I changed it to sound like a laser, thus better fitting the stealth genre.
Another improvement I made to the original game was that if the player turned away from the screen, the game would pause. This can be a great feature for when a player is distracted. Without this feature, the player would either have to find and activate a pause command, or return to the game to find they lost.
The original game had far more lighting. I made changes so that the lighting matched the player’s level of fear. In this way, a game can adapt the “scariness” to the player automatically. There are many games where a player can manually set the level of goriness, difficulty, and lighting. With emotion detection, games can adapt these settings automatically to make a rich individualized experience for the player.
This is just a sample of what emotion detection can add to video games. There are several other ways that this technology could be used in games, for example, players meeting each other in a multi-player game could see the actual expressions of the player, on the player’s avatar. Another example is for use in games with in-app purchases to estimate when a player is in a mood to make a purchase, and how much they might be willing to spend.
You can visit our documentation, which includes a 30 minute video tutorial on how to emotion-enable a game. The tutorial picks up right where the Unity training day videos finish. If you’ve never used Unity, you can learn how to write the game with a day’s worth of video training, and in just 30 minutes, how to emotion enable the game. You can play more examples, including Nevermind, at GDC 2016 booth 2405.