I am excited to announce the release of my latest game. TETBeams is an experimental target-shooting game that uses an eye-controlled interface. It was developed in Unity with C# and makes use of eye-tracking hardware by The Eye Tribe.
The basic concept behind TETBeams is that the player's eyes hold a massive amount of energy that can be released on command. Think of the Marvel Comics superhero, Cyclops. So long as the player's eyes are shielded, their energy will recharge. However, once the eyes are uncovered, energy beams will be released. The challenge for the player comes in the form of moving targets. If the player can unleash enough energy on a target, it will explode! The goal is to destroy as many targets as possible.
"Whatever he looks, he destroys"
The eye tracking hardware keeps an updated record of wherever the player looks at a given time. During the game, the player's eyes are used to aim at moving targets. Traditionally, aiming might have been accomplished by moving a mouse cursor to wherever the player wanted to fire. Instead, the player simply looks at the desired location on the screen. Only a single button used in the game (left mouse click). Holding the button will fire the player's energy at unsuspecting targets. Releasing the button will allow the player's energy to recharge.
In TETBeams, a player looks to a screen location to fire energy beams
The full source code, assets, Unity project, and version 1.0 Windows executable are available on GitHub. All are welcome to make use of this project as-is, with modification, or for redistribution. See the included license file for details.
Today, my excitement surrounding compact, affordable, eye-tracking hardware is similar to that which touchscreens brought a few years back. Both introduced new, mass-market forms of human-computer interaction. Once again, new avenues for game design innovation have opened. TETBeams is just one demonstration of how interfaces can be simplified and engagement can be amplified through the use of eye-controlled technology. There is much to learn about the design of eye-controlled interfaces for games. There is also an urgent need to explore this territory. Although using one’s eyes may seem like an obscure way to control software today, it will likely be commonplace within a few years. Imagine a developer making a simple call to the eye-control features of a mobile device, much like we do with cameras, gesture recognition, and accelerometers today.