We commonly look to virtual reality as the next big technical frontier of video game development, but at least one company out there believes that current VR headsets don't go far enough.
Forget about tracking head movement -- as far as the folks at Fove are concerned, to make great VR games you've got to go right for the eyes.
Backed by the London branch of Microsoft Ventures, the startup was co-founded by a hardware engineer and a former Sony Computer Entertainment Japan producer to build a headset with built-in infrared sensors that track eye movement as game input.
There's a lot about the (now successfully Kickstarted) headset that's still up in the air as the team works out the realities of production, though prototypes have already been used to help physically challenged children play piano without using their hands; you can find a detailed breakdown of the technology over on the Fove website.
You'll also find some brief nods to the team's work building eye-tracking plugins for popular game engines like Unity and Unreal, a topic we sought to shed more light on in the following brief email conversation with Fove co-founder and former SCEJ producer Yuka Kojima (pictured, with Fove co-founder Lochlainn Wilson).
Let's talk briefly about Fove's origin. You started developing the hardware part-time while working at Sony Japan; what specific projects inspired you to dig deeper into eye-tracking tech?
Kojima: Lochlainn [Wilson] did all the hardware development, whilst I conceptualized the interaction ideas. I was working on an unreleased PS Vita project when the idea of using eye tracking in games occurred to me. Fove was born from a discussion with Lochlainn and I regarding how to best realize this kind of interaction in modern games.
Can you offer some examples of how eye-tracking tech changes the game design process? Should developers interested in this tech take into consideration things like eye fatigue, for example?
Fatigue is generally not a problem with eye motions. Basically, we are always looking around, whether intentionally or not; it is very difficult to truly fatigue the eyes in the sense of making them move too much.
Our first cyclops demo is a very un-natural experience designed to highlight the power of eye tracking in as clear terms as possible. A real VR experience would be more subtle, such as a character recognizing what you are looking at and responding, or adapting their story. You could do things ranging from having enemies that possess you if you look at them, to the more subtle detection of how alert or sleepy you are.
We believe that eye tracking opens the window to the player's soul in VR, and can make experiences more engaging and believable.
So what challenges did you face in making FOVE play nicely with Unity, Unreal Engine, and other established game dev tech?
Fove is designed to play nice with those game engines. There are not too many issues with that right now, we have plugins that just work.
Fair enough. You seem eager to encourage developers to make VR games for Fove (see pitch video embedded above), but there are already a bumper crop of VR platforms to choose from. Why should game makers spend time/money to bring their games to Fove?
We do not believe that the current generation of VR has really nailed it. We think that game developers will see what we have got and push for this technology to be in more headsets.
Do you have any plans to license Fove's eye-tracking technology out to other hardware companies?
We will of course consider licensing our technology as we want to make sure it's in as many headsets as possible. We strongly believe that eye tracking is a key technology for the future of VR.