Sponsored By

At a GDC 2008 lecture attended by Gamasutra, Emotiv's Julian Wixson revealed the full SDK for the company's 'mind sensing' headset, from detection of blinking and facial movements to sensing of 'active intent' directly from the brain.

eric-jon waugh, Blogger

February 22, 2008

4 Min Read

On Thursday at the 2008 Game Developers Conference, Julian Wixson and a small panel of associates described and demonstrated the Emotiv headset and SDK, suggesting how a developer might incorporate the technology into a new or even quite finished production. The svelte Emotiv headset uses an array of sixteen EEG sensors to detect electrical impulses in the scalp. These signals are then interpreted by a suite of tools, each with its own range of applications. The "Expressiv" application identifies and interprets facial expressions; one of Wixson's associates demonstrated winking, blinking, and an unnerving grin, each of which was replicated on a rough facial model. Another application, called "Affectiv", recognizes emotional states. The most substantial and interesting application is the most active one, "Cognitiv", which "classifies conscious active intent". That is to say, it interprets what the wearer wants to do, allowing a player to execute specific commands and actions through thought alone. Origins Emotiv began about four years ago in Sydney, Australia, as a medical research company, focused on identifying emotional states in the brain of the individual. After some development, however, the company soon changed focus to consumer hardware. The reasons they chose EEG, out of all the available brain monitoring methods, are that it is the cheapest option; it is by far the easiest to use, requiring only about a minute of setup; that it is small and lightweight; and that it is a completely passive, non-invasive monitor of the body's "free" signals. Touchy Feely Some of the "Expressiv" signals, like blinking, are detected in binary (either they're on or they're off); most others, such as mouth movement, are progressive, interpreting a wide range of expression. Suggested uses include MMORPGs and other avatar-based situations, to enhance communication. NPCs in a standard RPG setting, however, could also react differently depending on the player's decorum (in place of overt dialog branching). Amongst uses in development for "Affetiv" is a frustration detector, which could lead to dynamic difficulty adjustment. The player's emotional state might also drive aspects of the game, such as musical score or graphical flourishes. On the more practical end, this application would be useful in monitoring play testers, offering quantitative user data. A more deliberate use might involve mental challenges, such as remaining calm while fighting, or meditating to generate magic. The "Fun Part" "Cognitiv" is the most complex application, and also the most flexible. As it is quite a different thing to detect active intent, compared with passive and more or less universal brain functions, both the software and the player need a bit of training and adaptation to produce much of a constructive result. The training phase consists of the player being asked to imagine conducting a particular action – such as pushing a cube into the screen – for a certain number of seconds. Depending on the player's focus, this might at first take more than one attempt. As the player becomes more adept, actions become easier and more natural, and soon the player can begin to juggle multiple thoughts at once. Some suggested uses include telekinesis, disintegration, and "social manipulation" ("I am the Master, and you will obey me!"). The Emotiv team stresses that it does not envision the headset as a replacement for traditional input so much as an additional layer on top of familiar control schemes. Some Logistics Any "Cognitiv" actions should probably be more properly thought of as more sophisticated and unconventional "super moves" rather than mere strings of button presses. That said, there are various ways of interpreting and integrating actions, from direct interaction (think the bat in Wii Sports) to various scripts (specific thoughts triggering abstract sequences of action). One curious function, "Emokey", allows direct mapping of thoughts to keystrokes. This can also apply to legacy applications; the team discussed their experiences playing Bioshock with the headset. It is up to developers to choose the depth of integration, from a surface-level key binding, down through some compromises with certain default settings, all the way to a hardcore, full integration, incorporating training into the game design. Any training might be subtly, even subversively, integrated into the tutorial portion of a game. Since people change over time, it might also be wise to allow the player to train again and "refocus" at will. "We expect to be amazed," engineer James Wright said, referencing some of the imagination vacuum in Wii and DS software. "I will be very disappointed if the only ideas [developers come up with] are ones we've talked about here in this session."

Read more about:

event gdc
Daily news, dev blogs, and stories from Game Developer straight to your inbox

You May Also Like