Sony's EyeToy project started out as a visual controller prototype that used a USB camera to try and achieve controller-free interaction with the PlayStation 2, bundled at launch with the EyeToy: Play
To date, sales have been quite successful -- more than 10.5 million PS2 EyeToy cameras have been sold worldwide.
And as the PlayStation 2 still retains its significant user base while the PS3's progress is slower, Sony has powerful incentives to continue to support PS2 peripherals like EyeToy.
At the recent GameCity event in Nottingham, Sony's Sandy Spangler gave a presentation on the history of the EyeToy, discussing how software development for the peripheral would be aimed at putting the player in the game world, rather than making use of physical props.
The Early Days
Interestingly, as first- and third-party Wii peripherals have gained acceptance, the current crop of EyeToy releases will make use of physical props such as pom poms and swords after all -- like with the next EyeToy title, Pom Pom Party
-- to allow for a more first-person game play experience.
For earlier EyeToy development, the goal was to allow the player to use their body to interact with game interfaces, as well as with game play modes.
EyeToy developers made the initial decision to use a camera representation of the player primarily because of accessibility; testing indicated that people immediately grasped the control and interface when they were able to see themselves and receive immediate feedback from their on-screen actions.
The results were consistent across gender and age groups, and were generally more positive for viewers of the interaction as well as the participants. Sony would like to claim that at the time, the EyeToy created a new genre of game play, coined "physical and social gaming."
The GameCity presentation also highlighted the work of Dr. Richard Marks of Sony Computer Entertainment America. He grew up selling video games at retail, and carried his interest in gaming through his doctoral research at Stanford, where he used video interfaces to control real-time interaction with robotics systems.
Dr. Marks met with Phil Harrison at the 1999 Game Developers Conference, and the two determined that the PlayStation 2 hardware spec was a good match to allow for real-time video game processing.
Spangler showed early demo video recordings of EyeToy prototypes, beginning with a very early video test called the "Spiders" demo. This showed Dr. Marks using his body to interact with asterisks projected onto a three-dimensional virtual representation of his head.
The test showed some faults -- which he soon incorporated as features. He was able to drag and move clumps of these asterisks from his face to his hands. The aggregate effect was reminiscent of spiders clinging to his body parts.
Other tests showed Dr. Marks and his son interacting with full-screen water surface distortion effects.
These tests were followed up with the use of physical props and visual effects such as flame effects. Other demonstrations included color tracking with props, such as wands with magenta and green balls stuck on the ends.
The first showed flocking effects using virtual butterflies who would be attracted to a green ball, which Dr. Marks' son would move around the screen. The second showed a butterfly crawling around the back surface of a magenta ball, using the color as a dimensional mask to the computer-generated butterfly.
Working With A Novel Interface
Simply put, the EyeToy is a USB camera which feeds real-time video information to the PS2 for both display and processing. The live feed returned to the user does real-time resolution processing correlated to the processor load used to calculate the interaction. It continuously compares changes in pixel information to create a feedback mechanism.
Very quickly, the Sony team learned that the best approach to developing games for this novel interface was to come up with simple and practical physical mechanics rather than traditional games, and then to build mini games around the most effective mechanics.
The physical interactions were put through rigorous development and testing to determine their robustness, repeatability and fun across the widest range of real-world environments and lighting conditions.
Some of the most stable mechanics translated to rather effective game play. A few examples are the "hit-," "rhythm-," and "wipe-based" interactions as exemplified by the EyeToy: Play Kung Foo, Beat Freak
, and Wishi Washi
For EyeToy on the PS2, Sony plans to release two new titles for the UK holiday season. These include SCEE London Studios' EyeToy Play: Pom Pom Party
and Zoë Mode's EyeToy: Hero
. Both make use of physical props such as a set of magenta and green pom poms, and bright green foam swords, respectively.
The use of these props allow for richer interactions. For example, EyeToy Play: Pom Pom Party
allows for position-based tracking of the accessories which sense whether one's arms are crossed in relation to one's body.
In EyeToy: Hero
, the developer could use virtual environments from a first-person perspective which the body tracking-based system didn't permit.
With an 8-year release history and 10.5 million cameras worldwide, combined with the huge installed base of the PlayStation 2, Sony is further demonstrating its support of the platform with the upcoming release of these two titles.
As part of Sony's contribution to GameCity and Nottingham-Trent University's Videogame Archive project, the Sony team donated the very first EyeToy production prototype to their permanent collection.