Sponsored By

Valve's wearable computing ace discusses the challenges facing VR

"VR and AR aren't just a matter of putting a display an inch in front of each eye." Valve's Michael Abrash comments on the perceptual challenges facing wearable hardware like the Oculus Rift.

Kris Ligman, Blogger

May 16, 2013

1 Min Read

Following on his talk on virtual reality delivered at this year's Game Developers Conference, Michael Abrash has written up a new blog post on some of the technical challenges in getting hardware like the Oculus Rift to feel "real" to the human brain.

There are three broad factors that affect how real – or unreal – virtual scenes seem to us, as I discussed in my GDC talk: tracking, latency, and the way in which the display interacts perceptually with the eye and the brain. Accurate tracking and low latency are required so that images can be drawn in the right place at the right time; I’ve previously talked about latency, and I’ll talk about tracking one of these days, but right now I’m going to treat latency and tracking as solved problems so we can peel the onion another layer and dive into the interaction of head mounted displays with the human visual system, and the perceptual effects thereof.

The article offers an in-depth discussion of the actual nuts and bolts of human visual perception and how head-mounted devices like the Oculus Rift need to address them. "More informally, you could think of this line of investigation as: 'Why VR and AR aren't just a matter of putting a display an inch in front of each eye and rendering images at the right time in the right place,'" Abrash writes. The post is the first in a series and a valuable read. Readers may also be interested in viewing Abrash's 25-minute talk on the GDC Vault, or viewing the slides available from Abrash's blog.

Read more about:

2013

About the Author(s)

Daily news, dev blogs, and stories from Game Developer straight to your inbox

You May Also Like