Sponsored By

Kinect, Anthropometry, and You

In this reprinted <a href="http://altdevblogaday.com/">#altdevblogaday</a> piece, Activate3D's Nick Darnell discusses how Kinect developers can use anthropometry to make UI interaction with the device work for most users.

Nick Darnell, Blogger

November 18, 2011

3 Min Read

[In this reprinted #altdevblogaday piece, Activate3D's Nick Darnell discusses how Kinect developers can use anthropometry to make UI interaction with the device work for most users.

Anthropometry (Greek anthropos (άνθρωπος – "man") and metron (μέτρον – "measure") therefore "measurement of man") refers to the measurement of the human individual.

Ask any Kinect developer what the hardest problem is developing a game or application that uses the Kinect – or any other depth camera. The answer you'll get most often will be creating something that works well for 95 percent of target users. This is something you have to consider for all your gestures, poses, and UI interaction.

  • Is this gesture too difficult for your average user?

  • Does this pose require too much flexibility?

  • Is this UI interaction comfortable and easy to perform?

One area that can benefit from anthropometry data is UI interaction. When you think about UI interaction with Kinect, you've got to picture it as a real world space (box, sphere, cylinder, other) located somewhere around the body, and you're mapping the hand position in that space to the 2D or 3D UI coordinate plane. When determining the size and location of this real world space and how it maps user hand locations onto the UI coordinate system, the largest question you need to consider is: Where Will They Be Most Comfortable? Generally speaking you want a space that minimizes upper arm movement – as that is much more strenuous compared to forearm movements. However, since the 3d position we're mapping into our 2d/3d UI coordinate system varies based upon user skeleton size, we can't choose a single set of real world dimensions that will work for everyone. We'll have to make educated guesses about the size and location of our real world UI coordinate frame based on size and location of the users bones. So How Does Anthropometry Fit In? Because the skeleton you get from Kinect – and other SDKs can be unreliable in certain poses; you often find yourself heavily filtering any kind of data you're tracking about the user, especially things like the user's arm length – which can vary dramatically over a session. So one thing I prefer to do is use anthropometry tables to ensure a more consistent size and location and doesn't fluctuate as much as the user's skeleton. Using anthropometry tables we can estimate the users arm length or hand size based on other bones in their body, bones that are more stable in your skeleton SDK of choice (Kinect, OpenNI, Iisu, Omek…etc). You can also use anthropometry tables to estimate the size of body parts that the skeleton SDK you're using doesn't provide – such as the size of the users hand. But Where Do You Find That Kind Of Anthropometry Data? Luckily such a resource has already been painstakingly cataloged for us by the FAA – The Human Factors Design Guide. The HFDG was put together so that planes could be constructed so that almost anyone would fit and be able to operate anything from their seat. The anthropometry data that's valuable to us starts in chapter 14, page 791. For example, these lovely tables from page 818 show the functional reach and the extended functional reach of men and women broken down by population percentiles. [This piece was reprinted from #AltDevBlogADay, a shared blog initiative started by @mike_acton devoted to giving game developers of all disciplines a place to motivate each other to write regularly about their personal game development passions.]

About the Author(s)

Daily news, dev blogs, and stories from Game Developer straight to your inbox

You May Also Like