Featured Blog

An Actual Heuristic Analysis of Mass Effect 2

My previous blog post seemed to have been interpreted as a heuristic analysis. In this one I perform an actual heuristic analysis and offer up my own edition of the scanning system.

For those of you who haven't read my previous post on the Mass Effect 2 planet scanning system, you can find it here.

In this one I will give a short heuristic analysis of the interface and offer up a prototype I developed with a colleage in two days.

For this analysis I will use a video found here.

First off, the good:

  •  Visibility of system status:

Just about everything about the scanning system is presented to the user, they can see how many resources they have, the location of their scanner, the probes they've put down and the current scanning strength of any element. 
  • Error prevention: 

The user can only scan or launch probes.  The user cannot move the scanner off of the planet thereby breaking it.
  • Recognition rather than recall:

In the area to the right where the scanning strength is displayed lines up nicely with the element names bellow so you don't have to remember which spike represents what.
  • Feedback:

One of the most brilliant things about this interface is the sound;  as your scanner gets closer to a resource node a sound associated with that element increases in frequency.  Creating a beautiful mapping between the frequency of a sound and the amount that you are going to get from probing it, when this is coupled with the visual feedback on the right hand side of the screen it creates a nice multi-modal experience.

 And now, the not so good:

  •  User control and Freedom: 

More a complaint about the entire game, Bioware does not allow you to fully configure the keyboard layout.  This also violates standards of First Person Shooters on the PC.
  • Aesthetic and minimalist design:

The UI is increadibly crowded.  It looks like two different teams designed it: the first one did the planet, and the second the scanning feedback.

The overall problem I found with this UI is that there are two places for a player to focus, the planet and the location of the scanner (which is important to determine where to put down probes) and the right hand side and the scanning feedback (which is important to determine how much of an element you'll get when you put down your probe).  The result is that I (I can't generalize because I haven't ran user studies with eye tracking data) spent most of my time focusing on the scanning feedback while simply trying to feel out the location on the planet.  This could be because I was on a PC and not 10 feet away from my screen.

I decided to build my own prototype to illustrate the problem with the UI and did so with the help of a colleage.  We started with combining the planet and the scanning feedback into a single location, changed up the input system in accordance to my last blog post and also flattened the planet to make it easier to use a mouse on.  Currently we're still working on putting sound in but we've got most of the other functionality working.

The prototype can be found here.


I'd like to thank EA and Bioware for making an increadible game and Natalie Funk for helping me with the prototype.

Latest Jobs

Double Fine Productions

Hybrid, San Francisco CA, USA
Senior Systems Programmer

Purdue University

West Lafayette, IN, USA
Clinical Assistant Professor in Game Development

Digital Extremes

Lead AI Programmer
More Jobs   


Explore the
Advertise with
Follow us

Game Developer Job Board

Game Developer


Explore the

Game Developer Job Board

Browse open positions across the game industry or recruit new talent for your studio

Advertise with

Game Developer

Engage game professionals and drive sales using an array of Game Developer media solutions to meet your objectives.

Learn More
Follow us


Follow us @gamedevdotcom to stay up-to-date with the latest news & insider information about events & more