Anna Marsh of Lady Shotgun Games had some 15 years of game industry experience behind her before going indie, and her studio recently released Buddha Finger
for iOS -- an excellent martial arts-oriented experience using touch. At the Develop Conference in Brighton, she spoke about design for touch screens.
As touchscreens have come into their own as a form of game control, Marsh says they've proven they're not a "fad or a sideshow to the real gaming industry, but a very serious platform [with] a massive install userbase that gives us access to hundreds of millions of players, so they need serious consideration." But interest goes beyond the numbers -- what interests Marsh isn't narrative, story or cool weapons, but what it feels like to play.
"Unless I can imagine what a game is going to feel like, I can't work on it," she says. "Game design is like sex: It's right when it feels good."
Game feel, however, is less "sexy" to talk about than story or cool weapons, and people commonly think that game feel is arbitrary or accidental. "We use the same word in English for our emotions as we do for tactile things... what makes 'feel' happens at a subconscious level." Game feel involves not only high-level inputs, but the subconscious of the senses.
"The difficulty of that is that the subconscious is hard to talk about," she says. "It's happening at a level we're not consciously comprehending."
Controllers vs. touchscreens
A controller for a game is often a space between the player and the game, the interface for their connection with the experience. It's actually a learned convention -- for evidence of this, look no further than how a non-gamer attempts to interact with a controller when they're unfamiliar. Screens, on the other hand, are immediate to the brain.
"It's really important that this shortening of the circuit, there, is happening in our brains," Marsh says. "You can directly touch the objects in the game, so it's very easy hand-eye coordination... it radically reduces the learning curve players have to make when they pick up a game."
Those who are accustomed to designing for controllers find touchscreens limited -- yet the array of buttons on a controller all do really similar things. Aside from the link between a trigger button and a gun, most of the interaction comes from arbitrary meanings mapped to the physical device. When using a touchscreen, the operating system can track the beginning and end of a touch, whether it's stationary or moving, as well as gestures like touching, pinching, swiping or rotating.
"You can combine those to make a really rich set of actions, where you have meaning in the actions themselves," Marsh says. "I feel that the perceived limitations come from imitating controllers... whereas if you start to use the touchscreen as a touchscreen, it becomes very versatile."
People also fret about their hand being "in the way" of seeing a game, but Marsh feels this is less a problem than some think. "Our brains are very good at filling in," she emphasizes. A human being doesn't presume an object or person disappears if they briefly lose vision on it, so if it's not a continuous problem, it doesn't interrupt the experience.
For example, Marsh doesn't think that the elements of Year Walk
that require the player to drag an object across several screens have good feel. Playing first-person shooters on a touch device is also challenging, since one constantly obscures one's own vision with one hand while trying to access touch buttons with the other. Tiny menu buttons should also be avoided in touch-oriented games, since most players don't immediately identify them.
"You don't have to put buttons in a place where the players obscure the screen, and given that we normally read left to right and top to bottom, it's better to put [UI elements] where the player's eye goes," she says.
One thing screens lack in a big way versus controllers is tactile feedback -- they can't inform the player about pressure or location. Players remember the locations of controller buttons for quicktime events because they memorize their location on the pad, not because of visual memory. So how can designers for touchscreen compensate for the absence of physical feedback and muscle memory?
Audiovisual feedback must pick up the slack, she suggests. In Buddha Finger
, the game uses about 15 or 20 events that happen when the player touches the button: Enemies' bodies flash white, their faces turn to a pain expression, shadows change color, a bold word appears, outside segments move, rings radiate out, the score changes, and sound feedback returns, among others. "We are really throwing shitloads of stuff at the player to tell them, 'yes, you have pressed that, well done.'"
Response is everything
On a touchscreen, response is everything, she says. "The player's subconscious is like a clingy boy or girlfriend -- it needs constant reassurance that yes, you really are paying attention. The player is in your world with your rules -- you need to make them feel safe for them to start being more daring in the game and to explore it."
Feedback needs to be clearly delineated, 100 percent reliable and consistent. Games where players expect touch to do something and it doesn't feel almost like a betrayal of the experience, even if there's only a 10 percent period where nothing really happens (and even when that lack of feedback has to do with framerates or tech errors).
"Immediate response is everything, not just response," says Marsh. "It's very important to process all touch events as they occur... although you have to throw a shitload of stuff at [players], the subconscious is very good at picking up lagginess or delay. Anything that isn't just a simple tap -- and most things you do on a touchscreen are not -- it's very important to always, always do something when the touch begins. Don't wait until that action is completed to trigger what is going to happen."
For example, when a player touches a "link" in Buddha Finger
, it lights up to let the player know they've activated it even before it completes the link. When an action requires multiple kinds of inputs, actions and animations should have "maximum interruptibility," to give the player a sense of maximum control.
Focus on visual cues, precision
Some visual cues are recognized much faster than others, and since touch designers should be compensating for the lack of tactile feedback, it's important to keep in mind brightness, speed, shape and color changes are picked up faster by the eye than other types of response. "The more your audiovisual feedback is being communicated with things that are recognized very quickly, the better it's going to feel," Marsh says.
This is part of why physics games like Angry Birds
thrive on touchscreen -- they make sense to players who are pulling, pushing and pinching because they respond in predictable ways. The more the physics of acceleration and deceleration behave as expected, the better the game will feel to the player. The game's response should be instantaneous and match exactly.
Precision is another factor: Fingers are less deft than a mouse or a stylus, therefore touch areas should be large enough not to deceive players into thinking they've pressed something they haven't. (The recommended touch area on a standard definition square is 44 points square," she says. On a full 768x1024 screen, the diameter Marsh uses is about 96 points -- a significant chunk of the screen.)
How much touching are you asking the player to do? "The whole point of Hundreds
is touching," she says. "Hundreds
can ask you to be very precise with your touching, because all it's asking you to do is touch, and that's all your attention is on."
When it comes to Super Hexagon
, the player's attention is less on the touching and more on the tiny triangle within the rotating lines of the shape. The surrounding area is more peripheral. Imagine looking for 44-point buttons in that game -- the touch areas are super massive, because it makes the most sense for that design.
"The way I think about touchscreens is the more non-touch stuff you have to pay attention to, the less attention you can pay to the stuff you don't have to touch," she says.
Pace, precision, direction
Finally, linking the touch action to the fiction is a strong design tactic. Movements that you use are reminiscent of real life and familiar because of muscle memory -- what you do with Angry Birds
in the game resembles using a real catapult. Lili
asks players to pluck flowers in a way that feels a lot like picking real flowers, and is well-balanced, Marsh suggests. The Room
has excellent tactility around its puzzle box world, where it's intuitive what to do with the manipulable objects in the game world, and it provides realistic responses complete with subtle, lifelike sound feedback that meet players' expectations of how the objects would behave in the real world.
Less-literal links to real-life action are also viable -- what if there's no direct link between your fiction and possible touch? You want to translate the suggestion to the touchscreen action nonetheless. "With martial arts, there's a definite 'feel' to it, and you can hear, when you listen to a martial arts movie, they use a lot of sound effects to get you to hear the pace. The important thing in translating feel, then, is pace, position and direction," Marsh says. That kind of choppy speed would work for a game about martial arts -- but a game about clay pottery would have entirely different pace and direction, for example.
"There are a lot of techniques like yoga, Tai Chi and Alexander that use body shapes to affect mental states and emotions. I don't know why it's exhilarating to put your hands in the air," Marsh points out. "There's a great concentration of our tactile senses in the hands, and I think it's possible to suggest that the hands can create shapes in miniature of the entire body, and possibly that can have some effect on your emotions, or how you feel about the game."