Sponsored By

Featured Blog | This community-written post highlights the best of what the game industry has to offer. Read more like it on the Game Developer Blogs.

On May 2nd, Execution Labs hosted its first official playtesting session for all the teams in its first cohort. This blog explains how our team, Pixel Crucible, proceeded and gathered some feedback and experience along the way.

Remi Lavoie, Blogger

May 8, 2013

7 Min Read

[This blog was originally posted on our website (07/05/09):
http://pixelcrucible.com/2013/05/how-we-do-playtesting-and-so-can-you/]


On May 2nd, Execution Labs hosted its first official playtesting session for all the teams. Playtesting can offer a wealth of information about what is working well… and not so well in your current game implementation. We wanted to make sure we did things right, so we slaved all day over our hot computers preparing a shiny new build for testers to play with. 

There are a few things to keep in mind while organizing a test session:

  • Always playtest with a recent build. (we don’t want to gather a bunch of feedback on issues that are already resolved)

  • Have a goal.

  • Get the most honest and objective feedback that you can.

  • Keep objective records of the feedback and the session itself if possible.

So basically… we NEEDED to have a plan!

So here, for your reading and viewing pleasure, the details of our master plan and how we used it to make the most of our playtesting session.

 

The Goal:

It is very important to set a goal for the session. Do we want to test out your game’s intuitiveness? The UI? That fresh new feature we just put in? These are all valid goals and it is important that we gear our testing towards our specific goal.

Our goal for this session was seeing how players first interact with our game without being given any instructions or background on what the game actually is. So the speech we gave to the candidates as they prepared to play our game went a little something like this: “What you will be playing is a Test Level from a work in progress. We will not give you any instructions, or answer any questions while you are playing. We want you to just start playing and figure things out on your own. If you want to narrate what you are doing, that would be great, but it’s not necessary and is entirely up to you. You will play for 5 minutes or until you complete the level, whichever comes first. After playing, we have a feedback form for you to fill out. Thank you!”

We also had a few sub-goals to verify for ourselves, we recently made significant changes to the input scheme for the game and added some new art assets, and this was a perfect opportunity to see what new players thought of them.

I can hear people saying: “So that’s all well and good sir, but how do you orient your testing towards these goals, and how do you validate them?”. That’s a very good question. First of all, there’s no need to call me sir, we’re all friends here. The answer to that question: it’s all in the way we ask our questions and collect our feedback, which conveniently enough is the subject of the very next paragraph! It’s like if I put all these sentences in some sort of order. I know, pretty crazy stuff, right? So read on!

Playtesting in session

Playtesting in session

The Feedback:

Collecting feedback from testers is without a doubt the most crucial part of playtesting. We want to, and in fact need to, plan out the questions we’ll be asking our testers. These questions need to reflect the goals that we have set for our session. Testing out UI? Well there should be a questions like: “What did you think about the user interface? Did you understand what each button did? Is there element that you didn’t understand?” and so on.

What? You want to know how we did it? No problem my friend, that’s pretty much the point of this article!

It all started with Mathieu drafting up a pretty nifty feedback form for our testers to fill out after playing. We then reviewed it as a team, made sure we covered all the goals we had set, and reviewed all the wording so it was not biased in our favor. A written form is a great way to collect much more objective feedback than just asking questions ourselves and relying on our fuzzy and probably pretty biased memories. Why are written forms better? Many reasons, let’s present them in a convenient list form.

  • Layer of separation: Asking people directly: “So what did you think of my game?” will result in extremely biased responses. People will not want to give negative feedback to our face, but are much more likely to leave more honest feedback on an anonymous form. Ideally, the people collecting the feedback would not even be part of the development team, for even more honest feedback, but we did not have such luxury this time.

  • Objectivity: Forms are more objective in two ways. The questions are always exactly the same, we do not risk phrasing them differently for each users. The way the answer is collected is also objective, it is written down by the person answering, and it’s not just us taking down notes of what we think was important in the answer, or even worse not taking notes at all and just trying to remember the feedback later on. This brings us to the other reason forms are pretty sweet.

  • Archives: We can always refer back to the filled out forms! They exist as full on tangible objects! We scanned all of them and kept those on record, and we also compiled all the answers in a spreadsheet which is shared with the whole team. What’s so great about this? We can easily extract all the issues, compile basic statistics based on the age/gender/background of testers. We can then address the issues found, and come back to the feedback and verify that all valid issues have been resolved and write down what was done to resolve them. BAM! Instant history of thewhat? why? and how we solved the issues!


Supervising on-going test while previous testers fill out the feedback forms.

Supervising on-going test while previous testers fill out the feedback forms.

 

The Records (Optional, but awesome!):

Another thing that can do to add even more depth to playtesting sessions, is keeping records. By records, I mean either a detailed log of everything that happened, or even better, a video! That’s what we did. Alex brought in this sweet document scanner camera that you can set up on a desk and we asked testers to play the game while keeping the device in the camera’s view.

We recorded the sessions, and can now look back at the videos and have really valuable information: what is the first thing the player did? how long it takes them to figure out the controls? how many times did they use a specific feature? Keeping track of all this manually while also watching what the player is doing is extremely difficult, filming and watching back the video later on makes this easy, and again, provides archives of what happened. It is also much less likely that we miss important stuff because we were busy writing down notes instead of watching the user play.

It can also be used as a comparison and validation tool for the next playtest. For example, if most players had trouble with your interface in the first playtest, we can then directly validate (or invalidate) our changes with the next playtest videos by comparing specific statistics from the previous ones (ex: Clicks to get to main gameplay: 1st playtest = 5, 2nd playtest = 3. Validates changes made.)


The setup: iPod, camera, timer, and laptop for recording.

The setup: iPod, camera, timer, and laptop for recording.


The video recording in action.

The video recording in action.



So there you have it, all the details (and some pretty pictures) of how we proceeded with our first official playtest. Hope you enjoyed the post, and that it was helpful for you.

Want more info?
Leave your questions, comments, suggestions, insults, dental records, in the comment section below.

Read more about:

Featured Blogs

About the Author(s)

Daily news, dev blogs, and stories from Game Developer straight to your inbox

You May Also Like