When developing indie games we throw out a disgusting amount of work. We all expect it. We work for weeks on a particular feature or asset only to decide that it doesn’t suit the game and we need to bin it. We take solace in the fact it wasn’t a waste of time because at least we’ve “learned” during the process.
When making a new title we can identify the assumption we’re making about the game. We make an educated guess about what art style an audience may like, or what mechanics they will enjoy. We create Minimal Viable Products (MVPs) to playtest these assumptions, but is there a way we can do it more efficiently?
There is a practice in Silicon Valley called “Lean Development” that outlines a way to create high risk products with minimal waste. Lean Development is the brain-child of Eric Ries, and is the focus of his book “The Lean Startup”. It details a strategy for creating anything that contains extreme uncertainty, which just happens to be exactly how game development works.
Lean Development shifts defining the value we add to a game from implementation to learning. It means we stop focusing on features, and start focusing on validating our assumptions. When we understand the product we are trying to make, we can make more informed decisions about what goes in to the game, and what doesn’t. The important distinction between feature development and validated learning is that one is about trying to get the game to completion, and the other is about trying to make sure we’re working on the right game in the first place.
We tend to think of a MPV as the game as a whole, not something to confirm assumptions. When we think of an MVP as an isolated experiment to learn more about our game rather than a subset of our game, we can suddenly get much more creative (and cheaper) with exactly how we do it.
As an example, say we were making a match three game and wanted to use a viking theme. We can assume that our audience likes both vikings and matching games. Rather than spending weeks creating a functional prototype, we could create a single piece of promotional art that links to a “Coming Soon” page, drop $100 on a targeted facebook ad, and decide an amount of clicks we would need to prove that a Viking Matching game is an idea worth pursuing in the first place.
When you start to build up your library of “validated learning”, you can make more informed and objective decisions of what avenues to pursue and what to abandon. By identifying our biggest “Leap of Faith” assumptions and testing them as early and cheaply as possible, we reduce the likelihood of throwing away weeks of work because the path we took wasn’t ever viable in the first place.
Shifting our priorities from feature implementation to validated learning can be a difficult step to take, but in doing so we can refocus our efforts to working on what we know is important rather than what we think is important.