Sponsored By

GDC Next advisory board member and Robot Invader founder Chris Pruett talks future games: "Infrastructure is enabling new types of games...It's also scaring a lot of traditional game devs out of their wits."

Patrick Miller, Blogger

August 29, 2013

21 Min Read

[Ahead of November's GDC Next, GDC's Director of Online Community Patrick Miller reached out to many games industry luminaries to see where they think the future of video games is headed. This interview is the second installment of a multi-part series that will run up until shortly before the 'future of games' conference, which takes place in Los Angeles, CA from November 5-7, co-located with the App Developers Conference.] GDC Next advisory board member Chris Pruett has seen the games industry from several different perspectives; as a console games programmer with Activision, a software engineer and game developer advocate with Google, and most recently as a co-founder of successful iOS/Android game dev studio Robot Invader (Wind-Up Knight, discussed in-depth at GDC 2013). He thinks that standardized hardware, free-to-play, and app store/platform discoverability are three trends that are key to following the future of the game industry; read on to find out why. You've worked in several different segments of the game industry -- from console dev, to Google, to a small indie studio. I imagine that you must have a rather unique perspective on the industry compared to people who tend to focus on a particular segment or niche; any insights into trends (large or small) that you've seen affecting different areas of the biz? I'm interested in the way game hardware is increasingly gravitating towards standardization. For most of the industry's history, proprietary hardware has been a way for platform holders to deliver high-end performance at cost and for game developers to stand out. But the next consoles are almost identical internally, and they are based on widely-available chipsets rather than proprietary designs. More importantly, the rise of mobile devices has created a huge supply chain of increasingly high-end CPUs and GPUs from a variety of manufactures that all conform to the same standards. iPads and Androids are all based on the same family of chipsets, and the explosion of smart phones has caused a huge amount of investment into the development of this family. We're already playing lots of games on these devices, and consoles like the Ouya suggest that a lot more standards-based game platforms may be on the way. The end point of this trend line, I think, is the commoditization of game hardware. What if you could go to the store and buy any game device, just as you buy a DVD player, safe in the knowledge that it will play all games? I think that this shift, combined with digital distribution, is going to cause dramatic changes to the way that the industry works. Actually, it's been happening for years already, but I don't think most traditional developers have felt the full force of the change yet. The move away from proprietary hardware to standards-based, widely compatible commodity systems, be they phones or game consoles or whatever, changes the way games are made, how they are priced, and what they have to do to stand out. Add to this that game technology is also becoming a commodity (Unity, UDK, etc.), and suddenly the mechanics of the industry start to look a lot more like other major media industries out there. Though it's hard to guess what the industry landscape will look like when the dust settles, I suspect that technology will no longer be a major component of game marketing. There will still be a need for lots of technical innovation, but it won't be technology that differentiates one platform, or one game, from another. Game companies will have to come up with new ways to stand out in the crowd, and platform holders will have to find new ways to present game content. These things are already happening; games on iTunes and Google Play live and die by the level of exposure they receive, and only rarely is the marketing message about raw tech. Valve's Greenlight system, while still very young, is an attempt to prepare for this post-commodization world, I think. Whatever happens, I think that this core shift away from proprietary hardware to standards-based hardware will dramatically change the way that games are made and marketed. Why do you think the industry has resisted this kind of standardization for so long? (I have a hard time imagining kids growing up without arguing about the relative merits of consoles on playgrounds…) Well, in certain areas of the industry, standardized hardware has been the norm for years. All PC games are running on x86 CPUs and are drawing through standardized APIs like OpenGL and DirectX. But on consoles, there have been lots of reasons for platform holders to build proprietary devices. The business model up until now has been to build a device that delivers performance at low fabrication cost in a way that is not straightforward to harness. Because the hardware is weird and proprietary, it takes developers several iterations to figure out how to build high-performance games for it. This ensures that software quality increases over the lifetime of the console, even as the hardware remains fixed. And since the hardware doesn't change, fabrication methods can be optimized and improved so that the cost to build becomes very low at the end of the console cycle. Since the cycle itself takes a number of years, the next device should be able to deliver a jump in performance without relying on an excessive cost increase. At least, I think that's the theory. The problem with that model now is that mobile chipsets are quickly encroaching on console chips. The year-over-year increase in mobile GPU power is insane; we've gone from something that was essentially a PS1 to something that can run some 360 games in the span of three years. This increase is a direct result of competition in the smart phone chipset market; you've got folks like Samsung, ARM, TI, PowerVR, Nvidia, and Qualcomm all investing huge amounts of money into mobile chipset R&D because the market for phones and tablets is exploding. This is way more investment than a single console ever benefits from. And since these chips are all designed around the same standards (ARM instruction sets and OpenGL ES-based GPUs), software written for one chip will run on all the others. I think it's significant that the Xbox One and the PS4 are both eschewing proprietary chipset designs in favor of off-the-shelf components. It suggests to me that Sony and Microsoft believe that they cannot maintain a performance edge over the exploding mobile chipset market for another seven-year console cycle. If your phone can run Xbox One games three years from now, and it's cheaper than the Xbox One because your carrier subsidizes the initial purchase price, why would you buy an Xbox One? There are reasons, but they aren't performance- or graphics-based reasons any longer. Which is probably why Sony and Microsoft are spending so little time talking performance numbers for their new boxes. The message has been about other things that raw chipset performance doesn't provide on its own (Kinect, cloud computing, exclusive software titles, large availability of indie titles, interface with TV, better controller, etc.). The Xbox One and PS4 need a reason to exist that isn't limited to looking the prettiest. And they need a way to do it without having to reinvent the wheel every time they want to release new hardware, because their competition releases new hardware every six months. My assumption is that the new consoles from MS and Sony are based on off-the-shelf parts so that they can release follow-up consoles a few years from now that are backwards compatible. Though mobile chipsets are not threatening to overtake high-end console performance yet, I think the design of the Xbox One and PS4 suggest that Sony and MS expect them to. As far as I can tell, this is a very good thing for developers. It means a dramatic reduction in the cost of software development, as we've already seen on mobile platforms. A high-end, triple-A game is still going to cost a lot to make, but small developers will also be able to deploy to these systems with whatever is within their means to build. And as we've seen on mobile, small developers are already capable of some pretty amazing high-end games. If there's a downside, it's that the marketing message is about to get a lot more complicated because you can't just claim that your box can write love letters and the others can't anymore. But long term, I think that's actually a positive thing as well, as the focus should be on the games and not on the boxes that host them. Furthermore, I think that the eventual commoditization of hardware will cause the price of home consoles (or whatever it is we play games on five years from now) to fall dramatically. That's a really good thing because in order to grow as an industry we need to reach a much wider audience than is typically addressed by your PlayStations or Xboxes. Again, mobile devices are the canary in the coal mine here. So, in summary, I think hardware will eventually become a standard, and games themselves will drive purchasing decisions. I think this will make a much larger array of games commercially viable, both from big and small developers, and I think it will lower the cost of game production significantly. I think that standardized hardware will also lower the cost of entry for players, and thus widen the audience that buys games. In short, this is a move that takes video games from being a large niche to being something that is available to the mainstream. And unlike today, it won't be limited to mobile platforms (or, to put it another way, all our platforms, including our TV consoles, may be mobile). All that said, I do think there is still a place for branding and differentiation, and so we don't have to worry about giving up our fanboy rage arguments. I mean, just look at the audiophile world; everything is basically a speaker and an amplifier and some cabling, all based on core standards and available in a huge variety of price points, and people will fight to the death to convince you that their setup sounds superior. Also, companies that win on usability and production quality can still carve out huge markets even with standard hardware. You could have bought any old MP3 player in 2005, but you bought an iPod because it was the best. I don't think the standardization of hardware changes that. You've been around in mobile games long enough to see the hype explode (mobile games will make us all rich!) and then die down (mobile games are a terrible way to make money!). From your perspective, where are mobile games going to go from here? I think that the mobile game market has reached a point of constraint that is holding it back: The app store interfaces. The problem is pretty simple: iTunes and Google Play haven't evolved fast enough to deal with the dramatic increase in app volume in the last few years. Compare iTunes, which hosts almost a million apps, to Amazon. The former is able to show small sets of hand-picked titles and statistically computed lists of a few hundred titles each (with many overlaps). The latter is able to serve huge volumes of products to all sorts of people in a way that is personalized, algorithmic, and easy to sift through. The main problem with mobile games these days is getting exposure. Exposure, be it via word-of-mouth or featuring or paid marketing, is the only way to get a game to players because without it apps simply vanish into app store black holes. Even with exposure, attention competition is so fierce that developers must do everything in their power to make their games as attractive as possible (hence the dramatic rise of free-to-play games). Companies with lots of money to throw at marketing can generate exposure outside of the stores on their own, which is why the App Store top lists are filled almost exclusively with games from large companies. Companies without such dramatic marketing budgets are having a pretty hard time right now. To platform holders like Apple and Google, there is immense value in ensuring that their respective app stores surface good content to users. Good apps and games are what sell hardware, and what keep developers coming back to the platform with new content. But "highest quality" and "biggest marketing budget" are rarely equivalent variables (for proof of that, one needs to look no further than this summer's blockbuster movie lineup). Apple and Google should consider it a problem that companies with money are able to buy their way to the most visible regions of the app store. It is in the platform holder's interest to insure that good content, regardless of where it comes from, gets shown to users. App stores aren't doing a good enough job of that, yet. In fact, I think they're still stuck in the stone age of digital content discovery (though both Google Play and iTunes have gone through significant revisions in the last year). If things remain as they are, the wealthiest companies will quickly dominate the platforms, and smaller developers will have no choice but to pack up and move elsewhere. However, there's still time for them to level the playing field. You've been vocal about your defense of free-to-play business models as not-inherently-evil-or-exploitative. How do you think F2P will continue to grow and change (if it will at all)? This is a pretty big question, mostly because "players" is now "everybody in the world with a smartphone." I mean, seriously, the data says that games are the #1 most popular kind of app by a large margin, and something like a billion people now own smart phones. So when we talk about "player expectations," I think we're talking about a group of people way too large to draw generalizations about. We could talk about the "traditional console gamer male in the US" market, or the "Japanese high school girl" market, or the "40+ housewives in Germany" market if we want to drill down, but we can't just talk about "gamers" anymore because, when it comes to mobile, that's everybody. This is part of the reason that free games are popular in the first place. Traditional console gamers may associate a certain type of game that takes a certain amount of time to complete with a certain price tag, but the rest of the world doesn't necessarily perceive the same value. What is the value of games to a person who's only ever played the free version of Hearts that came with his computer? This person might be willing to spend money on games, but maybe not up front, because hey, as far as he knows, games are free things. For this reason I think free-to-play is here-to-stay. But to me, all that really means is that the game can be downloaded for free and provides some way of spending money within it. The designation of "free" is important because large swaths of the audience want to try before they buy, and because (as mentioned above) on some platforms developers are so starved for attention that any potential hurtle to trying a game must be eliminated. I don't think we've even scratched the surface when it comes to discovering the best free-to-play models. And when I say "best," I mean both for the developer and the player. Because players will not continue to play games that they do not enjoy, and I don't think that games can survive forever on the backs of a small percentage of wealthy users. Free-to-play design is uncharted territory, and that while some developers have struck oil in one corner, we really have no idea what the boundaries of the space are. I don't think paid games will go away in the future; it's very uncommon for a new trend to completely eradicate the old guard. There are some areas of the industry (e.g. Steam) where the audience seems more than willing to pay for games upfront, and I think that's great. There are also certain game genres that do not fit well with existing free-to-play models, and until somebody figures those out, I think fans of those genres will stick to paid games. Generally, I feel that the more options developers have, the better. In an ideal world, we'd pick a monetization model that fits best with a given game design rather than being forced down one road or another. That said, I think that the free-to-play market will continue to grow and will eventually relegate paid games to a niche, much in the way Pandora and Spotify are finishing off the killing blow that iTunes dealt to CDs. What kind of systems or markets do you think devs should be looking to take inspiration from for designing and monetizing free games? What work (either in games, or outside of games) do you find most promising or inspiring in this regard? I think that making a good free-to-play game means treating the monetization aspect as a core part of the game design. A lot of monetization systems are part of a meta-game, used to regulate progression while the actual game play itself is unaffected. But making a purchase is an emotional event, and I think we should use them the way we use other aspects of the game design that carry emotional weight. For example, the Dead Space series has an interesting system with its Power Node items. Power Nodes are rare, extremely expensive items that the player can use to upgrade his weapons. There are not enough Power Nodes in the game to upgrade everything, forcing the player to make (sometimes hard) decisions about when and where to spend them. But in addition to upgrading items, Power Nodes can also be used to open specially locked doors. Encountering one of these doors is nerve-wracking because one must choose between collecting some power-ups (the quantity and quality of which are unknown) and permanently removing a precious Power Node from the game. I agonized over every Power Node door I found in Dead Space 2, and that's exactly what the developers of that game wanted me to do. When playing a free-to-play game, I have fun trying to min/max the purchases required to progress. In fact, I think that this mix/maxing of the purchases is a core game system that makes well-implemented free-to-play games more fun. I'd like to see purchase decisions extended throughout designs in order to prompt the player to have to make difficult decisions. If, as in Dead Space, we want the player to agonize over a decision because that's the state in which the rest of our game is most emotionally effective, shouldn't we be able to achieve that by putting real-money purchases on the line? I feel that there is significant room for risk/reward play mechanics that are emotionally resonant because they are grounded in real money (even very small amounts). I'm also fascinated with the idea of free-to-play games with significant story content. The Walking Dead is the only really successful example I can think of, and its free-to-play implementation is a simple one (which is not, in any way, a complaint; I'm very happy that it worked for that game so well). I feel that nobody has cracked the free-to-play-plus-narrative nut yet, and that's caused a lot of popular f2p games to be pretty thin, narrative-wise. I am pretty excited by that problem. Sometimes it seems like the best ideas in tech and games simply didn't happen at the right time. What do you think will make a comeback when the time is right? A recent example of tech that appeared to fail but will eventually become very important is game streaming. Though I was pretty skeptical of OnLive, the type of tech it developed is probably the way games distribution will work in the not-so-far future. Our Internet connections need to get a little better, and our server farms need to get a lot faster, and our games need to be written in a way that is distributable across such a farm, but those seem like matter-of-time issues to me. Game streaming, when it improves to the point that we find it acceptable, will also have pretty huge ramifications for the game industry. No more piracy, no more compatibility issues; commodity hardware really becomes just a dumb box connected to your TV or phone. I do think that this future is still a little ways off, but only because our internet infrastructure needs to improve. The basic technology is here now, as OnLive was able to show. [Infrastructure] is where I see the most disruption going on currently, and it is enabling new types of game designs that have never been viable before. It's also scaring a lot of traditional game developers out of their wits. Personally, I think that we're right in the middle of a rocky transition. Everything is in flux, lots of experiments are being performed, and even the success stories are fraught with problems. I think that will pass and the industry will settle down again. When it does, though, I think it will look pretty different than it does today. My long-term hope is that the result of all this is an environment where the audience is so huge that even niche genres have large followings. The types of games I want to make are not commercially viable on consoles today, and many of them would be hard on mobile due to discoverability issues. If we had a platform where discoverability was easy but cost remained at mobile levels, which was adopted by a very large audience, we'd have significantly more creative freedom. I think we're on the right trajectory to get there, but I am sure that it will be years before we actually arrive. On the topic of discoverability: That certainly seems to be a pickle -- especially considering that the number of games available on mobile devices is only increasing. Where do you think the most promising leads are in solving the discoverability problem? I think the future is personalized algorithmic curation. That means algorithmic selection of content to show to individual users based on whatever the store knows about them, plus a large injection of data from your social graph. This is basically a search quality problem, analogous to Google and Facebook and Twitter. Google assesses quality with algorithm, Facebook assumes that you share interests with your friends, and Twitter is a publicly maintained "playlist" of content that you want to call attention to. A better app store would use all three of these systems to present users with apps that the user is likely to enjoy. Into this mix it would also randomly add in apps that are completely out of the regular selection criteria, as well as apps that haven't had much exposure yet. This means that all kinds of games would be surfaced to users, that the results would be personalized for specific tastes and social groups, and that even minor, crazy titles would get a chance without needing to bank everything on special promotion from the platform holder. It also wouldn't ruin the market for big companies, as it would still be effective to advertise games in order to drive users to search for them. But it would prevent advertising dollars from owning the player's view into the store. All this technology already exists, of course. Amazon does much of it, but more importantly, this type of system is what drives online advertisements. Instead of showing the user a purchased advertisement, treat every app as an ad and use the same logic to select the best fits for the user. Then use historical information from that player about what they like to tune and improve these suggestions over time. This isn't an earth-shattering prediction. It's already what Amazon and Netflix and many other online services are doing to some degree. That's why I think that the app stores are still in the stone age because they are missing much of this still. Registration is now open for GDC Next - which Pruett is on the advisory board for, helping to pick the content - and the co-located ADC. The first 500 attendees who sign up can save over 30% on ADC, GDC Next, or a combined VIP Pass. For all the latest news on GDC Next, subscribe for updates via Facebook, Twitter, or RSS. Also, check out the previous 'What's Next' interview with Raph Koster. Gamasutra and GDC are sibling organizations under parent UBM Tech.

Read more about:

event gdc
Daily news, dev blogs, and stories from Game Developer straight to your inbox

You May Also Like