What is Nintendo really attempting to do with the Wii U? Game designer and researcher Ian Bogost, in the latest installment of Persuasive Games, looks for the answer.
For a century and a quarter, Nintendo has devoted itself to an unspoken mission: making games safe, stripping them of their risk and indecency. The company started as a hanafuda playing card manufacturer in the late nineteeth century. Like most gambling, hanafuda was closely tied to organized crime, and the term yakuza, the Japanese word for an organized crime mafia, finds its origin in that game. Nintendo set up shop just after hanafuda had been made legal in Japan, and the company seems to have remained embroiled in gambling and organized crime even as its products sanitized that practice for a newly enfranchised general public.
But even after 70 years in business, Nintendo still struggled to turn the proverbial tables on playing cards. Finally, in the late 1950s, a licensing deal with Disney allowed Nintendo to produce a series of family-oriented card decks and instructional books, changing its fortunes, and marking its second great taming of the medium of games.
After diversifying into electronic toys in the 1970s, the company imported video games to Japan -- it was a distinctly American form of entertainment that had been commercialized by Magnavox and Atari. Nintendo's first video game products, the TV Game 6 and TV Game 15, were based on Odyssey technology licensed from Magnavox.
But by 1981, original handheld and coin-op games made their way out of Nintendo's factories -- the Game & Watch series and the Donkey Kong cabinet being the most notable of these.
A Bittersweet Savior
Nintendo's attempt to re-commercialize home console gaming in the West marks the company's third redemption of games. In the wake of the industry crash of 1983, Nintendo devised an ingenious response that would set the pace for the next three decades -- for better and for worse.
First, Nintendo returned video games to the toy marketplace. The Robotic Operating Buddy (R.O.B.) first bundled with the Nintendo Entertainment System (NES) helped sell this pitch to American toy retailers, most of whom had been badly burned by the '83 crash and had lost their taste for video games.
Second, having learned from its own experiences as a licensor, Nintendo introduced what we now know as the first-party licensing model. A "Seal of Quality" would insure that retailers and consumers knew a product was worthy of their investment. Such a label would have to be licensed from Nintendo by publishers, which Nintendo itself would select and approve. Nintendo would also manufacture all the games at a mark-up commensurate to its influence among retail buyers. Everybody wins -- so long as everybody is Nintendo.
The result surely saved the video game retail market in the West, and for that gift anyone who makes a living or a pastime from video games owes Nintendo its gratitude. But this bailout came with a price. It also changed games, reducing them to a children's medium sold in toy departments and toy stores, rather than a burgeoning form capable of many different uses and experiences.
Popular opinion blames the crash of '83 on a flood of poor-quality games -- not just scapegoats like Pac-Man and E.T. for Atari 2600, but a whole mess of absurd and unplayable games hacked together by speculators attempting to cash in on the latest fad. But this is an unfair -- or at least an incomplete -- characterization. Terrible though many games might have been in the Atari/Intellivision era, they were also diverse and distinct, in a way we have only begun to recover in the last half-decade.
The earliest NES games represented familiar genres: mostly sports (10-Yard Fight, Excitebike, Golf) and fantasy adventure (Super Mario Bros., Clu Clu Land, Hogan's Alley), along with the curious puzzle games made to work with R.O.B. (Gyromite and Stack-Up).
By contrast, in the leading up to the 1983 crash, players could find Atari games that took up the rodeo (Stampede), aeronautic acrobatics (Barnstorming), tax strategy (Tax Avoiders), masturbation (Beat 'Em & Eat 'Em), advertisement (Kool-Aid Man) -- even adaptations of raunchy, R-rated movies (Porky's). In the 1970s and early 1980s, games were made for adults as often as they were made for kids -- played in bars and bowling alleys as frequently as arcades and basements. Video games might have been new, but they weren't immature.
Kids Become Teenagers
Before Nintendo came around to rescue video games, the industry was well on its way to becoming just the sort of general-purpose mass medium today's developers and critics like to think they are inventing anew. Ironically, many of those creators are too young to know what came before, and thus see themselves as saviors contributing to a long-withheld maturity, not realizing that such effort is only necessary thanks to their childhood video game idol.
A similar self-contradiction can be found in Nintendo's own success. In the 25 years ending in 1985, Nintendo went from an obscure licensor to a major entertainment company with its own intellectual property. But those properties -- Mario, Zelda, Metroid, and so on -- remained yoked to toy culture. They are children's characters and children's games that have persisted long enough that the children who first bought them have become adults with their own children. Thus Nintendo's reputation: wholesome, yet juvenile. Profitable, but harmless. Pop culture, not art.
The Return to Family Play
In the 30 years since the NES, the rest of the video games industry has "matured," for certain values of maturity. By the mid-1990s, Nintendo had just released Donkey Kong Country for its purple-and-lavender-clad Super Nintendo. Meanwhile, thanks to the advent of the first-person shooter and the Sony PlayStation, video games became an adolescent's distraction more than a kiddie toy -- something like the statistical average of the 1970s bar and the 1980s basement.
In 2006 Nintendo finally offered a definitive response: the Wii, whose novel physical controls and simplified graphics and interaction models mimicked forgotten aspects of Atari's business thirty years earlier, and Nintendo's own business of a hundred years prior -- making games safe for families to play together.
The Wii wasn't so much a "revolution" in interaction design, to invoke the platform's famous code name, as it was a return to prior ideas: the television as hearth, accessible and appealing family or group play, quick game sessions, lower-cost hardware -- all ideas Atari had addressed in the late 1970s. For Nintendo, the Wii was to video games what Disney playing cards had been to hanafuda. But once again, the only reason the Wii had to take on such a role is because Nintendo had inadvertently poisoned adults to video games with the NES two decades earlier.
Press B to Block
Seen in a broader historical context, the new Wii U is a very different beast from the Wii, even though the former is created from the rib of the latter. While the Wii was an offensive move on Nintendo's part, the Wii U is clearly a defensive one, a hedge that responds to the many trends that have erupted in games, consumer electronics, and home entertainment since the Wii's 2006 release.
Those include: the launch of Sony Move and Microsoft Kinect; the iPhone, iPad, Android, Kindle and the entire app store economy; the Facebook platform and the whole social games sector; the launch of Steam and the diversification of Xbox Live Arcade and the PlayStation Network; the completion of the 2009 digital television transition; and a drop in HDTV prices by an order of magnitude.
The Wii U responds to each of these shifts in its own way. To its physical interface competitors, the Wii U re-entrenches, making no changes to its existing Wii remote controllers. So confident is Nintendo in its superior physical controls, it doesn't even include any in the box. Who doesn't already have some?
To Xbox, PlayStation, and the HDTV market, Nintendo finally caves and adds 1080p HD support and sufficient GPU power to make use of it. It's a move that silently acknowledges what everyone but Nintendo already knew: we like Peach and Samus, but we also like Marcus Fenix and Nathan Drake.
In fact, Gears of War has as much Nintendo lineage as it does Doom lineage; it's the first-person shooter as exaggerated cartoon. So-called "core games" have probably Nintendoified more than they have "matured," so why not play them on a Nintendo?
To Steam et al, Nintendo finally adds a usable online shop with downloadable versions of retail releases and original titles, including independent releases, available on day one. And to Facebook, Twitter, and the rest, Nintendo adds its own private social network, Miiverse, which creates a sharing channel unique to each game, accessible from a single controller button.
The Tablet and the Television
But the Wii U's most obvious and important response to a current trend is its answer to smartphones and tablets. The Wii U GamePad forces players to confront one of the strangest features of the contemporary media ecosystem: the tension between the television and the handheld computer.
It's easy to forget, but home console video game systems were designed around the television more than they were designed around the video game. In the 1970s, long before the VCR, the Magnavox Odyssey and Atari Video Computer System had to teach their players about the very idea of connecting a box to their televisions in the first place. And to produce interactive images and sounds, those early consoles were engineered to couple directly with the cathode ray tube television. In the intervening years, we've forgotten how novel, weird, and difficult it was to make video games playable in our dens and living rooms in the first place.
The design of the Wii had already attempted to draw a new, explicit connection between the television and the video game console. The Wii remote was meant to be approachable and familiar thanks to its physical and operational similarity to a television remote. And the Wii menu was divided into "channels," borrowing its organizational logic from television and cable, paradigms everyone who had been alive at any point since the 1960s already understood.
With the Wii U, Nintendo tacitly admits that the Wii took this metaphor too far. Everyone knows how to point a remote in the general direction of the television, but using the Wii remote as a precision pointing device proved tricky and frustrating even for the most experienced and agile players. While the remote can still be used on the Wii U menu, the GamePad presents a more obvious interaction model on boot-up: a grid of channel buttons that can be touched to select and activate.
This feeling -- that of looking at a big, HD television display while holding a GamePad in your hands and not knowing where the real action is -- this is the central premise of the system. The Wii U is a home console connected to your big, high-resolution plasma display and your 250-watt home theater, which you ignore in favor of a low-res handheld device that can't even leave the room. Except when it's a substantial, fast-running handheld computer with a large LCD touch-screen display that you ignore in favor of your 50" flat-screen.
The sensation of being split between the television and the handheld computer feels strange and awkward. But isn't this precisely how all of us feel today, all the time? Torn between the lush absorption of newly cinematic television and the lo-fi repetition of streams of text and image on our mobile phones and tablets? If the Wii attached to television's past, the Wii U couples to its present: still seemingly unassailable, the most powerful mass medium around, delivering more and more immersion annually, yet substantially eroded by tiny devices delivering quips, quotes, and cat photos.
Entertainment industry pundits have coined the term "second screen experience" as an explanation for this crisis. Television provides a high-gloss, low-information experience, and now that tablets, phones, and laptops are nearly ubiquitous and literally in our hands already as we sit on the couch, TV viewers increasingly split their attention between the prepared, cinematic experience of HDTV and the data-rich reference function of the internet.
But "second screen experience" is far too neat-and-tidy a name for this phenomenon. For one thing, it subordinates smartphones and computers to televisions -- perhaps wishful thinking on the part of the studio executives who deploy the second-screen rhetoric. But more importantly, it describes a far more stable and comfortable situation than the one that actually exists in today's dens and family rooms.
Whether it's tweeting real-time reactions during a presidential debate, looking up a seemingly-familiar actor on IMDB, or just scrolling through Facebook while a mediocre sitcom or drama drones on around us, we are no longer watching TV or using our computers -- nor are we doing both. Perhaps we're neither watching television nor conversing on the internet, in fact, but rather interacting with the strange, uncomfortable space between the two. Like a lap only appears when you sit down, this weird interstitial space only exists when we activate both sorts of devices. It's not a two-screen experience, but a no-screen experience.
Many will miss this innovation and mock the Wii U for not being just another incremental change sold as faux-revolution. Internet purists will scoff, wondering why Nintendo was too timid to integrate directly with Twitter and Facebook. Hardware snobs will mock the GamePad for being neither fish nor fowl, not portable, high resolution, or general purpose enough to replace an iPad or a DS. They'd be right, but they'd also miss the point.
If earlier Nintendo systems made video games safe for homes and families, the Wii U turns the tables: it attempts to make the current trends in the internet and consumer electronics safe for video games. It's the first earnest, sustained, hardware-invested example of such an effort, and it's full of risk and danger.
The console's "missing manual" title, Nintendo Land, helps shepherd players through Nintendo's unexpected gambit with contemporary culture. Those expecting to find a light-hearted, group-play experience akin to Wii Sports will be disappointed, but won't be justified in their disappointment. The Wii U is not just an HD Wii -- not at all. It's a double agent for both the entertainment and technology industries, playing both sides against the middle. It's split-attention gaming.
Nintendo Land's Mario Chase offers the simplest introduction to this central principle of the Wii U. In this hide-and-seek game, one player pilots a Mario-capped Mii on the GamePad screen, while others control toad-hatted Miis on the television screen, via a split view, attempting to find Mario. It's a simple enough idea, and no description can make it sound compelling. But strangely, it is compelling.
The view on the GamePad is also divided. A top-down map covers most of the screen, and a zoomed-in 3D view shows only a small area around Mario's current location. The player being chased devotes most of his or her attention to the map, which also displays the locations of the other players in pursuit.
Occasional glances to the 3D view are required to delineate between different types of terrain and obstacles, and occasional glances at the television screen or the other players on the couch also offer fodder for tactical adjustment.
Likewise, the Wii remote players might be tempted to steal glances of the secret information on the GamePad screen, an interesting evolution of the private, sonic cues that were possible with a Wii remote.
These players can also benefit from collaborating through verbal interaction, which the Mario player can hear and respond to as well. If Wii Sports activated the physical space between the couch and the television, games like Mario Chase activate the conceptual space between the couch, the TV, and a third, private screen.
A similar feeling arises from New Super Mario Bros. U. On its surface, the title is just another Mario title, more or less identical in play experience to New Super Mario Bros. Wii. The earlier game had promised collaborative gameplay that would allow players of different skill levels to work together, but in practice three or four players mostly got in each others' way -- particularly if one of those players was considerably less adept at maneuvering a platform character than the others.
The Wii U rendition of Mario offers an out: one player can act as a kind of assistant, touching locations mirrored on the GamePad screen to create temporary platforms that the active players can use in a pinch. The result helps a younger, less experienced, or less interested player participate in the game in a more meaningful way, while offering true benefit to the rest of a group.
Nintendo has been experimenting with this second screen idea for some time, but it's never really worked out (remember the Tingle Tuner?). New Super Mario Bros. U finally makes good on the idea, and it does so at least partly because we're now more accustomed to splitting our attention between different devices in front of the television.
Nintendo Land's single-player games also re-orient the player's attention. In Captain Falcon's Twister Race, based on the F-Zero franchise, the player holds the GamePad in a vertical orientation and rotates it to steer the vehicle. The television provides the expected 3D view of the track, while the GamePad offers a top-down, 2D view of the play area.
Thanks to its vertical orientation, more of the track is visible on the GamePad. But due to its 2D, top-down rendering style, it's much more difficult to discern obstacles on the GamePad, so glances up to the television become advantageous. In some cases, they are required: tunnels sometimes obscure track boosts when viewed top-down on the GamePad, and the player must pilot on-screen in order to maintain enough speed to reach the next checkpoint.
The Zombie Console
The experience of Twister Race is fun and cheery on its surface, but strangely alienating in its experience. There you are, having spent $350 on a new Wii U with accelerated 3D HD graphics, having climbed behind your receiver to route and plug in yet another HDMI cable, and you're staring at a lousy 2D image of the track you're not looking at on your giant LCD television. What the hell is going on?
Ubisoft's Wii U launch title ZombiU helps answer the question. This is an M for Mature offering, a survival horror game with a permadeath feature meant to appeal to the core gamer audience Nintendo has supposedly ignored. During play, the GamePad displays a map of the player's immediate surroundings and an inventory. It's also used to perform certain in-game commands. Moving and rotating the GamePad allows the player to look around on the television screen, but this maneuver fixes the player's position and thus increases vulnerability.
A one-liner on ZombiU's box copy helpfully summarizes that title: "Feel the tension mount as you try to keep an eye on your TV and controller screen." This is more than just marketing copy for a single game: it's a thesis statement for the entire console. The Wii U is a system thrust into the uncomfortable gap between mobile devices and televisions. Just as zombies are neither living nor dead, so Wii U follows suit: today, entertainment in general and video games in particular are neither a televisual medium nor a mobile medium. They are not both, but they are not neither, either. They are something else, something uncanny, unsettling, out of place.
Nintendo Land is Nintendo like Tomorrowland is the Future
It's no secret that a large part of Nintendo's appeal comes from its long-running properties: Mario, Zelda, Metroid, Animal Crossing, Pikmin, and so on. The availability of New Super Mario Bros U at console launch satisfies some of that craving, but Nintendo fanpersons are an impatient and finicky bunch bound to flood the internet with demands for a new version of their favorite games.
Over the years, some of those titles have marked significant shifts in the genres they represent: 1996's Super Mario 64 set the standard conventions for the 3D action-adventure game, and 1986's The Legend of Zelda made an important advance in what would later be called "open world" games. But overall, Nintendo's most famous and successful titles don't offer innovation so much as repetition. In today's game design community, where innovation is often fetishized but infrequently defined, Nintendo gets a tacit pass. A new Mario game is a new Mario game. Who doesn't want to play it?
But Nintendo Land doesn't offer a new Legend of Zelda or Animal Crossing or F-Zero or Pikmin. It doesn't contain mini-games either, exactly, since many titles are longer and more complex than the name mini-game usually affords. Instead, Nintendo Land offers renditions of possible games that are neither expandable into legitimate titles nor contractible into smaller vignettes. They are not video games so much as they are representations of video games.
Weird as this characterization may sound, the average player won't notice it, because the entire game is housed within the fictional conceit of a theme park. Individual games can be selected by menu if desired, or the player can pilot a Mii around a circular park and choose a game by entering a bannered portal. Playing games earns coins, which the player can spend in a pachinko-like kiosk at the top of the park's central tower, yielding curious décor that fills out the park's empty surfaces.
Theme parks are venues for abstraction. When you ride Peter Pan in Disneyland, you get a quick narrative and physical experience of the story and the film, but you hardly feel immersed in the holodeck sense of the term. Theme park attractions don't have to persuade visitors that they are real, for those visitors have already agreed to suspend disbelief and to partake of one real, physical world as if it were another.
Likewise, the games in Nintendo Land are not really games, but abstractions of games, icons that stand in for games that are not really present. Just as Tomorrowland isn't really the future and Adventureland isn't really an adventure, so Nintendo Land isn't really a Nintendo game, so much as a game evocative of the sensation of Nintenditude.
The entire title is rendered in a felted or crocheted style reminiscent of LittleBigPlanet, further emphasizing its false yet deliberately crafted style. Just as riding a theme park attraction draws an uncomfortable yet pleasurable dissonance between a source work or idea and a vertiginous physical and audiovisual experience, so playing Nintendo Land offers a strange new view on Nintendo's catalog. It's a pretend Nintendo; it's Nintendo admitting to pretense.
In the West we often forget just how traditionally Japanese Nintendo really is. This aesthetic choice might be seen as sloppy or arrogant in the United States, a failure to make a coherent collection of titles that explain the purpose of the Wii U through methodical demonstration.
I take it as a gesture of humility. Nintendo is stepping back, acknowledging that things have changed. That it can no longer make assumptions about what games are or what they should be. And that its players shouldn't either. This gesture of humility is a serious and profound one, in that it also refuses to accept the game industry's standard assumptions about the present reality of games as mobile, social, and free-to-play. Instead, Nintendo presents a substantial, costly effort as its pack-in title, whose overall message amounts to, "we don't know either."
So serious is Nintendo about this act, it has launched its console with an independent, downloadable title that openly mocks the current state of video games. Little Inferno was created by Tomorrow Corporation, a new studio formed by Kyle Gabler (of World of Goo fame) and Allan Blomquist and Kyle Gray (Henry Hatsworth).
The game is both cute and morbid: in a fictional city bombarded by snow for as long as anyone can remember, a toy company (also called Tomorrow Corporation) creates the Little Inferno Entertainment Fireplace, in which children can burn their toys to keep warm. The operation of this virtual fireplace forms the entirety of the game about it: a brick hearth appears on screen, upon which the player can set different children's toys before igniting them.
Tomorrow Corporation provides catalogs of new toys, which can be purchased with coins earned by... burning more toys. Once purchased, the player must wait a period of time before they are delivered to the fireplace's inbox, which visually mimics the iOS dock in a not-so-subtle jab at the Apple app economy.
Earning combos by burning objects together in response to a list of clues provides tickets that can be used to speed up delivery, which can take several minutes per item by the end of the game.
It doesn't take much squinting to find Little Inferno's tacit message: games have become pointless grinds, absurd hamster wheel exercises meant only to produce their own continuance, to offer just enough novelty to imbue players with curiosity sufficient to press on in the pointless art of clicking on (or burning) another object.
The simulation of a social game-style energy mechanic outside of the context of a free-to-play game with micropayments makes an adept point: sitting there, in front of the useless fireplace that is the television, waiting for progress bars to fill, yields a frosty chill. Is this what players and creators want, or what they have been settling for?
So self-aware is Little Inferno that it even mocks Nintendo as host. One of the game's catalog of flammables, "1st Person Shopper," contains video game-themed objects (including references to some popular indie games). Among the items in this catalog is a "handheld fireplace," shaped more or less like a Wii U GamePad. Upon ordering this object to burn, the player -- who probably purchased the virtual object whilst staring down at the GamePad instead of pointing a Wii remote at the television before him or her -- can't help but shiver with postmodern nuisance.
Little Inferno is riddled with the same kind of ambiguity that characterizes the Wii U itself. It is both a silly toy and a real adult game, with real themes worth taking seriously. It's both a game and the representation of a game, both in turn created by a real and a fictional Tomorrow Corporation. The game itself is unabashedly and shamelessly "indie," thus realizing Nintendo's desire to offer an eShop store at console launch, while also remaining available on other platforms with "no copy protection" by guys with "no office" -- a deft rhetorical turn for young dudes who have probably burned the piles of cash they made from their previous games on swank cribs with Herman Miller-bedazzled home offices.
It's a subtle and surprising game about the horror of today's games that keeps its message close to its chest -- until the very end, when it reveals that message in the most heavy-handed, blatant manner possible. In so doing, Little Inferno embraces both the triviality of games and the naiveté of players and developers, while somehow still affording its creators the freedom to advertise such callowness as a "zero waste" aesthetic full of "polish." It's as "perfeccct as possible," the creators boast on their website, which is just to say, typographic tongue in cheek, flawed by design.
Little Inferno is lovely and awful in the way that most indie games are lovely and awful. Pretty and weird, unexpected and ironic, self-referential and earnest, full of youthful scorn for didacticism and pretense, in a manner delivered with moralism and ostentation. Little Inferno is the Wii U saying that it supports indie games, while also saying that indie games are sort of terrible, actually, and wouldn't you rather just save princesses and kill zombies instead?
I Can't Go On, I'll Go On
It's almost impossible to understand the Wii U in the abstract, without playing it. And even then you won't be sure of it, because the Wii U isn't sure of itself, and that's its greatest virtue. In an age when showy CEOs shout hubristic, trite predictions about the inevitable future of games, The Wii U offers an understated bravado that's far more courageous. With it, Nintendo admits, "we don't know either." We don't know what video games are anymore, or what they will become. It's a huge risk, and it's probably the most daring move Nintendo has made in its 125-year history. Domestication through polite ferocity. Feral design.
Still, let's not get too romantic: Nintendo's risk is not daring because the Wii U is good, necessarily. Many will lament what they will perceive as a step back for Nintendo compared to the "innovation" found in the Wii. They might be right. But the Wii U is serious in a way that Nintendo has never attempted. Even Nintendo may not have fully realized what it has done. It has domesticated the wildness of the present moment in video games, consumer electronics, the internet, and home entertainment by caging them out in the open. It's lurid and beautiful and repugnant and real, like watching Mickey Mouse smoke a joint in the alley behind Space Mountain.
We've all been assuming that games "growing up" means growing up in theme, tackling adult issues, achieving the aesthetic feats of literature and painting and film -- even if by "film" we usually mean "summer tent-pole movies."
But there are other ways to grow up. One involves embracing the uncertainty of one's own form and responding deliberately. That's what real art does, after all. It admits that it doesn't know what art is in theory, but only in practice. It gives the finger to its critics because it doesn't care if they like the results. Some among us keep asking for the Citizen Kane of games. Maybe Nintendo delivered something better, something weirder and more surprising -- particularly for a consumer electronics device. Not craft but soul, for once. Even Apple hasn't succeeded at that.
Around the time Nintendo was gearing up to license Disney's properties and become a major player in the playing card business, the novelist Samuel Beckett published The Unnamable. The book has no concrete plot or setting, and it's unclear if the work's characters and events are real or figments of the narrator's imagination.
Much of the novel is self-referential, with frequent ponderings over the possibility that the narrator is simply a construction of the language that forms the novel itself. The writing is a mess, full of despair and incoherence, long sentences flowing into one another to the point of illegibility. It ends with this famous passage, which now, surprisingly, improbably, could as easily have been written by Satoru Iwata or Shigeru Miyamoto:
They're going to stop, I know that well: I can feel it. They're going to abandon me. It will be the silence, for a moment (a good few moments). Or it will be mine? The lasting one, that didn't last, that still lasts? It will be I? ...
Perhaps it's done already. Perhaps they have said me already. Perhaps they have carried me to the threshold of my story, before the door that opens on my story. (That would surprise me, if it opens.)
It will be I? It will be the silence, where I am? I don't know, I'll never know: in the silence you don't know.
You must go on.
I can't go on.
I'll go on.