Sponsored By

Featured Blog | This community-written post highlights the best of what the game industry has to offer. Read more like it on the Game Developer Blogs.

Why do people play video games? 2

A brief overview of two psychological theories of motivation and their application to game design theory

Tom Ryan, Blogger

February 23, 2015

15 Min Read

Have you ever seen someone staring slack-jawed at a TV screen, controller in hand, and come back 6 hours later to find him sitting in exactly the same spot, still with the same empty expression on his face? It's hard not to wonder what's going on there. We can probably assume that this behavior does not occur in the wild, at least until we find cave paintings of a guy with an xbox controller. But every behavior has an explanation. Gamers will tell you they play for fun. However, that doesn't answer the real question: why do people find games fun in the first place? Why is anything fun?

A Gaming Expurriment

If you had to design a video game for a cat, how would you do it? You'd probably start with the assumption that cats like to chase mice, and then design around that. Cats are hunters by nature, and would probably enjoy swatting at mouse-like objects that squeak when they get swatted. People are much more complicated than cats, but the process is fundamentally the same: to make a game that people want to play, you have to understand what motivates them.

Gaming and Evolutionary Psychology

Motivations exist because they provide evolutionary advantages. If humans didn't feel the need to hunt, build structures, or make friends, we wouldn't have lasted very long. There are many theories of motivation, and they have a great deal of overlap. One of the simplest is the Four-Drive theory of evolutionary psychology. It has the following components:

The Drive to Bond
The Drive to Defend
The Drive to Acquire
The Drive to Comprehend

Another popular model is Self-Determination Theory. SDT has a strong focus on intrinsic motivation: things that people find satisfying without any external reward. Presumably, these "default behaviors" represent ends that are valuable in and of themselves to people trying to survive in a harsh, pre-historic world. In lieu of motivating external rewards, people will fall back to meeting their psychological needs with behaviors that increase their chances of survival, even though they are not connected to extrinsic rewards. The three main needs in SDT are:


There are many explanations for why people enjoy video games, but I think that the most compelling ones are based on the principle that people play video games for the same kinds of reasons that they do anything else. People have been playing games for millenia, and video games are probably just the latest incarnation. Sports have been regarded as a civilized substitute for war-like impulses, and modern war games echo that need. Children play games to develop skills and roles they will play later in life, and video games sometimes work to that end. Addiction - which is when one misguided drive overrides all others - is another explanation, though for most people it may be more on the level of coffee addiction than narcotics. In short, video games allow people to feel as if they are meeting important evolutionary needs.

With the obvious exception of sex.

Self-Determination Theory

SDT is a motivational theory that focuses on people's intrinsic motivations. It's a useful marketing aid that has been applied to many aspects of consumer behavior. The book Glued to Gaming, by Scott Rigby and Richard Ryan, focuses specifically on SDT's applications to game design. The following synopsis is a brief overview of the main ideas in Glued to Gaming. The three main psychological needs in SDT offer an explanation of what drives people to play video games.


Competence is a sense of mastery, learning and progression. People enjoy growth, and the feeling that they are developing skills. Games almost all employ an element of competence. If there's no possiblity of making bad choices, there is essentially no game. But there are many other focuses a game can have. Exploration, acquisition, and storylines can all eclipse the competence element in a game at various points.

Earlier games tend to showcase competence elements, because the other two aspects of SDT hadn't really taken hold in game design. Platformers, the old-school 2D games in which characters maneuvered across a series of platforms, have competence at their core. Players must jump at the right times, avoid things, and learn from their mistakes. In games that focus mainly on competence, precision is a large design element. Gameplay is less about giving players an array of good and bad choices to choose from, and more about having to do exactly the right things at the right times. Progression is a goal in itself, unlike games that tend to focus on rewarding a character through power-ups and acquisition. That sense of progression is particularly key to motivating players, because competence is fueled by the perception of growth and improvement. More modern games capitalize on the element of competence by challenging players to get "achievements," special goals in games that are generally peripheral to central progression. These provide a time-sink for perfectionists, or a leak-valve for people whose main-quest progression is stymied.


Autonomy is the desire to be in control of your decisions, and to make choices that impact the world around you. It's an extremely important consideration for game developers, because games are essentially nothing but choices. Surprisingly, autonomy elements have historically been downplayed for much of the history of video games (with some excellent exceptions), and in many traditional games as well. Video games often centered on being reactionary, and essentially giving players a narrow range of choices and self-determination. You didn't get to choose where you wanted to be in pong. You were either in the right place, or you didn't return the ball. Shooters and platformers followed similar patterns. You may have had choices about dodging left or right, but you still had only a few similar choices, and only one or two were correct. A lot of board games and card games were designed in a similar vein. Players rolled the dice in monopoly, and then bought almost everything they landed on. Card games give people a few plays to choose from, and sometimes none of them can lead to victory (or, just as badly, all of them are good enough to win). This style of game development includes an element of skill, but downplays the feeling of having an expressive range of choices. Times have changed.

It's become a mantra in game development that interesting choices are critical. Allowing players to make significant alterations in the game's course, as opposed to the linear style of progression in games like Mario Bros and PacMan, has produced the most popular games of the era. Grand Theft Auto 5, which generated a billion dollars of revenue in 3 days, is the archetypal open-ended exploration game. A player doesn't HAVE to go anywhere at any specific time, or buy any particular weapon, or talk to any particular character. He has to do something, but it can be anything. The game happens at the player's pace. It's still very easy to make bad choices and suffer the consequences, but there are many paths to success. Game series like GTA and The Elder Scrolls (of Skyrim fame) give people an opportunity to plan and strategize their path through the game. Not only is this fun in itself, it gives people something to chew on even when they're not playing the game, adding more value to the product.

Moving from linear games to open-ended has been a significant trend in the industry. Whether it's through character development, open-ended progression, or vanity items and skins, most games tend to give players a lot more choice than they did 20 or 25 years years ago. Nonetheless, linear games can still do well. Not everyone feels the need to express their autonomy through video games. But sprinkling in some additional choices is still a good way for developers to insure that a game doesn't feel meaningless.


Relatedness is the desire to connect with other people. People are social creatures, so it was inevitable that video games would take a heavily social turn. But as with autonomy, a lot of game designers downplayed multiplayer elements, or ignored them entirely, for a long time. Of course the internet hasn't always been around, but quality multiplayer gaming was still possible. Some companies, particularly Nintendo, stayed alive by catering to groups of people who wanted to come together by blowing each other apart. Many of Nintendo's biggest brands were games that were single-player capable, but far more popular for their multiplayer elements. Mario Kart and Super Smash Bros have both been going strong for 20 years. Rare's Goldeneye, a James Bond game based on the movie Goldeneye, was one of the most popular games of the era due to its 4-player arena matches.

Of course, that was then and this is now. Widely available high-speed internet has made social gaming the norm. Game consoles practically require wireless capability. And games like the Sims and Second Life have proven that games don't even really need to be games in order to hook enormous amounts of players. Of course, real-life sims involve autonomy and acquisition elements, but relatedness is at the forefront of their their design.

Many other genres have converted partially or completely from single-player to multiplayer. RPG's, formerly the quintessential single-player genre, have spawned the popular MMORPG genre, linking thousands of players into the same world. First person shooters and racing games, which evolved from single-player 2D platform games, are now driven largely by their multiplayer elements. Even single-player games often have localized modules that link to some kind of multiplayer element. Just because. And games of all genres include forums for players to socialize their experience.

The Four-Drive Model

One way of thinking about video games is as a replacement for the real world. When gaming was relatively new, imaginations ran wild with the possibility of "virtual reality." For four bucks, you could get someone at an arcade to strap a TV to your face and run around in a very tiny arena with terrible graphics, shooting at other people who were incapacitated in a similar fashion. It wasn't exactly the Holodeck. Maybe the lesson is that games don't need to be.

It's Real If I Think It's Real

The human frontal lobe has been described as one giant experience simulator. No other animal has even a fraction of the capacity to imagine, dream, hypothesize, and plan like people do. The human mind is capable of perceiving things not for what the are, but for what they might be. A fly is only interested in a lump of sap if it has sugar in it. A person can imagine using it to glue his thatch roof together.

A consequence of having the kind of mind that can react to things which don't exist is the potential to engage with unreal constructs. When you watch a movie, you can feel sad for some of the characters, and you can hate other ones. Neither of them are real, but they are real to you, at least to an extent. Video games can similarly engage natural impulses that would otherwise be directed at real-world things. On some level, games allow us to live out our fantasies. 

The Four Drive model of evolutionary psychology has long been used as a motivational framework in organizational management. It offers valuable insight into the ways that games can engage our natural impulses, and correlates with some of the biggest trends in the industry.

The Drive to Acquire

This human drive to acquire is most obvious with physical objects. However, it also includes status, personal experiences, and anything with continuity. That flexibility allows for an explanation of why people might value things in a game that amount of nothing more than pixels, ones, and zeros. One of the most sweeping trends in gaming, across most genres, is the addition of objects and achievements which are permanently stored on a game file.

In the early days of gaming, most genres focused on temporary things: Mario would get a mushroom, get big, and then shrink back to normal size as soon as he touched an enemy. You would race cars, and gain short-term bonuses that increased speed or durability, but they would fade. Now, most games involve long-term acquisitive elements. You don't just race cars, you acquire cash and build your car, retaining its added features in the future. You don't just complete a level, you finish it and get an "achievement" badge that is permanently stored and viewable to you, and it adds to your total of "achievement points." Even non-gaming technology uses methods like this. The CBS video player awards your account with five "coins" for watching a show. God only knows what you can buy with them.

The Drive to Bond

This drive compels people to form meaningful relationships with others, much like the need for relatedness. In the early days of video games, you could only play 2-player games at most. More advanced hardware came along, allowing people to play 4 player games. But a very large portion of the market still focused on the single player experience. Many game designers didn't feel that multiplayer interaction was a significant force. The internet changed that completely.

Genres which had formally been strictly single-player changed radically. The RPG genre, almost exclusively one-player, became the MMORPG industry. Players who used to walk around by themselves looking for the next cool sword were suddenly in a world with thousands of other people. The Real-Time Strategy genre produced an internet-capable multiplayer version of its old Warcraft 2 title, and the RTS genre skyrocketed onto the gaming scene. The powerful combination of games that allowed people to feel as if they were productive (through their drive to acquire) and socially engaged produced a new type of addiction. The now-famous Everquest Widows support group attested to the fact that video games could indeed engage people more fully than real life.

Of course, not all games contain a multiplayer element. But these days, its rare for games not to have some sort of community hub, like a forum or subreddit. The ability to talk with other gamers helps drive the social aspect, even if people aren't actually playing together. A social element is not required to make a game work, but it's a competitive evolution that most developers choose not to ignore.

The Drive to Comprehend

This drive concerns the fundamental purpose of higher consciousness: to make order out of chaos. The brain is an information processing machine. Human's ability to learn, predict, and control is fundamental to survival. It's impossible for a game not to engage this faculty. The second you start playing something new, you are already wondering "so what's this all about?" But some games take this much farther than others. Entire genres of games are built around puzzle-solving, exploration, and information-gathering.

Games like Myst challenged people to solve riddles, while the King's Quest series incorporated a large exploration element. The adventure genre is mostly extinct, but the exploration element has been rolled into many other genres. Many games present a fundamentally unknown world that they challenge you to unravel. Pattern-matching, one of the basic components of comprehension, has also been parlayed very successfully into games like Candy Crush and Tetris.

The Drive to Defend

The drive to defend is reactive. It's interesting that people might actually prefer to be placed in situations where they have to defend "themselves," since the primary purpose of this drive is to avoid danger, or the loss of resources. But gaming just wouldn't be the same if there wasn't some dude swinging an axe at your head. Like a thrill junkie, many gamers enjoy "danger." If a game is too easy and you're never threatened, it's a bad game. This could be because of an endorphin reward. Or maybe people enjoy exercising their defensive skills for the sake of practice. Perhapes there's a tie-in with competence fulfillment, as players are forced to overcome dangerous situations. Regardless of the reason, the fight or flight response has been a hook to gameplay ever since the first Gru was encountered.

The Best Thing is Everything

All the elements of SDT and Four-Drive theory can provide gameplay hooks. But the most popular games tend to have many drivers to gameplay. People don't always feel like being social, but they're more likely to commit to a game if it provides them the opportunity to talk to players when they feel like it. Not everyone wants to be constantly challenged, but a game that includes no elements of competence development is going to be have a much narrower market segment than one that allows people to feel accomplished. And people don't all choose to express their autonomy through games, but it's certainly a nice option. Natural trade-offs result when trying to incorporate more features into a game, but modern games have a much stronger tendency to make use of an array of gameplay drivers than old-school games. And not just because hardward has evolved, but because game theory itself has evolved. Even board games are undergoing a resurgence as they incorporate modern styles of design, resulting in mega-hits like Ticket To Ride and Pandemic.

Read more about:

Featured Blogs

About the Author(s)

Daily news, dev blogs, and stories from Game Developer straight to your inbox

You May Also Like