A think piece
This summer has seen multiple news-stories of people and especially children being hurt financially and psychologically by dark design and behavioral monetization in AAA- and mobile games; talks about exploiting behavioral mechanisms for monetization are circulating on conferences; ethical concerns are dismissed by companies that employ these methods. I am interested in the designers that come up with, as well as implement these mechanics and thus explore systemic problems within the games industry that entice people to work willingly on clearly amoral designs.
Stanford Prison Pachinko
In 1971, american psychologist Philip Zimbardo conducted a very famous experiment. A selection of purposefully ordinary students were randomly put into two groups: prisoners and prison guards. It didn’t take long before things went to hell in a handbasket, as guards started harassing and humiliating their fellow students physically and mentally, displaying in only six days what in other contexts and more serious situations leads to murder, sexual abuse and genocide. The experiment naturally had to be cancelled, but it wasn’t Zimbardo (expert in the harmful processes he was studying) who after those six days realized that things could not continue like this. Rather, it took the untainted eyes of a completely uninvolved person - his later wife - to realize that something was terribly wrong here. Even Philip Zimbardo himself as (make believe) prison intendent had become a victim of this occurring moral disengagement.
I argue that the same is happening to us designers in the games industry. Yes, the setting is very different from a make-believe prison and the players aren’t prisoners whom we wake up in the middle of the night to perform humiliating actions. But still, we willingly (with more or less protest) design systems that we know will hurt people, mentally and financially. As I see it, there are very similar patterns in the games industry today, as were seen in that university-basement in 1971. And only if we know about this - what is happening to us; what we have to watch out for - we can act and create change. Make no mistake, these mechanisms are not only impacting low-level designers at the bottom of the food-chain but work their way up every step up the corporate ladder, to the CEOs and even the shareholders. I don’t think all of these cogs in the machine are bad people; rather, it is a systemic problem. Zimbardo’s experiment can help explain what I think is happening here.
Cogs in the (Slot-)machine
Zimbardo names several independent mechanisms that cause immoral behavior and moral disengagement. Two of them are diffusion of responsibility and obedience to authority, which should be of no surprise to anyone familiar with the industry. The typical AAA studio has a top-down hierarchy - even if some would want you to think otherwise. The big design decisions about direction and scale are made further up the food chain and it is left to the designers on the lower levels to figure out the details. However, with behavioral monetization the devil lies in the detail and it requires extensive thought and knowledge about player psychology to do it right. Nothing would work without these low-level experts. The system leaves plenty of space to place the responsibility for what is done on those on top. How many game designers have quit their job when they were asked to create unethical designs? Not many, I presume and even I am not sure if I’d have the strength (and social support structures) to plain get up and leave, myself. Many more people argue at this point - hoping that they might be able to change something from within - but obedience to authority is still a mighty tool.
Social psychologist Stanley Milgram examined how susceptible we are to do evil things, if an authority figure tells us to. In his design 65% of people were willing to administer electric shocks to their partner even after his screaming had stopped - and that was when participants could directly hear the person they were harming. The big studios have a way of making their employees comply with controversial assignments. Crunch is extremely common and it is something that will hurt the devs themselves, contrary to designing predatory systems. Every few months a new story emerges, telling about game devs being coerced with “studio culture” to do the bidding of their supervisors. Devs are bullied, or even fired for unionizing, trying to positively affect studio culture, or even just for speaking up. This creates a potent mix in which most people do as they are told. But there are more mechanisms that will make us even less likely to “just speak up”.
Endless chips on the table
A big factor that contributes to our moral disengagement is the anonymization of us developers. And boy, is this a common thing! Guards wore sunglasses and uniforms during the Stanford Prison Experiment, but the walls between game devs and players are even thicker. Anonymization is one of the main contributing reasons why parts of the internet are as toxic as they are - and anonymization for game developers is a prevalent luxury. In the public perception, companies have become personified. “Have you heard, Ubivision has created the new, super-greedy next-gen loot box, Epicesda uses FoMO and Pay-to-Win to get their players to spend money!” It’s the companies that are evil, not the developers that disappear behind the label. I am aware that sometimes developers’ Twitter accounts are affected by community backlash directed at one or the other big gaming firm, but it is a choice to have a publicly identifiable account that people can direct criticism to. It is much easier to do immoral things, if no-one is calling you out for it. From all of the guards in the Stanford experiment only one started defying the dress code, as things turned from bad to worse.
Much the same is true for players. While developers are anonymized by the barriers of the studio-system, the players are deindividuated. We usually encounter the average player as either friends and family, or a faceless mass of internet-users and in the design process we usually cannot talk about them as individuals, but rather as a combined entity. Even if we want to get more specific and talk about different interests and types of people playing our games, we often use player-types. In some cases, we not only use deindividuating terminology, but also words that can be seen as dehumanizing: developers of games that employ micro-purchases like to use terms like “whales” or even “super-whales”. Don’t get me wrong, as a designer myself I find these terms as useful as the next guy, but we have to keep in mind what that does to our perception of the people we once set out to create a good time for.
Roulettedown of morality
The system of game development seems almost made to produce the outcomes we are seeing, but maybe that is just the nature of big, unregulated systems. I find this alarming to see where it acts upon otherwise good people, because I can see myself falling to this as well. We developers and especially us designers are vulnerable; we may encounter a situation where we become pressured to act in a certain way, our resistance already worn thin because we don’t see - have never met - the people we are harming. It is hard to feel empathy with labels and we all just want to make a living, don’t we?
I don’t want to paint everyone who builds behavioral monetization as an immoral maniac. But, I want to open eyes to what is happening - of those working on these systems right now and of those of us that might be asked to, in the future. Most game devs are great people, but we all can be directed to perpetuate horribly exploitative mechanisms under the right circumstances. Selling gambling to children through loopholes in the law might not be the same as humiliating your peers in a cold, damp university-basement. Exploiting the vulnerability of “whales” for financial gain might not be the same as shocking a friendly man with ever increasing electroshocks until their screams stop. But there is one thing I have learned both from reading about moral disengagement and following the news about behavioral monetization: unless we respond to social concerns and address the ethical implications of our choices, exploitation and systemic abuse can always get worse.
I want to thank Mata Haggis-Burridge for their advice prior to the publication of this piece.