Sponsored By

Creating a More Inclusive Gaming Environment – the Role of Content Moderation

The cost of toxic content and abuse is steep for players and the industry.

Farah Lalani, Blogger

November 16, 2023

11 Min Read

We all remember the simpler days of video games; offline games powered by the excitement of in-person encounters. Today's scene is dramatically different in how and where players interact; from online encounters, to multimodal, and on-the-go platforms. The gaming industry is tipped to maintain its recent rapid growth and could be worth $321 billion by 2026.[1]

With this growth, there have also been several challenges for the gaming industry, not the least of which is toxicity in games. The cost of toxic content and abuse is steep for players and the industry. “28% of online multiplayer gamers who experienced in-game harassment avoided certain games due to their reputations for hostile environments, and 22% stopped playing certain games altogether.”[2]

According to a study by the Anti-Defamation League (ADL), nearly 74% of adults have experienced harassment while gaming online, with female players being disproportionately targeted.[3]  Reach3 Insights, a market research company, stated that “77% of women gamers experience gender-specific discrimination when gaming, including name-calling, receiving inappropriate sexual messages, gatekeeping and dismissiveness”[4]. 

The gender bias in gaming was highlighted by Maybelline NY, the global cosmetics brand, which ran an experiment to see how male gamers would be treated when their voices and profiles were modified to appear more feminine. From the start, the experience of the two regular gamers was dramatically different from what they were used to; some peer gamers abandoned the game as soon as they hear feminine voices; others started spewing inappropriate messages, including insults and invitations to go back to perform “female chores”.  

There is extensive literature about the role of gender in the industry. Historically, video games have been designed and targeted for male audiences, playing up heterosexual stereotypes, and impacting how women interact with the games and peer gamers. 75% of respondents agreed that women in video games are sexualized to some extent. Furthermore, 50% of women who play games don’t feel represented when they look at characters in video games, compared to only a quarter of men.[5]

While progress is being made, challenges still exist, and ongoing efforts are needed to make online video games a more inclusive and welcoming space for women and individuals of all genders. Promoting diversity and advocating for respect and fair treatment of all gamers is essential for the continued growth and evolution of the industry.  Enhancing Trust & Safety in gaming is critical for this inclusive evolution. 

The State of Content Moderation in Online Video Games  

Trust & Safety is not new in the video games world, however there is no standard practice across the industry. A lot of room has been left for policy development and interpretation. A good example of this situation arises in the discussion of proactive vs reactive moderation; organizations are debating how to face and control user actions without losing the natural autonomy of the game. At the same time, lack of policy proactiveness could mean exposing gamers to inappropriate content, while waiting for a proper report or escalation. 

As content moderation practices develop in the industry, here are some of the opportunities our team identified: 

  • Comprehensive yet simple articulation of policies and enforcement actions: Where content moderation policies exist in the gaming universe, some policies lack coverage, definition, or clarity in areas such as dehumanizing speech, incitement to violence, extremism, and policies written in a way that is not easy to understand.[6] Stakeholders need specific and easily understandable reasons for any actions that the company takes on their account or their content so they can improve their behavior in the future. According to ADL, 59% of adult gamers believe that regulation is necessary to increase transparency around how companies address hate, harassment, and extremism.[7] 

  • Looking beyond content policies: We see the need to move from a content focus to an environment focus that takes a broader view; for example, in multimodal games, we need to look at voice, account creation characteristics, and other signals to try and focus efforts on the user groups that are responsible for most of the toxic behavior in games. Focusing on these signals can immediately remove a lot of hateful content in a game, given that this behavior is concentrated in a small minority of users– typically less than 3% is responsible for 30% to 60% of toxic content across all types of platforms. If the small minority of players causing toxic content is dealt with, smaller touches, nudges, warnings, and other forms of group behavioral modification can be used for most gamers on platforms to stop any remaining toxic behavior.[8]  

  • Operationalization: When it comes to safety, much of the user experience can be attributed to the way that video game policies are defined and operationalized. Moderation of video game content will improve if policies are available in all languages for markets where the platform is active; if policies are applied consistently across all users if action user reports are promptly adjudicated; and if there are quick and effective mechanisms of redress for people who disagree with content moderation decisions. Platforms must be nimble and modify/augment their policies frequently given the velocity of changes we have seen in the field in such a short time span. 

Opportunities for Solutions: Building a Safer Video Games Environment 

The good news is that there are opportunities for growth and improvement. The industry is rising at an exciting pace, and this is the time to scale our efforts towards building safer and trustworthy platforms.  

Partnering with the right vendors is a great path to accelerate the Trust & Safety journey. Our experience working with some of the largest global platforms in video games, gaming, social media, and e-commerce, has shown us that close synergies between platforms and their moderation partners will foster agile learning, high quality and accuracy, and distinctive innovation. As Trust & Safety experts, these are some of the key recommendations we share with our partners as they work on their journey.

One of our first recommendations to new operations starting their Trust & Safety journey with us is to build a diverse moderation team that reflects the gamer base. According to a study by the International Game Developers Association (IGDA), diverse teams are better equipped to understand and address the unique challenges faced by different groups of players.[9] “Unintended bias towards specific groups of people, topics or context may be due to representation deficiencies or to the lack of social diversity in the group of moderators”.[10]

Diversity in moderating teams has a quantifiable benefit to platforms and user experience. It has been established that the absence or reduced representation of women in the moderation teams has a direct impact in the experience of the final user, in this case, gamers[11]. Understanding the composition of the universe will reflect without a doubt the urge of the community to embrace diversity. According to Statista, by 2021 the distribution of gamers by gender in the US showed close parity, with 45% of female representation in the universe of gamers.

As gaming platforms’ user base broadens, addressing implicit bias in application of policies using diverse moderating teams will benefit users, increase gamer loyalty to platforms, and further platforms’ growth. This is also an opportunity to remind us that diversity must transcend content moderators and should permeate all levels of the organization. Ultimately, it is the work of many which will ensure the reshaping of gender norms in the ecosystem. “Rule-setting is subjective and reflects the biases and worldviews of the rule-setters”.[12] 

As we delve into the natural evolution of content moderation operations, the first step is to set clear and transparent community standards and codes of conduct, which all gamers must agree to before participating. Core principles are transparency and strong communication to reinforce awareness of the consequences of their actions.[13]

Mike Pappas, CEO and co-founder of Modulate, suggests any external-facing community standards or codes of conduct be crafted for a broad audience, saying, "Many kids or other players, especially in games that match folks from very different cultural backgrounds, may simply not know which behaviors are or aren't toxic. When we talk about toxicity, the image of a ‘troll’ comes to mind, but more than 50% of disruptive content actually comes from these kinds of honest mistakes. This means a clear, readable, non-legalese Code of Conduct can be an incredibly powerful tool for level-setting community behavior expectations and cutting down on toxicity."

This is also a crucial time to determine the protocols of reporting, empowering gamers to report toxic behavior, and fostering a culture of accountability. According to a survey by the Fair Play Alliance, 85% of players believe reporting systems are crucial for a positive online gaming experience.[14] Developing the right workflows for reporting violative content from the get-go ensures a green start for new platforms and a pathway for sustained success.

In parallel, designing and implementing effective penalties for offenders, such as temporary or permanent bans, can act as a deterrent. As mentioned before, categorizing different types of offenses, and deploying measurements accordingly will educate the community and create shared accountability. Consistent enforcement of these penalties, including nudges, warnings, other forms of group behavioral modification, and suspensions or bans sends a clear message that harassment will not be tolerated.[15]   

Leveraging artificial intelligence (AI) and machine learning algorithms can assist in content moderation. Automated tools scan chat logs, voice communications, and in-game actions to identify and flag inappropriate behavior. AI tools may be integrated to screen out toxic content directly and route content for human intervention. However, these solutions should always work in tandem with human moderators to ensure context is considered accurately.[16]

Trained moderators still outperform Large Language Models  especially for edge cases, or where additional information is needed; for example, in areas requiring more linguistic or cultural context such as hate and harassment there is still over a 20% gap in performance between AI and human moderators according to one data set from Open AI itself.[17] Some AI-based tools are designed to allow humans and AI to complement one another, with the AI helping prioritize where the moderators should devote their attention, but allowing the moderators final say on what, if anything, should be done.

Trust & Safety at the Core of Online Video Game Growth  

As the gaming industry continues to grow, it's imperative that gaming companies invest in comprehensive content moderation strategies that prioritize player safety and well-being. By leveraging technology, community engagement, and diversity, the industry can work towards a future where gamers of all backgrounds can enjoy their favorite titles without fear of harassment or abuse. 

The responsibility to create a safer gaming environment lies with everyone involved, from players, developers, policymakers, and moderators. Together, we can forge a path towards a video game community that celebrates diversity, fosters inclusivity, and puts an end to toxic behavior. 

[1] PWC, 2023. Perspectives from the Global Entertainment & Media Outlook 2023–2027.

[2] ADL, 2020. Free to Play: Hare, harassment, and positive social experience in online games 2020.  

[3] ADL, 2021. Hate is no game: Harassment and positive social experiences in online games 2021.

[4] Reach3, 2023. Women in Gaming: What changed in 2022?

[5] Reach3, 2023. Women in Gaming: What changed in 2022?

[6] ADL, 2023. Caught in a vicious cycle: Obstacles and opportunities for Trust & Safety teams in the games industry.

[7] ADL, 2021. Hate is no game: Harassment and positive social experiences in online games 2021.

[8] ADL, 2023. Caught in a vicious cycle: Obstacles and opportunities for Trust & Safety teams in the games industry.

[9] IGDA. (2020). Developer Satisfaction Survey 2020.

[10] Human Rights Law Review, 2020. Content Moderation Technologies: Applying human right standards to protect freedom of expression.

[11][11] Ziyu DENG, 2023. Run, Hide, Fight: How Female Gamers Understand and React to Sexism and Misogyny in Gaming

[12] Research Gate, 2020. Content moderation: Social media’s sexist assemblages.

[13]  Newzoo, 2021. Global games market report. 

[14]  Fair Play Alliance, 2019. Player experience data & insights report.  

[15] Fair Play Alliance, 2019. Player experience data & insights report.  

[16] Newzoo, 2021. Global games market report. 

[17] OpenAI, 2023. Using GPT-4 for content moderation. 

About the Author(s)

Daily news, dev blogs, and stories from Game Developer straight to your inbox

You May Also Like