Sponsored By

Microsoft has spent the first half of 2022 eliminating millions of automated Xbox accounts.

Justin Carter, Contributing Editor

November 14, 2022

2 Min Read
Logo for Microsoft's Xbox console.

Microsoft has released a Digital Transparency Report in an effort to show the steps being taken to create a safe community for Xbox players. And one of the key highlights from that report is that the Xbox team has taken punitive action against 4.78 million accounts between the period of January 1-June 30, 2022.

Player toxicity has become a larger focus amongst game developers and publishers this year. Developers have taken stronger public stances to condemn toxic members in their respective communities, and some have even gone so far as to implement gameplay measures to discourage disruptive behavior.

In its report, Microsoft defined its actions as "enforcements," which range from removing the offensive content or suspending an account shutting that account down completely and removing said content.

Of the accounts enforced, it continued, 4.33 million were brought about by activities like cheating or making inauthentic bot accounts. By comparison, other activities said to "ultimately create an unlevel playing field for our players or detract from their experiences," such as adult sexual content (199,000) or fraud (87,000) were incredibly low.

During that six-month period, cheating and inauthentic accounts took up 57 percent of overall enforcements.

For the six-month 2022 period, Microsoft had to suspend 4.5 million accounts.

In terms of reactive measures (read: actions made in response to player reports) during that same period, the Xbox team enforced 2.53 million players. 46 percent of that number was made up communications, such as a message or post left on an activity feed post. 

The end of Xbox's report shows that nearly 33.1 million players have been reported during the first half of 2022. Compared to 2021's similar six-month time of 52.05 million, it's a 36 percent decline. 

And looking further back, there were 45 percent fewer reported accounts compared to the amount reported between July 1-December 31, 2020.

Microsoft's first transparency report provides some insight into how its community moderation process for the social-heavy Xbox platform works. When future reports arrive every six months, some interesting statistics will emerge for those curious on how to foster a good community. 

About the Author(s)

Justin Carter

Contributing Editor, GameDeveloper.com

A Kansas City, MO native, Justin Carter has written for numerous sites including IGN, Polygon, and SyFy Wire. In addition to Game Developer, his writing can be found at io9 over on Gizmodo. Don't ask him about how much gum he's had, because the answer will be more than he's willing to admit.

Daily news, dev blogs, and stories from Game Developer straight to your inbox

You May Also Like