This is a short version of my final thesis. You can find the full version here.
Introduction
Social VR games are one of the most commonly used applications on VR. In these online social environments, various players from different backgrounds can participate in casual conversations, creating content, or games. Though on the darker side, users could face behaviors that are not recommended by the developers or are not acceptable by the scales of real-world social norms. Social VR developers aim for a healthier and more welcoming place for as many players as possible.
Even though social VR has many characteristics in common with other social media services or digital games, it fosters unique interactions that stem from its immersive quality. Managing new behaviors emerging from the freedom that VR brings, poses many challenges for developers, both technically and design-wise. There is no doubt that they need to take action against toxicity. Since it is not only good for their community and publicity, it also has economic justification; Many players who have ever experienced any toxic behavior feel anxious to go back to that game.
I tried to get more insight into social VR and types of toxic behaviors in these games and get to know various developers' solutions to these challenges. I prepared an online survey and analyzed the data to answer some of my questions. But before moving to the research, let’s go through some backgrounds on the topic.
Social VR
Social VR refers to online applications that focus on socializing in immersive virtual worlds. Besides socializing, it is possible to use these platforms to play games, create content, or role-play, among other things. VR Chat and Rec Room are two of the most famous social VR games. They are both available on almost all possible VR and non-VR platforms for free. As you can see in the chart below, their concurrent players have been rising recently.
VR Chat and Rec Room concurrent users based on Steam Charts
VR Chat was released in 2017, and it is still in early access. Based on Steam Spy, right now, it has more than 5 million owners. It has some advanced moderation tools that I haven’t seen in any other VR games so far.
VR Chat (2017)
Rec Room has been released in 2016 and has around 1 million users. It is more activity-focused and has many official and community-made games and rooms. Unlike VR Chat that doesn’t allow children under 13 to use the game, Rec Room has some regulations and settings that enable these users to play the game. Therefore it is more popular among the younger audience.
Rec Room (2016)
Toxic Behaviors and Violations
Virtual reality is an immersive technology and a more personal experience. It is believed that these features make VR a great platform for creating positive impacts by giving a strong sense of presence. When wearing a VR headset, the user almost detaches from the real world. In this setting, any abusive or discomforting act is also received more intensely. These behaviors could make it impossible to bear for many individuals and deprive them of the whole experience, while the virtual world has the potential to be an inclusive space for everyone.
Due to the emergent nature of these applications, players' interactions can be very unpredictable. This brings some community management challenges that are exclusive to VR. Even the common toxic behaviors are perceived differently in a VR setting. For instance, whispering a threat in an unsettling way or accompanying it with a gesture would leave a more intense effect on the victims. At the same time, the synchronous nature of social VR makes it harder to record these events or avoid the hostile situation.
Based on Oculus researches, toxic behaviors in Social VR can be roughly divided into three categories:
- Verbal harassment includes excessive swearing, explicit sexual language, cyber-bullying, inappropriate terms in usernames, threats to hack, hate speech, and violent speech.
- Physical harassment consists of sexual harassment like touching someone in a sexual way, making sexual gestures, stalking, blocking others' movements, entering others' personal space, or passing through others' avatars.
- Environmental harassment is done using virtual objects, showing offensive or violent content, or by using inappropriate avatars. In other words, they are usually intended to ruin others' experiences or exploit the system.
Other violations may not be considered harassment but are not acceptable in most Social VR applications, like children under 13 using Social VR without any supervision, or various forms of privacy violations.
Regulation and Social Control Tools
Social platforms are ongoing products, and their existence depends on their community. When it comes to games, each developer is responsible for their community. They also hold the most powerful tools to control and lead their users toward better behaviors. More tools and options had been provided to change the outcome of a player’s unpleasant encounter. options like muting, kicking, reporting, or banning other users are well-known.
One more common tool in social VR is Personal Space Bubble that prevents users from getting too close to each other when it is not intended or becoming invisible when entering this range.
Some games also have a unique safe zone feature that allows players to quickly isolate themselves from the surrounding and mute all other avatars present in the room. This would put users in a space that will enable them to make further moves without any disturbance.
VR Chat players can access some social control tools using this menu
One of the commonly used features in all games is contacting the game’s support. Facebook is planning to go further with its Social VR platform, called Horizon, by recording some footage to investigate the report in more detail. Oculus has also launched a security feature that lets users send video reports to their Safety Center. Some games offer support inside the virtual world, which means the users can directly contact a virtual community manager avatar to report their case.
Facebook Horizon Safety Overview Video
Another interesting tool is VR Chat's trust system. The trust and safety system is a ranking system to determine players' trust level, based on many variables. It is designed to shield users from annoying situations like loud sounds, visual noises, and other methods that someone may use to ruin others' experiences. Ranks are also displayed on users' nameplates. A rank called "Nuisance" is dedicated to users that have caused problems for others.
VRChat Trust System Menu
Each of these features works differently on each platform, and there is no standardized interaction for them on VR yet. This means users have to learn them on each game. Since using these tools interrupts the immersion, developers try to implement them as intuitively as possible. For instance, Rec Room allowed players to mute or block a user by holding their hand palm in front of them, instead of using a graphical user interface.
Even though social VR and cyber toxic behaviors are complicated subjects, I think we laid the groundwork for what comes next, the research.
The Research
My online survey on Google Forms
I shared the survey on Reddit, my personal Twitter account, and on related Discord community servers. The survey got 106 responses. After removing the incomplete or spam answers I ended up with 96 responses.
The participants’ age ranged from 12 to 49, with an average of about 21. Many participants were 15 years old teenagers. Most data were from male players and from European countries and the United States.