In this free GDC Vault talk, Victoria Tran (community director at Innersloth, and Wholesome Games organizer) and Kat Lo (Research Affiliate, Center for Responsible, Ethical, and Accessible Technology, University of California, Irvine and content moderation lead at Meedan) illustrate direct strategies for smooth, effective moderation during any kind of livestream. Based on years of experience working in various digital spaces, the tactics that Lo and Tran suggest are practical: with pre-planning and strategic “heavy lifting” done ahead of time so moderators have clear guidelines to work with.
“So the main things that you want to have ready... This is for any audience size, especially in a large corporation with a large following,” Tran says, after looking at a number of potential pitfalls and laying out the start of a solid moderation plan. “You want to be super prepared with a code of conduct and rules, Bots [chat bots, a feature of most streaming services like Twitch], emergency shut off: like an emote-only mode, a moderation guide, run sheet and moderators.
Tran also emphasized the crucial importance of a test run.
“Do make sure you do a test run!” She stated. “You can have everything in place. But if you have mods and you have no idea how to access the tools or your toolbox, it is totally useless. Especially when you're in a moment of panic.”
The code of conduct is also an absolute must-have, even if you don’t expect most users to actually go in and read it.
“The code of conduct and rule set is really important…” she says. “But even if people don't read it, it sets the stage for the chat, the kind of vibe you're looking for and can be easily referenced. It sets up more positive norms and, you know, lets you know which the negative ones are which can be really important when you have a really large global group.”
In her section of the talk, Lo goes in deep on the need for specificity in moderation, including the consideration of gray areas, edge cases, and the utility of an action matrix for specific behaviors.
"I think you should essentially create an action matrix of what actions to take in response to what violations," she says, noting the "action matrix" may not be the perfect term, but it works for the concept. "You can see that we have escalating levels of code of conduct violations with the action that we take against it. So there's levels zero through four." She gestured at her slide.
"I'll take you through a couple of examples actually. So double zero is no action. Section two indicates what shouldn't be taken down. And I actually think this is a really essential part of a moderation guide, because it gives you a good sense of contrasting what you should take down versus what you should not: like it's much more concrete."
“You get [a] much more concrete understanding, and I think it makes things a lot more explicit for discussion. So like, okay, 'profanity, or no profanity?'"
"Or criticism of the game or criticism of the streamer as long as it's not harassing or something. Some people might actually have a different idea where they say, like, 'we don't really want a negative vibe. So we're actually going to take down some of this criticism,' which you know, has its own implications. And then level one, what we typically have is an escalating timeout. So it's like, if you want to ban somebody for text, 10 seconds, 10 minutes, an hour and so on. "
Watch the talk above for the full rundown!