Twitch wants to make its streaming platform a safer, more inclusive space using new moderation app, AutoMod.
Previously, Twitch’s moderation tools gave mods the ability to remove comments after the fact, but this new system offers streamers tools to stop malicious messages from even appearing on chat feeds.
These changes aim to make Twitch a more usable platform for developers and streamers who may have been reluctant to use the service in the past for fear of aggressive and harmful comments.
AutoMod has been designed to identify and remove inappropriate content from Twitch chat to deliver a more positive experience, for streamers and other users.
It works by sending any messages it flags as inappropriate to a publishing queue, where they will either be approved or rejected by a moderator.
Broadcasters can tweak AutoMod's filtering levels based on their own personal preferences, while the tool can also be used to detect inappropriate emotes, characters, and symbols that might evade more conventional filters.
According to Twitch programming manager and inclusivity group lead, Anna Prosser Robinson, the streaming giant is keen to affect meaningful change.
"One of the best ways we can help bring about change is to provide tools and education that empower all types of voices to be heard," said Prosser Robinson.
"AutoMod is one of those tools, and we hope it will encourage our users to join us in our continued focus on fostering a positive environment on social media."
For the time being, the official AutoMod app is only available in English. However, the Beta version supports a whole host of languages including Arabic, Czech, French, German, Italian, Japanese, Korean, Polish, Portuguese, Russian, Spanish, and Turkish.
Streamers can get their hands on AutoMod right away using an opt-in mechanic found on their settings page.