Banning users is now becoming more efficient in Twitch as the platform adds two moderation tools. The new moderation tools include a feature that explains the reason why a user was banned and one that instantly blocks users if they use preset terms and phrases in chat.

How the two moderation feature works

Now, streamers can share their mod comments to another channel explaining how a specific individual was banned. For instance, if a user was using a slur, the mods of the channel that bans this user can share the context of the ban to another channel. 

Additionally, Twitch is adding Shield Mode into Mod View. Once this feature is activated, streamers can add terms and phrases to automatically ban users who use them. Streamers can then review the bans to make a decision whether to unban them, keep them banned or report them to Twitch.

This new feature will be integrated into Mod View so streamers can moderate their channel on one page. According to Twitch, this is an extra measure to keep communities safe as banned users keep on popping up on other channels while being banned on another. 

More blocking tools from Twitch

Aside from Shield Mode, Twitch also recently added a toggle to its blocking tools. When the toggle is on, streamers can ban users from watching their streams in real time, not just in chats. Overall, all these moderation features seem to be a great step in the right direction. However, many Twitch streamers are asking for stronger tools, such as IP blocking, to prevent further harassment and bullying. Despite being banned from channels, users can still watch streams by logging in and out of their accounts or making new accounts. Whether Twitch will roll out IP blocking, or is considering it, is unknown at this time.