After the pro-Trump mob attacked the capitol on January 6th, social media platforms have been restricting Trump’s accounts. Many have suspended his account indefinitely. Twitch and YouTube followed suit in hopes to stop the spread of misinformation.
YouTube started by removing Trump’s video in which he addressed his supporters on the morning of the attack. Trump has been accused of repeatedly spreading misinformation about the election results in the video, saying things such as he had won by a landslide. He continued to tell his supporters that he loved them and he thought of them as “special people.”
YouTube implemented the new rules on January 7th after the results were confirmed. On Twitter, YouTube introduced the new strike policy. The policy begins with a warning then moves to strikes. If a channel receives three strikes for spreading misinformation within a 90 day period, it will be terminated.
YouTube took a broader approach by applying these restrictions to all YouTube channels. They explained in the same Twitter thread where they said “we’ve removed thousands of videos that spread misinformation claiming widespread voter fraud.” YouTube explained they had removed several videos from Trump’s channel.
An effort to prevent violence
Twitch has suspended Trump’s account indefinitely and at least until the end of his term. In a statement given to Techcrunch, a spokesperson from Twitch said “the President’s incendiary rhetoric, we believe this is a necessary step to protect our community and prevent Twitch from being used to incite further violence.”
This was not the first time Twitch has suspended Trump’s account as well. His account was suspended temporarily in June for having hateful conduct when he had made multiple racist comments during a rally.
Platforms are fighting misinformation
These platforms are taking a stand against misinformation spreading within their sphere. Many of these events will play out on social media where we hope public discourse will continue without calls for violence.