Back in March, YouTube said that it would rely less on its content moderation offices because of the pandemic and would instead more heavily rely on machine learning-based systems. This has resulted in the most moderated quarter in YouTube’s history

According to data provided by YouTube, YouTube removed 11.4 million videos last quarter. That is almost twice as many removals as the previous quarters, which has about 6.1 million video removals. YouTube did warn creators that they could see an increase in video moderation and removals. However, YouTube wouldn’t issue a channel strike on videos that were flagged by its systems. Channels would receive strikes after the video is put through human review and deemed to violate YouTube’s Community Guidelines. These videos would only lead to a strike if YouTube had “high confidence” that the video violated its guidelines. YouTube also said it would be more cautious about what content would be promoted in searches, the homepage, and recommendations.

There was an increase number of appeals as well

As you would expect, since the number of videos removed increased last quarter, the number of appeals also increased. Similar to the number of videos removed, the number of appeals nearly doubled last quarter. It went from 166,000 appeals to 350,000. YouTube warned the appeal process would take longer due to decreased moderation staffing. Nevertheless, the number of videos reinstated after appeal quadrupled. That metric went from 41,000 to 161,000. So, in total, only 5,000 appeals didn’t lead to the reinstatement of the video in question.

Balance between under-enforcement and over-enforcement

Due to the Covid-19 pandemic, YouTube was forced to cut back on its human moderation capacity. In the end, the company had to walk a fine line between under-enforcement and over-enforcement.

“When reckoning with greatly reduced human review capacity due to COVID-19, we were forced to make a choice between potential under-enforcement or potential over-enforcement,” YouTube wrote. “Because responsibility is our top priority, we chose the latter — using technology to help with some of the work normally done by reviewers.”

In the end, the number of video removals nearly doubled, but the number of reinstated videos increased as well. So, it’s likely that YouTube’s AI moderation systems were strict on its enforcement of the platform guidelines, considering an overwhelming majority of the videos were reinstated. As the pandemic continues, there’s a high potential YouTube will be forced to use its AI systems to continue to moderate content that would later go under human review.