YouTube has been highly public about its efforts to improve its algorithm. However, despite updating its algorithm 30 times in just the past year, there’s a new study that suggests its algorithm is recommending videos that viewers regret watching at an alarmingly high rate.

According to a crowdsourced Mozilla study, about 71 percent of videos that viewers regret watching were recommended by YouTube’s algorithm. Additionally, nine percent of the video viewers regret watching violated YouTube’s community guidelines and copyright policies. Also, YouTube eventually removed these videos for violating its guidelines.

Overall, the respondents regret watching the videos because they were subject to misinformation, hate speech and sexualized material.

Advertisement

8 Tips for Making a Stellar First Video

Free eBook

Free

8 Tips for Making a Stellar First Video

Free eBook

Free

Thank you! Your free eBook will be sent to you via email


Mozilla study looked at a year’s worth of YouTube videos

In total, the study surveyed 37,000 people. Everyone had to download a Mozilla extension named RegretsReporter. Users were able to use the extension in either Chrome or Firefox. The study asked those taking the survey to flag every video they found regrettable to watch. This study ran from July 2020 to May 2021.

Some are questioning the study

Some are questioning the validity of the study. For instance, Platform founder Casey Newton, in a Platform newsletter, suggests the narratives formed from the study could be misleading. He notes the algorithm’s recommendations make up the majority of the watch-time on YouTube. So, it’s natural it would have the highest regret rate.

When CNET asked, YouTube claimed the opposite — stating, according to its internal data, users are happy with its algorithm. YouTube also questioned the validity of the study’s data, pointing out the study’s definition of ‘regrettable’ isn’t clear.

It’s possible the study doesn’t accurately represent the changes YouTube’s made to its algorithm over the past year. So, while the study might account for some of the participants’ recent feelings about the algorithms, the results of the data collected at the beginning of the study could skew the results. YouTube has changed since 2020 and feelings could be different now than they were a year ago.