YouTube is an incredible platform for video creators, enabling them to reach a far larger audience than they ever thought possible, as well as provide opportunities for making a profit through monetization of what they are showing online. But, creators need to be aware and updated on YouTube guidelines, which always seem to be changing.

2020 saw many changes in how people interact with one another, but for those using YouTube as their platform for content, policy changes made last year can potentially affect both their creativity as well as their wallet. What were these changes and more importantly, how do they affect the mindset of the content creator? What must be considered in making videos so as to not violate YouTube’s guidelines? Note that these changes are not a minor thing, as over 8,000 channels have been terminated due to policy violations since September 2020. All it takes is three policy violations for a permanent shutdown of an account. Thus, knowing what to avoid is of paramount importance.

It’s long been known that certain subjects are not allowed, such as those considered “hate speech” or potentially inciting violence. This also extends to certain topics, for example claiming there was fraud in the recent election, sexually suggestive material, the promotion of drugs and guns/gun-making. There are also topics that can be considered controversial, include those relating to war and political conflicts, which must be considered also.

Additionally, the content creator needs to decide whether the material is kid-friendly, meaning there is nothing that could be considered adult content. But it’s not a matter of just deciding this prior to putting up new content; all existing content on YouTube from said creator must be adjusted as to whether the content is being directed at children—stated as being for ages up to 13—or is for a mature audience. And since there could be “grey” areas in the mind of the content creators—for example, video games with mature themes—decisions may need to be made in a hard black and white fashion erring on the side of caution.

The video’s audio can also be a problem, as YouTube demonetizes ones based on sensitive subjects.  There are words that cannot be said and the algorithm that searches for these words appears to be unforgiving. Strong profanity used repeatedly is a no-no but if you bleep it out, then it’s all right. Other words that are now deemed harmful can be found by searching and seeing those who have been caught (i.e., doing research).

Here’s an interesting one: the video thumbnail that one sees when scrolling through the YouTube page needs to be interesting and arresting in order to gain attention. But these “custom thumbnails” better not be misleading because trouble could ensue. Of course, and obviously, nothing pornographic or violent is to be in that thumbnail. Checking over those who have created problematic thumbnails—and been caught for it—is a good way to see what is acceptable.

These new YouTube guidelines and restrictions have been put in place to protect both viewers as well as advertisers, and are not due to any company bias—at least that’s what YouTube says. A content creator who has been affected can protest through official channels, but there is no guarantee of success in the short or long term. 2021 is upon us, but the rules and restrictions now in place are not going away and most likely will only be expanded over time. So the reality is that a content creator 1) needs to review YouTube’s policy guidelines as it applies to the type of content they are creating, and 2) look at how similar content is being treated by YouTube in the “real world” in order to avoid stepping on landmines.