In 2016, when the Wall Street Journal published an article about Pewdiepie, one of YouTube’s biggest stars, allegedly sympathizing with Nazis on his channel, advertisers took note. When it was brought to the attention of these advertisers that some of the hate content spewed by ISIS was getting through the YouTube filters, they began pulling their ads en masse. YouTube knew that they had to do something about this massive loss of income.
On April 6, 2017, YouTube instituted new rules and policies for creators to abide by in order to bring back YouTube’s advertising money. Even the creators who’ve obeyed these new rules have found themselves making less income than they did before. Pewdiepie nicknamed these events “the Adpocalypse” and rightly so. Many content creators have taken significant losses to their income, and their creativity has been put in a stranglehold. Many have even left YouTube altogether, as they feel that they can no longer make a sustainable living under the new format.
Let’s take a look at the situation:
The New Rules as of April 6, 2017
On the surface, a lot of these rules appear to be common sense and reasonable. However, when you look closely, a lot of legalese and ambiguity has been written into them so that Google can cover themselves legally and simultaneously raise their advertising revenues with little concern for how it affects the average creator.
Probably the most highly accepted new rules is that all channels must now go through a new partner program application process to be approved for ad-sense. No channel will be accepted into the program until it’s accumulated 10,000 total views across all videos; this is designed so that Google can weed out pirates and policy violators before payments are made. A YouTube spokesperson has assured YouTuber that even more safeguards are on their way. Other rules include:
No Language that YouTube’s Algorithms Determine to be Inappropriate
There is a sentence here that reads: “Video content that contains frequent uses of strong profanity or vulgarity throughout the video may not be eligible for advertising. Occasional use of profanity won’t necessarily result in your video being ineligible for advertising, but context matters.” Note the use of indistinct words such as “often,” “may,” “occasional,” “won’t necessarily” and “frequent.” These vague words will make it difficult for content creators to abide by the rules that YouTube has set.
No Sexually Suggestive Content
There are probably millions of websites out there where sexual content can be posted. YouTube was set up as a family-friendly place for videos to be posted and, as such, they ban sexually suggestive content to keep it clean. There is one disturbing clause in YouTube’s description of “sexual content” and it reads, “(content) such as video content where the focal point is nudity…” If there wasn’t so much more to cover in these mystifying rules, we could go into an argument here about why simple nudity should not always be considered ‘sexual’ and whose interpretation of ‘sexual content’ we’re going with, but we are moving on, as there is so much more to (ahem) cover.
Vague words will make it difficult for content creators to abide by the rules that YouTube has set.
There is no place for promoting violence on YouTube or anywhere else, for that matter. However, one has to wonder where the line is drawn. What constitutes violence, anyway? Whose definition of violence do we go by? Apparently, we use YouTube’s bots’ definition. YouTube does explain that it is for video where the violence is “presented without additional context.” Still, that’s very vague and leaves a lot of room for interpretation.
The night before I wrote this, I saw a video of a man in Thailand taunting a bear with a bowl of rice. He thought that he was safe, since the bear was behind a cage. He was dead wrong. I think that is a very “violent” video, yet it had 8,888,887 views at the time, despite the “graphic content” warning. “If you’re showing violent content in a news, educational, artistic, or documentary context, that additional context is important” — even more ambiguity, and we have also been advised that this content would not be monetizable anyway. It would be impossible for YouTube or anyone to be able to make a 100 percent definition on what violence is and isn’t allowed.
Then we have Grand Theft Auto, Call of Duty, World of Warcraft, Rainbow Six Siege and many, many more T- and M-rated online games. A majority of the most popular YouTube content creators regularly play these games on their channels. Moreover, they have millions of subs who love to watch them play their first person shooters and other violence focused games. These are not the kind of games that you’ll find on Disney!
YouTube says, “Violence in the normal course of video gameplay is generally acceptable for advertising.” Looks like they’re okay! Just watch the language. According to the above-mentioned language rule, “Occasional use of profanity won’t necessarily result in your video being ineligible for advertising, but context matters.” Oh, they just had to throw in those last three words, didn’t they? Well, that should add plenty more obscurity — and anxiety — to your content creation conundrum.
No Hateful Content
Well, this is good. And it should be easy to follow, right? Not exactly.
In their new rules, YouTube explains that content promoting “discrimination or disparages or humiliates an individual or group of people on the basis of the individual’s or group’s race, ethnicity or ethnic origin, nationality, religion, disability, age, veteran status, sexual orientation, gender identity, or other characteristic that is associated with systemic discrimination or marginalization is not eligible for advertising.” That’s fine, but then they go on to make it completely abstruse by adding “Content that is satire or comedy may be exempt; however, simply stating your comedic intent is not sufficient and that content may still not be eligible for advertising.” This is probably in reference to PewDiePie, who defended himself against those claiming he supports Nazi ideology by pointing out that he was never meant to be taken seriously in those videos.
No Harmful or Dangerous Acts
This is probably a good rule, but for the wrong reasons. Would taunting a 700 pound bear with a bowl of rice qualify as a “dangerous act?” It should. Then again, a video that shows just what happens when bullying goes wrong should qualify as educational material.
“Jackass” was a truly dangerous show, and not just to the stuntmen on screen. The problem with content like this, as well as professional wrestling, is that some people try to emulate what their heroes do. And in the case of “Jackass,” they were doing a great number of things that were very poorly thought-out. At least professional wrestlers plan and practice their moves, although some have died and many have been maimed when their stunts went wrong.
While people should be smart enough to know when and when not to emulate what they see on YouTube, not everyone can, so they make arbitrary rules such as this. It’s damaging to creators; it’s damaging to their responsible fans, but it makes Google’s lawyers sleep much better at night.
No Controversial Issues and Sensitive Events
It’s funny how the Adpocalypse arrived shortly after the most recent presidential election in the US, which occurred on the heels of Brexit, both of which were, and still are, quite controversial. According to the new rules, discussion of politics is verboten if you want to monetize. But if you go to almost any other area of the internet, that’s about all that’s being discussed.
While there are surely many things on YouTube that can be discussed that don’t include the destruction of the environment, immigration and, well, everything else, it’s really difficult to avoid, considering how all of it affects our everyday lives. Shows such as David Pakman, Jimmy Dore and Secular Talk have taken to Patreon in order to pay the bills since ad revenue has dried up.
No Drugs and Dangerous Products or Substances
This is what YouTube has to say on this subject: “Video content that promotes or features the sale, use or abuse of illegal drugs, regulated drugs or substances, or other dangerous products is not eligible for advertising. Videos discussing drugs or dangerous substances for educational, documentary, and artistic purposes are generally eligible for advertising, so long as drug use or substance abuse is not graphic or glorified.”
This may sound pretty obvious on the surface, particularly the “dangerous” part, but it gets really tricky at “illegal,” Where I live, and in many other states, marijuana is legal. But in much of middle-America and many foreign countries — YouTube is an international platform, after all — it isn’t.
No Inappropriate Use of Family Entertainment Characters
YouTube explains this as, “Videos depicting family entertainment characters or content, whether animated or live action, engaged in violent, sexual, vile, or otherwise inappropriate behavior, even if done for comedic or satirical purposes, are not eligible for advertising” Would this be Peter Griffin running around in his “Donald Duck outfit?” Or are we talking about more graphic content than that? There is more ambiguity here.
No Incendiary and Demeaning Content
YouTube states that, “Video content that is gratuitously incendiary, inflammatory, or demeaning may not be eligible for advertising.” Yes, more ambiguity here, but they go on to add a bit of context with explaining, “For example, video content that shames or insults an individual or group may not be eligible for advertising.” With that explanation, this really should fall under the earlier “hateful content” rule. This is a good rule, but it’s also redundant.
Your First Amendment Rights
Because YouTube is wholly owned by Google and not a government agency, the First Amendment simply doesn’t apply. They can choose what content goes on their website and they can choose what they will — and will not — pay for. Until and unless a competitor appears that can challenge them — one that is willing to choose freedom of expression over advertising dollars — the alternatives aren’t very good.
Those are the new rules. You don’t necessarily have to abide by them, but you won’t be seeing a check from YouTube if you don’t.
The New York Times said it best when they wrote, “If YouTube wants to fulfill its promise of an online environment where independent creators can make interesting work, it will find a way to scrub ads from truly vile content without penalizing the merely controversial.”
YouTube promises that their AI is continually being upgraded and improved to make this a reality. They also tell us that many creators who experienced a huge dip in their income when the new rules came out are now back up near the levels that they were receiving before the new rules took effect — though not all creators would agree. For many, the Adpocalypse may leave a permanent scar on the platform’s reputation and relationship with creators.