An image of the YouTube homepage with recommended videos in dark mode

Content Moderation on YouTube and its Effects on Users

9 minutes to read
Article
Elise Stegmann
11/01/2021

On February 14th, 2005, three former PayPal employees registered the domain name YouTube. In the ever-expanding world of social media platforms, YouTube is a revolutionary space where one can upload videos and share them with friends. The website was originally meant to be a dating platform where people uploaded videos of themselves. However, when that idea failed, the website turned out to be the ideal place to upload and share video content, thus becoming a general host for videos. The website grew in popularity and was bought by Google in October of 2006. YouTube's motto used to be “Broadcast Yourself”, but as the culture and intent of the site changed, this theme was removed.

After Google took ownership of the website, they introduced advertisements and later the YouTube Partner Program in 2007. This meant that people who uploaded videos could now earn money from advertisements that surrounded videos. In 2013, a major change was made to YouTube's popularity and promotion algorithm. The algorithm was previously based on clicks (how many times a video was clicked on), but it was changed to be based on watch time. The clicks algorithm had prompted a lot of clickbait videos, with catchy titles and thumbnails that lacked the content they promised, but were promoted because of the sheer number of clicks they got.

In February 2017, several major brands boycotted YouTube as an advertising platform because their ads were playing before videos that featured extremist views. Companies such as L’Oréal, the Guardian, Marks & Spencer, and Audi removed their ads from YouTube. As The Times put it in one of the first articles published on the scandal, “Household names unwittingly pay extremists and pornographers." One example was an old video by Felix Kjellberg, known as PewDiePie, that featured anti-semitic views and jokes about Hitler. This incident, and the subsequent repercussions, were dubbed the YouTube 'Adpocalypse' (Weise, 2017). This article examines how the YouTube Adpocalypse affected content moderation on the platform.

Appeasing advertisers

In August 2016, YouTube shifted its focus to “family-friendly” content in an effort to make YouTube a more advertising-friendly space. Content that does not follow the ‘advertiser-friendly content guidelines’ is 'demonetised', meaning the video will not generate ad revenue for creators in the YouTube Partner Program. Ads might still be run on the video, but the money generated by these ads does not go to the creator.

The 'advertiser-friendly' guidelines can be found on the Google support website and cover most of the content YouTube does not tolerate, as well as the consequences for videos that do not follow these guidelines. Content violating these guidelines includes inappropriate language, violence, shocking and adult content, and controversial issues and sensitive events. If any of these are flagged in a video, the video will be demonetised automatically. The creator can then appeal to YouTube to have monetisation reinstated through human review. Google mentions on their advertiser guidelines that their systems aren’t always correct, and that that is the reason they have added the human review option.

YouTube’s content moderation

All platforms must moderate content to a certain degree to comply with (inter)national laws and protect users from one another and possible malicious intent. Moderating what can be posted on a platform is also done in an effort to appeal to potential new users as well as advertisers and investors (Gillespie, 2018).

The difficulties of content moderation are centred around three issues. First, there is the question of what is and is not acceptable and what laws and values govern acceptibility. Second, different stakeholders might benefit or suffer from moderated content. Third, how should moderation be executed? Both technology and human labour are needed to moderate all content on a platform. Technological detection is not without its flaws, and people are needed to both teach and correct the technology on how to flag inappropriate content (Gillespie, 2018).

In the case of YouTube, advertiser-friendly guidelines are not the only way they moderate content. The Community Guidelines dictate what content is and is not allowed on the platform, as well as what type of user behaviour is tolerated. The intent of the Community Guidelines is stated at the top of the guidelines' webpage:

“Our Community Guidelines are designed to ensure our community stays protected. They set out what’s allowed and not allowed on YouTube, and apply to all types of content on our platform, including videos, comments, links, and thumbnails.”

These intentions follow the reasoning Gillespie (2018) sets up for content moderation. The Community guidelines cover spam or deceptive content, sensitive content, regulated goods and violent or dangerous content. This overlaps with the Advertiser-Friendly Guidelines in all but deceptive content.

For example, the regulated goods section of the guidelines names firearms as content that is not allowed on YouTube. The rule states that videos selling firearms or instructions on how to manufacture firearms is not permitted. In the Advertiser-Friendly guidelines, it says that the sale and instruction for assembly of firearms, among other goods, is not suitable for advertisers. According to the Community guidelines, content that does not follow the rules will be removed and the channel that uploaded it will receive a strike as warning. The advertiser-friendly guidelines only say that the content will be demonetised. It seems that when a video from a creator in the Partner Program is flagged for inappropriate content, it is demonetised. When a video does not have advertisements the video is simply taken down.

These double rules have been made to appease companies advertising on YouTube. Advertisement revenue is the way in which both YouTube and YouTube content creators make money. The rules around content moderation were not created to aid content creators on YouTube, and instead have caused confusion and indignation among users.

To a creator’s horror...

Following the implementation of the new guidelines in 2016, only channels that were considered ‘Family-Friendly’ still had ads run on their videos. In an attempt to remain monetised, or regain monetisation, videos were taken down and thumbnails, titles and descriptions were changed. YouTuber Glam&Gore, or Mykie, is a special effects (SFX) makeup artist, and her content includes tutorials on how to create different ‘gore’ looks from horror movies and SFX spins on well-known characters. She has repeatedly stated that her content is instantly demonetised because of the thumbnails featuring violent or shocking content (according to YouTube). She has since started censoring her video thumbnails.

Censored thumbnail by Mykie from Glam&Gore

In a video titled ‘Why I Stopped Doing FX Makeup On My Channel As Often’, Mykie specifically addresses the situation surrounding demonetisation and SFX makeup tutorials. In the video she proceeds to create a demon look, while only ever talking about ‘making cupcakes’. She does not mention anything related to horror and gore, and censors anything that could get flagged as inappropriate. Unfortunately, the video was still demonetised.

In the first eight minutes of that video, she talks about the frustration she has felt surrounding her content and many others like it. Mykie says she is terrified her videos being demonetised because it happens so unpredictably. The videos are demonetised seemingly at random: Just when Mykie had started to see a pattern, YouTube changes the algorithm.

Apart from the lack of clarity on demonetisation, she also explains how people have stopped being notified about her updates. Numerous fans started sending her tweets about how they hadn’t realised she uploaded because they did not receive a notification from YouTube. These fans showed how there should be no reason for them to not receive these notifications, yet it still occurred. YouTube was no help, and kept sending Mykie and fans on loops, saying there was nothing wrong with the videos or the notification system (Glam&Gore, 2018).

Similarly, Anthony Padilla, founder and former owner of Smosh, has expressed that his videos were being demonetised on the basis of sensitive events and controversial issues. Padilla makes interview-style content that focuses on different groups of people, such as people with dissociative identity disorder (DID), cam girls, nudists and asexual people. During these interviews he mentions how his content frequently gets demonetised as soon as it is uploaded. Furthermore, he stated on Twitter that several people had reached out to him about certain ads playing before the video. Gun and firearms ads were played before a video that was demonetised (Padilla, 2019). Firearms and weapons are reason for demonetisation under the advertiser-friendly guidelines, as they register under harmful or dangerous acts.

In Padilla’s video ‘I spent the Day with ASEXUALS’, he takes a moment aside to address the audience. He explains that content that explores sexual topics, despite being educational or journalistic, almost always get demonetised. Apart from the lack of revenue, Padilla says these videos are also suppressed and less likely to be recommended to people who are not already subscribed to his channel.

He expresses how the video about asexuality will likely get demonetised and suppressed for even discussing topics related to adult content (as it is referred to in the guidelines) (Anthony Padilla, 2020a, 5:32-6:07). In a later video, 'I Spent the Day with PANSEXUALS', Padilla confirms that against his prediction, the video about asexuality remained monetised. He mentions that this raises hope that these topics such as sexual orientation are now less likely to be supressed and demonetised, or as he puts it ‘monetarily punished’ by YouTube (Anthony Padilla, 2020b, 5:12-5:40).

Hank Green, one of YouTube’s oldest creators and the founder of Vidcon, the official YouTube convention, explains the YouTube adpocalypse in the conveniently titled video 'The Adpocalypse: What it Means'. He uses a metaphor of buckets, and explains that there are three buckets that all videos on YouTube are sorted into. There is the manually selected content, that is picked to be family-friendly and safe for advertisers. These channels and videos get advertisements from major brands. Then there is the content that does not follow the advertiser guidelines and is technologically deemed not-advertiser-friendly. These channels, or videos have no ads run on them and generate no ad revenue. Lastly, there is a grey area between these two, where most content on YouTube resides. It is unclear how the algorithm and content moderation operates in this in-between.

For most social media platforms, content moderation is expected and thus familiar. On apps like Instagram, one knows not to post nude pictures as they do not comply with the guidelines (Gillespie, 2018, p6). People usually learn to navigate these guidelines, yet with YouTube it has proven difficult. Both Padilla and Mykie have said that the inconsistency of demonetisation is what is most bothersome. Because there is no regularity with which content is demonetised or removed, there is no way to learn to navigate the guidelines as a creator.

Further, creators may seem so thrown off by these rules is because they rely on YouTube as their main source of income. It is their job, and their income becomes uncertain when the monetization algorithm is inconsistent. Creating and producing videos is an emotionally taxing job which requires a great amount of planning, filming , editing and producing. Add to this the not knowing whether or not to censor oneself, and this makes producing YouTube videos for income a scary and draining job.

What This Means for YouTube

After the first YouTube Adpocalypse, many people predicted that these changes to the algorithm and content moderation would become the downfall of the platform. In reality, despite many disgruntled content creators, YouTube has continued to adjust and apply these guidelines. Most vulgar language has disappeared from the platform for fear of demonetisation. YouTube remains unique among social media platforms. Thus, in spite of the many complaints and issues, content creators continue to use it and will have to continue to comply with the guidelines.

Content moderation for any platform is a delicate balance between rules, appeasing investors, and attracting users, as established by Gillespie (2018). YouTube seems to have upset this balance, and in doing so, has turned its back on users. Advertisers appear to be of greater importance to the company than its users. The difficulties around content moderation also affect YouTube, but instead of acknowledging it and working to fix it, the website still updates its guidelines without listening to content creators. Those who rely on YouTube as their primary source of income have been thrust into a world of uncertainty and instablity.

It has been three years since the first YouTube Adpocalypse, and it is fair to say that there have been few consequences for YouTube as a company. Their appeasement of advertisers has not deterred users from using the website. The changes made to both the algorithm and the manner in which content is moderated did not go without resistance and outcry, but these were met with very little response. The response that did come was lackluster and ignored the wishes of the participating parties. They remain firm in their user-unfriendly behaviour.

References:

Gillespie, T. (2018). Custodians of the Internet: Platforms, Content Moderation, and the Hidden Decisions That Shape Social Media (Illustrated ed.). Yale University Press.