by Matthew Martinez
YouTube has recently changed its guidelines for ad revenue and video monetization, which is having a significant impact on content creators. Specifically, content creators are finding that their uploaded videos are flagged for inappropriate content and suffering from demonetization. This is part of a new policy created by YouTube to make the community brand-friendly and attract more advertisers to the platform.
An algorithm was designed to identify the following factors in videos and flag them for demonetization:
- sexually suggestive content, including partial nudity and sexual humor;
- violence, including display of serious injury and events related to violent extremism;
- inappropriate language, including harassment, swearing and vulgar language;
- promotion of drugs and regulated substances, including selling, use and abuse of such items; and
- controversial or sensitive subjects and events, including subjects related to war, political conflicts, natural disasters and tragedies, even if graphic imagery is not shown.
These categories have drawn criticism for being overly broad and vague. Combine this with the inaccuracy of bots learning how to enforce the guidelines and the result is unfair demonetization. Many of the demonetized videos are those dealing with subject matter YouTube has marked “not suitable for advertisers.” However, many of these videos are in fact appropriate and not deserving of demonetization. Casey Neistat, a popular YouTuber, after the mass shooting at a Las Vegas concert created a video aimed at raising money for the victims of the tragedy stating that all proceeds from ads would be donated to the victims and their families. A few days after the video was uploaded, the video was demonetized.
YouTube’s algorithm has recently been more widespread and aggressive at removing ads on videos that could have the slightest possibility of being controversial. Because the algorithm is fairly new, the result is that it is over-inclusive and impacting videos that should be deserving of ad profits. As a result, certain YouTubers are unable to sustain a career from making videos, and are being forced to stop uploading content. Even though YouTube is a private company not subject to the usual First Amendment constraints as other public forums of communication, removing ads is still a form of censorship. By flagging videos for demonetization, YouTube is rewarding a very specific kind of content, while forcing controversial, suggestive, or tangentially related content off of the platform. This significantly impacts LGBTQIA content creators because most of their videos deal with sexuality, the coming out process, and other related content that has been flagged for being “sexually suggestive.”
The underlying impetus for the increased policing of videos and the over-inclusive demonetization of videos was a response to right-wing political groups uploading and posting content that verged on extremism and hate speech. After brands found their ads being paired with videos on channels like InfoWars, along with other conservative content creators, they threatened to remove all support from YouTube. While YouTube’s intentions seem well placed, its execution has been isolating for all political groups.
As YouTube attempts to make the platform brand-friendly and palatable, it is acting like a gatekeeper and actively censoring content it deems inappropriate. Through the use of the current demonetization algorithm, YouTube is favoring certain speech over others and unnecessarily harming deserving creators and minority groups. The appeal of the platform is waning, and other services like Patreon and Vid.me are appearing on the horizon as a better market place alternative.
*Disclaimer: The Colorado Technology Law Journal Blog contains the personal opinions of its authors and hosts, and do not necessarily reflect the official position of CTLJ.