When delving into the hate speech posted on YouTube, it doesn’t take long to stumble upon the larger topic of general extremism and how the YouTube algorithm factors into the revenue of the people who post it. Initially, all content posted to YouTube is posted equally. While channels with more subscribers, people who have signed up to be notified about when a channel updates, will receive a bigger influx of initial watchers which can boost them in the search results, YouTube’s “filtering system” doesn’t treat a video with a million watches than one with just fifty. This is in large part due to YouTube’s viewer powered alert system.

When watching a video that seems to include objectionable content or material that violates the YouTube guidelines, any viewer with concerns has the ability to “flag” the video. Flagging a video is done by finding a flag icon under the video, clicking on it and describing what you are reporting the video for. The amount of time it takes for YouTube to get to this complaint varies, but after a certain number of strikes a channel can be suspended or even deleted.

This sounds like a good idea in practice, but what happens when viewers who stumble across a video that breaks these guidelines but doesn’t find the content objectionable? Incredibly dangerous misinformed information gets spread, that’s what.

Since YouTube boosts a video based on views, it’s very common for videos to break the guidelines and for the YouTube algorithm to accidentally push it to the front page because it has a lot of views. Really objectionable guideline breaking content often tends to be either extremely entertaining or scandalous to the populous at large, which means more people will watch. In the wake of a disaster or a piece of news, a content creator that likes to make the type of video previously mentioned might make content covering the news through a lens of their own bias and poorly researched facts, then get boosted above videos from more reliable sources when they finally post it because it attracts a fanatical crowd or a group of bemused onlookers.

False information immediately starts getting spread like wildfire. People already have a tendency to give a false impression or inadequate summary of news they’ve heard. Add a source that isn’t really reliable in the first place, and you have a riled audience regurgitating what they’ve heard to others. It doesn’t take long for the outrage to spread, and in the wake of many of these instances, it’s led to new outraged communities being formed.

What’s worse is that the content creator who might’ve originally started the panic does not usually care. They’re either too deep in their own convictions to consider their content inaccurate, want panic and outrage to spread or are doing everything for laughs. In all three of these instances, the content creator wants their message to be spread to as man people as possible. Having people repeat what they’ve heard to others is great. Having people form groups is better, especially for their bottom line.

YouTube videos carrying themes of extremism or hate content can be monetized just like any other kind of video on YouTube. The more views it gets, the more the ads assigned to the video runs, and the bigger the slice of the pie the content creator gets. The more money they receive, the more videos they can pump out, using the money towards equipment and living costs. These were the kind of problems YouTube was trying to tackle when they tried to update their algorithm, but they fell through in more ways besides roping in innocents and failing to take down extremists who immediately began to work around the algorithm. In the process of trying to appear neutral towards sources few have been afraid to call out in the past, YouTube has made an ineffective rebuttal and creating a toxic culture of neutrality.

image_printPrint this page.

Author

0 0 votes
Rate This Post
Subscribe
Notify of
guest
3 Comments
Newest
Oldest Most Voted
Inline Feedbacks
View all comments
Kevin
February 17, 2018 12:38 am

Hello Alexandria,

Thank you for the insight into YouTube’s algorithm. I just have a question about the term extremist you used. Do mean literal terrorists like ISIS or white supremacy groups that are using YouTube as way the recruit new members. Or do you mean channels and YouTubers who purposefully spread misinformation and incendiary commentary. Or perhaps both? Whatever it is, I think a major cause to this issue is the fact that many people use social media as a legitimate mnews form, which it is not. There is too much bias within platforms of social media to provide objective non-biased news, I when people only look for news stories that confirm their own beliefs it creates dangerous echo chambers. So while YouTube does have a responsiblity of creating a safe platform, we as civilians must also learn to think for ourselves.

Jacqueline
February 16, 2018 4:26 am

Alexandria,
YouTube has a lot of problems with their algorithm in some ways. While the platform is still enjoyable, I agree that there’s content that slips between the cracks and that it isn’t a totally level playing field. Scandals with payment, censorship, and the lack thereof pop up all the time. This was interesting and I’m curious to see where you take this. This article (http://www.replayscience.com/blog/how-does-the-youtube-algorithm-work/) breaks down more of the algorithm. The whole thing is a contraption that if a creator uses it just right, they can climb to fame.
-Jacqueline

Simon
February 15, 2018 5:59 pm

Hi, your post is really interesting because I had no clue about how youtube did their videos and search results. This leads to fake news and the bad information being spread. You go into how misinformed people can be after watching videos that spread poor ideas and missinformation. This is really interesting and my question is what do you propose to change it?
Thanks, Simon

Youth Voices is an open publishing and social networking platform for youth. The site is organized by teachers with support from the National Writing Project. Opinions expressed by writers are their own.  See more About Youth VoicesTerms of ServicePrivacy Policy.All work on Youth Voices is licensed under a Creative Commons Attribution-ShareAlike 4.0 International License

CONTACT US

Email allisonpr@gmail.com Call or Text 917-612-3006

Sending
Missions on Youth Voices
3
0
Would love your thoughts, please comment.x
()
x

Log in with your credentials

or    

Forgot your details?

Create Account