In the center of the YouTube, algorithm controversy lies concerns about visibility. People dislike how the algorithm works because it can bury their content while promoting others. Since revenue is generated based on the number of views you can translate into ad clicks, this causes an obvious problem for people trying to get a payday. In the case of demonetization, they can do a lot of work to get their stuff pushed to the top of that algorithm only for their opportunity to profit off of that work to be revoked. It’s all about being seen, which leads quickly into what currently makes popular YouTubers popular. If YouTube wants to make an actual visible change, they need to examine what their algorithm has made popular on YouTube, since people will follow it to boost their careers, creating a creator culture, or the customs creators on YouTube follow to try and achieve popularity. Since this is the same culture toxic neutrality has cropped up in, it’s reasonable to assume that certain things about this community helped create and continue to perpetuate the toxic neutrality on YouTube.
To start off, the best way to get views on YouTube is to follow what’s popular. A common YouTuber practice is to go to #PopularOnYouTube to see what’s trending. This doesn’t immediately lend to toxic neutrality, and it’s fine when Youtuber’s who practice this have to compete with other kinds of content, but they usually don’t. Completely new content usually has to work much longer to be picked up by the YouTube algorithm, so views get used to a certain kind of content and format. Any video outside of the usual content and format is considered niche, and people immediately regard it as something “different”. Once again being different in it of itself is not bad, but on YouTube it’s a big deal, with any YouTuber making “different” content being regarded as opinionated or biased. While this might sound reasonable in theory, a YouTuber just has to make one video this way to be stuck with this label, even if the thoughts they shared were relatively small. #PopularOnyouTube videos are usually just crafts or challenges, with the producers focusing only on guiding the viewer through their processes. They don’t include opinions on any subject, so listening to a video with opinions int hem no matter how small seems like a huge step up. So when it comes time to point to a toxic YouTuber or activity, it’s regarded as going to an extreme. YouTube social conventions are incredibly absolutist. It doesn’t matter what you’re pointing out. To the YouTube viewing community, if it breaks the opinionless nature of most Youtube content, it’s bad.
The only exception is when a topic is trending more than what’s #PopularOnYouTube, which is exceedingly rare. Like in most cases when people declare that they want to be neutral, they can only be moved when an opinion, stance or group has gotten so extreme that it’s impossible to ignore. An extreme to match an extreme. It’s an ouroboric extreme though. Because things are so obviously bad, people can safely fall back on the basic and unnuanced stances that lead them to be so neutral in the first place. Being neutral seems to fit easily into classic social conventions like being polite or nice because you never put yourself in a position where you need to clarify your intent or consider the differences in this interaction compared to others. It’s extremely safe. Likewise, in a situation where it’s obvious someone is in the wrong, it is safe to point it out because you’re unlikely to meet opposition. When you do meet that rare dissenting voice, disagreeing with them is just a safe, because it fits into another traditional social convention of speaking up, which takes a lot more nuanced thinking n other situations where the person you’re “standing up” against might have a different outlook or cultural experience than you. See people speaking up in the Jake Paul incident where he went to Japan and mocked a dead body versus people arguing over the validity of social rights groups.
YouTube needs to address this culture of neutrality if they truly want to make a change. While it is perfectly okay to make up opinionated content, allowing it to totally dominate the YouTube algorithm creates a culture of complacency and toxic neutrality that doesn’t speak up against over wrongs until it’s too late. Newcomers follow these rules and the trend starts over and over again. Viewers start to expect certain kind of limiting behavior, further promoting the same behavior that caused the kinds of accounts that led YouTube to change its algorithm in the first place. But if YouTube wants things to change for the better permanently while avoiding hurting the bottom line of YouTubers caught in the fray, they need to change the culture of their platform, not the algorithm.
Check out my hypothesis annotations: https://hypothes.is/users/at18258