The Confirmation Bubble

One of the things that has been discussed in our Government class is the concept of the “Filter Bubble”, the algorithm(s) search engines and social media outlets implement and how it can lead to more and more to what’s been referred to as a “confirmation bias” – the rejection of facts simply because they’re incongruous with someone’s current beliefs about a subject. This “bubble” isn’t an actual bubble but more an algorithm that sites like google Twitter, Facebook, and so on have adopted to present things to us that we may find more interesting or engaging. Many reading that would think that it doesn’t sound that bad; it actually surprisingly is, for the people subject to it and the content creators alike. While this algorithm does tailor your search results and recommended things to look at, it doesn’t take into account anything other than what it thinks you may really want to look at. The problem with this is that, while it shows you things it thinks you want to see, it doesn’t show you what you should actually be paying attention to, which can lead to views about subjects being so ignorance based that – when faced with an opposing view – someone may either wonder why they have never heard of what’s being shown to them or completely reject it.

This “confirmation bias” and how the filter bubble effects it can lead to a more unwavering viewpoint than what’s actually necessary for a subject because of the fact that you’re only being shown what you want to see, even though what you want to see isn’t always the right way to look at it. “Once we have formed a view, we embrace information that confirms that view while ignoring, or rejecting, information that casts doubt on it” – psychology today. The simple act of showing us more and more of what we want to see, and subsequently believe in, can lead to more and more of a bias towards a subject and can, eventually, lead to total rejection of anything that challenges that belief.

The filter bubble also harms content creators that focus on a viewpoint that is different from the norm. Recently YouTube has had a change to its general algorithms and policies towards certain kinds of content because of advertisers dropping out of the site. Their reasoning? Appearing next to hate speech or extremist views. While it’s understandable that giant companies like Starbucks and the like would not want to be seen in tandem to things such as extremism and hate speech, this does not justify the way google has handled the issue. In addition to demonetizing those extremist and hate speech videos, which in of itself is effectively censorship for those who live off their ad revenue for YouTube, they have been demonetizing videos that don’t even resemble extremism or hate speech and simply address controversial topics. Through this, google has been effectively creating an inwardly spiked filter bubble – for its own site – which is harming itself as well as those inside it. This filter bubble, holding all the tiny little balloons which are the YouTube community, have been either deflating or outright popping the little balloons inside it through their own policies. Because of this “adpocalypse”, many creators are having an extremely hard time maintaining their rate of creation for their content.