The Biased Internet

Born in a free country, everybody here in USA gets an equal opportunity as the person to your left or right, front or back, no matter the prejudices of caste, creed, class, color, sex, or race. But less and less people are st_20150920_bubble_1696335using that to their advantages, mainly teenagers, and are losing interest in politics because the ideals of the political party don’t match up with their own individual ideals; they just blindly believing their parents on the issues and take a stance with them without giving a thought of their own. The Internet also plays a huge part in this since, when the teenagers do think and try to debate on the issues they feel aren’t right, most of their point and argument comes from nowhere else but online, a place where anybody could say anything and it could become the new trend of the year. A staggering 88% of American teenagers between the ages of 13 to 17 have an access to a mobile phone of some kind and with 23% of teens who now own a tablet too, more than 90% of the teenagers have stated that they go online at least once a day. With these many users online, we run into a place bias known as “filter bubble.” What’s more is that this number will keep on going higher as new technologies are being released. Since, the internet is surely the fastest and the most efficient way to find any news in almost no time, it could easily deviate us from our thinking of the point to something else.

The Internet that is supposed to connect us to the world, improve the democracy by helping people find the people with same thought-mentality, or let us share our views about something wholly new, it is the very thing that’s blocking our ways and is giving us personalized feeds of our social media accounts which at time is just the thing you want, things at the reach of just one key stroke, but at times this could just be the thing we don’t want to see. When the internet curates the informations that reaches to our eyes, a “filter bubble” is created; an expression brought up by a TED-talker: Eli Pariser. Screen Shot 2017-09-18 at 1.35.53 PM.pngHe refers to this bubble as if this is something we live in and is blocking our views to the outside world. Google, Facebook, Yahoo, Youtube, Twitter and Netflix are some of the few curators of the “personalized” feeds that provides us with informations “it think we want to see but not necessarily what we need to see.” (Eli Pariser) Companies like these uses a combination of complex algorithm to customize the feed page according to the individual’s interest and curates what gets in and whats get edited out. The result is a crunched up summarization of the all the informations we shouldn’t be getting. For example: if you support the Republicans and watch a lot of its video on Youtube, there are high chance that Youtube will filter out the other side of the political spectrum.

Even though, there will always be one video from outside the “filter bubble,” people tend not to choose it because they know it doesn’t matches up to their expectations, ideals, and thinking, therefore, resisting the urge to go out of their comfort zone. To quote Eli Pariser, there is always existing “struggle going on between our aspirational self and our more impulsive present.” (Eli Pariser) We want to watch the new bill release from the Democratic side but we feel more relatable to the Republican side of it. Hence, as a Democratic country, we should press our citizens to follow both sides of the spectrum, no matter in what categories, to level the ground so that we have a fair and an unbiased understanding and thinking towards the topic than being on one side of the see-saw!

The Confirmation Bubble

One of the things that has been discussed in our Government class is the concept of the “Filter Bubble”, the algorithm(s) search engines and social media outlets implement and how it can lead to more and more to what’s been referred to as a “confirmation bias” – the rejection of facts simply because they’re incongruous with someone’s current beliefs about a subject. This “bubble” isn’t an actual bubble but more an algorithm that sites like google Twitter, Facebook, and so on have adopted to present things to us that we may find more interesting or engaging. Many reading that would think that it doesn’t sound that bad; it actually surprisingly is, for the people subject to it and the content creators alike. While this algorithm does tailor your search results and recommended things to look at, it doesn’t take into account anything other than what it thinks you may really want to look at. The problem with this is that, while it shows you things it thinks you want to see, it doesn’t show you what you should actually be paying attention to, which can lead to views about subjects being so ignorance based that – when faced with an opposing view – someone may either wonder why they have never heard of what’s being shown to them or completely reject it.

This “confirmation bias” and how the filter bubble effects it can lead to a more unwavering viewpoint than what’s actually necessary for a subject because of the fact that you’re only being shown what you want to see, even though what you want to see isn’t always the right way to look at it. “Once we have formed a view, we embrace information that confirms that view while ignoring, or rejecting, information that casts doubt on it” – psychology today. The simple act of showing us more and more of what we want to see, and subsequently believe in, can lead to more and more of a bias towards a subject and can, eventually, lead to total rejection of anything that challenges that belief.

The filter bubble also harms content creators that focus on a viewpoint that is different from the norm. Recently YouTube has had a change to its general algorithms and policies towards certain kinds of content because of advertisers dropping out of the site. Their reasoning? Appearing next to hate speech or extremist views. While it’s understandable that giant companies like Starbucks and the like would not want to be seen in tandem to things such as extremism and hate speech, this does not justify the way google has handled the issue. In addition to demonetizing those extremist and hate speech videos, which in of itself is effectively censorship for those who live off their ad revenue for YouTube, they have been demonetizing videos that don’t even resemble extremism or hate speech and simply address controversial topics. Through this, google has been effectively creating an inwardly spiked filter bubble – for its own site – which is harming itself as well as those inside it. This filter bubble, holding all the tiny little balloons which are the YouTube community, have been either deflating or outright popping the little balloons inside it through their own policies. Because of this “adpocalypse”, many creators are having an extremely hard time maintaining their rate of creation for their content.

 

The Traps of Filter Bubbles

IMG_2948

According to Eli Praiser’s Filter Bubble theory, the social medias itselves selects the categories of information on the internet for the viewers, and create an environment on what the viewers more interested in. However, Eli does not agree with the these personalising on the internet, since the internet only gives what it thinks we want to see, but not necessary what we need to see. I agree with this point of view.
Internet is a way for people to explore and find out the information all over the world. This ability gives people different points of view from all around the world, which put different contexts into the discussion. During this process, it helps people to have more and more details about the informations, and can reach the news with more comprehensive opinions. The environment with the conflict between opinions can help to develop the insight of the problems, which help to create more suggestions that helps to solve, or say, balance, the social problems we are now facing, such as conflict between politics and different foreign policy. However, when the internet closes down the options for us, we got only the information that we view more, and we can not realize it, since we can not tell what actually changed in the web. Although we are still viewing the same social medias, what we are viewing changed already, and this can cause a large problem. For example, a viewer is usually in the middle between democrats and republicans, he views balancedly between the information and news between these two groups. However, one day he found a really interesting democratic article, so he used a lot of time on reading it, and opened several more pages about the article. Then the internet records that he used more time on democrats than on republican that day, so they change his viewing information to more democratic information. In the end, this viewer becomes a democratic because the change in content of what he is viewing by the internet. This example shows that the internet-create filter bubbles are not available for people, since a lot of people who have neutral opinion can be lead to one side of the opinion.
On the other hand, the filter bubble created by internet can influence people’s life by blocking them from what they are not agree with. Imagine there is only a web page that contains all the opinions and comments that the viewer agrees to, the viewer can not know what others opinions are, then, within an environment with just agreements, the viewer can become more and more close to oneself’s opinion, and become aggressive towards other kinds of opinion that are not in the same route with the viewer. This can be a huge problem for online discussions, that it may become quarrels because people would not accept other opinions.
Finally, Eli raised a great opinion that internet helps to build democracy. However, I do not think the personalized website should be apart of it; In fact, it is the opposite of democracy. People got “blocked” from the information they actually need, instead, they got what they are “interested” in, but in the end, nothing necessary. In this way, people actually can not connect themselves to the world.
What people need are not design-for-one-only web page, they need to know all the information to have comparison, which makes it more easier to view the world and build self opinions. The filter bubbles are not helping people in viewing internet, since people can not get informations when they actually want to see things with details. Internet can never predict what a viewer really want to view, because occasion happens, and then people start to getting things from the filter bubbles from internet. People should have their own filter bubbles, created one will never meet the requirements of changing opinions of people surfing on internet.

Burst Your Bubble

Source: Don’t Bubble

On the second day of government class, we did an assignment on bias and filter bubbles (Parish Government). I had never thought about this topic and how it could potentially be related to government. It was shocking to me to discover how what we search on our computers and phones enables our later searches to be narrowed down to seemingly only what we want to see. Who would have thought that when the person sitting next to me in class googled the exact same thing that I did at the exact same time, different search results popped up for both of us. Some articles were higher up on the list on my search and some articles that popped up on her search weren’t even present on mind. Internet sites such as Yahoo, Google, and Facebook all control what we are seeing based on the information they have “gathered” on us as individuals.

Source: Audio Editions

Sites such as Yahoo, Google, and Facebook edit what we see on the Internet based on previous searches. Google looks at everything you are doing, including where you are searching from, and uses that information to filter your searches to what it believes you will want to see. Yahoo is a personalized site where people will receive different information when searching for the same thing (TED). Facebook uses recent searches on your computer to customize the type of advertisements that pop up on the sides of your page. If you had recently searched for an SAT tutor then logged onto your Facebook account, it is likely that you will see and add for SAT online tutoring or SAT prep books. The Internet is showing us what it thinks we want to see based on past searches instead of what we need to see. We don’t have a choice of what gets edited in and we don’t see what gets edited out. Factors such as political beliefs play a major role in what the Internet filters for a specific person (TED). If someone is constantly searching for a certain political party, the Internet will eventually detect those searches and start to filter our searches so that we are only seeing the results that mirror our political beliefs. We are cut off from other’s innovative thoughts about certain factors and find ourselves stuck in a box of our own beliefs not willing to accept other points of view. We become trapped and see no way other than our own while there are so many different ideas floating around that could be better or that could add on to and improve our own thoughts. Being deprived of all ideas does not improve an individual and his/her knowledge but rather hinders him/her.

We need to have some control on what gets through and what doesn’t because, sometimes, the limited information given to us only isolates us from the rest of the world and important issues that we should be cognizant of. This disconnection from the rest of the world can easily harm our society instead of help it. Being unaware of certain issues and view points just because an individual searched for an opposing view is trapping him/her in a bubble that he/she cannot burst. I believe that we must work to burst this filter bubble in which we are confined in order to broaden our perspectives on issues that involve opinions and ideas other than our own.