Filter Bubbles in the Media 

Last Updated: 16 Feb 2023
Pages: 8 Views: 249

In 2012, Eli Pariser released a book titled “The Filter Bubble: How the New Personalized Web is Changing What We Read and How We Think”, which highlighted “Google’s” shift to customized search results. This book, which he later talked about during a “Ted Talk” went viral and struck some interest in almost all who read it. He discussed the idea of what he defined as “filter bubbles”, which is “your own personal, unique universe of information that you live in online” (Pariser). Filter bubbles are not only on Facebook and Google, but everywhere on the internet. Algorithmic equations are used on all sorts of digital media from online music to where you buy your favorite clothes. These equations have also worked their way into almost all forms of social media: Facebook, Twitter, Instagram, and Snapchat to name a few. What you see on social media or on the internet is affected by what you have looked at in the past, what kind of person you are, etc.

Eli Pariser warned us about a world in which what we see is tailored towards what we want to see, however, it is often what we do not want to see that we need to see in order to be well rounded humans. Should we listen to Eli Pariser’s warning? The internet becoming a platform for personalization has multiple upsides such as the ease of it or the fact that you are reading what you want to read, however disregard of “the other side” on important matters is a severe issue and something needs to be done to address it. In Pariser’s “Ted Talk”, he talks about how what is in your filter bubble depends on a long list of factors, ranging from political interests to what kind of computer you are using. Pariser first noticed this while on Facebook. He was seeing only posts that leaned towards the liberal side, which is the side he stood on. He however, enjoyed reading the opposition, which Facebook had somehow edited out of his feed.

This forced him to experiment online with two of his friends. He asked his friends Daniel and Scott to search “Egypt” on Google and send him their results. Immediately, the results proved his claim that the internet offers some sort of personalization during searches. The “hot topic” in Egypt at that time was the protests, which Scott’s search results showed, however Daniel’s results showed nothing about the protests until the second page. Pariser said, “And this moves us very quickly toward a world in which the Internet is showing us what it thinks we want to see, but not necessarily what we need to see” (Pariser). Pariser mentions a quote from Eric Schmidt, who at the time was the executive chairman for Google, that said, “‘It will be very hard for people to watch or consume something that has not in some sense been tailored for them”’ (Pariser).

Order custom essay Filter Bubbles in the Media  with free plagiarism report

feat icon 450+ experts on 30 subjects feat icon Starting from 3 hours delivery
Get Essay Help

This quote from Google’s own chairman hints at the idea of personalization of the internet. Pariser then discusses the shift from human “gatekeepers” who “controlled the flow of information” to algorithmic gatekeepers who portray no sense of “embedded ethics that the editors did” (Pariser). Kenneth Werbin wrote an article titled The contextual integrity of the closet: Privacy, data mining and outing Facebook's algorithmic logics where he writes “What these studies clearly reveal is how algorithms open vast potential for the creation, connection, collaboration and efficacy of targeting individuals and, at the same time, subjecting users to intense regimes of surveillance, where privacy is at best an afterthought in the pursuit of profits” (Werbin). This goes along with the idea that profits are more important than ethics in the case of algorithmic filtering.

After viewing Pariser’s experiment on his “Ted Talk”, I wanted to do my own smaller scale experiment with two different people’s filters. I asked my brother Drew to search “Donald Trump” on Google and send me what popped up on his computer. I then asked my friend Eniang to search the same, and send me his results as well. Drew is more conservative, so I was wondering if that would have any impact on what his search results would be. At the top of the page, Google listed three “top stories”, all of which were highlighting something positive that Donald Trump had done. One story discussed his help with the current U.S. economy, as well another discussing his aid in creating new jobs. I then looked at what my friend Eniang sent me.

Eniang considers himself more of a liberal, so I was very interested to see if the same thing would happen with him. What his Google search results showed was three “top stories” all highlighting something negative Trump had done. The first story talked about President Trump being rude to a reporter. The second was about his demand for $5 billion to build a wall. It was very interesting to see how different political views affected what they saw on their Google search results. My 'liberal' friend only got to see 'top stories' that mentioned President Trump in a negative way. Eniang never got to see the “good side” of Trump, just like my “conservative” brother never got to see the “bad side” of him. Although people may not agree with Google filtering your search results, many people only want to hear what they want to hear, rather than what they need to hear. In a society where individuals are only reading things that interest them, and never anything about the opposition, society is in a sense placing itself in its own filter bubble.

Personalization on social media began as an innocent means of “reconnecting with old high school friends, but now Facebook is a major driver of news. (A Pew study from last year found that 62 percent of Americans get news on social media)” (Hess). With more and more individuals relying on social media for their news, there is no room for personalization because the outcomes could be serious. Algorithmic filtering can be broken into two separate parts. One part is “reach”, which is “how far a post cascades across a broadcast feed or set of networks, and algorithmic filtering can either promote or limit a post’s reach”. This means that the algorithmic control of the flow of media can change how far a post can make it on the internet. In an effort to make money, which is news networks and social media organizations' biggest goal, “many social providers allow users to override algorithmic filtering and further the reach of a post by offering pay-to-promote services…”.

This is what make up the advertisements you see on social media. These posts would normally be “filtered” out of your feed, but in a situation where a company pays money, these posts can be found on nearly any source of internet (social media, online news, online shopping, etc). In Michael Bossetta’s “The Digital Architectures of Social Media: Comparing Political Campaigning on Facebook, Twitter, Instagram, and Snapchat in the 2016 U.S. Election”, he discusses the mediating of communication on social media. Bossetta writes, “The present study argues that political communication on social media is mediated by a platform’s digital architecture—the technical protocols that enable, constrain, and shape user behavior in a virtual space” (Bossetta, 1). This means the algorithmic method that the internet, specifically social media, uses to filter the web. This specific quote hints at political propaganda on social media, which is “mediated” in a sense by our search history and other factors.

The article discusses a rise in political advertising in digital media in the United States. Bossetta writes, “Political advertising on digital media across local, state, and national elections rose from 1.7% of ad spending in the 2012 election to a 14.4% share in 2016” (Bossetta, 2). With the rise in media, more and more people are creating “filter bubbles” for themselves by reading and showing interest in certain types of websites. For example, Fox News has a tendency to lean more towards the conservative side on certain matters. If an individual were to rely solely on Fox News for their news, they would never see the other side (liberal) of certain issues, therefore creating a block in the flow of the information they see. What social media and Google are doing is picking up on these factors and automatically filtering your feeds before you even have a chance to take a stance on one side.

This is how “filter bubbles” are created, by, without knowing, creating an invisible block in the flow of information based on past history online. Filter bubbles are examples of personalization of digital media, but they are invisible and can be hard to identify. There are many other examples of the personalization of digital media. Other examples of this include personalization on music applications. Spotify, iHeartRadio, Apple Music all have some sort of “Daily Mix” that is essentially a mix of songs in your playlists, and songs that the company thinks you will like based off of your listening history. On Spotify, their daily mix is made by using, “clustering technology to identify distinct subgroupings within our users’ listening patterns, and then build recommendations around those, mixing in appropriate new suggestions along with the known favorites” (“How Your Daily Mix ‘Just Gets You’”). This technology is similar to the algorithmic techniques used to personalize your search history and social media feeds.

Another example of personalization of digital media is targeted advertisements. These advertisements can be found on social media, as well as the web as a whole. This is the kind of personalization that turns most heads. Have you ever gone online and searched for a pair of shoes, only to then log into Facebook and scroll past an advertisement for the shoes you were looking at? Targeted advertisements, “are a result of cookies and an IP address. Cookies are text files in your browser that track information you’ve searched. Your IP address is kind of like your house address and shows where you are located. The balance between both of them is what gives the information to advertisers” (Dangerfield). There are ways to work around data personalization. Shutting off your browsers “cookies” can enable the website to pull information from your search.

Facebook also has a setting in which you can turn off advertisement personalization by going into settings and turning off ‘“ads based on my use of websites and apps” setting and press the “choose setting” button and select “off”’ (Dangerfield). This allows Facebook to still track you and look at what you view, but the advertisements you see will not be personalized to your liking based off your use of certain websites/applications. There is also the case of private browsing, which most search engines have, that allows you to search without anyone receiving information based off your search. This allows privacy and it also means that anything looked at during “private browsing” will not be advertised on another page.

Many people disagree with the use of personalization in digital media. Kalev Leetaru of Forbes wrote an article titled “If Social Media Algorithms Control Our Lives Why Can't They Eliminate Hate Speech?” In the article he discusses the overwhelming use of social media in our world, but then discusses the negativity of social media, such as bullying, hate speech, etc. Leetaru writes, “If algorithms really wield that much power over our subconscious minds and conscious decisions, why can’t social media platforms literally write hate speech out of existence with a few lines of code?” This is very interesting to think about because often times hate and negativity are seen far too often in digital media. Leetaru believes that the same algorithmic codes that “deliver hyper-personalized ads to us in a fraction of a second, hundreds of thousands of times a day, (could) be retooled to deliver hyper-personalized interfaces that steer us away from our own negative beliefs”.

New York Magazine reported the day after Donald Trump won the election that, “The 'Filter Bubble' Explains Why Trump Won and You Didn't See It Coming” (Hess). Targeted ads and targeting posts have serious consequences, as seen in the 2016 presidential election. Facebook and other social media played a huge role in the outcome of the election due to digital personalization online, specifically social media. People might ask, “So what if social media is targeting me with advertisements and stories. It’s what I want to see”, however not paying attention to the opposition can make for serious consequences. Not only is this algorithmic method an invasion of privacy, but it creates a bubble for yourself which essentially blocks the flow of information from getting through to you. Sometimes this information is necessary in order to make well-educated decisions. In an example such as the 2016 presidential election, targeted advertisements and personalization in media played huge roles in determining the outcome for who became president. In 2011 Eli Pariser warned us about “filter bubbles”, it’s about time we listen.

Cite this Page

Filter Bubbles in the Media . (2023, Feb 16). Retrieved from https://phdessay.com/filter-bubbles-in-the-media/

Don't let plagiarism ruin your grade

Run a free check or have your essay done for you

plagiarism ruin image

We use cookies to give you the best experience possible. By continuing we’ll assume you’re on board with our cookie policy

Save time and let our verified experts help you.

Hire writer