Filter bubble

A filter bubble refers to the narrowing or limiting of a person's intellectual perspective based on web content provided by personalized search technology. Many search engines and social media sites use special programs to monitor the content an internet user reads or views the most often. The program then caters the content of a search or social media news feed to reflect the user's preferences. As a result, a user is provided only with content he or she agrees with and is often isolated from opposing points of view. Critics argue that such filter bubbles skew a person's views on reality and leaves him or her open to media manipulation.rsspencyclopedia-20170808-139-164011.jpgrsspencyclopedia-20170808-139-164034.jpg

Background

The internet is a network of computers and support technologies that connects users from around the world. The World Wide Web is a system that allows these computers and networks to communicate with one another and exchange information. As web technology was becoming popular in the early 1990s, users had to deal with complicated procedures to find and share information. They often had to type in complex web addresses to allow them access to the correct site or file.

The first web directories and search engines were released by the middle of the 1990s, allowing users to find what they were searching for by simply typing in a few key words. Early searches were limited in that they showed results in an order determined by the program or from a list compiled by a human editor. Search capability improved into the twenty-first century, as companies made it easier to find information online. Search engines began using a predefined set of instructions for computers to follow when compiling results. These instructions, called computer algorithms, were able to rank search content by popularity and relevancy to the user.

The introduction of social media sites in the first decade of the twenty-first century marked another change in the way the internet operated. These sites allowed users to connect with one another and share content such as photos, videos, and comments. At first, this information was displayed on a user's page in chronological order. Eventually, the social media sites began using algorithms to personalize the content a user saw on his or her page.

Overview

In 2017, the dominant internet search engine was Google with more than 1.6 billion unique monthly visitors—1.2 billion visitors ahead of its nearest rival. Because Google sorts through millions of web pages, its algorithms rank pages by several different factors. The site first analyzes word choice and language in an effort to provide the most useful and relevant information. To determine the top-ranked search results, Google's algorithms also factor in location and past search history. For example, a user in the United States who searches for "football" may see information about a National Football League (NFL) team. A person in England may see results about soccer, a sport known as football in Europe. A search for "football" may also bring up information about a user's favorite NFL team if that user has searched for that team in the past. The algorithm may provide different results for fans of different teams.

The most popular social media site in 2017 was Facebook with more than 2 billion monthly users. Facebook's algorithms tailor content to users in much the same way as Google. Facebook ranks stories, photos, and posts by how likely it believes a user will interact with the content. Facebook monitors a user's past activity on the site, keeping track of "likes," comments, and shares. It also takes into account the activity of the user's friends. The algorithms compile all this information to assign a relevancy score to the content. The higher the score, the more likely Facebook believes a user will be interested in the content. Unless a user changes Facebook preferences, this content is given priority on his or her news feed.

Google, Facebook, and similar websites claim these algorithms provide users with the content they want. Some critics contend the practice creates an atmosphere in which people are only exposed to information that reinforces their worldview. Opposing viewpoints are given less priority and are often never seen by a user.

In 2011, author and internet activist Eli Pariser coined the term filter bubble to describe this effect. Pariser noted that two users who performed Google searches for BP—a reference to British Petroleum—received two different sets of results. One received information on investment opportunities with the company. The other got a list of news stories about a catastrophic oil spill in the Gulf of Mexico in 2010.

As of 2017, researchers have found that a growing number of people are getting their news online. Many young people rely on social media sites as their primary source of news. As a result, news that is sorted by a social media algorithm may create an echo chamber in which one side of an issue is constantly repeated. Over time, a user may perceive his or her filter bubble as reality.

For example, during the 2016 US presidential election, many supporters of Democratic candidate Hillary Clinton were shocked and angry when Republican candidate Donald Trump was elected president. For months before Election Day, their news feeds and search results displayed mostly positive stories about Clinton and negative news about Trump. Many were unaware that Trump had gained significant support and had an enthusiastic online following.

According to critics, filter bubbles add to a growing divide among people with differing political opinions. Rather than seeing both sides of an issue, people remain locked in their ideological bubbles, reducing the ability for a constructive dialogue. Some critics believe this may leave users open to believing fake news—fabricated articles that appear to be real. Fake news stories often cater to a user's preexisting prejudices and only reinforce entrenched opinions.

Bibliography

El-Bermawy, Mostafa M. "Your Filter Bubble Is Destroying Democracy." Wired, 18 Nov. 2016, www.wired.com/2016/11/filter-bubble-destroying-democracy/. Accessed 18 Dec. 2017.

"How Search Algorithms Work." Google, www.google.com/search/howsearchworks/algorithms/. Accessed 18 Dec. 2017.

Kielburger, Craig, and Marc Kielburger. "How Internet Algorithms Are Dividing Us." WE.org, 11 Feb. 2017, www.we.org/we-schools/columns/global-voices/internet-algorithms-dividing-us/. Accessed 18 Dec. 2017.

Pariser, Eli. The Filter Bubble: How the New Personalized Web Is Changing What We Read and How We Think. Penguin, 2011.

Rouse, Margaret. "Filter Bubble." TechTarget, whatis.techtarget.com/definition/filter-bubble. Accessed 18 Dec. 2017.

"Short History of Early Search Engines." The History of SEO, www.thehistoryofseo.com/The-Industry/Short‗History‗of‗Early‗Search‗Engines.aspx. Accessed 18 Dec. 2017.

"What Is a Computer Algorithm?" HowStuffWorks.com, 5 Sept. 2001, computer.howstuffworks.com/question717.htm. Accessed 18 Dec. 2017.

"Your Social Media News Feed and the Algorithms That Drive It." Forbes, 15 May 2017, www.forbes.com/sites/quora/2017/05/15/your-social-media-news-feed-and-the-algorithms-that-drive-it/#5047865d4eb8. Accessed 18 Dec. 2017.