Created by SS15-62, 2015

Filter Bubbles and Personalisation


By the development of a differentiated into an interconnected, globalized world, the Internet has brought up many effective and influencing changes. Since the 1990s, the process of technical enrichments and social media and networks has generated a lot more opportunities for people to communicate. Also international companies profit of this process by having a coherent trade and an improved interaction between the governments of particular countries. However, the development of the globalized world also leads to negative effects on social media networks, such as Google or Facebook.

In 2011, Eli Pariser established the term “filter bubble” in his book “The Filter Bubble: What the Internet Is Hiding from You”. He is an internet activist who got popular through the produce of a website as an online petition against terrorism, after September 11, 2001. Therefore, he joined MoveOn.org to guide their policy campaigns. His book “The Filter Bubble” directed the attention of people to the sensitive issue of personalisation in the Internet (TED, 2011). The filter bubbles describe the phenomenon that websites use algorithms to prognosticate which information might be relevant for the users. The considered information for the algorithms is, for instance, the art of the computer, the location, the browser and the search history. The websites show only the information which is determined through the algorithms for each user. Non-relevant information is excluded and the user is isolated in his or her own “bubble” (Pariser, 2011). The best examples are “Google Personalized Search” results and “Facebook’s personalized news stream”. Facebook’s personalized news stream, specific the filtering out of information, is fulfilled without informing the users. It observes the interaction of the users and filters posts. If the user does not click much on some information or someone’s profile, these are deleted from their home page (Pariser, 2011). Google shapes every search to fit it to the persons taste. The search process is now “personalized”, which means that different users do not get the same information when they are searching for an equal subject with the same words (Halpern, 2011, p. 34). Even when the users are offline 57 signals are considered to personalize results. The algorithms determine which information might be the best for each user (Pariser, 2011). The users are directed to the material that is similar to their interests, opinions and attitudes, so that each user gets different information. When these filters are brought together, one has his own personal and unique filter bubble with his information online (Halpern, 2011, p. 34).

The problem is that these filter bubbles depend on a person, but the person cannot decide which information is stored. This personalization is invisible and the user has to pay attention on what he searches at Google or clicks on at Facebook (Pariser, 2011). It can only be presented when users compare their results on Google and are attentive on Facebook. The personalization leads to the point that the democracy might be vulnerable. The users instruct themselves with theirown ideas and do not receive other opinions, new ideas or perspectives. Google is such as the press, it distributes only the questioned information and endangers knowledge because of the limited various opinions (Halpern, 2011, p. 34). At least, the algorithms have to make sense and need to be transparent and controlled by the users (Pariser, 2011).

After Pariser published his book about how algorithms create filter bubbles in the Internet, he put a critical focus on Facebook and its algorithms. Therefore, he used a study of Facebook (Pariser, 2015). The question of the study is, if the filter bubble effect and its algorithms “will tend to amplify news that your political compadres favor [and how]” (ibid.). Bakshy and Messing who published that study, choose examples of liberals and conservatives for giving the study a central theme and illustrating their results in the best way. When self-describe liberals or conservatives click on different pages and links on Facebook, it does not mean that they see everything they expect to see. For instance, “there’s an 8% decrease in cross-cutting content from the algorithm vs. a 6% decrease from liberals’ own choices on what to click. For conservatives, the filter bubble effect is about 5%, and the click effect is about 17% — a pretty different picture” (ibid.).

This shows that there is an effect of these Facebook algorithms but in some cases it is minor, which is indicated by the 17 % of click effects for conservatives. Another important point is that there are also differences regarding the contents of liberals and conservatives and the so called “cross-cutting”- algorithms (ibid.).

As shown in the key chard above, if everyone saw random samples of every topic, liberals “would see 45% conservative content and conservatives would see about 40% liberal content” (ibid.) The “exposed” shows, what liberates and conservatives actually have seen while clicking on news links. It is less than Bakshy and Messing thought before. And this is exactly the purpose of Facebook algorithms, influencing the clicks and the shown content (ibid.). As illustrated in the study, the effect of filter bubbles and algorithms of Facebook on users is less than it was expected. Also Pariser (2015) assumed that algorithms have a major influence. It is more relevant to consider who the friends are and what actions they do, by regarding the shown articles on Facebook. The most significant point is still the individual choice of people what pages they prefer to see – it matters more than the algorithms (ibid.).

However, it needs to be stated that algorithms mediate constantly more of what we do in the Internet. They basically illustrate how the world should be and continuously influence Facebook and its users. Eventually, the integration of the filter bubble thesis of Pariser in 2011 with reference to the study of Bakshy and Messing into a general view, the connection of them is obvious. Both of them criticise the increasing impacts of filter bubbles and algorithms on people in social media networks. Especially Google and Facebook illustrate the negative effects on the interconnectedness of people in a globalized world.

References

  • Halpern, S. (2011). Mind control & the Internet. The New York review of books, 58, 33-35.
  • Pariser, E. (Producer). (2011, March). Beware online “filter bubbles”. TED. Podcast retrieved June 30, 2015, from http://www.ted.com/
  • Pariser, E. (2015, May 7). Did Facebook’s Big New Study Kill My Filter Bubble Thesis? [Web log comment]. Retrieved June 30, 2015, from https://medium.com/backchannel/facebookpublished-a-big-new-study-on-the-filter-bubble-here-s-what-it-says-ef31a292da95
  • TED (2011). Eli Pariser. Retrieved June 30, 2015, from https://www.ted.com/speakers/eli_pariser