In the days, weeks, and months following the widely controversial outcome of the 2016 U.S. Presidential Election, the term “filter bubble” has been tossed around between liberals and conservatives alike—but what exactly does it entail? Is it a false theory perpetrated by those who are upset by the election, crafted simply to place blame on the Internet rather than their own political parties? Is it a phenomenon caused by the decisions of social media users themselves and not at all the fault of Facebook’s algorithms? Is it a real issue that should be studied and eliminated? Each of these claims have been supported by vast numbers, to the point where this “filter bubble” idea has become a household phrase. According to journalist John Bohannon of Science Magazine, “political scientists have wondered [for years] whether the social network’s news feed selectively serves up ideologically charged news while filtering out content from opposite political camps”—which is the theory behind the filter bubble (Bohannon 1). Eli Pariser, the author of a 2011 book regarding filter bubbles, agrees that “it’s much easier than it’s ever been to live in an information environment that is several standard deviations from normal” (Brodeur 2). Perhaps this idea of social media filter bubbles has existed for longer than this generation has imagined, and the recent election merely struck surprised voters with the realization of just how potent they might be.
The term “filter bubble” is defined as “the tendency of social networks…to lock users into personalized feedback loops, each with its own news sources, cultural touchstones, and political inclinations,” according to Amanda Hess of The New York Times (Hess 1). Many blame the networks themselves—usually Facebook and Twitter—for this phenomenon, but Michael Andor Brodeur of The Boston Globe disagrees, calling filter bubbles “self-styled personal ecosystems of information we burrow ourselves into” (Brodeur 1). Others believe that the bubbles are caused in part by users’ own behavior and choices but also by the algorithms these websites use to enhance the browsing experience. For example, “when you do a search on Google…the results you get back will differ depending on what the company knows about you” (Bohannon 2). No matter the origin, however, it is important that social media users become aware of their own filter bubbles and the threats they may pose.
A filter bubble creates an environment for an individual that is friendly to their political opinions and only exposes them to posts and stories with which they will agree. Perhaps this sounds harmless in theory, but it actually politically polarizes the user, allowing them to sink to the deep end of their political ideology. Liberal or conservative, left or right, the more they click on news stories that comment positive on their side or negatively on the other, the less inclined they will be to analyze or even consider an opposing opinion. John Bohannon of Science Magazine believes that this division of media exposure could eventually lead to total political extremism of both major parties, and ultimately a chaotic pandemonium of a country. Views like this can seem a bit far-fetched themselves, and many are skeptical, but it isn’t so hard to imagine the consequences of a society in which “liberals and conservatives…rarely learn about issues that concern the other side simply because those issues never make it into their news feeds” (Bohannon 2). Brodeur agrees that “these are not ideal conditions for a productive cultural conversation” (Brodeur 2). For these reasons, it is likely in the country’s best interests to attempt to defeat this filter bubble phenomenon.
Brodeur claims that “it’s highly unlikely that Facebook will take steps to burst its own bubble,” but a variety of solutions to this situation are available for individual use. The Google Chrome extension PolitEcho, which “crawls through [a user’s] Facebook network and visualizes its political bias,” is one option that is becoming increasingly popular (Hess 2). FlipFeed is a plug-in coded by researchers at M.I.T. that allows you to experience the Twitter feed of a “random, anonymous user of a different political persuasion” (Hess 2). Read Across the Aisle, an iPhone application, alters its users’ news-reading experiences—as they browse articles within the app, an onscreen meter will fade from red to blue and back again “based on the particular site’s ideological bent” (Hess 2). Another plug-in that deploys a different tactic is Escape Your Bubble, which is available for Google Chrome. Described as “aggressively positive,” it sprinkles stories and posts of opposing political views within a user’s Facebook feed and labels them with a pink heart icon, smiley face, and the message “happily inserted by your EscapeYourBubble Chrome Extension” (Hess 2). The idea behind all of these plug-ins and applications is the benefits of “stripping opposing ideas of their negative emotional impact” (Hess 2). Heavy emotional influence is one of the leading motivators behind a person’s political ideology.
While not entirely agreeing that Facebook’s filter bubbles are caused by algorithms, CEO Mark Zuckerberg does agree that there is an existing “need to grow local news outlets…and present people with a range of perspectives,” which is just what these web extensions can accomplish (Hess 4). Zuckerberg’s net worth has actually plummeted by close to four billion dollars since the 2016 Presidential Election, and many believe this is due to the claims against Facebook’s code. Others see this as finger-pointing, however, and stand firmly with their beliefs that social media played no role in the election outcome. A 2012 study found that “Facebook’s news feed algorithm does indeed create an echo chamber effect,” but it is nothing to fear—according to their research, “the algorithm made it only 1% less likely for users to be exposed to politically cross-cutting stories” (Bohannon 3). This study was performed, however, by Facebook’s own in-house social scientists and could be heavily biased. Despite this, they did produce at least one statement of truth—“the power to expose oneself to perspectives from the other side in social media,” they stated, “lies first and foremost with individuals” (Bohannon 3). To have any hope of defeating the filter bubble, whether it be a product of a computer code or each user’s own internal bias, social media users need to increase their awareness of what they are reading and why and work to expose themselves to new beliefs and opposing ideologies.