How to avoid filter bubbles and free your mind

For the thirtieth time today, I mindlessly open Facebook and see what’s in the newsfeed.

I see some articles about the latest political issues in both the United States and Australia.

The filter bubble is in operation, but I’m not really aware it’s happening. I keep scrolling.

An article now appears about climate change being man-made and renewable energy providing some hope.

I click, read the article for about 30 seconds and then share it with a few friends.

Again, my perspective is being limited by the filter bubble. But it’s not really clear to me this is happening. I feel like I’ve learned something and I get back to scrolling.

Then I see the latest that’s happening in the Australian Football League, which I more passionately engage with.

I now feel like an informed person for seeing the latest that’s happening in politics and reading about how we can combat climate change. I’m also now able to converse with my family about what’s happening in the AFL.

The filter bubble is in full force, and it’s limiting my perspective of the world.

It was never a conscious decision for Facebook to become my primary news source. But it’s now how I stay informed. I’m quite typical. According to a new survey by Pew Research Center, 68% of Americans get their news from social media.

According to the same survey, 57% of people who get their news from social media expect the news to be largely inaccurate.

Fake news is a problem that has become well-known since the last US election, and Facebook is doing everything they can to fight against it.

But there’s a potentially larger and more insidious problem than fake news.

It’s the filter bubble, and it’s keeping people trapped in very limited ways of thinking.

What is a filter bubble?

According to Wikipedia, a filter bubble is “is a state of intellectual isolation that allegedly can result from personalized searches when a website algorithm selectively guesses what information a user would like to see based on information about the user, such as location, past click-behavior and search history.”

Let’s break this down a little further.

When we use Google, Facebook, Twitter, and many other websites, algorithms determine what we are shown. These algorithms are based on how we have previously used these websites and our Internet browsing history.

This makes sense. Facebook, for example, can’t possibly show us everything that our friends are posting online. It makes a selection based on what we are most likely to engage with.

The problem is this:

We are more likely to engage with content that is in alignment with our current perspective. For example, if we politically lean towards being a liberal, we’ll be shown content that comes from a liberal perspective. If we’re more conservative, Facebook will learn over time that we engage with content that is a bit more right-wing.

It’s not a conspiracy. Facebook hasn’t set out to keep people trapped in their current ways of thinking. Rather, Facebook is designed for people to continue logging in to the website every day. It wants us to find something we want to engage with.

If we were shown content in our newsfeed that challenged how we currently see the world, we would probably start to find other sources of news.

This may be better for us as individuals. But it’s not good for Facebook. They want us to spend more time using it every day so we are more valuable to be advertised to.

Other technology companies are driven by similar incentives.

In the early 2000s, US legal scholar Cass Sunstein issued a warning about this eventuality.

The Internet, he said, may allow us to establish a global commons that allowed us to overcome geographical and social barriers between people by having access to more balanced perspectives. But it was equally possible that we will simply erect new walls as like-minded people siphon themselves into homogenous groups of people who all share the same viewpoint and gather their views from the same sources.

“Although millions of people are using the Internet to expand their horizons, many people are doing the opposite, creating a Daily Me that is specifically tailored to their own interests and prejudices,” he wrote.

It may not seem like such a big issue. But filter bubbles create echo chambers, a situation where our beliefs are amplified or reinforced by communication within a closed system.

You would think the Internet would be the technology that enables a diversity of ideas to spread around the world and unite people from different communities. Instead, filter bubbles have arisen and are contributing to polarization around the world.

How can we avoid filter bubbles?

Most recommendations for how to avoid filter bubbles will focus on eliminating the impact of personalization by technology platforms.

This is an aspect of avoiding filter bubbles, so let’s go through a few tips:

  • Use incognito browsers, regularly delete your search histories, and try to use the Internet as much as possible without being logged into social media accounts.
  • Delete or block browsing cookies.
  • Actively read news websites that promote a diversity of perspectives.
  • Actively follow thought leaders on social media who share more balanced viewpoints.

This is only the first step. As explained above, the problem also comes from the way major media companies are incentivized to create content that is becoming more polarized in order to feed an audience that demands this content.

Therefore, another key step we need to take is seeking out perspectives that are more nuanced and balanced.

I don’t think it’s effective to simply avoid mainstream media sites and replace them with smaller and more independent sites. This would probably help people at an individual level, and I certainly encourage you to try and find alternative sources of news.

However, at a bigger picture level, the reality is that mainstream media sites will become increasingly important players on social media in the years ahead. This is because Google and Facebook are giving more “weight” to “reputable” news sources in their attempts to combat misinformation and fake news.

Therefore, I think that as consumers we need to increase the demand for more nuance and balance in the ways in which mainstream media outlets report on the news.

These nuanced and balanced perspectives are out there. They are just drowned out by sensationalism and polarized content which gets the clicks and views.

Eli Pariser on how filter bubbles operate

The concept of filter bubbles was brought to the world’s attention in 2012 by Eli Pariser with the publication of The Filter Bubble: How the Personalized Web is Changing What We Read and How We Think. Pariser is the founder of Upworthy, a popular viral news site and also an activist.

In his fascinating book, Pariser details how Google searches bring up dramatically different search results based on the browsing history of the user.

For example, searching for “BP” brings up news related to investing in the company for some people and news about its oil spills for others.

Pariser refers to your computer as “a kind of one-way mirror, reflecting your own interests while algorithmic observers watch what you click.”

This one-way mirror is the filter bubble in operation, creating a “personal ecosystem of information”.

The result:

We are insulated from cognitive dissonance, or in simple terms from being shown information that differs from what we currently believe.

Every time we click on a link, watch a video, comment on something or share a news article, we are communicating information to the major platforms about what we are interested in.

As Pariser says:

“What was once an anonymous medium where anyone could be anyone — where, in the words of the famous New Yorker cartoon, nobody knows you’re a dog — is now a tool for soliciting and analyzing our personal data.”

Technology companies use our personal data to show us highly targeted advertisements.

“While Gmail and Facebook may be helpful, free tools, they are also extremely effective and voracious extraction engines into which we pour the most intimate details of our lives. As a business strategy, the Internet giants’ formula is simple: The more personally relevant their information offerings are, the more ads they can sell, and the more likely you are to buy the products they’re offering. And the formula works. Amazon sells billions of dollars worth of merchandise by predicting what each customer is interested in and putting it in the front of the virtual store.”

We feed the technology companies relevant information about ourselves in ways you probably don’t know you’re doing. Pariser explains how some of the methods Amazon uses to get to know you better:

“When you read books on your Kindle, the data about which phrases you highlight, which pages you turn, and whether you read straight through or skip around are all fed back into Amazon’s servers and can be used to indicate what books you might like next. When you log in after a day reading Kindle e-books at the beach, Amazon can subtly customize its site to appeal to what you’ve read: If you’ve spent a lot of time with the latest James Patterson, but only glanced at that new diet guide, you might see more commercial thrillers and fewer health books.”

It works like this:

“Most personalized filters are based on a three-step model. First, you figure out who people are and what they like. Then, you provide them with content and services that best fit them. Finally, you tune to get the fit just right. Your identity shapes your media. There’s just one flaw in this logic: Media also shape identity. And as a result, these services may end up creating a good fit between you and your media by changing … you.”

These algorithms continue to feed us a steady diet of relevant information (and show us highly relevant advertisements). But, according to Pariser, we need a more balanced diet that also includes information that is uncomfortable, challenging and important. He advocates for algorithms that encode a sense of civic responsibility and public discourse that allow us to have more balanced perspectives. This is how we can unlock the full potential of the Internet.

The Internet should be something that introduces us to new ideas, new people and differing perspectives. But filter bubbles stop this from happening.

YouTube video

 

Does social media keep us “trapped” inside filter bubbles?

Are we really “trapped” inside filter bubbles by technology companies, as Pariser infers?

According to Mark Zuckerberg, founder of Facebook, the answer is no.

In 2016, Mark Zuckerberg denied the operation of filter bubbles in Facebook’s news feed:

“So we have studied the effect that you’re talking about, and published the results of our research that show that Facebook is actually, and social media in general, are the most diverse forms of media that are out there. And basically what – the way to think about this is that, even if a lot of your friends come from the same kind of background or have the same political or religious beliefs, if you know a couple of hundred people, there’s a good chance that even maybe a small percent, maybe 5% or 10% or 15% of them will have different viewpoints, which means that their perspectives are now going to be shown in your News Feed.

“And if you compare that to traditional media where people will typically pick a newspaper or a TV station that they want to watch and just get 100% of the view from that, people are actually getting exposed to much more different kinds of content through social media than they would have otherwise or have been in the past. So it’s a good sounding theory, and I can get why people repeat it, but it’s not true. So I think that that’s something that if folks read the research that we put out there, then they’ll see that.”

We are probably not “trapped” in some kind of Orwellian nightmare where technology companies brainwash us and prevent us from seeing opposing perspectives.

Instead, we are the ones who keep ourselves trapped by filter bubbles. Facebook enables it to happen.

Let me explain.

Media consumption based on political identities

In December 2018, Matt Grossman wrote a report for Knight Foundation titled Partisan Media and Political Distrust (you can read the summary here).

“Media choice has become more of a vehicle of political self-expression than it once was,” Grossmann writes. “Partisans therefore tend to overestimate their use of partisan outlets, while most citizens tune out political news as best they can.”

Zuckerberg is probably right that social media shows us a greater variety of perspectives than if we only got our news from one news source.

But many of us still do get most of our news from Facebook. At the same time, the Knight Foundation research also suggests that we increasingly use our consumption of news media to signify our political identities.

Media consumption is increasingly becoming “an expressive political act.” For example, right-leaning Republicans believe they should stick to Fox News, or right-leaning Democrats believe they should be loyal to MSNBC.

Social media shows surfaces a variety of polarized media

Social media and search users are being exposed to more diverse perspectives. A Pew survey around the 2016 US Presidential Election found that the majority of people experience a range of opinions in their social media feeds. A study by the University of Ottawa came to similar conclusions. They surveyed 2,000 British adults and found that the majority of people regularly seek perspectives different from their current perspectives. Most people have a reasonable idea of what the “other side” think about a particular issue.

But exposure to alternative perspectives isn’t resulting in people arriving at a more moderate or nuanced view. In fact, those that were exposed to alternate perspectives ended up becoming more confident of their initial viewpoint.

Another study carried out a linguistic analysis of Republican voters who were exposed to liberal perspectives. This caused an increase in negative sentiments and emotive words used in their expressions towards liberal opinion leaders.

This suggests that our blinkered perspectives may not be directly caused by filter bubbles. Rather, we are unconsciously becoming more blinkered even when we are exposed to alternative perspectives.

The social psychologist Jonathan Haidt from New York University explains it as follows:

“If you look at any measures of what people think about people on the other side, [they] have become vastly more hostile.”

We want to consume content that fits with our political identities and what we already believe to be true.

The issue isn’t that social media sites and search engines are preventing us from seeing alternative perspectives. They do personalize what we see and show us content we are more likely to engage with. But they also do a good job of showing us a range of perspectives.

The issue is that the news that makes it into newsfeeds is becoming increasingly polarized.

Media sites are incentivized to create their own echo chambers

Filter bubbles shape the world we live in, but at the same time, we are also being exposed to perspectives from outside our own echo chambers.

How do we make sense of this?

We need to also understand the role that media companies and public intellectuals play in actively creating their own echo chambers.

They are incentivized to do this because of the operation of filter bubbles. To build an audience, they need to create and share content that will resonate with people who see the media they consume as an “expressive political act.”

At the same time, people are becoming more polarized and tribal in the news they consume, as asserted by the Knight Foundation report.

Media companies are therefore incentivized to create content that is polarized in order to reach consumers who see media consumption as an expressive political act.

A recent study by The Reuters Institute released findings that support this point. Their Digital News Report studied surveys from 70,000 people in 13 countries, focusing on modern-day news consumption. The report revealed rising polarization in the United States.

People are increasingly sticking to what they see as “reliable” news sources that fit their political identities. For example, two-thirds of conservatives watch Fox News, and 19% regularly visit the ultra-right website Breitbart.com.

While rightwing Americans favor Fox News, liberals gravitate to media sites such as Buzzfeed, CNN or New York Times for their news.

audience map reuters How to avoid filter bubbles and free your mind

This spectrum across the left and right isn’t unique to the United States. It happens in Australia and most of Europe where politics is characterized by the left-right divide. For example, see the results for the United Kingdom below.

reuters uk How to avoid filter bubbles and free your mind

It’s crucial for media companies to create content that reaches their audience. In the current climate, their audiences are becoming more polarized. This incentivizes media companies to create content that is more polarized in nature.

At the same time, social media is becoming very effective at showing us content we are more likely to engage with. It also shows us a diversity of perspectives, at least in comparison to when we got our news primarily from newspapers and magazines.

But as consumers, we are becoming more polarized and seeing alternative perspectives simply makes us surer of what we already believe. We quickly move past articles in our feed that may help us develop a more nuanced perspective of the world.

Social media isn’t the direct cause of keeping us trapped in limited ways of thinking. It’s simply an enabler, bringing us more the polarized media that we are increasingly desiring.

We need social media to promote civic engagement and more balanced perspectives

Social media isn’t directly causing the increase in polarization we see around the world. But it’s a big part of the problem.

We are immersed in a constant stream of unbelievable outrages perpetrated by “the other side”. Social media makes it easier for us to see this and engage with others who are part of our own echo chambers.

It’s unlikely that we can increase trust and understanding between people while social media is our primary way of communicating about politics.

We have to recognize that we’re in a crisis and that the left-right divide is probably unbridgeable with our current methods of communication and news consumption.

This a real shame as we need politics to unite people around large-scale agendas that require transcending the current left-right divide.

Polarization is here to stay for many decades, and it’s probably going to get worse. So a key question we need to ask ourselves: how do we adapt our democracy for life under intense polarization?

In the meantime, each of us needs to take responsibility for avoiding filter bubbles and echo chambers.

Building a search engine for nuanced and balanced perspectives

This is why at Ideapod we are creating a “filter bubble buster”. It’s a new kind of search engine for the news, surfacing nuanced and balanced perspectives and filtering out news media that contributes to a polarized body politic.

We are doing this by conducting a number of analyses of social networks, mapping out hundreds of “echo chambers”, which are knowledge-based communities that regularly share news items with each other. These echo chambers will be mapped according to the political identities they are expressing. For example, we will isolate echo chambers along the left-right continuum, while also applying a cross-sectional analysis of groups of people who share similar perspectives on provocative issues such as gun control.

Most content shared online does very well within one particular echo chamber, but won’t surface within echo chambers where the participants share an opposing perspective.

But, at times, content is created about provocative issues that is shared within multiple echo chambers. These pieces of content would, therefore, be representing a perspective that brings people together in conversation.

For example, imagine there was an article about gun control in the United States that was uniquely shared within a pro-Donald Trump echo chamber and also a pro-Bernie Sanders echo chamber. This article may not have a large number of clicks and represent value for the mainstream media outlet that published it. But our search engine would surface it as it would contain a perspective of insight that has an impact on people from diverse echo chambers.

This is the content our filter bubble buster will surface.

We’re making our filter bubble buster available for journalists by the end of 2019, ahead of the 2020 Presidential Election campaigns. Our hope is that it will help journalists to use sources that are more nuanced and balanced in perspective.

If you’d like to be the first to use an early version of our prototype, you can sign up here.

Did you like my article? Like me on Facebook to see more articles like this in your feed.

Picture of Justin Brown

Justin Brown

I'm Justin Brown, the founder of Ideapod. I've overseen the evolution of Ideapod from a social network for ideas into a publishing and education platform with millions of monthly readers and multiple products helping people to think critically, see issues clearly and engage with the world responsibly.

Enhance your experience of Ideapod and join Tribe, our community of free thinkers and seekers.

Related articles

Most read articles

Get our articles

Ideapod news, articles, and resources, sent straight to your inbox every month.

0:00
0:00