Your Internet Might Be Different From Mine

11 January 2018

When we get online, we may not see the same Internet as others. All of us are living in what’s referred to as the “filter bubble,” or a manufactured version of reality produced by algorithms tailoring content to our interests. The effect produced is an echo chamber that reflects the beliefs social media and search engines think we hold. We may feel more connected and educated, even though the opposite is true.

The online world has quietly evolved from one of openness and community to one divided by algorithms and artificial intelligence. Unfortunately, human nature makes challenging our beliefs uncomfortable so we haven’t noticed as our feeds changed to reflect us. Our social media is comfortable but more divisive than we could have ever imagined. It’s relatively rare for our feeds to show us opposing views, and our own behavior is a disincentive for sites to show opposing views to us because we’re less likely to click on them.

The divisive nature of the filter bubble is most obvious when it comes to politics, especially with the 2016 election. A post titled “Why I’m Voting for Donald Trump” was shared over 1.5 million times [infographic] on Facebook. Another, titled “There are five living U.S. presidents. None of them support Donald Trump” was shared 1.7 million times [infographic]. Depending on which way Facebook thinks you lean politically, you likely only saw one or the other and only saw content related to the one you saw. This is why the fact that Hillary Clinton won the popular vote substantially and that Donald Trump became president were so shocking to opposing side. It’s also why some people continue to believe that Trump won the popular vote, though he didn’t.

The degree to which sites change for different people varies. Facebook and other social media tend to be the most drastic, while search results on Google tend to be the least effected (though there are execptions). Personalization is no secret; Facebook talks about it in their help center and Google offers settings for news and ad targeting. What we don’t know is everything that’s hidden, or why, as most sites keep their algorithms secret.

The effect is interesting because it doesn’t only affect us as individuals (and where it does, the effects may not be as major, yet, as we might think). Pariser, the author of a study of filter bubbles, points out that some of the people most reliant on social media are journalists, and their own filter bubbles may be influencing what they write about.

The fact is, we don’t fully understand the effect that social media and the filter bubble have now, and we don’t know how it will evolve in the future. Facebook, Google, and other sites could take steps to reduce algorithmic bias and to help us break out of our filter bubbles, but for now they don’t have much reason to. Until then, our filter bubble is ours to break out of because we know we’re not always seeing both sides.

Care about what the web is doing to our minds? Check out my book, The Thought Trap, at book.thenaterhood.com.

• • •

Stay updated by email
or, grab the feed

Found something wrong? Get in touch.

Share this