The online filter bubble is a natural extension of our preexisting biases, particularly evident in our media consumption. Those of us of a certain age — above 30 years — once purchased (and maybe still do) our favorite paper-based newspapers and glued ourselves to our favorite TV news channels. These sources mirrored, for the most part, our cultural and political preferences. The internet took this a step further by building a tightly wound, self-reinforcing feedback loop. We consume our favorite online media, which solicits algorithms to deliver more of the same. I’ve written about the filter bubble for years (here, here and here).
The online filter bubble in which each of us lives — those of us online — may seem no more dangerous than its offline predecessor. After all, the online version of the NYT delivers left-of-center news, just like its printed cousin. So what’s the big deal? Well, the pervasiveness of our technology has now enabled these filters to creep insidiously into many aspects of our lives, from news consumption and entertainment programming to shopping and even dating. And, since we now spend growing swathes of our time online, our serendipitous exposure to varied content that typically lies outside this bubble in the real, offline world is diminishing. Consequently, the online filter bubble is taking on a much more critical role and having greater effect in maintaining our tunnel vision.
However, that’s not all. Over the last few years we have become exposed to yet another dangerous phenomenon to have made the jump from the offline world to online — the echo chamber. The online echo chamber is enabled by our like-minded online communities and catalyzed by the tools of social media. And, it turns our filter bubble into a self-reinforcing, exclusionary community that is harmful to varied, reasoned opinion and healthy skepticism.
Those of us who reside on Facebook are likely to be part of a very homogeneous social circle, which trusts, shares and reinforces information accepted by the group and discards information that does not match the group’s social norms. This makes the spread of misinformation — fake stories, conspiracy theories, hoaxes, rumors — so very effective. Importantly, this is increasingly to the exclusion of all else, including real news and accepted scientific fact.
Why embrace objective journalism, trusted science and thoughtful political dialogue when you can get a juicy, emotive meme from a friend of a friend on Facebook? Why trust a story from Reuters or science from Scientific American when you get your “news” via a friend’s link from Alex Jones and the Brietbart News Network?
And, there’s no simple solution, which puts many of our once trusted institutions in severe jeopardy. Those of us who care have a duty to ensure these issues are in the minds of our public officials and the guardians of our technology and media networks.
From Scientific American:
If you get your news from social media, as most Americans do, you are exposed to a daily dose of hoaxes, rumors, conspiracy theories and misleading news. When it’s all mixed in with reliable information from honest sources, the truth can be very hard to discern.
Many are asking whether this onslaught of digital misinformation affected the outcome of the 2016 U.S. election. The truth is we do not know, although there are reasons to believe it is entirely possible, based on past analysis and accounts from other countries. Each piece of misinformation contributes to the shaping of our opinions. Overall, the harm can be very real: If people can be conned into jeopardizing our children’s lives, as they do when they opt out of immunizations, why not our democracy?
As a researcher on the spread of misinformation through social media, I know that limiting news fakers’ ability to sell ads, as recently announced by Google and Facebook, is a step in the right direction. But it will not curb abuses driven by political motives.
Read the entire article here.
Image courtesy of Google Search.