I’ve been writing about the filter bubble for quite sometime. The filter bubble refers to the tendency for online search tools, and now social media, to screen and deliver results that fit our online history and profile thereby returning only results that are deemed relevant. Eli Pariser coined the term in his book The Filter Bubble, published in 2011.
The filter bubble presents us with a clear faustian bargain: give up knowledge and serendipitous discovery of the wider world for narrow, personalized news and information that matches our immediate needs and agrees with our profile.
As the customer service systems of all online retailers and media companies become ever-more attuned to their shoppers’ and members’ preferences the power of the filter bubble grows ever-greater. And, that’s not a good thing.
The filter bubble ensures that digital consumers see more content that matches their preferences and, by extension, continues to reinforce their opinions and beliefs. Conversely, consumers see less and less content that diverges from historical behavior and calculated preferences, often called “signals”.
And, that’s not a good thing.
What of diverse opinion and diverse views? Without a plurality of views and a rich spectrum of positions creativity loses in its battle with banality and conformity. So how can digital consumers break free of the systems that deliver custom recommendations and filtered content and reduce serendipitous discovery?
Personalization technology that allows marketers and media organizations to customize their products and content specifically to you seems to be a win-win for all: businesses win by addressing the needs — perceived or real — of specific customers; you win by seeing or receiving only items in which you’re interested.
But, this is a rather simplistic calculation for it fails to address the consequences of narrow targeting and a cycle of blinkered self-reinforcement, resulting in tunnel vision. More recently this has become known as filter bubble. The filter bubble eliminates serendipitous discovery and reduces creative connections by limiting our exposure to contrarian viewpoints and the unexpected. Or to put it more bluntly, it helps maintain a closed mind. This is true while you sit on the couch surfing the internet and increasingly, while you travel.
Last week Amazon purchased Goodreads the online book review site. Since 2007 Goodreads has grown to become home to over 16 million members who share a passion for discovering and sharing great literature. Now, with Amazon’s acquisition many are concerned that this represents another step towards a monolithic and monopolistic enterprise that controls vast swathes of the market. While Amazon’s innovation has upended the bricks-and-mortar worlds of publishing and retailing, its increasingly dominant market power raises serious concerns over access, distribution and choice. This is another worrying example of the so-called filter bubble — where increasingly edited selections and personalized recommendations act to limit and dumb-down content.
A decade ago in another place and era during my days as director of technology research for a Fortune X company I tinkered with a cool array of then new personalization tools. The aim was simple, use some of these emerging technologies to deliver a more customized and personalized user experience for our customers and suppliers. What could be wrong with that? Surely, custom tools and more personalized data could do nothing but improve knowledge and enhance business relationships for all concerned. Our customers would benefit from seeing only the information they asked for, our suppliers would benefit from better analysis and filtered feedback, and we, the corporation in the middle, would benefit from making everyone in our supply chain more efficient and happy. Advertisers would be even happier since with more focused data they would be able to deliver messages that were increasingly more precise and relevant based on personal context.
The online filter bubble is a natural extension of our preexisting biases, particularly evident in our media consumption. Those of us of a certain age — above 30 years — once purchased (and maybe still do) our favorite paper-based newspapers and glued ourselves to our favorite TV news channels. These sources mirrored, for the most part, our cultural and political preferences. The internet took this a step further by building a tightly wound, self-reinforcing feedback loop. We consume our favorite online media, which solicits algorithms to deliver more of the same. I’ve written about the filter bubble for years (here, here and here).
I’ve written about the online filter bubble for a while now. It’s an insidious and disturbing consequence of our online world. It refers to the phenomenon whereby our profile, personal preferences, history and connections pre-select and filter the type of content that reaches us, eliminating things we don’t need to see. The filter bubble reduces our exposure to the wider world of information and serendipitous discovery.
If this were not bad enough the online world enables a much more dangerous threat — one of hidden bias through explicit manipulation. We’re all familiar with the pull and push exerted by the constant bombardment from overt advertising. We’re also familiar with more subtle techniques of ambient and subliminal control, which aim to sway our minds without our conscious awareness — think mood music in your grocery store (it really does work).
You may be an anonymous data point online, but it does not follow that you’ll not still be a victim of personal discrimination. As technology to gather and track your every move online steadily improves so do the opportunities to misuse that information. Many of us are already unwitting participants in the growing internet filter bubble — a phenomenon that amplifies our personal tastes, opinions and shopping habits by pre-screening and delivering only more of the same based on our online footprints. Many argue that this is benign and even beneficial — after all isn’t it wonderful when Google’s ad network pops up product recommendations for you on “random” websites based on your previous searches, or isn’t it that much more effective when news organizations only deliver stories based on your previous browsing history, interests, affiliations or demographic?
What do you get when you take a social network, add sprinkles of mobile telephony, and throw in a liberal dose of proximity sensing? You get the first “social accessory” that creates a proximity network around you as you move about your daily life. Welcome to the world of a yet another social networking technology startup, this one, called magnetU. The company’s tagline is:
It was only a matter of time before your social desires became wearable!
magnetU markets a wearable device, about the size of a memory stick, that lets people wear and broadcast their social desires, allowing immediate social gratification anywhere and anytime. When a magnetU user comes into proximity with others having similar social profiles the system notifies the user of a match. A social match is signaled as either “attractive”, “hot” or “red hot”. So, if you want to find a group of anonymous but like minds (or bodies) for some seriously homogeneous partying magnetU is for you.
Google’s oft quoted corporate mantra — do no evil — reminds us to remain vigilant even if the company believes it does good and can do no wrong.
Google serves up countless search results to ease our never-ending thirst for knowledge, deals, news, quotes, jokes, user manuals, contacts, products and so on. This is clearly of tremendous benefit to us, to Google and to Google’s advertisers. Of course in fulfilling our searches Google collects equally staggering amounts of information — about us. Increasingly the company will know where we are, what we like and dislike, what we prefer, what we do, where we travel, with whom and why, how our friends are, what we read, what we buy.
Echo and Narcissus, John William Waterhouse [Public domain], via Wikimedia Commons
About 12 months ago I committed suicide — internet suicide that is. I closed my personal Facebook account after recognizing several important issues. First, it was a colossal waste of time; time that I could and should be using more productively. Second, it became apparent that following, belonging and agreeing with others through the trivial “wall” status-in-a-can postings and now pervasive “like button” was nothing other than a declaration of mindless group-think and a curious way to maintain social standing. So, my choice was clear: become part of a group that had similar interests, like-minded activities, same politics, parallel beliefs, common likes and dislikes; or revert to my own weirdly independent path. I chose the latter, rejecting the road towards a homogeneity of ideas and a points-based system of instant self-esteem.
One enterprising person has taken his passion for recycling and reuse to extraordinary lengths. Matt Malone is a professional dumpster diver, making a profitable business from others’ trash. And, there’s another great benefit to his growing business — keeping untold quantities of discarded goods, and some of it hazardous material, out of our landfills. Ours is a thoroughly wasteful society and sadly our consumer culture still rewards businesses for this wastefulness.