We need to get out of our modern Plato’s Cave — algorithmic filter bubbles

Nicolas C
5 min readMay 4, 2019
Piranesi, Carceri, Plate XI © British Museum

At the dawn of the internet, the cyber-pioneers hoped it would be the starting point of a revolution of the information similar to Gutenberg’s invention of the printing press. Thirty years later, and all we have is a global pessimism over what the web has become: a vast unregulated marketplace, where you’re never sure whether you’re the customer or the product. Yet, all the knowledge that we dreamed could become accessible to anyone is actually here, and we don’t need to dig too far to discover rich and thought-provoking content. Here’s my idea on how to make the best of what we have.

One of the problems of the internet, identified a while ago, is that our attention has become a product, and that the whole digital environment is designed in order to capture our time, and thus our attention. To produce such an effect, the platforms create (and, more importantly, show us) contents that will arouse our interest.

Since it has become possible quite easily to know what we like, and thus who we are, it has become possible for the platforms to suggest content that we will probably appreciate. The more you’ll find interesting content, the longer you stay on the platform, the more you pay.

We even help them to do so: when you register on Medium, you fill in your centers of interests, such as ‘Culture’ or ‘Movies’. Since you’re reading this post, I can guess you’re interested in Techs. As such, Medium isn’t the one to blame — I’d rather say we are.

Although we should not mix Medium’s recommendations and Twitter’s feed, for they are inherently different, the problem they pose is quite the same. When reading your Twitter feed, you’ll only come across persons that you deem to be of interest, that are probably rather similar to you. That’s how you can end up with this:

Twitter users’s connections during the Ferguson riots. Oversimplified explanation : users in red tend to be white conservatives; in blue, black liberals. © Quartz

Above, Quartz drew a cartography of the most active Twitter users tweeting about the Ferguson riots: “In the image at the top, each point is one of the most talkative tweeters, and two points are connected if one mentions the other: in essence, the image depicts the social network of who talks to whom.” To sum up very quickly their very interesting article: the red group is for the major part composed of white conservatives; in blue, black liberals.

On filter bubbles and ‘technococoons’

There is nothing new here: as soon as 2011, Eli Pariser coined the term “filter bubble” in his essay The Filter Bubble: What the Internet Is Hiding From You.

By the way, I consider this personalization to be potentially a rather good thing. We can be exposed to a lot of interesting content matching with our interests, and pretty much anyone can dig deeper into what he likes than what could have been possible with a good old public library.

Yet, we are kept captives of our own centers of interest. When I scroll through Medium, or Twitter, what is shown to me is what I already kind of know: literature, culture, technology. What about ancient Japanese theatre? Molecular gastronomy? Peruvian politics? There’s a whole part of the world that I will never hear about — and that’s always a shame.

The main problem of this hyper-personalization of the contents accessible to us is, I reckon, that we get trapped into what French science-fiction writer Alain Damasio calls a ‘technococoon’. We’re immerged into a non-hostile environment, reassuring and seemingly friendly, our own little cyber comfort zone. Following this path would lead us to become isolated from the Other, from the others, from the otherness — and Levinas reminds us how essential it is. It’s because something is new, strange, difficult and uncomfortable that we are led to evolve, adapt and become different ourselves.

Let’s take an easy example: suppose I were a racist person, trapped in my own Twitter filter bubble. I could spend all day barely even seeing anti-racist posts, and only discussing with people that seem to share my own ideas.

Our digital environment has become surprisingly close to a modern era Plato’s Cave (I won’t expand on that, here’s a good video about it if you want a quick recap). I wouldn’t say that the content we see is but an illusion; however, I’d say those are images that are only one part of a vaster reality. And as for those projecting the images we are allowed to see on the walls, it seems quite obvious to me that they are the algorithmic black boxes Franck Pasquale exposes in The Black Box Society (2015). It’s time to break the chains.

How to avoid filter bubbles

There has recently been a discussion over whether the numeric environment was to blame for this echo chamber effect. To be honest, we have always been more likely to read a book because its title referred to something that interested us, or to talk to someone in a pub because we shared his ideas. Although I do believe there is more serendipity in a library than on Twitter (a stray book or an unexpected discussion is more likely to happen than a Tweet coming out of nowhere), I do, most of all, think that ‘The problem was the same 10 years ago’ is no reason not to change today.

I have been reading a very thought-provoking essay, Obfuscation, A User’s Guide for Privacy and Protest, published by Finn Brunton and Helen Nissenbaum in 2016 (I wrote about it, in French, here). Their idea is that since we’re tracked for our data, we should use obfuscation as a means of rebellion big tech companies.

Obfuscation is a very antique strategy, which consists in willingly producing ambiguous and useless information in order to hide one important element. For instance, when the Romans come to seek Spartacus, every slave claims to be Spartacus — thus, making it very hard for the Romans to identify the one they are looking for.

Although the authors promote a modern use of obfuscation to resist the surveillance capitalism Shoshana Zuboff recently exposed in The Age of Surveillance Capitalism (2019), I do believe we could find a personal use for obfuscation.

Add different centers of interests to your Medium profile, follow people you wouldn’t naturally follow on Twitter, listen to musics you think you don’t like: you are going to be exposed to a great dose of difference. You will hate the most part of it. But even if you discover one new topic of interest, or take the time to think about one idea that is not yours usually, I reckon you will have taken a serious opportunity to improve yourself.

If you enjoyed this article, you can support my writings by buying me a coffee !

--

--

Nicolas C

French journalist writing on literature, culture, tech and technocriticism. Personal website : https://www.curabooks.fr. Twitter : @NicolasCelnik