Thoughts · · 4 min read

Internet Multiverses

Algorithms didn't divide us. They just made it easier to never leave the universe we already preferred. A case for holding your beliefs a little more loosely.

A friend and I were talking about Instagram the other day. The usual observation: the algorithm knows too much. It knows what I’m interested in before I’ve fully articulated it to myself. He said the same about his feed. And then the discussion went to how these feeds are different each of us, not just different interests, different realities. Different news, different opinions, different versions of what’s happening in the world.

That’s when the thought clicked. These aren’t filter bubbles. They’re closer to parallel universes.

The multiverse is already here

Every person with a phone is living inside a version of the internet that was constructed specifically for them. Your feed, your recommendations, your search results, your suggested content: all of it is shaped by what you’ve clicked, liked, watched before. The internet you see is not the internet someone else sees. It’s a reflection of your past behaviour, served back to you in a loop.

And inside each of these loops, everything makes sense. The opinions feel reasonable. The outrage feels justified. Because mostly you’re seeing the version of reality that your past self already agreed with.

Multiply this by a few billion people and you get something strange: a planet full of humans who are technically connected to the same network, but experientially living in entirely different worlds. Worlds that don’t just differ in preference. They differ in fact.

The easy villain

The instinct is to blame the algorithm. And it’s a satisfying narrative: big tech companies built engagement machines that exploit our psychology, and profit from our division. There’s truth in that.

But I think it’s an incomplete story.

These algorithms are doing exactly what they were designed to do: show you more of what you engage with. That’s it. They’re optimizers. They don’t have an agenda beyond keeping you on the platform long enough to serve another ad. The algorithm doesn’t care if you’re watching cat videos or conspiracy theories. It only cares that you’re watching.

Social media platforms have no incentive to change this. The system works perfectly, just not for the goals we wish it served.

So if the algorithm is simply a mirror, and the platform has no reason to adjust the mirror, the question shifts somewhere uncomfortable: what about the person looking into it?

The harder question

We don’t like this framing. It’s much easier to be a victim of technology than a participant in our own narrowing. But the truth is, algorithms can only reinforce what we already do naturally.

We’ve always preferred newspapers that confirmed our politics, social circles that echoed our worldview. The algorithm didn’t invent this tendency. It just removed all the friction from it.

The problem with unshaken belief

I’ve come to think that absolute certainty about anything, any ideology, any religion, any political position, any worldview, is one of the most quietly dangerous things a person can carry. Not because the belief itself is necessarily wrong. But because certainty closes the door to revision. And revision is how we’ve gotten everything right that we’ve ever gotten right.

History is full of things we were collectively, confidently wrong about. The shape of the earth. The centre of the solar system. The nature of diseases. The rights of entire groups of people. In every case, the majority was certain. And in every case, certainty was the thing that delayed progress the longest.

The pattern is clear if you’re willing to see it: we don’t move from wrong to right in a single leap. We move from wrong to less wrong. From less true to more true. Slowly, painfully, and almost always against the resistance of people who were sure they already had the answer.

Absolute truth is extraordinarily rare in this world. Most of what we hold as “true” is just the best model we have so far. And that’s fine, as long as we remember the “so far” part.

What I think this means

I’m not arguing for nihilism, or for some detached “nothing is real” posture.

It is something simpler: hold your beliefs like hypotheses, not identities. Engage with things that challenge you, not just things that validate you. When you catch yourself feeling absolutely certain about something, treat that as a signal to look harder, not a reason to stop looking.

The algorithm will keep doing what it does. The platforms will keep optimizing for engagement. That’s not going to change. What can change is how seriously you take the universe your feed has built for you, and whether you’re willing to occasionally step outside it.

The internet gave us access to all of human knowledge. It would be a waste to use it only to confirm what we already believe.

Tagged in

internet algorithms critical-thinking philosophy

Share this

← Back to all writing