Bubbles
Or the texture of a Tuesday
I’ve been thinking a lot lately about how easily you and I confuse exposure for truth.
Put a person in Rome for a semester, and they come back with new tastes, a new gait, a new baseline for what a street is supposed to feel like. Put a person in a startup where everyone ships at breakneck velocity and they internalize speed as a moral virtue. Put someone in a structured corporate machine and they start to mistake bureaucracy for the laws of physics.
We like to pretend we arrive at our worldviews through rigorous, independent reasoning. But most of the time, we arrive at our views simply by breathing the air.
A detail from a famous 1970s study has always stuck with me: heroin addiction rates among servicemen dropped dramatically the moment they returned from Vietnam. When the environment changed, the behavior changed. It wasn’t because tens of thousands of people suddenly discovered a new reservoir of inner virtue. It was because the world around them stopped cueing the same compulsions. And the battlefield had disappeared.
The takeaway here is the fact that the self is vastly more permeable than we want to admit.
The things you think about determine the quality of your mind. Your soul takes on the color of your thoughts. — V. 16
Which brings me to the “AI bubble.” It’s a phrase that gets thrown around constantly, almost always as a critique of markets, valuations, and tech-world hype. But I think it’s actually about habitats.
Some of us are currently living inside an environment where AI is no longer a feature. It is infrastructure. It sits beside the cursor the way autocomplete sits beside spelling. It doesn’t feel like “using AI” anymore; it feels like wearing an exoskeleton. It acts as a second nervous system for thought and execution.
When you spend enough time in that specific climate, your priors begin to shift completely. A certain set of things just stop being impressive.
Writing a competent first draft stops being a hurdle. Turning a vague, sprawling thought into an outline becomes cheap. Staring at a blank page stops feeling like an act of creative courage and you can’t help but wonder about your overall efficiency.
But as those things fade, a different set of skills suddenly becomes important: Choosing the precise constraint. Architecting the right context. Asking the exact right question to collapse untold hours of otherwise fruitless iteration into ten minutes of curation. Knowing what not to do when absolutely everything digital is possible.
Living in this environment produces a very specific kind of confidence. It isn’t arrogance per se but something subtler. It’s the realization that the world is incredibly malleable and that a massive percentage of what we used to call “work” was closer to a set of inescapable frictions tied to a title.
Now, place that person in a conversation with someone who only occasionally interacts with these tools.
Someone for whom AI is a neat but occasional convenience. Maybe they use it to summarize a dense PDF, clean up an email, or generate a placeholder graphic. Their underlying environment hasn’t changed. The physics of their workday feel exactly the same as they did three years ago.
And then, put both of them in a room with someone who doesn’t use it at all.
Someone who experiences “AI” entirely through secondary sources: the breathless headlines, doom-laden op-eds, and the distant, annoying rumble of other people’s excitement. For them, AI isn’t a lived reality. It’s just strange weather pattern on the horizon.
When these three people argue, it may sound like they are having a disagreement about software. They aren’t. They are arguing about what it feels like to be inside their day.
The exoskeleton-user says: “Everything is accelerating.”
The light-user says: “Not really, it’s just a helpful tool.”
The non-user says: “This is entirely overhyped,” or “This is terrifying,” depending on which fumes they’ve recently inhaled.
None of them are right or wrong. They are just accurately describing the physics of their local bubble.
And bubbles do something psychologically violent to us: they make our highly specific experience feel universal.
They make our local preferences feel like global principles.
They make our temporary constraints feel like permanent reality.
This is exactly why the current discourse around AI is so uselessly overheated. People keep screaming the same question across different atmospheres: Is this real?
In one environment, it’s undeniably real. In another, it’s obviously not. In a third, it’s just a ghost story.
The interesting question though for me is: what happens to an industry or better yet a society, when we stop sharing the same defaults?
What happens when some people can treat thought as an externalized, iterative process (reframing and drafting on demand) while others are still paying the full, heavy internal cost for every single first sentence? What happens when one group lives inside a high-utility agentic loop with machines, and another lives inside a high-suspicion loop about them?
The political version of this is already so very painfully obvious to us: different media diets produce entirely different realities. But the work version is perhaps more insidious. Different tools produce different baselines for what effort even is. And effort isn’t just a resource because it is indeed a moral language.
We already know how this story goes when environments diverge this sharply. People stop empathizing because they literally cannot simulate the other person’s Tuesday. They stop arguing about solutions and start arguing about the nature of reality itself. They don’t just separate. They look across the void and assume the other side is hallucinating.
So maybe the right posture for this moment isn’t evangelism, and it isn’t skepticism. Maybe it’s environmental humility.
It’s the discipline to pause before forming a hot take and ask: What air am I breathing right now? What air are they breathing? And what would I find entirely obvious if I had lived their Tuesday for the last six months?
The most dangerous thing about bubbles is not that they are wrong. It’s that they feel exactly like the truth.


