Things Aren’t Changing That Fast
We are in a strange position where the present feels normal while the people building the future are warning us it won't be.
Every day there’s a new fundraise. Every day there’s a new model announcement. Every day someone on Twitter declares that software engineering is dead, or that we’ve achieved AGI, or that the entire economy will be restructured by Q3.
And yet.
I want to make a simple observation that feels almost transgressive to say out loud in 2026: for most people, not that much has actually changed.
The selection bias problem
Here’s a question I keep asking myself: who is actually talking about AI?
Engineers at AI companies. VCs who fund AI companies. Tech journalists who cover AI companies. Researchers who study AI. In other words, the people whose daily experience of AI is genuinely transformative, because they’re the ones building and using it at the frontier.
This creates a strange epistemic bubble. The discourse is dominated by people for whom AI has changed everything. They’re not lying. For them, it’s true. But they’re maybe 0.1% of the workforce, talking to each other, pattern-matching everyone else’s experience onto their own.
There’s also a financial incentive to hype discontinuity. “This changes everything” raises money. “This is a useful incremental improvement” does not. I have a more cynical version of this argument that I may share in future posts: that the best way to raise money for something with relatively little near-term ROI is to position it as a war against another country. I’m not sure I believe this, but it isn’t crazy to write down.
Ask a dentist how AI has changed their job. Ask a plumber. Ask a high school teacher. Ask anyone, really, who doesn’t work in tech. The answer is usually: it hasn’t, much.
The Matt Levine Theory of Technological Change
Matt Levine has this recurring bit where he points out that most financial “innovations” are really just new ways of doing things that people have always done. Crypto is banking. SPACs are IPOs. NFTs were trading cards.
Something similar is happening with AI discourse. Every new capability gets framed as a discontinuity, a break from everything that came before. But then you look at how people actually use the technology, and it’s mostly: better autocomplete. Faster first drafts. A research assistant that doesn’t get tired.
These are genuine improvements! I use AI tools constantly. They make me faster at things I was already doing. But “faster at things I was already doing” is not the same as “everything is different now.”
A Question for the Room
Here’s a simple test. How has your actual daily workflow changed because of AI in the past twelve months?
If you’re a software engineer, the answer might be substantial. More on that in a moment.
But if you’re a lawyer, or a doctor, or a teacher, or a marketing manager, or a small business owner: has your job fundamentally changed since 2022? Are you doing things that were literally impossible before?
For most people, I suspect the honest answer is: not really.
The Engineering Exception
Software engineering is a genuine exception, and it’s worth being precise about why.
Code is the perfect medium for current AI capabilities. It’s text-based, heavily structured, has clear success criteria, and exists in massive training datasets. Engineers were always going to be the first group where AI tools became genuinely transformative.
Andrej Karpathy put it well in a recent thread: “This is easily the biggest change to my basic coding workflow in ~2 decades of programming and it happened over the course of a few weeks.” He went from 80% manual coding to 80% agent-assisted coding in about a month.
But here’s the kicker. Even Karpathy notes that “I’d expect something similar to be happening to well into double digit percent of engineers out there, while the awareness of it in the general population feels well into low single digit percent.”
That’s the gap I’m pointing to. A genuine transformation is happening for maybe 10-20% of engineers. But software engineers are perhaps 1% of the workforce. The discourse acts like everyone’s world has been upended. It hasn’t.
What’s notable is that for perhaps the first time in history, these warnings are coming from the CEOs of the companies actually producing the technology. Elon, Sam, Dario are all warning about the implications of what they’re building. I don’t think that happened during the internet bubble. The people building the thing are the ones saying “be careful.” That’s new, and I’m not sure what to make of it.
Dario Amodei just published a 20,000-word essay called “The Adolescence of Technology.” He describes a vision of “powerful AI” as essentially “a country of geniuses in a datacenter” that could arrive in as little as one to two years.
This is not someone prone to hyperbole. Amodei is a physicist by training who has spent over a decade building these systems. He writes that watching progress from inside Anthropic, he can “feel the pace of progress, and the clock ticking down.”
So how do I reconcile this with my observation that things aren’t changing that fast?
Two Truths at Once
I think two things can be true simultaneously:
One: The underlying technology is advancing at a remarkable pace. The people building these systems are seeing things that genuinely concern (and excite) them.
Two: The diffusion of this technology into the broader economy is proceeding at a much more normal pace. Most people’s jobs haven’t changed. Most workflows haven’t been disrupted. The world feels pretty similar to how it felt a year ago for most people.
Karpathy captures this nicely: “The intelligence part suddenly feels quite a bit ahead of the rest of it...”
The intelligence part has gotten quite good. Everything else (integrations, workflows, organizational adaptation, broader diffusion) lags behind, because that’s how technology adoption has always worked. There’s nothing special about AI here. It’s subject to the same frictions as every other technology.
So:
Even if I’m right about the pace of change for most people right now, there is something deeply important happening.
We’re in a strange phase where the people closest to the machine are nervous, and everyone else is mostly indifferent.
That gap will close.
The only question is whether it closes gradually, or all at once.



Super engaging piece. Will be revisiting to see how it ages over the next few weeks