AI Makes Your Best People Slower — And They Do Not Notice
Experienced developers using AI tools are measurably slower — and still convinced they are faster. A note on the skills that erode before anyone notices, and what that asks of how teams are led.
I keep catching myself in the same loop. I ask AI to draft something — a brief, a structure, an analysis. The output looks decent. I start editing. Twenty minutes of prompting, fifteen minutes of tweaking. Then the thought creeps in: I could have just written this myself. Probably faster. Definitely more mine.
And yet, the next time, I do it again. Because it feels productive.
Turns out that feeling is well-documented. A controlled study by METR found that experienced developers took 19% longer on tasks where AI tools were allowed than on tasks where they were not. The kicker: even after experiencing the slowdown, they still believed AI had sped them up by 20%. A 39-point gap between perception and reality.
That gap was on my mind when I attended a lecture by Mark Coeckelbergh — philosopher of technology at the University of Vienna, former member of the EU's High-Level Expert Group on AI, and author of Why AI Undermines Democracy and What To Do About It — on the ethics of large language models. One image stuck: the human shifting from driver to passenger. We built the car. We got in. And somehow, we ended up in the back seat.
He was talking about society. But I could not stop thinking about my own work — and the people I work with. If even the people who design and build these systems do not notice how their judgment shifts, what chance does anyone else have?
The back seat of your own process
You have seen it too, if you lead a product team. Someone pastes an AI-generated analysis into a strategy doc. The room nods. Nobody asks how the output was constructed. Nobody can. The black box is not just in the model — it is in the boardroom now.
This is not a failure of intelligence. It is a failure of friction. AI removes the resistance that used to force us to think. And thinking, it turns out, needs resistance the way muscles need gravity.
Skills you do not use disappear
Coeckelbergh made a sharp comparison: a pilot who stops flying loses their license. Not as punishment — because the skill atrophies. You cannot land a plane on theory alone.
The same is happening in product teams right now. Writing, analysis, critical judgment — these are not just tasks. They are capabilities that require practice. When AI drafts your product brief, rewrites your research synthesis, and suggests your next experiment, what exactly are you still exercising?
The research is starting to catch up with this intuition. A 2025 MIT Media Lab study measured EEG activity during essay writing and found that participants using ChatGPT showed a 47% drop in brain connectivity compared to those writing unaided. It is a preprint with a small sample, but it points in the same direction as other work in the field. A more recent study in Technology, Mind, and Behavior followed nearly two thousand working adults using AI on workplace tasks. Those who leaned most heavily on AI, accepting its outputs with minimal modification, reported lower confidence in their own reasoning and a weaker sense of ownership over their ideas.
The dangerous part: it feels like productivity. It looks like efficiency. But underneath, your team is slowly becoming unable to catch the machine when it is wrong.
The manager who knows best is dead
In education, Coeckelbergh argues, the all-knowing lecturer is finished. The person at the front who transmits knowledge — that model does not survive AI.
The same applies to leadership. The CTO who has all the answers, the CPO who dictates the roadmap from intuition alone — that archetype is already obsolete. But the replacement is not "let AI decide." It is something harder: becoming the person who asks better questions. Who creates the conditions for critical thinking instead of delivering conclusions.
That is a design problem. And most organisations have not started solving it.
The muscle does not announce its absence. You notice only when you reach for something you used to do without thinking, and find that the reach itself has become unfamiliar. By then the work is already done — somewhere else, in a way you can no longer fully interrogate.
That is what atrophy looks like from the inside. Not failure. Not a missed deadline. Just a quiet distance growing between you and the work you still call your own.
I am interested in how this is playing out inside teams that are genuinely leaning on AI for the work that used to require thinking. If you are in one, and something here landed, find me on LinkedIn. I read everything.