Building in Public

What the AI Doesn't Understand Yet

A developer friend is still employed, still coding — but wondering whether it means what it did two years ago. On context, human judgment, and the thing AI can’t replace yet.

3 min
Mar 10, 2026
Hand-carved style minimalist pattern on off-white textured paper, depicting a spiral root network of organic, deepening roots, symbolising the accumulated human judgment and lived context that AI cannot replicate.

A developer friend reached out recently. Colleagues had been let go at his company. Not him. He was still there, still coding, still considered valuable. But something had shifted. He was doing the same work he’d done for years, and it didn’t feel the same. He asked, almost in passing: does coding still mean what it meant two years ago?

I didn’t have a reassuring answer.

What I had was a question back: what do you understand about the problem that the AI doesn’t?

The context that can’t be generated

I’ve spent a lot of time thinking about what AI is actually replacing in professional work, and what it isn’t.

The code, yes. Give a capable model a clear specification and it will produce working code faster than most developers. That’s not a projection. That’s just what’s happening now.

But the code was never the hard part. The hard part is knowing what to build. And more specifically: knowing which details matter for this particular client, in this particular industry, with this particular team dynamic. The AI doesn’t attend the meetings. It doesn’t pick up on the CFO’s unspoken hesitation. It doesn’t know that the last three projects at this company failed because of organizational friction, not technical limitations.

That accumulated, lived, difficult-to-articulate context is what I’d call the thing worth protecting.

What my own journey taught me

I came up through a world of functional specifications. Dense documents, logical conditions, flowcharts. The problem wasn’t that nobody read them. The problem was that text-heavy documentation could never convey the actual experience of using software. The user was invisible in the spec.

That realization sent me into UX. Then no-code. Then AI-assisted development. Each transition felt like a phase change, not just in tools, but in what I was actually responsible for.

The further I got from writing specifications, the more I understood that the value I was adding had never really been in the specifications themselves. It was in the questions I asked before I wrote them. It was in knowing which stakeholder’s concern to probe further. It was in recognizing when a technically correct solution would fail socially.

None of that ever made it into any document I produced. It lived in the conversations around the document.

The question underneath the question

When my developer friend asked whether coding still means what it did, he was really asking something else: am I still valuable?

The honest answer: it depends on what you think your value was.

If the value was in typing correct syntax, then yes, that’s being compressed. Fast. But if the value was in understanding a complex system well enough to ask the right questions, in holding the context of a problem across weeks of iteration, in knowing when something technically works but doesn’t actually solve the problem, then that value isn’t being replaced. It’s being amplified.

The developers who are struggling aren’t the ones who lack skill. They’re the ones who haven’t yet recognized that their skill was always more than code.

The thing worth protecting

I don’t know what this field looks like in two years. Anyone who tells you they do is speculating.

But I have a working hypothesis: the professionals who will matter are the ones who can hold context. Not just technical context. Human context. Organizational context. The kind built from years of watching how things actually go wrong, and why.

AI is extraordinarily good at pattern matching within defined parameters. What it doesn’t have, what it can’t have yet, is judgment built from care. The kind of judgment that asks: we could build this, but should we? Or: the spec says one thing, but the user actually needs something else.

That’s the thing worth protecting.

Not the syntax. The understanding.