Some Knowledge Only Comes From Making the Thing

I saw someone recently arguing that friction in software development was actually useful, that slowing down helped produce better outcomes. The framing is easy to dismiss as cope, the kind of thing you'd say if you were losing the automation race and wanted to feel better about it. But there's something underneath it that I've been thinking about.

To build good products, you have to build good products. That sounds like a tautology but I mean something specific by it: for most people, it's only through the actual process of building that you discover what the thing should be. Iterating, exploring, failing, revising — the subdecisions and edge cases, the moment where you realize users do something you never anticipated and the whole model you had in your head was slightly wrong. You can't plan your way to that. You can only build your way to it.

There's a Dunning-Kruger angle here.[1] You can't write a good specification for something you haven't built yet, because you don't yet know what needs specifying. The gaps in your understanding only become visible when you try to make the thing work. Building is what reveals what you didn't know you didn't know.

The prompt-to-product vision, one-shotting increasingly complex software from a description, assumes you can get the specification right upfront. But the ceiling is the quality of your specification, and you can't specify what you don't yet know you need.


I think building can be an act of love.

I come at this from a background in physics and engineering, which probably shapes how I think about systems in general. A system isn't a static architecture diagram. It's something that behaves — it has dynamics, flows, feedback loops, state that evolves over time. Understanding a system means understanding how it moves: how perturbations propagate through it, where the stable and unstable equilibria are, what happens at the boundaries, how it responds to stress. You don't really know a system until you know its behavior, and you don't know its behavior until you've watched it move.

When you've spent enough time with a thing, you start to hold it in your mind this way. You can see how changes will ripple through, which parts are tightly coupled, where the load-bearing assumptions are. The system becomes something you can almost watch run. You develop a sense for its dynamics that goes beyond what you could write down in documentation.

And it goes both ways. The system takes up residence in you. You carry it around when you're not working. It surfaces in the shower, while you're falling asleep, on a walk. The problems you haven't solved yet keep running in the background. You dream about it sometimes. I don't think this is a sign of obsession or poor work-life balance — it's what happens when you're engaged with something complicated enough to be interesting. The system needs that much space to fit.

This is where the intuition comes from. You start to know where something is wrong before you can articulate why. The wrongness has a texture, a feel — like noticing a vibration that shouldn't be there, or a response that's slightly too slow, or a coupling that's going to cause problems three changes from now. You notice because you've spent enough time with the thing that you've developed the capacity to notice. Maybe that's what care actually is: enough sustained attention that you start to see the details.


I'm not making an argument against AI tools here, or against speed, or for some kind of artisanal hand-crafted software mystique. I use AI tools constantly. They're genuinely useful. If I don't have to hand-write boilerplate, I can spend that time on the parts I actually care about. That's a real gain.

What I'm trying to point at is the difference between producing a thing and understanding it. You can produce correct code without understanding the system it belongs to. You can generate solutions to well-specified problems. But understanding — the intuition that comes from having built, broken, fixed, revised, watched someone use it wrong, gone back and thought harder about what "right" even means — that's different. The system has to take up space in your head. You have to carry it around for a while. You have to watch it move.

There's a version of AI-assisted development where the tools handle the mechanical parts and free you to focus on the parts that require understanding. And there's another version where the tools handle everything, and you skip directly to "working software" without ever developing the understanding. You get the output but not the residue of having made it. And the residue is where the good stuff lives: the knowledge of why this decision and not that one, what will break if you change this, what the user actually needs versus what they said they needed.

I don't have a clean resolution to this. I'm not arguing we should slow down for the sake of slowing down, or that friction is inherently good. But the process of building — really building, where you care enough to develop a feel for the dynamics of the whole system — teaches you something you can't get any other way.

Maybe the right frame isn't "friction is good" but "some knowledge only comes from making."


When I'm deep in a system I care about, there's a feeling of it moving. All the pieces at once, the flows and feedbacks, in some way I can almost see. It's not a mystical claim — it's just what happens when you've internalized something well enough that it becomes part of how you think. You know how changes will propagate because you've watched them propagate. You can sense the wrongness before you can articulate it because you've developed a feel for what rightness looks like in this particular system.

I don't think you get that from one-shotting a specification. You might get correct software. You might get useful software. But you don't get the thing where the system lives in your mind and you can see it moving.

That might be fine for most software. Not everything needs to be an act of love. Most software is good enough being mediocre, doing its job, infrastructure nobody thinks about too hard.

But if you want to make something good — really good, the kind of thing that feels right to use, that handles the edge cases gracefully, that doesn't fall over in unexpected ways — I think you have to go through the building. You have to care enough to let the system take up space in your head, to develop a feel for its dynamics, to watch it move until you know it.

I don't know how to specify your way to that.


  1. The Dunning-Kruger effect as popularly understood is largely a statistical artifact — the pattern emerges from regression to the mean and the better-than-average effect, not from any special metacognitive deficit in low performers. See Gignac & Zajenkowski (2020) for a good breakdown. ↩︎