From Craft to Compliance.
The future of software engineering?
What happens when software is produced faster than humans can understand it?
We’re about to find out. AI is industrializing code production; the marginal cost of new features is heading toward zero. However, the scarcest resource hasn’t changed. Human understanding is still slow, sequential and context-heavy. You can parallelize code generation, but you can’t parallelize coherence.
Our entire engineering system (agile, code review, team norms etc) was designed for humans writing code at human speed. It assumes that feedback loops move slowly enough for people to notice problems. It assumes someone on the team actually understood the change.
At industrial throughput, those assumptions break. Instead, we get:
More parallel change than shared context can support
Larger diff surfaces and more accidental complexity
Artifacts that are “plausible” without being right
Teams relying on ritual signals (”tests passed”, “ship it”) instead of comprehension
The word that saves us is one that makes most engineers recoil in horror, compliance. Not the bureaucratic kind (change review boards, committees for everything), but the other kind, mechanized trust. Systems that produce evidence that the code is behaving within agreed constraints, because we can no longer rely on humans noticing everything.
This is a shift from craft to compliance. It doesn’t kill the soul of engineer; it just moves it.
Why our systems were shaped for humans
To understand what’s breaking, you need to see what we built and why.
Most of what we call “modern software development” is really a set of practices designed around human frailties: limited attention, partial context, miscommunication, shifting incentives, fatigue, and a million and one cognitive biases. We built workflows that assume people forget things, misunderstand each other, and change their minds and then wrapped them in feedback loops. It mostly works.
Agile fits perfectly into this world. It’s a set of compensating mechanisms for human frailty:
Small batches because humans struggle with big-bang integration
Frequent check-ins because humans diverge silently
Working software because humans lie to themselves about progress and about what they want
Retros because humans repeat failure modes unless they name them
We designed systems assuming that writing software costs time and attention. Teams could hold enough of the system in their heads to make good decisions. Mistakes got caught because feedback loops moved at human speed: build → test → deploy → observe.
This was the human-shaped era. The shape of our engineering system matched the shape of human work.
Craft has friction. Friction has value.
Humans craft software in a way that’s inherently self-limiting.
We can only hold so much in working memory, so we naturally constrain what we create. We leave traces of intent. We feel the weight of a change because we paid for it in effort.
That friction prevents runaway complexity. It’s a feature, not a bug.
AI manufactures software. It’s fast, abundant, and often convincing. It can generate a feature-shaped “thing” with little effort which means we create more of it. We try more options, we up the ambition, we accumulate more moving parts.
When production is frictionless, complexity becomes frictionless too.
Compliance as mechanized trust
When people hear “compliance” they picture bureaucracy1: ticket templates, committees, box-ticking. That’s the worst version (and unfortunately the most prevalent).
Think of compliance differently: mechanized trust. evidence that the system is behaving within agreed constraints.
In a human-shaped world, we had plenty of implicit compliance. It came from shared context and judgment. You trusted the team because the team could plausibly understand the change.
At industrial levels of change, we have to make the implicit explicit. Trust shifts from “someone on the team understood it” to the system demonstrating it.
That means policy, verification, and runtime control. (More words that sound bad, I know.)
But here’s the reframe: if change is cheap, assurance can’t be manual. Compliance isn’t about slowing down. It’s about keeping speed without losing coherence.
What this looks like in practice
Policy becomes executable
In the human-shaped era, many important rules lived as norms. You don’t log PII. You don’t roll your own crypto. You don’t couple this module to that module. Norms work when the team can hold the context.
At industrial throughput, norms need backup. They have to be reified as code:
Secrets scanning
Dependency allow/deny lists (including licenses and registries)
Vulnerability thresholds and exception handling
Infrastructure policies (encryption required, no public buckets, least privilege)
Architectural rules where you can encode them (layering constraints, for example)
If it’s important enough to care about, it must be automated.
Risk-based paths replace one-size-fits-all
The fastest way to make compliance awful is to apply maximum friction to everything. Good compliance is tiered.
A trivial change should be trivial. Changing the auth system or the data you’re storing should require more scrutiny. The bigger the risk, the more proof required.
Good architecture (such as separation of concerns) makes this possible.
Verification shifts earlier and later
Human-shaped systems often put too much weight in a single moment: “before merge.” In a compliance-shaped system, assurance spreads across the lifecycle:
Better test suites (including contract tests where relevant)
Static analysis tuned to your codebase
Dependency scanning integrated into the pipeline
Pre-deploy checks (configuration, policy, integrity)
Progressive delivery (staging, canaries, gradual rollout)
Fast rollback and kill switches
Observability that detects problems quickly
At industrial scales, you can’t foresee everything at the moment of change. You have to build a system that stays safe while changing.
“Green” stops meaning “safe”
We often pretend that if the tests pass, the application works. That’s a great aspiration, but it doesn’t scale.
At industrial throughput, green often means “nothing obvious broke.” It doesn’t mean “this is coherent with the system” or “this won’t create a long tail of complexity.”
Compliance doesn’t magically solve this. But it reframes the question:
What evidence do we have that this change is safe enough for the risk we’re taking?
Humans move up the stack
In the craft era, senior engineers spend a lot of time inspecting artifacts. They read code, review pull requests and build mental models of changes.
In the compliance era, that time is better spent designing the system that produces and constrains artifacts.
Humans shift toward:
Architecture and boundaries (where mistakes become expensive)
Creating tests that encode critical behaviours
Shaping policies that reflect real risk, not imagined risk
Building observability that makes production legible
Deciding what’s allowed to move fast
Humans move from crafting change to crafting the conditions under which change can be trusted.
Making the shift
If you want this without bureaucracy, build it deliberately and automatedly:
Define risk tiers. Pick a handful that match your system: docs, UI-only, normal logic, auth/billing, data/migrations.
Attach evidence requirements. Low risk flows fast. High risk requires more proof: specific tests, rollout constraints, signoffs, stricter policy.
Convert repeated lessons into guardrails. Every “we always comment this in PRs” is a policy candidate. Turn anecdotes into constraints.
Steer humans toward intent. Teach reviewers to ask: What’s the impact? What’s the rollback plan? What complexity does this introduce? Does this cohere with the architecture?
Make production safer. Progressive delivery, feature flags with discipline, excellent observability, rapid rollback. Don’t rely on a single checkpoint.
Keep shipping fast. Stop betting on humans noticing everything.
Craft isn’t dying. It’s relocating.
The fear behind “compliance” is that it kills the soul of engineering.
It doesn’t have to. Craft doesn’t disappear; it moves up the stack. In the industrial era, the craft is shaping:
Architectures that resist failure
Guardrails that prevent predictable mistakes
Pipelines that produce evidence
Systems where change remains safe at scale
That’s still craft. It’s just applied to the system that produces code, not only the code itself.
https://itrevolution.com/product/the-delicate-art-of-bureaucracy/ is a good book on understanding that bureaucracy doesn’t have to be bad.


