I use AI every day. I also built an entire infrastructure to make sure it doesn’t erase the person using it. Those two facts aren’t contradictory. They’re the same practice.

AI output drifts toward generic. Left unconstrained, the model produces clean, competent copy that belongs to nobody. Same cadence, same transitions, same way of building to a point. Ten AI-assisted “About” pages sound identical. The person disappears. What’s left is the tool’s default register. I call that slop: output that passes for quality without containing any.


The anti-slop thesis is structural. The problem isn’t that AI writes badly. The problem is that AI writes well enough to fool you into shipping work that sounds professional and says nothing specific. The fix isn’t better prompts. The fix is governance: infrastructure that catches the failure modes before they ship.

My infrastructure has three layers. A voice protocol derived from how I actually talk in unguarded conversations (not published writing, which is already a performance). A twelve-item verification checklist that catches specific AI writing patterns: em dashes, negation-affirmation (“Not X. Y.”), fortune-cookie closers, banned words, sentences that could appear on anyone’s site. A no-hallucination policy that prevents any skill from inventing details about the work. Every concrete claim traces to a verified source in a corpus of sixty thousand documents of ideation history. If the claim can’t be traced, it doesn’t ship.

A blind evaluator read this site and concluded it was “unequivocally” human-written. Then I told them the truth: every page was compiled from three years of my conversations by the system described on the site. The voice came through because the source material was mine and the governance prevented the tool from replacing it with its default register.


The punk angle matters. Building tools to stay human inside systems that don’t care whether you do. The AI platforms don’t build anti-slop infrastructure because slop serves engagement. More output, faster, at scale. The quality floor is “good enough to not get caught.” That’s the standard slop operates at.

I built the opposite. Voice protocols that enforce my actual register. Evaluation lenses extracted from real design practitioners that test whether the work carries identity or just competence. A compilation pipeline that mines my own thinking instead of generating new content. The tool holds what my working memory drops: food science, timing tables, architectural dependencies, the order of operations for a complex evaluation. My working memory holds what the tool can’t: which kid had a hard day, whether this page sounds like me, whether the opening creates grip for a stranger.


The distinction between AI as generator and AI as compiler is the core of the thesis. Generation starts from a prompt and produces something new. The AI originates the material. Compilation starts from existing source material and transforms it. The source is mine. The system mines it, evaluates it, and assembles it under rules I wrote. It invents nothing.

This is pro-human, not anti-AI. The tools are essential. The governance exists because the tools are essential. Without governance, the convenience of AI-assisted production erases the specific person who should be visible in every sentence. With governance, the tools amplify what was already there instead of replacing it with the average.

The infrastructure took longer to build than writing the site by hand would have. That’s the cost of fidelity. The result is a site where every sentence traces back to something I actually said or decided, compiled by a system designed to preserve the voice instead of overwriting it.