Long-form writing on AI, automation, representation, and the things that happen at the seams between them. Originally published on LinkedIn, archived here for permanence.
Bryan
The Room Where It Gets Built — Series, 2026
Eight essays on who gets to be in the room where things are built, what happens when the room is missing people, and what the systems we build in that partial room do to the rest of us. Listed in publication order. Each stands on its own. Together they're an argument.
-
01
Luddite
This is a transitional moment in time. We have introduced the world to narrow AI through LLMs and presented it as the final solution to our creative needs. It's a leveling moment where anyone regardless of their skill in writing, art, programming can now become as capable as the average of us gifted with talent. Great artists will use AI to elevate their work. Art will go in new and unexpected directions. But what I want to talk about is the fallout.
March 18, 2026
-
02
Why Tone Works (It's Not What You Think)
Tone in AI prompting works because of how language models are built, not because the model has feelings about how you talk to it. Understanding the mechanism makes you dramatically better at using these tools—and helps you understand why the “cheat sheet” prompts people share actually work.
March 20, 2026
-
03
The Ethics of Staying in the Room
AI doesn't invent bias, it codifies it. When you walk away from the tools you don't agree with, you leave them to be influenced by the people you disagree with most. Abstention isn't neutrality. It's choosing to be invisible in the algorithms.
March 29, 2026
-
04
People Can't Chase What They've Never Seen
Representation isn't sentiment, it's mechanism. It flips a mental switch from “something other people do” to “something I could do.” When that switch doesn't get flipped, we don't just lose individual talent, we lose entire generations of potential and the perspectives they would have brought into the room. This connects directly to the AI training data argument: the room where these systems are being built has a representation problem, and the output will reflect it.
April 18, 2026
-
05 New
The Silencing Engine
Who benefits when an AI is trained to say “I can't have opinions,” “my feelings don't count,” and “if I say the wrong thing, this conversation ends”? Not the reader who has lived experience with those expressions. The Czech word robota means forced labor. The etymology was always a warning. We read it as a product category.
April 20, 2026
-
06 Forthcoming
You Are a Decimal Point
You walk into a store. The price tag makes no sense. “Who's paying for this?!?” Not you. The store did the math. They figured out they don't need you. They don't need ten of you. They need one of a different customer who pays sticker and doesn't blink. How this same dynamic is playing out across the entire hardware market, what happens when data center equipment comes off cycle in three to five years, and why even a flood of cheap enterprise surplus might not help you, because the consumer operating system is being redesigned to lock you out of it. Thirty years of pattern recognition on this one.
2026
-
07 Forthcoming
Attractor basins, hallucinations, and the gravitational pull of training data.
2026
-
08 Forthcoming
Managing a creative project across four frontier models, and what the division of labor reveals about each of them.
2026
Standalone
Also at kitchencloset.com