Proof of Concept in 2025
Issue 278: Reflecting on themes and issues from the year
When I write Proof of Concept, I don’t work from an editorial calendar. There’s no fixed schedule or predetermined list of topics. I write what’s top of mind—often experimental—and only in hindsight do the themes reveal themselves.
Looking back, it’s no surprise that much of this year circled AI. It was inescapable. But I’ve been intentional about not turning this into an AI newsletter. The work here has always been about something broader: entrepreneurship, creativity, and experimentation—and how shifts in technology reshape how we build, lead, and decide.
What happens when everyone can build anything?
In 2025, that question stopped being hypothetical. The barriers to building collapsed quickly. With a model, a prompt, and a thin UI layer, almost anyone could ship something that looked like software. At first, it felt like a creative surge. Then the sameness set in.
The clearest signal was the rise of one-shot prompt apps: entire products wrapped around a single clever prompt, exposed through chat, marketed as breakthroughs. Many worked once. Few held up over time. Outputs became repetitive, bloated, and increasingly sloppy. As more of these tools appeared, it was tempting to blame the models. But the real issue wasn’t capability—it was judgment.
When everyone can build, capability stops being the differentiator. What matters instead is intent: what a system is designed to do, when it should act, and when it should stay out of the way. This is where branding quietly became strategic again—not as marketing polish, but as a signal of taste and restraint. In a crowded field promising “more,” the tools that stood out were clear about what they chose not to do.
Abstraction followed the same pattern. Easier tools didn’t eliminate skill; they relocated it. Playing a DJ set doesn’t make the piano obsolete—it changes the relationship between intention and output. In software, abstraction shifted the work from executing primitives to orchestrating systems. Generating output was easy. Sequencing, editing, and knowing when to stop was not.
Chat interfaces intensified this tension. Natural language made AI accessible, but it also flattened interaction. Everything became conversational, even when precision was required. Small requests triggered large rewrites. Refinement produced variation instead of control. As chat became the default, the ecosystem filled with verbose, overconfident output—AI slop that felt helpful until exactness mattered. Differentiation began to emerge not from better prompts, but from better interfaces: places where language, controls, and constraints worked together.
By the end of the year, the answer came into focus. When everyone can build anything, building is no longer the advantage. Judgment is. Differentiation shows up in how systems are shaped, not how quickly they’re assembled. In a world of infinite output, the scarce skill is deciding what deserves to exist—and what doesn’t.
Writing from this year
Software development is changing
For professional builders, 2025 wasn’t about whether AI could help—it was about realizing that the underlying shape of software is shifting. The patterns many of us grew up with still work, but they’re no longer sufficient. I’ve become convinced that Model/View/Controller, as we’ve known it, is beginning to bend under the weight of intelligent systems.
In A new MVC is emerging, I explored how this shift isn’t about replacing architecture, but reassigning responsibility. Models are no longer static code and databases; they’re probabilistic systems that evolve. Views are less like destinations and more like interchangeable surfaces. Controllers—once simple translators of input—are becoming agentic, capable of planning and acting across systems. The boundaries still exist, but they’re softer and increasingly negotiated at runtime.
This raised a deeper question: if the parts of the system can change without us explicitly rewriting them, what does it mean to maintain software over time? In The Platform of Theseus, I kept returning to the idea that modern platforms are never finished. Stability no longer comes from freezing change. It comes from designing for replacement.
As systems became more dynamic, the work shifted from implementing fixed behavior to defining constraints and guardrails. Interface design, orchestration, and feedback loops began to matter as much as correctness. In Real-time strategy games and AI interfaces, I looked to games as an analogy: environments where players don’t issue single commands, but shape strategy and respond to unfolding conditions. AI-native software increasingly works the same way.
Tools matter here, but not in the way the hype cycle suggests. In Using power tools, I argued that AI is less like magic and more like a table saw: immensely powerful, easy to misuse, and unforgiving if you don’t understand it. Professional builders aren’t defined by avoiding abstraction; they’re defined by knowing when precision matters and when speed is acceptable.
All of this led to a final realization: planning and execution are collapsing into the same surface. In The plan is the program, I wrote about how plans are no longer static artifacts handed off to be implemented later. In AI-native systems, plans are executable, revised in real time, and often carried out by the same agents that helped generate them.
Professional software development isn’t disappearing. It’s becoming more demanding. The work is less about writing code in isolation and more about shaping systems that can change without falling apart.
Writing from this year
Strategy is compressing
For most of my career, strategy and execution lived at different altitudes. Strategy happened upstream—in decks and roadmaps. Execution followed downstream. That separation wasn’t philosophical; it was practical. Building took time. Changing direction was expensive.
That space collapsed in 2025.
As tools accelerated, iteration cycles compressed. Prototypes became cheap. Experiments became reversible. Decisions that once required alignment across functions could now be tested directly in the work. Strategy didn’t disappear—it moved closer to the surface.
In Strategy is compressing, I tried to name what many teams were feeling. When feedback loops shorten, the cost of being wrong drops. When the cost of being wrong drops, the value of over-deliberation collapses with it. Strategy no longer earns its keep by being exhaustive; it earns it by being directional.
This exposed a different bottleneck: decision-making itself. In Making decisions happen, I wrote about how many organizations weren’t slow because they lacked ideas, but because they lacked mechanisms to turn intent into action. As iteration sped up, indecision became more visible—and more costly.
Compression also changed how risk behaved. In Displacement variance, I explored how volatility didn’t disappear—it relocated. When execution becomes easier, uncertainty moves upstream. Strategy can no longer pretend to buffer ambiguity. It has to operate inside it.
This is why strategy increasingly looks like making. In Strategy-to-Pixels, I argued that influence now comes from the ability to translate intent directly into artifacts. A prototype can answer questions a memo never will. As iteration cycles collapse, the distance between deciding and doing shrinks with them.
What compressed wasn’t thinking—it was the gap between thought and consequence.
Strategy didn’t become less important. It became less abstract.
Writing from this year
Looking to 2026
If 2025 was about compression, 2026 feels like it will be about clarity.
The novelty phase is wearing off. AI is no longer surprising, and that’s a good thing. The next year won’t be defined by bigger models or louder demos, but by quieter questions: Where should intelligence live? What deserves automation—and what doesn’t? How do we design systems that help people decide, not just produce?
The builders who will matter most in 2026 won’t be the fastest or the loudest. They’ll be the ones who can shape constraints, design feedback loops, and hold a point of view while everything around them stays fluid. Professional tools will continue to raise the ceiling. Democratized tools will continue to raise the floor. The opportunity—and responsibility—is in building the bridges between them.
I don’t expect 2026 to be calmer. But I do expect it to be more legible. The questions are getting better. The tools are getting sharper. And the people doing the work are learning, again, that leverage doesn’t come from doing more—it comes from deciding what matters.
That’s what I’ll be paying attention to next.
Hyperlinks + notes
The Silent Killer in Your CI/CD Pipeline: Why “It Works on My Machine” Isn’t Enough
Nicolas Cage scribbles up an impression in the first Madden trailer
Someone Stole $400,000 Worth of Live Lobsters en Route to Costco
Why Apple’s Foldable iPhone May Be Smaller Than Expected → Burgers should be wider, not phones!
18 Quotes that Defined 2025: Andrej Karpathy, Sam Altman & Pope Leo XIV on AI
I Exceeded My Creator Goals by 2x This Year: Here’s What Worked and What Didn’t by Peter Yang
A Call for New Aesthetics → This is an awesome initiative by Patrick Collison and Tyler Cowen


So much to be excited about in 2026, and if "clarity" is indeed the theme, we will all be better off for it! I am personally fascinated by the implications of strategy being compressed, as you said. It has the potential for the most dramatic impact and improvement to how we build products, and with any such change, it will likely come with major growing pains. Excited to see more of your thoughts on this topic emerge, David. Happy holidays!