The risk of being incremental
Issue 249: What the failure of the Sega 32X taught us about tech adoption
Growing up, my brother and I were avid video game players. Though we started with the Nintendo Entertainment System (NES), we were a Sega family. I honestly can’t remember why. Maybe it was the marketing aimed at older kids, or the fact that Altered Beast—a game we obsessed over at our local Pizza Hut while redeeming Book It stars for personal pan pizzas—was exclusive to Sega. Or maybe it was simpler than that: Sega was 16-bit, and in my child brain, double the bits meant double the power.
Whatever the reason, we stayed loyal to the Sega ecosystem. When the Sega CD came out, we got it. It was wild. A Marky Mark video game, full-motion video, and CDs! It was clear even then: technology signaled progress.
So when the Sega 32X launched in 1994, we jumped in. Holy shit—32 bits? That’s four times more than what Super Mario Bros and Duck Hunt offered! But something felt different. The 32X returned to cartridges, and the game library was underwhelming. Then, just five months later, Sega released the Saturn, a completely different CD-based console. Even as a kid, I remember thinking: Why would they release two consoles in the same year? There was no way our parents were buying another system that soon.
It wasn’t until I was an adult that I learned what happened with Sega back then. 1994 was a transitional moment in technology. CDs had been around, but they were still expensive to produce. Internally, Sega faced a fork in the road: double down on CDs and make a bet on the future, or stick with the reliable cartridge format. In a bid to hedge their bets, they did both. They shipped the 32X—a stopgap, cartridge-based add-on—and followed it with the Saturn, a next-gen CD console.
It was the worst possible move.
By splitting their ecosystem, Sega fragmented their customer base, confused developers, and diluted the excitement. They tried to be incremental and futuristic at the same time—and failed at both. Just months later, Sony entered with the PlayStation, an unapologetically CD-based console that would go on to dominate the market. Sega never recovered.
Thinking in incremental steps during a transformational shift is like slowly boiling a frog. By the time you realize how hot the water is, it’s too late. Sega’s dual-console debacle in 1994 is a perfect case study of what happens when you try to cautiously straddle the old and the new.
Tech is having a Sega moment. We’re in the middle of another platform shift, and this time it’s AI. In uncertain times, customers want clarity. If you’ve ever worked in customer success, sales, or product strategy, you know that people don’t just buy what your product does today—they buy your vision. They want to know how you see the future. Not a roadmap, but a point of view. When the landscape is shifting, indecision sends a signal, and lack of conviction erodes trust.
AI adoption feels like the Blu-ray vs. HD DVD moment. There’s confusion, hype, and half-measures. We see tools layering AI into old workflows instead of rethinking the interface. Leaders hedge instead of committing. The instinct is to do both: protect the old while poking at the new.
But the winners in platform shifts aren’t the ones who play it safe. They’re the ones so convicted they’re willing to lose. Sega played to protect. Sony played to win.
Everyone senses a shift, but no one knows exactly which format—or in this case, which interface, workflow, or foundation—will win. When Sony launched the PlayStation 3, it shipped with a Blu-ray drive at a time when HD DVD was still a serious contender. It was a risky bet: Blu-ray players were expensive, and HD DVD had the backing of major studios. But Sony had conviction—not just in the technology, but in its role as a Trojan horse. By embedding Blu-ray in every PS3, they didn’t wait for the market to decide. They moved it. Over time, Blu-ray became the default medium for games and high-definition video.
Emerging AI technologies like Model Context Protocol (MCP), voice-native interfaces, agentic runtime environments, and contextual memory frameworks are today’s equivalents. They’re powerful, but unproven. Adoption won’t hinge solely on performance, but it’ll come down to distribution, developer ecosystems, and confidence. The winners in this era won’t be the ones who hedge. They’ll be the ones who, like Sony with Blu-ray, bet boldly—and shift the market in the process.
Placing bets in high-displacement markets is a huge risk. However, the winners in the markets are also the ones who have so much conviction they're willing to risk losing, and that's how they win.
If you’re not re-imagining everything right now in what you’re building, you’re likely building the 32X.
What’s your move?
Hyperlinks + notes
First Builders and the New Tech Mafia by
→ Check out the new podcast by Amber and Rachel. Their first guest is the incredible Rohini Pandhi, Head of Expansion at MercuryThe 6 Builders who will Thrive in the Era of AI by
I think Sony is a good example for both sides of this story. They’ve defined markets and adoption of new technology with CD, Blu-Ray and BetaCam, and then suffered from their proprietary tech with BetaMax, MiniDisc and UMD. Apple as well, although with a reductionist approach to drive adoption of existing technology.
But maybe the takeaway of their history is in the importance of that boldness even when you know it can fail.