Testing the future in production
Issue 263: Apple’s quiet strategy: prototype in plain sight, refine in production
This past week was Apple's Awe Inspiring event, and much of the attention was on the iPhone Air, the new mobile device with a sleek and modern design. With utter disregard for the camera bump, this is Apple's thinnest phone ever developed since the iPhone 6. For me, the iPhone Air is not about a new phone, but a signal to Apple's direction in experience. The most important image shown in the keynote was the detailed view of the camera bump. This remarkable feat of industrial design, fitting the entire chip system into the camera bump, is phenomenal. The iPhone Air as a device itself isn’t the exciting part for me; it’s what may come next with the technological hints.
The iPhone Air is a clear forerunner of the much-rumored iPhone Fold, but it also signals a future where software-enabled devices are as small as a camera bump—or even smaller. This has implications for AR headsets, smart glasses, future Vision Pro concepts, and compact home devices. In this post, we’ll look at examples of how Apple has tested concepts in production that later became real, and the lessons we can take from how they test user experience in the wild.
Apple's history of testing in production
Though some of these moments have been confirmed by Apple, it is best to view them as retrospective examples of when Apple tested UX in production. There are different aspects to test: experience, software, and materials. The launch of the Apple Watch shows how certain technologies and capabilities were shipped step by step.
One of my favorite devices was the 6th-generation iPod nano, which replaced the click wheel with a 240×240 pixel multi-touch interface inspired by the recently launched iPhone. One of my favorite accessories for the nano was a wristband that allowed you to wear it like a watch. Five years later, we saw the release of the Apple Watch with a much more advanced interface and technology, but one could argue that the experience had been tested years earlier.
During the summer of 2013 at WWDC, many people noticed something about the iOS 7 beta. Aside from the thin Helvetica Neue font and the introduction of slow-motion video, the Clock app displayed the actual time, and its hands moved in real time. iOS 7 also introduced motion tracking, enabled by the M7 chip. In iOS 8, the Health app appeared in the beta.
A year later, in September 2014, Apple announced the Apple Watch with the same interface. The Watch became a nexus point where multiple experiences came together, even though they had been shipped independently before.
The Apple Watch itself also became a testbed. Force Touch on the Watch was a predecessor to 3D Touch on iPhone and macOS, even though Force Touch was not originally intended for the Watch. It was also the first device with an Always On display, which later scaled to larger devices.
Apple is always hinting at where they are going. We just have to see it clearly. For the Apple Watch, the play was simple:
Use iPhone apps as a way to test unreleased software.
For hardware and new interactions, test on the small watch form factor before releasing to iPhone, iPad, and Mac.
Experience strategy lessons
Whether you're working on a business, brand, or piece of software, I expect all my designers to be platform and ecosystem designers. When you are designing for platforms, you must think about biodiversity. The best way to test the implications of that is to slowly introduce something new to the ecosystem.
Identify the nexus point and work backwards
Before you can incrementally land pieces of what you're building up towards, you need to know what the moment is. The nexus point is the big moment you ship building up to it. The Apple Watch was the nexus point for some of the examples mentioned above. The iPhone Air's nexus point might be the foldable iPhone or something even bigger.
When Product Marketing and Product Management are more infused as crafts, there is a need to balance the narrative and what you ship. If you're too waterfall, you'll lose the moment. If you're too agile, it may not be clear to the customer what the endgame is. By defining a nexus point, you have a committed point to work backwards from.
Ship design patterns for experimental introduction
In ecology, experimental introduction refers to releasing a species in a controlled way to study how it adapts, survives, and interacts in a new environment.
A design system works the same way. Its purpose is to scale quality and consistency for the customer. But when a system exists only to preserve the status quo, it stops serving the customer and starts serving itself. To stay healthy, a system needs clear intentions to guide how and when it evolves.
My rule of thumb has always been to introduce a single component that signals the new world—often something as simple as a button, since it ships easily. In some cases, the experiment might be larger, even an entire application.
Knowing when to subvert the system to propel it forward is good design systems thinking.
Test materials and capabilities
I’ve held onto my Apple Watch Series 3 because I love the ceramic casing. Apple eventually discontinued ceramic as a watch material, but the experiment wasn’t wasted. The lessons from working with zirconia-based ceramic on the Watch informed Apple’s approach to tougher, more resilient materials. In 2020, Apple introduced Ceramic Shield glass on the iPhone 12—a composite of glass infused with nano-ceramic crystals. While not the same as a full ceramic enclosure, it was clearly inspired by Apple’s earlier exploration, bringing the durability benefits of ceramic to millions of iPhones.
This idea applies directly to AI and LLMs. Think of an LLM as a new material being introduced. An experience designed with AI from the start feels fundamentally different from one where AI is bolted on later. Even something as simple as a button that calls an LLM can reshape expectations—users click without knowing exactly what happens on the other side, yet the material itself changes the entire interaction.
Nail the experience before mass adoption
Apple often takes a small, tightly scoped interaction and perfects it before scaling it to the broader ecosystem. A clear example is the Digital Crown on Apple Watch. When it launched in 2015, the Crown solved a very specific problem: how to navigate, zoom, and scroll on a screen so small that your finger would block most of the content? By anchoring the interaction to a physical dial, Apple created a precise, tactile way to control the Watch without compromising visibility.
At first, it seemed like a watch-only solution. But over time, Apple scaled the concept. The AirPods Max adopted a crown-like dial for intuitive audio control, and the Vision Pro headset placed a Digital Crown on top to adjust immersion levels and switch between AR and VR modes. What started as a micro-interaction for a niche product became a core control mechanism across entirely new categories. Apple’s playbook here is deliberate: nail the experience in a constrained context, then expand it once it’s proven.
Recap
Apple has a reputation for not testing with customers or doing research. I don't know where people get that idea. Perhaps people just want to rationalize Steve Jobs and say that. My point of view is that Apple does customer testing in key markets in production.
A few things to remember:
The nexus point matters most to customers; identify the go-to-market moment that tells that story
Testing new concepts in production is your ability to pace layer change
Experimental introduction is a powerful concept to see how the ecosystem adapts to the new UI interactions and patterns
Small features can be prototypes in plain sight; what feels minor today may be groundwork for tomorrow’s product.
Materials and interactions scale differently; test them first in constrained contexts before expanding to the broader ecosystem.
For you to be successful in designing with AI and for AI, it’s important to be both a platform designer and an ecosystem designer. Identify the nexus point where the big moment connects, then ship pieces of it as you go. You don't have control of where things go, but you can make the next moves in where it goes next.
Hyperlinks + notes
Despite all the discourse about titles, I still am an interaction designer at heart
Designing for Apple Watch before it even ships → A fun lookback 11 years ago (!!!) on prototyping for Apple Watch when I didn’t have the device
Enter Dynamic Island, a major hint at Apple's Extended Reality (XR) strategy
From idea to interface: A designer’s guide to AI-powered prototyping by CJ Gammon
Congrats, Replit, on launching Agent 3!
Have an amazing Webflow Conf in New York
Inside Claude Code: How an AI Native Team Actually Works | Cat Wu
Vibe Coding 101: 23 practical tips to build functional prototypes
Great post by Dan Romero about strong remote cultures