The Prototype That Passed Every Lab Test and Failed at the Coast
Making Electronics Survive the Physical World#
There is a moment in every hardware engineer's career (if they're paying attention) when they realize that the circuit is the easy part.
Rushikesh Deshmukh's moment came early. Fresh out of his doctorate, working on mobile robotics systems, he watched a prototype navigation unit fail in the field. Not because the algorithm was wrong. Not because the processor was undersized. Because condensation formed on a connector during a temperature swing, caused a brief electrical short, and corrupted the communication bus between components. The robot lost its spatial awareness. The team spent three days debugging software before someone thought to open the housing.
"That was the call," Rushikesh says. "The algorithm was flawless. The circuit board was correct. The failure was in the interface between the electronics and the physical world. That interface, where the circuit meets reality, became my career."
It is a career that would span embedded systems for consumer electronics, industrial automation, and eventually the extreme operating environment of permanent outdoor venue infrastructure. But the through-line was always the same question: how do you make electronics survive conditions that actively try to destroy them?
Good Enough as a Design Philosophy#
Before OZ, Rushikesh spent years in consumer electronics, a world where "good enough" isn't a compromise but a design philosophy.
"Consumer electronics operate within a very narrow environmental envelope," he explains. "Room temperature. Controlled humidity. A user who brings the device indoors when it rains. You design a phone to last two years. You design a laptop to last four. You design them knowing that the user will control the environment."
This paradigm (where the environment cooperates with the hardware) defines how most engineers think about building things. Weatherproofing is an afterthought. Cooling means "put a fan in it." The market rewards thin, light, and cheap. It doesn't reward a product that works identically on day one and day 1,826.
"I spent years building products with a two-year design life. Well-engineered products. Beautiful products. Products that would be landfill within 36 months. There is a kind of existential dissatisfaction in that, if you care about craft."
Why Build What Lasts?#
The question haunted him: why build hardware that lasts five or more years when the industry rewards annual replacement?
"The honest answer is that most of the industry doesn't need hardware that lasts. A smartphone that lasts two years and costs 400 dollars serves its user well. The economics of planned obsolescence are rational for consumer markets."
But there are domains where planned obsolescence isn't an option. Infrastructure. A bridge lasts 75 years. A stadium lasts 50. The hardware deployed to those structures must operate on infrastructure timescales, not consumer timescales. When you mount a compute node on a stadium gantry 30 meters above the pitch, nobody is climbing up there every 18 months to swap it out. The hardware must survive. Not metaphorically. Physically.
"The industry was telling me that hardware is a commodity. That the value is in the software. That enclosures are ordered from a catalog and forgotten. I knew that was wrong for the domains I cared about, but I didn't yet have a company that agreed."
Most AI companies treat hardware as a commodity input. OZ treats it as the foundation of the entire business. When your product requires permanent outdoor compute nodes operating unattended for years, the hardware layer becomes the most defensible part of the stack, because the knowledge required to build it can only be earned through time and failure, never purchased.
Venue-Grade Infrastructure#
The threshold was OZ. Not just a company that took hardware seriously, but a company where the entire product thesis depended on hardware surviving the physical world.
"When I first saw the deployment requirements (permanent outdoor installation, fully sealed enclosure, powerful AI compute running at sustained load, robotic camera gimbal, weatherproofing to survive any storm, operational across all four seasons, five-year minimum design life, zero scheduled maintenance) I knew this was the engineering problem I'd been looking for."
The challenge wasn't any single requirement. Any one of those constraints is solvable. The challenge was all of them simultaneously. A sealed enclosure means no fans, so you have to cool hundreds of watts of AI compute using only passive cooling. Sealing against weather means no air exchange, so you have to manage humidity buildup from temperature swings without vents. A robotic camera gimbal requires precision mechanics inside a sealed box across thousands of operating hours.
"You can't A/B test a sealed enclosure. You can't deploy two versions and see which one gets more clicks. You design it, build it, test it against physics, and either it works or it doesn't. The feedback cycle is months, not minutes. That demands a different kind of engineering patience."
Every Failure Mode Discovered the Hard Way#
The testing regimen for OZ edge hardware reads like an adversarial simulation of everything a stadium can inflict on an electronic system.
Thermal cycling: repeated transitions from -20 to +55 degrees Celsius, simulating winter nights to summer afternoons. UV exposure: thousands of hours to test how materials hold up over a five-year lifespan. Vibration endurance: simulating the mechanical stress of 50,000 fans jumping in unison. Moisture testing: pressurized water jets at every seal, every cable entry point, every mounting interface.
"Each test represents a failure mode we discovered the hard way, through field deployments, through prototypes that passed lab testing but failed in real conditions. The thermal cycling test exists because an early prototype developed micro-cracks in a gasket after 200 cycles (invisible to inspection, large enough to admit moisture). The vibration test exists because a mounting bracket fatigued at a stadium where a supporters' section created sustained resonant vibration during matches."
Dr. Rushikesh Deshmukh
Head of Enclosure & Edge Systems
“Every failure in the field becomes a test case in the lab. Every test case becomes institutional knowledge. That knowledge is the moat, and it only accumulates with time.”
The Prototype That Passed Every Lab Test#
The defining moment (the one that changed how Rushikesh thinks about testing) was a prototype that passed every lab test but failed in the field.
"We had a second-generation enclosure. It passed thermal cycling. It passed waterproofing. It passed UV exposure. It passed vibration. Every lab test, every environmental chamber, every accelerated aging test. We were confident. We shipped it to a coastal venue for a pre-season deployment."
Within six weeks, the internal humidity sensor flagged unusual readings. Moisture was accumulating inside the sealed box. Not from rain; the seals held. Not from condensation; the thermal design prevented that. The water was getting in through a mechanism nobody had anticipated.
"Coastal air carries salt particles. The salt deposited on the enclosure surface, particularly around cable entry points. Over repeated temperature cycles (warm days, cool nights) the salt created a moisture-attracting film that drew water along the cable jacket, past the seal, and into the enclosure. The amount was minuscule per cycle, but it accumulated over weeks. No lab test had replicated this because no lab test used salty coastal air with real day-night temperature swings over a six-week period."
The fix required redesigning the cable entry point with a secondary seal and a salt-resistant coating. That kind of design breakthrough (born from a specific real-world failure that no simulation predicted) is exactly what competitors would need to rediscover independently. There are no shortcuts. The physics doesn't read your pitch deck.
An Unreplicable Body of Knowledge#
Every season, every venue, every climate zone adds to OZ's institutional hardware intelligence. Not theory, but empirical knowledge earned through things breaking and being fixed.
"People ask about competitive moats, and I always bring it back to this: how many seasons has your hardware survived outdoors? How many failure modes have you discovered and resolved? How many venues across how many climate zones have tested your designs? These aren't questions you can answer with capital. You answer them with time."
The thermal design breakthroughs came from a prototype that kept overheating during a summer deployment. The passive cooling didn't have enough surface area for the intense southern European sun. The power protection innovations came from a deployment where stadium floodlights switching on and off sent electrical spikes that crashed the AI compute. Each breakthrough required months of investigation, redesign, and revalidation.
"A well-funded competitor could hire great engineers and start building tomorrow. In two years, they'd have a product that works in a lab. In three years, they'd have a product that works at a few venues. But they wouldn't have five years of field data across climate zones. They wouldn't know about the salt corrosion problem, or the resonant vibration problem, or the dozen other failure modes we've discovered and solved. They'd encounter each one for the first time, just as we did. And each one takes months to diagnose and fix. You can't compress that timeline with money."
This is the hardware version of Waymo's autonomous driving hours argument: the accumulated field experience is itself the competitive advantage. A competitor starting today would need to deploy their own hardware, across their own venues, through their own seasons, discovering their own failure modes, a process that takes years regardless of funding. In Physical-AI infrastructure, time is the moat.
Why Atoms Are the New Defensibility#
Rushikesh returns to first principles when he talks about competitive advantage.
"The hardware moat is real because it's physical. You can't fork a sealed enclosure on GitHub. You can't download a cooling system from a model zoo. You can't spin up a vibration-resistant mounting bracket in the cloud."
"Software moats erode quickly. A better algorithm, a larger dataset, a faster chip: any of these can neutralize a software advantage in a single product cycle. Hardware moats erode slowly, if at all. Every season that our enclosures survive in the field (every summer heatwave, every winter freeze, every storm) adds to the empirical evidence that this hardware works. That evidence compounds as reputation with venue operators. And reputation, in infrastructure, is the ultimate moat."
He leans forward.
"VCs keep asking me what our defensibility is. I tell them: go build this. Seriously. Take a hundred million dollars, hire the best engineers, start today. In five years, you'll be where we were three years ago. That's not arrogance. That's physics. The clock doesn't care about your funding round."
The Innovation That Never Stops#
Eighteen years in. And Rushikesh Deshmukh is still iterating.
"The next generation focuses on three things. First, more computing power per watt of heat, because the AI models will always want more, and the heat budget is fixed by physics. Second, modular compute trays that let us swap in next-generation chips without replacing the entire enclosure, because the enclosure should outlast multiple generations of silicon. Third, built-in environmental sensors that feed hardware health data directly into the world model."
That third point is where Rushikesh's eyes light up. The enclosure itself becoming a sensor, reporting its own temperature, power quality, and mechanical stress as data that the world model ingests alongside player positions and ball trajectories. The hardware not just supporting the intelligence layer but contributing to it.
"People ask me what keeps me going after 18 years. It's this: every problem I solve reveals a deeper problem. Every failure mode I eliminate exposes a subtler one. The physics doesn't change, but our understanding of it deepens with every deployment, every season, every venue. That is what compounding means in hardware. Not doing the same thing faster. Understanding the same physics more completely."
He pauses.
"Eighteen years isn't 18 years of the same year repeated. Each year, the constraints change: new chips, new thermal profiles, new venues, new climates. But the physics doesn't change. Gravity. Thermodynamics. Material science. The constants are the foundation. The variables are the challenge. And the institutional knowledge we've built, that's the proof that we met the challenge. Every season. For years. And we're just getting started."