ArcFlow
Company
Managed Services
Markets
  • News
  • LOG IN
  • GET STARTED

OZ brings Visual Intelligence to physical venues, a managed edge layer that lets real-world environments see, understand, and act in real time.

Talk to us

ArcFlow

  • World Models
  • Sensors

Managed Services

  • OZ VI Venue 1
  • Case Studies

Markets

  • Sports
  • Broadcasting
  • Robotics

Company

  • About
  • Technology
  • Careers
  • Contact

Ready to see it live?

Talk to the OZ team about deploying at your venues, from a single pilot match to a full regional rollout.

Schedule a deployment review

© 2026 OZ. All rights reserved.

LinkedIn
  1. Home
  2. Blog
  3. Why Cloud AI Fails at the Physical Edge

Why Cloud AI Fails at the Physical Edge

Article March 5, 2026

Share this post

Cloud AI has won the batch processing war. For training models, running analytics, processing documents, generating images, the cloud is the right answer. Elastic compute, pay-per-use pricing, and infinite horizontal scaling make it unbeatable for workloads that can tolerate latency.

Physical-AI infrastructure isn't one of those workloads.

The latency wall#

A cloud round-trip adds 50–200ms of network latency. For a chatbot, that's imperceptible. For a robotic gimbal tracking a player at full sprint, it's the difference between a broadcast-quality close-up and a missed shot.

OZ's published SLO is p99 latency ≤120ms, measured from photon capture to structured spatial output. That budget includes detection, tracking, prediction, camera cueing, gimbal movement, and frame validation. There's no room for a network round-trip.

This isn't an optimization problem. No amount of edge caching, CDN placement, or network engineering eliminates the physics of light traveling through fiber. The inference must happen where the cameras are.

The availability wall#

Cloud AI depends on network connectivity. A venue with a momentary network interruption loses all AI capability if inference runs in the cloud.

OZ venues continue operating through network outages. The edge compute unit runs the full inference pipeline locally. A network outage means the cloud doesn't receive telemetry. It doesn't mean the venue stops producing. The broadcast continues. The spatial data continues. The control loops continue.

For infrastructure that operates under published SLOs with financial consequences for downtime, cloud dependency is an unacceptable single point of failure.

The cost wall#

Cloud inference costs approximately $0.50 per equivalent operation. Edge inference costs approximately $0.05, a 90% reduction.

For a venue processing six 4K60p streams continuously during every match, every training session, every event, the cost difference isn't marginal. An always-on workload that processes over 100 Gbps of raw sensor data per match can't economically run in the cloud.

The cost crossover between cloud and edge happened in 2024–2025. For continuous, high-throughput, latency-sensitive workloads, edge is now cheaper AND faster. The economic argument is settled.

The sovereignty wall#

The EU AI Act, DORA, and NIS2 create mandatory data residency and sovereignty requirements. Video data from European venues processed through non-EU cloud providers faces compliance risks that edge-sovereign architecture eliminates by design.

OZ processes all perception data at the venue edge. Raw video never leaves the venue. Only derived structured data (trajectories, events, spatial context) traverses the network, and only to destinations the data controller specifies.

This isn't a feature added for compliance. It's the architecture. Edge-sovereign means the data stays where the cameras are, processed by compute that the venue controls.

When cloud is right#

OZ uses the cloud for everything that can wait: model training, fleet analytics, playbook optimization, reporting, and long-term storage. The split is simple: time-critical decisions run at the edge; everything else runs in the cloud.

The mistake is assuming that because cloud AI works for most software, it works for physical systems. It doesn't. Physical systems need deterministic latency, local availability, cost-efficient continuous processing, and data sovereignty. The edge provides all four. The cloud provides none.