ArcFlow
Company
Managed Services
Markets
  • News
  • LOG IN
  • GET STARTED

OZ brings Visual Intelligence to physical venues, a managed edge layer that lets real-world environments see, understand, and act in real time.

Talk to us

ArcFlow

  • World Models
  • Sensors

Managed Services

  • OZ VI Venue 1
  • Case Studies

Markets

  • Sports
  • Broadcasting
  • Robotics

Company

  • About
  • Technology
  • Careers
  • Contact

Ready to see it live?

Talk to the OZ team about deploying at your venues, from a single pilot match to a full regional rollout.

Schedule a deployment review

© 2026 OZ. All rights reserved.

LinkedIn
  1. Home
  2. Blog
  3. When AI Moves Physical Systems: The Spatial Graph as Real-Time Control Plane

When AI Moves Physical Systems: The Spatial Graph as Real-Time Control Plane

Spatial Robotics

Article March 5, 2026

Share this post

Most AI systems stop at detection. They identify objects, classify scenes, and produce annotations, but they don't act. The output is a dashboard, not a control signal.

The perception-action gap#

In a venue environment, detection without action is incomplete. Identifying a player is not the same as framing them in a broadcast shot. Detecting an intrusion is not the same as directing a camera to track the intruder. The gap between seeing and acting is where operational value lives.

Graph-driven control#

OZ's robotic capture systems do not run on raw detections. They run on Spatial Graph queries. When the graph updates with a new entity position, the control system evaluates:

  • Priority rules: which entity matters most right now
  • Zone policies: what capture behavior applies in this area
  • Mechanical constraints: what the physical actuator can reach
  • Transition cost: how disruptive is the camera move

The result is a control decision, not a detection report.

Deterministic under real-time constraints#

Robotic capture at venue scale demands deterministic timing. A PTZ camera that receives a cueing command 200ms late produces a missed shot, not a delayed shot. The control loop runs at the edge with fixed-budget computation. Every cycle completes within its time window or escalates.

From one camera to many#

The Spatial Graph makes multi-camera coordination possible. When one camera tracks wide, another can zoom tight, not because a human operator made the call, but because the graph encodes the spatial relationships that make the decision obvious.

This is what spatial robotics means at OZ: the graph sees, the control system decides, and the hardware acts, all within the latency budget that the venue demands.