From Raw Perception to Structured Intelligence: The Data Asset Every Venue Generates
Raw perception data isn't a product. Cameras capture pixels, sensors emit signals, and AI models produce detections, but none of that is useful until it becomes structured, versioned, and queryable.
From detections to a graph#
The Spatial Graph is OZ's structured output layer. Every entity detected at a venue (a person, a ball, a vehicle, a zone boundary) becomes a node in a temporal graph with typed relationships, spatial coordinates, and provenance metadata.
This isn't a database dump. It's a live, versioned data structure that carries:
- Entity identity: persistent tracking across cameras and time
- Spatial coordinates: real-world position, not pixel coordinates
- Temporal edges: relationships between entities across time
- Provenance: which sensor, which model, which confidence level
Why a graph, not a table#
Tabular data works for aggregation. Graphs work for relationships. When a broadcaster asks "show me the player nearest to the ball," that is a graph query. When a security operator asks "trace this person's path across three zones," that is a graph traversal.
The Spatial Graph makes these queries first-class operations, not post-hoc analytics.
Versioned schema, stable contracts#
Every Spatial Graph output follows a versioned schema. Downstream consumers (OZ Studio, third-party analytics, broadcast integrations) build against a stable contract. When the schema evolves, it evolves with backwards compatibility guarantees.
This is what makes the Spatial API possible: a structured, versioned, queryable representation of physical reality that downstream teams can trust.
The compounding effect#
Every venue that runs an OZ VI Venue contributes to the Spatial Graph corpus. Patterns discovered at one venue transfer to the next. The graph doesn't just describe what happened. It builds institutional knowledge about how physical environments behave.