ArcFlow
Company
Managed Services
Markets
  • News
  • LOG IN
  • GET STARTED

OZ brings Visual Intelligence to physical venues, a managed edge layer that lets real-world environments see, understand, and act in real time.

Talk to us

ArcFlow

  • World Models
  • Sensors

Managed Services

  • OZ VI Venue 1
  • Case Studies

Markets

  • Sports
  • Broadcasting
  • Robotics

Company

  • About
  • Technology
  • Careers
  • Contact

Ready to see it live?

Talk to the OZ team about deploying at your venues, from a single pilot match to a full regional rollout.

Schedule a deployment review

© 2026 OZ. All rights reserved.

LinkedIn
ArcFlow Docs
Get Started
  • Get Started
  • Quickstart
  • Installation
  • Project Setup
  • Platforms
  • Bindings
  • Licensing
  • Pricing
Capabilities
  • Vector Search
  • Graph Algorithms
  • Sync
  • MCP Server (AI Agents)
  • Live Queries
  • Programs
  • Temporal
  • Spatial
  • Trusted RAG
  • Behavior Graph
  • Agent-Native
  • Event Sourcing
  • GPU Acceleration
  • Intent Relay
Concepts
  • World Model
  • Graph Model
  • Query Language (GQL)
  • Graph Patterns
  • SQL vs GQL
  • Parameters
  • Query Results
  • Persistence & WAL
  • Error Handling
  • Observations & Evidence
  • Confidence & Provenance
  • Proof Artifacts & Gates
  • Skills
GQL / WorldCypher
  • Overview
  • MATCH
  • WHERE
  • RETURN
  • OPTIONAL MATCH
  • CREATE
  • SET
  • MERGE
  • DELETE
  • REMOVE
  • WITH
  • UNION
  • UNWIND
  • CASE
  • Spatial Queries
  • Temporal Queries
  • Algorithms Reference
  • Triggers
Schema
  • Overview
  • Indexes
  • Constraints
  • Data Types
Functions
  • Built-in Functions
  • Aggregations
  • Procedures
  • Shortest Path
  • EXPLAIN
  • PROFILE
Skills
  • Overview
  • CREATE SKILL
  • PROCESS NODE
  • REPROCESS EDGES
Operations
  • CLI
  • REPL Commands
  • Snapshot & Restore
  • Server Modes & PG Wire
  • Persistence
  • Import & Export
  • Docker
  • Architecture
  • Cloud Architecture
  • Sync Protocol (Deep Dive)
Guides
  • Agent Integration
  • World Model
  • Graph Model Fundamentals
  • Trusted RAG
  • Using Skills
  • Behavior Graphs
  • Swarm & Multi-Agent
  • Migration Guide
  • Filesystem Workspace
  • From SQL to GQL
  • ArcFlow for Coding Agents
  • Data Quality & Pipeline Integrity
  • Code Intelligence
Tutorials
  • Knowledge Graph
  • Entity Linking
  • Vector Search
  • Graph Algorithms
Recipes
  • CRUD
  • Multi-MATCH
  • MERGE (Upsert)
  • Full-Text Search
  • Temporal Queries
  • Batch Projection
  • GraphRAG
Use Cases
  • Agent Tooling
  • Knowledge Management
  • RAG Pipeline
  • Fraud Detection
  • Sports Analytics
  • Grounded Neural Objects
  • Behavior Graphs
  • Autonomous Systems
  • Digital Twins
  • Robotics & Perception
Reference
  • TypeScript API
  • GQL Conformance
  • Compatibility Matrix
  • Glossary
  • Data Types
  • Operators
  • Error Codes
  • Known Issues

Autonomous Systems

Every autonomous system is, at its core, a question of state: what is known, when it was known, and how much to trust it. A robot navigating a warehouse, a UAV fleet maintaining formation, a self-driving vehicle making a lane-change decision — each requires a continuous, spatially precise, temporally accurate, confidence-scored store of what is actually happening.

Neural world models are the simulation tier — generative engines that anticipate how the world evolves under actions. ArcFlow is the persistence tier — the operational world model that stores what actually happened, at what confidence, from which sensor, queryable at any sequence checkpoint.

Neural world models simulate. ArcFlow records. Autonomous systems need both.

The infrastructure problem#

Most autonomous system architectures cobble together three or more systems:

  • A spatial data store for positions and geometry
  • A time-series database for sensor history
  • A graph or relational database for entity relationships
  • An ML model store for confidence scores
  • A message broker for real-time updates

Each boundary between these systems introduces latency, consistency risk, and operational complexity. A single query that crosses systems — "find all robots within 20 meters of a high-confidence obstacle, sorted by their last confirmed status" — requires a join across at least three of them.

ArcFlow collapses this stack into one in-process engine.

The data model#

-- Physical entities with spatial position and observation class
CREATE (r1:Robot {
  id: 'ROBOT-01',
  x: 12.4, y: 8.7, z: 0.0,
  vx: 0.5, vy: 0.0, vz: 0.0,
  status: 'navigating',
  battery_pct: 87,
  _observation_class: 'observed',
  _confidence: 0.99
})
 
-- Obstacles — observed vs predicted
CREATE (o1:Obstacle {
  id: 'OBS-001',
  x: 18.0, y: 8.5, z: 0.0,
  type: 'static',
  _observation_class: 'observed',
  _confidence: 0.98
})
 
CREATE (o2:Obstacle {
  id: 'OBS-002',
  x: 25.0, y: 12.0, z: 0.0,
  type: 'dynamic',
  _observation_class: 'predicted',
  _confidence: 0.42
})
 
-- Detection edges carry sensor provenance
CREATE (r1)-[:DETECTS {
  sensor: 'lidar',
  range_m: 8.3,
  _confidence: 0.98,
  at: timestamp()
}]->(o1)
 
-- Fleet coordination
CREATE (f:Fleet {id: 'FLEET-A', formation: 'line'})
CREATE (r1)-[:MEMBER_OF {role: 'lead', position: 1}]->(f)

Spatial queries — what is around me?#

-- All entities within 15m of Robot-01 (ArcFlow Spatial Index backed)
CALL algo.nearestNodes(point({x: 12.4, y: 8.7}), 'Obstacle', 10)
  YIELD node AS obs, distance
WHERE distance < 15.0
RETURN obs.id, obs.type, obs._observation_class, obs._confidence, distance
ORDER BY distance
 
-- High-confidence obstacles only — filter out predictions
CALL algo.nearestNodes(point({x: 12.4, y: 8.7}), 'Obstacle', 10)
  YIELD node AS obs, distance
WHERE distance < 15.0
  AND obs._observation_class = 'observed'
  AND obs._confidence > 0.85
RETURN obs.id, distance
 
-- Line-of-sight check between two entities
CALL arcflow.scene.lineOfSight(robot_id, obstacle_id)
  YIELD has_los, note

Temporal queries — what changed?#

-- Where was this robot at a recent checkpoint?
MATCH (r:Robot {id: 'ROBOT-01'}) AS OF seq 800
RETURN r.x, r.y, r.status
 
-- Replay the last 5 minutes of all robot positions
MATCH (r:Robot) AS OF seq 1000
RETURN r.id, r.x, r.y, r.status
 
-- Detect if any robot has been stationary for more than 60 seconds
MATCH (r:Robot)
WHERE r.status = 'stopped'
  AND r._updated_at < (timestamp() - 60000)
RETURN r.id, r.x, r.y, r._updated_at

Confidence-filtered decision making#

-- Only navigate toward zones where all obstacles are high-confidence
MATCH (target:Zone {id: $target_zone_id})
MATCH (obs:Obstacle)
WHERE obs._observation_class IN ['observed', 'inferred']
  AND obs._confidence > 0.7
  AND distance(obs.position, target.position) < 20.0
RETURN count(obs) AS blocking_obstacles
 
-- Confidence-weighted PageRank to find the most "trusted" nodes in the sensor graph
CALL algo.confidencePageRank()
YIELD nodeId, score
RETURN nodeId, score ORDER BY score DESC LIMIT 10

Live monitoring — always current#

import { open } from 'arcflow'
 
const db = open('./data/world-model')
 
// Alert when any robot enters a restricted zone
const zoneMonitor = db.subscribe(
  `MATCH (r:Robot)-[:IN_ZONE]->(z:Zone {restricted: true})
   WHERE r._observation_class = 'observed'
   RETURN r.id, z.id, r.x, r.y`,
  (event) => {
    for (const row of event.added) {
      triggerAlert(`Robot ${row.get('r.id')} entered restricted zone ${row.get('z.id')}`)
      db.mutate(`MATCH (r:Robot {id: $id}) SET r.status = 'halted'`, { id: row.get('r.id') })
    }
  }
)
 
// Live view: fleet health dashboard (auto-maintained, zero-cost reads)
db.mutate(`
  CREATE LIVE VIEW fleet_health AS
  MATCH (r:Robot)
  RETURN r.id, r.status, r.battery_pct, r._confidence
  ORDER BY r.battery_pct ASC
`)
 
const health = db.query("MATCH (row) FROM VIEW fleet_health RETURN row")

Multi-robot coordination#

-- Fleet members sorted by their position in formation
MATCH (r:Robot)-[m:MEMBER_OF]->(f:Fleet {id: 'FLEET-A'})
RETURN r.id, r.x, r.y, m.position
ORDER BY m.position
 
-- Find robots that have lost contact (no detection events in last 10 seconds)
MATCH (r:Robot)
WHERE NOT EXISTS {
  MATCH (r)-[d:DETECTS]->()
  WHERE d.at > (timestamp() - 10000)
}
RETURN r.id, r.status
 
-- Shortest path between two robots through the navigable graph
MATCH p = shortestPath(
  (a:Robot {id: 'ROBOT-01'}),
  (b:Robot {id: 'ROBOT-05'})
  -[:NAVIGABLE*]->()
)
RETURN length(p), nodes(p)

OpenUSD / physics integration#

For robotic systems that use USD-based scene graphs (USD-based simulators, digital twin platforms):

-- Export the world model as USD scene description
CALL arcflow.scene.toUsda() YIELD usda
 
-- Resolve USD prim path to graph node
CALL arcflow.scene.primId('/World/Robots/ROBOT-01') YIELD prim_path, prim_id
 
-- Collision contacts from physics simulation
CALL arcflow.scene.collisions(robot_id) YIELD from_id, to_id, impulse, at_time
 
-- Neighborhood in robot's local coordinate space
CALL arcflow.scene.queryInLocalSpace(robot_id, 10.0)
  YIELD node_id, local_x, local_y, local_z

Why one world model instead of five systems#

The operational world model layer is the piece most autonomous stacks are missing. Neural world models handle simulation. ArcFlow handles persistence — collapsing what used to be five separate systems into one in-process engine.

CapabilityTraditional stackArcFlow
Neural model outputsApplication code, separate storeStore as _observation_class: 'predicted' edges
Spatial proximitySeparate spatial DB + queryCALL algo.nearestNodes(...)
Entity historyTime-series DB + joinAS OF seq N on same graph
Confidence scoringApplication logic_confidence on every fact
Observation classNot modeled_observation_class built-in
Live alertsMessage broker + CDCdb.subscribe() in-process
Graph algorithmsExternal toolCALL algo.pageRank() built-in
Fleet relationshipsRelational DB + joinsFirst-class edges with properties
USD integrationSeparate scene grapharcflow.scene.* procedures

See Also#

  • World Model — the two-layer framing: neural world models simulate, ArcFlow records
  • Building a World Model — step-by-step with spatial, temporal, and confidence
  • Grounded Neural Objects — lifting neural world models detections into persistent entities
  • Spatial Queries — ArcFlow Spatial Index, frustum, line-of-sight
  • Temporal Queries — AS OF seq N, replay, comparison
  • Live Queries — standing queries and live views
  • Programs — declare sensor fusion pipelines and perception programs as installable manifests with GPU hardware validation
  • Triggers — fire a skill automatically when a new sensor frame or detection node arrives
  • Digital Twins — live mirror of physical systems
Try it
Open ↗⌘↵ to run
Loading engine…
← PreviousBehavior GraphsNext →Digital Twins