CLI
arcflow adds world model intelligence to the command line. The way grep searches text and find searches files, arcflow searches, transforms, and reasons over operational world models — at filesystem speed, with structured JSON output that pipes into any workflow.
# Create a graph, query it, get JSON — one line
echo 'MATCH (n) RETURN count(*)' | arcflow --data-dir ./graph --json
# Pipe results to jq
arcflow --data-dir ./graph --exec schema.cypher --json | jq '.rows[].name'
# Watch a directory — auto-execute queries as agents write them
arcflow --data-dir ./graph --watch queries/ --output-dir results/For AI coding agents, arcflow is a tool in the same category as grep, find, ls, and jq — a command-line primitive that processes data and returns structured results. The agent doesn't need MCP, WebSocket, or any protocol. It calls a CLI command and reads the output. Zero overhead.
Agent Tooling#
As Fast as grep#
An agent calling arcflow has the same performance characteristics as calling grep or find:
# grep searches text
grep -r "class AuthService" src/
# arcflow searches a world model
arcflow --data-dir ./graph --exec "MATCH (c:Crate)-[:DEPENDS_ON]->(d) RETURN c.name, d.name" --jsonBoth are single-process CLI calls. Both return structured output. Both complete in milliseconds. No server startup. No connection handshake. No token overhead.
Structured Output for Agents#
Every command supports --json for machine-readable output:
arcflow --data-dir ./graph --exec "MATCH (n) RETURN count(*)" --json{"columns":["count(*)"],"rows":[{"count(*)":"10234"}],"count":1}Deterministic exit codes:
0— success1— query error (syntax, missing label)2— system error (file not found, permission denied)3— validation error (constraint violation)
Structured errors with recovery hints:
{"ok":false,"code":"PARSE_FAILED","message":"Unknown keyword 'METCH'","recovery_suggestion":"Did you mean MATCH?"}Batch Execution#
Execute an entire directory of queries in one call:
arcflow --data-dir ./graph --exec-dir queries/ --output-dir results/ --jsonOne process. N queries. N result files. The agent writes .cypher files, calls the CLI once, reads .json results.
Watch Mode#
For interactive agent sessions, the CLI watches a directory and auto-executes queries as they appear:
arcflow --data-dir ./graph --watch queries/ --output-dir results/The agent writes a file → the result appears within milliseconds. No polling. No protocol.
Interactive REPL#
For human developers, arcflow starts an interactive REPL:
arcflow --data-dir ./my-graphCommands:
:help— command reference:status— node/relationship counts:schema— labels, types, indexes:demo— load sample graph:bench— run performance benchmark:dump— export graph as JSON:quit— exit
See REPL Commands for the full command reference.
Server Modes#
Add a flag to serve the same engine over the network:
arcflow --data-dir ./graph --pg 5432 # PostgreSQL wire protocol
arcflow --data-dir ./graph --serve 7687 # TCP
arcflow --data-dir ./graph --http 8080 # REST API
arcflow --data-dir ./graph --ws 7688 # WebSocket subscriptions
arcflow-mcp --data-dir ./graph # MCP (cloud chat UIs)Run all servers concurrently on the same graph store:
arcflow --pg 5432 --serve 7687 --http 8080 --data-dir ./graphSame engine. Same queries. Same data. The CLI is the primary interface; server modes are for when you need network access. The --pg flag lets any PostgreSQL client (psql, pgAdmin, Grafana, psycopg2, node-postgres) connect and run WorldCypher queries transparently.
Authentication#
The HTTP and WebSocket servers support JWT authentication:
arcflow --http 8080 --auth jwt --jwt-secret my-secret --jwt-issuer https://auth.example.comAPI key auth for simple cases:
arcflow --http 8080 --api-key my-api-keyMulti-Workspace Tenancy#
Serve isolated graph stores for multiple tenants in one process:
arcflow --http 8080 --multi-workspace 50Each tenant gets their own graph store, WAL, and standing queries at /workspaces/{id}/. Zero data leakage between tenants. Create workspaces via the HTTP API or the --multi-workspace flag to pre-allocate slots.
Validation#
Dry-run query validation without executing mutations:
arcflow --validate "MATCH (n:Person) RETURN n.name"
# → {"ok": true, "plan": "NodeScan > Filter > Return"}
arcflow --validate "METCH (n) RETURN n"
# → {"ok": false, "code": "PARSE_FAILED", "suggestion": "Did you mean MATCH?"}Useful in CI to validate query files before deployment.
Flywheel Suite#
Evidence-based scoring and gating for agent workflows:
# Execute queries and collect timing + evidence artifacts
arcflow run queries/pagerank.cypher queries/community.cypher
# Score results with percentile analysis
arcflow score queries/pagerank.cypher
# Gate with pass/fail verdict (use in CI)
arcflow gate queries/critical.cypher
# Exit code 0 = pass, 1 = fail
# Compare against a baseline
arcflow compare queries/pagerank.cypher --baseline results/baseline.jsonThe flywheel suite is designed for AI agents that need to verify their own outputs — run a query, score it against historical evidence, gate on quality thresholds.
GPU Commands#
arcflow gpu status # Show available GPU backends (CUDA, Metal)
arcflow gpu install # Download GPU acceleration librariesSync Commands#
arcflow sync status # Pending mutations and high-water mark
arcflow sync push # Export pending mutations as JSON to stdout
arcflow sync pull # Import mutations from stdin JSON
arcflow sync snapshot # Export full graph snapshot to stdout
arcflow sync restore # Import snapshot from stdin
arcflow sync health # Mesh node status
arcflow sync cloud # Push + pull via ArcFlow Cloud (requires login)Agent Context Introspection#
Full engine state dump for AI agents:
arcflow agent-context synth --jsonReturns all labels, relationship types, algorithm capabilities, observation classes, clock domains, and available procedures in a single structured JSON. Agents use this to understand the schema and available operations without running individual CALL db.* queries.
Authentication#
arcflow login # Authenticate with oz.com/world
arcflow login --token TOKEN # Headless / CI login
arcflow logout # Clear stored credentials
arcflow whoami # Show current identity
arcflow status # Account + engine statusSelf-Update#
ArcFlow can update itself:
arcflow upgrade # Check for updates and install
arcflow upgrade --check # Check only, don't installThe REPL also checks for updates in the background on startup (cached for 24h). Opt out with ARCFLOW_NO_UPDATE_CHECK=1.
Updates use atomic binary replacement with automatic rollback on failure.
Sections#
- REPL Commands — interactive query session and meta-commands
- Snapshot & Restore — graph persistence and migration
See Also#
- Agent-Native Database — why CLI-first beats MCP for shell agents; filesystem workspace patterns
- ArcFlow for Coding Agents — structured errors, batch execution, checkpointing
- Installation — binary, npm, Docker, and WASM install paths