ArcFlow
Company
Managed Services
Markets
  • News
  • LOG IN
  • GET STARTED

OZ brings Visual Intelligence to physical venues, a managed edge layer that lets real-world environments see, understand, and act in real time.

Talk to us

ArcFlow

  • World Models
  • Sensors

Managed Services

  • OZ VI Venue 1
  • Case Studies

Markets

  • Sports
  • Broadcasting
  • Robotics

Company

  • About
  • Technology
  • Careers
  • Contact

Ready to see it live?

Talk to the OZ team about deploying at your venues, from a single pilot match to a full regional rollout.

Schedule a deployment review

© 2026 OZ. All rights reserved.

LinkedIn
  1. Home
  2. Blog
  3. What It Actually Takes to Make a Stadium Intelligent

What It Actually Takes to Make a Stadium Intelligent

Article March 12, 2026

Share this post

Twenty metres above the pitch, the wind pulls at your harness. A structural beam sits exactly where the blueprint shows clear space. The power outlet marked on the plan was decommissioned three years ago. This is where Audur Erlingsdottir, OZ's COO, starts every conversation about AI infrastructure. Not in a conference room, but on the gantry, where the drawings never match the reality and mounting brackets have to hold for five seasons through rain, summer heat, and the vibration of 40,000 fans jumping in unison. What follows is her stage-by-stage account of what it actually takes to make a stadium intelligent: from the first site survey to the moment the system produces a live broadcast with no crew, no trucks, and six weatherproof enclosures that a visitor would never notice.

Before Anyone Touches the Venue#

The installation begins weeks before anyone arrives on-site. And it begins with a question that has nothing to do with technology.

"The first question we ask is: where does the prime broadcaster want the cameras?" Audur says. "People assume we decide camera positions based on AI optimisation or computational modelling. We do use those tools, but the starting point is always the broadcaster's preference. They have been producing football for decades. They know what their viewers expect to see. We respect that knowledge."

The broadcaster (or the rights holder, depending on the market) typically specifies camera positions that follow a well-established logic. The two most important cameras sit at the centre of the pitch, on the main stand side. These are the cameras that produce the primary match feed: the wide tactical view and the tighter follow shot that tracks the ball.

"The centre cameras need a minimum elevation of six to eight metres above the pitch," Audur explains. "More is better. The higher the camera, the more of the pitch you can see without the players in the foreground blocking the view behind them. Ideally, these cameras sit in what broadcasters call the president's seat, the best centre seat in the house. That position gives you the most natural, balanced view of the game. It is where a spectator with the best ticket would sit, and that is exactly the perspective the viewer at home should have."

Two more cameras sit at the sixteen-metre lines, level with the edge of the penalty area on each side. These are the offside cameras. They provide the angle that referees and VAR systems need to judge offside decisions, and they give the broadcast director a tighter perspective on attacking play in and around the box.

The final two cameras are the goal cameras. They sit behind each goal, elevated, looking back down the pitch, or alternatively at the opposite corner of the pitch, depending on the venue's structure. These cameras capture the moments that matter most: goals, saves, near-misses, the reactions of players and goalkeepers.

"Six cameras, six positions, each one chosen for a reason that goes back to how football has been broadcast for thirty years," Audur says. "We are not reinventing the grammar of sports broadcasting. We are making it possible to deliver that grammar without a crew of twenty."

Camera placement follows the broadcaster's established production language: two centre cameras at the president's seat (minimum 6–8m elevation), two at the sixteen-metre/offside lines, and two goal cameras behind or diagonal to each goal. The positions are familiar to any broadcast professional. The difference is that OZ delivers them with permanently installed, robotically controlled cameras instead of a temporary crew.

The Site Survey#

With the camera positions agreed in principle, the site survey turns to the physical reality of the venue. And this is where Audur's operational instinct takes over.

"Every stadium is different," she says. "The drawings never match the reality. There is always a structural beam where the drawing shows clear space. There is always a power outlet that was decommissioned three years ago but never removed from the plan. The site survey exists to reconcile the design with the actual venue."

The survey team walks the full camera path: every mounting location, every cable route, every access point. They assess structural load capacity at each mounting position. Each camera enclosure weighs approximately thirteen kilograms, but the mounting bracket must support significantly more than that, since stadium gantries experience wind loading, vibration from crowd activity, and thermal expansion across seasons. The safety factor is two to three times the static weight.

They trace the cable routes from each camera position back to the head-end location, where the venue's compute infrastructure will live. Every camera connects to the central system via single-mode fibre optic cable and a power feed. The fibre carries the 4K video signal. The power feed runs the camera, the gimbal motors, and the onboard electronics.

"The cable routing is the part that looks boring on paper and causes the most problems in practice," Audur says. "You need protected pathways (conduit, trunking, cable trays) from each camera position back to the head-end. In a modern stadium with cable management infrastructure, that is straightforward. In an older ground that was built before anyone imagined running fibre, you are negotiating with the building itself."

The survey captures everything in OZ Designer, the scene planning tool that models the venue layout digitally before any hardware ships. Camera sight lines are calculated. Blind spots are identified. Coverage overlap between cameras is verified. The output is a deployment blueprint that is version-controlled, peer-reviewed, and locked before the first piece of hardware leaves the warehouse.

The Head-End: Where the Intelligence Lives#

The head-end is the room (or sometimes the cabinet) where the OZ VI Venue compute unit lives. It is the brain of the installation, and its location matters more than most people expect.

"We look for a location that is convenient, secure, and thermally manageable," Audur says. "Convenient means accessible for maintenance without disrupting venue operations. Secure means locked, with controlled access. Thermally manageable means the ambient temperature stays between ten and twenty-five degrees, which rules out most outdoor locations and many unventilated utility rooms."

The compute unit is a rack-mounted server with GPU processing power, the same processing power that, in a traditional broadcast setup, would fill an outside broadcast truck. It runs all of OZ's AI models, the real-time production engine, graphics rendering, replay processing, and the contribution gateway that encodes and delivers the final programme feed.

"The ideal location is often where the broadcaster would traditionally park their OB truck," Audur says, and there is a practical elegance to this. "That location already has the infrastructure a broadcast operation needs: power distribution, network connectivity, cable access to the pitch-side positions. And if there is a major event where a full broadcast crew does arrive, they can plug directly into our venue servers and use our camera feeds alongside their own equipment. The OZ VI Venue doesn't replace the OB truck. It makes the OB truck optional for every match, and complementary for the big ones."

The head-end cabinet requires front and rear access, with one metre clearance on each side for airflow and maintenance. Peak power draw is approximately 2.2 kilowatts, with a brief in-rush at power-on. Two independent 230-volt feeds on separate circuit breakers provide redundancy. If one feed fails, the system continues operating on the other.

Audur Erlingsdottir

Audur Erlingsdottir

Chief Operating Officer

Operations & Scale

“Every stadium is different. The drawings never match the reality. The site survey exists to reconcile the design with the actual venue.”

The Installation Week#

The physical installation takes three to five days on-site. The team arrives with hardware that has been pre-configured, tested, and firmware-updated before it leaves the warehouse. Nothing is assembled for the first time at the venue.

"Day one is the head-end," Audur says. "The compute unit goes into the rack. The network switch, the power distribution, the fibre patch panel. Everything mounts and connects according to the blueprint. By the end of day one, the brain of the system is powered on and talking to our Network Operations Centre."

Days two and three are the cameras. Each of the six camera enclosures is mounted at its surveyed position, aligned to the calculated angle, and connected via fibre and power back to the head-end. The enclosures are weatherproof, sealed units designed for permanent outdoor deployment across every season. No fans, no air intake, no maintenance openings. Each enclosure weighs about thirteen kilograms, fits within a 55 by 65 centimetre footprint, and draws roughly one amp at 230 volts.

"Mounting cameras on a stadium gantry is physical work," Audur says. "You are working at height, often in weather, with safety harnesses and lifting equipment. The installation team are not software engineers. They are riggers, electricians, and cable technicians who understand structural work. We work with local installation partners in each market, people who know the venue, know the local safety regulations, and have the certifications to work at height."

Days four and five are cabling and power verification. Every fibre connection is tested for signal quality. Every power connection is verified against the electrical specification. The network is tested end-to-end, from each camera through the switch to the compute unit and out through the contribution gateway to the delivery network.

"By the end of day five, the system is physically complete," Audur says. "Every camera is mounted, every cable is run, every connection is tested. The venue looks almost unchanged: six small enclosures on the gantry, a cabinet in the head-end, and fibre running between them. No trucks. No temporary rigging. No cables across walkways. Permanent infrastructure, like the floodlights."

Commissioning: Where the System Comes Alive#

The physical installation is the foundation. Commissioning is where the system becomes intelligent.

"Installation is hardware. Commissioning is knowledge," Audur says. "During commissioning, the system learns the venue, and we verify that it has learned correctly."

Camera calibration comes first. Each camera's exact position, angle, and lens characteristics are mapped into the system's spatial model. The AI needs to know precisely where each camera is and what it can see in order to stitch six separate views into a coherent understanding of the pitch. This calibration is performed by OZ's engineering team, not the installation partners. It requires AI expertise, not rigging skills.

Then comes tracking validation. The system's AI models are tested against known scenarios: players at marked positions on the pitch, moving at measured speeds, under the venue's specific lighting conditions. Detection accuracy is quantified. Not "it seems to work," but measured against acceptance criteria with specific thresholds.

Framing validation follows. The robotic gimbals are tested for their ability to find, frame, and follow subjects smoothly across the full range of motion. The 25x optical zoom is verified at every focal length. The AI's shot selection logic is evaluated against broadcast standards: does it choose the right camera at the right moment? Does the close-up arrive before the action, not after?

"Then we progress through the safety modes," Audur explains. "Twin mode first: the AI runs alongside a reference, observing but not controlling. Shadow mode next: the AI makes decisions, but a human reviews every one before it executes. Live mode last: the AI controls the production, with human override available at any time. You can't skip a mode. The playbook enforces the progression."

The commissioning phase includes two or three acceptance matches, real, live football produced by the system under match conditions. The acceptance criteria are specific: usable broadcast minutes must exceed ninety-five percent of the match. No critical system failures. Fewer than five operator interventions per match. The contribution feed must be accepted by the customer's technical team.

"Pass means pass," Audur says, a phrase she has clearly said many times. "If the acceptance criteria are not met, the system does not go live. We don't negotiate with the criteria. We fix whatever failed and run the acceptance again."

Commissioning progresses through three safety modes (Twin, Shadow, Live) with no skipping. The system must prove itself at each level before advancing. Acceptance requires real matches with quantified criteria: ≥95% usable minutes, zero critical failures, fewer than five operator interventions. Pass means pass.

Go-Live and the First Season#

The moment the system passes acceptance, it transitions to operational status. The Network Operations Centre activates continuous monitoring. Every metric (camera health, processing load, thermal performance, network quality, AI accuracy) flows in real time to the NOC team.

"Go-live should be uneventful," Audur says. "If commissioning was thorough, the first live match under full operational status should feel exactly like the acceptance matches. No surprises. That is the standard we hold ourselves to."

The first thirty days are a heightened observation period. Performance baselines are captured: this venue's specific patterns of temperature variation, lighting changes, network behaviour, crowd vibration. These baselines become the reference against which the system monitors itself for the rest of its operational life.

"Every venue has a personality," Audur says, and there is genuine affection in the observation. "One stadium's floodlights create a particular shadow pattern at sunset. Another stadium's gantry vibrates at a specific frequency when the crowd jumps. A third stadium has a network that slows down slightly when the point-of-sale systems process half-time transactions. The system learns these patterns and distinguishes them from actual problems. After thirty days, the system knows its venue."

The first season is where the compounding begins. Every match adds data to the Venue Graph, the structured spatial record of everything that has happened at the venue. Every operational event (every temperature spike, every network fluctuation, every camera recalibration) feeds back into the playbook. The system that was commissioned in August is measurably better by December. Not because someone updated it. Because it learned.

Hardware preparation at a local inventory workshop ahead of venue deployment

What Changes for the Venue#

Audur pauses before answering the final question: what does a venue look like after an OZ installation?

"From the outside, almost nothing changes," she says. "Six small enclosures on the gantry. A cabinet in a service room. Fibre in the cable trays. A visitor to the stadium would not notice anything different. That is by design. We are infrastructure, not decoration."

"From the inside, from the perspective of the venue operator, the league, the broadcaster, everything changes. Every match is produced. Not just the showcase matches. Every single fixture, at broadcast quality, with structured data, with replays, with graphics. The venue has intelligence. It sees, it understands, it remembers. And every season, it gets better."

"People ask what the installation looks like. The honest answer is: it looks like a stadium. That is the whole point. The technology disappears into the venue. What remains is the capability, and the capability is permanent."


This interview is part of the OZ Interview Series, profiling the team building the world model for the physical world.

All InterviewsAll with AudurLearn more about OZ