Walk into the production compound of a major touring concert, a large-scale corporate conference, or a televised awards ceremony and you may find two video control rooms operating simultaneously, each with its own director, its own switcher, its own monitoring stack — and almost no direct communication between them except through the show’s overall technical director. This is not redundancy or organizational confusion; it is the considered operational response to a fundamental creative and technical truth: broadcast television and live IMAG (image magnification) are different art forms that serve different audiences and require different aesthetic philosophies.
The Aesthetic Philosophy Divide
An IMAG director serves the live audience in the room. Their cuts follow the energy of the performance in real time, providing wide establishing shots when the audience needs spatial context, tight close-ups when the performer’s facial expression is the story, and two-shots or reaction shots that mirror what the unaided eye would naturally seek in the moment. IMAG cutting is reactive, fast, and emotionally intuitive. It is often done without the musical knowledge and timing practice that broadcast directors develop over years — and that’s acceptable because the live audience is forgiving. They’re in the room. They’re experiencing the energy directly.
A broadcast director serves a television audience watching on screens of varying sizes, in rooms with ambient light competing for attention, potentially tuning in mid-show with no warm-up period. Broadcast cuts follow grammar rules developed over decades of narrative filmmaking — establishing shots before close-ups, matched eye-lines, cutaway timing that serves story rather than impulse. Broadcast cutting is deliberate, architecturally planned, and technically constrained by transmission delays, closed caption requirements, and broadcast standards that have no relevance to the in-room IMAG screen.
The Technical Case for Separation
Beyond the aesthetic argument, there is a compelling technical case for separating broadcast and IMAG operations. The signal chains are different: IMAG typically operates at 1080i or 1080p in SDI from cameras positioned for room sightlines, while broadcast may require 4K HDR capture, super-slow-motion cameras for replay, RF camera systems on the stage, and a full graphics engine with lower thirds, score bugs, and commercial break transitions. Managing both signal chains from a single production infrastructure creates complexity that compromises both.
Productions at the scale of NFL Super Bowl halftime shows, Grammy Awards, or UEFA Champions League final ceremonies regularly deploy separate OB (outside broadcast) trucks for the television production while the live event’s IMAG is served by an entirely separate in-house video system. The television production company — Done + Dusted, Fulwell 73, or NEP Group — operates independently of the venue’s technical team, sharing access to camera positions and show call information but operating their own production infrastructure.
Camera Sharing Protocols
The point of interface between separate broadcast and IMAG teams is typically the camera sharing protocol — an agreement about which cameras each team has primary access to, and how conflicts are resolved when both teams want the same camera simultaneously. Common approaches include dedicated camera pools (broadcast cameras and IMAG cameras are entirely separate, with no sharing), signal distribution (camera outputs are split and fed to both production centers simultaneously, with each team using or ignoring any camera as needed), and negotiated primary/secondary access (each camera has a designated primary director, with the secondary team taking the feed on a courtesy basis).
Signal distribution is the most common approach at scale, enabled by multi-destination DA systems or IP router matrices from companies like Imagine Communications, Ross Video, or Grass Valley. Each team receives all available camera signals and selects from them independently — the broadcast director cuts a slow-motion replay insert while the IMAG director stays on the live performance feed, with no coordination required.
Communication Structures Between Teams
When separate teams share a show, the communication architecture must define how they interact without creating operational dependency that defeats the purpose of separation. The standard model is a technical director or show caller who sits outside both production centers and manages show flow — communicating segment timings, talent movements, and technical changes to both teams simultaneously via a shared comms channel or an IFB system that patches through to both directors.
Both teams share the show script and rundown but execute against it independently. The broadcast director may hold a wide shot through a segment transition to provide clean content for the network cut while the IMAG director cuts to a tight reaction shot at the same moment. These simultaneous but different editorial decisions are possible only because the teams are not trying to coordinate at the level of individual cuts — only at the level of show segments and major transitions.
When Single-Team Operation Is the Right Choice
Separate broadcast and IMAG teams are not always the right answer. Corporate events, smaller concert productions, and institutional events with modest broadcast requirements are often best served by a single, experienced video director and technical director managing all camera outputs with a clearly designed switching matrix. The overhead of two production teams — two directors, two switcher operators, duplicated monitoring infrastructure — is justified only when the scale and complexity of both the live event and the broadcast output demand independent optimization. Understanding where that threshold is — and having the production management experience to recommend the right structure — is one of the marks of a mature AV production company.