New architecture aims to merge real-time media workflows with AI processing on a single, high-performance network
As the industry continues its steady march from SDI to IP, a new conversation is quickly taking shape across sports-media engineering circles: what comes after ST 2110?
For Cisco, the answer lies in what it calls AI-driven media fabrics — a next-generation infrastructure model that brings together traditional broadcast workflows and real-time AI processing on the same network. In a newly published blog post, the company outlines its vision for how live production environments must evolve to support the growing demands of modern content creation, personalization, and distribution.
MORE: Evolve IP Media to AI-Driven Media Fabrics: Future-Proof Broadcast with Cisco and NVIDIA
At the center of that vision is a familiar foundation: IP. However, it takes on a new role. No longer just a transport layer for video and audio signals, the network itself is becoming an active participant in production.
“High-performance infrastructure is key towards an AI-driven media infrastructure,” says Murali Gandluru, VP, Data Center Networking, Cisco. “What’s happening right now is those two worlds — media and AI — are converging.”
From SDI Replacement to Workflow Transformation
Cisco has been working in the media space for nearly a decade, launching its IP Fabric for Media (IPFM) platform in 2016 as broadcasters began transitioning away from SDI-based environments. Initially, the focus was straightforward: replace legacy routing and switching with commercially available IP infrastructure. But the opportunity quickly expanded beyond simple replacement.
“We didn’t just throw a switch at broadcast customers and say, ‘Go use it,’” says Sunil Gudurvalmiki, VP of Product Management for Nexus, Cisco. “We realized the workloads — the sensitivity, the quality, the jitter, latency — all of these are so important that we had to pay extra attention to make sure no packets get dropped and timing is synced properly.”
That attention has helped Cisco’s IPFM platform gain traction across major sports broadcasters and venues, underpinning everything from global events like the Olympics and FIFA competitions to large-scale venues like The Sphere in Las Vegas.
More importantly, the move to IP has unlocked new ways of thinking about production workflows.
“The kind of sharing that can happen with different master control rooms is phenomenal,” Gudurvalmiki says. “It’s transformational.”
Bringing AI Into the Fabric
Now, with AI rapidly entering the live-production environment, Cisco and NVIDIA are looking to build on that IP foundation by enabling media and AI workloads to coexist on the same infrastructure. A key piece of that effort is the Media Exchange Layer (MXL), an emerging framework designed to standardize how video, audio, and metadata are shared between traditional broadcast systems and AI-powered applications.
“AI is coming into live broadcasting, and like it or not, customers are going to embrace it,” Gudurvalmiki says. “On the same fabric, on the same network, you can run uncompressed ST 2110 workflows and also run AI applications.”
In practical terms, that means AI engines can access live media streams in real time — without separate processing pipelines — enabling new production capabilities to be layered directly into the workflow.
Examples range from automated graphics and replay enhancements to more advanced use cases like real-time language translation. In one demo, AI tools powered by NVIDIA’s Holoscan platform translated live audio into multiple languages while also adjusting video to match lip movements for each feed.
“You no longer feel that disconnect between audio and video,” Gudurvalmiki says. “It’s such a neatly done experience.”
Ready Today — and Still Evolving
While the vision is ambitious, Cisco says key elements of this architecture are already in place, particularly on the networking side. The bigger challenge lies at the application layer, where vendors are still developing AI tools that can fully take advantage of frameworks like MXL.
“From a networking standpoint, we are ready,” Gudurvalmiki says. “We have the capacities, the systems, and the software built for it. We’re just getting started on this journey. This is the first iteration — there will be multiple evolutions.”
That reflects the broader state of AI in live sports production: rapid innovation, growing interest, but still early in terms of scaled deployment.
“In the past year, the conversation around MXL has probably grown 100%,” Gudurvalmiki notes.
Not unexpectedly for broadcasters, though, is that both executives point to cultural change and not the technological achievement itself as the biggest obstacle in this story. For decades, broadcast engineers have operated in highly specialized environments. Moving to IP — and now AI-enabled systems — requires new workflows and new ways of thinking.
“You’ll be surprised — tech is not the biggest challenge,” Gudurvalmiki says. “The biggest obstacle is the mindset shift. We had to learn how to speak their terminology. It’s about bridging that gap.”
At the same time, changing audience expectations are helping accelerate that shift.
“The expectation that audiences have around events and how they engage while games are happening has already been transformed,” says Gandluru. “It’s going to be transformed even more.”
What It Means for Sports Production
For sports broadcasters, the implications are significant — even if the full impact is still taking shape.
In the near term, this architecture could accelerate trends already underway, including more scalable REMI workflows, an increase in alternate broadcasts, and deeper integration of real-time data into live coverage.
Longer term, the combination of IP infrastructure and AI processing could enable entirely new forms of storytelling — from dynamically generated content to personalized viewing experiences tailored to individual fans.
Still, Cisco’s message is clear: regardless of which applications ultimately emerge, they will all depend on a robust infrastructure foundation.
“It doesn’t matter which innovation wins,” a Cisco representative notes. “They will all rely on foundational infrastructure to make it come to life.”
With NAB Show 2026 approaching, Cisco and NVIDIA are expected to showcase early implementations of these concepts, offering a clearer look at how AI-driven media fabrics may begin to take shape in real-world production environments.
For now, the industry remains in what Gudurvalmiki describes as an “early adopter” phase — but one moving quickly.
“We are already seeing this come to life with customers,” he says. “Now we’re working closely with these customers and others to layer AI-driven workflows on top of that infrastructure.”