t’s Not About the Wall: Why NAB 2026’s Biggest VP Story Validates What We’ve Always Believed

It’s Not About the Wall: Why NAB 2026’s Biggest VP Story Validates What We’ve Always Believed

The virtual production industry is finally admitting that LED volumes aren’t the only answer. Here’s why that matters — and what we think comes next.


For the past few years, the conversation around virtual production has been dominated by one thing: LED walls. Big ones. The bigger the better, apparently. If your stage didn’t have a curved LED volume the size of a small aircraft hangar, you weren’t doing “real” VP — or so the narrative went.

This week at NAB 2026 in Las Vegas, something shifted. And we think it’s worth talking about.

Projection is back — and it’s serious

CarbonBlack Technology and a coalition of broadcast partners — including Christie, Vizrt, Disguise, and WePlay Studios — unveiled what they’re calling the first multi-camera projection-based virtual production solution. That’s a mouthful, but the core idea is straightforward: use projection instead of LED panels to create in-camera backgrounds, and do it well enough to support multiple cameras simultaneously.

CarbonBlack’s secret is a nanotechnology-based screen surface called Hybrid RP, which delivers deep blacks and controlled light behaviour without the pixel structure you’d normally see from a projector at close range. Their breakthrough in ambient light absorption addresses what’s always been projection’s Achilles heel — the moment you turn on a key light, your background washes out. They claim to have solved that.

We haven’t seen it in person yet, so we’ll reserve full judgement. But the fact that a serious group of industry players is demonstrating a projection-based VP workflow at NAB — not as a concept, but as a working multi-camera setup — tells us something important about where the industry’s head is at.

The LED monoculture is cracking

Here’s what interests us most: this isn’t just about projection versus LED. It’s about the industry finally acknowledging that different productions need different tools.

An LED volume is brilliant for certain things. Full 360-degree environment wraps for wide establishing shots. Interactive lighting that reflects naturally off talent and surfaces. There’s a reason The Mandalorian looked the way it did — that production had the budget, the time, and the Unreal Engine expertise to make an LED volume sing.

But for a huge number of productions — the ones that don’t have ILM’s pipeline and a multi-million pound budget — a volume introduces complexity and cost that can actually slow things down. You need LED processing, colour calibration across hundreds of panels, genlock synchronisation, moiré management, and a team of LED technicians alongside your VP operators. For a talking-head corporate shoot or a mid-budget drama, that’s often overkill.

At vedrí, we made a deliberate choice to build our pipeline around real-time green screen compositing with Unreal Engine rather than an LED volume. Our Mo-Sys StarTracker Max handles camera tracking, feeding positional data into Unreal so the virtual environment moves precisely with the physical camera. The composite happens in real time — the director sees a finished shot on the monitor, not a green void. We get the same creative feedback loop that makes VP powerful, without the overhead of an LED wall.

We’re not saying our approach is better in every scenario. We’re saying it’s the right approach for a lot more scenarios than the industry has been willing to admit.

The real magic is in the tracking and the engine

Here’s what CarbonBlack’s demo, our own green screen pipeline, and the best LED volume stages all have in common: the underlying technology that actually makes VP work has nothing to do with what’s behind the talent.

It’s camera tracking. It’s Unreal Engine. It’s the real-time render pipeline that generates perspective-correct imagery at 24fps (or 50, or 60) with sub-frame latency. Whether that render ends up on an LED wall, a projection screen, or composited over a keyed green screen, the creative result depends on the same fundamentals — accurate tracking data, well-built Unreal environments, and operators who understand the pipeline.

Unreal Engine 5.7, which shipped late last year, reinforces this. The new Live Link Broadcast Component lets Unreal itself act as a source of animation data across your network, opening the door to multi-machine VP workflows where you can offload retargeting or secondary processing to another Editor session and broadcast the results back. For studios running lean — like us — that’s meaningful. It means you can distribute the compute load without needing a rack of Disguise servers.

The updated Composure toolset in 5.7 also improved shadow and reflection handling during real-time compositing, which directly benefits green screen VP workflows. Better edge quality, more accurate light interaction between real and virtual elements. These aren’t headline features, but they’re the kind of incremental improvements that make a visible difference in the final output.

What this means for clients

If you’re a filmmaker, agency, or brand thinking about virtual production for an upcoming project, the takeaway from NAB 2026 is encouraging: you have more options than ever, and the technology is maturing across all of them.

The question isn’t “should we use an LED volume?” anymore. The question is: “what does this specific production need?” Sometimes that’s a volume. Sometimes it’s projection. Often — especially for projects that need flexibility, fast turnaround, and cost-effectiveness — it’s real-time green screen compositing.

At our studio in Gwynedd, we run through this decision with every client who walks through the door. We look at the shot list, the environments, the budget, the timeline, and we recommend the approach that actually serves the project. Sometimes we tell people they’d be better off on an LED stage. More often, they’re surprised at what we can achieve with our Blackmagic Pyxis 6K, StarTracker Max, and a well-lit green screen.

What’s next

We’re watching the CarbonBlack projection approach closely — if the ambient light handling is genuinely as good as they claim, it opens up interesting possibilities for hybrid setups where you combine projected backgrounds with tracked green screen elements. We’re also keen to see how Disguise’s new AI plugin framework develops, particularly the LTX integration that generates 4K content from text prompts. That could dramatically speed up environment pre-visualisation, even for studios that don’t use Disguise servers in their final pipeline.

The VP industry is growing up. It’s moving past the “one tool for everything” phase into something more nuanced, more practical, and — honestly — more interesting. We’re glad to be part of that conversation.

If you’re planning a production and want to figure out which VP approach makes sense, book a studio visit — we’ll walk you through the options and show you what our pipeline can do in person.


This Fortnight in VP

CarbonBlack and partners unveil multi-camera projection-based VP at NAB 2026 — The first serious projection-based alternative to LED volumes for broadcast VP. CarbonBlack’s nanotechnology screen surface eliminates the ambient light problem that’s held projection back. Worth watching closely.

Disguise previews Sony XYN and LTX AI plugins at NAB 2026 — Sony’s XYN plugin converts real spaces into 3D CG for LED backgrounds, while LTX’s AI model generates synchronised 4K/60fps content from text prompts. AI-assisted content creation is arriving in the VP pipeline faster than most expected.

Blackmagic announces DaVinci Resolve 21 and URSA Cine 12K LF 100G — Blackmagic’s NAB lineup is enormous this year, but the 100G Ethernet push across cameras, switchers, and recorders signals a real shift toward IP-based live production infrastructure. Also notable: the URSA Cine Immersive 100G for Apple Vision Pro live production.

VP Gathering 2026 wraps in Breda with “Leaders of the Craft” theme — This year’s gathering focused on mastery and creative leadership in VP, with live stage demos and talks covering gaussian splatting, AI integration, and motion capture workflows. The event continues to be one of the best VP-focused gatherings in Europe.

Unreal Engine 5.7’s Live Link Broadcast Component enables multi-machine VP workflows — UE can now act as a source of animation data across your network, letting studios distribute compute across multiple Editor sessions. The updated Composure tool also improves real-time compositing quality for green screen pipelines.

 

MORE ARTICLES