An RTS panel predicts that virtual production will have a profound impact on how TV shows are made.
Technologies have always been central to the evolution of the programme making process, but we have a revolution on our hands with the advent of virtual production.
Virtual production is the ability to augment TV and film sets with walls of LED displays that show a virtual location, whether that’s faraway planets (used in The Mandalorian) or the tennis courts of Wimbledon 2021 (apparently overlooked by the BBC’s virtual studio).
However, virtual production is vastly more than an upgrade of the familiar green or blue screens that have long been used to capture live action that can later be combined with location footage or computer-generated imagery. The screen imagery can match the camera’s viewpoint in real time and also provide suitable lighting for the real-world set.
The all-encompassing experience of the LED surround screens has a big impact on many aspects of the production process, as a panel of industry-leading practitioners discussed at a recent RTS event.
“We’ve all seen things that claimed to be a revolution but, in fact, they just made one department’s life a lot better,” said Ian Milham, the virtual production supervisor at Industrial Light & Magic. The company helped to make Disney+’s The Mandalorian, a pioneering programme in this field.
“This is one of those [developments] where, if it is used correctly, it’s a win for everyone,” he said. “For anyone who has done green-screen or blue-screen work, you don’t realise how much work you’ve been doing on what I’ve been calling the ‘imagination tax’. In many ways, having everything really be present is just a tremendous relief for everyone.” Milham’s “imagination tax” becomes evident when the cameras roll in a studio set surrounded by wall of LED screens as opposed to one hung with an inert fabric green screen.
Now, actors can see, moment by moment, what they are interacting with, and lighting directors do not have to guess exactly where the light sources in a scene should be.
Milham added: “We’re currently doing a show with Ewan McGregor, who is reprising [the role of] Obi-Wan Kenobi, whom he played on a big, blue-screen set nearly two decades ago. Now he’s playing him again on one of these [virtual] sets.
“You can see his joy in the comparison: he has already talked about how much better [the virtual set] is for him. For special effects, you can actually add fog and smoke. Lighters can add real light. With costumes, [in the past] maybe they’d want to avoid something reflective. Now: go for it. If anything, it makes the whole thing better.”
The reflective armour of The Mandalorian’s central character naturally picked up real reflections from the images on the surrounding video wall. But in a green-screen studio, his armour would have picked up green patches that would have needed laborious post-production work to prevent the background leaking through.
Milham described the collaboration needed from all departments as a “sea change”, because, “rather than input virtual effects at the end of the process, it’s [done] right at the start, and each department can influence the look and feel of the production”.
The technology began life in the computer games industry, where development and game-play platforms such as Unreal Engine and Unity matured until they were ready to be repurposed for the film and television industry. Now, stepping into a “volume” – the programmable space in which filming takes place – is akin to stepping out on location, but with a fraction of the effort.
Neil Graham, head of virtual production at Sky Studios, said the new technology supported a number of strategic goals set by Sky Studios, such as financially de-risking projects by planning ahead, reducing their carbon footprint, improving efficiency on set, and allowing the previsualisation stage to happen in real time, and collaboratively. “So it gets [us] away from [the traditional] linear process, and that drives innovation,” he said.
Previsualisation allows the various departments to understand a director’s intent well before principal photography starts. But the possibility of being able to stand inside a variety of virtual locations or “sets” before any construction takes place now allows the director and department heads to discuss the implications of each others’ creative ideas at a very early stage.
There is a cost-saving aspect as well. “Part of our metric for success is that we need to be able to bring the location to the stage cheaper than to take the crew to the location,” said Milham. “There’s physical construction, less moving in. We’ve seen the same crew get up to 30% faster on a volume stage than on a traditional stage.”
Steve Jelley, co-founder of Dimension, a cutting-edge virtual production studio, gave an example of how it works in practice. It recently filmed Fireworks, a new indie short from two-time Oscar-winning VFX Supervisor Paul Franklin, who won for Inception (2010) and Interstellar (2014).
“You can use its benefits for drama at any budget. It is still expensive to rent a large volume, or certainly to build one, but, when you start to combine it with other virtual production techniques, it works pretty well,” he said. “For example, I’m looking at doing the big wide shots on blue screen using SimulCam [which superimposes real world actors and objects on to a virtual world].
“Then I’m flying in LED panels for the medium close-up work. So you can use this stuff in combination. Well planned, it can be faster and better, and you can make it stack up.”
The elephant in the Zoom room was addressed: does this mean crew are at risk of losing their jobs?
“I think the fear is the disruption and change, which is undeniable. It doesn’t really change the core thing we’re doing,” said Milham. “If you’re a gaffer or someone who is used to working on a set, [you might think,] ‘Here comes this guy with his computers who thinks he’s going to take over’. Well, we are still dealing with the physics, emotion and craft of lighting something.
“You might be lighting with an iPad instead of a physical instrument, but we also still use physical instruments. There are all kinds of exciting possibilities, so someone who can navigate the tricky waters of that change can find a version of their job that they like more.”
Kate Gray, the head of product management at NTT Data (“part of NTT, which is most likely the largest tech company that no one’s ever heard of”), explained that, with all tech developments, human adoption is at the heart of its success. “Everyone that goes on that set, and everyone that I know that’s ever worked on a digital transformation, is doing so for the benefit of their customers, the business, or that particular piece of beautiful creativity. So, yeah, we’re humans and make things messy. But at the same time, humans can come together and make great things.”
Looking to the impact it will have in the future, Sky Studios is focusing the application of virtual production on its drama slate, before moving on to comedy, entertainment and the arts.
Its potential will grow as the technology continues to develop and the range of already-created volumes increases, which is likely to make rental prices more affordable. “It feels to me like virtual production is still in its infancy. There’s been amazing work over the past couple of years but the tools are getting better all the time,” Graham said. “So it does feel like that impact will grow exponentially.”
The best news is that, as far as tech goes, it is easy to dip your toe into it and explore the possibilities. “You can literally replicate the brain bar [the informal term for the virtual production team on a project] by using Unreal Engine, a PC and some TVs,” said Jelley. “It’s very accessible, much more so than most film-making equipment…There’s really no barrier to entry.”
Report by Shilpa Ganatra. The RTS event ‘TV’s production revolution: The rise and rise of virtual production’ was held on 28 June. It was chaired by journalist Kate Russell.