As lockdown restrictions start to ease around the world, one of the key consequences that’s starting to come to light regarding the COVID-19 pandemic has been its role as an accelerant. Certain trends within the industry were already well under way, such as the growth of remote, collaborative workflows in post production, or remote contribution for live events. But COVID-19 has moved these very rapidly from being a nice option to becoming an absolute necessity, and next on the list is virtual production.
While post production has been able to continue almost without pause during lockdown, production itself is only now starting to ramp up again. The different rules and regulations surrounding it vary from country to country — changing all the time in response to wider events, with concerns about a possible second wave never too far away from most people’s thoughts. As a result, virtual production is very much a technology coming to fruition.
Some of the technology driving it is complex, but the concept is fairly simple: Virtual production allows filmmakers to plan, imagine or complete any or all parts of their production in the digital arena. Whether that’s blocking out scenes, pre-visualizing shoots, motion capture, using LED screens for live compositing work, or using realtime rendering to show a director and DP how real and CG elements will interact, it’s all virtual production. Some productions have effectively gone all in on the virtual production methodology — the traditional filmmaking or episodic shoot now features very little in their workflows, whilst others are only using elements of virtual production.
This is where COVID-19’s role as an accelerant comes in. The driving forces behind the uptake of virtual production were always the powerful motors of flexibility and efficiency; effectively you can do more and do it cheaper, quicker, and, indeed, better, once you have made the commitment to change the production pipeline to a virtual or semi-virtual one. COVID-19 has added a further impetus to everything as productions suddenly realize they need to keep their footprint low, reduce location work wherever they can, and ensure that stages run safely whilst also being cost-effective.
And what is even more exciting is that virtual production is getting better all the time. Epic Games has already been demonstrating the realtime power of next year’s Unreal Engine 5 release, which will add greater realtime capabilities and even better image quality into the mix, making the real and the virtual worlds indistinguishable. Already we can work much quicker than before. Using Ncam, directors can see exactly how their actors are positioned in the virtual space and quickly iterate and adjust scenes.
We’ve been doing this for a while. In Solo: A Star Wars Story (2018), Nviz (then Nvizage) used Ncam to visualize VFX shots through the lens to show actors where graphic elements were coming from, speeding up both production and post, as well as helping the animation team to reposition objects on the fly as director Ron Howard iterated scenes. In The Nutcracker and the Four Realms (2018), for example, Ncam was used to place markers on the set to guide where in a complex VFX landscape an actor was pointing.
“Without the Ncam system, we would have had to guesstimate where Philip needed to point and then retrofit our world to match,” explains VFX supervisor, Max Wood. “Not only could we see the shots in realtime, but we saved a large amount of post production time in not having to adjust our world for one shot.”
This is just the start though. Take the LED screen technique (as used at its apogee by The Mandalorian, which filmed in a 21-foot by 75-foot set with specially constructed LED walls and ceiling, providing backdrops created in Unreal Engine), which enhances a huge degree of flexibility. We are no longer looking through the lens here or seeing a realtime composition monitor; we are shooting live actors against the CG backdrop. Okay, very few have the Disney-level budget to scale things up like that, but more conventional and smaller LED setups can still deliver the same level of quality on smaller stages. That allows productions to knock out a good percentage — in some cases probably even all — of their location work and bringing a host of benefits, saving transport costs, minimizing environmental impact, speeding up rig/derig and shooting time, and even just making things feasible during the pandemic.
The next generation of software will enable that process to be even more instinctive and intuitive, and will be good enough that what is captured in-camera does not even have to go through a further rerendering process in post.
Of course, it’s a technique that doesn’t just have to be used for genre programming either. While it’s always tempting to think CG = space or fantasy, the current crop of games engines such as Unreal can create convincing sets of pretty much anything, indoor or outdoor. Using photogrammetry, real-life locations can be captured and recreated in exquisite detail, even down to the fact that their light can be used to partially light the virtual set from the LED screens when shooting starts.
From an apartment set for a sit-com, to an Edwardian London street, this is a production technique that is going to dramatically change the economics of producing for film and television over the coming years.
In the live space, virtual sets are slowly starting to dominate live television, as the economic argument they make has been matched by the quality of the end product. We first showed Unreal Engine working with our data at NAB 2016, and now we see it used on a daily basis by tier one broadcasters to create virtual sets around the world. We have one customer in Buenos Aires producing eight different programs a day, all with different sets, from the same physical space. Switchover takes place in the advert break at the top of the hour.
In the UK, Sky Sports uses our equipment to ensure that they have the same virtual studio environment in over 90 different venues for its Premier League coverage. Whatever the size and shape of the physical space at the venue, the virtual studio is always consistent and the whole thing can be set up in under one hour.
Our technology was also used by Sony Innovation Studios to create a virtual version of the Shark Tank set for Sony Pictures Television when pressure at the lot in Culver City meant that there wasn't enough space for its usual two stages. The virtual set was used for filming around 100 exit interviews and was indistinguishable from the real world one to the extent that crew members in the production truck often forgot they weren’t monitoring a real set.
The same arguments that have driven the adoption of virtual sets — flexibility, cost savings, creativity — are now becoming urgently examined for film and television production. They are compelling enough that they were always likely going to happen, it’s just that, as with remote post production workflows, COVID-19 has brought the necessity forward by a couple of years.
Nic Hatch is the CEO Ncam Technologies (www.ncam-tech.com), a London-based developer of realtime visual effects (RVFX) solutions. Ncam has a US office in Santa Monica, CA.