How camera tracking & in-camera VFX fit into VP workflows
Nic Hatch
Issue: January/February 2022

How camera tracking & in-camera VFX fit into VP workflows

Ever since COVID-19 burst onto the scene, virtual production (VP) has quickly become the buzzword of the industry. While its value had been quietly debated for years — early VP demos involved little more than replacing a chroma key background with a primitive virtual set — many in the industry were not convinced it would ever take off, and only a handful believed in its true potential. Today, those early visionaries have been proven right.



What is virtual production? 

One challenge we face when discussing ‘virtual production’ is that there is no singular definition for what, exactly, VP entails. At this point, it serves as a catch-all phrase for many different techniques and technologies, including complex bespoke workflows, which are used either before or during production. 

When it comes to in-camera virtual production, workflows are starting to take shape in two ways: in-camera visualization, where teams can view previs (proxy or temp VFX) through the lens, and in-camera VFX (ICVFX), where the final shot is captured in-camera in realtime, often with VFX content driving an LED wall as the background. 

Only one thing is true for everyone: Whether you’re visualizing a location through virtual scouting, replacing a greenscreen with a CG environment or shooting in-camera VFX with LED walls, virtual production is the future of creative storytelling.

The pros & cons of ICVFX

Visualizing VFX in-camera enables filmmakers and stakeholders to get on the same page when it comes to VFX-based shots. For realtime visualization, it’s the next level up from previs, enabling the compositing of actors on proxy VFX backgrounds, or foreground CG creatures to be composited on physical backgrounds, and everything in between. From dancing toys to ripping the world apart, the possibilities are endless.



For in-camera VFX, the shot is finished in realtime. Both of these techniques enable more experimentation with lens choice, framing and compositional freedom while mixing the physical and virtual worlds in realtime. Being able to blend the two worlds in-camera, during production, brings a host of benefits — and not just for those witnessing the results in realtime. It also aids dailies, editorial and post in decision making.  

The obvious downside is having to generate the content prior to production, meaning a more advanced planning stage and obtaining sign-off earlier in the process. And, of course, there is no getting away from the need to bring additional technology onto the set. VP requires major shifts in the typical production sequence, which may take some getting used to. But ultimately, the benefits and efficiencies enabled by these shifts will far outweigh the challenges. 

How does camera tracking fit into virtual production?

For any type of on-set virtual production involving in-camera visualization or in-camera VFX — whether on green-/bluescreen, using LED walls, indoors or outdoors — the ability to accurately and robustly track the camera is a prerequisite. That means the ability to obtain the position and rotation of the camera in realtime, known as 6DOF (degrees of freedom). It is also paramount to understand the lens characteristics, meaning the optics and distortion parameters on any given frame, along with the focus, iris and zoom (FIZ) readings. And this all needs to happen in realtime, meaning the same speed as the camera, whether 24fps or 60fps. Ncam is synonymous with delivering this requirement day-in, day-out, around the globe.



Currently, many teams use motion-capture systems for camera tracking, but run into myriad problems because they aren’t built for this function. Mocap systems are specifically designed to capture complex motion, predominantly of human beings and objects. Most use a number of cameras placed around a room or studio, aimed at the center of the room, in order to create a motion-capture volume. And while many mocap solutions are capable of tracking a virtual camera or physical camera, it requires setting up multiple cameras in order to track a single film or TV camera, which is a complex, time-consuming and expensive endeavor. In contrast, a system like Ncam is specifically designed for ICVFX and can track in any environment, on any camera, with any lens or rig — thereby resolving any issues around flexibility and scalability.

Best practices for camera tracking

To use camera tracking effectively, users first need to make sure the production is fully on board with the VP techniques, technologies and workflows that will be utilized. Set expectations early by educating all stakeholders on why and how virtual production will be used, clearly explaining the benefits. In many ways, VP is a “see it to believe it” solution; once you are on set and everyone can physically see the results, ensuring buy-in on future projects will become easier.

A system like Ncam Reality is designed specifically for realtime camera tracking, including lens distortion computation. A multi-sensor device is attached to the camera, which feeds information to the software. The training required is minimal, as is the day-to-day operation, and the software is designed to take new users through various setup wizards. For VFX studios, the system complements any motion-capture solutions they are using and learning how to use camera tracking is second nature.



What does the future look like? 

There is no question that 2022 will be the year of virtual production. But harnessing the power of all these technologies and delivering consistently excellent results requires a particular skill set that is still relatively rare, so understanding the emerging responsibilities and roles is quite a hot topic right now. There are many new roles evolving, which may or may not turn out to be relevant over the next few years. Some may be jettisoned, while other, totally different roles may need to be created. We won’t know the answers for a while, but what is certain is that defining responsibilities and streamlining workflows is the key to success.

Over the next few years, VP technologies will mature, workflows and skill sets will improve, and educational opportunities will progress. Perhaps the most important shift, though, will come when people let go of fear, embrace change and make a concerted effort to learn and improve. There is an inevitability around virtual production, but 2022 is going to be a vintage year! 

Nic Hatch co-founded Ncam in 2012 to develop innovative virtual production technology and solutions for film, TV and broadcast. Prior to founding Ncam, he formed Nviz, the London-based visualization company. Hatch is also a creative, having worked as a CG artist for Mill Film and the Moving Picture Company on a variety of projects, including Troy, Tomb Raider and Harry Potter and the Philosopher’s Stone.