Virtual production is a rapidly-evolving field, and motion control plays an important supporting role in its development. We are excited about the future of these technologies and the new creative possibilities they represent for filmmakers. Across many productions worldwide, we are seeing a growing number of MRMC’s motion control camera robots being used in combination with virtual production by some of the biggest names in the industry.
The increasing use of camera robotics in virtual production enables improved accuracy and more stable camera position data, which means better, more stable backgrounds, which in turn allows for the use of multiple passes for improved lighting and additional effects. Pre-programmed moves straight from previs also create better background rendering by allowing for pre-rendering time, which streamlines the production workflow straight from previs with fewer takes. MRMC rigs automatically take the data from the Bolt, Milo, Titan or any MRMC moco rig, converting it to Unreal’s coordinate space via The Unreal Engine LiveLink plug-in. This minimizes any potential rendering lag, essentially tethering the virtual camera in Unreal to the Bolt robotic camera. The LiveLink plug-in can also be used for previsualization. MRMC’s Flair motion-control software can be connected directly to Unreal and used to generate realtime feedback for camera moves before a set has even been built.
Hand in hand with the rise of virtual production is the increasing number of LED volumes in circulation in studios in the UK and beyond, making the technology more widespread and accessible. The obvious benefit of LED walls is that they can create immersive virtual environments that can be visualized for the talent without the need for a greenscreen and changed on the fly if required. This allows filmmakers to shoot scenes in any location, real or imagined, without having to leave the studio, which also helps reduce the carbon footprint of productions. Our motion-control rigs can create complex camera movements that interact with the LED walls, creating truly cinematic visuals. As a famous example of this, the Bolt Cinebot was used extensively on the Disney+ series The Mandalorian. The Bolt Cinebot was used to create complex camera movements that interacted with the show's LED walls, creating immersive intergalactic virtual environments.
The future of virtual production is very bright and works to solve many of the challenges facing filmmakers today. As technology continues to improve, virtual production will become even more accessible and affordable. This will open up new creative possibilities for filmmakers of all levels. MRMC's motion-control products and software solutions are being used to help shape the future of virtual-production workflows. As technology continues to improve, virtual production will become even more widely used, and motion control will have a central role in its development. We are committed to developing new products and solutions that help filmmakers to create the best possible stories.
Dan Brooks is the Head of Marketing & PR for MRMC (www.mrmoco.com), a manufacturer and supplier of solutions for motion control, automation, broadcast robotics, volumetric and remote image capture.