Efficient pipelines are the foundation of successful digital creation projects, but we live in a world where everything has to happen faster, contributors are spread across different locations and files can quickly add up to terabytes of data from multiple sources. So, managing all those teams and systems becomes difficult and time-consuming. Plus, people may be working on the wrong version of an asset or missing an asset, which can slow down schedules, lead to duplicated work and waste resources on incorrect renders.
For an organization like ours, pipelines and workflows can either be a hindrance or a chance to shine. We have worked on massive high-profile projects, such as The Acolyte, Stranger Things, Fast X and The Little Mermaid, with teams based in the US and UK — including freelancers — and the need to often share our work with clients. This is why we spent extensive time and effort developing some best practices around managing project pipelines, including onboarding new contributors, and managing virtual and freelance resources, with some help from carefully selected technology tools.
The story starts when we decided to take advantage of Unreal Engine, which was just starting to be adopted by media & entertainment studios at the time. Unreal is a highly-flexible platform for realtime video rendering, originating from the game-development market.
Most of our artists use Maya, and we needed a way for them to connect efficiently with Unreal. So, we decided to use this opportunity to create a new pipeline to sync sequences across both Maya and Unreal, making it as simple for artists as possible. This way the artist retains responsibility for their shots, focusing on the creative aspects.
In addition, we wanted to ensure that all content is easily accessible, shared and tracked so that artists can find what they need, regardless of where they are based or what application they are using, all while adhering to strict security protocols. Also, since we work closely with our clients, the pipeline had to support remote content delivery to share data with them.
Purpose-built pipeline
The result is an in-house pipeline built primarily with python, designed to make it simple for people to create their own workspaces, which, in turn, reduces the amount of technical support they need. Perforce’s version-control system is also an integral part of that pipeline, managing large amounts of data and creating common ground between artists, developers and vendors at every stage of our development process. This simplifies the ability to see what has changed, who created what, where and why. Perforce also integrates with Maya and Unreal, in addition to having a python API that we use to create our own bespoke tools and automations to submit and sync files to the central version control system. It also allows us to set granular permissions, so users can only see and interact with what they have been permitted to access, which is crucial since we handle multiple projects, each with their own IP and NDAs.
Perforce P4 is also used as a common ground when working with multiple vendors. The main studio usually creates the Perforce server for a project and then gives each vendor access so we can all share assets and materials easily and efficiently, rather than waiting for periodic deliveries and ingest processes. Of course, we all still have our own pipelines, but this process ensures that the connection points between our studios is unified, and we have a consistent source of truth with no more questions about whether an asset was delivered or not.
The language around the use of AI for film has been evolving, and Proof is constantly looking to use AI responsibly in order to increase efficiencies. For example, when processing 3D gaussian splats, the training happens on-premises in the software session, and the training data is destroyed when the session is closed, therefore no data is transferred to the cloud or used in external training data sets.
This approach gives Proof the best of all worlds: a uniform way of working with contributors while giving them control over their workspaces, plus an efficient way to work with clients and other vendors involved in a project. The end result is a modern pipeline that can handle vast amounts of data from multiple sources while maintaining the highest degree of security for some of the industry’s most high-profile content.
Prior to his current role at Proof, Steven Hughes worked as a previs/techvis artist at the company. He’s also held various positions in animation at companies such as Blue-Zoo Productions and Dramatico.
Steven Hughes is Creative Technology Supervisor at Proof Inc. (www.proof-inc.com).