Most cinema pros make a life at a unique niche of the film pipeline process. For shooters, editors or VFX designers, the zone of artist responsibility does not usually bridge the entire workflow spectrum.
Indie filmmakers aside, only a few people — directors and producers — stay with a project throughout the science and art jungle of a full cinema pipeline.
The journey begins from script stage to design, construction, camera capture, editorial and VFX into DI, where results are recorded to film or conformed to digital. While the story end of the chain is mission critical, a project can obviously face challenges from first to last. See chart.
Advances in film production tools brought major transformation to a process that accelerated from the mid-1990s to the latest gathering stage of evolution.
A practical workflow case in point for a full cine pipeline is where I work, Filmworks/FX — a 25,000-square-foot production campus that’s home to several strategically-allied companies. Filmworks/FX, New Deal Studios, CosFX, Stranger Comics and Filmworks Finishing Partners (a film finishing fund) work together in both service and content creation scenarios. Put another way, the Filmworks/FX campus services and originates intellectual property through to production of A-list and independent films. The Filmworks/FX network is home to a film and digital camera capture department, shooting stages with services, back-lot, high-end CGI, a DI color correction theater (with 10-bit digital and 35mm projection), telecine, design production (set and miniatures construction) and scanning-recording.
Filmworks/FX maintains a full digital lab with software tools from a range of providers. The most relied on for 2D work is Adobe Creative Suite 5. Key among these tools would be After Effects with its Photoshop foundation. And plug-in tools from Imagineer, GenArts, Red Giant and Cineform have become nearly as critical to pipelines as their host apps.
But as important as software has become, hardware is at the cusp of the new cinema watershed. At the forefront, Canon with its 5D Mark II and 7D is the trend-setting poster child for an indie capture revolution that is finding its way into studio work. More on this later…
Advances in processing power also represent a major benefit. Even so, many post facilities don’t immediately upgrade to the latest hardware/software packages. Reason: mid-project is a bad place to switch your pipeline. Early in the year, FilmworksFX maintained Windows XP 32 on most VFX systems. A 64-bit Windows 7 lab was adopted recently (between projects) for a major production boost across the board.
An instance of this is the indie film Phase One, produced by FilmworksFX. We upgraded Maya licenses to 2011 between projects. An Nvidia FX5800 accelerated HP Z800 with 32GB of RAM rendering a complex Autodesk Maya 3D scene took 20 minutes to render what had taken up to eight hours on 30, 32-bit processors.
Phase One required a scene with high-resolution 3D models of cockroaches depicted as hundreds of particles. HP/Nvidia hardware ran through the Maya simulation and rendering with no more apparent stress than opening a Word document. From 4K workloads to 3D renderings to composites showed similar gains.
After Effects CS5 ran nearly 10 times faster on the HP Z800 systems at 64-bit vs. the old 32-bit platforms. Even pulling 4K frames off the server was surprisingly fast. 2K workflows performed like standard definition footage, and RAM previews on multilayer 2K files with keying and masks took seconds to load on the Z800.
CULTURE & EVOLUTION
HDSLR cameras have empowered new workflows for indie filmmaking via rapid image acquisition and VFX element capture. Cameras like the 5D and 7D can quickly capture an element for a VFX shot that would have typically been built in CG or that would have required substantial budget and production resources across a film shoot to get the proper resolution needed.
Motion pictures shot on 35mm film still tend to work out of a scanned digital format at 2K (2048x1556) with sequenced DPX files. The offline editorial tends to be QuickTime-based files at HD resolution (1920x1080) out of an Avid system or Apple Final Cut Studio (more on this in future installments of the workflow series). If a film is cropping at 1:85 or 2:35, then the HD resolution acquired on a camera like the 5D is only about six percent shy of being full frame.
Filmworks/FX often shoots elements with the Canon 5D instead of creating conventional labor-intensive CG elements that would have been the only choice in the past. Thus, indie-based technology has been a considerable factor for increased speeds through the production process. That’s not to say there are no shortcomings to indie-based technology, but with proficient shooting and image adaptation such elements can fit seamlessly mixed into any shot without apparent image loss or compression. At the end, it comes down to digital ones and zeros.
Recent technology shifts have had tremendous cultural impact on the film business for shocks that have made things more difficult in some regards.
As an example, QuickTime technology has become a vital tool at the production office end of editorial and review for VFX post (usually out of Final Cut Studio or Avids). But with current film production pipelines, visual effects companies spend countless hours matching color from finalized 2K VFX shots to be put back into HD editorial. Said color-matching work can actually take longer to do than performing VFX work for the actual 2K client shot.
Why is production more particular with this than before? Because edits are now done at HD and often screened for projection. This means executives will be watching screenings more regularly in a quality film-like format. If color changes shot to shot, the inconsistency will pull decision makers out of the picture.
In this case the technology has changed the culture to an expectation of having in-progress work resemble the final product. A few years ago, production people watched temp SD imagery and made do with it. This is no longer acceptable to most productions. Now the film production culture is not trained to see past things in progress. People get nervous when things look half finished, even if they are only half finished.
So the biggest change in pipeline is quality of QuickTime imagery given back to editorial. An old standard definition visual model has now gone to a QuickTime-based HD — one that takes more space and more processing power. As a plus, imperfections in temp work are much easier to correct at the HD stage. For service providers, a non-stop tech makeover can be a double-edged sword. Independent filmmakers have benefited, however. Fresh technology lowers the price of entry significantly and the image quality is closer yet to A-list cinema quality.
One revealing trend under the new paradigm is that visual effects companies are becoming production companies and production partners. This is simply because many visual effects people happen to be filmmakers at work under profit margins so low and at competition so high that the choice to create content is about survival.
VFX PIPELINE — DATE NIGHT
CosFX founder Paul Bolger, VFX supervisor for shots on Fox’s Date Night, used Canon DSLR cameras to capture thousands of reference footage of New York. Much of the photography was filmed in Los Angeles but had to look as though it took place in New York.
If it was completely done in CG it would have taken months to complete and would have been extremely expensive. Using Planar tracking techniques in Imagineer’s Mocha and Mokey, we were able to create rapid and more photogrametry style finished composites. The entire Imagineer bundle for tracking, rotoscoping and background creation was recently enhanced into a single unit that is Mocha Pro. With the Adobe CS5 production tools suite, Mocha Pro made for a long-awaited upgrade to an already indispensable VFX core. On that score, the substantial 64-bit upgrade path to Adobe CS5 was key for boosting rendering power and speed across the board.
As an example, for the Date Night scene, raw footage was hand rotoscoped using Adobe After Effects. The street the police officers were on, as well as the police cars were kept. The footage was tracked using Mocha. All transform data, such as scaling, position and rotation, are copied to a clipboard and then pasted onto a solid layer in After Effects.
While the roto was in progress, on-set photography was separated in Photoshop so layers could have tracking data applied to them separately for recreated parallax shifts. The process involved radical manipulation of areas in repainting of missing background sections obscured by foreground objects. Once the roto was finished, color matching on the background plate was done to match the foreground. Then, all tracking data was linked to the layered background and thus LA became New York. One little tidbit — right next to a police officer is a large black box that is actually a crash camera. Rather than taking out the crash camera, a NYPD logo was tracked onto it.
Five years ago this shot would have taken considerable effort at far greater cost. Now such work can now be done on a weekend at reduced rates.
DI SCREENING & OUTPUT
The DI theater at Filmworks/FX doubles as a VFX screening room. Globalstor and BlueStor RAID units power Assimilate Scratch, which is used for conform coloring and playback. Reviewing VFX shots on a computer monitor or even a 72-inch flat screen doesn’t serve as a good quality control (QC) reference on imagery that will be projected. We screen off a 10-bit NEC 2K DLP projector on a 20-foot screen for review. Even if we are working in 4K, we can quickly render cache the files in Scratch to play at 2K without having to do lengthy rendering of the files at 2K.
This type of high-end play back would have been too costly for many post firms to acquire 10 years ago. Current high-quality 2K projectors, including the NEC projector in use at Filmworks/FX can be had for a relatively low outlay. You won’t have one in your house unless you are wealthy, but being able to review VFX work and conforms have saved Filmworks/FX countless hours in redoing shots as well as being able to receive timely approvals from clients.
Final Cut Studio and AJA Kona 3-driven Apple G5 hardware are also used in the DI theater for visual effects versioning where an entire film is typically watched on HD straight from the Final Cut timeline projected via SDI into the 10-bit NEC/Texas Instruments DLP projector for continuity and secondary quality control. A QC process that may be repeated hundreds of times in production.
As an instance of QC at 2K 10-bit resolution to DI screen, filmmakers on a recent show were shocked to find something new when their film was projected at the DI theater. One scene had a central character of the film with a prop knife sticking out of his chest. During editing for the small screen, editors, producers and the director had not noticed the prop knife sticking out of the actor in casual conversation with other players.
This was a quick fix for the visual effects department but it could not have happened without high-end projection where troubleshooting problems come down to being able to literally see them coming. And isolating and solving issues are what it’s about across the production pipeline.