VFX: 'Flight' makes use of the cloud
Issue: November 1, 2012

VFX: 'Flight' makes use of the cloud

SAN FRANCISCO — Paramount Pictures’ Flight, which is already garnering some Oscar buzz, is a story about a hero who also proves to be human. 

This Robert Zemeckis-directed film contains over 400 visual effects shots that will largely go unnoticed — they were also generated with a cost-effective workflow using cloud rendering instead of relying on an in-house renderfarm. 


Every shot was produced under the watchful eye of VFX supervisor Kevin Baillie at San Francisco’s Atomic Fiction (www.atomicfiction.com), the visual effects house he founded two years ago with Ryan Tudhope. Baillie’s resume goes back to when he was 18 years old working on previsualizations for George Lucas on Star Wars: Episode I — The Phantom Menace. He then went to ImageMovers Digital in 2009 and worked with Zemeckis and Flight producer Steve Starkey on A Christmas Carol and Mars Needs Moms. 

The director heard Baillie and Tudhope  were starting up this new visual effects studio, “He had liked working with us before and said, ‘I’d love you guys to do the picture.’”

Atomic Fiction began work on Flight in August 2011 with script breakdowns, budgeting and location scouting. Principal photography started in Atlanta in October 2011 and went through to mid-December.

The movie, says Baillie, is essentially about “hero” pilot Whip Whitaker’s (Denzel Washington) fight to vindicate himself, and how it affects everyone around him.
 
Zemeckis, explains Baillie, was very clear that the visual effects could not take the viewer out of the film. “[He emphasized,] they are there to support the story, not to steal the show,” explains Baillie. “He had a general visual direction where he wanted things to go, which was sort of gritty and realistic but not distracting. That was a really big guiding principle for us on this show. Director of photography Don Burgess really helped us support that, and our art director Chris Stoski pushed to hit that at every point.”

PREVIS, POST-VIS

The production originally planned for roughly 130 VFX shots, however, after Zemeckis “had done his editorial process and came up with a lot of really amazing ideas, it ended up being closer to 400 shots.”

The production was helped along not just with previsualization, but post-visualization as well. “A previs shot,” describes Baillie, “is really the beginning of defining a shot. Creatively, the sky is the limit. You don’t have any cameras that you’re locked into yet. You generally haven’t shot any footage yet, so you’re really making up shots from scratch. Post-vis, on the other hand, is when you’ve already shot a particular scene and you’re trying to figure out how to work a significant computer-generated element into that shot.” Some of the previs for the crash scene was done by LA’s Third Floor and in-house at Atomic. 

“A great example of a scene that benefitted from post-vis,” says Baillie, “is the now-famous shot from the trailer, where the plane comes flying overhead, upside down.” 

When they were on location in Atlanta, shooting that background plate at an apartment complex, camera operator Robert Presley didn’t know the exact sky position or how fast the plane was going to be moving. He made his best guess to execute a camera move as if he was trying to follow a plane.

Atomic Fiction, in the post-vis phase, took that camera move, put a computer-generated airplane into it and then “started work... with Zemeckis saying things like, ‘That’s too high, too low, it looks like it’s moving too slow, let’s speed it up.’ We would actually start adjusting the camera footage in post to match this plane.”

INVISIBLE EFFECTS

So the director’s goal with the visual effects in Flight was to make most of them invisible. “The audience doesn’t notice them,” states Baillie. “To them it was just like it was filmed in-camera.”

For example, the airplane cockpit was shot against a greenscreen. Everything seen through the windows was composited from multiple elements: helicopter sky shots, digital clouds, matte paintings, plus subtle touches like little fingerprints and smudges on the windows and glints of light. “Just the things that can dirty up the frames so that at the end of the day, the audience has no idea that it was a visual effects shot,” he explains. 

Another transparent effect happens during the crash scene toward the beginning of the film where the airplane has to roll and fly upside down in order to stay in the air. Mike Lanteri (Jurassic Park) the physical special effects supervisor on Flight, had the challenge of taking a McDonnell Douglas 88 airplane fuselage and rolling it upside down filled with people.

It turned out that, practically, special effects was only able roll half of the fuselage at a time,  so it had to be shot as two separate elements. In one element the camera is inside the fuselage for the foreground and for the second element they had to back up the camera 40 feet and shoot through the open end of the plane for the distant part of the fuselage rolling around. Atomic Fiction then married the two elements together.

Bucking the trend of jobbing out visual effects to multiple vendors, Atomic completed all of the shots for the film. “It made a lot of sense for us to handle all of the work on Flight because we’d really developed great shorthand and a level of trust with Zemeckis,” reports Baillie. 

RENDERING

“We have a very scalable infrastructure at Atomic Fiction because of how we’re set up to use the cloud. We were able to do 400 shots in around four months with a team of about 35 people,” explains Baillie. “The fantastic folks at The Paint Collective and Bot VFX acted as an extended part of our team, helping with roto/paint and camera matchmoves   As the show expanded from 130 shots to 400 shots over the course of a couple months, the cloud enabled us to actually grow with the show without incurring insane amounts of cost in doing so.”

Virtually the entire movie was rendered in a cloud-based system developed by Boston-based Zync. “They have a really great system for both managing how data gets into and out of the cloud, as well as bringing up virtual computers in the cloud and managing them as they churn away on your data and then shutting down once they’re done.”

What Zync’s cloud solution allowed Atomic to do is treat computing like a gas or electric utility at your house; you pay only for what you use. They were able to scale up and down with not just the needs of a show, but with the needs of the crew on an hour-by-hour basis.

That made their costs very predictable because every shot had its own rendering cost associated with it. “You’re paying for what you use on an hourly basis, it actually costs the same to render on 100 computers for an hour as it would to rent on 10 computers for 10 hours,” explains Baillie. 

Artist time is expensive, so if a studio can render shots fast as they are building them,  that’s a huge cost savings. “They‘re going to be able to react and keep that creative momentum going,” he says.

“It would have been impossible for us to actually keep up with the growth of this show any other way. I can go from having 50 computers running in the cloud to having 400 computers running in the cloud within a couple of minutes.”

With an in-house renderfarm, “there would be no way I could turn on a dime and add another 100 computers within a week even, or two weeks, or three weeks. Building out data centers takes a long time, and a lot of money. Whereas when you’re using the cloud you can react to changes like that instantly. That is a hugely liberating concept both as a business owner and as an artist.”

Baillie hopes that both the movie studios and the artists creating the software are going to get onboard with this cloud idea and embrace it. “When they see that a movie that is a top-caliber film like Flight can be successfully made with the cloud, and made even better as a result of the cloud, I hope it’s going to spur these people on to stop being scared of the concept and to start embracing it.”

For 3D, animation, lighting and rendering, they used Maya and V-ray as the primary renderer. They also used 3DS Max for matte painting work and some effects work. “3DS Max has some great tools, some great plug-ins like FumeFX and Krakatoa available to it. For compositing, we’re entirely Nuke.”

“We’re also big fans of Shotgun,” notes Baillie (pictured, left). “We use Shotgun to do all our production management. In keeping with having a small, nimble, fast-moving team on the show, we basically had a production team of five people and that includes editorial managing all 400 shots over the course of four months. That was only possible due to Shotgun and our internal asset management system called Fidget.”

They use mostly Macs running OSX. Their core server infrastructure runs on Linux machines built by Lightbeam. The matte painters and some of the VFX artists work in 3DS Max on Windows.

THE SHOOT, DAILIES

Backing it up a little, let’s talk about production. Flight was shot with Red Epics, and Light Iron Los Angeles handled the entire on-set data workflow and digital dailies production with their Outpost systems. Light Iron also did the final DI and DCP master.
“The Red Epic was actually a really great camera for us to use in a lot of ways because of its size and weight,” declares Baillie.

When they were shooting the sky background helicopter plates, they had to cover themselves for any situation. Burgess and Baillie devised this custom-made rig for the front of the helicopter where they mounted three Red Epics. Each one of them had a 14mm Ultra Prime lens on it. They were fanned out so that between those three cameras they were able to get a 240-degree stitchable panorama of everything the helicopter saw. 

Instead of doing the same action three times with the camera pointed three different ways, they were able to get any action all in one pass. “We ended up getting enough helicopter footage to fulfill the needs of 80 percent of the shots that we had to achieve.” 

Flight visual effects were done at 3K and final deliverables rendered at 2K. “We actually found that little bit of extra resolution helped make a lot of our greenscreen extractions a lot easier,” reports Baillie.

Baillie says one challenging aspect of the film was “having worked with Robert Zemeckis in the motion capture all-CG world and then taking the language that we developed there and applying it back to live-action filmmaking. A lot of live-action filmmaking has more restrictions to it than there would be in a completely digital world. A lot of the ideas that Zemeckis had were still in that fantastic realm. Trying to figure out how to achieve these amazing, big ideas in the timeframe that we had was probably one of the biggest creative challenges.”