HOLLYWOOD — Twenty-eight years after Disney Studios’ pioneering visual effects sci-fi film Tron became an indelible part of the cyber-cultural consciousness, Tron: Legacy — a sequel using the latest and greatest technology and design — pays homage to the original... in glorious Disney Digital 3D.
How did this happen? In 2007 Joseph Kosinski, a young, up-and-coming commercial director, walked into producer Sean Bailey’s office and pitched his vision of doing a remake of Tron in a post-Matrix world.
“What I said to Sean,” explains Kosinski, “was that I wanted this movie to look like Tron. We live in an age that with the computer you can make anything look like anything. I wanted this movie to be instantly recognizable as Tron from the first image on the screen. That meant staying true to the design of the original Tron, and making it feel absolutely real, [as if] you took a motion picture camera into the world of Tron and shot it from the inside.”
That meeting convinced Bailey to get a pitch session at Disney. Kosinski persuaded the execs to let him make a three-minute, VFX proof-of-concept that would layout in Kosinski’s vision, “the look, the tone, maybe a hint of the narrative of a movie that didn’t exist yet.”
Kosinski took the production of the test to Eric Barba (Benjamin Button), Digital Domain’s Oscar-winning visual effects supervisor, with whom he had worked on the commercial for the X-Box game Gears of War. “Right before I started in earnest on Button, the test came in for Tron. Joe came to me, we planned it all out and finished it in stereo,” recalls Barba.
The Kosinski/Digital Domain demo impressed the Disney execs, however the turning point came when they showed the VFX proof-of-concept as a teaser at Comic-con in 2008. The totally unexpected sneak peek into what Tron: Legacy could be was met with an enthusiastic response by Comic-con fans. The Internet buzz over the next few months convinced Disney to greenlight the project.
Like Avatar, Tron: Legacy was shot in dual camera 3D using Pace Fusion rigs developed by Vince Pace and James Cameron. However, the Tron: Legacy gear was a step up from Avatar. Sony made available F35 cameras as opposed to the F950s used on Avatar.
“The benefit of this camera,” recalls Kosinski, “is that it has a full 35mm sensor which gives you that beautiful cinematic shallow depth of field. We needed the suits to be the brightest things on set, which meant the sets were illuminated very dimly. We made up for that by shooting the whole thing with master prime lenses, wide open at 1.3, which makes the 3D fall off in a very different way that gives it that distinct look.”
During production they output the data from each camera to Codex Digital portable and studio devices, and captured in both uncompressed 1920x1080, 10-bit 4:4:4 DPX frames S-gamut and compressed 3:1 wavelet, depending on the scene.
In a very real sense, much of Tron: Legacy was made even before principal photography began. “There is no such thing as post on Tron: Legacy,” says Kosinski, “we started the visual effects on day one.”
“Wyatt Jones had been editing previs and he stayed on through most of the project,” says Tron: Legacy editor James Haygood (Fight Club, Panic Room). “When I came in, I started working on previs with him on some of the big set pieces — sections that relied more on previs. That was a big part of some of the early production, which was already happening when I arrived.”
“Working with stereo left and right eye is fairly new for visual effects companies,” says Barba. “Avatar came out six or seven months after we finished principal photography and no one knew what that was going to look like.
“We had to build a 3D pipeline, and even a 3D previsualization pipeline, for Joe Kosinski to plan out the look of the film in stereo so he could see how everything was going to appear on the big screen,” says Barba.
“From the very, very beginnings of this movie, we had a whole previs department that Joe was working with directly,” explains Steve Preeg (Benjamin Button), Digital Domain’s Oscar-winning animation director. “Sitting with him, lining up cameras, helping him pick out his shots. A lot of the big action sequences were cut before they even started principal photography.
“All of those assets, all of those scene files,” continues Preeg, “all of that camerawork was already built into our pipeline. So there were tons of things we were able to leverage out of our previs team.”
The previs team in fact became the layout team, and Digital Domain got to keep the wealth of knowledge that started at the beginning of that film and used that throughout the rest of the process, “which you won’t get if an outside facility does the previs,” notes Preeg. “It was very efficient and valuable, and I would not want to do it any other way.”
Haygood, who used Avid Nitris DX to edit, reports that one of the most challenging aspects of editorial was that most of the time you were staring at people in greenscreen or previs shots. “When you are shooting a big special effects film, a lot of times 90 percent of your image is not there. You are shooting a couple characters on bluescreen with whatever amount of set you have, which on this movie varied from none to elaborate. So you are going through an editorial process just like any other film — a lot of it is left to your imagination.”
For a much of it “you have a couple characters against previs,” continues Haygood. “So you have people in a kind of cartoon environment, but you have just enough to suggest something. So someone not really familiar with it can look at it and go, ‘Oh, they are in this street’ or, ‘They are in this room,’ or whatever it happens to be. So you are cutting the way you normally would be cutting for performances and story and pacing, and the normal things that you do. In a lot of [ways] we ignored the fact that it was a big visual effects movie or that it was a 3D movie. The other stuff gets filled in and you get to the end and you say, ‘Oh my god, this is a fully immersive world,’ because you get used to looking at it just as simple imagery with performance and then you get to the end and you see the full scope of the thing, and it’s impressive.”
DE-AGING JEFF BRIDGES
On of the key conceits of Tron: Legacy is the realization of a believable 30-year-old version of the 61-year-old Jeff Bridges.
“Some of the hardest technical challenges were the Clu 2.0 character in stereo,” explains Barba. “From our Benjamin Button experience, knowing that tracking a human head onto a live-action plate of another actor was incredibly challenging. We had to come up with an even more robust system because in stereo there’s no forgiving — half a pixel off and it doesn’t sit in the same space on a 3D screen. We had to come up with a system based on what we did for Button and take it to the next level.”
“We did a cast of Bridges’ head at his current age,” describes Preeg, “and in the computer digitally de-aged him into a 3D version of his head.” They accumulated “tons of reference data” and pictures of Bridges in different poses. They also made a cast of his teeth and took pictures of where his teeth fit in his mouth. “Obviously there’s lots of footage of Jeff from that time period cause he was in a number of films. We pulled tons of frames from those films and made this amalgam of the idealized Jeff Bridges around that early 1980s time period,” says Preeg.
On Button, principal photography was done a couple months before they ever got Brad Pitt onto a soundstage to record his head replacements. Jeff Bridges, however, wanted to interact with people on set and be there and be in the moment, so Digital Domain needed to develop a way for them to capture his information on set rather than in controlled conditions later.
“We used helmet-mounted cameras all lit with infrared lights so we wouldn’t interfere with any set lighting while we were there,” says Preeg. “We recorded data on set with audio. We had to track that information and get 3D data out of those points. We went with four cameras so we had at least two cameras seeing every point on the face at all times so we could triangulate those points and get real positional 3D information.”
For the live-action sequences of the younger Bridges, they used body double John Reardon. “He’s a bigger, muscular guy who spent a lot of time studying how Jeff Bridges moved when he was younger,” says Preeg. “So they would do a take with Jeff Bridges wearing the helmet rig, acting with live people and Reardon would study those takes. We would re-shoot the scenes with the body double mimicking how Jeff had just done that performance and then we would take the helmet-cam data from Jeff’s performance and put it on the body take version that he had mimicked.
“You are going to see things that are CG that you will think are live action and things that are live action that you will think are CG,” adds Preeg.
THE COLOR GRADING
At press time, Tron: Legacy was being conformed and graded at LA’s LaserPacific on Autodesk Lustre. “What we got were 1920x1080 DPX files, as well as all the files and archives from the dailies,” says colorist David Cole. Even though Cole just started doing the digital intermediate in September, he’s been working on all the promo material — teasers, trailers, ComiCon releases — for two years, so the look development and intent had already been set.
Cole is working closely with Kosinski and cinematographer Claudio Miranda on the grading and doing a 3D “convergence pass” using the Christie CP2000 XB and Real D Crystal Eyes 5, creating multiple digital and film versions of the movie and a 3D grade that is separate and unique from the 2D grade. “When you are doing 3D you are dealing a lot with luminance levels; when you get lower luminance saturation of color becomes less apparent so we were compensating for that.”
During the convergence pass, Cole can actually “push the image in and out of the screen as required.” He says that part of his job is to “ease the load on the audience eyes — not just in terms of color and texture but also in the placement of the z-space.”
Visit this link for a making-of video: http://digitaldomainvip.com.