Cover Story: 'The Dark Knight Rises'
Issue: August 1, 2012

Cover Story: 'The Dark Knight Rises'

LONDON — Warner Bros. & Chris Nolan’s The Dark Knight Rises, boasts over an hour of stunning IMAX footage, an accomplishment that raised the bar for the film’s visual effects supervisor, Paul Franklin, and his team at Double Negative UK (www.dneg.com), which delivered over 450 VFX shots at an astonishing 4K and higher resolution, more than double the 2K resolution of effects produced for most movies made today. 

Franklin and Double Negative (Dneg)worked on the The Dark Knight’s visual effects four years earlier, but wanted better quality for this movie. The key for his team this time around, says Franklin, “is that we wanted to make the work seamless. We don’t want anybody to notice that we’re doing visual effects.”

Franklin stared working preproduction on Dark Knight Rises in January of 2011 with director Nolan, cinematographer Wally Pfister, special effects supervisor Chris Corbould, production designer Nathan Crowley, VFX producer Mike Chambers and senior coordinator Katie Stetson. LA Center Studios was the prepro base of operations for The Dark Knight Rises, as it had been for Nolan’s Inception and The Prestige. 
 
Preproduction, says Franklin, “is an important phase of the film because while having discussions with Chris about how the visual effects might be used to tell the story in the film, we’re working out how we actually going to make the film and the visual effects we’re delivering.”

With Nolan it’s pretty well known that, if possible, he likes to do his effects practically, — remember the rotating room fight scene or the locomotive skinned semi-truck barreling down Los Angeles in Inception?

THE PROCESS

“We’re always trying to work out the process,” says Franklin. “‘How can we actually get this in-camera?’ That is very much the world of Chris’ filmmaking. We understand that if we do something in CGI, we’re going to have it at the absolute highest possible standard. It needs to be undetectable. People need to look at it and think, ‘Oh, they went out and shot that.’ The great thing is that I’ve got a really good understanding of how Chris’ process works. I know what he likes, what he doesn’t like. I’ve got a good idea of where he’s going to want to use visual effect, and I’ve got a good idea where he’s not going to not use visual effects.”

(Warning: Spoiler Alert! If you haven’t seen the film, don’t read yet.)

“For instance,” describes Franklin, “the football stadium — we destroyed Heinz Field in Pittsburgh, or rather we didn’t because obviously we can’t really blow up Heinz Field. We did a lot of very spectacular in-camera pyrotechnics on the day, which sort of set the tone for the piece.” 

Franklin also had real Pittsburgh Steelers running across a raised platform with holes dug into it. The stunt guys fall into the holes and it looks like they’re being swallowed up by the collapsing field. 

By contrast, for the Gotham City bridge destruction scenes, Franklin went up in a helicopter and shot as much material as possible to give them real plates to work with. However, since New York City wouldn’t allow them to set off any practical pyrotechnics when you see the bridges blow up, all the effects were created digitally. 

The signature vehicle from the film is The Bat, a multi-role combat helicopter that Batman flies around Gotham City. According to Franklin, “Chris Corbould our special effect supervisor built this amazing functioning prop that was mounted on a vehicle with a hydraulic lift on it so we could drive this up and down the street pretty fast. The car it was supported by would travel 70-80 miles an hour quite easily. But it was limited in that it could only go up as high as the arm would take it, and it had difficulty cornering at speed.

“So we did do a lot of digital work to extend the range of the flying vehicle. When you see it doing really complicated aerial maneuvers, it’s a digital version of The Bat. But it had to be an absolutely seamless match for the live action, and I think it’s going to be interesting to see if people can really spot when we’re using the practical Bat and when we’re using the digital one.”

Franklin notes that one early reviewer who had seen the prologue thought the aerial heist plane break up was real, but it’s not.  What that said to him was a new standard had been achieved due to a combination of factors: the extraordinary technologies that have been developed over the last 10 years and the increasing levels of skill and artistry that the crew has developed. 

“It’s also the way that Chris uses the work in the film. He’s audacious about how he places his visual effects in his films. Things that you wouldn’t necessarily think were a visual effect, but he always puts it in amongst reality, and the reality is convincing, so when you see the visual effect, it rolls past without any question. You don’t say, ‘Hey that must be bogus.’” 

IMAX CHALLENGES

Back In 2008, Franklin and his team already worked out a lot of the technical issues working with IMAX footage on The Dark Knight, yet there were new challenges and innovations for The Dark Knight Rises. “A lot of this film takes place in the daytime, which is unusual for a Batman film, and most of the IMAX work we did on the Dark Knight was nighttime material. So, we’re really having to push the believability of what’s going on.” 

All the IMAX 65mm VFX scans, he says, were done at the DKP 70mm IMAX facility in Santa Monica at 8K and the IMAX shots worked on at 5.6 x 4K. The anamorphic 35mm footage was scanned at Warner Bros. Motion Picture Imaging in Burbank on a Filmlight Northlight at 4K and all the scope shots worked on at 4K. The VistaVision scanning was also done at MPI at 6K. All VistaVision material worked on at 6K and all the VistaVision material was reformatted to IMAX.

Franklin notes that a 2K scan is 10MB of data while an 8K scan can be up to 220MB of data. Suddenly you’ve got a couple of gigabytes of data just for a second of film. Obviously that increases your render times and your storage. “I think in the end we ended up with a half a Petabyte (500TB) of online storage that we could actually use to manipulate the data. In terms of the total amount of data, we must have created easily a thousand Terabytes of data on this film.”

Another challenge is that there are no digital systems, monitors or projector,s which allow you to see a full resolution IMAX frame. “You can only see little bits of the picture,” explains Franklin. “So you have to develop a feel for it. You have to say, ‘Okay, I’m looking at this low resolution proxy,’ — we consider a 2K to be a low-resolution proxy of the final image. While looking at this low-res proxy, I’m going to have a good feel for what it is going to look like when it goes out to IMAX and we see it projected.”

What they did for dailies and VFX comps is down res the images, print them out onto standard 35mm and then project them. “And every so often we’d make selects and see those at full-resolution IMAX in whatever IMAX theatre we could get access to, whether it was the AMC Lincoln Center in New York, or the AMC CityWalk in Los Angeles, or the BFI IMAX in London.”

TECH INNOVATIONS

One of Dneg’s technical innovations was a physically based raytracing system within Pixar’s RenderMan. “Basically what is does is it allows us to accurately calculate the exchange of energy between surfaces. This produced an absolutely extraordinary level of consistency in the 3D renders this time.”

In the past, Franklin (pictured) had to carefully scrutinize a CG image and place it against the live-action photographic reference and tell the guys, “The shadow in the wheel arch of the Batmobile needs to be a point darker. The highlights on the windscreen need to be two points lighter. That’s got not quite enough contrast.” They addressed individual surfaces on every object and graded them by hand to get them to look right. 

This time around, he says, the system was so spot-on that instructions were: make it a bit darker; make it a bit brighter. “Once they got this thing set up, it was so fast to work with that we could show Chris an animatic, a grayscale animation of The Bat flying down the street. He’d say, ‘That great. Let’s get that rendered,’ and then I’ll be able to show a lit, rendered, composited version of the shot 24 hours later.”

Another innovation was the new onset high dynamic range (HDR) capture system. Basically, they would take photographs on set across the whole range of F-stops in order to characterize the lighting environment and to capture the range of exposure that they would eventually see on film. 
 
In the past this was done by taking a chrome sphere out onto the set and then shooting it with a camera. The sphere would give a 360 image all around, but it would be low resolution and only as good as the sphere itself, which can pick up dings and fingerprints. 

They evolved their procedure into using fisheye lenses. They would have a camera on a nodal mount pivoted around the nodal point of a lens with a fisheye lens on it. Then they’d shoot a series of exposures panning the camera 90 degrees in between exposures. This time they built a little robot that has four cameras with fisheye lenses in a casing mounted on a tripod. It’s driven by a proprietary computer system called DN Snapper. 

After each take they’d run out to the middle of the set, plant the robot down, and then the computer would run all the exposures. The process takes about a minute. In the past it would take up to 30 minutes to capture the lighting environment. “That gave us very accurate lighting maps which then fed into the new physical rendering system and it made a tremendous difference to the way that all the CGI looked in the film,” recalls Franklin. 

Dneg uses Autodesk Maya as their main 3D package. They also use Side Effects Houdini. “That’s a very important piece of software for doing all of our dynamic simulations. You know, blowing up buildings and crashing things and stuff like that.”

They have their own in-house fluid dynamic system called Squirt, used to create the big digital pyrotechnics, as well as a lot of their own proprietary tool sets. They also are using Nuke as their main compositing package. “But again it’s our own flavor of Nuke that has a lot of our own color management systems and things going in there.”

Franklin says there are approximately 20,000 CPU cores in their renderfarm. The main operating system is Linux and most of the workstations are regular PCs running 3.46GHz dual 6-core CPUs with 48GB of memory and Nvidia graphics cards.