By Daniel Restuccio
Issue: June 1, 2004

THE DAY AFTER TOMORROW - PART 2

PART 2 With Inferno running on an eight-processor SGI Onyx Emmerich was able to view 2K effects shots interactively adjusting the look and timing of the effects to match the beats of the temp audio track. "It was the only way to slam through these shots," says Colin Strause.

As New York City succumbs to water, Los Angeles is devastated by multiple tornadoes. Twisters had been done before by Industrial Light & Magic, says Butler. "We really wanted to hit people over the head and do better than what was done 10 years ago." Digital Domain used their in-house software Voxel B in order to create and manipulate more scientifically accurate representations of the twisters as a real world phenomenon. Then they outputted them as fully volumetric renderings.

"We studied twisters and emulated a physical scenario that represented that behavior, rotational speeds, and used that model to drive the destruction around it," he says.

Butler calls what he does an emulation rather than a simulation. A simulation, he explains, is defined as modeling behavior as it actually occurs in the physical world. By contrast, an emulation just has to appear as if it is behaving in accord with actual behavior. One uses physics to replicate exactly what is scientifically correct. The other uses physics, but the eye is the judge of what works and what doesn't. You want the behavior to appear natural, but nature doesn't perform to cues. Emulation inherits the visual behavior without the full constraints of a full simulation. So emulation is a way of augmenting a simulation for dramatic effect, he notes.

The Digital Domain tornadoes would drive through a building and that building would have knowledge of the distance of where the twister was, feel the velocity of the wind and know when it should break apart. "It look more believable and scary that way," Butler says proudly.

Digital Domain uses a combination of proprietary and off-the-shelf software including Maya and Houdini by Side Effects (www.sidefx.com). They like Houdini, Butler says, because of it's open architecture. Their programmers can get under the hood and make hooks into their proprietary software. Houdini, which is a combination of effects, editing and composting software. "We were able to build an elaborate set up in Houdini that hooked into Voxel B for rendering and hooked into another software to break apart buildings," he describes.

In the end, they took all the shots and Nuke'd them all, meaning they composited the shots in their home grown compositing software. Having built Nuke in-house they can tailor it to their specific needs. The hardest thing, he says, is getting the CG elements to look as if they fit into the scene. Normally you'd render the element as a lit pass and try and comp it into another lit scene and try and match them. With Nuke they rendered out the twister with three different lighting passes as a multiple layered volumetric and let the compositor dial in the lighting.

Meanwhile Sam Hall and his friends are now trapped in upper floors of the submerged New York Public Library. The temperature drops precipitously freezing the water, and to make matters worse, a blizzard starts. Laura, Sam's love interest played by Emmy Rossum, falls dangerously ill when a cut she suffered escaping the flooding becomes infected. To save her from plunging into potentially fatal septic shock Sam must brave the extreme blizzard to find medicine aboard the Russian tanker that is parked frozen outside the library. Hungry wolves, courtesy of Industrial Light and Magic's character animation department, pick up their scent and hunt them down.

"That had been the one surprise during shooting," says Goulekas. "We tried to use real wolves and real wolves weren't going to cut it. They were kind of docile and timid."

"They tried for quite a while to use real wolves but they looked like scared animals," says ILM CG artist Lanker. "They would turn their ears back to listen to the trainer and it was really noticeable and they didn't look ferocious and scary."

"Roland decided they needed to be replaced and approached ILM to see if entirely CG generated wolves could be used," he recalls. Wisely, when the original scenes with the wolves were filmed on set, duplicate back plates without the wolves were shot as well anticipating the digital re-casting.

Even with the mo-cap data from the German shepherds their ear motion, turning the ear forward or backward, or tail motion had to be hand animated. "For the tail we had procedural animation in place, we would simulate the tail motion and it would come together automatically," Lakner describes.

Lakner continues, "The main challenge was to make the fur look realistic. During prep time we brought in three timber wolves. We measured them, photographed them, figured out the direction of the hair and how they move. What is their build structure, their muscles and skeleton? We tried to do everything as correctly as possible.?

"We had to provide four different color wolves: yellow, black, gray and almost black. How long the hair is on the real animals became important in relation to dynamics and how it reacts to light. On a real wolf the fur is up to four inches long and when they run, that stuff moves all over the place. Yellow or blond hair reacts different than black hair. It has different reflection and refraction properties. These are subtle barometers that when you don?t have it figured out, things don't look right. You're not sure what's wrong, but it just doesn't look right."

ILM uses a combination of off-the-shelf and proprietary software. "We use proprietary software to model and do the set ups. The rigging and animation is done with Maya. We used Render Man and built custom shaders," says Lakner. Just when things look like they couldn't get worse, the eerie supercell forms over Manhattan. First seen from the orbiting space station the swirling clouds over the globe were created by Hydraulx using sub surface scattering to achieve the photorealistic look. The ultra freezing effect of the supercell was seen earlier when the helicopters crashed. So when the audience hears sharp crackling sounds and sees rime ice - the snow-feather structures that form when fog freezes on to the outer surface of objects - appear on the tip of the Empire State Building, they implicitly understanding it's not a good thing for humans. The freeze swiftly spreads down the skyscrapers like a rabid virus blowing out windows and racing to ground level. Instantly grasping that he and his friends are in danger Sam Hall rushes them back to the library as the wraith-like freeze chases them through the halls of the library.

"I went into this film thinking the twisters and the water would be the hardest stuff," says Goulekas. "The irony is with twisters, water, ice shelf, I could find all kinds of stock footage [for] reference. Everyone knows what water is supposed to look like. But when you see the big freeze and the Empire State Building gets covered in ice by The Orphanage, that was one of the hardest things to pull off because no one?s ever seen this before. So how do you get the audience to believe this is real?"

Goulekas took a pane of glass, put black velvet under it, sprayed the glass with a chemical that makes ice sickles, hit it with a blow dryer and filmed the ice crystals forming. That saved them, she says, because they had three companies that were doing ice flowers, The Orphanage, Dreamscape Imagery and Hydraulx and the only thing that held it together was those textures.

"The reveals and the animation and the crystals are CG layers," Goulekas says, "but the base ice flowers came from the plates. That was a good thing, because if everyone tried to write the same shader that would have been tricky."

While the intent of many of the effects in the film are to be spectacular, some of the illusions in the movie are distinctive because they're imperceptible. These invisible visual effects include the many "breath" shots, actors' frosty exhalations which signify the presence of a cold climate, that were added during compositing.

Zoic in Culver City, CA, did 46 breath shots (www.zoicstudios.com) under the supervision of visual effects supervisor Rocco Passionino. "The shots were tricky," says Passionino, "because the camera is moving around a lot and the actors are moving around a lot. Breath gets left behind, it doesn't travel with you. We had to understand how fast does it come out, how fast it dissipates, what the shape of it is."

For research the Zoic crew were sticking their heads into freezers and observing people exhaling cigarette smoke. Ultimately they built the breaths in 3D using particle systems in Maya. They did their temp composites in After Effects and their final work in Combustion running on a PC and in Discreet Flame on an SGI Octane.

West Hollywood, CA-based VFX house Ring of Fire (www.ringoffire) built and composited another 36 of the frosty breaths and took a slightly different approach. "We shot a bunch of practical elements," says Casey Conroy, visual effects producer, meaning they experimented with using actual foggy breath shot against black and compositing the real breath into the plates. "However the look constantly evolved. Roland and Karen would like one part of the breath, but not another. So having control was crucial." Some of the breaths, he says, were done using the 2D particle effect in Combustion and a fair amount were done using the real breath elements as particle emitters within Combustion.

Dreamscape Imagery (www.dreamscapeimagery.com), the visual effects division of Los Angeles-based Uncharted Territory, was responsible for 18 VFX shots, including the Mexico border and refugee camp scenes, as well as some of the ice frost effects in the freezing New York City Library hallway.

"We used 3DS Max with Particle Flow for extensions on the refugee camp," describes lead CG artist Brandon Davis. Particle Flow, which Davis helped design, is an event-driven particle system with a scripting component that builds realistic fog, snow, water and other natural effects in 3DS Max.

"The camp was only 300 meters wide and needed to be extended for miles. We added vehicles, people, huts, to the scenes replicating things that were in the original shot and repeating them out to the horizon in organized manner."

However, he says, the old days of tediously modeling and placing hundreds of objects by hand is gone. "Using Particle Flow we created a collection of objects that could be distributed procedurally into the shot and tweaked quickly."

Davis also developed a Particle Flow network that could place the objects in the scene and procedurally adjust various parameters on the fly to change the composition and layout of the objects. "The idea," he says, "is that each particle was given a region to be placed, the type of object it instanced, i.e. a vehicle, a person, or a building, and a behavior assigned to that object - parked neatly, parked chaotically, or driving away."

For the deep-freeze sequence Davis had to devise a way of creating the boundaries of the ice and dragging procedural textures through the geometry of the hallways. "We had make it look like it was growing, but not coming out of nowhere," Davis explains. Visually, he says, they had to look closely at the scene and determine where cold air would seep into the hallway setting that as the source of the frost tentacles and matching the growth to the movement in the shot.

"We couldn't just unwrap the geometry and texture map it. It was almost like a simulation in the way that it grew," he recalls. "Tharyn Valavanis wrote scripts to intelligently unwrap the hallway geometry so that we could grow procedural textures from exposed areas. In the view port you could see the general patterns and it was easy to crank out new versions."

"Tracking was the hardest thing we did," Davis remembers. "They shot the hallways scenes with a 10mm rectilinear lens with wicked distortion on the edges. We had to tweak the geometry on a frame-by-frame basis to get it to look right." Final compositing of all the shots was done in Adobe After Effects, he says.

Visual effects producer Petra Holtorf at Los Angeles-based Yu+Co (www.yuco.com) wrangled the creation of 65 visual effects for The Day After Tomorrow including compositing several sequences such as the mall scene, sky replacements, and the actual composite of Jack and Sam driving to the airport that was temp composited earlier by Peter Elliot.

In the story, Jack Hall is undeterred by the dense blizzard overtaking Washington. He and his colleges, Jason Evans and Frank Harris (played by Dash Mihok and Jay Sanders), set out to rescue Hall's son by trekking across snow-covered Pennsylvania towards New York City. Inadvertently they travel across the glass roof of a shopping mall, which like thin ice on a lake, cracks and breaks under Harris's weight. Hanging precariously from a single safety rope hooked on a ceiling beam, Harris, against the pleas of Hall, sacrifices himself by cutting the rope and plunging to his death.

That sequence involved composting the bluescreen shots of the actors with miniatures of the mall. "The opening shot was a little difficult. The camera tilts down into the mall and we had to match the real beams with the actors with the miniature beams," says Holtorf. "Roland also thought the windows in the mall stores looked empty so we took pictures of real store signs and composited those into the shot. We enhanced the scenes with practical snow elements and particle snow using Maya and used Apple's Shake on a Macintosh system to composite the shots."

The interior car shots were shot static on a bluescreen stage without rain, she says. Later it was decided to add rain to the shots so Yu+Co had to replace the car windows adding rain to the composited scene.

"What's amazing to me," concludes Goulekas, "is that all the hard stuff on this show got done in four or five months." Even, she says, with the never-been-done-before required level of photorealism. "Photorealism and human flesh are about the two hardest things to do in CG. In this film it had to be photoreal. That's what it was all about. There was this incredible pride in the movie. Everyone seemed to have the attitude, 'We're going to make this the best damn thing anyone's ever seen.' And they all had that energy and enthusiasm. Roland and I were pushing to get that last 10 percent out of everybody. And they were all great. I have nothing but good to say about all the companies that came on. We were very lucky."