Advertisement
Current Issue
April 2014
Issue: Animation - May 30, 2003

EFFECTS COMPANIES JOURNEY TO CENTER OF THE EARTH

LOS ANGELES - Imagine this... as the spinning center of the Earth slows, the planet's rotation is affected and the equilibrium of the environment is thrown out of whack. If it continues it will mean the destruction of the entire planet. The only hope rests with a group of daredevils traveling to the center of the Earth to give it a jumpstart, sort of like the Earth's personal AAA service.

This is the premise for the Warner Bros' feature "The Core," which has made it possible for a variety of visual artists to create a completely new "under" world for audiences. With very little existing scientific data to go on, the film's creators, including director Jon Amiel and visual effects supervisor Greg McMurray, were free to create their own science fiction.

"The hardest thing was creating the overall look for this underworld," notes Chris Bond, president/visual effects supervisor at Winnipeg, Manitoba's Frantic Films, which did much of the film's previs work and played an integral part in creating the look of this world. "When you set out to make a space movie, there's been hundreds of space movies and everyone knows what space looks like. But no one's ever been to the center of the Earth and no one knows what it looks like. So how do you make it look believable?"

TRANSPARENT ROCK?

To make "The Core" visually appealing, instead of mimicking what little is known about the center of the Earth - hard rock and darkness - the film's creative team designed a fluid, transparent and luminous world.

"What Jon and Greg both wanted was something stylized in the sense that as we get progressively deeper into the Earth, they wanted to have more freedom with the camera and to see greater distances," explains Bryan Hirota, visual effects supervisor at CIS Hollywood (www.cishollywood.com)."From a storytelling point-of-view that would be more interesting than playing it completely realistic, where you wouldn't be able to see a thing."

Using Alias|Wavefront Maya, CIS Hollywood created the underground vehicle called Virgil, which takes our heroes through the Earth, and provided cross-sections of the craft so the production team could construct a full-scale rendition of the ship for shooting live-action sequences. CIS also helmed the effects of Virgil burrowing through the different layers of earth. To create world of molten lava, CIS developed techniques involving fluid dynamics and volumetric rendering to show the flow of the semitransparent liquid and the wake that Virgil leaves as it travels. As the ship goes deeper, this specific volumetric rendering gave the liquid more transparency, allowing the director greater freedom pull the camera back. CIS used Maya for previs and lockdown of the animation and worked with a company called Flow Analysis (www.flowanalysis.com) to achieve fluid simulations for different shots. The fluid simulations were rendered in the volumetric/particle renderer Jig made by Steamboat Software. The more traditional, hard surfaces like the thin outer crust of solid rock were rendered in Entropy. A majority of the compositing was done with Apple's Shake and some with Discreet Inferno.

Additionally, to create the blinding appearance of the outer layer that resembles nebulae and galaxies surrounding the Earth's core, CIS worked with an independent software developer named Doc Bailey whose proprietary renderer Spore generated the ephemeral images that were later composited around the outer core. CIS used s 25-node Linux Networx renderfarm for the majority of rendering and the studio is powered by Linux-based Boxx Technologies workstations.

GEODES & LAVA

Frantic Films (www.franticfilms.com), designed many of the action sequences in the film, including the geode sequences where the ship travels into a giant cavernous sphere laced with millions of giant crystals. Virgil flies into the geode, crashes through the crystals and shoots a laser at it in an attempt to break up the larger crystals. The laser beams bounce off the crystals, multiply and refract to other crystals, illuminating each crystal for a moment.

"To hand animate that would be incredibly tedious, so we wrote a bunch of procedural stuff within 3DS Max to do all that," explains Bond. "We also build a rendering pipeline to render all those gigantic crystals."

At the end of the sequence the ship punches a hole in the geode causing an enormous eruption of lava, which begins to fill up the geode and puts our heroes in peril. "That became a huge beast and we had to write more code for Max to handle it."

In addition to 3DS Max, Frantic used a Maya for some of the modeling and animation, Eyeon's Digital Fusion for compositing, 2d3's Boujou for tracking, Realviz Realtimer to slow down a few shots and Entropy and Brazil's Splutterfish for rendering.

DEEP BLUE SEA Before digging into the Earth's crust, visual effects house Creo Films (WEB) had to get Virgil launched from an oil pumping station in the middle of the ocean, down through the depths of the water to the ocean floor.

For the launch site, the production shot in Vancouver and Creo extended the set and added CG water and rain to put the launch platform in the middle of the ocean. During the long journey through the water they encounter a group of whales who flee when an underwater earthquake occurs causing a whirlpool, which sucks the ship downward. Whale animation was done with Maya, compositing in Inferno and all rendering was down in NewTek LightWave. As the ship travels down billions of tiny air bubbles emanate from the ship. Creo used Houdini to generate the bubbles and then used Arnold Sofware for rendering. Arnold software works with 3DS Max but not with Houdini or LightWave, so Creo created a custom pipeline to make the two software systems work.

BRIDGE OVER TROUBLED WATER As the Earth's core begins to slow, the environment goes haywire. In one sequence, a beam of sunlight passes, unfiltered to the Earth's surface, over the San Francisco Bay at the base of the Golden Gate Bridge. The intense beam of light causes the water boil and the bridge to break apart, killing hundreds of unfortunate commuters. Computer Café (www.computercafe.com) in Santa Maria, CA, worked with New Deal Studios (formerly Hunter Graztner Industries), which built a miniature of the bridge. "You still can't beat reality and the miniature gave us a better look to add to for the close-ups," says digital effects supervisor Jeff Goldman.

Goldman and visual effects supervisor Scott Gordon used LightWave to build the bridge and to map plates into a 3D environment, and Digital Fusion to composite the CG model, miniature, live action and effects sequences together. Boujou was used for tracking. As the sunbeam, created with LightWave, hits the water, steam and bubbles start emanating before the water reaches a rolling boil. 3DS Max was used for the steam and small bubbles and Maya for the larger churning of the water. Computer Café is based around IBM Intellistations and Boxx workstations with a renderfarm made up of custom solutions and render nodes from Boxx.

THE SPACE SHUTTLE Rising Sun Pictures (www.rsp.com), with offices in Sydney and Adelaide, also provided a significant amount of work on film, including the entire sequence of the space shuttle flying, re-entering the Earth's atmosphere and landing in water.

The production team in the US filmed various plates of the baseball stadium and parts of the LA River. These plates were used for the sequence where the shuttle nears the ground. All the elements for the space and upper atmosphere parts of the sequence were created in post.

Rising Sun built the shuttle and the streams of ionized gasses around the shuttle during its re-entry into the atmosphere. Meanwhile, another group at Rising Sun was concentrating on doing the R&D for all the water interaction that would be needed once the shuttle touched down.

"When it came to adding all the wakes etc. to the LA river shuttle landing we had a blank slate … almost literally," says Tim Crosbie, visual effects supervisor at Rising Sun. "The plates that were shot were basically dry except for a few puddles on either side of the recessed water in the center. So pretty much all of the water you see in the sequence was generated by us."

The process started with tracking all the moving plates in 3D using a combination of Boujou and Maya Live, then creating a basic model of the river in Softimage|XSI using measurements and other data sent over from production in the US. Next up, they created a low rez shuttle and animated this with feedback from McMurray to get the choreography right. Once a shot was approved for animation they lit the shuttle and then sent final renders to Sydney for compositing. Senior 3D artist Ben Paschke built and textured the shuttle and animated all of the shots with fellow animator Timothy Kings-Lynne. The final renders were global illumination renders through XSI and Mental Ray that were rendered out and then "baked" back on the model for a motion blur pass. This process achieved the realism of global illumination without the problems of trying to render motion blur at the same time.

While Paschke and Kings-Lynne were concentrating on locking down the animation, Jason Madigan took on the task of building all of the space views of Earth, using a considerable amount of photographic footage to get the desired look.

Madigan built the Earth in 3D and then applied hand-built textures at resolutions up to 10K. The final touches involved putting a volumetric cloud layer in place, which required building a fractal noise shader for generating cloud maps and a technique for pushing the resolution of images by building fractal noise into the sub sampling that occurs when you zoom in.

The 3D pipeline was split into three for the actual water generation: wakes, water (including reflection, refraction etc and debris) and visual effects elements such as individual splashes, plumes of vapor and misty bits that just hang there and interact with the vortices that trail behind the shuttle.

All the elements were kept as simple as possible so that once they were composited together in Shake any simple changes didn't necessarily mean another 3D render. Because of the care taken with the lighting, they usually found that a simple 2D tweak was all it took to get the water to "sit" correctly.

One thing that really enabled the 3D and the 2D to gel so easily was the use of CineSpace, a piece of monitor calibration software that was written by Rising Sun a few years ago and is now sold through sister company Rising Sun Research. It's designed to allow the user to see (as close as is possible) a representation of what one would see on film on a CRT monitor. All of the monitors in both facilities were calibrated with this to enable the 3D and the 2D artists to see the final product regardless of whether they were working in linear or logarithmic space. "I have a feeling that without this the whole filmout process might have been a lot more ‘interesting,'" says Crosbie.