By Daniel Restuccio
Issue: November 1, 2004

POLAR EXPRESS


Jerome Chen's first impression of the project was that this "was going to be an amazing opportunity to create new techniques and technology."
CULVER CITY, CA - There are those who will say that you have not truly seen The Polar Express until you've seen it in IMAX 3D. The Warner Bros./Sony Pictures Imageworks movie, based on Chris Van Allsburg's children's book, has been adapted for the big screen by Oscar-winners Tom Hanks (Best Actor) and Bob Zemeckis (Best Director).

On their journey to realizing the story of a young boy's renewed faith in Christmas, Hanks, Zemeckis, visual effects supervisors Ken Raltson and Jerome Chen, and the team at Imageworks have created an astonishing cinematic hybrid that blends the imaginative beauty of animation with the subtle feel of a live-action film. The startling realism of the movie in 2D achieves an uncanny presence in IMAX 3D. The film opened on November 10.


Imageworks used Alias Maya, MotionBuilder and Maxon's Cinema 4D for animation work.
Hanks and Zemeckis felt a particular affinity to this beloved story that they had read to their own children at Christmas. No stranger to films with complex effects and animation, director Zemeckis envisioned the project as a live-action movie with the look and feel of the original oil pastel illustrations. Zemeckis contacted his Back to the Future colleague Raltson at Sony Imageworks.

The group spent months brainstorming and testing different approaches, including a short-lived, live-action version with a paint filter applied and a version that combined live-action and CG. Eventually they leaned toward a completely computer animated film but using the technique of motion capture.

"My first impression," says Chen, "was that this was going to be an amazing opportunity to create new techniques and technology and not be a traditionally-animated film."


Imageworks used Alias Maya, MotionBuilder and Maxon's Cinema 4D for animation work.
MOTION CAPTURE
The technique of motion capture has evolved over the years as a cost-effective means of applying realistic motion to a CG model versus the labor-intensive approach of keyframing. In basic terms, the actor performs a specific action, in a defined space wearing a special skin-tight suit, with sensors that are read by infrared cameras that capture the positional X,Y and Z information of each sensor in that volume. However, in order to achieve the level of realism Zemeckis wanted, they would have to take that technology to a whole new level.

Enter Demian Gordon, seven-year motion capture veteran and mocap supervisor for The Matrix movies. In The Matrix: Reloaded and The Matrix: Revolutions, Gordon pushed the mocap envelope by presenting truly plausible digital doubles throughout the films. As mocap supervisor for The Polar Express he dropped the jaws of system provider Vicon when he conceptualized a mocap system with 64 cameras, almost three times the 24 he used on The Matrix.




Gordon built three mocap stages but mostly used the high-resolution space, a 10 x 10 cube, that could accurately capture the complex motion of up to four interacting characters. What makes this distinct from The Matrix system, says Gordon, is that they could capture both body motion and facial expression in 360 degrees at the same time.

Zemeckis was so impressed with the first tests, done with Hanks performing one of the characters, that the idea was hatched for the versatile actor to play multiple roles, including the lead role of the eight-year-old boy. Hanks would ultimately play five roles: the boy, the conductor, the boy's father, the hobo and Santa. In scenes where Hanks played the child-size hero, oversize props and set pieces were used.


Mocap supervisor Demian Gordon requested Vicon's performance capture system feature 64 cameras - three times what he used while working on the Matrix films.
So began the logistical adventure of shooting the entire feature in a 10-foot cube. As the mocap cameras faithfully rendered the actual movement of the actors, 12 reference video cameras simultaneously recorded all scenes on the stage. Most of those cameras were set as wide-shot lock offs. Others were operated to get tight close-ups of the actors' faces. Co-cinematographer Don Burgess worked with Zemeckis on the basic blocking of the scenes, however none of these videos were used to determine the actual shot angles and framing for the movie.

THE INTEGRATORS

The Polar Express did start like a traditional animated movie with a Leica reel, an animated storyboard, cut by Avid Film Composer editors Jeremiah O'Driscoll and R. Orlando Duenas to the voiceover dialogue of the actual cast members back in May of 2002. However, unlike a traditional CG movie, this animatic was not refined into the final CG shots, but used more to budget the picture and as a writing tool for Zemeckis, who penned the script with William Broyles, Jr.

Once the script emerged, scenes were staged on the mocap stage. The editors took the video reference footage and built a "performance assembly," cutting together the actors' best takes into a complete version of the scene. For example, only when Hank's performance as the conductor was cut together with his performance as the boy, did the scene actually exist. Yet once assembled it existed in a 360-degree virtual world. After Zemeckis approved the performance assemblies, they were turned over to "the integrators."


Mocap supervisor Demian Gordon requested Vicon's performance capture system feature 64 cameras - three times what he used while working on the Matrix films.
The integrators' job was to find the corresponding mocap data for each performance, in every shot used in the scene, and then build a low-rez CG version of the scene, which was dubbed "Michelin Men" by Zemeckis. These motion-accurate, featureless, low-rez, stick figure models acted out scenes with an inset picture of their detached faces floating above them from the reference video. This is the point where the actual shooting of dailies begins.

To make the cinematography have the look and feel of live-action, Imageworks built "the wheels room." Inside the wheels room two operators, dubbed the digital grips, would quickly set rough keyframe camera moves along a spline in the CG world. Each of their computer systems was connected to a custom interface, a black box with large aluminum wheels, based on a remote Libra head created by Nick Phillips. Co-cinematographer Robert Presley would re-shoot the move, using the wheels interface, imbuing it with the particular shooting style he knew Zemeckis wanted. "Since Polar Express was shot like a live-action movie, cut points were generally based on the video performance rather than the actual wheels room shot," says co-editor Duenas, adding that because the CG characters didn't get lipsynced faces right away, it was the only way to tell what the actors were doing.

Once the sequence was near to being locked, it entered "the turnover stage." This is when the final Michelin Man shots are "turned over" to Imageworks for detailed CG work. Many meetings occur to deconstruct each scene, determine the complexity and budget every sequence, shot by shot.


Sony Pictures Imageworks used its own Birps software for lighting and Bonsai for compositing.
After shots were "turned over," fine integration was done. First it went back to integration, and all the high-rez mocap data was applied to the character models. In a regular animated film, all the props and costumes would be created from scratch. The Polar Express had all the traditional departments of a feature film. All the costumes, set designs and props, were meticulously scanned into the digital domain, adjusted and set into the virtual scene as if dressing a real set.

Putting The Polar Express together over a two-and-a-half-year production cycle took the talents of 319 artists and production people.

THE TOOLS

Image works animators used Alias Maya and Maxon's Cinema 4D to build Michelin Man scene files and turnovers, and Alias MotionBuilder to blend mocap data in performance assemblies. Pixar's RenderMan was used for rendering, Imageworks in-house software, Bonsai, was used for compositing, and in-house software, Birps, was used for lighting. They also created a new, ultra-fast smoke rendering software called Splat.

Bill Villarreal, Sony Pictures Imageworks VP of technical operations, says that in order to produce The Polar Express in both 2D and 3D, they brought online a renderfarm of over 1,200 processors to crunch through the nearly 70TB of data from both versions - the system was made up of Dell and IBM workstations. New to this production, he says, was access to two high-resolution workrooms that played back 1280 x 1024 images in realtime, with audio, on either a high-resolution monitor or projector. In these rooms, dubbed sweatboxes, animators regularly reviewed their work with Ralston and Chen. An additional room was set up with two projectors equipped with polarizing lenses to simulate the 3D effect for the separate team that did the 3D version of The Polar Express. The studio employed Network Solutions for digital asset management.

IMAX 3D

Image works producer John Clinton and digital effects supervisor Rob Engle, oversaw a totally separate team of 70 artists, including camera, technical directors and effects artists to transform The Polar Express into the first full-length IMAX 3D CG feature.

Clinton and Engle's team would re-enter the finished Maya scene files and attach two additional cameras to either side of the actual camera used to shoot the scene. The spacing between the cameras is made into a parameter they can animate.

"The way you see things in 3D in a theater is driven by that separation between the right and left eye cameras, " says Hugh Murray, IMAX VP for technical production. One of the reasons the CG version works so well, he says, is that that separation is precisely adjustable for every frame of a shot for the optimal 3D effect.

Clinton says that even though they were simultaneously building two additional movies, the process went very smoothly. They had their own separate team, their own renderfarm and a lot of storage. "There were very few changes to address the IMAX screen," echoes Engle.

"We reduced the depth of field on most of the shots," says Engle, "to deepen the focus and allow the eye to wander and be immersed. There were a number of shots that needed to be slightly recomposed to account for the larger, more immersive size of the IMAX screen. For example, when the boy is reading his note from Santa, some elements crowded the edge of frame."

IMAX took the 290,000, 2K frame files produced by Imageworks and put them through their proprietary DMR (digital re-mastering) system. DMR has two component processes: one takes out the film grain and the other converts the image to 4K resolution. The Polar Express had no film grain so they just up-rezed the 2K files to 4K and applied the custom-sharpening tool to get the detail back. They then rendered the 4K files to film on a Management Graphics Solitare film recorder to 65mm negative and struck all their 15/70mm prints from that negative.

Sound playback in an IMAX theater is experienced over a 12,000-watt, uncompressed, six-channel surround system. Each theater has up to 44 proportional point-source speakers that can accurately move sound left to right, backward and forward, and up and down. At Skywalker Sound, Murray spent a week supervising the remix of the The Polar Express soundtrack to further enhance the 3D feel. The remixers were Gary Summers and Dennis Leonard.