Issue: July 1, 2008


James Mason, Arlene Dahl, Pat Boone. Pat Boone? Fifty years ago Henry Levin filmed the stars attempting to sail a flimsy raft on a mysterious subterranean sea somewhere deep within in the Earth's murky bowel.

It's a nifty premise, but it has taken five decades of filmmaking and technological advances to overtake Levin's Cinemascope Journey to the Center of the Earth and bring Jules Verne's classic tale back to theaters. Today we have the affable Brendan Fraser taking over for the chilly Mason, as well as feats of visual effects that would have been unimaginable in 1959.

But recently, Eric Brevig, the visual effects supervisor on so many action and fantasy films covering 16 years at ILM and before that at DreamQuest Images, could not perform that key role. He's been too busy directing his first feature film: Journey to the Center of the Earth 3D. On Journey 3D Brevig used a VFX supervisor he knew well — Chris Townsend, who was his associate supe on many films and who maintains "good shorthand communications" with the director.

Because Brevig was both designing sequences for the new film and planning how to shoot them, he naturally planned them to be effects-friendly. "It kills me to see things done inefficiently," he says. "The effort should show up on the screen and not be someone rotoscoping someone's hair for three weeks." A director without Brevig's experience (The Island, The Day After Tomorrow, Signs, Wild Wild West, Men in Black, The Abyss) might call for a shot that, "if it were six inches to the right, would cost half as much money."


Chuck Shuman, Brevig's bluescreen DP for 20 years, shot the new Journey in stereo using two Sony HD F-950's bound together just so. Shuman is an "amazingly talented lighting cameraman" who, with Brevig, worked with James Cameron on The Abyss.

After Titanic, Cameron built underwater dual-camera rigs with his engineering partner, Vince Pace. "Those were the most advanced stereo camera rigs" when Brevig started prepping Journey two and a half years ago. Brevig told Cameron, "I'd love to use your rig, but I'm going to need a smaller version because I'm going to be in tight places." The Pace/Cameron team obliged by quickly finishing a smaller rig just in time for the start of shooting in Montreal.

On set, the compact, twin F-950s had their lenses and optical block — on which the image is actually formed — out-rigged via a fiber-optic umbilical and separated as much as 400 feet away from all the camera electronics and recording mechanisms.


The director's station had two HD monitors. Brevig and Shuman shot both an A and a B twin-camera rig on set — so that's four images. "I knew what the 3D looked like," Brevig says, "I just needed to see what the cameraman was framing." The director's flat-screen HD monitors were "wonderful" because "I could judge focus and see what we were getting very clearly." Video engineer Fred Meyers recorded a direct master of each individual camera at 4:4:4 and made a simultaneous Sony SR-W version at 4:2:2.

Inter-ocular distance is adjusted between the cameras' lenses, and Brevig would go for distances ranging between three inches and a mere half-inch. "If you're shooting miniatures, you want to reduce the distance between the lenses because that reinforces the scale of what you are shooting as being big. This is used in a scene that would otherwise cause eye strain if you have objects close to camera and very distant from camera." Reducing the inter-ocular distance can make the image onscreen — including a close up of Fraser — more comfortable to look at.


Brevig is happy that Journey 3D will be brought to theaters by RealD. The company "has the largest market share in the US," he says. There are currently at least 2,000 RealD theaters set up in the US, and a recent deal with the Regal Entertainment Group will up the total to 3,500 in the near future.

Matthew Cowan is chief scientific officer at Beverly Hills-based RealD ( and he cites brightness level as a prime reason for RealD's acceptance: "The XL system doubles the brightness of the projected image, allowing use of lower lamp power or larger screens — or both — for images that meet the studio-required brightness." RealD uses a high-gain silver screen to help double the brightness of projected images and employs the full DCI-compliant color gamut. RealD also promises audiences will see "ghost free" pictures using inexpensive, single-use 3D eyewear. [For more on the impact of RealD on the stereo market, see news story on page 12]

Competing stereo display schemes from Dolby and XpanD will also display Journey 3D on their own 3D platforms in theaters, mainly in Europe; while RealD has 3D screens in theaters in 24 countries.
Brevig stresses that stereo films are also typically released in 2D — just the right eye — and he was careful while directing and editing to make sure that Journey held up as a 2D movie as well.


Brevig worked with an editing team comprising Steven Rosenblum, Dirk Westervelt and veteran Star Wars Avid editor Paul Martin Smith. Smith "came well armed because of his experience in visual effects and working in HD," says Brevig. Of the editing process, he adds, "We looked at all the dailies every day in stereo. Then we cut it in 2D and, when it was all done, we conformed and viewed it in stereo just to see how it all played. Basically we had very few changes to make once we saw it in 3D."


"Stereo complicates all visual effects issues because VFX fool the eye," Brevig says.  "In stereo you can see in depth any cheats that you might be trying to get away with. If you're making a synthetic ocean surface, it's made up of from five to 10 water simulation layers. Each of those all have to be exactly in the right Z space." Of Frantic Films' VFX work, Brevig says, "the images they created were wonderful but layering them so that they lined up with the live action and all the other elements of the composite, I think, proved to be a real challenge to everybody who works in stereo."

Chris Harvey, Randal Shore and Michael Shand work for Frantic Films VFX, Harvey in the company's Vancouver shop and Shore and Shand in the Winnipeg facility. Shore is a VFX producer and Harvey is VFX supervisor and head of 3D. Along with Shand, Frantic's Winnipeg-based VFX supervisor, the three were instrumental in bringing Journey's 133 raft-at-sea shots to the screen in stereo. A total of about 60 Frantic artisans and technicians worked on the film at the two locations.

The Frantic crew was not on-set for Journey's principal photography — they came on a little later in the process. The VFX are all-Canadian (and enjoy Canadian tax breaks). Montreal shops Hybride and Meteor Studios first got the VFX nod from Brevig and, once the filmmakers realized how crucial CG water and waves would prove in the film's 4.5-minute storm-at-sea sequence, they sought out Frantic. The company also develops and markets Frantic Films Software, which includes titles meant for watery particle work such as Flood and Krakatoa; some helpful in stereo work like Awake; and also Deadline, an administration and rendering toolkit for Windows-based renderfarms. (Their CG animation software tool of choice is Autodesk's 3DS Max.)

Frantic's storm-at-sea work includes: creating highly detailed seas (all CG) that surge around the practical, storm-tossed raft that bears the live actors; match-moving the waves to the motion of the raft; creating the creatures (all CG) that come crashing out of the water (including one that flops on deck); the stormy sky; and the painstaking addition of replacement of rain water (in this case both practical and CG) that strikes the waves and soaks the subterranean travelers. The storm's innumerable raindrops had to be made to behave properly in stereo space or they could commit visual violations like suddenly disappearing behind an actor's head. Throw aboard the raft's CG sail and ropes and you've got a lot of digits.

The sea was considered a character unto itself, interactively bucking the raft around in stereo space, but an unexpected practical problem added to the degree of digital difficulty: back on the set, the motion control gimbal used for rocking the raft broke down, so crew members hid behind and beneath the raft and jostled it physically as best they could. You can't use a stock sea for something like this. Journey's raging sea was completely designed and executed using Frantic's simulation R&D, fluid dynamics, particle spray and more at the Winnipeg facility, Shore says. The CG waves were made to "perform" and look real via "art directable" simulation tools that
made the waters seemingly drag the raft to and fro.

One of the storm's travails is a school of 150 or more prehistoric razorfish that crashes through the waves threatening our sailors. One particularly toothy individual even flops aboard and Fraser's character has to wrestle the fiendish thing over the side. In the height of live actor/CG creature interaction, director Brevig had Fraser grapple with various props to achieve various camera angles, including a foam-rubber football and a stick with a weight on the end (to represent holding the fish by its tail). The razorfish were designed with gnashing, deep-sea teeth and a bioluminescent glow, and Frantic's 3D team brought that to life via 3DS Max, Autodesk Mudbox and Pixologic ZBrush. With lots of proprietary scripting: "We have our own muscle tools, skin-simulation software," Harvey says, adding "the pipeline is all custom-scripted." The 3D team, led by Chad Wiebe, featured lead modeler and texture artist Jelmar Boskma on both the fish and the plesiosaurs. All 3D artists and other contributors to the creature pipeline worked under Harvey's supervision. 

The razorfish, we soon see, are actually fleeing a voracious pack of giant plesiosaurs - as many as seven long-necked, sea-going, carnivorous reptiles in some shots — bounding through the waves after the fish. Frantic's work on the shots with the razorfish and plesiosaurs (which they were asked to re-design so they appeared more fearsome) was so pleasing that the team was awarded a last-minute additional creature: a prehistoric trilobite, designed by Juliana Kolakis, appears in scientist Fraser's dream early in the film.

Frantic's two locations (and there's a third Frantic in LA) communicate as freely as if they were across the hall from each other using CineSync. And Frantic's proprietary system, Project Flow, is highly important as a dailies and tasking management system. "We are able to assign tasks, submit dailies and review dailies all through this system," Shore says, "which also synchronizes the digital files to each office so the supervisor, wherever they are, can review the artists' work wherever they are. We use this in conjunction with CineSync for actual conference call reviews." Along with Frantic's proprietary software tools the team also uses both SynthEyes and Boujou  for camera tracking.

For compositing, Frantic's longtime workhorse has been Eyeon's Digital Fusion. Sean Konrad was 2D technical lead on Journey 3D and has since been made Frantic's pipeline designer. "One of the great points of Fusion on this show was the way we were able to use it in ways that weren't intended to smooth the process of comping in stereo — both actively on projectors and passively using anaglyphs and other stereo viewing methods," he says. "It's a testament to the company's existing tools, the interface design and the low-level SDK accessibility/script automation that we were able to jam a more-than-capable stereo toolset into an application whose primary function was conceived well before the resurgence of stereoscopic cinema."

For stereo production, Frantic's Michael Shand is credited with conjuring up the Fusion-based "stacked images" approach rather than the more common one-eye-at-a-time mode. "Stacking allows us to work in stereo at every stage in the process while simultaneously limiting the stress and management of dealing with two separate eyes," Shand says. "By keeping the eyes together in the workflow, it allows the compositor to have constant access to how the eyes interact with each other, which is very important. Comparing eyes for differences in color, roto and inter-ocular distance is a necessity and keeping the eyes paired all along the way makes this very simple."


Frantic made its own rain, and Brevig and company made rain on the set the old-fashioned way. The question was how would the two match up? "You'd have to key out what you could and replace a lot of rain," Harvey says, and a good bit of R&D time went into solving this. "It was amazing how much the rain added to the stereo aspect of the show. It was so cool because you could see this volume of rain extending back to the horizon and raindrops right in front of your face."

Frantic created a "nonlinear creature pipeline" that was a big help when creature design changes were made very late in the game to achieve scarier dinosaurs. "They revamped both the style of the fish motion and the plesiosaurs' look," Harvey says. "Because of the system we'd devised, you could animate, rig, model and texture all independent of each other at the same time and they would all flow back into the final render. We redesigned the dinosaur, pushed ahead on the animation and, within a week, we had a new concept approved, a new model done, and it was in shots. We didn't lose animation we'd already done; it just fed back into itself.

"This is a stereo movie," Harvey says. "They shot it in stereo and stereo is how it was intended to be seen. It's great — I like the aspect of being immersed in the film!"