Preparing for Stereo 3D
Issue: November 1, 2010

Preparing for Stereo 3D

Although stereo 3D is gathering momentum it appears to be “a bit like the Wild West” to those getting into it for the first time, according to Seth Henrikson, co-founder of Chicago’s Odd Machine, a hybrid production and post studio. “There don’t seem to be any hard-and-fast rules,” he notes. “Anyway, things are moving so quickly that what you did on your last job has probably been updated already.”


The fourth film in the franchise and the first in stereo, Resident Evil: Afterlife 3D was released in September, although it began production before Avatar hit the screens and 3D’s stock turned distinctly bullish. 

“It was a bit of a gamble shooting Afterlife 3D,” admits film editor Niven Howie. “But I think 3D may have gained the film some new fans.”

Shot in Toronto with Sony F35 CineAlta cameras in Pace stereo 3D rigs, files were converted on the set to Avid’s DNxHD codec, and the material was viewed side-by-side for 3D offline editorial. An Avid Unity MediaNetwork was used and shared among three Avid Media Composers — one for the assistant editor, one for Howie and one for the VFX editor.

Resident Evil: Afterlife 3D was not only Howie’s first 3D feature but also his first HD digital movie. “I relied on my assistants to deal with data management, but I was very pleased that Avid had delivered Media Composer 4 software so I could edit exactly the same way I’m used to editing,” he notes. “It was as if I was working with just one picture, yet the system was invisibly carrying the left and right eye together. At any time I could put on the glasses and watch the cut in 3D. That was the selling point for me.”

Despite the familiarity of the Media Composer systems, there was a steep learning curve to 3D, he says. “It’s getting your head around how 3D is delivered to your eyes and to your brain. Once you understand that, you can play with the levels of depth and amount of 3D, which impacts on editorial. The film’s shots were blocked out for longer coverage, and I didn’t do the fast cutting the MTV generation is accustomed to: There’s so much to see within one image that if you cut too quickly it becomes nonsense.”

Howie found himself part of “a very collaborative team” on the picture. “I liked being more involved, sharing ideas and concepts with the rest of the crew — we’re all still learning 3D,” although DP Glen MacPherson, ASC, “knew the rigs and the process,” Howie says.

“We had a short turnaround. I was editing and fine-cutting shots — 300 to 400 of them — to turn over to the VFX company (Mr. X) while we were still shooting,” he explains. “Because no lab had handled 3D in Toronto, the camera company, Pace HD, supplied an on-set lab in a trailer equipped with a Quantel Sid, Media Composer and large-screen 3D projection. Camera material went to the on-set lab; upon wrap we viewed the 3D projected dailies, then they were digitized practically around the clock so they were in my Avid the next day.” Howie “loved” having the lab on set. “It was probably slightly more expensive, but it was very useful and a real time saver to be able to see the dailies without traveling.” Deluxe Toronto conformed the original camera material, performed the DI color grade and mixed the sound.

Even though the technology and workflow is likely to have evolved and changed by the time he cuts his next stereo 3D feature, Howie believes he will still be able to apply his newly-acquired knowledge. “The next one may use different cameras and transfer in a different fashion — and it probably won’t have an on-set lab,” he says. “But I’ve had the experience of working with a team that designed the workflow, that had to think on its feet. So I feel well equipped for the next 3D project.”


Jerry Steele, creative director/co-founder of Culver City’s Steele Studios (www., has found that those exploring stereo 3D “are worried they’ll do everything wrong. It seems that there is a lot of second-guessing.” His advice: “Hire a good stereographer and simplify the post process.”

Steele tries to assuage the apprehensions of newcomers to stereo 3D by explaining that “3D post can be a very straightforward process if you have the right stereo machine on the back end. We have a Quantel Pablo. It has wider bandwidth, can handle up to two streams of 4K at the same time, and offers all the tools you need for regular online, compositing and finishing. So it makes 3D post really easy.”

Using a Sony HDCAM SR-5800, which records left and right eyes simultaneously to a single tape, makes post “even more simplified,” he notes. “Eventually, stereo 3D will be so straightforward that no one will look at it as a separate type of production or post. Dual-camera acquisition will disappear with the emergence of bi-ocular cameras, which are already in the prosumer market. I expect to see a lot more professional-quality bi-ocular acquisition at the next NAB.”

The “gray areas” right now seem to be whether to shoot parallel or converged and how to manipulate the convergence on the back end. “The big technological requirement for 3D is the whole area of convergence,” says Steele. “People who shoot converged need a stereographer on set who can supervise convergence, and a good camera tech who knows how to make modifications and adjustments shot to shot. Some people shoot both: They like to use converged footage for close-ups and parallel footage for wider and more dramatic shots, especially with motion. Shooting parallel allows you to re-converge dynamically in post without worrying about keystoning.”

A Quantel house for the last 20 years, Steele Studios invested in its Pablo primarily to meet the needs of a diverse range of clients producing music videos, commercials and feature films. “We wanted a machine that could do everything: compositing, color correction, titling, rotoscoping, offline/online. That’s what Pablo had going for it — it covers the spectrum,” says Steele. When the stereo 3D option for Pablo became available, Steele was an early adopter. “It’s a real game-changer,” he declares.

Steele notes that sometimes clients considering stereo 3D production want to cover their bases with a 2D version as well. But he warns that creating a 2D version “is not as simple as just picking an eye” and editing left- or right-eye footage, especially if composites are involved. “If you’re doing multi-level compositing from the back to the foreground, especially if there’s a close-up focus on the action, you can end up being off one eye or the other. So you have to do more work in post.”

Over the summer, Steele studios performed 3D post for Shakira’s Waka Waka music video for the official FIFA World Cup song. Sony Music presented Waka Waka in its 3D pavilion at the World Cup games in South Africa and used the clip for point-of-sales worldwide for its Bravia 3D TVs. They also provided post on the Lady Gaga 3D music video, Alejandro.

“We’re having a lot of fun with stereo 3D,” says Steele. “You get to learn all sorts of new things. Especially as a compositor, you’re constantly modifying elements in 3D space now — you’re working on real dioramas, not just a canvas. That’s a new and exciting experience.”


Odd Machine’s (www.theoddmachine. com) Seth Henrikson served as DP on a recent shoot for the Origin Entertainment documentary with the working title, God and Physics. Exploring astrophysics, intelligent design and the origins of the universe, it features interviews with Nobel laureates, NASA scientists and other big thinkers, and is slated for release on Blu-ray disc.

Initially Henrikson was skeptical about shooting the interview segments in 3D, but he “quickly became a believer. When I looked at the dailies, it felt like the people were in your living room with you.” He used a P+S Technik Freestyle stereo 3D camera rig designed for Steadicam use but configured it on a tripod and outfitted it with a pair of Silicon Imaging SI-2K Minis.

“We were the first crew to shoot with the rig, and it was my first time with the cameras,” he reports. Well-known stereographer Keith Collea was on set to advise him; Origin Entertainment executive producer James Volk headed up the production and post production team in Los Angeles.

Editor Warren Lam at Origin says, “there were a lot of variables” in determining the post workflow for God and Physics. “Post changes fast, and now we have 3D on top of it! This was Origin’s second 3D production, but the first handling everything through post production, and we plan on doing more here. Origin wanted to make the best use of its resources and worked things through to arrive in a good space.” 

Experienced with both Avid and Final Cut Pro, Lam ultimately cut the documentary in FCP with CineForm Neo3D, a realtime 3D editing workflow software package compatible with most nonlinear editing systems, including Final Cut. “With CineForm, we can take in material in any format, 2D or 3D, including stock footage from NASA, and convert it to the CIneForm codec,” he explains. “CineForm’s active metadata mixes the left eye and right eye footage, so we can look at the monitor and edit in 3D in realtime. To save time we worked in 2D, then watched all the pieces come together in 3D on our 60-inch Panasonic monitor. It was an incredible to see the difference watching the footage in the 3D space.”

Lam had to get accustomed to the creative differences of cutting in stereo 3D. “I couldn’t do MTV cuts on the talking heads any more,” he notes. “I had to find ways to make the footage engaging without the normal tricks of cutting away in two seconds; that’s where we’ll be using graphic elements,” such as backgrounds and lower-thirds.

He also discovered that 3D is so immersive that it held his attention when jump cutting an interview from a medium to a wide shot. “With 3D you’re supposed to give people time to stay in that environment,” he points out. “But in this case, the jump cut worked. It just goes to show there are no rules.”

Lam is close to locking God and Physics; he expects to provide the files required for a RealD output to Blu-ray disc.


Oasis Imagery (, which launched in August in Hollywood, has good timing. It’s one of the first facilities built from the ground up to support studios and independent filmmakers with stereo 3D and file-based production and post services.

At Hollywood’s first 3D Film Festival (3DFF) at the end of September, guests got to see a proof of concept of the new 3D technology Oasis Imagery has developed, along with Instant Effects, for previsualization, production and broadcast. Dubbed 3x3D by the company’s chief visionary officer, Scot Barbour, the process allows several layers of 3D to be composited in realtime.

For previzualization, filmmakers can “shoot a subject in 3D on our greenscreen stage and composite it into a 3D background in realtime,” Barbour explains. “With 3D, you really need to see what you’re recording ahead of time.

“Imagine going on location. You know you’ll be extending the set and wonder how well the plates will play in 3D,” he continues. “With 3x3D you can put the background in, put the subject in, set the stereo parameters, then go out and shoot everything else. You know what you need to do before you even get on location.”

Broadcasters can also shoot subjects on our greenscreen stage, “add a composited background, put the branding of the network or company on in realtime, dump it to our 5800-SR deck and your show’s in the can, live,” he adds.

It’s the company’s belief that “you can’t do 3D post well unless you do 3D production well,” Barbour emphasizes. “We can shoot your 3D show, edit it, online it and do the DI in our theater that supports RealD, Dolby 3D and XpanD projection.”

Oasis Imagery and Instant Effects are still in development on 3x3D with the goal of creating a product that works well both on a greenscreen stage and as a turnkey system filmmakers can take on location. “3D without this system is almost like it was shooting film and having to wait for film processing to see the result. To see stereo 3D live is paramount in any situation because you can make big mistakes really quickly in 3D, some that can’t be corrected on a reasonable budget,” Barbour points out.

“We showed three simple layers at the film festival. The next presentation will show real interactive effects in a very practical situation. Already a lot of people in the previz world are extremely interested in what we are doing. Several 3D shows coming up want to shoot live. With 3x3D they shoot live, comp their elements in realtime, output to our theater for viewing and output to SR simultaneously.”