Issue: May 1, 2010


Anybody poke a spear at you lately? How about a tusk? A gun muzzle? Chances are, as you view the ever increasing product coming to theater screens in stereo 3D, you’re seeing a lot less jabbing and a lot more “spatial filmmaking” — developing that magical Z-axis as a new aspect of storytelling and environment creation rather than as a brief gimmicky thrill. Of course, few among us are above enjoying some good, brief, gimmicky thrills, but movies are 90 minutes to two hours and more in length and the goal these days is to immerse the viewer in a milieu and really take you someplace.
This is especially true of today’s CG movies. DreamWorks’ chief Jeffrey Katzenberg laid down the law after 2003’s traditional-cel/CG combo, Sinbad — future animated films would henceforth be produced as “3D animation.” But, spatially, what we called “3D” then was really 2D.
Today virtually all CG releases are the new kind of 3D that requires glasses, as are a growing number of live-action movies — even some indie films. Currently, there are maybe 3,500 stereo screens in US theaters. That figure will only grow as exhibitors experience increasing demand for stereo 3D screens to display this new form of storytelling.


The job title “stereoscopic supervisor” has been around since, oh, 2007. But when you put yourself out there as “Captain 3D,” you are really making a statement about how important stereoscopy is in your career — and the film industry. Phil “Captain 3D” McNally is a case in point. He’s been stereoscopic supervisor on DreamWorks Animation (www.dreamworksanimation. com) films starting with Meet the Robinsons through the current How to Train Your Dragon and on to Shrek Forever After, now in its final stages, and this fall’s Megamind.
The paucity of stereoscopic screens was one big drawback to 3D’s proliferation, but McNally notes that even in the early days the number of 3D screens was roughly doubling with each major 3D release. “We’re now getting the momentum behind the content,” he says, “and there’s more content than there are screens, as we’re seeing with our own How to Train Your Dragon. The pressure is now on the screen side. It’s like we’ve pushed the cart over the hill and we’re now starting to roll down.” That cart is full of new product, including next year’s Kung Fu Panda II — into which stereo opportunities have been worked from the very beginning of production.
At this point, 3D should look like it’s part of the filmmaking and storytelling, McNally says, “instead of being bolted on after the fact.” McNally likens a good stereo experience to a memorable dream — three-dimensionality is an integral part of it. Filmdom’s first century of moviegoers actually had to train themselves to see and understand a story in only two dimensions, he asserts.
With a number of DreamWorks Animation films moving through the production pipeline at once, McNally spends his days looking at stereo imagery — “pretty much everything at every level.” As with any CG film, initial work starts with a small group sketching storyboards in 2D. “But the minute that we go from very early storyboards — into even previs — as soon as it goes into a computer graphic environment, in our case it’s typically Maya, then we have the capability of setting stereo.” The idea is to give CG artists — and directors, too — the ability to recognize opportunities to exploit 3D space. They also get feedback on whether a cut works as well in 3D space as it does in 2D.
By the time the animation department completes its final pass, McNally says, “the last thing we do is dial in the stereo in a way that can be animated within the shot. We can have multiple stereo rigs so that a foreground character might have less depth to be less distracting while we’re looking at the full depth of the background. The bottom line for previs is that we want good stereo that is fast and doesn’t hurt — we don’t want eye strain.” 
As for the high intensity achieved by the quick-cut style used in action films, McNally believes that 3D films can generate the same levels of excitement at a slightly slower pace — a stereo film’s sheer amount of data is intense in and of itself. This makes previs all the more important as far as judging how to set a scene’s depth.
For Monsters vs. Aliens (2007), McNally’s first movie as stereo super, he set up a variety of depth tests — simple spheres and grids that ran through a range of depths. One thing DreamWorks determined was how much depth you need to make a sphere look round in the Z dimension. “It’s very important for your characters not to look as if they’ve been squashed or stretched in space. That volume is the separation between the two cameras — the inter-axial or inter-ocular distance.”
A second lesson was how much an object can be moved behind or in front of the screen without causing viewers discomfort.
“We’re designing for a 40-foot screen and our images are 1920 pixels wide. On a 40-foot screen, 10 pixels of separation which recede beyond the screen is the equivalent of infinity in real life. That’s not very much depth to work with. So we tested how far beyond that we could do and still have an acceptably comfortable depth and we found that 20-24 pixels of depth is still very watchable for almost everyone in the audience. This is about double eye-width and what we’ve found is the space behind the screen is more limited because people don’t like to diverge their eyes.”
So DreamWorks does not go beyond the 20-24-pixel depth range — a.k.a. just over one percent of the screen’s width. However, just going “back” is not enough to make, say, background trees appear far enough away. “So you have to use the space in front of the screen to give yourself a depth budget to work with — to provide good volume on characters as well as comfortable depth in the far background.” McNally and company found that you can use slightly more pixel depth in front of the screen — 1.5 or maybe two percent. Combined with the 3D space created behind the screen, “now you’re up to 2.5 or three percent as a very comfortable amount of depth from the nearest point all the way back to the farthest point with the screen falling somewhere between the two.”
McNally views How to Train Your Dragon as “spatial filmmaking” rather than 3D filmmaking. “With 3D you tend to think of the gimmicks,” he says. Audiences appreciate the flying sequences in Dragon because “they are hugely spatial by their nature. What could be better than having a technique which increases the ability to sense the space?”
McNally recalls the first time the film’s directors, Dean DeBlois and Chris Sanders, saw the flying sequence, in a 3D rough-layout review, where the titular dragon and his young master begin to fall to earth from a great height. “They were like, ‘Wow! This is so cool!’ The level of intensity had gone up beyond what they’d been seeing up to that point.”


Stefan Sonnenfeld and Company 3, based in Santa Monica and New York, got involved early in 3D motion-picture mastering. Sonnenfeld uses a Blackmagic DaVinci Resolve system to grade stereo films in Co3’s Dolby/RealD-equipped theater. Tim Burton was a recent inhabitant of the theater with his Alice in Wonderland. 
Co3 executive producer for feature film, Devin Sterling, manages the DI grading theater in Santa Monica (Co3,, is an Ascent Media company) where the focus is shifting to 3D films. In the past year Co3 did G Force and Alice here, as well as the films’ numerous marketing campaigns, including trailers and special press pieces.
“We have what looks like a very big 2011, with lots of 3D releases,” Sterling says, adding that by next year this theater may be exclusively 3D. “There are plenty of films that are gearing to do [3D] production in the very near future.”
One path to 3D starts with your 2D film and curiosity about converting it to 3D. Ascent does not currently offer 3D conversion but the company does have a “soup-to-nuts finishing process for 3D films,” Sterling says.
Alice was shot single-camera using 35mm for the traditional-looking sequences above ground and Genesis HD for the greenscreen fantasy that takes place underground. Burton fell among the directors who found the current double-camera rig too restrictive for acquisition. Rather, Sterling says, Burton and company planned early on to do subsequent 3D conversion. “I think the way they used it was quite creative. Three-D is about immersion, not gags or gimmicks. Alice is a very comfortable movie to watch.”
Almost every shot was greenscreen; Sony Pictures Imageworks was the lead house with Ken Ralston in charge of VFX. Co3 scanned Burton’s film elements at 2K using Arriscan. Alice’s 3D DI was done at 1920/1080 resolution. Co3 performed the DI from 10-bit log DPX source files.
IMAX versions were also called for. IMAX mostly uses film deliverables, but even their large format is beginning to rely on digital display. “There’s a lot going on,” Sterling says. “We’re on the cusp of something new.”
Director Tim Burton balances his wildly creative eye for design with a decisive nature and, sometimes, a refreshing disregard for “the small stuff,” Sterling says. And Burton had Stefan Sonnenfeld at the controls of the Resolve.
“Detail-wise, you need to see a little more into the 3D than the 2D [standard color grade],” Sonnenfeld says. “We felt we needed different contrast ratios in the 3D because you’re seeing deeper into the three-dimensional image, although the base of the 3D correction [is the initial] 2D correction.”
That said, Sonnenfeld suggests that Alice, with all its CG backgrounds, actually gave him somewhat less color work to perform. But the work involved in matching his color grade with the CG work done at Sony began very early on. “Obviously [CG images] have to have a style and color and contrast and balance, but the translation between Sony and Company 3 was very tight so, when Tim would go from Sony to Company 3, it would look pretty much the same.”
Sonnenfeld and Burton, along with editor Chris Lebenzon, focused on the challenge of maintaining color uniformity between the movie’s four different displays — 2D, 3D, IMAX and IMAX digital. (This required numerous visits to the IMAX facility.) Simply put, Sonnenfeld says, their color management effort was “to make sure what we do in digital gets emulated as accurately as possible to film. In that process you have these 3D look-up tables where you manage the color for whatever deliverable is required — consistent to the original intent.”
There were occasions when Sonnenfeld had to “majorly color” certain scenes when Burton wanted to achieve a moody, de-saturated “look.” In one memorable scene, Johnny Depp’s Hatter walks through a forest lit in moody, contrasting darks and light with Alice on his shoulder (when she was just small). In this case, Imageworks did not have time to impart a unique color look and feel, and Burton tapped Sonnenfeld for his expertise. “We used many multiple [Resolve] windows and shapes, and tried to give the feel. Our system has infinite windows capability — with all the different twists and shapes we were able to do, we finessed it with Tim and it all came out looking great.”
Burton takes the color red seriously and Co3’s use of 3D LUTs helped. “When you get into a three-dimensional look-up table, that allows you to mix the colors,” says Mike Chiado, Co3’s director of engineering. “So a red color can actually translate in the red-green-blue color space to any other space, not just along the red vector.”
Alice’s croquet sequence required concentrated color work, as did scenes in the Red Queen’s castle, including shading her oversized face and keeping its paleness consistent. Sometimes Depp’s remarkable makeup needed tweaking at Co3, again for a consistent look.
Since so much of the movie is greenscreen, Sonnenfeld says, “the CG people, the matte painters, the supervisors, in that sense are just as important as the real set designers, makeup people and lighting people — it’s a whole new environment. It’s teamwork at its best.”


Country music star Kenny Chesney’s hit songs deal with fun subjects like “Summertime,” “Beer in Mexico” and a “Keg in the Closet” to name a few. His next project also deals with the hot season, but this time in stereo. 3D: Kenny Chesney: Summer in 3D was directed by Joe Thomas and edited by Skip Masters and Mark Pruett. Milton Adamou, a longtime Quantel iQ user, was the film’s Pablo 3D artist.
Compiled from six different open-air stadium shows, the concert film was shot last summer by 3ality Digital using as many as eight stereo rigs per show with Sony 1500 camera heads. (Up to 14 2D cameras captured concert action as well).
While Chesney and his band mates may toy with viewers’ perception of the Z-axis occasionally on stage, the real action revolves around Chesney’s showmanship. “The coolest stuff is moving,” says Masters, who served as unit production manager as well as editor. “Kenny is in almost constant motion and there aren’t many static shots in the show. We used tracks with Chapmans, pneumatic pedestals, a Super-Techno, Steadicam, and even a few cameras on conventional sticks.” Masters is a partner in production company HDReady, LLC (www.hdready. com) and 3ality Digital served as production partner on Kenny.
3ality Digital’s quest is to provide end-users with a stereo viewing experience sans glasses ( but Masters and crew monitored their progress through a 3ality SIP (stereo image processor) displayed on a JVC 3D LCD monitor using polarized glasses. Masters is a veteran of multicam live concerts and edited Summer in 3D, his first stereo show, on Avid DS Version 8.4, which was not yet 3D-ready. (It is now.) The show was cut in 2D (one eye) with constant reference to the 3D material on the JVC monitor via a Sony 5800 VTR. “There were really very few shots that had to be changed because of something we couldn’t see in the other eye,” Masters says.
Chesney the performer is kinetic and his music is energetic, so the concert film needs to play that way, too, even if 3D is known for its slower pace, Masters says. “In the case of Summer in 3D, we started out at the languid pace that you’ll see in a lot of live-action 3D, and Kenny really felt it was too slow. So we kept ratcheting it up until he was comfortable and some of this movie is cut quite quickly — which is appropriate to the artist and his music, and his fan base.” The outdoor arena stage incorporates a projecting T-shaped gangway that Chesney uses to great effect. “We spent a lot of time in post making the pace comfortable from the perspective of the audience with regard to eye strain, etc. And it worked out great, I think. We also had some songs that were very interesting to play with in the Pablo with regard to visual layering, dissolves, and with depth balance.”
Milton Adamou, who began using Pablo in 2007, is currently working on M. Night Shyamalan’s upcoming fantasy epic, The Last Airbender, which is a 2D-to-3D conversion. He now uses Pablo’s Neo control surface which emulates the tactile controls colorists favor such as track balls. “Pablo isn’t just a color corrector,” he points out. “It’s a compositing system, an editor, a conforming system, a stereoscopic toolbox, a Paintbox.”
On the 3D release of Kenny Chesney, Adamou’s role was editorial online conform and geometry corrections rather than color correction, which was done in DI sessions at Colorworks on the Sony lot in Culver City using Baselight, which output the DCP (digital cinema package for screening). Adamou did perform color duties on the 2D Blu-ray release. 
“It was a stereoscopic online job,” he says. “The Pablo was the hub for the project where we were assembling everything.” Besides geometry corrections Adamou  was also “doing depth grading, adjusting the convergence and rebuilding the opening sequence.” The opening was shot on 35mm and involved seven layers of CG graphics and text for both the left and right eye, but some of the elements were jumping forward or backward out of place, and Adamou succeeded in smoothing things out to work in 3D. “This is where the Pablo really works because you are multilayer compositing through two eyes. We had to combine it, view it all in 3D, then adjust it all in 3D as we were watching it through all the independent layers.”
Adamou appeared at NAB last month on the Quantel stand showing off Pablo’s latest 3D post production tools in Version 5. Besides new color and geometry-correction capabilities, he likes the “very advanced 3D analysis tools that will show you where things are in 3D space, as well as being able to do things [like sharpness] on both eyes or just one eye.” One 3D bellwether Adamou notes is time-to-completion. Post on earlier 3D concert shows was measured in months. “For this we had to complete the post process in four weeks. You’ll find this in post production,” he opines. “Things will get faster and faster, and at some point it’ll plateau and reach a comfortable place where we say, ‘Okay, this is going to take X amount of time.’”


Repercussions from the success of Avatar and other 3D films are being felt around the production world.
Bartosz Malina and his partner, editor Pawel Witecki, own three-year-old MasterShot Studio in Warsaw, Poland. MasterShot ( is dedicated to Red acquisition and Assimilate Scratch post production, all in 4K resolution. Malina and Witecki have been busy using Scratch to conform and grade commercial projects, network TV branding and music videos, and MasterShot has finished about eight feature films on Scratch.
Malina is a director as well as VFX expert and MasterShot’s next challenge is a feature film, Don’t You Dare Scare Me, due out next year. The plan is to shoot the horror/comedy in 3D using two Red cameras and post the show in 3D on Scratch with the consulting help of Arnaud Paris, head of Paris-based Sysmic Films. Paris himself is an experienced Scratch/stereo 3D finishing man.
Based on recent test results MasterShot will use the Red cameras with Angenieux Optimo zoom lenses and shoot in stereo via an Element Technica Quasar Beamsplitter rig. The film will be shot in 4K and the deliverable will be a 2K DCP. Witecki edits on Final Cut. As the production’s stereographer, Paris will use Scratch to conform 3D content via the EDL, color grade, correct the 3D effect, online elements and export dailies for digital cinema package preparation, Malina says.
“There will be a lot of VFX,” he says. “We will shoot a lot of pre-passes for the FX team.” Malina says the team wants to have convergence correction in its compositing software. “We will use standard 3D and 2D apps to achieve effects. Materials from the effects team will be conformed in Scratch. Final 3D effects we will correct on a 3ality/JVC 3D monitor combo. Then we will prepare a DCP master in Stereo/2K/24fps.”
Meanwhile in Paris, France, Arnaud Paris’s own Scratch-based 3D finishing business is going strong ( and has joined with Paris-based StereoCorp3D ( to provide producers with script-to-screen service that streamlines workflow for shooting and posting in 3D. Paris’s service combines Scratch with Red cameras using the same Element Technica Quasar rig Malina has been working with in Poland.
In post, Paris uses Scratch for realtime 4K stereo grading with active glasses in a digital theatre. The Scratch/Red workflow, Paris says, and its ability to handle native R3D in realtime at 4K has created a buzz. “We’ve fitted Scratch with dual Red Rocket cards, so we can work with two streams of 4K R3D files at the same time. But we can work with dual DPX streams just as efficiently.”
The Quasar rigs are available through Sysmic Films’ sister company, the Paris rental house LocaRed.
The Sysmic/StereoCorp3D alliance also offers a popular 3D production training course — highly regarded French stereographer Alain Derobe is one of the teachers. With excitement over 3D production increasing, the French educators have also been invited to launch the first Reducation program in Paris.
“Recently we were hired by Bartosz’s [MasterShot] to help them prep for a feature in Poland,” Paris says, and this form of consulting is becoming a trend for Sysmic in other countries as well. “The tools to correct convergence are very intuitive,” Paris says of Scratch. “And we have a 3D projection system that lets us prepare larger screen size masters as well as home/Blu-ray releases very easily.”


Dimitris Athos, 3D stereoscopic producer at New York’s UVphactory (www., has served as program director for BEFILM, The Underground Film Festival, since 2004. BEFILM suggests that stereo 3D filmmaking by independents is maturing from a hobby to a profession. The festival claims bragging rights as the first short-film event to establish a stereo 3D category. (Last year’s Venice Film Festival added a stereo category too.) As soon as Athos and BEFILM initiated the category in the fall of 2008, they knew they were onto something. “All of a sudden, out of the woodwork, came all these independent 3D stereoscopic filmmakers,” Athos says, committed artists and small studios who churned out shorts in stereo on little or no budget. Last year BEFILM ( screened nine 3D shorts out of 30 entries. This year’s festival — held in New York April 27 through May 1 — saw more 3D entries and screened at least 14 3D shorts, some of which are commercials.
This year’s UVphactory 3D entry, Drown in the Now, is a 4:30 music video for Crystal Method. The video, nearly all stark, black-and-white (with no gray scale) CG with some surprising red images, was “recreated” in stereo. Athos is careful not to confuse the UVPH process on Drown in the Now with garden variety 3D conversion. “Ninety percent of the video is CG,” he says. “It was recreated in true stereo 3D.”
Dolby has been supporting last year’s and this year’s BEFILM Festival 3D filmmakers with discounts on DCP, the final render of all the 3D stereoscopic files for distribution. BEFILM DCPs can be screened in RealD and Dolby 3D-equipped private screening rooms, as well as Disney’s Manhattan screening room.
The DCP makes life for 3D stereo producers — indies included — much easier, Athos says. “Up until [recently] we’ve been screening left-and-right or side-by-side files” created with “home-brew methods,” he says, to produce, edit and display 3D. Going back a couple of years, UVPH did the CGI and post on Bjork’s Wanderlust video — a Tibetan fantasy/tour de force including yaks — in 3D.” That was designed and filmed in stereo, and all the post production, live action and CG creation was done in stereoscopic.”
Looking ahead, Athos wants to see adaptations of stereo 3D in “the big NLEs” shown at the recent NAB. “The thing with stereo post production is that you have to shoot it correctly,” he says. “There’s a new language in the way you frame the shots; in the way you edit it together and pace things; and the way you bring about the thematic realization of your project. You can’t always fix something in post. Even if you shoot it correctly and post it correctly, if your final projection is off, even by a little bit, the whole 3D effect disappears.”
Athos recommends framing a shot a little wider than usual. As far as editing, “You tend to cut from masters to close-ups and then, maybe, to medium [shots]. You will never cut from wide to medium to close-up — it’s too jarring for the eye.” With help from like-minded friends, Athos is putting together a tips-and-tricks book for those interested in getting into 3D stereo. This collection might also be helpful for those involved in big-budget 3D shoots.
“The stereo palette is completely new — you need to function differently,” Athos says. Sometimes, when producers and film students dive into 3D and get back their footage, they wonder “Why doesn’t this work?” “The baton fell to us to try and help because the more good 3D there is out there, the more it encourages our work artistically and commercially.”