The work is becoming more sophisticated, and increasingly that sophistication is being required for 3D stereoscopic projects. Sometimes artists start a job not knowing that they will be asked to go stereo halfway through. Other times they know it’s coming but still have to prepare intricate stereo workflows so there are no surprises during the process. All of this keeps a visual effects team on its toes and ready for anything.
And these days, being prepared for anything often means more than just knowing the latest compositing tools and techniques, it involves learning other aspects of filmmaking that aren’t typically in a compositor’s job description... and most importantly, looking at things in an entirely new way.
The Molecule is a six-year-old New York City-based VFX and motion graphics production company whose primary work has been visual effects for TV series such as Rescue Me, Royal Pains, Damages and Blue Bloods — all deceptively loaded with lots of invisible effects, as well as some traditional big effects — but they have recently been fielding requests for feature film work thanks to their newly minted Hollywood office.
Chris Healer, president/CEO of The Molecule (www.themolecule.net), sees that resolutions keep getting bigger and bigger, and that frame rates keep getting higher and higher, and he’s not certain it’s all very necessary. “Personally, I think we exceeded all practical limitations a long time ago. Maybe you want to see 4K in a theatrical screening, but from then on out HD is fine for all practical purposes — Blu-ray discs, downloading on the Internet, even viewing HD in a theater is fine.”
At this year’s NAB he saw 8K recorders and 120fps playback in stereo. “It’s like, come on guys, do we really need stereo, do we need multi-view, do we need more pixels, more frames per second?” It remains to be seen what of this is going to stick.
The Molecule is currently providing VFX on the indie film Hellbenders. “It’s 5K in stereo, which is like 120MB per frame,” he explains. “It’s cool that we can do it, but it begs the question: why do we always have to be at the absolute maximum limit of what our machines can do? At this point, when an HD job comes in, I am thrilled. Everything is so fast and so fluid. When a standard def job comes in it’s hilarious; it’s virtually realtime compositing.”
Healer and crew have done a good amount of stereo, including work for The History Channel, a short film and bits for the feature The Mortician. Hellbenders is the studio’s first full-length stereo feature. “It’s the first real stress test of our pipeline. It’s one thing for one person to do a shot in stereo or a couple of people to do four shots, but it’s an entirely different thing when you have 15 people trying to push hundreds of shots through a pipeline. And some of them are thinking in stereo, some are thinking purely operationally and some are figuring out how to make the software work better. Everybody is at a different place and that’s an entirely different strain to put on a pipeline.”
Thinking about stereo in a different way creatively is a must, reports Healer. “When I first started doing stereo I thought that even a static frame — a non-moving camera — would need some kind of a 3D track on it in order to extrapolate 3D information. That was difficult to do. I spent a lot of time trying to work out the process of extracting 3D information from a stereo pair to flow it into our software so the 3D guys could add things in true depth and make that mathematically accurate. In an ideal world you would be doing one thing to one eye and parameterizing the space in such a way that having done one eye, the other eye would kind of come for free. Lately our approach has changed dramatically. I don’t believe that anything is exact. From a purely didactic sense you can do it that way, and yes it’s valuable and sometimes you have to go there, but more often I find if it looks right, it’s right.”
Healer acknowledges that when he started working in stereo he was frustrated. “So much of compositing is just smoke and mirrors, and so many of our techniques for doing things just flew out the window with 3D. Then as time went on I realized there is something very pliable about 3D stereo. When you are blinking right eye, left eye, right eye your brain is only receiving that information from one eye, but when you look with both eyes and realize your brain is selecting what it wants to look at and throwing out quite a bit of information. I find that is kind of happening in 3D compositing as well. If you direct a viewer to look in the right place using convergence and focus and color, little stereo bugs can be forgiven.”
Healer is a former Shake artist. “I loved it, even though at its core it had some limitations.” Now a Nuke user, he says this tool addresses many of the issues he had with Shake, like time remapping, scripting, and even undo commands. “I love that Nuke can do compositing of single-view material and stereo material, but they take that idea even farther to call it multi-view. To me, that shows they have a very forward-thinking perspective, not just that stereo is coming but so is multi-view… maybe 10 years from now.
TIP: “There are a few stereo books out there and I think they should be read, but also taken with a grain of salt. Again, if it looks right, it’s right. I constantly close my left eye and look, close my right eye, and start thinking in stereo. I want two images that look right when I look at them in separate eyes. Which is a different mindset than ‘let’s build this accurate 3D world and view it through a stereo camera.’ That is almost the wrong way to think of it.”
NITROUS VISUAL EFFECTS
Veteran VFX artist Geoff Leavitt and partner Jonathan Bourgoine started Calabasas, CA’s Nitrous Visual Effects just over three years ago. Supplying VFX services for television and film, the studio has three full-time employees and ramps up as needed. This helps them keep quality up and overhead down.
Leavitt, whose extensive VFX credits include Zookeeper, Grown-Ups and X-Men: The Last Stand, reports that today’s compositing landscape increasingly features stereoscopic work. “A lot of projects are going stereo, but there is 2D to 3D conversion too, and that is a lot of heavy handed compositing,” he says.
Nitrous (www.nitrousvfx.com) recently completed over 80 shots for the independent feature Julia X, a true stereo production with a budget between $2-4 million. The film was shot with two Reds using a beam splitter to get left and right eye. “We got double the footage than a 2D film, but were prepared,” explains Leavitt. “We spent about six months researching stereo compositing pipelines as well as stereo 3D pipelines because we didn’t want to get the footage and then be at a standstill trying to figure it out as we go.” The process went so smoothly that instead of needing the allotted six weeks, they delivered in four and a half.
One effects sequence they worked on was adding feathers to a fight scene involving scissors. “They needed more feathers floating in front of the zero plane so the audience could almost reach out and touch them,” says Leavitt. “We built them using particle systems in Maya and then replaced it with 3D models we built and composited it together.”
There was another shot where one of the characters peels out in a Mustang, sending smoke out of the screen toward the audience. The tires then pick up gravel, dirt and debris, and start throwing them at the audience. They also created and composited CG blood spray coming off of scissors and a knife at the screen; replaced Kevin Sorbo’s legs in a scene were his feet were being nailed to the floor; and added CG rain to another scene. PFTrack was used for matchmoving.
There is no question that preparation is key to a successful stereo job. “It depends on the compositing package, but when you are compositing in 3D stereo you have to be conscious of left eye and right eye, where in regular 2D compositing you only have a single frame to worry about.”
Shops that don’t have their pipeline down could end up doing twice the work, he says. “Because whatever you do on the left eye you have to duplicate on the right eye with whatever offset is set up in the camera.”
Most compositing packages now allow you do the work once and make minor adjustments for each eye. “With Autodesk Toxik, which they now call Composite, you are able to transfer the footage over and give it a little bit of an offset so it lines up. When you check your work it should drop right into place,” explains Leavitt. “With Nuke, when you are doing that you just do your work once in one stream and it tends to just offset itself to the left and the right eye.” Leavitt has worked on Nuke, but Nitrous uses Composite (their main tool for Julia X), After Effects and Shake.
Another way to get prepared early is working with production. Nitrous requests camera reports. “We’ll put that in and get what the zero plane was set at, which is what defines a 3D space,” he explains. “When we input that into the stereo camera it accounts for the offset, so if you put it in once it knows which is left eye and which is the right eye and there might only be minimal adjustments, if at all. There are times when you have to dial it in because it doesn’t look right, and you’ll know because your eyes tend to cross.”
Nitrous is also busy with non-stereo jobs, including regular work as the outsourcing facility for CSI: NY. Anything the show’s in-house VFX department can’t get to because of overload is handed off to Leavitt and team. For the past two seasons they have averaged about 35-40 shots per episode with a three- to five-day turnaround. The show is set in NYC but shoots in LA, so creating New York City locations is a big part of the job. They use After Effects for compositing since CSI: NY’s in-house department uses it, making for easier back and forth in terms of assets.
TIPS: “Everybody works differently, and what I tell people is, ‘Figure out what works best for you and don’t let anybody tell you what you are doing is wrong.’ We are creative problem solvers and I expect the people I bring in here to be able to think outside the box. If they have ways of doing something that is different than the way I like to do things but it still gets things done on time and the quality is good, I say go for it.”
Compositor/lead lighter Paul Stodolny of Toronto’s Arc Productions (www.arcproductions.com) has seen a number of compositing trends developing recently, including the return of the camera as a character.
“For a few years everyone was trying to make pristine, clear beautiful images, but as the existence of the camera has been reintroduced in the shot, it has definitely been affecting compositing. Now everybody wants lens flares…thank you J.J. Abrams,” he laughs. “And during explosions, compositors are required to add things like dirt and excessive light bleeding on the lens to create a realistic, documentary style that’s now typically done in post.”
Today’s filmmakers are sophisticated and well aware of what kind of tools and abilities compositors have. “They know we can do more than painting out dolly tracks and light stands,” explains Stodolny. “The art department and set decorators don’t even have to be as exact because we can paint anything out or anything in for that matter. For example, on our current production of Robosapien we worked on a basketball sequence, painting out the trademarked logos that production didn’t have rights to.”
Arc Productions, which until recently specialized in full-length animated feature films (Gnomeo & Juliet, 9), has taken on live-action projects as well, and its first visual effects project is the aforementioned Robosapien. This feature film, based on the Wow-Wee toy line and co-produced with Arad Productions, was shot live action. Arc created and integrated a CG robot that befriends a little boy. They provided about 900 shots in total for the film, which was shot on Red and Sony F23 cameras. According to Stodolny, “This is something else compositors need to educate themselves about — digital camera technologies, because it’s no longer just matching film grain, it’s matching digital noise and other effects of digital filmmaking,” says Stodolny.
Another trend that he sees is on the compositing software side of the business. “Most software companies are competing to find new ways to manipulate 3D renders in post,” reports Stodolny. “To provide our artists maximum flexibility and to decrease iterations of CG renders, our in-house developers created tools allowing artists to spend less time in the CG package and more time refining their final image. For example, rather than having to go back to render to do fine lighting changes, we have tools allowing us to alter color, falloff, volumetric and even texturing in our lighting all from within our compositing packages.”
While Arc has been using Fusion for its feature animation work, the studio has begun using Nuke as well for some of its live-action integration projects like Camelot, a TV series for Starz. “Due to the architecture of each of these packages we’re now able to use helpful tools in both programs, opening up options to all of our artists,” he says.
Arc itself has created its own trend: about three years ago, its lighting and compositing teams joined as one to give the lighting artists the control of bringing their shots through to final picture. “However, even on complex productions requiring compositing specialists, our lighters, who are familiar with using Fusion, build their beauty passes within Fusion and Maya and then hand it off to our newer Nuke compositors.” They employed this workflow on the 3D stereo film Dolphin Tail, for which they created a CG dolphin. It’s due out this fall.
Stodolny, who worked as a stereo compositor on Beowulf for Sony, says Arc saw the stereo trend developing and prepared. Gnomeo & Juliet was released in stereo, but that decision was made halfway through production. “Even though it was a full CG movie, we had written plug-ins in Eyeon Fusion to be able to render the second eye and eliminate the need to go back to 3D renders for the bulk of the show.”
Stodolny sees the line between 2D and 3D blurring. “It used to be that you needed to look at one final image to tell if something integrated well. Now to be a compositor you need an understanding of how the two images work in stereo, how animation works and how texturing works. You need to know a lot of different jobs, and there are a lot more tools to learn than there used to be.”
TIP: Stodolny, who came into the industry from the cinematography world, teaches a photography and cinematography class to compositors/lighters before every project. “The best thing for people to understand is what’s being captured within the frame and what’s happening within the lens. A lot of artists will add things like flares and light effects, but if they don’t actually know how the camera lens is picking up that information it’s hard to layer that properly within the image.”
He encourages artists to watch movies and freeze frame on shots with and without VFX and to shoot HD video and study it. “A lot of artists forget about motion because they are looking for color accuracy and film grain at a frame-by-frame level. Everything we do is imitating photography, so if you understand photography you’ll know immediately what to look for.”
IMAGE: Before & After: Arc is providing VFX for Camelot, a series on Starz.