Graphics for Stereo 3D
Issue: February 1, 2011

Graphics for Stereo 3D

Although stereo 3D for broadcast and non-feature film applications is still in its infancy, it’s not too soon for designers to discover what creating graphics for the medium entails.

“It’s time for designers to get their heads around it, to become known for it among clients on the leading-edge and to build the muscles required to do it well,” declares Alex Lindsay, founder of San Francisco-based Pixel Corps, a guild for media producers with 1,500 members in 40 countries. “Graphics for stereo 3D requires a big rethinking of how you do things. It’s a whole different beast and often requires customization to be effective.” 


Pixel Corps ( has been crafting 3D elements for the guild’s own informational podcasts that reach about 100,000 people a week. “We use our own productions for R&D,” notes Lindsay. “We did it when 4K became available on YouTube, and now we’re doing it with 3D. If it was easy, everybody would be doing it; we try to figure out how to do things while they’re still hard!”

He reports that, “as soon as you get into 3D you realize that what you learned about how to frame and how to edit doesn’t work anymore. Most of the rules we’ve had don’t translate one-to-one in 3D. 3D is a lot more like the way we look at the world: things tend to want to be centered in the frame so your eyes can focus on them, and that breaks a cardinal rule of 2D filmmaking.”

Opinions vary about the degree of 3D to implement, he adds. “Some supervisors want to converge behind the main subject, others right on the main subject. It depends on how your eyes feel afterwards: They’re working harder than in the real world where things are seen in one long shot. Focusing here, then there, then somewhere else puts a lot of stress on the eyes. As you start to shoot 3D and build 3D elements you have to stay conscious of the issue.”

Complexities arise when graphics are composited over backgrounds, Lindsay points out. “If you want graphics over top of a background, you have to look at convergence and the interaxial distance in the background plates,” he says. “With lower thirds, for example, you want them to have some dimension, but they need to mix in with the physical plate and not blind the user.”

Pixel Corps’ primary distribution medium is the Web, which uses “the lowest common denominator anaglyph process with old-fashioned cyan and magenta glasses,” he explains. “That means we can’t use some blues or red in our graphics.”

The guild is in the midst of creating 3D graphics for podcasts that will stream the MacWorld conference in 3D. “It’s a big R&D test for us,” says Lindsay. “We hope to do more at NAB.”

Designers on staff at Pixel Corps are using Maxon Cinema 4D for the full 3D open, lower thirds and titles and are tapping the SV stereo rig for the CG cameras instead of writing proprietary rig software. “It’s a great little pro rig that sets everything up so we can see where the convergence is, and it gives us a lot of control of the cameras. Then we render two passes and bring them into my own DV Garage Conduit nodal compositing system. Soon we’ll be able to composite stereo 3D in realtime; we’ll render then fine tune to make sure the graphics look good on top of the video.”

Lindsay believes we’re still “two or three years away from people wanting to see 3D graphics over their TV shows.” In the meantime, he feels the tools will get a lot a better and easier to use at the next two NABs and SIGGRAPHs. He advises designers to start getting their feet wet now.


New York City’s UVPhactory (www.uvph. com) has been working in stereo 3D for about three years, commencing with the Bjork Wanderlust music video, whose many animated elements served as “a crash course” in how to deliver stereo 3D, says principal/co-founder Damijan Saccio. You might remember it as a Post cover (Feb. ‘08).

Most recently, the company created a stereo 3D college football graphics package for cable network Versus last fall for the California/Oregon PAC 10 game. Not only was it Versus’s first 3D telecast, it also turned out to be the most-watched college football game in Versus history.

“Sports is ahead of the curve for 3D,” Saccio notes. “We created a logo animation for the end of the open, a bumper and transitions in and out of replays for a Pac 10 game that aired in early December. Our task was to tie into the look of the open while taking advantage of all the new stereo 3D capabilities.”

One of the biggest challenges posed by 3D is that most clients have no way to view it properly. “That complicates matters and requires us to create anaglyph versions for 2D computer screens, which can be seen with the old red and cyan 3D glasses, so a client can get some idea of what the elements are going to look like in 3D,” he explains. “Then, to accompany that, we also show a full-color left eye render so that the client can see what it looks like in full color. Ideally, a client would have a 3D monitor like we have in our office, which allows for full 3D viewing with the same kind of polarized glasses that you use in a movie theater.”

Rendering left and right eye versions doubles the amount of rendering time, which “can be significant” depending on the length of the deliverables. “As much as the software has improved and become automated, there are so many adjustments you can make to fine tune things that you have to factor that into your schedule,” notes executive producer Paul Schneider. “We get a lot of 3D package requests with the same schedule as 2D packages; it’s part of the education process to realize it’s going to take longer.”

UVPhactory used Autodesk’s Softimage|XSI as its primary tool for the college football package where banks of stadium lights and spinning logos serve as transitions, and a football sails through the ring of the Versus logo poised over the goalpost at the end of the open. “XSI has a new release for subscribers that contained some stereoscopic tools, which were very helpful to us,” reports Saccio. Adobe After Effects and Apple’s Final Cut Pro were also employed.

Previous 3D projects had been delivered digitally, but this project had to be delivered on tape. “We found that Sony’s SRW-5800 recorder, which we booked at PostWorks, can input the left and right eye into two separate video channels. So it can go out live and feed full-size left and right eyes, which the live production truck required,” Saccio says.

UVPhactory is currently finishing stereo 3D titles for a TV show and a graphics package for another 3D broadcast job.


Having created stereo 3D main-on-end credits for Shrek Forever After and the new 3D logo for the 3ality camera system, Hollywood’s Yu + Co ( turned its sights to crafting smarter-than-the-average main-on-end credits for the Yogi Bear 3D feature.

The company pitched several ideas for what was supposed to be the main title sequence, but later became the main-on-end credits. A scenario depicting the original cartoon characters — Yogi, Boo Boo, the Ranger — in silhouette traveling through a cel animated-style landscape filled with pic-a-nic references (pretzels, watermelon, kabob ladders, cotton-candy clouds, fireworks with turkey drumstick and carrot pinwheels) had “equity that old and new audiences could identify with,” notes art director/design lead Synderela Peng. The credits stood in stark contrast with the style of the live-action/3D-animated film.

But the end sequence’s graphics needed to be crisp and clean with no fuzziness around the edges, something that’s not always easy to achieve in stereo 3D. 

“There are very specific things you have to think about in designing in 3D,” notes animation director and VFX legend Richard Taylor, who was a VFX director on the original Tron. “You have to be aware of the contrasts in very graphic sequences. Moving very bright images against a dark background can give you ghosting around the images, and after-imaging and strobing can be more accentuated depending on the speed you move from left to right.”

Because the credits did not have rounded or shaded objects but, instead, “lots of planes of material going back in space,” care was given to make sure the graphic elements appeared dimensional, he says. “When you add a bit of texture, even very subtle texture or tonal changes, behind the elements all of a sudden you have 3D.”

He believes that Peng’s biggest design challenge was the transitions from scene to scene. “You want to be clever and find movements to bring into negative space without being heavy handed,” he says. “We had to be very careful about edge violation in negative space so images didn’t fade or pop away. And because the characters were silhouettes they didn’t have a lot of dimension, but the things around them did. So we had to let other things be more dimensional,  which was kind of tricky.”

The team at Yu + Co employed After Effects for the graphics and Flash for the character animations, in order to capture the fluidity and naturalness seen in traditional cel animations, explains Peng. 

Animators worked on their own computers in anaglyph to rough out sequences, then moved on to large 3D monitors requiring active shutter glasses to preview the 3D effects and convergence. A series of approval sessions in the theater at FotoKem followed.

“You have to see the work in a theatrical environment where it’s projected and you’re wearing polarized glasses like the movie audience,” says Taylor.

He finds that the lessons learned from the Yogi Bear end credits were “more about design than the technology: the speed at which you move objects, the contrast range of objects, the amount of depth to build into scenes. This isn’t instinctual — it comes from trial and error. The more you experiment in designing for 3D the better you get.”

At Yu + Co, Garson Yu was the creative director, Sarah Coatts producer and Sean Hoessli effects coordinator. The design team led by Peng featured Edwin Baker, John Kim, Daryn Wakasa, Etsuko Uji; the 3D stereo compositors were Stevan del George and Mark Velacruz; the After Effects team was composed of Jill Dadducci, Andres Barajas, Gary Garza, Wayland Vida, Alex Yoon; and the animators were Josh Dotson, Eddie Moreno, Noel Belknap, John Dusenberry, Dae In Chung, Ben Lopez and Pota Tseng. Jason Sikora and Latoria Ortiz handled the editorial.


The stereo 3D storytelling that Hollywood’s PIC Agency ( did for the prologue and titles for the feature My Bloody Valentine led to a recent assignment for the fourth film in The Final Destination franchise.

Their original brief for The Final Destination 4 main titles was to focus on the continuing premonitions the main character has of his friends dying in horrific ways. But creative director Jarik van Sluijs suggested revisiting the grisly deaths of the previous films via an X-ray technique.

The design team was faced with a big challenge, however: They couldn’t show clips of any of the actors in the first three movies. But recreating some of the death scenes using an X-ray technqiue would “make fanboys happy” without revealing actors’ identities, he reports.

They staged HD shoots for some elements and animated still photography for others, rotoscoping the results to give depth. In a tribute to their live-action recreations, at one point the designers were cautioned about including clips from the previous films in their titles and had to explain that the footage was their brand-new material, not original excerpts.

The monochromatic main titles, stained with red blood spatters, show drills boring, car parts tumbling, a light pole falling, pyro blazing and bodies suffering impalings and decapitations. All are depicted with an X-ray perspective that effectively conveys the horror The Final Destination aficionados have come to expect.

The titles would have been standout in 2D, but now PIC Agency had another dimension to work with. “We had to determine how to do the 3D and make it beautiful, how to use all the gags and still tell the story,” says executive producer Pamela Green.

“You know how you look at an X-ray and see scratches, dust and dirt?” asks van Sluijs. “We added them in one pass but they made everything look flatter because they accentuated the plane of the screen. We went back and did separate dirt passes — some deeper, some outside the screen, none on the screen level — so it felt like a real window.”

The two virtual cameras in their software of choice, Autodesk Maya and Cinema 4D, “sync up with the software so we could control convergence and 3D,” he points out. “We did some painting in Photoshop, composited in After Effects and edited in Avid.”

The most important step was testing the titles’ convergence and pop-out effects. Although the designers were able to do that in-house, it was critical “to see the titles on the big screen,” says Green, a process that took place at FotoKem.

“People usually have in-your-face pop-ups, and we’re against too much of that,” she reports. “We wanted to focus more on the design elements.” PIC Agency also created the premonition sequences within the film and treated them rather “conservatively,” too.

Van Sluijs does admit to one “old-school” 3D moment involving a premonition of a snake. “It reminds you of the 3D pick axe coming at the audience,” he laughs. “But for the titles we tried to subdue the elements coming out of the screen. We played with the typography a lot: It hovers just outside the screen and creates a lot of depth.”

He notes that even had they been crafted in 2D, the titles’ “design, editing and storytelling would have been the same. 3D adds an extra element and takes more time to render and composite.”

“It’s still an education for everyone,” says Green. “When we look at tests with clients we go into the theater, put on the glasses and hope for the best!”


Sky Creative, the in-house creative department of broadcaster BSkyB ( based outside London, may have the most continuous stereo 3D experience of any graphics house today. BSkyB launched a stereo 3D channel showing sporting events in pubs and clubs in April ‘10; a domestic 3D channel, Sky 3D, launched in homes in the UK airing original sports and arts programming and existing 3D movies in October ‘10.

“We started visual effects and graphics development work about six weeks before Avatar released in conjunction with the testing of the OB trucks,” recalls VFX manager Sarah Cloutier. “We produced a cinema spot that ran in front of Avatar that used Softimage|XSI for CG and The Foundry’s Nuke for compositing and finishing in SGO Mistika. That gave us the basic platform and pipeline we’re using for all our 3D broadcast design, channel branding, bumpers and promos. The only change is now we are using Nuke for desktop compositing, and we correct geometry, finish, master and grade in Mistika or Autodesk Flame.”

Flame’s 2011 release provided “very robust stereo tools for all graphics packages,” she notes. “Every week we do quick football match openers, team badges for transitions and wipes, and goal and trophy wipes. We supply Flame data to Vizrt, which plays out the graphics live from the trucks.”

She says that a year ago “we had the idea of taking the sensibilities of a feature film pipeline and making it fast for TV. We don’t have the time that film people do: we have to turn things around in hours sometimes. We’ve written our own plug-ins for Softimage and are working with SGO to ensure that Mistika continues to be developed at the speed we need.”

Sky Creative now routinely works in anaglyph shaders with three cameras in a scene: left, right and center. “We use the center to render the HD version,” says Cloutier. “When you’re designing titles or channel branding if you only use the left eye for the HD version it feels weird, it’s not center-framed. When we roto or comp with Nuke we work with the left and right eye separately, and when we color match we use a split screen.”

The chief difference in designing graphics for 3D is “the need to fill the depth budget, to put things in the back, middle and foreground to give a sense of depth. But things can’t be too busy,” she cautions. “The design has to be more refined, simpler. If you put too many graphics and moving elements in 3D space it can be very confusing to the viewer.” 

Sky Creative recently crafted a stylish ID for Panasonic, which sponsors 3D films on Sky Movies, that was such a hit that Panasonic now also runs it in cinemas. They supervised the live-action 3D shoot in Prague, flying in a replacement camera from the UK when one of the cameras on the rig failed. The shoot captured a woman walking through a magical environment; Sky Creative added CG snow and composited in glowing teardrop-shaped lamps on the floor. When the camera pulls back viewers realize the woman is actually inside the TV.

“Fully-integrated CG and compositing in 3D is becoming more common and easy to produce in short time frames due to the robust and efficient workflow Sky Creative [has] developed,” Cloutier reports.


Kim Lee, who heads New York City’s Worlds Away Productions Digital (, says his company “jumped into the deep end of the pool” with its autostereoscopic (glasses-free stereo 3D) project for the world launch of the Lexus CT 200h. Featuring eight views, the all-CG intro to the vehicle, displayed in a kiosk at public venues worldwide, had Lee dealing with eight streams of images instead of stereo 3D’s two.

“Currently, the eight-view autostereoscopic technology is great for digital-signage applications,” he notes. “It’s great for venues where you want to make an impact and capture viewers in a transient audience environment.” The Manhattan showcase for the Lexus video was High Line Park downtown.

The project came to Worlds Away via Alioscopy, makers of the glasses-free 3D display. Lee teamed with AMCI agency creatives to co-direct the video,  which has a “change your landscape” theme. It integrates a photoreal CG Lexus CT 200h with flying 3D type and a real human hand that drops CG models of iconic structures — the Sydney Opera House, the Chrysler and Empire State Buildings, a pagoda, Seattle’s Space Needle — around the car, building a unique, international landscape.

Worlds Away pared down extremely-detailed CG models of the vehicle supplied by Lexus and sourced architectural models from TurboSquid. Lee used Autodesk 3DS Max as his primary tool, tapping Autodesk’s Combustion 2008 (“an old program, but I know it so well because I used to train people on it”) for compositing and motion blur. 

“Combustion spit out eight views of each shot, which we put through Mix8, Alioscopy’s proprietary software for interlacing eight views into one,” Lee explains. After it yielded one sequence for each shot, he conformed the video in After Effects to produce the uncompressed AVI required for playback.

With an autostereoscopic project the “cinematic vocabulary most artists are accustomed to” flies out the window. “You have to rethink how you want shots to flow,” he says. “Sometimes the traditional way to do something won’t work for autostereo and you have to use forced-perspective tricks.”

As to pop-out effects, Lee gave all the typography off-the-screen dimensionality and provided the vehicle with subtle 3D effects, such as popping out the tip of a bumper. But the hand, featured building the remarkable CG landscape, remained a 2D element.

“Doing a real-world project like this helps you speak to future clients and give them better ballpark estimates of what they’re in for in terms of costs and time,” Lee notes. “The more you do it, the more you can give absolutes, which is really important to clients.”