Advertisement
Current Issue
April 2014
Issue: April 1, 2010

VFX FOR TV

By: Ken McGorry
Bad things happen to good people. Really bad things happen to really bad people: burning, freezing, liquefying, disintegration, dismemberment, infestation, varicose-vein affliction, extreme anorexia and, yes, even hair loss. Add to that the ability to fly unaided, violent crashes, deadly explosions, pyro, bad weather, megaton detonations and acts of God. Yep. You’re watching broadcast television on any given night.
Television episodic effects supervisors have a tough, tough job. Andrew Orloff, VFX supervisor at Zoic, asserts, “They have to be a lot more on the ball and a lot more stringent and organized to make sure that there’s no winging it. You don’t have time to think of a new plan.” However, the broadcast veteran says, “What I love about broadcast television is it’s a great crucible for creative ideas in visual effects.”
That’s what television production today is about, especially for visual effects work, turnaround times are brutally short and budgets are not too big.
And then there are the locales. For a TV episodic to differentiate itself, its locale is of great importance. Digital image manipulators are hard at work as we speak tweaking greenscreen shots to look like they were shot anywhere from New York City to Mumbai.
Then there’s the relatively instant gratification TV broadcast can bestow on its practitioners. When their shows are broadcast, millions of people can see their work. And it lives on in cyberspace and possibly in DVD release (and don’t forget “reruns”).
Meanwhile pros working in broadcast visual effects attend movies just as television audiences do and they are quite intimate with what the biggest budgets and craftiest artists can put up on the big screen. Here, five VFX houses — all headquartered in the LA area — discuss how they grapple with the tight deadlines and increasingly sophisticated audience expectations that broadcast production faces.

INHANCE-ING CSI: MIAMI

Inhance VFX in Los Angeles counts CSI: Miami among its many clients. CBS promoted a recent episode of the long-running crime series — episode 815, called Miami, We Have A Problem — during the Super Bowl and hyped its unusual nature: cops in space. The Super Bowl promo was also exciting for Inhance’s VFX supervisor, Eddie Robison; he got to see his own effects work broadcast during one of the world’s biggest sporting events.
CSI: Miami prides itself in visualizing how a fictional attack or murder was accomplished and, since episode 815’s killing took place in an orbiting civilian spacecraft, we need to see it. Robison has worked 15 years in TV and feature film VFX production, and his seasons on Star Trek: Enterprise gave him a lot of experience visualizing outer space and strange planets.
Inhance (www.inhancevfx.com) provided seven hero shots for this episode and followed that with about 31 rig-removal shots to erase the harnesses supporting the various actors appearing weightless aboard the spacecraft. Inhance has a number of tools at their disposal. Robison is a NewTek LightWave 3D artist and, in his lead VFX role, brought Eyeon Fusion to the shop when he joined about three years ago. (Inhance also uses Adobe After Effects.)
“Television is about budget and time,” Robison says, “and Fusion is great for that.”
The shots include two deceptively simple scenes where the show’s actors examine the spacecraft in its hangar. The craft itself is an amalgam of a Gulfstream G4 corporate jet with details composited by Robison to suggest something Richard Branson might fly into space. The backing plates for this sequence were shot on Genesis cameras — as is the whole series this season.
Robison calls the hangar sequence, including tracking the moving camera, the most challenging: “a very long dolly shot following the actors across the hangar and right up to a real G4 Gulfstream jet.”
The back plate shows a corporate jet parked under the lights as David Caruso and company approach, discussing a murder as the camera follows. Robison removed the rear of the plane and replaced it with his own design — a double set of portholes and big rocket engines. It’s common for Robison to go on set with CSI: Miami VFX supervisor Larry Detwiler for effects shoots. Here Robison shot stills from numerous angles using a Nikon D90 to create environment maps capturing the reflection of the hangar’s overhead lights off the jet’s fuselage, which he then used on the CG portion of the craft.
“A ‘clean’ hangar wall was painted and merged over the plate using the tracking data,” Robison says, creating a new plate with the rear half of the real jet removed. “This new background was then 3D tracked by artist John Karner and the data was brought into LightWave to render out the new, modified section of the spacecraft.” Inhance VFX used Boujou software on tracking shots.
Robison seamlessly blended the CG with the real fuselage. “Depth blur, color correction and film grain tied everything back together,” he says, “and then the actors were rotoscoped back on top of the completed shot.” The hangar shots also required compositing in hangar staff people ambulating in the background.
Robison says LightWave “is a great tool for television” with its time and budget restrictions, especially for solid-body vehicles. He also appreciates how LightWave can work with unlimited render nodes.
This sequence’s four Earth-orbit shots were satisfying for Robison because he could call upon his experience on series like Star Trek. The orbital shots show Florida (of course), and the production benefitted from an extremely detailed, 26,000-pixel map of the Earth newly provided by NASA. Besides the CG sphere for your terrain map, you need spheres for clouds and shadows; for the atmosphere’s glow; and to make CG stars “fade” when close to Earth’s atmosphere.
Other VFX shots included bloodshed in weightless space and a push-in showing a flock of micro-meteorites piercing the ship’s hull. The micro-meteorites, Robison says, were motion control 1000fps Photosonic plates shot by Detwiler and “sandwiched together into a CG shot to create the effect of micro-meteors rupturing their oxygen tank.”
Inhance VFX artists Bruce Coy and Mike Underwood were key in the subsequent rig-removal work showing performers in weightless space. “Fusion is a fantastic wire-removal tool,” Robison adds.

VAMPING IT UP

Entity FX, with facilities in Santa Monica and Vancouver, has been busy with this season’s effects work on The CW’s Smallville and AMC’s Breaking Bad. But one newcomer to TV has created a stir — perhaps fueled by the public’s rabid interest in vampirism — The CW’s Vampire Diaries.
One audience-satisfying vampire sequence has long been the transformation from the pale and thirsty to the fully outed rampant vampire. Besides the fairly straightforward speed-ramp and harness effects that help depict superhuman powers, such as jumping off a building, another key factor in exposing the undead characters in Vampire Diaries is their eyes. When, er, agitated, a vampire’s eyes grow dark and the surrounding skin, including the cheekbones, will sprout a mild case of bulging veins. With their realistic pulsation and sickly color, these CG veins actually help convey a vampire’s mood. Usually a bad mood.
“Perhaps the most challenging thing is that a lot of the effects that we do are kind of performance-related,” says Mat Beck, LA-based senior VFX supervisor at Entity FX (www.entityfx.com), which has done all the effects for Diaries. These dramatic effects “tie in with the actors and feed off the actors and contribute to the moment. Their eyes change in a lot of different ways that reflect their emotional state.”
Various vampire emotions include lustful, hungry for blood and emotionally conflicted, but they also must be readable to the audience — vampires mostly appear in dark environments. Each character is assigned unique vein “geography” and their veins actually behave differently depending on what they’re experiencing. “One of our precepts is that everything has to look photoreal and in-the-scene,” Beck says. “The veins may be an odd thing to see on somebody’s face, but not an incredible thing.” In another example of surreal realism, one episode has a female witch revealing her powers to a friend by playfully levitating a roomful of feathers courtesy of Entity’s CG and effects.
Besides wire removal and super-speed activities, Entity FX has also provided Vampire Diaries with CG fog effects and CG animals such as a crow “to add to the creepiness and mystery of the environment.”
Entity has the capability to create digital doubles for stunts, but traditional rig-removal is still more efficient. The shop has an array of tools for hiding rigs, including After Effects, Nuke and Flame. This can be simple work, but sometimes a wire can get in the way of an actor’s face. “More often,” Beck says, “it’s not the wire that’s the problem, it’s the rig that’s deforming the clothing. You can get a cloth bulge rising out of the small of the back — often you spend much more time on the clothing than the wire. After Effects is a great package for a lot of this stuff and for certain special cases we’ll use Flame.” Entity also uses Nuke, but less so on Vampire Diaries. The show is shot on Sony F-35s along with some smaller cameras, but Entity can work with any mix of formats from 35mm to Red to Canon 7D to HDV.
The lighting for many Vampire Diaries scenes reminds Beck a little of the early days of The X Files. To make such shots mysterious yet readable to the audience, “you have to have really good control of all the parameters of lighting. In terms of the veins we have a mixture of techniques — some of them full-on 3D tracking of the actor’s face as it moves — and 2D and 2 1/2-D techniques. They all benefit from knowledge of the real lighting in the room. A raised vein may show itself by picking up a little bit of highlight on one side and cast a shadow on the other side. Our reference is most often photographs of veins. We have a proprietary system that involves distortion of the surface so that the points on the [CG] skin move with the points on the actor’s skin.” Entity even has software sliders to animate variables like puffiness, color and “crinkliness” of an individual CG vein — even the pulsation caused by blood flowing through it.
Entity FX uses Autodesk Maya for 3D CG work. Motion tracking software includes Maya Live, Boujou, SynthEyes and Pixel Farm’s PFTrack. Shotgun production software provides shot data management. Entity is a full-service shop serving feature film and TV production equally. “Our pipeline is based on multi-skilled individuals in different departments,” Beck says, and the company is flexible enough to accommodate features and episodic TV, as well as the low-budget indie.

CASTLE EXTENSIONS

Where are we? That’s a question often floating in the minds of TV viewers. Television is particularly good at establishing a story’s dramatic, gritty locale in the opening and then giving you a clean, wholesome, more affordable setting such as oh, Toronto.
It makes sense for the producers of the series Castle to work in LA. But Castle — a murder-mystery-with-wit series depicting a bestselling crime author and the beautiful detective he wishes he knew better — is set in rough, tough New York City. The geography gap — as well as any potential credibility gap — is nicely bridged by a kind of one-woman second unit in the person of Leslie Robson-Foster. She’s an Encore Hollywood VFX supervisor who prowls NYC and environs with knowledge of each episode’s needs and a nice Canon 5D camera capturing high resolution photos that ultimately stand in for the dramatic backdrops Manhattan offers at night, in the day and at magic hour. Her background plates then go to Encore Hollywood (www.encorehollywood. com) and VFX producer Jon Howard, an Encore veteran with many years in both telecine and visual effects.
Is there a connection between the two today? “Technologically there are fundamentals that apply to both disciplines,” Howard says, but he finds VFX a little more to his liking. “It’s actually analyzing the image and pulling it apart to either add something that’s missing or remove something that shouldn’t be there.”
Robson-Foster set-extension plates marry with live action via either Inferno or After Effects, or both. Sometimes she captures traffic elements for use in background plates by shooting video with the Canon still camera. “We can also takes the cars, duplicate them, and have them driving or move them around in the frame if need be,” Howard says. Inferno serves well for trickier work like this. Output is 1920x1080 HD. “It’s got very fast processing,” Howard says. “It’s such a powerful platform it can handle doing multiple passes pretty easily.”
A memorable episode aired in January featuring a powerful scene between the titular Castle and a love interest on a rooftop. It’s a night exterior to which “they wanted to add a classic, beauty New York City skyline,” Howard says. The sequence comprised about 30 bluescreen shots of the actors, including moving crane shots, and Encore added the backgrounds. Encore took Robson-Foster’s plates — stills shot at night with extended exposure — and put them together to make one big master, adding additional lights and distant heat-shimmer. The whole skyline had to be tracked into the camera moves.
The Castle production team wants audiences to feel that the show takes place in New York and they appreciated the ultimate impact Encore gave the sequence. Robson-Foster works directly with Castle executive producer Rob Bowman who Howard calls “a consummate professional.” Encore scaled-up her nighttime skyline stills to 4K and extended the sides to allow for the camera movement. They also pulled the back plate apart a little to create more of a sense of depth.
Bob Minshall and Brian McIntyre are key Inferno artists on Castle. “The speed of the Inferno is a great asset,” McIntyre says. “With only two artists handling nearly all of the workload on a 25-30-shot scene and a two-week turn around, it easily allowed us to get everything completed with plenty of time to spare to work on the fine details. It provides a variety of keying options which come in very handy when you’re trying to get that little wisp of hair to come through. And it’s compatibility with the 3D tracking software, Boujou, helped solve some of the more difficult tracking problems. Using plug-ins we applied a lens blur to the background, along with a subtle ‘heat haze’ to give the effect of shimmering lights.” McIntyre adds that “Inferno also performs nice color matching, which is necessary when stitching together a background using stills with different lighting conditions.”
Stephan Fleet is lead After Effects artist responsible for such details as making background building lights twinkle in night exteriors. He will sometimes go on set.
Encore Hollywood also provides effects for series such as 90210, Three Rivers and House, as well as Dirty Sexy Money.

V & FRINGE

VFX supervisor Andrew Orloff is also executive creative director at Zoic Studios,  whose Emmy-nominated effects work has been influential on a number of bigger-budget effects-driven TV series, including Fringe, Human Target, Terminator: The Sarah Connor Chronicles and True Blood. Much of Orloff’s and Zoic’s attention is now turned to the alien-visitation event series V, which returned on ABC at the end of March.
Zoic (www.zoicstudios.com) is based in Culver City and Vancouver, and is seven years old. In addition to its VFX for broadcast episodics, the company does effects for feature films, commercials and game cinematics. Staff can often total in the 90s and much more during the facility’s busiest times.
V, Orloff says, shoots “a huge amount — hundreds of shots — of virtual sets per show” and one Zoic task is to provide realtime visual feedback on set, including “a realtime composite and a realtime-rendered environment so they can see temps of shots on the stage, which is super exciting.”
Zoic provides V’s production people with live feeds in two forms — a flat greenscreen feed and their proprietary ZEUS system: Zoic Environmental Unification System. Orloff and company work very closely with the series’ directors, production designers and DPs. The actors benefit, too — they can see their blocking.
“Production design designs the sets as they would traditionally but instead of handing them off to construction, they hand them off to us,” Orloff says, “and we build them virtually.” Zoic works in advance with the show’s DPs to light the sets “with a real, physical light profile.” When they all get on set Zoic loads up a realtime version of the virtual set, which is highly detailed, and it’s automatically composited into the shot. “The camera operators look at it because it allows them to compose much better with than without.” Marking the floor for actors, or even the newer practice of projecting a virtual set’s floor plan onto the floor does not provide the camera operator or director with “spontaneous opportunities” to follow a character in a more natural way, Orloff says.
So everyone can see, “we split it off to sometimes three or four monitors at a time.” The direct visual feedback is “fantastic,” Orloff says. It gives DPs “the opportunity to light into the scene in a way that normally isn’t done for a virtual-set show.”
The ZEUS temp render is not as high-res as the final will be, but it is full color and displays “a very, very comprehensive temp composite in realtime.”
The crew on V got used to the benefits of ZEUS right away. “They’ve pushed the creative limits of it — they’re doing much more complicated shots; much more complicated blocking; going hand-held with the camera; going SteadiCam; going on cranes with the camera. The system we have is very flexible and it works on permanent tracking markers on the ceiling and a lipstick-cam that points up to them — we can take it from set-up to set-up to set-up.”
V shoots with Arri D-21 cameras capturing 1920x1080 24p HD — de rigueur these days for top-of-the-line TV — to HDSR 4:4:4.
Zoic converts the HDSR video to files. The production company gets an Avid DMX media file for approval and the final delivery to the network for air is also DMX. Zoic uses Maya/Mental Ray for 3D CG and mostly The Foundry’s Nuke for 2D effects, but they also use LightWave and After Effects. Zoic also uses three Flame licenses. “Everything is run through one master renderfarm we have here,” Orloff says, “including the composites. It allows us to make a huge volume — the broadcast volume is just huge!” A “light” show can require 150 shots and the heavier shows can demand over 300 “on an incredibly tight turnaround.”
Mike Romey is Zoic’s pipeline TD. The robustness of Zoic’s pipeline allows artists to excel at what they do because it relieves them of all the heavy lifting of data wrangling and data management and allows them to “express themselves more efficiently.” 
For Fringe, Orloff says Zoic creates many digital prosthetics. “We’re taking prosthetic makeup and tracking digital elements onto it, enhancing the performance, chopping people’s jaws off, having parasites crawl out of people’s mouths and stretch their face around.” Zoic artists are encouraged to devise their own methods for visualizing such shockers. The Zoic method is expedient — have “generalists” leapfrog over the traditional department-by-department workflow — in television, there simply isn’t enough time. “We have like three days to do all that stuff!” Orloff says. “Model, texture, rig, light, maybe hand off to one specialist who’s really good at animation. Then a compositor will put it all together and maybe generate some of their own lighting passes.”
After a given round of effects production, Orloff and Romey and staff will review which artists’ most innovative effects could work best for Zoic as a whole and then codify the workflow for that effect into a script. “Then everybody can have access to it at a click of a button. We let the artists’ discoveries and innovations drive what we do in the pipeline side.”
Zoic is often consulted at the very early outline stages of a series’ season, Orloff says. “We’ll go in and say, ‘Here’s 10 cool effects that we think will be great for your show.’ And it’s really cool to see them written into the scripts as they come along.” As each episode progresses, VFX artists get to improve an effect over and over — “It’s like getting paid to do R&D!”

OUTSOURCED & MORE

In addition to services like compositing, editing, 3D CG and previsualization, Stargate’s staff offers film and HD production as well as something special to them called the Virtual Backlot. It’s designed to provide productions — particularly those without a lot of time and budget — with an immersive, 360-degree world, which may be “photographed from any angle.” Such virtual locations may be true to life or a fantasy world (with help from CG) or a mixture of both.
Just last month Stargate founder Sam Nicholson was in Mumbai shooting 360-degree locations for a new sitcom called Outsourced. It’s not for Indian consumption though, it’s a pilot for NBC featuring an American whose employer has sent him to India to work at a call center that serves customers in the US.
“The television business is becoming the lead innovator in [VFX] because we can move so much faster,” says Nicholson. As far as he’s concerned, effects-driven filmmakers need to start paying more attention to broadcast TV. “The innovations that we do in television are actually going to start appearing in feature films now. Digital imaging is all about television.”
Based in Pasadena and Vancouver, Stargate Studios (www.stargatestudios.net) has provided VFX for numerous TV episodics, including 24, Heroes, Grey’s Anatomy, ER and CSI: New York.
Nicholson’s early background is in film and optical printing. While he recalls optical printing as “a pain,” Nicholson allows that today’s digital processes are evolved from the old techniques. But what’s done today has been completely enabled by digital technology, he says, adding, “Where it’s going is completely off the charts.”
For gathering location footage, Nicholson shoots Canon cameras exclusively: “The 1D, 5D, 7D — still cameras that shoot motion in high def.” Regarding Outsourced, he says, “Traditionally pilots don’t have enough time and money — particularly in the half-hour format — to go a lot of places. But this is the way we’re ‘opening up’ the show and making it stand out — it’ll look like they’re in Mumbai each week.”
To shoot a 360-degree environment for use on Outsourced Nicholson and crew set up an array of eight Canon cameras: “When we’re driving, we’re looking straight forward, straight back, sideways, three-quarter-front, three-quarter-back and reflection passes, all rigged on a car in traffic. We get a 360-degree view of most of the places we shoot; then we can go [to LA] and remap that onto a three-dimensional set. It allows you full freedom of camera movement, hand-held, as if you were really shooting a scene and wanted to walk around someone.”
Once the footage is back at Stargate Studios the plan is to mount an auto-rickshaw (a three-wheeled moped popular in India) onto a turntable on a greenscreen stage. “We have a stage specifically built for shooting realtime visual effects on greenscreen. It’ll look like we’re doing 360’s around an auto-rickshaw.” So there’s your Mumbai.
On the greenscreen stage Nicholson shoots Sony F-35s and is getting into the new Alexa camera from Arri — both digital cameras are designed to emulate 35mm. The camera operators can circle the actors in the rickshaw with the Mumbai location footage providing a backdrop from every side. “The plates that I shoot [in Mumbai] will be streaming in realtime and be composited so the actors can look out the windows and see ‘where they are’ in Mumbai — which is a tremendous help for acting and for directing.” To accomplish this, Nicholson needs to be extremely thorough on location and shoot the scene from every conceivable angle — close-ups, wides, establishing shots, etc. — without the actors.
By contrast, a show like Heroes requires labor-intensive post production to depict a virtual environment like a devastated New York City or to create a photoreal virtual human being. Meanwhile, Nicholson says, “in 24, we’ve been putting New York in this entire season and the actors have not had to travel to New York. Wouldn’t it be cool if you could go out in a backlot and take a camera and look up in the air and New York was there? Or Paris or London? That’s what this is and we’re seeing it now in realtime, which is what’s so exciting onstage. We don’t have to wait for weeks to see every painstaking composite done — we’re compositing live onstage.”
Stargate uses After Effects for compositing and Maya and LightWave for CG. Outside LA, Nicholson has digital artists set up in the Vancouver sister-shop as well as talent working in Toronto, Malta and Mumbai. All can communicate and collaborate easily using a proprietary Stargate program called VOS — Virtual Operating System.
Back in Mumbai Nicholson has trained people ready to feed him additional 360-degree plates for upcoming scenes in Outsourced. He’s also planning to set up an official Stargate India facility to provide outsourced back plates etc.
Stargate is presently creating a “Virtual Backlot” set that will be square miles in size with shops and buildings that you can virtually enter — a “3D virtual reality that looks photoreal. It’s essentially texture-wrapped reality — wrapped like wallpaper onto a CG environment.”
Nicholson asserts that the age of punishing writers for writing “unproduceable” scenes is coming to a close. “The innovations that come out of television are going to be absolutely stunning over the next 10 years. There are some very exciting things coming down the pipeline — ambitious filmmaking that’s never been tried before in television.”