Advertisement
Current Issue
September 2014
Issue: April 1, 2009

VFX FOR TV SERIES

By: Randi Altman
While overt visual effects, like a liquid metal terminator, a morphing Mustang, a frozen man or a ping-pong playing mirror, encourage you to suspend your disbelief, they at the same time can help get you deeper involved in the story. It’s the job of the following visual effects pros to make you accept what you see, no question.

TERMINATOR: THE SARAH CONNOR CHRONICLES

Anyone who has tuned into Fox’s series Terminator: the Sarah Connor Chronicles knows that it’s an exciting ride — a race to stop the dreaded Skynet before Judgment Day. And all along the way, our carbon-based heroes take on terminator robots, those who look like people, those who look like robots and those made of liquid metal.
Santa Monica- and Vancouver-based Entity FX (www.entityfx.com) is responsible for creating all these different kinds of terminators and then some. The “and then some” includes a ton of environments, like a post-apocalyptic Los Angeles, a burned out aircraft carrier, a nuclear reactor and various ruined urban war zones, along with flying machines like the Hunter Killer, explosions, fluids, fire, weaponry and wire removal.
The amount of shots Entity FX provides varies per episode and they prefer to have as much lead time as possible. “There are some lead effects that we can start on early, but basically the show gets done in about two, two-and-a-half weeks,” reports president Mat Beck.
Entity FX helps speed up their process by modifying already existing assets, or building new ones such as the “Prototype Hunter Killer,” a new variant for this season. “The modeling of this was challenging because it’s based on a sketch that looked cool in the sky but had to be transformed into a model that was capable of folding into itself and matching some important story points,” explains Beck. The 3D model was eventually built as a full-sized prop, which wreaks havoc on set in consort with its CG cousin. There is a cool shot where the CG and full-sized prop stand in for each other in the same shot, each doing what it’s best at. It has become a truism that the most challenging and rewarding shots are the ones that combine real and CG.”
An example is Catherine Weaver (Shirley Manson), who appears human on the outside but is really a terminator. There is a scene where she is walking nonchalantly through a warehouse as her arms turn into liquid metal swords, cutting unsuspecting employees in half as she just keeps on walking. “It’s a deceptively simple sequence.,” he says, “but she’s not really touching those people in the live action. When she slices them up it’s a choreography of two real people and some CG hardware that connects them. So Junji Hirano had to build CG blades that were scary, and tracked with her real arms, and still motivated her interaction with her victims. To make that work we had to replace some background and CG part of her human arms as well. Brian Harding did a super job supervising that sequence. Lead CG artist Kaz Yoshida worked on the look development for this scene.”
One of the bigger effects jobs Entity provided for this season took place in a Mexican church. The good guys trap T888 Cromartie and with a hail of armor-piercing shotgun shells bring him down and damage his control chip, allowing them to remove it and kill him. “There is a succession of increasing amounts of damage to his head with each hit,” explains Beck, who says the progression was based the live-action cut, on sketches by VFX supervisor Jim Lima and maquettes by make-up artist Robert Hall “The internal structure was consistent with previous shows, but we had to cheat it slightly so you could see the cylinder inside it that contains the chip. At the end of the sequence he’s lying there and it’s his POV. They reach down toward camera and everything goes black. Some of the creative execs were a little emotional at his demise.”
Beck says the most challenging part of a sequence like this is the tracking. “It has to be absolutely perfect. Cromartie is moving around and we have to replace some of his head with 3D parts that are locked in place. Any high or low frequency wiggles, lens distortion, any variable that changes the orientation of his face has to be accounted for so it really looks like his head is half human: half the actor’s and half hardware.”
Entity FX’s senior producer, Trent Smith, was on set that day to make sure all necessary data was collected. “He made sure we got all the lens data, the camera data, and good tracking points so the software could narrow down the solution for what is actually happening in three dimensional space,” says Beck. While Entity uses a variety of tracking solutions, they called on 2d3’s Boujou for this sequence.
Another challenge was making his skin look damaged but not too much to cause a problem with the network. “You can’t have enormous amounts of hamburger or of skin flying off,” he says. “but we had to reveal underlying structure. So you have some reddish skin particles flying off, but not too reddish — and just the right amount of metal.”
Entity built a 3D head with varying amounts of damage — they also scanned the damaged maquettes (scanned by Nick Tesi at Eyetronics) as well as the actor so they could model his head, which needed to house a damaged terminator skull. “Each shot removed more skin and metal,” explains Beck. “All the techniques fit together so that some of the head was the real actor and some was the CG version of the actor. Some was completely synthetic metal.”
Entity FX uses Autodesk Maya for modeling and animation, and they write their own proprietary scripts for manipulating particles and creating effects. Textures are mostly via Maya, with some done in Photoshop and some being procedural.
Beck reports that they have “a pretty substantial” Linux-based renderfarm. In terms of storage they call on two systems: “We have two offices; our main office server is based on Isilon; our satellite office is served by NetApp. We also have a lot of home-built nearline storage, and proprietary software for linking up two offices so that everything is automatically backed up and everyone sees the same data.”

IT’S BACK!

Knight Rider is back, but instead of David Hasselhoff and a Pontiac Trans Am, the new NBC series stars Justin Bruening (as Michael Knight’s son) and a Ford Mustang on steroids. The visual effects house responsible for this supercharged KITT is Valencia, CA-based Master Key Visual Effects, supplying between 200 and 300 shots per episode.
Twin brothers Elan and Rajeev Dassani run Master Key (www.mkvfx.com), which set up shop in Valencia to be closer to where the show is shot.
While half the shots are car comps — KITT is shot on greenscreen to save budget — the other half involve a CG version of the car that often transforms into other vehicles, such as an F150, or an E-150 van, or is pictured in non-traditional settings, like parachuting out of a CG plane.
So how can one tiny shop put out this many effects shots per episode? “We specifically engineered our pipeline to get these shots through very fast,” explains Maser Key founder/president Elan Dassani, who serves as VFX producer for the show. “We know how to do all the reflection plates, mirrors and the backgrounds, and we automated a lot of that. Our guys just open a pre-made comp with dirt passes and reflections, created by our 2D supervisor Stephan Fleet, and they can import the background and foreground plates for the specific shot. This makes for a fast process.”
When the writers realized what Master Key was capable of and on what schedule, they let their imaginations run wild. “They started doing things like putting KITT underwater in submarine mode for one episode,” laughs co-founder Rajeev Dassani. “They could toss the car in water, set it on fire, make it transform into anything they want. They knew we would find a way to do it. It was freeing for them and us.”
About two weeks prior to shooting, says Elan, “the writers would come to Master Key, let us know what they were thinking and we’d evaluate what was a manageable amount.”
In terms of the transformations, the numbers vary from episode to episode, but average between 10 and 30. “And that’s when our timeframes got really tight,” reports Elan. “They had to talk to us way in advance… prior to even knowing what the shots were, otherwise we wouldn’t have time to transform the car into a van.” 
One of the heavier effects jobs was a four-minute sequence where KITT was fighting a giant robot. “That is as big a sequence that you’ll see on TV with the exception of maybe Battlestar Galactica,” says Elan. The car and CG robot are fighting and that had to be composited onto live-action footage. “It’s a giant robot fighting a real car that was filmed on a big desert plane in Valencia. We designed and modeled the robot based on original sketches by concept artist John Eaves, who worked on the Star Wars films. The finished model was textured and rigged with a hybrid character/vehicle rig, so the animators could come in and use the articulated limbs in combination with the robot’s wheels. With vehicle interaction, machine guns, missiles, jumps, explosions and flaming debris, it made for a pretty huge battle sequence tied into the live-action footage.”
There was another episode that featured a CG airplane landing on a freeway. KITT drives into the plane and it takes off, sheering its wheels on the freeway bridge. 
Because of Master Key’s flexibility, there were times when the producers called on them for input on shots where they didn’t know exactly what they wanted. Elan gives an example: There was a character that injects himself with something that transforms him into a Hulk-type guy, with extended veins and eyes. “Normally you do that in make-up or with contacts, but they didn’t know how they wanted him to look so they had us do it.”
In addition to using Maya for part of the show’s 3D effects, Master Key built the CG KITT in Autodesk 3DS Max running on Boxx workstations. “Max has a Vray plug-in that is really nice for cars, and it’s fast,” says Elan. For 2D work, they call on After Effects for Mac. They use Nuke on occasion, but they prefer the speed of After Effects. Master Key’s 100-blade renderfarm was built by Bell Technologies — each blade has eight Xeon processors. NetApp is their server of choice.
Because of their tight turnaround, Master Key will buy models — things like cars and planes — often calling on TurboSquid or 3D Export. “It depends on who has the best model for the job,” says Elan. “The best model for the C130 airplane was on TurboSquid.”
Things that are custom or real, they will model themselves or have scanned. For instance, KITT has a form called Attack Mode, so a real car was built in Attack Mode and the shows’s production had it scanned. “It’s an actual car they built to do stunts with, and it was borrowed for a day,” explains Elan. “It’s easier to scan real things by laser and to get the exact data, after which we have to go in and do clean-up work.” They also have CAD models of the Ford cars used in the series.
Things that have no basis in reality other than a drawing, like the robot, were created from scratch. “We have the sketches and get the guys modeling and texturing them so we could get approvals,” says Elan.
Another thing that helped Master Key was their collaboration to post at all times via data links. “Shot versions would come in and they were linked into our system, so they could post notes and we would see them the same minute,” explains Elan. “So the writers and the editors could work in collaboration, like dailies. Otherwise it would be impossible to get the notes and the changes fast enough.”
Shortly before our interview, Master Key had completed the first season of Knight Rider. The total number of effects they created for the series? 4,125 shots. Insane!

WAREHOUSE 13

Keyframe Digital Productions (www. keyframe.ca) in Niagara, Ontario, is the house of record on the upcoming Sci-Fi series Warehouse 13. The show, scheduled to debut in July, focuses on two Secret Service agents who find themselves in charge of a  storage facility housing cursed and magical artifacts collected by the government. The two work with the warehouse’s caretaker, Artie, not only to find new relics to house, but to control the ones already stored.
With the show currently in production in Toronto, the Keyframe team is gearing up for a massive amount of effects per episode — between 50 and 150 per episode, and anything from monitor comps and rig removals to full CGI set extensions with matte paintings — including an entirely-3D warehouse. “There is a gigantic, seamless, greenscreen surrounding one of the sets,” says Keyframe VP/co-founder Clint Green, describing a scene that has Artie dangling from the warehouse ceiling as he attempts to repair failed electrical issues. “As they are pushing in, we will use 2d3 Boujou to track the shot and replace and add the expanse of a 3D warehouse around the actor. Artie and pieces of the practical set will be taken and comped into the 3D warehouse scene.”
One of the magical objects is a mirror in a frame that agent Pete uses to pass the time by playing a game of ping-pong with himself. To get the timing right, they will have a metronome on stage so the actor (Eddie McClintock as Pete) can hit the ping-pong ball every second or third beat.
“We are going to put a greenscreen in place of the mirror, but leaving the practical frame,” explains Green, who is onset during the production. “We’ll shoot over the actor’s shoulder toward the greenscreen and he’ll be pretending to hit the ball, which we’ll build in CG. Then we will remove the mirror and move the camera to the other side and get him playing — again timed with the metronome — only with different actions. Then we will take that image, reverse it, put it back into the greenscreen and put the CG ball in.”
Another item they will be creating visual effects for is the guillotine that killed Marie Antoinette. “It has a cursed blade,” explains president/visual effects director Darren Cranford. “Pete has to sneak into a museum where the guillotine is being shown to retrieve the blade.”
One thing standing in Pete’s way is a giant laser grid put in place by the museum to protect the item. “We are creating the grid, which we have to track fully in 3D space around the character and the guillotine,” explains Cranford. “Once the blade is released we see a magical effect, like shock waves.” The blade is practical until it is released and becomes 3D.
According to Cranford, another effect they are working on is a ghost effect for a character that time travels, thanks to an infamous compass. When he gets caught in a limbo world “the audience will see this floaty, articulate ghost, but as the show progresses, he will become less transparent.
“When he appears in present day, he’s trapped in a different dimension; he is not completely himself,” he continues, adding that the character draws power, life force, from his sister, who is not a ghost. “She’s dying and he’s getting stronger in order to appear in the present day.”
Cranford says when he appears ghost-like “we’ll do a 3D character with a lot of particles and a lot of work in Combustion to give it that aura and electrical look. As he becomes more realistic, we’ll be shooting the actor on greenscreen and treating his image.”
Keyframe uses Autodesk 3DS Max for modeling and animation and Autodesk Combustion for compositing, all running on custom-built PC workstations. Both Green and Cranford have been using 3DS Max for years. They like that it seems to be written by coders for artists instead of by coders for coders. They also like that it’s open source code. “You can manipulate the source code to make Max do what you think it should be doing,” says Cranford.
Technical director Eric Harvey, writes all Keyframe’s scripts and helps write their pipelines for VFX and animation.
Green points to an automation function written by Harvey and used when Keyframe was working on the animated kids’ series Pinky Dink Doo. “Say we needed to replace an object — instead of us having to load every shot and finding the object, he would write a code that would open up the files by themselves, find the object, replace the object and hit render, load the stuff down and then open up another shot until the whole episode was done.”

LIFE

LA- and NY-based Look Effects supplies visual effects for many feature films, and it’s this experience that helps them when creating shots for TV series. “It’s a big advantage for us when working in television that we have that experience with film directors who noodle everything,” says VFX supervisor Max Ivins. “In noodling every detail, you gain the knowledge of how to make things look good, and that benefits us when working with television deadlines and sensibilities.”
The studio has been providing visual effects shots for NBC’s Life since it began last season. While the one-hour drama is story-driven not effects-driven, Look Effects (www.lookfx.com) primarily works on specialty episodes like “The Business of Miracles,” which begins with our detectives walking into a crime scene in which a doctor was murdered while working late in his lab. Because of his long hours, this doctor often used an oxygen tank to keep himself awake. The murderer switched the doctor’s oxygen tank for one filled with liquid nitrogen, which when inhaled, froze his entire body. The big effect happens when detective Charlie Crews touches the doctor with the tip of his pen, and slowly, very slowly, the frozen man starts to crack and break apart.
“This is not something that is possible in real-life,” laughs Ivins, “and that is one of the things that was fun about what we were doing. There is a kind of a fantasy-reality thing.” So the challenge was figuring out what someone would look like if they were totally frozen.
The producers had a full-size prosthetic dummy built, based on a cast of a man, and that is what the actors played to on set. So the viewers are seeing the dummy — with some Look Effects color treatment added to make him appear frosty — until the doctor starts to crack, then he becomes a 3D model built by Look.
Because this was going to be a one-off model, Look Effects chose not to scan it and instead took measurements and built a model that was approximately the size of the dummy. “So we are transitioning from a photographic version of him to our model,” says Ivins. “He is actually projected to our model, it’s the same photography, so you don’t see the transition when it crosses over.”
The hair, clothing and oxygen mask were all on the dummy originally and recreated in 3D via Maya for the cracking and falling apart, which was the tricky part, says Ivins. “We took the model and broke it up — some was procedural, but we modified that. We used dynamic solutions to do the falling. Then we had to write modifications to the dynamics with MEL scripting.” Dynamics is a tricky thing, he says. “Making the pieces repulse or stick to each other. We have some very funny tests where he is sort of exploding instead of falling apart.” Shawn Lipowski was the main animator on the piece.
Ivins says getting the effect right was a combination of technical solutions with a sensitivity to the animation. “And we came up with a happy medium, but it’s not very natural because it starts from the top and goes down to the bottom. We thought, logically, big parts would crack off and hit the floor and break into littler parts, but for the comedy of the scene, which starts with the close-up and gets wider — the best way was start at the stop and work down.”
He says the battle was making all the dynamics work correctly. “Just getting the textures to work right so you didn’t see the lab coat on the interior pieces and making sure the interior pieces hinted at the inside of a body but didn’t look gory because it’s a humorous moment. We had to balance that all.”
He says the artists benefitted from the fact that it’s a very in-your-face effect and not realistic. “It gave us a little license to play with it and give it some little humor beats, like the last piece that falls in the wide shot. There were a lot of technical hurdles to get over, but I think what sells it is the artistry of how it blends into the scene.”
Look Effects used Apple Shake for the color treatment and compositing. “We used some pretty standard multipass lighting systems, rendering out highlights, secular ambient inclusion pass,” says Ivins.
Textures are mostly hand painted, using projections of the actual dummy and cleaning those up. The interior was photographs that were modified. Look Effects modified an Ice shader they had written for another project to get a frostier look. “We had to make sure it wasn’t too gory on the inside but still maintained the ice feeling without being too shiny. We didn’t want it to be ice sculptor-y; we wanted it to look more popsicle-ish.”
The 3D department at Look uses a mix of PC and Linux workstations, while their renderfarm is a hybrid of Macs, PCs and Linux machines. Ivins credits 3D supervisor Michael Capton as the genius the behind system that built “the frozen man.”