Artists working on visual effects for television series take pride in both the quality of their work and the efficiencies they’ve achieved to get results in such a short timeframe. Jim Clark of Hive FX says VFX for a television series often have to be produced at one third the budget and one third the time of a feature film. He credits his studio’s pipeline with making that kind of turnaround possible.
Cinematographer/VFX supervisor Sam Nicholson of Stargate Studios agrees, noting that every year his studio’s shot count increases while budgets decrease. It’s the nature of the television business, he notes. Nicholson feels that feature film productions are going to have to look to television workflows for efficiencies in the future to be competitive. The experience Stargate has gained through demanding workloads and intense deadlines is readying them for work on Internet series, which Nicholson sees as having unlimited potential as a new form of business.
The pros we spoke to this month are creating effects for television, some are obvious, others are invisible. Here’s their take on the business and how they are achieving the results many of us see in our living rooms.
Stargate Studios (www.stargatestudios.net), which was founded in 1989 by Sam Nicholson, ASC, has locations in Los Angeles, Toronto and Vancouver, as well as partners in Mumbai and Malta. Its proprietary Visual Operating System enables artists, supervisors and clients to interact with each other from any location in the world as well as operate a 24-hour-a-day production cycle for maximizing efficiency.
Stargate’s credits include visual effects for The Walking Dead, Revenge, The Event, Grey’s Anatomy, 24, ER, Heroes and CSI. Nicholson says the company can be working on more than 20 shows at any given time, and a new program that’s dependent on the studio’s VFX expertise is Touch, the new Fox series starring Keifer Sutherland.
The show continues a long-time collaborative relationship with creator/writer Tim Kring, who’s used Stargate’s VFX skills on Crossing Jordan and Heroes. Touch looks at the interconnection of people’s lives throughout the world and a young autistic boy who recognizes the affect a person’s action can have on someone they’ve never met and in another part of the world.
Mark Spatny is a visual effects supervisor on the show, which is heavily dependent on virtual sets and mattes to create locations such as Moscow, Mumbai, Bagdad and Dubai. The show is in its first season, but already the storyline is calling for challenging locations such as the site of a plane crash and even the International Space Station.
Stargate worked on the pilot episode and is on board for another 13. At press time, the show was getting ready to shoot episode 10. Each episode can incorporate three or four storylines, and the studio can be responsible for anywhere between 50 and 100 shots.
Nicholson says Stargate’s virtual backlot allows the studio to pull off the different locations. Stargate has an extensive library of high resolution location imagery, and they specialize in re-dimensionalizing it, mapping imagery to animated elements to create 2.5/3D sets.
The studio uses Adobe After Effects for compositing. “It’s the best program for allowing us to do custom programming to tie into our network,” Spatny explains. Maya is primarily used for 3D, though artists do call on NewTek LightWave when appropriate. Digital extras are animated in Massive and landscapes are executed using Planetside’s Terragen.
Stargate produces as many as 10,000 effects each year and Nicholson sees the business following what he describes as a “reverse” Moore’s Law. “Each year we turn out twice as many shots for half the amount of money,” he notes. The feature film business, Nicholson feels, will ultimately have to study the efficiencies that television effects houses have found to work within future budgets. And those working on television effects will continue to refine their workflows to allow them to work on Web series, which Nicholson says, could represent a limitless business opportunity.
Stargate is also working on Beauty and the Beast for television, which will make extensive use of virtual sets, and a series of six Sony Crackle Webisodes featuring high-quality visual effects.
Portland, OR’s Hive FX (www.hive-fx.com) has been providing visual effects services for the NBC series Grimm since the pilot episode. The drama — in its first season — is inspired by the classic Grimm’s Fairy Tales and centers around a Portland homicide detective, played by David Giuntoli, who discovers that he is a descendant of an elite line of criminal profilers. He’s charged with keeping a balance between humanity and the mythological creatures of the world.
According to Hive FX founder Jim Clark, the studio typically has two to three episodes in-house at any given time. Their work includes creating the different creatures that appear throughout the series — some new and some reoccurring — as well as compositing them into production plates.
“Television is so fast now, and so furious,” says Clark. “The work we’re doing now for TV is what you would have expected two years ago for features. The expectations are so high and we have to maintain that because we could potentially lose the contract.”
Clark is a partner in Hive FX with executive producer Gretchen Miller. The two have worked together for 13 years, initially in Santa Barbara, and ultimately set up shop in Portland four years ago. Hive FX is a 6,000-square-foot facility that contributes to both spots and television series.
The studio uses Maya for modeling, rigging and animation, and ZBrush for facial sculpting. Hive also calls on Maxon’s Cinema 4D V.13 for creating hair effects. Maya and V-Ray are used for lighting and rendering. Compositing is performed using After Effects and Mocha Pro. All compositing and editorial is performed using Macs. The studio has 40 quad-core processors in its renderfarm.
A typical episode of Grimm could involve 40 effects shots and each episode could introduce as many as five new characters. “We mostly do characters,” says Clark of their work on Grimm. “That’s mostly what we are known for, especially characters with hair.”
Clark says when the studio initially began working on the series, they would create a two-dimensional morph to change the human characters into their creature personas — a process that Clark says was very complicated. “Now, the process is so much faster,” he notes. “We take photographs of the actors on set and our 3D sculptor builds the actor in about two days. From that model, they then build all of the creatures. The character can morph from a 3D human to a 3D creature. It allows us to have a lot more control over how that morph happens.”
Rarely do they work with greenscreen footage. Instead they shoot the actors on-set under normal lighting conditions. “While we are there we’ll take little dots and mark key places on their face for tracking and reference, and then the first thing we do is paint out those dots,” he notes. “A lot of the time we’ll paint out the entire head and upper body and replace it entirely with the CG version.”
The studio has approximately three weeks to work on each episode. “The only reason we were able to take Grimm on was that we put a really smooth pipeline in, even on a low budget,” says Clark.
DAMAGES, ROYAL PAINS
The Molecule (www.themolecule.net) in New York has been around for approximately seven years. The studio was founded by a team of freelancers, including VFX supervisor/COO Luke DiTommaso, each of whom brought a different skill to the table. Working out of a 2,600-square-foot loft in Soho, The Molecule also recently opened an Los Angeles office, and has been busy contributing visual effects to a number of television shows.
DiTommaso cites the FX series Rescue Me as The Molecule’s first big break in VFX for TV. The studio created numerous explosions and fire effects for the recently concluded series, which starred Dennis Leary as a NYC fireman. Their work on Rescue Me led to work on another FX series — Damages — and at press time the studio was also working on NYC22 and Royal Pains.
Royal Pains starts shooting early, before the leaves turn, and the trees are still bare. It’s supposed to be a summer show. We do a lot of background matte paintings, adding trees.” The Molecule also handles monitor burn-ins and medical enhancements, as well as fixes cosmetic work and effects that help maintain continuity.
One obvious effects shot required the look of a live x-ray. The Molecule secured some x-ray footage and composited a hand, shot against a blackscreen, to make it look like a surgeon was performing a procedure on a patient.
“They also shoot on a stage, so they’ll shoot in front of a massive curtain, which looks pretty good for the most part,” DiTommaso notes, “but you can see that the water isn’t moving, so we will add water reflections and birds passing, and it brings the scene to life.”
For Damages, which stars Glenn Close as a ruthless lawyer, the studio recently had to create a number of video monitor effects. “The protagonist is a Jullain Assange/Wikileaks-type of person,” he notes, “so there is a lot of iChat type of stuff. We are creating a lot of graphics and user interfaces. In one scene, they were doing a Skype thing, but they wanted to show that the people’s faces were being blurred out. We created an interface that they used on-set for live playback of this After Effects-style interface.”
For the musical drama Smash, about the Broadway theater business, The Molecule regularly works on 80-90 shots per episode that run the gamut of VFX needs.
When it comes to shooting elements, The Molecule will most often use the same camera as the show’s production — Arri’s Alexa much of the time. For simpler, one-off shots, the team might use a Canon 7D.
The Molecule uses Nuke for compositing and Maya for creating photoreal elements that get married with live action. “We’re usually creating elements that are comped in. Rarely are we doing full on CG shots,” notes DiTommaso. “On the graphics side, we use After Effects and Cinema 4D.”
The Molecule is based around Mac and Linux systems. The studio has a 400-processor renderfarm that it can share with its Los Angeles office, and can also tap into the West Coast studio’s mini-renderfarm if they need additional processing.
Encore (www.encorehollywood.com) provided visual effects and post services for the ABC series The River, which debuted this season. The show centers around a nature explorer/television host — Dr. Emmet Cole — who’s gone missing in the Amazon while on a shoot. Cole’s wife and son, along with several crew members, set out to find him, and the rest of the missing expedition, unsure of what they might find in the unexplored region. The show is shot in Hawaii.
Stephan Fleet is VFX creative director at Encore, and says the studio handled approximately 80 percent of the visual effects that appeared in the pilot. Zoic also contributed. Encore then went on to handle all the VFX for the additional seven episodes that made up the first season. While unsure as to whether a second season of The River will be produced, Fleet says the storyline allows for future adventures with the existing cast.
One of the show’s reoccurring visual effects includes a stylistic blue-green, dragonfly-type of insect that leads the characters throughout the entire series.
“There were a lot of challenges on the show, but the number one biggest challenge was that, unlike other shows on television right now that take place in a mystical world, this is a show that is supposed to be 100 percent grounded in reality,” Fleet explains. “Everything is supposed to be 100 percent believable. So we had to make our stuff photoreal and feature quality on a television schedule.”
Also adding to the challenge was the show’s decision to stay away from the typical high-end cameras used in television production. “They were using GoPros, 5Ds, 7Ds, and I think the hero camera was a Sony EX3, which is a prosumer camera,” recalls Fleet. “Sometimes they were shooting 10 to 12 cameras at a time. It’s a very different type of show.”
The camera set-up was a stylistic choice, designed to support the show’s handheld feel. “They had the Alexa available, but since the cameramen are actually characters in the show, if you used the Alexa, you essentially doubled the time it would take to get the coverage because you would have the wrong camera in the cameraperson’s hand,” says Fleet. “So it became quickly apparent that using the Alexa on this show was highly uneconomical.”
Fleet says it’s hard to tell which footage captured by the actors actually made it into the final edit. The on-camera talent wasn’t necessarily expected to capture shots perfectly, but if they did get something good, it stood a chance of being used.
“If you are a VFX supervisor like me, the first thing you say is, ‘OK, we are going to shoot it with the best camera. Give me the highest quality footage and the most dynamic range. Give me everything picture perfect and if we need to mess it up, we’ll mess it up in post,’” says Fleet. “That’s the traditional way of thinking. In this case, we were getting some messed up stuff, so we were going to be creative and come up with some workflows and tools to integrate what we do into their footage.”
Camera movement can help mask the seams of CG, says Fleet, but it also meant having to perform 3D tracking for every effects shot.
“Bluescreen and greenscreen pretty much don’t work with these cameras,” he notes. “Other than some monitors, we didn’t use any on the show. We rotoscoped everything and cut out people. That was a decision from the beginning.”