VFX for Films
Issue: August 1, 2013

VFX for Films

If any summer called for cooling off in a multiplex with a summer movie, it’s this one. VFX studios around the world have enhanced sci-fi and superheroes, the paranormal and the undead, helping pack cinemas with fans eager to see their latest handiwork.

ELYSIUM

After earning an Academy Award nomination for his work on Neill Blomkamp’s 2009 feature District 9, Peter Muyzers, a partner at Image Engine, Vancouver (www.image-engine.com), has reteamed with the director on the new sci-fi film Elysium.

“We started talking about Elysium while we were working on District 9, so it’s been about four years for me,” says Muyzers, who served as VFX supervisor on the new movie. “With films, you’re sometimes chasing a vision that’s not very clear, but Neill knows very much what he wants and can easily articulate it. For Elysium it ultimately boiled down to ‘is the audience going to believe this, or are they going to think of what they see as visual effects?’ Neill didn’t want Elysium to be Star Wars. He wanted all the VFX rooted in reality. In all our briefings Neill provided references based on real-world examples. If he didn’t believe the VFX could exist in the real world, they didn’t work.”



Image Engine formed an in-house art department to handle a lot of concept work on the Elysium ring, the idyllic space station-like environment inhabited by the elite as Earth becomes overpopulated and disease- and crime-ridden.    

Elysium is “an extreme version of Malibu,” Muyzers chuckles. “In fact, we took helicopter [reference] footage of Malibu to get the feel of flying over Elysium.” Image Engine got considerable creative input from legendary sci-fi concept artist, Syd Mead (Blade Runner, Tron). “It was an inspiration for all our artists to know that Sid was working on the line drawings,” says Muyzers.

The complex build of the fully-CG Elysium ring was split between Image Engine, the film’s lead VFX vendor, and San Rafael, CA’s Whiskytree, which became “almost an extension of our environment team,” Muyzers says. Image Engine was tasked with all of the structural components, exterior surfaces, the hub and glass, while Whiskytree handled the terrain, vegetation, trees, mansions and bodies of water.

“For the sheer scale of things — the panels, beams, support structure — we looked at today’s engineering feats for inspiration,” says Muyzers. “We asked questions like, ‘if a span was this long, how many cables would it really need to hold it up?’ We wanted everything to feel real world-like, not flimsy. Audiences have to believe that Elysium could exist out there.”

The daunting complexities of the ring tested every piece of hardware and software in the shop, he notes. “We made healthy assumptions how complex it would be, but Neill kept adding more and more to get the level of realism he wanted. That meant more storage, more renderers — [Solid Angle’s] Arnold and 3Delight — and working with Whiskytree to gain more bandwidth and people.”

In addition to a data set heavy with polygons and textures, an added challenge was working at 4K resolution.“They shot on Red Epics with anamorphic lenses, so they captured 3.3K resolution that was bumped up to 4K,” Muyzers reports. “We built a 4K review theater, the first in Vancouver, so we could assess all the detail viewers would see when the film was released.



“4K is here to stay, and we’re setting new standards with these features. We worked closely with the Academy [of Motion Picture Arts and Sciences] and their ACES color encoding system to refine what it did on the show. It’s still early days for ACES, but it’s an absolutely key component to 4K — a powerful way of managing color workflow for the future.”

Besides building out the Elysium ring, Image Engine added droids and flying vehicles to numerous plates. Actors performing on-set in gray suits were painted out and replaced with digital droids making sure “that the performance of the actor was clearly visible in the performance of the droid,” Muyzers explains. “There was a motion-capture shoot for a few crowd shots, and our animators added more detailed animation to the droids, like pop-up gadgets that would show highly-advanced robotics. These included weapons, sensors, lights and even a collapsible police baton.”

Helicopters were photographed as stand-ins for space vehicles. By painting out the helicopters and inserting space ships of a completely different design, Image Engine was able to preserve the vehicles’ interaction with the environment — landing and kicking up dust and dirt, for example. “The integration made the vehicles real and tangible,” says Muyzers. “That was absolutely key to getting performances that worked for the space ships.”

There was room for miniatures in Elysium, too. “I don’t believe miniatures are dead,” Muyzers says. A space ship crash was done with one-sixth scale miniatures at Kerner Optical, now 32Ten Studios in San Rafael. “Neill said let’s do this as miniatures — let’s not even try CG,” he recalls. “It would have been hard for a digital artist — the space ship launch, trees breaking and snapping, dirt everywhere. If you got it wrong it would feel fake. So they shot miniatures at Kerner with five Red cameras rolling simultaneously at high speed.”

Muyzers says that the variety of work Image Engine handled for District 9 seemed “hard at the time but looks simple now” compared to the diversity and complexity of work for Elysium. “The droids, flying vehicles, massive environments and the complexity of the ring itself — we’re so proud of our team. There’s not one shot that I think could look better.”

THE WOLVERINE

Wellington, New Zealand’s famed Weta Digital (www.wetafx.co.nz) returned to the Marvel Comics universe with The Wolverine, the latest in the hit X-Men franchise, which follows Logan/The Wolverine as he tangles with the Yakuza. The company’s work on the film was diverse, encompassing characters and environments, which gave Weta’s artists “more of a sense of the whole film,” says Weta VFX supervisor Martin Hill.

The main CG character in the movie is the Silver Samurai, The Wolverine’s nine-foot tall adversary, who’s not only clad in traditional samurai armor and weapons, but also a mechanized battle suit. “They built a full practical armature of him on-set, and we needed to bring it to life,” Hill says. “It couldn’t walk or move; it was effectively a place holder or it was used when seen from behind or at a distance.”



Recreating the design in CG was a challenge because “when the model was created, it wasn’t necessarily designed with all the range of motion and articulation needed,” he reminds us. “We made changes so it could walk, move its arms and swing its sword around without looking conceptually different from the original.”

Although director James Mangold wanted “a dynamic and characterful performance” from the Silver Samurai, the villain’s suit was “mirror-like and closed” and allowed “no facial expressions,” says Hill. “So all the emotion was put into the animation — head nods, dazed or confused body movements. It would look quite robotic if those nuances weren’t there.”

Motion capture footage was used as reference for the Silver Samurai’s martial arts’ moves and swordplay as was video footage of visiting kendo experts. “Jim wanted the Silver Samurai to move with fluidity and confidence although he was very tall, broad and bulky,” says Hill. “The reference footage gave us the nuances to apply to augment the animated performance.”

Weta developed new render techniques for the Silver Samurai’s metal body, without which the character could have looked “too chromy, unreal and CG,” Hill notes. “Our in-house shaders created his surface as damage built up from the fight. We used arc welding footage as reference and discolored the surface to show hits.”  

Animators also looked at footage of traditional Japanese sword forging techniques to craft the Silver Samurai’s weapon, which cuts through bone like butter. “Beating and folding the metal, the striated metal effects — we used them as inspiration to create a hot weapon that could slice through The Wolverine’s skeleton,” says Hill.

The company also extended the sequence’s two-story, high-tech lab set to appear as if it were 20 to 30 stories tall.
In another complex character animation, Weta was charged with helping the wounded mutant Viper shed her skin. As the actress appeared to slice her face open with a fingernail, Weta match moved the performance and added digital elastic skin that pulled at the new skin beneath it. “It took a lot of 2.5D effects and 3D with lighting effects to deform her body,” says Hill.  

Weta aged and de-aged Yakuza boss Yoshida in a battle with The Wolverine using highly-detailed scans of the younger and older actors portraying Yoshida and morphing between them. The company added, extended or ripped out digital claws for Logan and crafted digital doubles to extend stunt doubles’ wire work.

For a night sequence featuring a dying animatronic bear, the director wanted more emotion. “So we needed to match move the animatronic bear with a 2.5D technique, where we animated on top of the performance to give more expression to the eyes, brows and tongue,” Hill explains.  “Once we made a new performance we warped it onto the animatronic bear, and with skilled comp work it looked seamless. It was easier working with the animatronic plates than creating a full CG bear to match.”

For the exciting bullet-train fight, which was shot on a soundstage in Sydney, Australia, Weta extended a short segment of the train set and built Tokyo in the background.



“Creating a fully-digital city is a huge undertaking, and not one I wanted to get into if there was a better way,” recalls Hill. Although it was impossible to track a real bullet train and capture authentic backgrounds, a van with a rig sporting eight Red Epic cameras in a panoramic configuration drove down an elevated freeway in Tokyo recording “almost a Google street view” of the train’s backdrop. “We could speed it up and take out the road and cars,” Hill explains. “It was never going to be quite the right perspective, so we did some distortion and rebuilding — but the footage still gave us a solid template. It all went much faster than building a full CG city.” Weta also added signage and advertising to bring the colors of Tokyo to the scene.

Hill notes that for a company that “tends to build worlds entirely digitally,” Weta’s extensive use of 2.5D techniques for The Wolverine was something new. “Augmenting plates is a very powerful technique,” he says. “You’re automatically working with reality that the director has filmed, the DP lit and everyone’s happy with, so there’s a lot of value in that. You have to be clear about its limitations, but you can finesse what you’ve already captured to a big extent.”

THE CONJURING

Real life meets the supernatural in The Conjuring, the true story of paranormal investigators Ed and Lorraine Warren, the couple involved in ridding Amityville of its horror, and the Perron family who confronted witchy doings in their new home in 1971. Visual effects by Pixel Magic (www.pixelmagicfx.com), which has offices in Toluca Lake, CA, and Lafayette, LA, help tell the tale in ways both undetectable and spooky.

Pixel Magic created 150 VFX shots, keeping in mind the brief from Warner Bros. and director James Wan. “Things couldn’t look so supernatural that people wouldn’t believe they could have happened for real,” says Raymond McIntyre, Jr., a VFX supervisor, partner/VP at Pixel Magic and the film’s on-set VFX supervisor. “There couldn’t be any Harry Potter in it — no magic. When we couldn’t get the results we wanted practically, we used CG. But we tried to keep everything grounded in reality as much as possible.”



The Perron’s story was grounded in reality. The house they moved into had a history of possessed witch mothers and murdered children, starting a century before.

Things signal trouble almost from the outset. The Perron’s youngest daughter, who will soon be in peril from her own mother, finds an antique music box, which serves as a portal to the past. It reveals, in composite shots by Pixel Magic, reflections of long-dead child victims in its spinning mirror. Digital breath shows how cold the house has become. UV lights reveal ghostly digital handprints and footprints on the walls and floors. Flocks of digital birds menace the house at night.

A complex shot that first reveals the witch begins with Lorraine Warren gathering laundry from a clothesline. A sheet flaps on the line and flies off, drapes itself into a human form, falls off to show nothing’s there, wafts up to cover a second-story window, then falls away once again to reveal the witch in the bedroom window where Mrs. Perron is asleep.

McIntyre says the sheet was initially intended to be a practical effect. “But given the time we had, we decided to do it digitally,” he reports. Still, having witnessed the attempts to rig the practical sheet proved an advantage. “I saw how the real sheet behaved in space, with the light, as it was flying. That really helped create the CG portions of the shot.”

Mrs. Perron develops subtle, then more severe, bruising on her body as the witch quite literally grabs her and takes possession during the film. Pixel Magic used a 2.5D technique to project Photoshop files of makeup effects bruises onto her arms where tracking markers had been placed and created depressions in the skin showing the witch’s hurting unseen hand.

Stunt work and digital effects combined in the terrifying attack on daughter Nancy, who’s lifted up by her hair and flung and dragged across the room. Individual strands of Nancy’s hair start to lift in the air, then more and more hair rises until she’s lifted off the floor and flung at her parents, breaking glass French doors. Then Nancy is pulled across the floor by her hair, spinning in all directions.

“The big chunks of hair we could do practically with clips and wires, but to get the individual strands of hair to perform the way we wanted them to in an extreme close-up of Nancy we had to use CG,” says McIntyre. A stuntwoman was hoisted by practical wire, her was hair created in CG and stretched to lift and pull her around the floor. The wires were digitally removed in post and replaced with more flooring.

To release Nancy from her torment, Lorraine Warren grabs a pair of scissors and cuts a hank of digital hair and watches it fall to the floor. But when it hits the floor the shot transitions to practical hair shot by McIntyre in post. “I got a wig, cut and dropped it to capture the real physics of hair hitting the floor,” he says. “It would have meant a ton of render time and simulations otherwise.”



During take one of the shot, Lili Taylor, who plays Mrs. Perron, wasn’t quite prepared for the breaking glass and had a genuinely frightened reaction to it. The director was delighted with her response but preferred the action that played out in a subsequent take. So Pixel Magic was charged with stitching the best of two takes together with painstaking rotoscoping and background replacement.

Another seamless VFX sequence from Pixel Magic opens the movie, creating from plates of the location and stage houses what appears to be a single Steadicam shot. It shows the Perrons driving up to the house on location, moving through the front door, traveling through the inside of the house on-stage, then moving out the back door and into the location house’s yard.

Pixel Magic removed some of the scary from a shot where the possessed Mrs. Perron changes her physical features via prosthetics — contact lenses, veins, lips, skin treatments — then returns her back to normal. Artists used 360-degree photos of Lili Taylor’s four stages of makeup and a digital head they’d made as a guide to help them paint her back to normal.

Pixel Magic also created a set extension in post for the rocking chair sequence to deliver a camera angle not captured on the set. “I photograph everything and take a lot of HDRI photos for lighting reference,” says McIntyre. “We couldn’t have created the digital matte painting needed to complete that scene without the photos we had — both my own and those from the on-set still photographer. We were generating a camera angle that didn’t exist.”

The company’s toolset included The Foundry’s Nuke and Adobe After Effects for compositing; Imagineer Systems’ Mocha for planar tracking; Andersson Technologies’ SynthEyes for 3D tracking; LightWave, Autodesk 3DS Max and Maya for animation; and LightWave, Cebas FinalRender and Mental Ray for rendering.

PACIFIC RIM

The ever-busy Industrial Light & Magic (www.ilm.com) created more than 1,500 VFX shots for Pacific Rim, a sci-fi film that honors kaiju and mecha genres while standing on its own as a unique film that director Guillermo del Toro has described as “operatic.”

Set in the near future, international soldiers pilot giant mecha called Jaegers in a battle against monster kaiju invaders who have arrived on Earth via a Pacific Ocean portal. ILM’s San Francisco headquarters handled the majority of the VFX shots, with its facilities in Singapore and Vancouver contributing to the fight sequences. Ghost VFX in Copenhagen, Rodeo and Hybride in Montreal, and Base FX in China also crafted shots; Virtuos was tasked with asset building.



“We did a variety of work — CG characters, matte paintings, set extensions and pure effects work,” says ILM VFX supervisor Lindy De Quattro. Although del Toro’s art department provided a starting point for the mecha and kaiju creatures, ILM’s aptly named art director, Alex Jaeger, worked with model supervisors Paul Giacoppo and Dave Fogler, as well as De Quattro and her fellow VFX supervisor John Knoll to refine the final look of the characters.

“We pulled reference footage of animals, from gorillas to crocodiles, to find weird eyes and other component parts for the kaiju,” which represent a wide range of species, she says. Del Toro also provided cultural references — from fine art, films and graphic novels — for inspiration.

The kaiju are “definitely unlike anything we’ve done before,” De Quattro says. “Each one is unique, and they all look, move and fight differently.” ILM animation director Hal Hickel spent a lot of time “developing specific movements for each character so they’d be threatening and ominous” and not merely huge and silly. The crab-like Onibaba had myriad small inner claws that had to be functional in a fight, for example. 

On the other side of the fight card, del Toro didn’t want the mecha Jaegers to remind viewers of Transformers, she notes. “We decided not to use any motion capture; we didn’t want them to look or move in a human way. There had to be machines behind the action.” So animators used vehicles as references, including a lot of US and Soviet WWII tanks, and Alex Jaeger determined where the mechanics of a piston or ball joint would provide the movement required.

Like the kaijus, each of the Jaegers has a different personality. “They represent the countries on the Pacific Rim that have an emotional stake in the battle,” De Quattro explains. “The American Jaeger is a bit of a cowboy with a wide-legged stance and a swagger in its walk. The Russian one looks like Cold War technology; the Chinese one is more agile and adept at martial arts.”

Battle sequences take place in Hong Kong and at the bottom of the sea. Although ILM started with real Hong Kong location footage the digital settings had to be amped up in scale to accommodate the giant warriors. “They were so huge that they couldn’t walk down the biggest street in Hong Kong without knocking down buildings,” De Quattro notes. “So we split streets to widen them, if nothing else.”

Fluid sims were required for the ocean surface. With the Pirates of the Caribbean series and Battleship to its credit, ILM was “confident that we were in a good place with water,” she quips. But fluid sims for vast expanses of water are still time consuming and expensive to do. Water was “art directed” to create giant waves “that were physically correct but would get out of your field of vision” and not block the action that followed. Del Toro considers the film’s digital water its most exciting visual effect and has called ILM’s water dynamics “technically beautiful but also artistically incredibly expressive.”

Digital rain played a big part in shots as well. “Everything was wet all the time,” De Quattro says. “A lot of fights take place in the rain and a lot of those feature slo-mo sprays of rain flying off the surfaces of the Jaegers in a Raging Bull kind of slo-mo sweat moment. It really humanizes them.” 

A pipeline for digital rain was developed to layer precipitation in shots, adding atmosphere and color, including the “beautiful super-saturated washes of rain” that del Toro wanted to show off the neon lights of Hong Kong.

ILM didn’t use quite so much of its in-house software for Pacific Rim. Instead it tapped Side Effects Houdini for its rigid sims pipeline, The Foundry’s Mari for texture painting, Nuke for compositing, Katana for lighting, and Arnold, Chaos Group’s V-Ray and Pixar’s RenderMan for rendering. ILM retooled its pipeline to accommodate the additional data per layer that the show’s deep compositing demanded.

De Quattro says it was fun working with del Toro, who proved to be an “inspirational” force. “He’s such a fan of filmmaking, of the genre, of ours! His enthusiasm was very contagious — the whole crew caught it.”

During the production of Pacific Rim SIM Digital/Bling Digital in Hollywood (www.simdigital.com) handled the front-end workflow at the Pinewood lot in Toronto, providing camera support, DIT equipment, the Avid offline set-up and building out a custom data lab.



“One of our permanent data labs was nearby, but they wanted their own data lab within the production offices, so we built a system tying together the data lab and dailies room for the offline and VFX editors,” explains Chris Parker, chief technical officer for SIM Digital. “We also tapped into Pinewood’s internal network so, from the stage floor, production systems supervisor Ben Gervais could set the looks for the dailies with DP Guillermo Navarro and transfer them through Pinewood’s network to the data lab.”

Early on it was determined that a purpose-built system would best serve the film. It needed to handle data management and all dailies processing, which encompassed setting the looks, managing the color files, processing the files to Avid-friendly specs, transferring them to Warner Bros.’ internal screener system and doing VFX pulls.

Since Pacific Rim was largely a stage-based show the director wanted to get footage to the editors quickly. “At the end of the shoot day, del Toro wanted the offline editors to start working on cuts with that day’s footage. So we tied the data lab to offline and VFX editorial to tighten the turnaround,” says Parker. 
 
SIM Digital supplied several Blackmagic Design DaVinci Resolve dailies workstations, several Avid workstations and separate ISIS shared storage, all tied together, for dailies and data to promote ease of use and enhance security. Ben Gervais wrote custom scripts for the VFX pulls that efficiently sourced shots in the master file and transcoded them to ILM’s specs.

When shooting wrapped, del Toro remained in the Toronto production offices, working on his director’s cut. But when it came time for him to move back to LA for a screening, the question of how fast SIM Digital could move the Avids came into play. The solution was to build a mirrored system for del Toro in LA.  

“He was cutting in Toronto, took a plane to LA and picked up where he left off,” says Parker. “Then editorial shifted to the LA post production offices, which we continued to build up after they decided to post-convert the film to 3D — we beefed up offline by adding 3D Avid workstations and screening monitors.”

Parker notes that it was important to come to Pacific Rim with “no preconceived plan. We wanted to hear what the show was all about: locations, stage, cameras, editorial needs, ILM’s needs. With a blank slate and a lot of discussions we were able to build out a system specifically for their needs. Ben Gervais and dailies producer Jesse Korosi were instrumental in the design and implementation of this purpose-built system.”



The system helped del Toro “get into offline faster than he ever had before,” says Parker, and it enabled DP Navarro to “maintain a comfort level and apply his immense talents to his first major digital project. By all accounts it was the best [front end] experience they’ve had; it allowed them to shift from production into digital post seamlessly.”

WORLD WAR Z

Hands down the go-to company to record massive zombie swarms is Audiomotion Studios (www.audiomotion.com), the Oxford, England-based motion capture provider, which gained its undead expertise working on World War Z.

Audiomotion deployed 160 of its Vicon cameras on location at Shepperton Studios, creating one of the world’s largest-ever motion capture stages. The 50-by-85-foot volume was used to capture hundreds of stunt moves later transformed by The Moving Picture Company and Cinesite into thousands of frenzied digital zombies.



“We had already executed quite a few shots for the show when the animation consultant realized they were going to need a huge capture volume, so we started looking for a stage big enough to accommodate our needs,” says Mick Morris, managing director/co-founder of Audiomotion.

They set up on an unheated stage at Shepperton during the depths of winter, hanging all of the Vicon cameras from trusses; preproduction planning helped configure the cameras quickly since time on the stage was limited.

“We’ve been doing motion capture for 15 years and had never been tasked with a capture volume of this size,” Morris says. “We had never put together a rig of 160 cameras before — we were pioneering and breaking new ground. We captured hundreds of moves that populated a lot of set pieces,” including zombie swarms in Philadelphia and Jerusalem. 
 
In addition to the scale of the motion capture volume and the freezing on-set conditions, Audiomotion also had to deal with myriad stunt performers who obscured markers. “Even with 160 cameras it was hard to see every marker because of the sheer amount of occlusion,” Morris explains. “The Vicon software does a good job filling in the gap when a marker disappears on the trajectory and reappears later. But we still needed a huge amount of manual intervention in post to make sure the clips delivered to MPC and Cinesite were as high quality as possible. This wasn’t just a case of pushing buttons and capturing motion.”

Indeed, Audiomotion’s expertise was evident on two fronts: having “the technical chops to set up this large a capture area,” says Morris, and fielding a team capable of solving any problems with the zombie character rigs to create clips ready for the animators at MPC and Cinesite to transform into swarms of the undead.



On-stage stunt performers clambered up netting to simulate scaling the walls around Jerusalem and smashed into and crawled over stand-in cars, buses and trucks. “These were not lumbering zombies,” Morris reminds us. “We worked closely from the early days with the choreographer, movement directors and animation consultants, exploring different ways for the zombies to move and performing movement studies and tests to develop their distinct motion.”

Audiomotion didn’t have to write any new code for the project, and the Vicon hardware, “the Rolls-Royce of motion capture cameras,” never let the team down, Morris reports. “You can take them into almost any environment and know they’ll get the job done.”

The resulting digital zombies blended in “amazingly well” with zombie stunt performers and extras shot in camera. “Our motion capture was meant to be seen up close and personal, and the CG characters are almost indistinguishable from those captured in camera,” Morris says.