By contrast, for The Matrix: Reloaded the universal-capture system used to create virtual heads of Keanu Reeves, Laurence Fishburne and Hugo Weaving works differently, he says. In that case,"the actor is filmed from several directions under one relatively diffuse lighting condition, where they use crossed polarizers to separate out the shininess of the face. The diffuse lighting yields a diffuse texture map for the face, and the way that the face reflects light is then simulated according to analytical models of surface reflectance and subsurface scattering."
The Light Stage/Sony system works differently, explains Debevec. "To create a rendering of the actor in a real or virtual set, the computer program can simply recombine the images of the face to simulate its appearance under any combination of lighting directions - the light from a real set, or the light envisioned by a lighting designer, or a combination of both.
"Algorithms implemented by Mark Sagar and his team at Sony Imageworks made it possible to animate the shape and viewpoint of the face based on several expressions of the actors, creating fully animated photoreal faces," he continues. "A particular advance was a novel rendering pipeline created at Sony Imageworks that allowed the digital lighting team to light the reflectance field faces and traditional CG elements using the same sets of traditional lighting controls."
"We get all the textural information from the light stage," adds Stokdyk, "but the facial expressions we get from motion capture." Animators were able, he says, to combine bits and pieces of performances chosen by director Sam Raimi and animation director Anthony La Molinara, and customize them to get a specific expression that matched the action and could work under any lighting condition.
Whenever possible for reference, adds Dykstra, " we have the actual actor, preferably in the actual lighting that was used, actually performing the scene ."
DOC. OCK
In Spider-Man 2 Dr. Otto Octavius is a brilliant scientist trying to harness fusion energy. During a demonstration where he manipulates the raw plasmas with mechanical arms, an accident occurs. The robotic appendages fuse to this body and he transforms into the deranged, multi-tentacled Doc Ock who becomes obsessed with destroying Spider-Man. Animating those arms became a personal challenge for Dykstra.
He says, "We started out with Doc Ock as a concept" that evolved to over a year in tight collaboration with animation director Anthony LaMolinara, costume designer James Acheson and production designer Neil Spisak. They were aided by the concept drawings of James Carson, Alex Tavoularis and Paul Catling, with Raimi "orchestrating the entire deal."
Doc Ock doesn't want to move like a spider, explains Dykstra, he doesn't want to move like a jointed quadruped, and when you make him look like an octopus, an octopus looks great under water, but you put them on dry land and they lose their strength and power. The trick, he notes, was coming up with a design for the tentacles that gave them personality... a sense of them being an organic extension of Doc Ock's physiognomy, but still able to maintain some sense of dangerous mechanics.
Artists gave Doc Ock's tentacles personality and a sense that they were an organic extension of the character's physical make up.
|
"The determination was made that we would put practical tentacles or puppets in the scene to interact with the actors as much as possible," Dykstra continues. He says Raimi loved having the puppets in the scene because he could be spontaneous with his direction. In full rig, 16 puppeteers were required to move in concert with Molina. Even scenes where the puppets would ultimately be replaced by CG, they were still used for realistic action and lighting reference.
For the CG versions of the tentacles, they broke down, piece by piece, then cyber-scanned the practical tentacles built by Edge FX. Imageworks then textured and set them up for the animators.
"For our CG tentacles," explains Stokdyk, "we needed the ability to stretch and un-spool out of Doc Ock's back so we setup slinky-like squash and stretch controls. That helped a lot on impact when the tentacles hit a building where we could put a little shock wave, a little ripple, through it."
THE IMPORTANCE OF INFRASTRUCTURE
Visual effects digital content creation would be very difficult to accomplish without an ultra-sophisticated computer infrastructure to support it.
"We are a 24/7 operation," says Alberto Velez, executive director of systems engineering at Sony Pictures Imageworks. "A challenge we face is increasing the technical infrastructure to handle the complexity of digital shots, which on this project we were always moving toward more dynamic photorealism. This includes providing additional processing power, network bandwidth and high performance and highly available storage."
Two of the biggest changes that Velez has managed over the past few years are the migration from SGI Irix systems to higher performance commodity workstations and render servers and the transition to the Linux OS, both on the front and back end of the facility. In an ideal world, once all the applications Imageworks uses are ported to the open-source operating system they will be "Linux on all fronts."
"At the beginning ofSpider-Man 2," Velez recalls, "we were in transition to a Linux-based renderfarm. Some of the creative leads were a bit concerned about the transition, but once they realized its benefits they welcomed the significant performance gains."
Another key technology that was deployed at Imageworks for Spider-Man 2 production was an improved color-correct viewing process using in-house proprietary tools and an NEC DLP projector. This configuration facilitated the ability of production leads to review digital dailies in a correct color space that resulted in quicker feedback and turnaround in processing shots by their artists.
Imageworks' technical infrastructure scales up to bring as many workstations and servers online as needed during a project, sometimes numbering in the hundreds during crunch time. "In the last three months on Spider-Man 2 we brought on a lot more machines, particularly for the renderfarm. Since the first Spider-Man movie," continues Velez, "the renderfarm evolved from a thousand 1GHz processors to roughly twice that many with 3GHz processors."
Imageworks uses a mixed array of IBM, Hewlett-Packard and Dell workstations and servers in single- and dual-processor 3GHz configurations. IBM and HP workstations are used interactively and include Nvidia Quadro FX graphics cards and an average of 2GB of memory. Dual-processor IBM and Dell servers comprise the renderfarm. All systems are networked via a Gigabit Ethernet backbone LAN.
Visual effects productions store all their files on Network Appliance networked attached storage (NAS) files with a current capacity of more than 50 Terabytes to handle all in-house projects. The assets of Spider-Man 2 alone took up over 10 Terabytes. Imageworks has developed its own Oracle-based asset management system.
Not resting on their laurels, Velez says his department is investigating the eventual migration from 32-bit to 64-bit systems. "Workstations and servers in a 64-bit architecture," he says, "offer inherently more precision, higher memory addressability and enable rendering of large, complex datasets at an increased level of detail. They also give us the ability to simulate physical phenomena much more realistically."
ADDITIONAL SHOTS
Santa Monica- and San Francisco-based Radium worked closely with Sony Imageworks on 52 effects shots for the film. The Radium team, led by VFX supervisors Scott Rader and Jonathan Keeton, as well as VFX producer Tom Ford, performed complicated wire and rig removal, CG and compositing work for various shots and sequences. The wire and rig removal involved re-creating buildings, removing cranes and apparatus that enabled the characters to fly, as well as CG geometry and projection mapping of high-rez images. They used Discreet Inferno, Combustion, After Effects, Maya and Baujou.
"What was most challenging from an artistic standpoint was that each sequence required different looks," explains Rader. "Each round of shots entailed a fresh start, and we completed a wide array of effects work. We approached each shot differently, even though there were eight or 10 shots per sequence."
"Another interesting aspect to our work was that there was a high level of protection with the film," adds Ford. "We were provided with single frame references, but were never allowed to see the context of our shots. It presented an exciting challenge for us to create effects for certain shots and sequences without seeing how our shots fit together in the final film."