|HOLLYWOOD — After years in the making and hundreds of millions of dollars spent, James Cameron’s dream of taking filmmaking to the next level has become a reality. The individual components of Avatar — adventure/love story, visual effects, motion capture technology and 3D — have all existed before, but arguably, this is the first time these elements have been so deftly integrated in this way. And never has such an epic ambition been so singularly controlled by one man’s individual vision.
Cameron just may be responsible for a new era in modern digital cinema: finding a way to make non-human characters more human with advances in every aspect of digital filmmaking.
When I asked Cameron why it took hundreds of million dollars of talent and technology to make a love story, he laughed. “Well it’s an adventure. It’s a science-fiction film, a fantasy film. It takes place on another planet. The characters are non-human, so it couldn’t be done with actors and make-up.”
Cameron wasn’t interested in “redefining gluing rubber on to actor’s faces.” He was interested in finding a way for non-human characters to express humanness. His vision was to see non-human, CG characters believably reveal human emotions on screen.
Cameron says creating the world of Avatar and it’s people started even before there was a story. “We were going down the road of performance capture and computer generated character creation, and that was one of the stated goals of the film. In fact that was one of the stated goals before the story. [Even when] I was the CEO of Digital Domain I was looking for a way to push my team at the time...to go beyond what was possible then in terms of computer generated creature and character creation.”
Years were spent developing a high-resolution performance capture system that could faithfully reproduce the fidelity and subtly of an actor’s live performance at the level that Cameron was looking for. “I wasn’t interested in being an animator,” he explains. “I wasn’t interested in making an animated movie. Animation is a tool of these films. I wanted the character to be created by the actor, owned by the actor. I wanted it to be transparent from my perspective as a director that the performance that I got with the actor that day is ultimately what the character would do.”
EDITING’S NEW ROLE
“Jim was originally going to edit the whole thing himself,” recalls editor John Refoua [Balls of Fury], “but after a little while he realized that with all the work and directing it was going to be too much. He called me to come in for six weeks and that was two and a half years ago.
“This is such a different movie,” continues Refoua. “It’s not like a normal picture where the editor does this, the director does that.” He notes that the way things are put together, the production and post production departments “aren’t separate, so the director is in the middle of everything. Editorial,” he says, “is very integrated into the production side because of the motion capture.”
When editor Stephen Rivkin (Pirates of the Caribbean trilogy) came on board, one of the first things Cameron did was show him the new technology he was using on Avatar. “My mouth just dropped open,” he says. “I looked at this knowing that if they wanted me I was hooked. This was something I had never seen before: cutting-edge virtual production. On Pirates, we had actors in gray suits with markers all over them and they were on a live-action set interacting with other live actors. Avatar is a whole other level of performance capture.
“This is taking actors, putting them in capture suits on a blank stage and capturing their performances, editing their performances, and building them together,” he continues. “Sometimes stitching pieces of performances together in the same take, so that Jim can ultimately put a camera on them and photograph them from any angle and play back the identical performance.”
Think about the possibilities, he says, normally you would have the classic editing dilemma: “Well the actor was really good in this angle, but it’s out of focus, or I wish I had that performance in the close-up. This is no longer an issue because you can play back that performance and create any shot you want, shoot as many takes as you need, without an actor getting tired, and always delivering the best they had in the entire process of capture.”
Cameron recalls, “The hardest thing for us to get our minds around was the fact that we were editing before we had shots. So in a way it was all infinite possibility; you had this great performance, but you don’t have the shots yet. So deciding which part of the performance might be the right moment for that close-up is a different thing than knowing you’ve got a close-up that you like. So as an editor, it was a whole different problem. Normally you go through, you make your selects and find that great magic moment. You eliminate the stuff that’s out of focus, you eliminate the earlier takes that aren’t as interesting and pretty soon your choices are narrowed down, but here our choices never narrowed. So our narrowing process was our own discipline as filmmakers.”
“One of the first things we would do after we reviewed the footage of the capture was to put together some kind of performance edit,” explains Rivkin. “These were the best of the best. The best performance takes of each actor combined into one scene and played back in realtime for Cameron to enter into and shoot virtual cinematography.”
However the computing system could not play back an entire scene in realtime, especially if there were a lot of characters in the scene. So once a performance edit was reviewed they would break it down into what were called “loads” or sections of scenes, referring to the fact that a scene file needs to be “loaded” into the available memory of the computer. The determining factor of where a “load break” would occur would be the editing “cut.”
Rivkin, commenting on his new role as “performance editor, says, “At the load level we are trying to anticipate where ‘cuts’ might be.” Jim will say, ‘There is no question I am cutting to a close-up at this point.’ Well that’s a logical ‘load break.’ We know we have the freedom to break a load and start a new one there.”
“So by the time you have completed the performance edit,” describes Cameron, “you are half way through a four-step process. Step one is capture, step two is the performance cut, or what I like to call the pre-cut. Step three is shoot the cameras. The actors are not even involved with that part of the process. You set up the ‘loads’ and I’ll be out there with the virtual camera and the editor is sitting there working with me. It’s the most collaborative director/editor process I’ve ever been involved with, because the editor is cutting right away as it feeds directly out of my camera into the Avid [Media Composer system]. He’s assembling the scene behind me as I’m working my way through the coverage on the scene.”
The cinematography process, led by Mauro Fiore (Kingdom, Smokin’ Aces), that takes place after the performance edit is quite extraordinary. “When Cameron is shooting these ‘loads’ played back, he has the ability of manipulating [the actors] spatially or temporally,” says Rivkin. “If someone reacts too late to a line that’s said in a two-shot, he can slip them a second forwards or backwards, or he can move them closer to each other for better composition. Just like you would tell an actor, ‘Could you lean in a little bit?’”
So where does this leave the editor who normally, at the end of the day, has a couple of thousand feet of film to edit, yet now the director can shoot forever?
Rivkin leans back in his chair and ponders the question. “This is where it gets interesting and off the beaten path for an editor. We built these ‘loads’, [editorial is] on the stage during the camera process because we’d have to tell Jim, ‘This is the load you wanted for this line [of dialogue] or this is load where this action was good.’ We are streaming his camera angles [from MotionBuilder] into the Avid in realtime. So they are immediately available to look at and analyze.
“Now there’s a whole new thing that’s happening,” he continues. “There’s an interaction between the director and editor that is similar to what you would do in post, but you’re in production. We’d take a break and Jim would say, ‘Let’s see what you’ve got so far.’ And we’d immediately be temping in sound effects and putting in music and doing all kinds of stuff while he’s shooting it. So what would normally be something you would see much later on in the process you’re seeing instantly.”
Rivkin leans over and taps the space bar on his Avid system and the play head starts moving. In the timeline all 24 tracks of video and audio are being used. At the top of the timeline the completed film plays back on a giant Panasonic flat screen. “It’s still missing a few effects shots here and there, but it’s mostly done,” he says, explaining that the track directly under what we are seeing is the other eye of the 3D experience. The other tracks of that master, the visual stereo final cut is the history of the project, like rings on a tree or layers in an ice core, every track moves us back in time.
Rivkin shows the same scene shot months ago of actors in performance capture suits scurrying around on an empty, brightly lit stage. When the actors’ performances were captured, there were also four to eight HD cameras shooting close-ups so that at any time there was real a reference of that specific actor’s face. In addition to capturing the actor’s body movements tiny cameras that look like headset mouthpieces were recording all the facial expressions of the actor’s green dotted faces. I notice the head gear they’re wearing has prosthetic pointed ears of the Navi characters they’re playing. Nice touch.
Rivkin isolates and plays back another track of that scene. Here blue, low-polygon rigged CG characters duplicate the movements from the previous version. Their heads also have low-resolution renditions of the facial motion capture. The semi transparent look of the features gets them dubbed the “Kabuki mask” versions.
When Cameron was shooting the scenes, it could take an hour to get two or three shots and get them exactly the way he wanted — to really refine them and do the smoothing and adjust the lighting. Now you come up to the forth step in the process. “You finally got your dailies,” the director says smiling.
Even with the very basic process of moviemaking changed in this revolutionary, paradigm-shifting, way Rivkin still feels that “the same things that apply in any feature film situation apply here. We’re trying to find the best way to tell the story. We’re dealing with performance and basic storytelling. Trying to make the best movie possible using the same editing techniques and intuition and instinct that come into play in any film.”
It’s a whole new world.