By Ann Fisher
Issue: March 1, 2003

Creating Digital Characters: Mocap, Keyframe, or a combination of both

You want realistic action? Use motion control. You want range of emotion? Keyframing is the only way to go. Distinctly different tools for distinctly different results. Even the motion capture fans, though, often like to throw a little keyframing into the mix.

For this aardvark character in Jungle Jam, Rhonda Graphics opted for keyframe animation via A|W's Maya.
The Creative Assembly (, with offices in Sussex, England, and Brisbane, Australia, is a videogame developer that is currently producing Rome: Total War, the latest installment of Activision's Total War PC game series. Most of the game's character animations are mocap-based but for some faster combat and attack moves that the performers could never achieve on their own, the artists used a hybrid approach that keyed exaggerated motion on top of their mocap data. The Motionbuilder control rig, Kaydara's automated character rigging tool, made that approach possible by letting them start keying right away onto the character without rigging it first.

"It gave us the best of both worlds of using the subtleties of mocap and the exaggeration of handkey animation," says director of motion capture Greg Alston. "We wanted the animations to have a realistic quality to them. You can't beat the subtleties you get from motion capture. [However] we knew with the power of Motionbuilder, we could apply classic animation techniques on top of our mocap data, so we were no longer just constrained to the actors' performances."

Psyop used keyframe for this Lugz The Arrow spot. They employed Softimage|XSI running on Pentium 3's and 4's.
Alston credits Motionbuilder with helping smooth out other challenges usually associated with motion capture. "Because Motionbuilder is a great motion capture editor and an excellent 3D animation software, it gave us an even greater choice of solutions for occlusion, noise and marker swapping. Here we could assess whether it would be more efficient to reconstruct the optical data, such as filling gaps and filtering, or to handkey the problems out," he says. Creative Assembly uses its own in-house Vicon 370 motion capture equipment. It captured the performers' moves, imported the raw data files onto its virtual actor in Motionbuilder, then mapped it onto its skinned character.

Alston mentions one downside of this technique, "Over keying on top of mocap may look inconsistent next to plain motion capture, making it too cartoonlike," he says.

Creative Assembly has used mocap for several sports PC games and is currently also working on Total War products for PS2, XBox and GameCube consoles.


3DBob Productions (, a Hollywood-based character animation and visual effects studio, used a similar hybrid approach for The GodMan, a 48-minute photoreal CG "life of Jesus" movie, completed this month for client The Book of Hope (Fort Lauderdale). The film was produced completely in-house at 3DBob, using 30 individual character rigs that were multiplied into a cast of more than 100.

The film was directed as a live action one.

Mocap performances were taped on miniDV using Ascension's MotionStar Wireless magnetic system. That footage was edited together and cleaned up with Kaydara's Filmbox before being exported to Discreet's 3DS Max/Character Studio.

Toronto's Dan Krech Productions ( spent 14 months creating Scourge of Worlds, A Dungeons and Dragons Adventure, a 140-minute animated feature film. Using Alias|Wavefront Maya V.4, the DKP team created 14 characters for the feature, all based on the same reference model. Textures were added using Photoshop and Bodypaint. Scourge will be distributed by Rhino Records in June.
"Motion capture was the best way for our characters to have a true human-like performance," says Bob Arvin, president of 3DBob. "The great thing about the hybrid technique is that you can refine each portion of the character's performance individually. On the mocap stage, we get the broad performance. When we ADR the mocap reel, we can then bring in the appropriate voice talent and adjust the dialogue timing when necessary without having to worry about lipsync. Once we layout the scene and begin animation we can then further refine the facial and body performance of each character [with keyframing].

"The only con was the indirect method of getting data from Filmbox to Character Studio," he adds. "The .bvh format was the only way to communicate between the two packages and we lost all neck and spine data during that step causing extra work for the animators."


Duck Soup (, an animation studio in West Los Angeles, pursues a completely different path when creating its digital characters. It has just finished its second CG animated short film, all keyframed using Alias|Wavefront Maya 4.0 on P4 Athlon PCs.

"Years ago, when we realized the wave of the future was going to be CGI, we decided the only way people were going to take us seriously was if we did something on our own. So we did Snowman and it worked the way we hoped," says executive producer Mark Medernach. Duck Soup's first short, Snowman, hit the festival circuit about a year ago, showing in places like SIGGRAPH's Electric Theater, and has served as a valuable marketing tool. "We just finished our first two M&M spots and I know Snowman led them to us."

Duck Soup's new short, Kozo, expands upon the studio's character animators' range. Snowman was about aliens abducting a snowman that melts, so the emotional range was pretty limited. Kozo is about a purple hippo that has a run-in with a vending machine and its expressions range from joy to frustration to anger. "Also, in Kozo we wanted to show more of a true stop-motion look. We lit it like Global Illumination so it looks almost photoreal, like you photographed him… true lighting as opposed to CGI lighting," he says.

Kozo was designed and directed by Lane Nakamura, the Duck Soup director who also directed Snowman. The short took about a year to produce, though that timetable was dictated by other Duck Soup jobs that had to be scheduled around. Medernach estimates the entire short could have been done in about four months.

"Kozo cost us quite a bit of money to do it, but it's been worth it because now it's paying off," says Medernach. "And it's also made us realize that we can provide content as well." Nakamura is currently developing Kozo into a feature length project.


Rhonda Olsen, president/creative director of Phoenix, AZ's Rhonda Graphics Animation Studios (, is in the keyframe camp, too. Her shop specializes in 3D computer animation and combining 3D elements with live action for spots, videos, broadcast and films.They recently animated an aardvark character named Sully for a direct-to-video series titled Jungle Jam. It was all 3D CGI done with Maya on an NT workstation.

"Most people would say that keyframing takes a lot of time, but I think the end result is well worth it. We're not big on motion capture at our shop," says Olsen. "Personally, I have not seen a motion captured animation that I was really impressed with, except for Gollum in Lord of the Rings: The Two Towers. I read that most shots were a combination of motion capture and keyframing, which makes sense. With really great motion capture data you can get close, but adding the keyframing takes the whole shot up a notch.

"We prefer keyframing to motion capture," she states. "It allows the animators to bring out the true emotion of the character and, in this case, we were animating an aardvark who talks, so we weren't looking for a photorealistic aardvark by any means. He is a 'character' by design, an aardvark whose favorite pastime is doing cannonballs at his swimming hole. In this sequence, we were going for a Roadrunner and Wile E. Coyote type of schtick, where Sully's legs spun around underneath him like a buzzsaw [a la Roadrunner] - very cartoony and exaggerated. I don't think you could achieve that with motion capture."


Mark Mayerson agrees with Olsen. He is the director and creator of Monster By Mistake, a children's CG television series produced in Canada, now in its fourth season. Monster By Mistake is a creation of Catapult Productions ( in Toronto.

"We don't feel that motion capture works on characters who are supposed to be cartoon characters. Motion capture would have to be extensively edited to eliminate the realistic timing and action," he says. "By the time we did that, it would be less efficient than keyframing. Keyframing allows us to get the performance we want for these characters."

The main characters in Monster By Mistake are a brother and sister. The brother got mixed up in a magic spell and turns into a seven-foot blue monster every time he sneezes. His sister Tracy has the book of spells that cause Warren's condition. Their best friend is a ghost who lives in the attic.

Currently, episodes 27 to 52 are in production; air dates are fall 2003. Two dozen animators work in three teams; each team animates a half-hour episode in seven weeks using Side Effects Houdini software.

"One of the great things about Houdini is that it is an integrated package and procedural," says Mayerson. "We have an effect where the boy changes into the monster. Because Houdini has such a comprehensive scripting language, we were able to create a macro that lives inside the Houdini interface to take care of this effect. The effect renders out images of the boy and the monster, turns them into high contrast images, traces them into geometry, and forces the geometry to have the same number of points so that we can blend between the outlines of the boy and the monster. It also lines up the new geometry with the camera. The same macro uses the traced geometry as a particle source for a particle system that accompanies the transformation.

"Everything takes place within the same Houdini file through a simple front end interface. The macro takes care of dealing with rendering, compositing, geometry creation and particle systems. All the animators have to do is specify which way the transformation is going (boy to monster or vice versa), the start frame and the length of the transformation."


Pixel Blues ( ), a digital post production company and graphics department in Burbank, uses the method most appropriate for the job. When Raval Media and i25 Post out of Colorado contracted with Pixel Blues for a :30 Taco John's restaurant spot featuring two 3D talking quesadillas, keyframing got the nod.

"We used keyframe instead of motion capture because of the budget and the short duration of the animation. Also, the character had no arms or legs, and motion capture didn't seem to be the best solution," says Pixel Blues CEO/co-founder Randy Tede. Because the characters had no body, all emotions had to be delivered via the mouth and eyes.

Pixel Blues 3D artist Jim Hanna modeled the characters. Maya on PC was used for the animation and a Discreet Flame for final compositing.

Psyop (, a collective of creative and technical directors in New York, uses keyframing for its CG character work, most recently a :30 spot on air now for Lugz shoes, created for client Avrett, Free & Ginsberg. The Arrow celebrates hip-hop culture by weaving through three different characters in a gritty urban environment.

"If we hand animate our characters, we can add more personality to them, more freedom and more attitude," says co-director Marco Spier. "And because of the interaction with the graffiti - they're interacting with this arrow around them - the action had to be hand animated."

The spot was animated using Softimage|XSI on P3 and P4 workstations, then composited on a Flame. This project had nearly 100 layers that included real objects like paint spatter on plexiglass and handheld camera footage.

"We do mostly design and animation motion graphics," says Hyon, the other co-director. "We've done all graphic spots before but we've been specializing in characters the last couple of months. Most clients would like to see characters in their spots. It was a fad for awhile in the world of advertising, and I think we're kind of going away from characters in the future."