<I>Avengers: Endgame</I> VFX supervisor Dan Deleeuw
Karen Moltenbrey
Issue: May/June 2019

Avengers: Endgame VFX supervisor Dan Deleeuw

If we have learned nothing over the past decade since the characters from the Marvel Cinematic Universe (MCU) began gracing the big screen, it’s that each Avenger can easily carry his or her own film all the way to resounding box-office success. But, when they band together, it’s a total game changer — and none more so than the recent Avengers: Endgame, which toppled box-office records even before it was released on April 26th. As of press time, its success continues, nipping at the heels of Avatar as the top-grossing movie of all time — this after just little over one month's time. 

In fact, of the top 10 highest-grossing movies, half are Avenger-related films — and only one of those five centers on a single Avenger (Black Panther), while the other four movies feature a team — proving their power in numbers.

With more Avengers on screen come more visual effects scenes. And in Endgame, the 22nd film in the MCU, there are many — nearly 2,500 out of roughly 2,700 shots in the film contain VFX. In comparison, Avengers: Infinity War had approximately 2,700, and since the two films were shot back-to-back, many of the studios on Infinity War continued their work on Endgame, a direct sequel, as did directors Joe and Anthony Russo. According to Dan DeLeeuw, Endgame’s visual effects supervisor, 13 vendors worked on this show, with Weta Digital and Industrial Light & Magic generating the larger share (494 and 533, respectively), as Digital Domain, Framestore, Cinesite and Dneg assumed a large number as well, in addition to work by several other vendors.

Despite Endgame being a continuation of Infinity War, the facilities did not rest on their laurels, with a number of advancements for the latest film, especially concerning work on Thanos, Smart Hulk and throughout the final battle. 

“Interestingly, Endgame has about 200 fewer VFX shots than Infinity War, but the actual complexity and length of the shots far exceed anything we’ve ever done,” says DeLeeuw, who was also visual effects supervisor for Infinity War and other MCU films. “It was definitely a progression in terms of understanding the size, scope and especially the density of the effects.”

Endgame picks up following the devastating events of Infinity War, with the universe destroyed after an injured Thanos manages to activate the Infinity Gauntlet, and half of all life across the universe disintegrates. However, the surviving Avengers unite to take back the Infinity Stones in order to reverse Thanos’s destruction, but soon learn that Thanos has destroyed the stones. Tony Stark (Iron Man) and Bruce Banner (Hulk) successfully build a time machine in an attempt to resurrect those who Thanos had disintegrated. 


Avenger fans are used to seeing two sides of Hulk (Mark Ruffalo): the angry, green, muscular version and the scientist Bruce Banner. In Endgame, they see a whole new side — Smart Hulk, with the stature of the big green guy but with the intellect and demeanor of the scientist. “I think we pushed Smart Hulk further in terms of what we could do with a digital character,” says DeLeeuw of the work, which was handled by ILM.

ILM has worked with Hulk a few times in the past. But this time, the single character had both brains and brawn, requiring a new approach that gave the CG character more human-like qualities. ILM began with scans from Disney Research Studio’s Medusa Performance Capture System, a mobile rig of eight cameras and lights coupled with proprietary software for reconstructing a high-resolution version of an actor’s face in full motion, without the use of traditional motion-capture dots. Ruffalo (and Josh Brolin, as well, for the character Thanos) sat in front of the cameras and practiced facial shapes and performed dialogue. That information was captured and used as the basis for building the underlying Banner mesh as well as for the basis of ILM’s Hulk facial shape library. 

Medusa tracks the pores of an actor’s face, from which a photoreal 3D model is derived. However, the eight-camera solution requires the actor to sit in a studio environment; the directors wanted him on the set with the other actors. So, ILM began the solve using SNAP, what was then its current facial solver, which comprises two head-mounted cameras positioned in front of the actor’s face, where tracking markers are placed, enabling the facial animation to be acquired on set. The trade-off is that a low-resolution mesh is used, which ILM then reapplied to the high-resolution mesh from the Medusa scan to drive the facial shapes.  

“We started there but felt like we needed a higher level of fidelity given Ruffalo’s performance, the nuance of his face,” says Russell Earl, ILM’s VFX supervisor. “So, we started to improve the SNAP solve system by taking the meshes we were solving, and then basically comparing them back to the shapes we had captured in that initial Medusa session.”

The results were better in this newer system, known as Xweave, but ILM learned about Disney Research Studio’s Anyma, which up to this point was used as an ADR-style booth with three stationary cameras for recording performance dialogue after the fact. Disney Research Studio adapted the Anyma solver to work with ILM’s head-mounted camera footage that they had already shot. Anyma doesn’t just rely on the low-resolution mesh generated from the points; rather, it generates a mesh per frame and does a photometric solve based on the footage from the head-mounted cameras for a much higher-fidelity solve.

While the performance solve was being completed with the new system, ILM was simultaneously rebuilding the re-targeting aspect, whereby Ruffalo’s performance would be applied onto Smart Hulk. Once ILM reviewed the solves on a Banner mesh, the crew compared them back to plates of Ruffalo, making sure they had a one-to-one match. Then, using the system, called Blink, they re-targeted the Banner solve onto the Hulk model using new code. At the same time, that re-target broke the per-mesh solve into Hulk facial shapes and provided animators a much more user-friendly version to work with. 

When it came to animation, ILM used new deformers that were also more user friendly for dialing up or down the different aspects of Ruffalo’s performance. (ILM provided its Hulk model, shader information and base rigging to Framestore, which also worked on some Smart Hulk shots.) 


As DeLeeuw points out, Thanos, like Smart Hulk, was another character who was pushed further for this film, particularly for the end battle. Whereas Hulk’s performance is very broad and his face very elastic, Thanos’s face is intense yet subdued. “You’re dealing with a really precise performance,” says DeLeeuw, comparing Thanos and Smart Hulk. “There’s not a lot of movement in [Thanos’s] face.” Nevertheless, the character’s performance is crucial to the storytelling. In the final shots, Thanos does not speak, but it is clear what he is thinking based on his CG body language and facial expressions.

Early in the film, Digital Domain again handled shots of Thanos, as it had for Infinity War, with Weta taking control of the character when he attacks the Avengers’ compound well into the film and creating a few hundred Thanos shots for Endgame, building on the work it had done for Infinity War. 

In the short time between the two films, Weta continued to work on the character. According to Matt Aitken, VFX supervisor at Weta, the facility’s facial modeling team created some new target shapes around the edges of the mouth. The team took advantage of new developments in Weta’s facial animation pipeline, using Deep Shapes, which enables the animators to procedurally add more fine-level detail to the facial performance. “When Thanos’s facial performance changes from one expression to the next, we’re not changing either of those end points of that transition. They’re still staying exactly the same. “Our facial animators have complete control over the shape of Thanos’s face so we can control the facial performance at a very high level,” says Aitken. “But, we wanted to add some complexity to that transition itself, so we’re modeling a little bit of inertia.”

Here, Thanos is a bit younger, having come from four to five years in the past, and is more agile and powerful, requiring keyframe animators to dial in his movements on top of the motion captured from Brolin and the motion captured from a stunt performer in the fight scenes during the end battle. 


Endgame is filled with many impressive visual effects sequences, none more than the end battle, which brings back a plethora of characters who join in the fight to defeat Thanos. The past version of Thanos arrives, too, and attacks the compound with his warship. A battle of epic proportions ensues. Various heroes step in for the relay to carry the gauntlet to safety. “Going back to Civil War, we had about a dozen heroes running at each other and fighting in the big battle scene. In this one, we had hundreds of heroes and villains, and every Avenger using his or her powers at the same time. The complexity of the shots went through the roof!” says DeLeeuw, as a plethora of CG and live-action characters join in the fray. 

Weta built, and then destroyed, the CG compound, making sure set pieces could be moved to accommodate various camera angles. Weta also built digital doubles of all the characters and all the creatures — essentially a kit they used to create the battle. Then, the fighting breaks out within the crater that’s now where the compound used to be. A lot of the sequence was shot in Atlanta on a soundstage. “No matter how big the stage was, it was never big enough to photograph all the action,” says DeLeeuw.

The Sakaarans from Guardians were actors, dressed in black armor. The Chitauri were CG, as were the Outriders from the first Avengers and Infinity War. “You are on the stage with stunt people, then the CG has to carry that all the way to what you actually see in the film, where the battlefield is populated and extends for miles beyond what you’re photographing,” DeLeeuw notes. “You’re starting with very little and adding to it.”

The final battle is filled with explosions, starting with the compound and continuing as Captain Marvel (ILM, to be exact) destroys Thanos’s ship — all calling for large-scale simulations. “The water sims and explosion sims [in Endgame] are light-years ahead of where we were just six, seven years ago in terms of what we were able to do,” says DeLeeuw.

Make no mistake, visual effects carried this major battle. And in the thick of it all were two vendors in particular: Weta and ILM. Weta’s first shot in the film is in the third act, when Thanos destroys the compound, and culminates when [spoiler alert] Stark succumbs to his injuries. Predominantly, the environment is CG throughout the sequence.

For the explosions, Weta extended the work it did on War for the Planet of the Apes for the destruction of the base at the end of the film, incorporating volumetric pyro physics and modeling the transfer of heat into its simulations for a more physically correct and realistic result. Weta also employed its crowd simulation software, Massive, to populate battle scenes with tens of thousands of soldiers. “There’s a shot when the two armies clash with each other at the start of the battle. It’s one of the longest shots we produced. There are plate elements, but essentially it’s a CG shot that easily could have gotten muddled. Instead, our animation team produced something spectacular and very easy to watch,” says Aitken.


In addition to Thanos, Weta also crafted Iron Man’s Nano suit, having a history with the character all the way back to the first Avengers film and seeing him progress over time. Additionally, Weta did some Hulk work for the first time and shots of Thanos battling Cap, Thor, Iron Man and Captain Marvel at the end of the movie. And, the artists handled the Scarlet Witch, whose power is amped up in this film, interpreting graphically-styled reference images and turning them into CG simulations for a physical, volumetric look. 

Weta also opened portals. In Endgame, just about every hero and villain from the MCU emerge and partake in the battle as they unleash their full potential. The characters materialize through Dr. Strange’s portals, which Weta reworked at an immense scale, starting with the portal technology they had devised for Infinity War, the more human--size Dr. Strange portals. “It was important that when the portals start to appear, the audience is able to identify what they are and then realize these characters are now alive after having turned to dust at the end of Infinity War,” Aitken says. Moreover, the environments inside the portals were full-3D environments built by Weta: Wakanda, New Asgard, Contraxia, Kamar-Taj and Titan. 

“On the days when all the characters were there, were some of the most amazing days on set because you are looking at every character who has ever existed in the first 10 years of the Marvel Universe together at the same time,” DeLeeuw says.
ILM also had a big role in the end battle, creating Strange’s magic. The studio opened the battle after the attack first occurs as the heroes fall to the lower levels of the compound, animating Hulk and Rocket in the scene. The ILM artists continued the action during the battle after Hawkeye hands off the gauntlet, with a shot of Hulk and Cap together, followed by shots of Black Panther once he picks it up and before he is stopped by Thanos’s blade. Spider-Man flies in to assist, then rescue. Strange uses the Winds of Watoomb to form a water tornado, then we’re back to Spidey, before Captain Marvel swoops in and takes out Thanos’s ship. 


Emotions ran high and low during the battle and in the aftermath. Indeed, the portal openings became a key turning point of the film — in addition to eliciting a good deal of emotional response from audiences. DeLeeuw witnessed the full effect that the scene had on viewers as he watched the film in a theater. “It’s one thing to think [the sequence] is going to work, then you see it with an audience and realize how much it really works and affects them,” he says.

Without question, Endgame weaves the entire MCU saga together into the biggest superhero movie of all time. And like in the film’s end battle when all the characters unite in their ambitious common goal, so, too, did the VFX artists from various studios as they overcame daunting challenges to achieve a level of success never before reached in a cinematic production. Ah, the power of teamwork.

Karen Moltenbrey (karen@cgw.com) is the chief editor of Computer Graphics World, Post’s sister publication.