Cover Story: <i>Rogue One: A Star Wars Story</i>
Issue: January 1, 2017

Cover Story: Rogue One: A Star Wars Story

The first in the “Star Wars Anthology” series, Rogue One: A Star Wars Story takes place shortly before the events of the original Star Wars, when Rebel spies, led by Jyn Erso, set out to steal the design schematics for the Galactic Empire’s new superweapon, the Death Star. The film is produced by Lucasfilm and distributed by Walt Disney Studios Motion Pictures.



“The idea of the anthology stories is to explore stories in the Star Wars universe in ways we haven’t seen before,” says Nigel Sumner, VFX supervisor at Industrial Light & Magic (ILM; www.ilm.com). Director Gareth Edwards and DP Greig Fraser “have created unique visuals through cinematography, lensing and lighting, but the design language segues directly to ‘A New Hope’ — there’s a wonderful legacy look and feel to Rogue One. I’d love to see a double feature with both films!”

Sumner saw his first Star Wars film — Return of the Jedi — when he was seven. “It held such a warm place in my heart,” he says. Little did he know that he’d grow up to work on episodes 2 and 3 at ILM and play a major role in a new chapter in the Star Wars franchise.

Rogue One is “comparable to other Star Wars pictures in the number of VFX shots: These films tend to have a high shot count,” notes Sumner. While prepro was centered at ILM’s San Francisco and London studios (principal photography was done at Elstree Studios near London), VFX-shot production involved the creative talents of all four ILM facilities: San Francisco, London, Vancouver and Singapore. “For every project we reinforce and strengthen the pipelines already established between studios,” Sumner explains. “The volume of work required for a picture of this scope and scale can really challenge a system.”



Work was “placed strategically” among the studios, he says, with “sizable chunks owned creatively by each studio, from layout to final effects and compositing.”

The Third Floor (www.thethirdfloorinc.com) created previs, techvis and postvis for the movie under previs supervisor Barry Howell and UK lead Margaux Durand-Rival. The team began working with Lucasfilm and ILM’s art departments in San Francisco in November 2014, then started production in the UK at the beginning of 2015 with artists from The Third Floor London. 

The company brought its own Star Wars heritage to the project: Its six cofounders met on the third floor of the Main House at Skywalker Ranch during the making of episode 3. “When we finished, we had enjoyed working together so much that we decided to form our own company,” says Howell. “Now we’re one of the largest previs companies around, employing over 250 people worldwide, so it was quite enjoyable to be able to return to our roots and work with some of the same colleagues again.”

In providing previs for Rogue One, The Third Floor was asked to show “the feel of the movie’s aesthetics — the mood, the atmosphere, the lighting — so you felt like you were watching the first pass of the movie,” Howell explains. “Gareth was very clear about wanting to feel the emotion. Lighting played a huge part in his storyboards, which were more like mood paintings. He wanted us to capture that in the previs.”

The Third Floor’s previs touched “all of the big action pieces” as well as vehicle interiors and “smaller emotional moments you don’t normally associate with previs,” he says. The most fun was developing ways to reveal the Death Star and the destruction it brings.



“We knew it would be the big moment in the movie, the moment everybody remembers, so we wanted to nail it as best we could,” Howell recalls. “Gareth gave us sketches and the variables and let us play. We provided options that progressed through multiple iterations that integrated old and new elements.”

Although The Third Floor employed its usual suite of proprietary software for previs and postvis, the company invented Random Cam for Rogue One. The Third Floor had previously provided previs for Edwards’ Godzilla and realized that the director wanted to introduce “those happy on-set accidents, that sense of randomness” into previs.

So The Third Floor wrote a new script that takes the previs environment and the object of interest, sets up variable camera angles and auto-generates different perspectives. “You wouldn’t want to use the majority of those images, but there are always a couple of new and unique ones,” says Howell. “Gareth sorted through the images for the big destruction scene on planet Jedha and gave three or four to a concept artist to paint over. Moviemaking often falls back on tried and true ways of showing action, and this allowed us to view shots with fresh eyes using angles one would not normally think of.”

Rogue One was shot with Arri Alexa 65 65mm cinema cameras. “The 6.5K image sensor captures a wonderful amount of detail, but it’s a lot of data to transfer and that adds to the volume component,” says ILM’s Sumner. “A large percentage of shots were driven and created in 4K; the rest were created in 2K and upscaled.”



According to Sumner, one of the biggest challenges was tying the ethos of 2016’s Rogue One into 1977’s A New Hope. “The technology at our disposal is very different now, but we wanted to recapture the essence of the original movie and recreate it in this storytelling. Part of that is achieved with the characters, creatures and designs, but it was up to us to come up with new ways to bridge the aesthetic and technical gap while retaining the ethos of the movie.”

One method involved the legacy vehicles seen in Rogue One. When the animators built the models for the vehicles in the original Star Wars, they detailed them with model kits. “Our model supervisor Russel Paul embraced the same notion,” says Sumner. “He tracked down the original model kits, scanned them in 3D and we populated a parts library to help build out digital models of the iconic vehicles and flesh out detail on new vehicles using the design language of 1977. For me, accessing the practical models from our archives was a childhood dream come true!”

Environment supervisor Enrico Damm and his team used Autodesk 3ds Max and Chaos Group’s V-Ray to craft set extensions for the planet Jedha based on location photography in Jordan and to recreate stunning aerials shot in the Maldives. They also utilized SpeedTree to create the lush tropical landscape surrounding the Imperial base on Skarif.



ILM also introduced a successful new approach to shooting virtual sets, which involved a basic form of set construction. “It didn’t make sense to do a full set build for most scenes in the cockpits, so we created a low form factor, proxy version of the set for the actors to work in and the DP to frame and light, then we replaced that with mattes of the virtual set,” Sumner explains. “That gave us an advantage over bluescreen because there was more natural base lighting and ambience from the shape of the environment. It provided a grounding in lighting the foreground characters we wouldn’t have had with traditional bluesceen.”

The Third Floor created footage to project onto massive LED screens when the actors were inside any spaceship. “We customized a plug-in that allowed us to create a 4K 360-degree spherical image using our previs environments,” Howell explains. “We could provide these to ILM for projection on the LED screens through TouchDesigner. As the environments almost encapsulated the actors, it was possible to light them and the set with proper lighting and color temperature so the plates would integrate well with the final composites.

“Because of the nature of how Gareth shoots, often keeping the background out of focus, some of the footage we saw in the dailies with the LED screens looked almost like a first pass at a final,” he notes.



The Third Floor went beyond its usual toolset to tackle some of the more complicated LED scenes. “We knew we couldn’t replicate volumetric clouds with Maya so we used the Unreal game engine, which worked very well,” Howell reports.

ILM’s proprietary on-set motion-capture system, nicknamed iMocap, was used for the character of K-2SO. The reprogrammed security droid now serving the Rebels is played by actor Alan Tudyk. “We wanted to capture K-2SO live on-set with the other actors,” says ILM animation supervisor Hal Hickel. He saw the original Star Wars at age 13 and grew up to work on episodes 1 and 2 and special edition content for episode 4.

“We developed iMocap 10 years ago for Davy Jones and his crew in Pirates of the Caribbean: Dead Man’s Chest,” he explains. “Since then live, on-set motion capture has become more commonplace when the character is roughly human size and you want it to interact with other actors. K-2SO is about seven feet tall, so Alan wore high-tech stilts with motorized ankles, created by Neil Scanlan’s team, that gave him a natural gait and made him the right height for eye lines.”

Hickel notes that before principal photography began in the UK, Tudyk rehearsed with the painter stilts on ILM’s motion-capture stage. “He could look in our monitors and see himself as K-2SO in realtime, which allowed him to figure out certain things like how to carry his body and his arm movements, before shooting began.”

ILM animators were charged with extracting Tudyk’s motion-capture performance and applying it to his CG character making sure that “we didn’t change his performance but preserved it,” says Hickel. “K-2SO had to portray Alan’s intentions as an actor and communicate faithfully what he was going for.”

Although Hickel reminds us that droids in Star Wars don’t have expressive faces, the animators gave K-2SO “extra expression in his rotating and darting eyes. That was fun to play with. We even tried a blink but decided that went one step too far.”



Perhaps the biggest challenge for ILM animators in San Francisco was creating two iconic digital human characters: Grand Moff Tarkin, played by the late Peter Cushing, who appears in about 40 shots, and [the late] Carrie Fisher’s young Princess Leia, who is seen in one shot.  

“It took a lot of hard work by a rock star team who took a really rigorous approach,” says Hickel. “It’s easy to get midway and when problems arise be tempted to use band-aids to fix things and get the shot done. But we knew we’d have to take a rigorous approach to evaluating the model, comparing it to archival footage, painting and shading the model. If there were any problems we couldn’t use cosmetics to hide them — we had to lift the hood and really fix things.”

Animators replaced the head of the actor performing the Tarkin character with a younger Peter Cushing head; sometimes they replaced his entire body with CG. “An interpretation has to happen — Guy Henry, the actor playing Tarkin, wasn’t doing a Peter Cushing impression. We had to make sure his performance fit in the scene but that it felt like Cushing,” says Hickel.  “We all got very focused on the details. Toward the end of the process, when we were getting notes from Gareth, we were beyond discussing skin and teeth and really got into the acting.”

Like most Star Wars films, the epic space battle surrounding the Shield Gate was a massive undertaking for animators. “What made it interesting and fresh was Gareth’s direction: It was like shooting live action with immersive sets on our virtual production stage,” says Hickel. “During rehearsals and the early takes, he and the camera operators hunted for angles as the scenes played out — they figured out the best place to be in the middle of the action so we could follow through with the animation.”



To achieve this, “We pre-animated beats of the battle action and loaded it into the virtual cameras on a loop. On our virtual production stage, here in San Francisco or in London, Gareth could look through the viewfinder and see how the action played out over and over. He could conceive shots and find angles, he could ask us to attach a camera to a particular ship. It’s an interesting way to work beyond storyboards and previs,” Hickel notes.

Autodesk Maya was ILM’s workhorse tool for animation and some modeling. ILM also relied on The Foundry’s Nuke for compositing, Side Effects’ Houdini for particle effects and ILM’s proprietary software Zeno, Plume for simulation and rendering, and the newly-developed ILM Flux for digital character reconstruction based on archival footage.

During the film’s 18-month production schedule, ILM “made the switch” to Pixar’s RenderMan/RIS interactive rendering, says Sumner. “From a lighting and rendering standpoint, it was a new paradigm and gave us a new shader set.”

RIS had a few on-set applications, including on-set LED lighting. “When they were shooting the cockpit on a gimbal, we pre-rendered a number of sequences that were played back on an array of LED panels for natural, interactive lighting on the environment and the actors,” he says.

Rogue One also marked the first time that the output of our realtime rendering engine was used in a film. “We’ve been working with Lucasfilm’s Advanced Development Group in San Francisco on realtime rendering for features,” says Sumner. “A handful of shots for K-2SO were lit and rendered through this pipeline and composited into shots. It’s a very positive advancement, and I hope we’ll continue to use it in the future.”

The Third Floor’s postvis fed editorial in London where artists filled in backgrounds or added CG characters and elements to produce temp comps. 

Sumner notes that it was “humbling” to work under ILM chief creative officer John Knoll on Rogue One. “For four years, from the movie’s inception to completion, he was the cornerstone,” says Sumner. “From the concept of the story through the initial drafts and acting as senior VFX supervisor on the show and one of the executive producers, he poured his energy and creative passion into Rogue One. We all wanted to do the best job we could do for him.”