Previs: Ncam goes on-set for <I>Solo: A Star Wars Story</I>
Issue: October 1, 2018

Previs: Ncam goes on-set for Solo: A Star Wars Story

Solo: A Star Wars Story is the most recent release in the space-fantasy franchise. For this installment, the teams at Lucasfilm and ILM took visual inspiration from Star Wars Episode IV: A New Hope to tell a new story of Han Solo’s early adventures. 

Ncam’s augmented reality platform was selected to provide live, pre-visualisation on-set, enabling director Ron Howard, director of photography Bradford Young, cast, camera teams and editors to better convey complex bluescreen VFX scenes, as well as speed up the post production process. 



Rob Bredow, visual effects supervisor and co-producer on Solo: A Star Wars Story and Hugh Macdonald, head of technology at visualisation house Nvizage, recently discussed how Ncam’s technology enhanced the production process while saving time and costs — most notably on a major stunt sequence set on a speeding train.

“From the very beginning, we set out to be inspired by the era of VFX from Episode IV: A New Hope and ground our choices of VFX techniques and cameras back to that era, while modernising those techniques with the advanced technology available to us today,” explains Bredow. “We created practical creatures, props and sets wherever possible. We would only place cameras where real cameras could be positioned and we employed rear projection techniques when shooting the cockpits…and modern technologies like Ncam’s helped us to enhance every shot we utilized it on.”

The ‘Imperial Train Heist’ is a key scene in Solo, a sequence of more than 10 minutes’ duration that was months in the making. The train features industrial-looking carriages stacked above and below a narrow chain-like track, tilting precariously as they hurtle along a rocky, mountainous route at high speed. The action primarily takes place atop the carriages as Beckett and his gang, including Han Solo, climb and jump from carriage to carriage of the moving train on their mission to seize valuable cargo, while simultaneously fighting off a fleet of marauders on speeders.

“We were driven by the desire to make it all as realistic as possible,” added Bredow. “We did some practical tests, including building a mock-up of the train on a runway, and driving a semi-truck at 80mph to see what it would really look like out there. Ultimately, we decided to shoot the sequence on a bluescreen stage so we would have full control of the lighting environment knowing we could duplicate the wind with fans and the carriage motion with a specialized motion base.” 



To create the mountainous backdrop of the planet Vandor, a camera crew travelled to the Italian Dolomites during winter to ensure the right weather and lighting conditions. With an Arri Alexa 65 on the nose of a helicopter, they shot over 100 aerial plates that were used as backgrounds and references for digital environments throughout the sequence. In addition, the ILM team photo-modeled thousands of photographs into 250 square miles of mountains, which were assembled in MotionBuilder into one continuous track that was used as the basis for the pre-vis. 

“We deployed the Ncam system on-set to provide a live preview of the CG background showing the mountains going past the train. Ordinarily, with a bluescreen shoot, everyone on-set has to imagine what the final composited shot will look like, which makes it hard for actors to react and can result in  errors that aren’t picked up until postvis or post production,” Bredow explains. “With the Ncam system, we could see the on-set action and the CG background composited together in realtime, live on-set, which really helped our cast, camera operators and editors to understand what we were shooting for each of the complicated beats of that sequence. We could see the mountains racing by the train and frame up appropriately; when a corner was coming up, the camera could pan or tilt up from the actor’s face up to the corner up ahead, and we could intuitively find those shots on the set that simulated being out in the real environment.” 

Nvizage supplied the Ncam systems for the production, and supplied a team led by Hugh Macdonald, Nvizage’s head of technology. For the majority of the production a single Ncam Reality system was on-set, though two systems were used on a few key scenes where it was useful to capture data from two cameras during the same scene. The Ncam camera bar was rigged onto various cameras throughout the shoot, including an Arri Alexa 65 and an Arri Alexa Mini, and variously mounted under or over the camera as needed.

One of the biggest challenges on this mammoth sequence was keeping all of the action beats organized. 



“Working closely with Hugh, our previs team at The Third Floor, and the artists at ILM, we planned out long beats, or what we called ‘chapters,’ that we could photograph and give to the director, Ron Howard or the second unit director Brad Allan,” continues Bredow. “It required planning and collaboration, but that pre-visualization work gave us lots of flexibility. Whether we wanted a short action beat or to shoot for four minutes continuously, we had around 25 chapters of pre-vis material built in advance and could call them up at a moment’s notice on-set and be ready for the next set up.”

Hugh Macdonald recalls one particular section when Beckett (Woody Harrelson) hears the speeders coming and spins around to shoot as they fly past. “The unique ability of the Ncam system to enable the visualization of VFX shots through the camera lens really aided how this scene was shot. It meant that the second unit could watch the Ncam preview and show Woody exactly where the speeders were coming from. Brad could then make sure that Woody’s action was going to fit with when the VFX came in, taking away the need to retime him or adjust the speed of the vehicle later. It really helped the creative process and sped up the production. It also meant that everyone could be confident that the work on the day would tie together seamlessly in post,” he recalls. 

The flexibility of the Ncam system also enabled the production to adapt with the changing needs of a highly complex action sequence, as Bredow explains: “The sequence evolved as it was shot, so if Ron Howard or [co-producer and writer] Jon Kasdan or I had an idea for a revision, we could talk to our previs team at The Third Floor and they could turn around a new chapter, generate a new MotionBuilder file and run it over to the set the next day or even that same day. We could be up and running very quickly with a new action beat that met the director’s requirements for the story.” 

There were some key features of the Ncam system that further eased the workflow, whether adapting to the director’s preferred work method or enabling on-the-fly changes to the animation as Bredow explains:



“Ron likes doing long, continuous takes that give the actors the opportunity to try different performances. Hugh created shortcuts within the system to play specific background clips that coincided with various sections of the script. Ron could therefore do several takes in a row without cutting the cameras, and Hugh would trigger the playback for each take so that the correct section of the mountains was going by as the dialogue was delivered. We could keep going as long as necessary and the background shot would be consistent.” 

Macdonald adds: “The Ncam system also came into play when there were some elements of camera positioning that the second unit couldn’t completely plan in advance. In the section when Val [Thandie Newton] is coming to the attention of some droids, Brad [Allan] might ask if a droid could come in at a different point within the frame, so that the camera unit could line up the cameras in a better position. We were able to do some basic animation on the fly to reposition objects in a matter of minutes, which helped to confirm the camera positions very quickly and easily.” 

While the on-set visual element of the Ncam system provided significant benefits to the production process, there was another major advantage that Bredow felt was just as important. “When we walked off the set for the day, we not only had better, more informed and more intuitive camera moves that felt more in the style of the rest of the sequence — we also had our postvis already done,” he says. “With a bluescreen production, you would traditionally do the shoot and then later composite in all those digital backgrounds so that editorial has media to cut with. In this case, we left the set having already built all that postvis work, which gave the editors much more coherent material. Imagine having to edit this 10-minute action sequence, with a bunch of people on the tops of trains –– it would have been so much harder for editorial to have to put all that together if they didn’t have all those backgrounds.  It really helped us create the sequence on an aggressive schedule and put it together efficiently.”

Ncam’s augmented reality system provided many benefits on this production, but Bredow had initially been reticent about employing this real-time technology due to perceived obstructions to the shooting process — a perception that he was happy to have disproved.

“I’ve tried to use realtime technology on-set before and it hasn’t always been this successful. You don’t want a lot of extra people or hardware, or any complexity that slows down your shooting process — but with this system we only had one extra person assisting on the camera and Hugh operating on the box, and we really didn’t have to slow down production very much at all to get a great deal of extra advantage,” he says. 

However, Bredow knew that he would need the buy-in of the camera department to make it work, in particular director of photography, Bradford Young. Before principal photography, Bredow took Young to visit the set of another Disney film, The Nutcracker and the Four Realms, that was benefiting from utilizing Ncam’s technology. 

“The DoP on that film really liked the system – for him, it was a very important part of his workflow and not at all intrusive,” says Bredow. “Bradford quickly became an enthusiastic supporter and helped to get the camera department behind putting a little extra hardware on the camera, and we also worked hard to integrate Hugh in with the camera department. With everyone working as one team, the workflow became very smooth which was critical to make this successful and seamless on-set.”

The finished train heist scene met all its creative objectives, and Bredow gives major credit to Ncam’s contribution. 

“I think this sequence in particular was improved by our choice to use real-time technology on set,” he says. “Thanks to the Ncam system, we could get a sense of the whole context of the shot, and we got a wider variety of more interesting shots. It allowed us to be more intuitive with the kind of camera coverage we could get, and hopefully that helps the sequence feel more realistic to the audience when they watch the completed piece.”