Visualization steps up its game
Issue: May/June 2019

Visualization steps up its game

Post recently caught up with Eric Carney, founder and visualization/visual effects supervisor at The Third Floor about the studio’s work on Game of Thrones, Avengers: Endgame, Spider-Man: Far From Home and on advancing next-level content-making through previs, virtual production and immersive planning. 

What are some of the major technical advancements in visualization in the last 10 years?

“Ten years is a long time! I think I got my first iPhone 10 years ago…We’ve seen a lot of technical advancements since then!



“Virtual camera technology has come a long way. The first VCam we used had sensors hard mounted in one room and cost $30,000. Today we can do the same thing on a hand-held tablet in any room.

“Faster, more cost-effective Lidar and photogrammetry has had a big impact on what we do with previs and in technical planning. Now we can quite often have very accurate scans of a set or location so that we are creating the previs with all of that in mind, and our ‘techvis’ can be down to the inch because we know exactly where things are.

“Camera tracking technology has advanced as well, leading to an expansion in on-set simulcam. It used to be that you needed to set up an optical motion capture volume to do camera tracking or were limited to oversized motion control cranes or encoders.  Now there are through-the-lens solutions that can be quickly bolted to any camera, with setup that is quicker and more streamlined.

“Virtual reality has become a major tool in the last two years. We can bring our environments into VR and the director and other creatives can do a virtual scout to explore how they want to cover a scene without being at the location or before the set is built. You can explore a scene with VCam but with VR you really feel like you’re in the location. It’s impossible to describe—you just need to see it.

“The biggest advancement by far is the use of real-time rendering platforms. Tools like the Unreal Engine have the potential to really transform a lot of what we are doing. We’re able to iterate more quickly and at a high visual quality with more complexity and it allows for greater collaboration across departments. It’s becoming a big part of what we do — from previs and postvis to virtual production, virtual reality and augmented reality.” 

What do you think are the most exciting new tech developments that will improve the content visualization process in the near future?

“Real-time rendering using game engines is certainly very exciting right now. It is allowing things that previously seemed impossible or impractical to become commonplace. It’s probably the biggest advancement we’ve had in previs since we’ve started doing previs. It will allow directors a better way to interact and engage with the previs, and become more of an active process for them. It will also allow us to offer new, customized visualization tools to the other key creatives on the show.

“Another exciting new technology is the use of AI. We’re just starting to scratch the surface but the possibilities look fantastic. You’ll always want to have bespoke assets and animation for the hero work, but for background elements we’ll be able to utilize AI to build out the rest of the forest or city or populate that city with people that can react to what’s going on.”



Where does classic previs and the process of working out shots with creators fit or feed into the future visualization workflow? How is previs evolving through key tech?

“You can do some really exciting things with mocap and VCam. You can have actors perform a scene in mocap suits and then shoot it with a VCam and have five minutes of previs before lunch. That idea isn’t new; we’ve been doing it for several years now. It’s just that now the tools are better, faster and easier to use and the experience is better and more collaborative for everyone.

“It won’t be a process for every scene or creator, and artists visualizing shots at the computer will continue to be a big part of the workflow for the foreseeable future. One exciting innovation in that workflow is that we have the ability for the director to walk up to anyone’s workstation, grab a tablet and start interacting with the shot or scene. They can use that device as a virtual camera and quickly shoot as many shots as they want on the fly and without a big setup.”   

Tell us about visualization techniques and technologies The Third Floor helped develop and use for the final season of Game of Thrones.

“In work for Seasons 3 and 4, we were doing traditional previs for selected scenes for the purposes of story or that had a discrete technical challenge that needed to be studied. As seasons progressed, the previs process really became a hub for the show to shape ideas and produce technical playbooks with input from all key departments.


  
“For the finale season, we visualized 2000 shots and produced thousands of technical diagrams that informed everything from fire strafes to crowd tiles to drone plates to dragon rides.  We based the previs on Lidar scans or photogrammetry accurate to the sets and locations, which was greatly beneficial. The Lidar-based previs meant anything designed in previs reflected the real world, and that technical diagrams could come out of the previs to precisely inform camera position, actor positions, distances, props, greenscreen placements and hundreds of other shooting details.

“This also meant we could take our previs blocking, as well as fleshed-out ‘pre-animation’ files from final VFX vendors like Pixomondo and Image Engine, and use it to drive specialized production rigs — dragon motion bucks, flamethrower-equipped cranes, suspended cablecams, to capture real elements in sync with CG. Our team from The Third Floor on the ground in Belfast grew to include dedicated techvis artists, virtual production supervisors and supervisors to shoot and solve for motion control and element shoots and, in Season 8, to have a special technical lead focused on virtual scouting.

“One of the tenets was believability with the imagery. We worked with visual effects supervisor Joe Bauer and visual effects producer Steve Kullback continually across multiple seasons on methodologies to shoot dragon flight and fire elements for real, supporting that with virtual production to have those effects match what the dragons would be doing in CG. With Jon Snow’s first dragon ride, an aerial dragon fight, wights in Winterfell hanging from Drogon’s tail and Daenerys doing strafe runs throughout King’s Landing, we had a lot of new challenges in Season 8 that saw some pretty interesting combinations of equipment, motion control and real-time tech. 
 


“An example is when Drogon attacks Euron Greyjoy’s ship in Episode 5. We produced a moving eyeline using tracked LEDs that gave the actors a view of the CG dragon to react to and had a simulcam composite as the crane filmed the shot. Later, in collaboration with Sam Conway and the special effects team, we delivered third-scale fire as 1 to 1 elements for plates already shot, piping the dragon performancesseamlessly into the pyrotechnic rig.

“In collaboration with the team, on the season we also flew practical flamethrower elements from techvis data via Spydercam, using a Libra head mount to control pan and tilt of the dragon’s throat. We used NCAM to provide real-time overlays in shooting Episode 3 plates for the giant, Crum, attacking Winterfell. And we had an NCAM system bringing pre-animation of Drogon live into the Episode 6 Throne Room set on a display that was viewable to the directors and crew. 
“Season 8 saw some of the sets — the Red Keep Plaza, Arya’s Ship, the Throne Room being scouted virtually, using created 3D assets, to study spatial relationships and frame-ups ahead of time. Episode 6 director of photography used a tool we built and ran called Pathfinder to interact with the sets and make photoboards while immersed in the scene via a head-mounted display or animated virtual cameras.  
 


“On Game of Thrones, the goal was to do as much in-camera as possible.  We filmed hundreds of shots of Emilia Clarke on the dragon base in motion control and that’s why the imagery looks so real because it is as real as it gets without actually having dragons and castles! Needing to plan for complex shots with real-world locations, sets and actors as well as account for characters, creatures and effects in CG led us to push the possibilities of both the real equipment and the virtual toolsets, and all of the intersections in between.”

How does visualization relate to virtual production? How does The Third Floor work within the virtual production space?

“Done right, visualization can be directly connected to virtual production, which is really an umbrella term to describe many different tools and techniques for both production and pre-production. For example, we use a lot of those same tools in our visualization work.

“At The Third Floor, we’re working hard to make what we do in pre-production visualization flows through to on set or into virtual production and from there into finals work. We work in many virtual production areas ourselves — VCam, simulcam, motion capture, motion control, techvis, VR, AR — and we also work with visual effects companies to support them. 
 
In some cases we might be providing the entire ‘virtual production’ solution and in others we’re working closely with a large team of vendors or collaborators to support the production’s virtual production workflow.  Every production and show is different so that’s how we approach the work. We’re not trying to sell a ‘one-size-fits-all approach’ but rather trying to understand each project’s unique needs and working style and developing solutions that fit them.”  

What do newer techniques and processes for visualization mean for other creative and technical areas around a production?

“Visualization will increasingly become a tool for all departments and we’ll be able to create new apps and tools to meet the unique needs of many collaborators. They’ll be able to interact with the visualization on tablets or laptops in ways that are easy and make sense for them. The tools will positively impact and accelerate the work they are doing and help validate how it fits into the bigger picture.



“A lot of the nuts and bolts of production come down to planning and information-sharing along with logistics management. Right now that’s done in separate ways and they are often isolated or not very visual. There are a lot of PDFs in the production world and it can be a challenge to know what all departments are doing, and, for example, how one thing might impact something else. We want to build a collaborative visualization platform that everyone can use to visualize what is going on. 

“Everything done on different department apps and in the production department and from the visualization team will feed into this. Then the platform will be able to display information for each day of shooting and each location to show what’s needed or where things need to go.   You’ll be able to see that in an interactive visualization environment on a device or laptop.

“For visual effects, it’ll means we’ll have that much more informed planning and targeting for what to shoot and hopefully get more ‘in the can.’ 

“Visualization teams will be able to provide final visual effects facilities with assets and animation that are closer to the desired effect or performance.  There will be less wastage and ‘re-doing’ of work so there’ll be a better starting point for getting to the final shot and hopefully a faster process with fewer revisions.”

How will virtual production influence future television and OTT?

“Virtual production can help bring down the cost of high-concept content no matter the platform. If you are trying to create fantastical environments and characters that do not exist, then virtual production is a fantastic tool to do that in an environment that more closely resembles traditional filmmaking. I think this is really exciting for television and over-the-top content, where virtual production can help validate what’s being done more quickly and save on costly re-shoots, which are not typical in TV work.  With virtual production it can be possible to achieve visuals and tell stories that previously may have only been the domain of high-priced feature films.”