VFX: Apple TV+'s Emmy-nominated <I>Five Days at Memorial</I>
August 23, 2023

VFX: Apple TV+'s Emmy-nominated Five Days at Memorial

Based on actual events and adapted from the book by Pulitzer Prize-winning journalist Sheri Fink, Apple TV+’s Five Days at Memorial chronicles the impact of Hurricane Katrina and its aftermath on a local hospital. With high floodwaters, intense heat and a lack of power, exhausted caregivers at a New Orleans hospital were forced to make decisions that would ultimately follow them for years to come.

“Day 2” of the eight-episode drama is nominated for an Emmy for Outstanding Special Visual Effects In A Single Episode. Eric Durst and Matthew Whelan worked as VFX supervisors on the show and shared insight with Post on the series’ visual effects challenges.



What were the needs of the series?

Durst/Whelan: “Hurricane Katrina and the flooding of New Orleans was such a massive event, that in the beginning there was great interest from Carlton Cuse and John Ridley, the show’s EP writers/directors, to quickly wrap everyone’s heads around how to visualize this immensely dense and large scale world.

“With this in mind, there was a great deal of research that took place.  This enabled all of us to immerse ourselves and truly understand the events that took place, and importantly create a workflow that enabled us to show these events with true, real world authenticity.  

“Sheri Fink, who wrote the original article in The New York Times that her book Five Days at Memorial was based on, winning the 2010 Pulitzer Prize in Investigative Journalism for her work, was extremely important in this process.  Sheri would give us access to her vast libraries of visual reference that took years to gather, which helped enable us to achieve this task.”



Talk about the city’s hospital and houses?

Durst/Whelan: “It was clear right away that we needed to create a lot of the city. To enable this, we would need to have a vast amount of assets — 3D models of the hospital and all the houses and buildings in the various neighborhoods that were being portrayed in the series. To achieve this, we hired Crafty Apes out of Baton Rouge to send a LiDAR and drone/photogrammetry team to New Orleans and scan the structures needed in the various locations that we had identified. We scanned the hospital and the surrounding neighborhoods, as well as 27 individual houses. Because the architecture in New Orleans is so unique, these 27 houses were chosen because they were similar, yet different enough to give us the capability to multiply these structures and make vast neighborhoods if called on.”

This was a well-covered recent event, so were you able to use stock footage at all?

Durst/Whelan: “There was a great deal of archival and news footage that existed showing the before, during and after moments of Hurricane Katrina, along with footage of the flooded streets, bodies and people on the roofs waiting to be rescued in the aftermath of the flooding of the city. However, there was no photography or visual documentation showing the most iconic events of the tragedy at the moment they occurred.  

“The roof of the Superdome being ripped off and the Levee Breach in the Lower 9th Ward were two of the best known. These were moments that we wanted to see, as they were pivotal to the story. To do this, we took a forensic approach, deconstructing and re-building these events with the gained knowledge from our research, and reconstructing these events so they could be seen accurately and for the first time.”



How were you able to use Good Earth?

Durst/Whelan: “Google Earth has a wonderful feature called Time Machine, which enables you to go to a location and go back in time with the satellite imagery taken at moments in the past. This enabled us to travel to our Lower 9th Ward location and see what it looked like the week before Hurricane Katrina hit and the day after the levees breeched, destroying the neighborhood with such ferocity it looked like a massive bomb had exploded. This enabled us to see exactly how the levee walls broke, and the exact structures that were taken away by the incoming water.  

“This process greatly helped all of us understand both the magnitude and the physics of what had actually occurred, the visual imagery from the exact moments that we needed it showed us in clear detail, so we could re-create these events as authentically as possible.”  

Can you explain your development process?

Durst/Whelan: “With our research as guidance, we set out to visualize the scenes as quickly and clearly as we could. Traditionally, this is done with storyboards or previs, where the directors give a brief and the previs team comes back with their interpretation of the director’s instructions. This time though, we wanted to have this experience be more realtime, as if we were all on the set together and blocking out the action, along with camera angles with everyone together, as if it were a live event. To teleport us into a future, where the set had already been built, and place everyone on the ground so we could work with actors and cameras as if they were there too on this set, we put everything in Unreal Engine, which enabled us to work in realtime. This gave the directors the most flexibility to find their shots.



“As we were still under COVID protocols, these meetings were remote, with everyone on Zoom and on shared screens. However, in many ways, this actually helped, because we were all seeing the exact same thing at the same time. Another advantage was that we could record all these sessions, which helped us go back and review notes, as this process went at lightning speed.

“The approach became very comfortable for both Carlton and John, because we could adjust in realtime to what they were thinking, which helped them find their shots and become comfortable with the sets and locations far in advance of set construction.”

What were some of the more involved sequences?

Durst/Whelan: “The largest sequences using this process of pre-visualizing in Unreal were the scenes on the helipad and the ER ramp, where the main rescues took place. A foundational part of giving the directors the ability to quickly block out both the action and camera angles was having the virtual world reflect the right dimensions and accurate camera lenses so everyone could clearly understand how it would all look in the most accurate way possible. This was more important than having a visually-polished frame, as long as we knew what was going on and where the cameras were place, that was all that was important. This was an efficient way to work and enabled us to smoothly move through all the work that had to be achieved.

“This work with the directors also helped inform both Matthew Davies, the production designer, and John MacGillivray, the special effects coordinator as to what they had to build and prepare for. As we were housing the ER ramp in a million gallon water tank that was 280’x130’, and this pre-planning was pivotal to avoid any miscues, which the schedule and budget had no allowance for.”



What gear were you using?

Durst/Whelan: “The placement of blue screens in both the helipad and tank sets was important to get complete coverage of any foreground people and objects. We were shooting exterior for these scenes, and as the sun would move throughout the day, it would change the lighting on the screens. So, we needed to come up with a solution that was flexible enough to keep both coverage and maintain the right lighting on the screens at all times.  

“Ramsey Nickell, our DP, had used a technique of placing 20’x30’ blue screens on tele-handlers. This gave us the ability to quickly move a large amount of blue screens to any area of the set within minutes, moving the screens into place after the camera had been placed in its final position. Because the tele-handlers could tilt and rotate the screens, we had the ability to adjust the blue screens to follow the sun and keep their illumination in the optimum area.  

“On set, we used standard grey ball and chrome balls for most lighting setups, with bracketed Theta Z1 HDRIs as backups. We also built an array of eight GoPros, which were placed into both the helicopter and tank sets to allow for triangulation in case in-camera tracking failed. All the configurations of the tank set were LiDARed to allow for object tracking. Our key grip, John Tennant, and his team built cement filled buckets with rebars topped in tennis balls, so we had locked tracking markers in the tank.”



What does this Emmy nomination mean to you?

Durst: “I’m glad that people connected with and liked the work.  I love transparent effects, because they are quietly and invisibly present, something that people don’t notice until you take them away, and then the audience is astounded. I get a kick out of that, and I feel that the work in this show was at the highest bar, so I’m glad the TV Academy voters in the Visual Effects peer group who are truly our peers, recognized the work. This is my second nomination.”

Whelan: “I think the work in Five Days shows that the VFX don’t have to be in your face. It’s great to feel that our VFX serviced the story. It was a huge weight to tell this story in a truthful and respectful way — it’s a story where nearly 1,400 Americans lost their lives. I feel that the VFX team did a really great job, and it’s nice to hear that audiences felt that the VFX were invisible, and that it helped them feel immersed in the story and it heightened their pathos. I think that’s the power of amazing SFX and VFX, and I’m really proud of the work.”