Storage for VFX
Issue: July 1, 2012

Storage for VFX

As visual effects shots continue to grow in complexity, and more and more are created for stereo 3D content, the demands on storage systems increase. Fortunately, there’s no shortage of storage solutions on the market to fit every need and every size facility — from single-site dynamos to global entities.

MR WONDERFUL

Mr. Wonderful, the design, animation and visual effects division of New York City’s Northern Lights editorial, shared storage with its parent company until increases in business and project file sizes required a storage solution dedicated to VFX. The engineering staff reviewed several options and then settled on a Rorke Data HyperDrive File Level SAN, which uses the Galaxy Aurora RAID family and HyperFS SAN file system. Its file-based administrative architecture was deemed a plus.
“We weren’t locked into partitions as we were with Avid Unity and Facilis TerraBlock; we could expand and contract the size of the job’s storage at will,” says Damien Henderson, executive producer at Mr. Wonderful (www.mrwonderful.tv). “And we could administer the server on our own. Rorke was great helping us develop different protocols — Mac, PCs, Linux, Discreet — to connect to the server; it’s a good solution for a heterogeneous network like ours.”
The 40TB HyperDrive is not quite a year old. “We went for a larger buy at the outset and got close to filling it up on the CBS Upfront project,” Henderson reports. “We archive to LTO-5 tape for long-term storage.”



The bulk of Mr. Wonderful’s work is for television networks, with commercials comprising the balance. “We approach everything from a creative and design perspective,” he says. “Our creative, or the creative we do in collaboration with the client, dictates our VFX and CG.”
The recent CBS Upfront saw the network returning to Carnegie Hall, a popular venue for spotlighting its fall programming line-up. “It was a traditional presentation with entertainment value,” Henderson notes. “The half-hour prior to the presentation was eye candy, focused around the characters on the shows. We had a lot of fun animating strips of photo booth-style pictures selected and edited by the CBS creative team. Then a live DJ came on, with animations projected over the entire proscenium, to give a rave-techno vibe up to zero hour.” 
“Carnegie Hall has a deep stage area for concerts so we worked with the marketing division of the network, which built huge screens and scenic to cover the interior of the stage leaving the beautiful proscenium,” says Henderson. “Then we created a trompe l’oeil architectural effect for the interior that was projected back onto the stage with the CBS eye logo worked into the architectural design.”
He notes that highly detailed reference photos of the famous hall’s gingerbread architecture helped Mr. Wonderful animators repaint and retexture the interior elements working the iconic eye logo into the décor. This enabled CBS Corporation “to brand Carnegie Hall for CBS,” he explains.
Mr. Wonderful used Maxon Cinema 4D and Adobe After Effects to create about 90 minutes of content at 4320x2700 resolution. WorldStage, the new brand for Scharff Weisberg and Video Applications, distributed the content from 22 Dataton Watchout servers, from Corporate Imaging, with three Christie Vista Spyder X20s switching. WorldStage also furnished nine sets of double-stacked Christie projectors, which projected all the elements except for the tape rolls from the broadcast truck parked outside.
“It was the biggest job we’ve put through [the storage] to date,” says Henderson. “The system worked great. We had 15-20TB with all the data uncompressed before it hit Watchout.”

COSA VFX

Television and film visual effects studio CoSA VFX (www.cosavfx.com) finished a busy broadcast season with shots for Alcatraz, Person of Interest, Fringe, Pan Am and JJ Abrams’ pilot, Revolution, directed by Jon Favreau. The Toluca Lake, CA, company is gearing up for new seasons of Fringe and Person of Interest, and the launch of Revolution.
While visual effects storage for features extends over a long period of time, episodic television “balloons in a week, then cascades off in a week and a half,” notes David Beedon, one of the CoSA VFX partners. “It goes up and down violently.”
To handle such fluctuations, the company invested in Promise Technology’s Pegasus R4 4TB RAID and LaCie’s 4Big Quadra 4TB RAID, which are coupled with Mac servers and configured as mirrored pairs for redundancy and safety reasons, Beedon explains. The RAIDs, internal to the Mac towers,  create back-ups nightly.



“For a company of our size, we put out a lot of VFX shots really fast; we delivered 1,500 shots last season with 15-16 artists. We had enough projects to necessitate splitting out Revolution on one RAID, Fringe on another and Person of Interest on a third so as not to have a bottleneck.”
CoSA VFX has been using LaCie RAIDs since its inception. “We wanted something that would work right out of the box and be economical enough to meet our needs,” says Beedon. “We added the Pegasus early last season.” Artists use Luxology’s Modo for 3D animation, The Foundry’s Nuke and Adobe After Effects for compositing and finishing, and Imagineer’s Mocha and Andersson Technologies’ SynthEyes for tracking.
When the Revolution pilot came in, the company had just enough high-speed storage capacity on hand to handle its nearly 100 visual effects shots plus a regular slate of work. “The reality of TV is that schedules become truncated and artists have to hit the RAIDs hard; the RAIDs have to work flawlessly,” says Beedon. 
For Revolution’s look at a world that has lost its electrical grid and gone electronically black, CoSA VFX created matte paintings, set extensions, clean ups and wire removals, a CG Earth and some de-aging effects. The pilot arrived in-house as the company produced 30-40 shots for another pilot and was still working on Fringe and Person of Interest.
“We found our storage was filling up fast, in part because we were probably saving more renders than necessary,” Beedon reports. “We offloaded prior episodes of Person of Interest and Fringe to free up space. Once a show airs we archive the episode and move it off the RAID, but Revolution had no official air date that we knew of, so we kept the whole show online along with upcoming episodes of Fringe and Person of Interest. That brought us pretty close to capacity.”
But as business continues to increase, “we can just keep adding RAIDs as needed and keep plugging and playing,” he forecasts.

SCREEN SCENE

Twenty-seven-year-old Screen Scene (www.screenscene.ie) in Dublin offers comprehensive picture and sound post production and a specialist VFX area under one roof. Picture infrastructure includes offline editing and Avid’s DS, Avid Symphony, and Autodesk Flame finishing rooms; sound is edited and mixed in their 12-room audio infrastructure. Screen Scene services the feature, commercial and broadcast sectors.
Screen Scene’s storage needs are obviously legion. About four years ago the online department moved from an Avid Fibre Channel Unity to a Facilis TerraBlock solution, putting the company on the path for additional TerraBlock systems. Today Screen Scene sports a 36TB TerraBlock 24EX for online storage, a 12TB TerraBlock 24EX for offline, and another 12TB 24EX for visual effects.
The single-platform solution makes it easy to “coexist and share media among departments and removes any need to copy or duplicate media,” says chief engineer Greg Tully. “We’re all interconnected via Fibre or Ethernet ”
TerraBlock enables Fibre and network-attached clients in the VFX department to access the same media at the same time using the Facilis Multi-user Write volumes, which “is quite unique,” says Tully. “We can even hook in the Fibre client and do screenings and share files among the online and DI departments without any duplication of media.”
TerraBlock has been serving the rapidly growing VFX department well for the last two years. Screen Scene’s credits cover a broad range of work, from clean-ups and set extensions to CG crowds, destruction and pyro. Recent projects include previs for A Good Day to Die Hard, effects for season one of Game of Thrones, VFX for the acclaimed feature Albert Nobbs, as well as the UK series Skins, Ripper Street, Coup and Loving Miss Hatto.
“TerraBlock acts as our main visual effects storage and is accessed via Fibre on Nuke [workstations] that require realtime playback and the standard Ethernet network for 3D and 2D workstations,” says Tully. “The process of reading rushes, working project files and rendering all flows through TerraBlock.”
The previs for A Good Day to Die Hard took five months, he notes, with eight artists “churning out 3D sequences of potential effects scenes for the upcoming movie.” All the post and VFX for Game of Thrones’ first season were facilitated by TerraBlock during heavy compositing and rendering sessions.
Screen Scene’s VFX department is continuing to expand, and the seat and render node count is getting higher and higher. “We continually re-assess our storage needs and solutions,” says Tully. “We may go with more Facilis expansion units or another storage solution. These days it’s about who is delivering optimum price/performance/reliability; right now we are very happy with the TerraBlock.”

CRAZY HORSE EFFECTS

When Crazy Horse Effects (www.crazyhorseeffects.com) in Venice, CA, considered VFX storage for its all Mac-based workflow some four years ago, it discovered Active Storage, founded by the team that built Apple’s XRAID. Who better to understand the Mac?
“Active was a young company at the time, and we invested in a 16TB unit to get us going with four or five artists,” recalls Brian Sales, compositing supervisor and IT lead. “Now we have 25 people on two RAID sets: our original ActiveRAID 16TB system and the 32TB ActiveRAID we added.”
VFX supervisor/co-owner Paul Graff notes that Crazy Horse Effects (CHE) had non-centralized storage at the outset. “All the workstations were connected together and shots were distributed over the network. That works for a small company — it allowed us to keep everything local on a hard drive. But as we grew it just didn’t work any more for us.”



Sales says the Active Storage solutions have “been rock solid.” Graff notes that the busy company has already stepped up to 2TB and 3TB drives to as much as triple volume over its initial 1TB drives.
CHE, which does VFX for television and feature films, won Emmys for HBO’s Boardwalk Empire and John Adams, and completed over 300 shots for the film Water for Elephants. It was this last film that prompted the acquisition of the 32TB RAID. More recently, CHE supervised/produced and completed  several hundred shots for Oliver Stone’s Savages and contributed work to Ang Lee’s Life of Pi.
“We created a few set extensions, many wounds and blood hits, and video chat for Savages,” says Graff. “With our Red and Red Epic cameras we can shoot live-action elements ourselves. There was a night highway sequence where the shot was focused wrong through a mirror. So we mounted a camera on the roof of a car, drove the car and replicated everything outside the car’s front window with the correct focus.”
By mounting either a Canon EOS 5D and 7D camera with a fish-eye lens on a laptop during 1st unit photography, CHE was also able to capture some surprising footage. They intended to gather content to use, in an undistorted mode, for video playback during videoconferencing sequences, but director Stone loved the super close-up fish-eye shots so much that he ended up using much of CHE’s photography. 
CHE also crafted “some gorgeous Indian environments” for Life of Pi, says Graff. “For that film we did everything from all-CG wide establishing shots to adding backgrounds or foregrounds. Those shots were stereo environments done in After Effects and Nuke.” Working in stereo 3D ups storage requirements because it “doubles scans and produces a lot of extra renders,” Sales reminds us.
Graff explains that a film is likely to stay online on the RAIDs for a few months, so he’s always “pushing for more space. We moved to 2TB and 3TB drives, but the bottom line is you can never have too much storage. It can get really tight with space.”

METHOD

With facilities in Los Angeles, New York, Sydney, London and Vancouver, which work on film effects, CG features, and effects and CG for commercials, Method Studios (www.methodstudios.com) has opted for an individualized approach to visual effects storage rather than a one-size-fits-all solution.
Currently, London, Vancouver and New York use Isilon storage; Sydney has a Panasas system; and LA uses both EMC’s Isilon and Hitachi Data Systems’ BlueArc storage.
“We’re happy with each facility using the technology that works best for them,” says Method’s VP of technology Paul Ryan. The choice of solutions not only depends on the work and the workflow but also is “driven by the support for the system in each region and the relationship of the facilities with their local vendors,” he notes.



Both Isilon and Panasas did not require huge initial investments and can “grow in quite manageable chunks,” Ryan says. “They really shine with their ability to grow capacity or bandwidth independently.” Panasas gives Method “some exposure to parallel NFS. We’re interested in seeing how it performs in a VFX environment and have been very happy with the test results from our Sydney facility, which is building out its infrastructure.”
He explains that the emerging parallel NFS standard makes it possible to build more scalable, higher-throughput storage networks; the data bandwidth and metadata bandwidth can expand independently as needed. “Panasas has really spearheaded this storage system architecture. Other vendors are working toward employing it, but Panasas has been there for quite a few years already.”
Method LA perhaps enjoys the most varied workflows of the group, from Flame-based commercial finishing to film and commercial visual effects, says the facility’s director of technology Olivier Ozoux. 
“Right now we have close to 350TB of storage between our Isilon, BlueArc and SANs,” Ozoux reports. “We split work between Isilon and BlueArc more for our convenience than for performance. Features are done with BlueArc storage and commercials with Isilon.”
Method LA’s biggest recent film VFX job was for Wrath of the Titans, which consumed 120+TB of BlueArc storage. “BlueArc’s two heads give us 80TB of online, high-speed storage and another 80TB of near-line storage,” says Ozoux. “High-speed online volumes transparently connect to slower near-line disks so, from a user perspective, it looks like files are sitting on a single piece of storage even though they are migrating from faster to slower disks. That allowed us to keep the Wrath data easily accessible to artists without resorting to archiving on tape.”
The facility implements the same process for commercial visual effects on its Isilon storage. Artists use Autodesk’s Maya for character animation, Side Effects’ Houdini for effects and Pixar’s RenderMan and Chaos Group’s V-Ray for rendering. 
“We’re always reviewing our needs: Storage is sort of an arms race year after year,” says Ozoux. “You find out you always need 20 percent more than the amount you have: You want to do more complex things or the artist expands to take up all available resources. We try to manage that [latter] process, to give artists the tools they need and show them where they’re spending their data and at what cost.
“From a technical point of view, we’re very happy with both platforms. Looking forward, we are always looking for the best performing hardware and, ideally, it would be a single solution.” 

RHYTHM & HUES

With more than 1,400 employees in six locations (Los Angeles; Mumbai and Hyderabad, India; outside Kuala Lumpur, Malaysia; Vancouver; and Taiwan), Rhythm & Hues Studios (www.rhythm.com) boasts over 1PB of storage globally. In LA, the largest facility has about 300TB of Tier 1 storage in EMC2’s Isilon IQ 10000 Series 28-node cluster and several 6000 Series clusters.
According to chief technology officer Gautham Krishnamurti, Rhythm & Hues has “gone through pretty much every solution out there,” including “home brewed” storage systems. “Typically, we like to have two different vendors, but two years ago we became solely Isilon. We’ve been very happy with Isilon’s performance and support. If we need to add more storage, we add another node to the cluster so it’s an incremental cost.”
Storage is broken into three tiers. Tier 1 Isilon storage is used for media currently in production, which needs to be accessed by artists and the renderfarm; it’s high-performance and scalable, and is broken down into job data and RLA data. Media is moved to slower Tier 2 near-line storage to free up space in Tier 1; Tier 2 content is not immediately in use or heavily accessed. Tier 3 far-line storage is used for long-term archiving.



Shows are typically shared among the Rhythm & Hues facilities with every location sharing assets. “Every facility is treated exactly the same and expected to do the same quality work,” says Krishnamurti. “We have a completely distributed workflow that’s pretty seamless.”
With a 24/7 global operation artists need to be kept up to date constantly. “We proactively move assets worldwide,” says Krishnamurti. “When a new version of an asset is generated and a user at another facility is using an older generation of that asset, the new version proactively migrates so it’s at that location if they need it.”
For the upcoming Life of Pi, animators and lighters in all the Rhythm & Hues facilities worked on the film’s tiger, orangutan, zebra, hyena and whales, and did a lot of digital water work. “You’d be hard pressed to tell what was rendered where,” Krishnamurti says. The film is the first show to extensively use the new data center in Taiwan, which has a large amount of render nodes. “We have more compute power in Taiwan than in the rest of our facilities combined,” he notes.
Life of Pi was done in stereo 3D, which increases storage needs, he points out. “Our storage needs are up significantly in the last two years. We increased our compute power about 16-fold and our storage about 3.5-fold.  It’s just amazing what people can do with the technology, but the side effect is that the amount of storage required is staggering. In the effects water simulations for Life of Pi, a single sim file took up 3TB of storage.”
He says employees frequently quip that he should run out to a local store and buy additional 3TB drives for $150. “Sure you can buy cheap,” says Krishnamurti. “But you need cheap, scalable and high-performance, and you get to pick two.”
On a more serious note he says Rhythm & Hues is “always looking for cheaper and more viable solutions. We’re always R&D’ing storage; it would be nice to have more options. We’re also looking at cloud solutions to see where the future is going to be.”