STORAGE FOR VFX
With post houses and studios handling massive amounts of visual effects data for projects ranging from high definition spots to 2K and 4K feature films, demands run high for storage solutions that offer fast access, creative collaboration and reliable service.
Visual effects house Zoic Studios (www.zoicstudios.com) maintains facilities in Los Angeles and Vancouver, both of which rely on Isilon clusters for extensive VFX storage and backup. The Los Angeles office alone boasts a 1,200-processor renderfarm and some 200 workstations.
Main production storage in LA consists of a 20-node Isilon 1920 cluster; Vancouver deploys a 10-node system. Both locations have Isilon 1200 series clusters (8-nodes in Vancouver, 4-nodes in LA) for disaster recovery storage, and LA also has an older 9-node Isilon 4800 series cluster for reusable assets stored in the vault.
"We started seven years ago as a pretty small company with some direct-attached storage, then went to a Dell system with RAID arrays attached," recalls head of engineering Saker Klippsten. "Then we tested SGI, CXFS, BlueArc and NetApp systems, which were pricey at the time. Their bang-for-the-buck didn't do it for us."
But a friend who owned a VFX house got a pre-NAB peak at an Isilon system, which piqued Klippsten's curiosity. He arranged for a demo of the alpha version of Isilon's 6-node cluster (which Zoic still uses for textures and photos). "We were amazed how easy it was to set up compared to other products," he says. "Racking it took the longest, but turning it on and joining the nodes in clusters was amazingly fast: We only had to configure one machine and the rest picked it up 'automagically.'"
Zoic tends "to push storage to the edge" and "pounded on" the alpha version of the system for the feature film Serenity, which the studio was working on at the time. "We knew it was the way to go," Klippsten reports.
The company initially invested in a 4-node production cluster for series, such as Buffy The Vampire Slayer and Angel, and now has a 20-node cluster for its roster of 20 shows, which includes the CSI franchise, Fox's Fringe, Sci Fi Channel's Eureka and ABC's new remake of the 1980's series V.
When Zoic opened its office in Vancouver about two-and-a-half years ago, it was a "no brainer to put an Isilon cluster up there," he notes. Aspera software enables the facilities to easily transfer data between the studios at high speed. "We work on a lot of shows jointly and divide VFX shots, but the Isilon system allows us to feel like we're one company," he says.
Looking ahead, Klippsten believes Zoic can still grow with Isilon. The company's workstations are Windows XP64 systems, which run 45-50MB per second for copying data to the cluster; Zoic has tested Windows 7, which achieves 100MB/s. "So Isilon has a lot more power behind it," Klippsten observes.
Zoic has also been testing Isilon's X series, whose newer hardware and faster processors and hard drives provide another potential upgrade path. "It's so much faster and allows us to play HD in realtime using their accelerator nodes," he says. "These nodes can independently grow the performance of the cluster's throughput. You used to need to increase storage to get more throughput, but now you can grow throughput independently."
RING OF FIRE
Santa Monica's Ring of Fire Studios (www.ringoffire.com) has opted for a NetApp FAS 3020 filer as "the workhorse for all our storage and server needs," says VFX senior producer Casey Conroy.
"We had a general file server, but what prompted us to get involved with NetApp was the groundbreaking 2004 film Sky Captain & the World of Tomorrow: The entire movie was shot on bluescreen and all the environments were CGI. With the schedule and volume of shots we had to do, we needed a very reliable file server solution to handle the throughput of 30, 40, 50 workstations and get all the renders done. And Net-App was rock solid from day one."
Once Ring of Fire wrapped Sky Captain the studio "rolled the FAS 3020 into our workflow of VFX for commercials, other features, episodic TV, special venue films and music videos," he reports. "It's still our file server backbone; we utilize it all the time. It's still very fast and continues to be extremely reliable with no drive failures.
The company has made various firmware upgrades but "no significant hardware upgrades," just "scaling capacity to get it up to another level," Conroy explains. "Nowadays nearly every spot has an HD finish, and when you're dealing with VFX, CGI and multilayered composites, data adds up quickly. Transcoding R3D (Red camera) files to 2K or HD resolution TIFF or DPX sequences means dealing with massive amounts of data. So you must have a very reliable file server with the capacity you require and quick access. You need a system like this to keep pace with what you're doing and keep the integrity of your data completely secure."
Ring of Fire recently worked with director James Lima on a four-minute animated piece that's part of Madonna's "Sticky & Sweet" tour. The combo of motion graphics and 2D and 3D animation is displayed on a big screen in slightly higher than HD resolution in an unusual aspect ratio. "We had multiple artists on multiple workstations using multiple applications all grabbing and rendering back to the NetApp server," Conroy recalls. "There was no significant slowdown when tons of people were accessing it, and we never had to worry about the data. The system is completely reliable."
Likewise, 2D and 3D artists worked concurrently on Nickelback's Gotta Be Somebody music video, which has been nominated for several Much Music Awards. Once again, the FAS 3020 proved to be "a system you could rely on for handling massive amounts of data."
So it's no wonder that Conroy calls the NetApp filer "an unsung hero of digital effects," which performs a "tireless, thankless job" toiling in the background. "It always works and works well, so you may not appreciate how awesome it is," he declares.
At London's Double Negative (Dneg), the feature film VFX house (www.dneg.com), seven BlueArc Titan 3200 servers (five with 27TB each and two with 45TB each) are configured in two clusters, which act as source and destination locations for rendering.
Prior to acquiring the Titans, Dneg used SCSI-attached RAIDs on Linux servers. Then the company migrated to NetApp filers. "When we reached the capacity/performance threshold of the NetApps we turned to a 2-node BlueArc Titan II cluster," recalls systems engineer Steve Lynn. "At the time, it was the only option that could give us the performance we needed at a competitive price. We've slowly added to this and were very early adopters of the Titan III servers, which have given us vast gains in throughput from our storage. This additional throughput allows us to render scenes faster and more reliably as our ever-increasing renderfarm accesses their disks."
All of Dneg's major features use the Titans for their primary storage, including The Dark Knight, The Boat That Rocked, Hellboy 2 and Angels and Demons.
For Angels and Demons, Dneg was charged with 258 shots, creating the Vatican's intricate and detailed Basilica and St. Peter's Square and populating it with CG crowds, and crafting a CG helicopter, parachute, exploding fountains and more.
The majority of Dneg artists on Angels and Demons worked directly to the BlueArc clusters via a distributed file system used internally at the studio; others worked locally on workstations, published to the Titans and rendered to and from the clusters. "With so many people accessing so much at once to read and write data, performance can drop off on lesser-performing, lower-capacity servers. Smaller servers can stop serving data altogether," Lynn points out. "You don't see that with BlueArc.
"Our renderfarm has 400 to 500 machines, most of them quad-core processors, each core with one job running. That's a huge amount of reading and writing."
Dneg's new Singapore office has installed a BlueArc Titan server that fills "exactly the same role" as in London, Lynn reports.
"We're very happy with BlueArc and have no plans to look at any other enterprise storage solutions," he notes. "They're developing software reasonably quickly so you can add more storage online. And we can scale up performance by adding another system to the cluster. Performance is the biggest driver for us at this point."
A film and broadcast post production facility with a visual effects department, Calgary's Whiteiron Digital (www.whiteiron.tv) has a 24TB Rorke Data Galaxy SAN for editorial and VFX applications.
"We were mainly an Avid house that was adding Final Cut Pro to its toolkit, and year after year we were buying extra local storage, which seemed an inefficient way to operate," notes director of operations Earle Nichol. "So we went to NAB 2007, talked to Vancouver reseller Oceana and the Rorke people, who seemed to have a product that fit our budget, and did what we needed to do — increase storage and [foster] collaboration."
Although Rorke advised against setting up a new system in the middle of a major project, Whiteiron Digital had "no choice," Nichol recalls. "We had been editing the dog-training series At the End of My Leash, which airs on Slice in Canada (and has been reversioned as In the Dog House for the US and is in negotiations with Animal Planet). After NAB 2008 we were told they needed to complete two seasons of the show in the time usually devoted to editing one season. We knew we couldn't do it without shared storage."
Whiteiron technical supervisor Geordie Glazer dealt with the limitations of the software-based system by setting up different zones within the Galaxy. Four Final Cut Studio systems (one of them a digitizing station) have their own zone and access shared storage dedicated to them. An Avid Adrenaline and Media Composer (software only) also have a zone; a DS Nitris suite is not on shared storage. The VFX department, which sports five Boxx workstations running After Effects, Combustion and 3DS Max, has its own zone, too. FibreJet software manages data for editorial; metaSAN software handles the VFX side.
"It took a bit to get the system up and running, but once it was — wow! How did we ever live without it?" asks Nichol.
Glazer says, "It probably took six months to get the SAN working the way we wanted it to. The Final Cut zone, which we needed for the show, was the biggest priority. Then we carved up zones for everyone else. We had purchased the SAN to have all the edit suites access a pool of drives, but our usage changed with the concept of dedicated zones, so the system evolved a lot."
For At the End of My Leash, which is edited on Final Cut, the Galaxy SAN enabled Whiteiron to consolidate the show's toolkit of repeatable elements, like bumpers and opens, in one place so editors could easily pull components they needed with no confusion and assured quality control. With up to five shows on shared storage at the same time for offline and online, editors could cut episodes, make revisions and do final touch-ups quickly and efficiently.
"Due to production delays, the show started seven weeks late, but thanks to being able to schedule edits in different rooms and share media and projects effortlessly, we finished the two seasons a week before they were due," Nichol reports. Whiteiron has now posted five seasons of the popular series.
In the VFX department, prior to acquiring the Galaxy SAN, only one Boxx workstation at a time could render out an After Effects project, Glazer points out. "Each machine was an independent island for VFX creation and rendering. Now all five machines can open up the same After Effects project and work on it, and jobs can be rendered out by up to five machines. It's made a dramatic difference to our throughput."
Enhanced collaboration and faster rendering have also played a part in Whiteiron's ongoing 2D and 3D animated spot campaign for Canada's Shaw Cable featuring the Snailski family (of snails) who are aghast at the speed of cable and Internet delivery.
Whiteiron is just coming up on a year with its Galaxy SAN, and Glazer says the system has "matured" and is working well. But with the company's first feature film project beginning soon "we may have to go down the expansion route," Nichol notes. For now, "we'll be cordoning off a zone in the SAN for shared storage for the
film so dailies transcoding, syncing and back-up can be done simultaneously which, once again, will get things done for our clients now, not later."
At Method, an Ascent Media company with offices in LA and New York (www.methodstudios.com), a wide range of solutions is employed to handle storage for projects "from Web resolution all the way to 2K and 4K features and everything in between, including cinema spots and HD commercials," says chief engineer Robb Cadzow.
Three sets of solutions service virtually every department in the facility, which has recently lent its VFX expertise to the film Wolverine and myriad commercials, among them the "Meet the Volkswagens" campaign featuring the talking VW Beetle and bus.
For the workflow of CG and desktop compositing departments in LA and New York, Method taps three 6-node Isilon clusters (1920 and 6000 systems in New York and a 1440 in LA) to provide central network-attached storage. In LA, the Isilon 1440 teams with a NetApps 3140 filer.
"Both Isilon and NetApps are great at handling heavy, high-speed, concurrent access from many, many hosts: We have hundreds of render nodes and dozens of workstations," says Cadzow. "Both Isilon and NetApps allow us to scale the file system without disrupting workflow. We can make it larger on the fly, and it's relatively transparent to users."
Method was an early Isilon adopter — in fact, the company was a beta tester — and plans to stick with Isilon clusters.
To deal with very high-resolution realtime workflows Method uses DataDirect Networks (DDN) storage. The LA office has almost 100TB of storage based around DDN's S2A 9550 products to handle realtime VFX, color correction and realtime editorial.
Cadzow recently put another 40TB online boosting Method's capabilities to close to 100fps for full-aperture 2K. "I'd call DDN our primary storage vendor," he reports. "The company is fantastic, and its storage is pretty much unparalleled at the ultra high end."
About 24TB of Apple Xsan is dedicated to Method's digital acquisition department in LA for "organizing, rendering and processing shows, commercials and features shot digitally on Phantom, P2, Red or Arri D21 cameras," he adds. "The department needs speed but can't afford the seven-figure expense of DDN. Apple Xsan's bang for the buck is quite high,especially with Apple-based workflows like Final Cut Pro."
In addition, Atempo software running on Sun StorageTek hardware is used in LA and New York to back up and archive data from all devices. "Our work is backed up for protection and versions are kept online for the short term," Cadzow explains. "Automated back-ups occur at varying intervals determined by the workflow or project."
"User-friendly" Atempo software is employed by many Ascent Media facilities so it has also become an easy interchange format for sharing knowledge and expertise, he points out.
"We started with a small set-up in LA three years ago and now have a fairly large configuration; New York only implemented Atempo and StorageTek about a year ago," says Cadzow. "The system scales very nicely and will grow with us as we buy more licenses and modules and StorageTek hardware."