<I>Blackpink</I>: Creating Meta's latest VR experience
January 2, 2024

Blackpink: Creating Meta's latest VR experience

Blackpink, the South Korean girl group consisting of Jisoo, Jennie, Rose and Lisa, recently released a VR concert experience in Meta Horizon Worlds’ new venue, Music Valley. As of December 26th, fans are now able to view the final show of the “Born Pink” world tour, which was captured at the Gocheok Sky Dome. By wearing a Meta Quest headset, fans of the group can watch them perform hit songs that include “Shut Down,” “Pink Venom” and “How You Like That.”

The concert special runs 70 minutes and was produced and directed by The Diamond Bros (www.thediamondbros.com), in partnership with Meta. It’s appearing in VR exclusively in Meta Horizon Worlds.



Jason Diamond, director and executive producer at The Diamond Bros in New York City, recently shared insight into the company’s work on Blackpink: A VR Encore and the challenges they faced to create a virtual reality experience.

How did The Diamond Bros get involved in this project?

“We have been long-time creators with Meta on a number of projects, ranging from the initial launch of their Live Concert platform Venues in 2018, as well as some technical whitepapers, so when Meta approached us with the opportunity to film Blackpink in Seoul, we were incredibly excited for not only the creative challenge, but also the technical.”
 


What were the unique needs or challenges compared to past VR work?

“Productions of any kind always have challenges, of course, but VR has its own unique challenges because of the nature of fish-eye lenses and the need to be much closer to your subjects than traditional lensing. In the case of Blackpink, it’s a few things: How to properly cover the performance area of a massive arena stage with multiple smaller stages; how to follow and cover four performers, as well as quite a number of backup dancers and an incredible light and laser show. Our camera positions are fixed, so it takes a lot of research to understand and Venn diagram the stage area for where the performers will be to make sure we have coverage for as many opportunities as possible.
 
“Production outside of the US is always challenging due to possible language barriers, specific regional production methodologies or rules we may not be aware of that could add roadblocks to our usual productions. We rely on our ability to problem solve in realtime to facilitate a smooth execution of our capture goals. Working with arena and concert talent & teams is always about finding the delicate balance between getting the camera positions we know we need for optimal capture and servicing the needs of the artists’ performance while respecting the paid audience, who are the compliment to the performance. The Blackpink performance had a large downstage area that raised up almost six feet from the main stage height at times, so finding a special camera position that would not only fit the height and headset needs, but also service the entire show was tricky in order to ensure our ‘viewer’ had an optimal experience.”
 


Can you explain your production process and some of the technology used?

“VR is, of course, a specialized medium for capture and display compared to traditional mediums, however, there are quite a number of similarities when it comes to production in general. Like any production, we need to determine the specific needs and crew. Since we were traveling to Seoul, Korea for this project, we determined we would need to travel a reasonable-sized crew of specialists and key positions that we could augment with incredible local crew across broadcast/on-set signal ops and camera department/grip. Luckily, this was a multi-day, on-site build for Blackpink, so we were able to not only see the stage being built and begin our scouting of the arena, but also have an additional day of complete setup, cabling and testing before the actual shoot. 
 
“We captured the performance on eight modified Red Raptor VV cameras with Canon 8-15mm zoom lenses on each. We also used three Canon R5C cameras with the same lenses for some different scenarios that wouldn’t fit the Raptors. We cabled each of the eight Raptors back to our home base behind the stage and used Red Control Pro on a Mac laptop to monitor and control all eight Raptors. The software gives us control of literally every button or setting in the camera, so we can use them in parent/child mode, or we can change the settings on a specific camera or cameras to make sure we had optimal capture. We also used fiber runs for 4K/60 SDI monitoring at our home base through a Blackmagic ATEM Constellation 8K, with two 60-inch 4K TVs for program/preview and multiviews of all eight cameras. This also enabled us to feed SDI/HDMI to a PC laptop running Assimilate Scratch Live, which allowed us to dewarp the fish eyes in realtime and see the cameras live on a Meta Quest 2 headset, so we could be 100% confident of what each camera position afforded us and make any adjustments we needed to during our prep day. 

“On the shoot day, we were able to confidently monitor the show and various angles with a ‘switch’ of camera angles to get a feel for the edit later, as well as see first hand any key moments and how they would feel. In post, we cut, color, de-noise and dewarp in DaVinci Resolve. For highly-technical projects like this, it’s a huge benefit to us to be able to do everything we need to do in one app. We had the opportunity to work with the incredible team at Dungeon Beach in Brooklyn, who provided color/final deliverables, and also handled the full audio mix, spatializing and sweetening for this project.”
 


What was the timeframe to produce and deliver this project, and what were the deliverables?

“The timeframe to prep was less than a month before we were on a flight to Seoul, but we had a solid five days on the ground to problem solve ahead of the show. I think everyone in production works to get as much time on the ground as possible, and we felt very positive about the schedule we were able to maintain. We had about two months to edit, color and de-warp the footage, and the final deliverable was a 4,320 pixel by 4320 pixel, 180 degree MonoVR file with a spatialized stereo audio mix.”