Image courtesy of All of it Now

How BTS joined Coldplay for a live hologram performance of 'My Universe'

March 15, 2022
From massive spaceships to dancing aliens, the music video for Coldplay and BTS’ recent single, My Universe, brought two of the world’s most famous bands together in spectacular fashion. It was a historic moment, and not just because the single was the first by two co-billed headliner groups to become a US chart-topper. Behind the scenes, five VFX studios were working with volumetric video—a technology that could change the way entertainment is made.

Led by Ingenuity Studios, the VFX houses used a 360-degree,108-camera volumetric capture rig to record each singer from every angle as they performed. Using volumetric capture meant that each member of the two supergroups was recorded as a volume similar to a hologram, rather than as a flat image. It also meant both Coldplay and BTS were able to sing together in the video, even though BTS was shot on green screen in Korea and Coldplay was in Spain. The volumetric recording of each singer could be composited in from any angle—all with realistic depth, color, and lighting.

It was an impressive final result. But when LA-based creative agency, All of it Now, came onto the campaign, they started working with Coldplay’s creative team to take the concept even further.

“We wanted to bring the holograms into the real world, but needed an event,” says Danny Firpo, All of it Now CEO & co-founder. “Our search led us to [NBC’s] The Voice, which had all the infrastructure we needed to get the groups performing perfectly in sync.”
 

Traditional holograms versus volumetric video

The first step to bringing such an ambitious idea to life was to build real-time production into the pipeline. “We could have brought BTS on stage as a traditional projection, like the famous Tupac hologram Digital Domain made with an angled piece of glass,” says Nicole Plaza, All of it Now Executive Producer. “But we knew that a traditional approach would not have worked for this. The installation time, cost, and limited viewing angle would have made it less engaging, and also significantly less flexible than what we had with Unreal Engine.”

By funneling Dimension Studios’ volumetric capture data into Unreal Engine, the team could easily and quickly make adjustments to an AR performer’s position and timing based on physical stage limitations. This speed and agility was essential to the production, and near-instant changes such as this have been next to impossible with Hologauze screens and the Pepper’s ghost method.
Image courtesy of All of it Now
Using a real-time workflow also meant the team could camera-block the entire performance in Unreal Engine. This not only enabled the bands to preview the pacing behind each shot; but helped the production team see if there was enough coverage from each camera position, meaning they could make decisions about which cameras and lenses were needed for the shoot.

“Originally, we had planned for four cameras, but upon seeing the quality levels we could deliver in real time, The Voice and Coldplay teams quickly agreed that seven cameras would increase the overall production value even more,” explains Berto Mora, All of it Now Virtual Production Supervisor.

Translating the idea to the stage

Once production started, Unreal Engine was used to run the compositing and playback of all seven feeds with the original BTS volumetric footage captured in My Universe video shoot. “Knowing that we had excellent 360-degree coverage of all seven performers really opened up the doors of possibility as far as what shots were possible with the footage,” Mora continues. “We had a dedicated server for each camera, so that the director was able to see all the feeds simultaneously, and cut the show as if the AR BTS members were actually on stage with the real Coldplay.”
Image courtesy of All of it Now
In order to transition the BTS performers on and off stage in real time, the team also created a ‘glitch’ effect using fresnel effects and Sequencer in Unreal Engine, so the performers would instantly read as holograms to the audience. Fans of the music video might remember this look, as it was also used to imbue the galactic “transmissions” with the same imperfect characteristics we associate with holographic data.
Stype’s Stypeland plugin was then used to ingest all the camera tracking and lens distortion data. This was recorded with Live Link so the team could re-render selected segments of each performer’s solo moments and make frame-level adjustments to the lip-sync for particular performers in post. Meanwhile, on stage, All of it Now’s team worked with award-winning designer Sooner Routhier to use specific floor lights that would help the Coldplay musicians identify which AR performers were on stage, and when.

The benefits of real-time technology

Like all disruptive workflows, using volumetric video involved some challenges. At one point, the team at All of it Now even collaborated with Microsoft to rewrite plugin code so that BTS performer assets—which were provided as a special MP4 export using Microsoft’s SVF plugin—remained perfectly in sync with Coldplay.

Facing these challenges, however, was more than worth it. By using volumetric capture data, the team was even able to re-composite certain shots in post, which was made easier because the entire pipeline was always designed for real-time performance and playback. “Each video file was played back from an Atomos Shogun, with audio and video fed into the same servers used for the onsite production, essentially re-creating the exact live scenario that we had onsite,” reveals Mora.

“Using this method, we could fix minor sync issues or modify the glitch keyframes to add or subtract the glitch effect for each shot. We also used our custom pipeline tool to read the timecode range, find the take with the recorded tracking data, and create a new sequence with just the tracking data required for each shot that needed to be fixed in post. This saved time when compared to having to manually identify the take with the correct frame range, which was crucial in the rapid turnaround times required for post,” says Jeffrey Hepburn, All of it Now Lead UE Artist.

By the time The Voice was on air, All of it Now’s team was so happy with the results that they are now exploring parallel pipelines into other volumetric video applications using Arcturus Holosuite, which can change a performer’s head position to “follow” a specific tracked camera. But wherever they go, it sounds like Unreal Engine is coming with them.
Image courtesy of All of it Now
“Unreal Engine came through for us at every stage of this project, which ended up being a milestone for volumetric recordings and an exciting precedent for both mixed reality and metaverse applications,” explains Firpo.

He cites several advantages to using real-time playback of volumetric video, such as the ability to capture highly detailed facial expressions, wardrobe elements, and finger gestures without the need to dive into 3D character design, rigging, and animation pipelines.

Firpo goes on to say that this technology provides an incredible opportunity to explore alternative workflows while avoiding “the ‘uncanny valley’ problem that holds teams back.”

“Real-time playback of volumetric video opens up some big possibilities,” he says. “When you prove that you can record once and reuse assets across multiple applications, it’s exciting to artists who are always thinking of the next big thing. Volumetric video is going to lead them in some interesting directions.”
Image courtesy of All of it Now
Pictured (from left to right): Danny Firpo, Nicole Plaza, Preston Altree, Berto Mora, Jeffrey Hepburn, Neil Carman

    Get Unreal Engine today!

    Get the world’s most open and advanced creation tool.
    With every feature and full source code access included, Unreal Engine comes fully loaded out of the box.