TPi May/June 2022 - #269

Page 10

BEHIND THE SCREENS

MY UNIVERSE IN MIXED REALITY LA-based production studio, All of it Now (AOIN), reflects on mixed reality performance, in collaboration with Coldplay Creative, BTS and Dimension Studio.

Photo: AOIN

Coldplay and BTS joined forces to perform My Universe on NBC’s The Voice in mixed reality – with the British rock band performing in-person and BTS band members performing as holograms in augmented reality (AR), using 3D virtual avatars created via volumetric capture, and rendered live using Unreal Engine 4.27.1. LA-based production studio, All of it Now (AOIN), reflects on this landmark performance in collaboration with Coldplay Creative and Dimension Studio. Has AOIN ever attempted such an ambitious AR performance before? “We first worked on a live AR performance with BTS during their 2019 tour. I believe this was one of the first AR experiences to be used on a tour which spanned four continents – the US, South America, Europe and Asia. This time around, we were able to pilot with seven simultaneous volumetric captures playing back across seven different AR cameras.” What are some of the biggest misconceptions about this style of hybrid performance? “The biggest misconception with AR is how the in-person audience sees it. The conversation we typically have about the live audience question is generally about how prominent a role the IMAG screens play in arena and stadium-sized shows, and how the vast majority of the viewing audience end up watching the performance on the IMAG screens. This is largely due to the distance between their seats and the actual performance, along with how large IMAG screens are at these shows. This live audience factor also applies for broadcast as well, and for the live audience experience with The Voice, we worked with production on ‘coaching the audience’ to respond to the key AR moments as if they were really there on stage.”

010

What technical challenges did you and the team have to overcome? “The biggest difficulty in this project was technical – we were trying to do something that had not previously been done before – using seven volumetric capture recordings in real-time, tracking to timecode. This required adding lines of code to an existing plug-in from Microsoft, along with extensive testing, in a relatively short timeline. We were able to get the modified plugin to work, with one major caveat, which was that we needed to run about 10 seconds of timecode pre-roll, prior to the seven volcap recordings tracking properly to timecode. Once we made that discovery, and determined a viable solution, the largest technical hurdle had been crossed. What logistical challenges did you and the team have to overcome? The other major hurdle was more logistical, and required loading in and calibrating seven AR cameras while in the middle of on-going production of The Voice. This meant that we had limited stage time, and had to find small pockets of time in between acts in order to get the cameras and lenses tracked and calibrated for the AR performance. This was achieved with ‘down to the minute’ scheduling and planning, along with clear communication and collaboration with The Voice production teams.” Do you foresee this technology ever being utilised in the live setting? “We are only starting to scratch the surface of what volumetric capture and AR performances can do. We absolutely see a future where volumetric capture and AR will be used in live performances, but taking it to the next level, where the volumetric capture itself is live, and performers can literally be in two places at once.” www.coldplay.com www.allofitnow.com www.dimensionstudio.co


Turn static files into dynamic content formats.

Create a flipbook
Issuu converts static files into: digital portfolios, online yearbooks, online catalogs, digital photo albums and more. Sign up and create your flipbook.