Skip to main content

TPi September/October 2021 - #265

Page 18

BEHIND THE SCREENS

THE MADISON BEER IMMERSIVE REALITY CONCERT EXPERIENCE Sony Music’s ‘digital Madison Beer’ project rewrites the rules of virtual concerts, harnessing the creative capabilities of Unreal Engine.

Photos: Hyperreal & Epic Records

While most concerts are limited by worldly constraints, a virtual concert can be whatever an artist wants it to be, giving them the power to shape fan experiences and realise fantastical concepts at a much higher level than is possible in real life. The Madison Beer Immersive Reality Concert Experience takes this idea and runs with it, turning one piece of content into the type of transmedia campaign that can thrill fans from YouTube to VR. For all the leeway afforded to them by 3D, the production team — led by Sony Immersive Music Studios, Magnopus, Gauge Theory Creative, and Hyperreal — still saw value in maintaining a measure of realism. “When we started with a blank canvas, our creative goal was to construct a virtual concert through photoreal recreations of a real venue and a real artist, but which also layered in enough magic to reimagine the concert experience itself,” said Head of Sony Immersive Music Studios. “You start with things that are totally plausible in a physical setting, because that’s what’s going to make your fans get into it and accept the experience,” said Magnopus CoFounder, Alex Henning. “Once you’ve got them hooked with that kernel of truth, you start to

018

build on top of that with the fantastical.” Hyperreal started by capturing Madison’s face and body with two separate arrays of high-resolution camera systems in Los Angeles. The first system produced a volume for her face, neck, and shoulders, as it recorded photometric data at the sub-pore level. By capturing the way she moved from every angle, Hyperreal was able to get enough data to construct an ultra-realistic avatar, or “HyperModel,” that steers clear of the Uncanny Valley. With the help of 200 cameras, Madison’s body, muscles, and shape were then recorded in a range of biomechanical positions to ensure deformation accuracy in Hyperreal’s real-time HyperRig system. After adding Madison’s preferred performance gear, Hyperreal brought the avatar into Unreal Engine to experiment with movement before the live capture session at PlayStation Studios in LA. While this was happening, Magnopus was hard at work on the venue and VFX systems. After considering a full LiDAR scan, Sony Immersive Music Studios decided to construct the venue from scratch to allow them more control over the lighting. They started with the original CAD files, which

were imported into Autodesk Maya and given the full artistic treatment, including all the nuances that make Sony Hall unique. Magnopus was then able to build upon that with lighting and VFX effects. “Sony Hall is an intimate venue with a lot of character, detail and beauty, which made it an ideal environment for the experience” said Spahr. “It is also great for VR, because of the scale. It’s not a giant, cavernous arena or a tiny hole-inthe-wall club,” said Henning. “It’s got almost the perfect amount of dimension.” Magnopus made use of Unreal Engine’s built-in virtual scouting tools to get their cameras set up so they could test the lighting before diving into the special effects. VIRTUAL MUSIC PRODUCTION BENEFITS Unlike most motion capture shoots, The Madison Beer Immersive Concert Experience was a remote affair driven by teams across the US. In LA, Madison Beer was in a mocap suit and head-mounted camera. In Philadelphia, Hyperreal CEO, Remington Scott was directing her in real-time, using a VR headset that not only allowed him to view Madison’s avatar face-to-face live within the virtual Sony Hall, but adhere to the COVID-19 restrictions that were keeping them apart.


Turn static files into dynamic content formats.

Create a flipbook
TPi September/October 2021 - #265 by Mondiale Media - Issuu