VFX Voice Summer 2023

Page 1

THE CHEMISTRY OF ELEMENTAL

VFXVOICE.COM SUMMER 2023 VFX EMMY CONTENDERS • GLOBAL TV VFX • SPIDER-MAN: ACROSS THE SPIDER-VERSE VIRTUAL CINEMATOGRAPHY • PROFILES: GALE ANNE HURD, DENNIS MUREN & ROGER GUYETT

Welcome to the Summer 2023 issue of VFX Voice!

Thank you for your continued and enthusiastic support of VFX Voice as a part of our global community. We’re proud to keep shining a light on outstanding visual effects artistry and innovation worldwide.

In this issue, our cover story goes inside the character design of Pixar’s Elemental, where fire, water, land and air residents live together. Read our preview of the Emmy Awards contenders, the inside story of Spider-Man: Across the Spider-Verse and our TV/Streaming coverage on global TV and CG characters. Check out the latest trends in virtual cinematography, global production, location-based VR and avatars of the stars. Enjoy our up-closeand-personal profiles on VFX luminaries Dennis Muren and Roger Guyett. Read about the inspiring career of this year’s VES Lifetime Achievement Award honoree: culture-shaping producer Gale Anne Hurd. And get to know our VES Washington Section in the spotlight… and more.

Dive in to meet the visionaries and risk-takers who push the boundaries of what’s possible and advance the field of visual effects.

Cheers!

P.S. Continue to catch exclusive stories between issues only available at VFXVoice.com. You can also get VFX Voice and VES updates by following us on Twitter at @VFXSociety.

2 • VFXVOICE.COM SUMMER 2023 [ EXECUTIVE NOTE ]

FEATURES

8 THE VFX EMMY: THE 75TH EMMY AWARDS

Meet the contenders in the running for this year’s prize.

14 VFX TRENDS: GLOBAL PRODUCTIONS

Multiple vendors sharing more on big-budget projects.

20 TV/STREAMING: GLOBAL TV VFX

Inside VFX-driven hits from South Korea and Europe.

28 PROFILE: DENNIS MUREN

Multiple Oscar-winner still cherishes the creative process.

34 COVER: ELEMENTAL

Effects were an essential aspect of Pixar’s character design.

42 VR/AR/MR TRENDS: AVATARS OF THE STARS

AR, VR and VFX add new dimensions to live entertainment.

50 PROFILE: GALE ANNE HURD

Producer/writer has been shaping pop culture for decades.

56 VFX TRENDS: VIRTUAL CINEMATOGRAPHY

Maintaining visual language in a shifting cinematic paradigm.

64 ANIMATION: SPIDER-MAN: ACROSS THE SPIDER-VERSE

Sequel expands the multimedia aesthetic of the Spider-Verse.

72 PROFILE: ROGER GUYETT

VFX Supervisor specializes in effects that advance great stories.

78 TV/STREAMING: CG CHARACTERS

CG characters are increasingly playing more central roles.

84 VR/AR/MR TRENDS: LBVR

Location-based VR is briskly growing again post-pandemic.

DEPARTMENTS

2 EXECUTIVE NOTE

92 VES SECTION SPOTLIGHT – WASHINGTON

94 VES NEWS

96 FINAL FRAME – THE EMMYS

ON THE COVER: Fire (Ember) and Water (Wade) meet in Elemental. (Image courtesy of Pixar/Disney)

4 • VFXVOICE.COM SUMMER 2023 [
] VFXVOICE.COM
CONTENTS

WINNER OF THREE FOLIO AWARDS FOR PUBLISHING EXCELLENCE.

VFXVOICE

Visit us online at vfxvoice.com

PUBLISHER

Jim McCullaugh publisher@vfxvoice.com

EDITOR

Ed Ochs editor@vfxvoice.com

CREATIVE

Alpanian Design Group alan@alpanian.com

ADVERTISING

Arlene Hansen Arlene-VFX@outlook.com

SUPERVISOR

Nancy Ward

CONTRIBUTING WRITERS

Naomi Goldman

Trevor Hogg

Chris McGowan

Oliver Webb

ADVISORY COMMITTEE

David Bloom

Andrew Bly

Rob Bredow

Mike Chambers, VES

Lisa Cooke

Neil Corbould, VES

Irena Cronin

Paul Debevec, VES

Debbie Denise

Karen Dufilho

Paul Franklin

David Johnson, VES

Jim Morris, VES

Dennis Muren, ASC, VES

Sam Nicholson, ASC

Lori H. Schwartz

Eric Roth

VISUAL EFFECTS SOCIETY

Nancy Ward, Executive Director

VES BOARD OF DIRECTORS

OFFICERS

Lisa Cooke, Chair

Susan O’Neal, 1st Vice Chair

David Tanaka, VES, 2nd Vice Chair

Rita Cahill, Secretary

Jeffrey A. Okun, VES, Treasurer

DIRECTORS

Neishaw Ali, Laurie Blavin, Kathryn Brillhart

Colin Campbell, Nicolas Casanova

Mike Chambers, VES, Kim Davidson

Michael Fink, VES, Gavin Graham

Dennis Hoffman, Brooke Lyndon-Stanford

Arnon Manor, Andres Martinez

Karen Murphy, Maggie Oh, Jim Rygiel

Suhit Saha, Lisa Sepp-Wilson

Richard Winn Taylor II, VES

David Valentin, Bill Villarreal

Joe Weidenbach, Rebecca West

Philipp Wolf, Susan Zwerman, VES

ALTERNATES

Andrew Bly, Johnny Han, Adam Howard

Tim McLaughlin, Robin Prybil

Daniel Rosen, Dane Smith

Visual Effects Society

5805 Sepulveda Blvd., Suite 620

Sherman Oaks, CA 91411

Phone: (818) 981-7861

vesglobal.org

VES STAFF

Jim Sullivan, Director of Operations

Ben Schneider, Director of Membership Services

Charles Mesa, Media & Content Manager

Colleen Kelly, Office Manager

Brynn Hinnant, Administrative Assistant

Shannon Cassidy, Global Coordinator

P.J. Schumacher, Controller

Naomi Goldman, Public Relations

Tom Atkin, Founder

Allen Battino, VES Logo Design

6 • VFXVOICE.COM SUMMER 2023
SUMMER 2023 • VOL. 7, NO. 3 VFX Voice is published quarterly by the Visual Effects Society. Subscriptions: U.S. $50: Canada/Mexico $60; all other countries $70 a year. See vfxvoice.com Advertising: Rate card upon request from publisher@vfxvoice.com or VFXVoiceAds@gmail.com Comments: Write us at comments@vfxvoice.com Postmaster: Send address change to VES, 5805 Sepulveda Blvd., Suite 620, Sherman Oaks, CA 91411. Copyright © 2023 The Visual Effects Society. Printed in the U.S.A. Follow us on social media

2023 EMMY CONTENDERS CELEBRATE THE CENTRAL ROLE OF VISUAL EFFECTS

At the 74th Primetime Emmys, The Book of Boba Fett scooped Outstanding Special Visual Effects in a Season or Movie, with The Mandalorian taking home the award the previous year. It remains to be seen whether the much anticipated third season of The Mandalorian will take home the award after successive wins for the Star Wars universe. In the Outstanding Special Visual Effects in a Single Episode category, Squid Game won for the “VIPS” episode. With several new productions and more prequels and sequels from acclaimed trilogies and series such as The Lord of the Rings, Vikings and Game of Thrones, as well as an Addams Family spin-off, it has been an excellent year for visual effects. It will undoubtedly be a significantly close race at this year’s awards.

At the 2023 VES Awards, The Rings of Power garnered three awards for the episodes “Adar” and “Udun.” “Rodeo did 580 shots on the series,” says Visual Effects Supervisor Ara Khanikian. “I associate Lord of the Rings with epic landscapes, so having a hand in that was really interesting for us, in terms of classic matte paintings and CG environments. We had a lot of fun with the Harfoots, with the scale comps and just kind of playing with the scale of the smaller species versus human size. We had all the work surrounding the stranger as well. That started with him crashlanding in Middle Earth where he creates a crater of fire, and that involved some very complex work. We had a lot of very complex effects of the stranger not being in control of his powers and just seeing how his powers interact with the environment for the first time.”

“The most challenging sequence in terms of scope was definitely the one from the final episode with the battle between the mystics because of how much had to happen in it,” adds Rodeo FX Effects Supervisor Nathan Arbuckle. “We had all of the nature effects such

8 • VFXVOICE.COM SUMMER 2023
TOP: Rodeo FX did about 580 shots in total for The Rings of Power. (Photo: Ben Rothstein. Image courtesy of Prime Video) OPPOSITE TOP TO BOTTOM: Thing and Jenna Ortega as Wednesday Addams in Wednesday. (Image courtesy of Netflix) Union VFX delivered over 300 shots for The Sandman (Image courtesy of Netflix) Vikings: Valhalla could find itself among the nominees once again after a strong start to the second season. (Photo: Bernard Walsh. Image courtesy of Netflix)

as leaves and trees and power from the stranger. We also had all of the fire that the mystics were going to put in there. We had all the celestial energy, and we had to have the wraith look as well when the stranger hits them with the celestial energy. The final episode involved a full CG forest build and doing lots of simulations for all of the CG plants as well as the effects between the stranger and the mystics. Then, of course, having the mystics get blasted away and turned into moths. We also did the Morgul blade effect, the glowing sword that builds out of a dark energy.”

The Netflix series Wednesday features an array of visual effects from fantastical monsters to supernatural abilities. The Rocket Science visual effects team delivered more than 300 shots across the series, with the majority of the scope of work focused on Thing, along with Wednesday’s Scorpio-Pet, digital doubles, dynamic CG fire, explosions and FX simulations with additional supporting visual effects. “Tom Turnbull, Rocket Science President/VFX Supervisor was very clear that the visual effects needed to be grounded in reality while set in this fantastical world,” says Visual Effects Supervisor John Coldrick. “As good as the miming of a disembodied hand was, there were shots where issues such as incorrect center of gravity and appearances of hovering over the ground would crop up. While the motion of a hand with no arm was magical, it still had to obey the laws of physics. This would sometimes involve altering the fingers so they connected with the ground during runs and gallops, removing sliding and altering axes of rotation. The goal was to make it difficult to imagine a ghostly arm on the end of that wrist, and make Thing live in a natural way.”

“The biggest challenge was to seamlessly integrate Thing into each shot, regardless of any obstacles on location, set and time of day,” Coldrick adds. “The disembodied hand needed to interact

SUMMER 2023 VFXVOICE.COM • 9

with the surrounding environment as well as with other actors, imparting its own personality and performance. Production landed on a combination of hand-acting from Victor Dorobantu dressed in a blue suit, and prosthetics to add the stump to his real hand, and the Rocket Science visual effects team seamlessly integrating Thing into frame. The RS VFX team created a partial rig consisting of the wrist and the hand down to the finger base. This was tracked onto the stunt performer in the plate. We often had to tweak the wrist performance to create the illusion of a bodiless hand.”

“Realizing Thing was more challenging than it looks,” explains Kayden Anderson, Rocket Science Visual Effects Producer. “For many of the complex shots, we had to troubleshoot to devise unique solutions. Maintaining the actor’s hand performance and recapturing the environment with fidelity was not a one- size-fits-all scenario.”

MARZ Visual Effects Supervisor Ed Englander also worked on the series. “We did a couple of sequences of shots with Uncle Fester, when he first runs into Wednesday out in the forest and she tries to draw a sword on him and he shocks her,” Englander remarks. “There were a couple of shots when Fester is resuscitating Thing, and we did some of the electricity effects there as well. One of the larger shots we had to do was an environment shot, which was an aerial drone shot of Jericho. It was a wide shot of the whole town and you can see across the river. We had to fill in the backsides of many of the buildings on the main street because outside of this one shot, you never see the whole town. It was all standard fronts and facades, and we had to build out the backs of those and do a bit of aesthetic modification on some of the edges of the town. There were houses in rural residential areas that extended out past the main street which we had to add as well. They wanted it to feel like one of those fall postcards for anywhere out in Vermont. I believe the entire production was filmed

10 • VFXVOICE.COM SUMMER 2023
THE VFX EMMY
TOP TO BOTTOM: Rachel Weisz portrays twin sisters in Dead Ringers, based on the David Cronenberg 1988 horror film. (Image courtesy of Prime Video) The Last of Us masterfully brings the video game to life. (Photo: Liane Hentscher. Image courtesy of HBO) Angus Bickerton and his visual effects team picked up a nomination at the 21st VES Awards for their work on House of the Dragon. (Image courtesy HBO)

in Romania, but we had a lot of reference footage, and you can’t do a Google search of Vermont without autumn trees showing up everywhere. There were plenty of good bits of material to flesh that out.”

Another strong contender is the fantasy drama series The Sandman. Based on Neil Gaiman’s original comic book, The Sandman follows Dream as he embarks on a journey after years of imprisonment in order to regain his power. “The team and I at Union VFX worked on the show for over a year and delivered over 300 shots spread across all 11 episodes of Season 1,” explains Visual Effects Supervisor Dillan Nicholls. “Bringing this rich, complex and much-loved series of comics to the screen was always going to be a challenge and open to interpretation. We read the comics and tried to immerse ourselves in the world of The Sandman as much as possible in advance of joining the project. Initial discussions focused on some key sequences and early concepts from the Warner Bros. team, establishing what kind of aesthetic we were looking for and the level of realism of different sequences as the more abstract, surreal world of ‘the dreaming’ crosses over into the real world.”

“Some of the work had a very open brief, particularly for sequences taking place in the world of dreams, or ‘the dreaming,’” Nicholls adds. “We were encouraged to be creative, using reference from the comics as a starting point, but with freedom to experiment with different techniques and aesthetics to achieve surreal and abstract results. On the other hand, some of the work was very much grounded in reality and required fairly traditional ‘invisible’ effects work such as set extensions and greenscreens. Often in the show, the two worlds of the dreaming and the waking meet and we would create fairly traditional, realistic-looking VFX shots, but with something not quite right, a surreal twist.”

SUMMER 2023 VFXVOICE.COM • 11
TOP TO BOTTOM: Chris Sumpter as Jake in Episode 104 of The Midnight Club. (Photo: Eike Schroter. Images courtesy of Netflix) Chloë Grace Moretz in the The Peripheral, based on a William Gibson novel (Photo: Sophie Mutevelian. Image courtesy of Prime Video) Paz Vega as Ava Mercer, left, and Giancarlo Esposito as Leo Pap in Episode “White” of Kaleidoscope. (Photo: David Scott Holloway. Image courtesy of Netflix)

Discussing the most challenging effect to realize on the series, Nicholls notes that the views across the Thames by the tavern in Episode 6 were shot against greenscreens and required extensive full CG/DMP environments and a full CG tavern. “These scenes took place in the ‘real world’ in London but across dramatically different time periods, across 600 years, and required a lot of research as well as an evolving CG build of the tavern through the ages. We needed to maintain a balance between making it recognizably the same location – a crucial story point – while also showing that area of London evolved over hundreds of years, from green fields to present day Canary Wharf,” he says.

Five Days at Memorial won Outstanding Supporting Visual Effects in a Photoreal Episode at this year’s VES awards. “In total, UPP delivered 265 shots for this project,” says Visual Effects Supervisor Viktor Muller. “As on similar projects of this kind, it was desirable that the visual effects weren’t apparent at first sight. If a shot revealed itself as a VFX one, it would, in fact, be incorrectly done. For us, the biggest challenge was creating the New Orleans Superdome, where we were depicting the battle with the hurricane. The most demanding aspect of this shot was figuring out how to approach it so that it looked as real as possible but simultaneously remained visually attractive and interesting for the audience. In other words, as on the basis of real references hardly anything would be visible, we needed to find the balance between making sure that the viewers were able to see something and keeping the shot realistic.”

Set nearly 200 years before the events of Game of Thrones, House of the Dragon depicts the events leading up to the Dance of the Dragons. Visual Effects Supervisor Angus Bickerton and his visual effects team picked up a nomination at the 21st VES Awards for their work on “The Black Queen” episode. Contrastingly, Vikings: Valhalla, in its second season, is set 100 years after the events of Vikings. Valhalla was nominated at last year’s awards for Outstanding Visual Effects in a Single Episode for “Bridge,” with the original series having been awarded Outstanding Special Visual Effects in a Supporting Role at the 72nd Primetime Emmy Awards. Valhalla could find itself among the nominees once again after a strong start to the second season.

New to the mix is the post-apocalyptic drama, The Last of Us, which masterfully brings the video game to life. Adapted by the game’s creator, Neil Druckmann, and Chernobyl creator Craig Mazin, The Last of Us follows the hardened, middle-aged Joel, who is tasked with escorting 14-year-old Ellie across a treacherous and barren America in what may be the final hope for the survival of humanity. Another potential newcomer in the running for a nomination is Amazon Prime’s Dead Ringers. Based on David Cronenberg’s 1988 horror classic, Dead Ringers is centered around twin gynaecologists (portrayed by Rachel Weisz) in a gender-flipped version of the

When the winner of the Primetime Emmy Awards is announced September 18, it will certainly be a close race. Nonetheless, the visual effects work over the course of the last year has been nothing short of remarkable. Visual effects have played a central role in some of the biggest series released this year and each of the mentioned series can be extremely proud of their groundbreaking and beautifully crafted work.

12 • VFXVOICE.COM SUMMER 2023 THE VFX EMMY
original film. TOP TO BOTTOM: The boom age of the 1880s in New York is stylishly captured in The Gilded Age (Photo: Alison Cohen Rosa. Image courtesy of HBO) The Rings of Power took home three awards at the 2023 VES awards. (Photo: Ben Rothstein. Image courtesy of Prime Video) One of the biggest challenges for Rocket Science VFX on Wednesday was seamlessly integrating Thing into each shot. (Image courtesy of Netflix) The Sandman is a strong contender for a nomination at this year’s Emmys. (Image courtesy of Netflix)

A SHARED SURGE: PRODUCTION WORK MOVES AROUND THE WORLD

The VFX boom goes on, and it is a shared surge. There has been continued growth in the visual effects business, with vendors from many different countries working together on movies and series. And often the individual VFX companies themselves have multiple facilities located around the world, including ILM, Wētā FX, DNEG, BOT VFX, Technicolor Creative Services, Streamland Media, Pixomondo, Digital Domain, Outpost VFX and Framestore. The collaborations range across North America, Europe, Australia, New Zealand and South and East Asia, with more participation on the horizon from emergent visual effects houses in Latin America and Africa as well. Consequently, great opportunities and new challenges have emerged with the increasing cooperation between geographically distant studios.

TOP: Method Studios and MPC did some heavy lifting on Top Gun: Maverick, assisted by Lola VFX, BLIND LTD, Intelligent Species and Gentle Giant Studios. (Image courtesy of Paramount Pictures and ViacomCBS Inc.)

OPPOSITE TOP TO BOTTOM: The Indian Hindi-language fantasy-adventure film Brahmā stra: Part One – Shiva is an example of an “in-house two-vendor” solution at scale. In this case, multiple facilities of sister companies ReDefine and DNEG delivered over 4,000 VFX shots for the epic film. (Image courtesy of DNEG and Dharma Productions)

Framestore was the lead VFX vendor on Fantastic Beasts: The Secrets of Dumbledore and was joined by Rodeo FX, Digital Domain, Image Engine, One of Us, Raynault VFX, Clear Angle Studios and RISE Visual Effects Studios. (Image courtesy of Warner Bros. Pictures)

Wētā Digital and ILM led the way with The Lord of the Rings: The Rings of Power, collaborating with Rodeo FX, Cause and FX, Method Studios, DNEG, Outpost VFX, The Third Floor, Rising Sun Pictures, Atomic Arts and Cantina Creative. (Image courtesy of Amazon Studios)

Says Jeanie King, ILM’s Vice President of Production, “The entertainment industry is far more globalized than it has ever been in the past, and it is to everyone’s advantage to spread the work. It gives us all more capacity to get all the work done. Also, we are able to access more talent in different regions, which benefits everyone. Effect vendors and clients alike need to be more flexible and organized now because everyone is spread [over] more time zones.”

King adds, “Due to the high volume of projects requiring VFX throughout the industry, studios have had to spread the work around. Many projects have had 10 to 20 VFX studios involved. Vendors need to have a diversified portfolio as do the studios/ clients in order to protect themselves for their deliveries.”

“On Marvel Studios’ Doctor Strange in the Multiverse of Madness,

14 • VFXVOICE.COM SUMMER 2023

we worked with multiple vendors sharing assets, environments and FX in order to complete the work on shared sequences. This was also true on Marvel Studios’ Black Panther: Wakanda Forever,” remarks King, who adds, “Most recently, on Avatar: The Way of Water, ILM collaborated with Wētā FX, which had created the majority of assets for the show. So, we were ingesting and manipulating all of that data into our pipeline so we could work efficiently on the sequences we were contracted to create.”

On The Lord of the Rings: The Rings of Power, ILM and Wētā also collaborated with a stellar group of VFX studios that included Rodeo FX, Method Studios, DNEG, Outpost VFX and Rising Sun Pictures. King sees the movement of the work around the globe as an advantage. “The sharing of work between companies has become easier due to increased standardization in process and technology,” King comments. “Companies have had to evolve as more of the studios engage multiple vendors on one project. Vendors are partnering with each other more than ever before. We all want to get the work done as efficiently as possible. It’s a competitive market out there, and each vendor wants to make sure they are working as productively as possible.”

Using multiple vendors is about resource availability and identifying the right artists for the right work, according to Patrick Davenport, President of Ghost VFX. “We’re very much a global industry now.” He feels that the increasing spread of VFX work around the planet has been an inevitable process. “The industry has been headed in this direction for some time. The work follows

SUMMER 2023 VFXVOICE.COM • 15

artist resources, tax incentives, etc. And now with work from home and hybrid models, it allows artists to work anywhere, anytime.”

Davenport continues, “From a client perspective, it’s about getting the work done on time, within budget, and mitigating the risk of having all your work with one sole vendor, which may end up struggling to deliver. From our perspective, by having a global studio it allows us to operate 24/7 and identify talent in different locations that are best suited to the work.”

A scenario of multiple VFX houses on bigger shows “can enhance important creative factors that come from having different supervisors and production teams with different strengths and specialisms overlooking different sequences on a show,” affirms Rohan Desai, Managing Director of ReDefine, part of the DNEG group.

There can also be downsides. Desai explains, “It can add overhead in terms of managing multiple vendors. Asset sharing is also a hurdle with increased costs. This can create additional work for vendors when they are required to use assets made by other vendors as there can be duplication of effort. Finally, consistency in look can be an additional challenge as each show has a certain aesthetic, and this needs to be matched by all vendors. Different teams may approach aesthetics differently and this can result in inconsistencies.”

To achieve a high standard across all sequences and studios, “I have found that communication and collaboration is the best way to make this work,” comments Christian Manz, Framestore’s Creative Director, Film. “I have had as many as five companies working on a shot/sequence in the past, and in that instance I brought all of the key supervisors together to discuss approach and kept them communicating with each other. I also find that by showing early WIP as soon as possible to the filmmakers and [having] frequent reviews keep the work on track to a fantastic, consistent final result.”

Some recent multi-vendor projects led by Framestore include Fantastic Beasts: The Secrets of Dumbledore (for which Manz served as Production VFX Supervisor), His Dark Materials Season 3 and 1899 (these were almost entirely Framestore, but involved multi-site work across the firm’s London, Montreal, Mumbai, Vancouver and New York studios), Top Gun: Maverick (completed as Method Studios, now part of Framestore) and Wheel of Time Season 2

Jabbar Raisani was a VFX Supervisor for the Stranger Things

Season 4 series, which he says utilized over two dozen studios from around the world. Raisani comments, “Stranger Things S4 was challenging due to the high shot count, high complexity and short delivery schedule. We made it work by spreading the work over numerous vendors in order to maximize throughput.” The VFX studios involved included Rodeo FX, Important Looking Pirates (ILP), Digital Domain, DNEG, Lola VFX, Crafty Apes and Scanline VFX, among others.

To help studios communicate, says Raisani. “One piece of software we used extensively was SyncSketch, which allowed us to visually communicate notes to vendors in an interactive setting. SyncSketch uses cloud-based technology and our VFX

16 • VFXVOICE.COM SUMMER 2023
VFX TRENDS
TOP TO BOTTOM: Ghost VFX was the main VFX studio for Troll , a fantasy tale of a giant troll terrorizing contemporary Norway, and was joined by Copenhagen Visual, Swiss International, Shortcut VFX, Gimpville and Static VFX Studio. (Image courtesy of Motion Blur and Netflix) Doctor Strange’s (Benedict Cumberbatch) magic was conjured up with the help of multiple VFX studios. (Image courtesy of Marvel Studios and Walt Disney Pictures) The planet Vulcan in Star Trek: Strange New Worlds. Unreal Engine and Arnold Rendering are popular tools for ensuring smooth collaboration between vendors. (Image courtesy of CBS Studios, Inc.)

team collaborated daily using shared Google documents. We also used Evercast for daily remote VFX reviews and PacPost.live for editorial reviews.”

The sharing of assets between VFX studios is often a challenge. “I think it is getting better, but it is still challenging,” comments Niklas Jacobson, VFX Supervisor and Co-Founder of ILP. “Different companies have their own toolsets and workflows, and many companies use proprietary tools. In some cases, there could also be licensing issues due to the use of third-party texture or model services.”

Jacobson notes, “Even with different techniques, there are some generally accepted techniques for texture channels and shaders in particular that all the big vendors especially converge towards. With the development of standards like USD and MaterialX, and as they become widely adopted, hopefully sharing will be easier.”

Sharing work can also be quite challenging from the bidding side. Explains Jacobson, “Clients and vendors will want to be as efficient as they can, but in particular sharing hero assets means that one vendor will generally not be able to anticipate the requirements of an asset across other vendors’ sequences. This tends to leave both parties guessing a bit at the planning stage and definitely requires some out-of-the-box thinking.”

ILP has had positive experiences working with its peers and colleagues. Jacobson observes, “We try our best to be good creative partners with everyone we work with, and it’s satisfying to feel that the vast majority of vendors we work with tend to have this stance as well. Stranger Things S4 is a great example – the season finale was such a massive episode and had to be split between vendors. We did a sequence featuring the Demogorgon, a creature which was created by Rodeo and even featured in an earlier episode in the same environment and lighting conditions. Our collaboration with Rodeo was very smooth, and we had no trouble ingesting their creature into our pipeline.”

Pixomondo and various other vendors worked together on the Star Trek: Strange New Worlds series. Pixomondo Virtual Production and Visual Effects Supervisor Nathaniel Larouche notes, “Vendors have been able to coordinate their efforts through the use of a variety of software and hardware solutions. At the top of this list is Unreal Engine, an industry-leading 3D game engine with powerful tools for creating realistic, high-fidelity 3D environments. As well as enabling vendors to create detailed and realistic scenes, Unreal also features tools which allow vendors to quickly and easily collaborate on a project in real-time. Furthermore, Unreal’s system allows for cross-platform compatibility across platforms such as PC and Mac.”

Continues Larouche, “In addition to Unreal Engine, Arnold Rendering is another popular software option among vendors. By using Arnold’s advanced ray-tracing capabilities, 3D artists can achieve incredibly lifelike images without having to spend much time tweaking lighting and textures. Additionally, Arnold supports common image processing formats such as OpenEXR and HDR (High Dynamic Range) formats which allow for greater flexibility when it comes to sharing shots between vendors.”

In addition, hardware also plays an important role in helping

(Image

The

Netflix)

From the Willow series on Disney+. ILM and Hybride supplied the visual effects along with ILP, Image Engine, Luma Pictures, SSVFX, Creative Outpost, Misc Studios, Midas VFX, Ombrium and The Third Floor.

(Image courtesy of Disney+)

SUMMER 2023 VFXVOICE.COM • 17
TOP TO BOTTOM: The visual effects in Black Panther: Wakanda Forever were handled by ILM, Cinesite, RISE Visual Effects Studios, Digital Domain, Wētā FX, Storm Studios, Whiskytree, Scanline VFX, Barnstorm VFX, Mammal Studios, SDFX Studios, Territory Studio, Clear Angle Studios, PixStone Images, SSVFX and Luma Pictures. courtesy of Marvel Studios and Walt Disney Pictures) dreaded Demogorgon for Season 4 of Stranger Things. Among the many VFX providers were Rodeo FX, ILP, Digital Domain, DNEG, Lola VFX, Crafty Apes, Scanline VFX, BOT VFX, Clear Angle Studios, Rogue One VFX, The Resistance VFX, Cadence Effects, Jellyfish Pictures, Alchemy 24 and FutureWorks Media. (Image courtesy of

vendors coordinate their efforts, according to Larouche. “For example,” he says, “computers that are powerful enough to run both Unreal Engine and Arnold Rendering are essential for ensuring smooth collaboration between different teams working on the same project. High-end graphics cards are also beneficial in providing faster rendering times, which can help reduce wasted time spent waiting for updates from other parties involved in the project.”

The Cloud is playing an increasing role in helping VFX studios interact. “The Cloud has had a significant impact on VFX so far; it enables storage and backup of assets in a secure environment that can be accessed by multiple users at any time,” Larouche explains. “It also allows for real-time sharing of files and data between members of a production team, allowing them to work more quickly and efficiently.”

The Cloud offers other advantages as well. Larouche comments, “Cloud-based solutions provide scalability options tailored to specific teams or businesses’ needs. For example, if a vendor’s workload grows unexpectedly over time, they can easily scale up their storage on the Cloud without needing to purchase additional hardware or software licenses.”

Continues Larouche, “Recently emerging technologies in the field are further enhancing our collaborative capabilities when it comes to virtual production pipelines. In particular, cloud-based platforms such as Shotgun allow studios to track progress across all departments in real-time while providing customizable tools like asset management and review functionality that help streamline processes, while maintaining complete visibility into projects at all times. This allows vendors greater control over their workflow while cutting down costs associated with manual processes that tend to slow down progress significantly when attempting larger productions spanning various remote locations.”

States ILM’s King, “Even though pipelines may be different from company to company – because we have had these conversations over the years and work has passed back and forth – we have developed in-house tools to make it easier. Also, due to the remote work situation that many VFX studios are still working in, more people are able to connect via video conferencing, which makes the entire process more efficient, productive and also personal.”

Clients are more open now to vendors talking between themselves than ever before. Comments King, “Because we have been sharing work over the years, conversations have taken place and processes have developed to make ingesting work easier. Also, relationships have formed. Supervisors and artists have worked together at the same facilities and friendships are made, which makes it easier to discuss workflows and have much more helpful, in-depth conversations.”

18 • VFXVOICE.COM SUMMER 2023 VFX TRENDS
TOP: For Doctor Strange in the Multiverse of Madness, ILM worked with Wētā Digital, The Third Floor, Luma Pictures, Trixter, Crafty Apes, Digital Domain, Framestore, Sony Pictures Imageworks, Clear Angle Studios and Spin VFX. (Image courtesy of Marvel Studios and Walt Disney Pictures) BOTTOM: Pixomondo paved the way with the VFX for Star Trek: Strange New Worlds and has worked on the show with Crafty Apes, Ghost VFX, FX3X (Cinesite), Vineyard VFX, Boxel Studio, Barnstorm VFX and Storm Studios. (Image courtesy of Pixomondo and CBS Studios, Inc.)
“The sharing of work between companies has become easier due to increased standardization in process and technology. Companies have had to evolve as more of the studios engage multiple vendors on one project. Vendors are partnering with each other more than ever before. ...
It’s a competitive market out there, and each vendor wants to make sure they are working as productively as possible.”
—Jeanie King, Vice President of Production, ILM

HIGH-END EPISODIC VFX ON A GLOBAL SCALE

TOP: Gulliver Studios was responsible for extending more than 10 environments which included six unique game settings for Squid Game. (Image courtesy of Netflix)

OPPOSITE TOP TO BOTTOM: No practical location could be found for the Moon, so a set and LED wall were utilized for The Silent Sea. (Image courtesy of Netflix)

The communication tower that team leader Han Yoon-jae (Gong Yoo) climbs and falls from had to be extended in CG. (Image courtesy of Netflix)

Practical gimbals were combined with CG augmentation provided by Westworld to get the required interaction for The Silent Sea. (Image courtesy of Netflix)

Breaking out in a big way to showcase the digital artistry of the visual effects industry in South Korea was Netflix series Squid Game, which won the 2022 Primetime Emmy for Outstanding Special Visual Effects in a Single Episode and received a VES Awards nomination for Outstanding Supporting Visual Effects in a Photoreal Episode for Episode 107 titled “VIPS.” “The Korean visual effects industry has evolved drastically in the past 10 years from simple comp tasks to complex shots, which involves various 3D skills and technologies,” states Moon Jung Kang, VFX Supervisor at Gulliver Studios. “Previously, Korean filmmakers tried to avoid too much of digital augmentation because they were not sure of Korean visual effects quality. But as Korean content has gained global popularity, filmmakers started trying more diverse genres and Korean visual effects industry also evolved. In these days, it’s easy to find heavy visual effects shots in Korean content.”

Kang was a member of the Primetime Emmy award-winning team for Squid Game as Gulliver Studios was the sole vendor. “We had about 2,000 shots [for nine episodes], and the post-production period was about nine months.” Squid Game deals with human and social problems through classic Korean children’s games. Remarks Kang, “In the earlier stage, there was a concern about that these games were unique to Korea and the gameplay was too simple. In general, the visual effects environment work is mostly focused on

20 • VFXVOICE.COM SUMMER 2023
TV/STREAMING

realism, but in the case of Squid Game it was necessary to express the feeling of a realistic, yet artificially created set at the same time.” Most of the visual research was centred around environments. “In most cases, we research through images and clips from the real world, but in Squid Game we extended our research to classic paintings and illustrations. For the maze environment with a bunch of stairways, which the basic concept was surrealism, we actually referenced classic surrealism paintings and illustrations a lot during the asset and layout process,” adds Kang.

“In Squid Game, we had more than 10 environment extension issues, not only six unique game environments, but also a cave and an airport,” Kang notes. “Some game environments such as the marbles game town, the playground and the dormitory were large builds, and we extended mostly walls and ceilings. But the rest of environments such as the schoolyard, tug-of-war tower, circus tent and maze ways environments were heavily extended and reconstructed in CG. The circus tent environment included three different set locations such as main glass bridge, VIP room and floor ground area, and we had to combine them all in one and seamlessly connect plates, which were shot under different lighting conditions. Furthermore, the director wanted the circus tent lighting to be very dark overall with a hot spotlight on the bridge, but we had plates with huge fill light above the set.

SUMMER 2023 VFXVOICE.COM • 21

Interestingly, the sequence which gave us the hardest time brought us the honor of winning the Emmy Award.”

Every sequence had its own complex elements. “The piggy bank shots gave us the hardest time to execute,” Kang reveals. “It wasn’t because of shot solution complexity, but to get the exact look of the piggy bank that the director wanted. It wasn’t just about making the material look realistic, but we also had to emphasize the falling money inside the piggy bank.” The schoolyard sequence in Episode 101 is a personal favorite of Kang’s. “The schoolyard environment was the first space to show the boundary between real and fake space, and the practical set was mostly filled with blue matte. We didn’t have a final concept image, and whether the schoolyard should be treated as indoor space or outdoor space wasn’t decided at that time. In an earlier process, we looked through various options and suggested the indoor space with a big opening on top and walls with a hand-painting touch. The key concept of the Squid Game’s environments is the mix of real and fake space.”

Another major Korean Netflix series is The Silent Sea, which was originally a short film called The Sea of Tranquility by Choi Hang-yong. He subsequently expanded the sci-fi concept into eight episodes starring Bae Doona, Gong Yoo, Lee Joon, Kim Sun-young and Lee Moo-saeng. Key sequences had storyboards and previs. “They were used for planning the crash landing in Episode 101, Yunjae falling in Episode 103 and the appearance of Luna in Episode 105,” states Kim Shin-chul, VFX Supervisor at Westworld. “We used techviz for actual filming, to make a filming plan and use Ncam [virtual production] in the filming of the elevator fall scene while checking the appearance of the Balhae Lunar Research Station through a monitor.” A creative challenge was to convincingly convey what will happen in the future. Notes Chul, “It was fun to create the propellant that is in charge of fuel and how the

22 • VFXVOICE.COM SUMMER 2023
TV/STREAMING
TOP TO BOTTOM: Oni: Thunder God’s Tale was to be stopmotion animation but in the end became CG that emulated the characteristics of the original intent. (Image courtesy of Netflix) Megalis VFX constructed an asset pipeline that consisted of Solaris, USD and Arnold to handle the work required for Oni: Thunder God’s Tale. (Image courtesy of Netflix) The characters in Oni: Thunder God’s Tale were meant to feel as if they were made out of felt. (Image courtesy of Netflix)

lander docks with it. The structure of the docking station was created with the idea that it would be easy for passengers and astronauts to board the Moon in the era of relatively easy travel. In the part of walking on the moon, it was difficult to interpret the reference material and the actual appearance in a cinematic way.”

No practical location could be found for the Moon. “A set and an LED wall were used,” Choi remarks. “The powerful directional sunlight had to be implemented with only a limited number of lights without scattering of the atmosphere, but the position of the light could not be changed for each camera setup, so an LED wall was used to cover all angles. And the art terrain was filmed by changing only the position of the actor in one setup using the characteristic rock. Since LED wall shooting is still unfamiliar, we R&D and tested it together.” The monochrome starkness of the lunar environment contrasts with the dystopian Earth. Adds Choi, “Due to temperature changes, the topography of the sea level changes, and yellow dust and green algae are frequent. Maritime workers lost their jobs and abandoned fishing tools and boats. In the previous concept, we tried to show a corroded image of landmarks around the world, but it was deleted because it was judged to be an excessive expression of other cultures.”

Getting the proper performance for the reveal of Luna was critical. “The director thought a lot about the Luna character between an animal and an innocent child,” Choi explains. “To get the desired movement a digital double was used for her first appearance. After that, it could be more relaxed, and the director let the actor act as much as possible. It seems that the writer paid attention to the conflict between characters and the narrative rather than the visual.” The many concepts were discussed for Balhae Lunar Research Station. Notes Choi, “We decided on the location and width of the passage in detail based on the movements of the

TOP: Graphical elements such as numbers were incorporated into the imagery to show the contestants realizing the type of game they are playing in Alice in Borderland. (Image courtesy of Megalis VFX)

MIDDLE AND BOTTOM: Not many public images exist of the Japanese Supreme Court, which made it a creative challenge for Megalis VFX to envision the interior of the building for Alice in Borderland. (Images courtesy of Megalis VFX)

SUMMER 2023 VFXVOICE.COM • 23

SECOND FROM TOP: To create the impression that Tokyo is deserted, the famous Shibuya scramble intersection was recreated as part of a massive open set at Ashikaga City for Alice in Borderland (Image courtesy of Megalis VFX and Netflix)

BOTTOM TWO: In order to better control the environment, the drive scenes for Babylon Berlin were shot against greenscreen with plates composited in later. (Images courtesy of RISE)

characters.” Yunjae falls while attempting to repair the communication tower. “In pre-production,” Choi adds, “we decided on the movement line with the action team through previs. However, it should be expressed as descending from a height of more than 35 meters, but the actual tower set was about five meters. In order to set the camera angle, Ncam was used to film while watching the background made in advance was viewed on the monitor in real-time.” A dramatic moment is the flooding and destruction of Balhae Lunar Research Station. “With the concept of being destroyed by the nature of water increasing indefinitely, the water was expressed with effects simulations and hallway miniatures, and the base was destroyed by freezing all of the outlets,” Choi says.

Japan has a small visual effects industry that does the vast majority of the digital augmentation on domestic projects. “You don’t see many companies here doing visual effects,” notes Jeffrey Dillinger, Head of CG at Megalis VFX. “In Japan, they haven’t pushed visual effects studios to get the quality of work like Wētā FX, but they haven’t had a project go overseas. They are fine with a stylized approach.” A lot of the initial work that the company got to do was effects simulations. Remarks Dillinger, “Oni: Thunder God’s Tale was our first full project. It was initially going to be stop-motion animation and we were going to do CG enhancements, but eventually they decided to do full CG because it would have taken years.”

Character designs had to be translated from 2D to 3D for the Netflix limited series, which consists of four episodes. “The most important thing we had to establish was the asset pipeline [Solaris, USD, Arnold] because we didn’t have one. We ended up doing more than 2,000 assets,” Dillinger adds. Most of the challenges were technical. “When we first decided to use Arnold and Solaris, you couldn’t render hair, and the most important thing for our characters is the hair. Our protagonist has an Afro, and the characters are meant to be made out of felt, which you can’t accomplish that without having a layer of fuzz. We were able to accomplish quite a tactile feel where you can almost reach into the screen and touch these characters at times,” Dillinger describes.

Getting to contribute to Season 2 of Alice in Borderland was an exciting opportunity for Megalis VFX. “It’s a good project, Season 1 was fun to watch, and you could tell creatively the director and DP were good,” Dillinger remarks. “In each episode there are these games that happen. In the case of Episode 206, if people lose the game, acid falls on them and they melt. Onset, instead of acid, they used water and smoke, which makes sense, but when water hits somebody it behaves differently than acid. Usually, acid has more of a yellowish tint to it, so they’re trying to give it a little bit of that without going over the top. In a lot of those cases, it was a static camera and body, so it was a lot of 2D work to put on top of the skin.” Alice in Borderland does not hold back on the blood and gore. “Onset, they actually built a maquette of what a human looks like after acid has been dropped on them. We didn’t use it per se, but that ended up being concept art for us,” Dillinger notes. The game takes place at the Japanese Supreme Court. “Apparently, you cannot shoot there, and on Google we only found three photos that show the interior. It’s an artistically interesting building and has these concentric circles that rise up like an upside-down funnel. We spent a lot of time trying to

24 • VFXVOICE.COM SUMMER 2023 TV/STREAMING
TOP: For Alice in Borderland, onset water and smoke were used and subsequently replaced with CG acid, which involved a lot of 2D work to place over the skin. (Image courtesy of Megalis VFX)

stay true for those who might have seen it in person. There are also lot of graphical elements, like numbers in the air, which is a visual, stylistic way of showing the contestants realizing the type of game that they’re playing.”

Entering into its fourth season on Netflix is Babylon Berlin, which has visual effects produced by RISE Visual Effects Studios. “There was no German series before featuring so many visual effects shots,” observes Robert Pinnow, Managing Director and VFX Supervisor at RISE Visual Effects Studios. “In the first season, we delivered over 830 and worked on over 900, which at that point was an insane amount. There were approximately 50 full CG shots. The way that we suggested to use visual effects was new and quite common in the U.S. market, which was, ‘Don’t solve it conventionally by putting something there and still not have a right image. Just shoot it, we’re going roto and exchange the whole background rather than just that little antenna.’ The freedom they had was new for German visual effects.” Quite a few Berlin landmarks make an appearance. Comments Pinnow, “We rebuilt the Alexanderplatz correctly because they were insisting on shooting there, but it has a tiled ground that didn’t exist back in the 1930s. The residential areas are made up. Some of it got shot in the city, especially things like the railway station, and others got shot in the backlot at Babelsberg Studio.” For a full year, the entire production office was put into a former federal building that was supposed to be redone. “Shooting took place on one level within that building for all of the interior sets and police department. For Season 2. the production office had to move to another one, and now they’re in a former school.”

Every episode is directed by creators Henk Handloegten, Achim von Borries and Tom Tykwer. “One was shooting in one location, no matter what episode it ended up in, and everyone else had to follow it,” Pinnow explains. “Sometimes, someone had 50% on an episode and on another 10%. The three of them were doing it together.” The trio had to approve the visual effects work. “That was interesting, indeed,” Pinnow observes. “One was like, ‘It looks good.’ The other one was like, ‘It doesn’t fit my needs. It could be this or that way.’ And the third one, Tom, was straightforward and had everything in context. During the early days of Babylon Berlin, the decision was made to support all of the driving sequences with digital backgrounds. Describes Pinnow, “They could drive the whole city day and night. The background plates in the streets wouldn’t have been usable anyway because of modern elements. The buildings we made for that were the key to designing the streets that were backgrounds for normal shots. That then became part of the concept process. We delivered a turntable of every house. In this way, they could grab one frame that was in the right perspective, put it together in Photoshop and send it back to us. On Season 2, we did it ourselves because they trusted us. There was not much concept art unless it was something specific.”

It is important that visual effects are applied constructively. Concludes Pinnow, “There were a few shots where one of the directors said, ‘If we had known how good they looked, we would have used more of them.’ And the other one said, ‘I like that the good and great shots in this sequence are a side effect.’ You need the visual effects to show the sequence, but it’s not on the eye. It’s helping the storytelling.”

26 • VFXVOICE.COM SUMMER 2023 TV/STREAMING
TOP: Alexanderplatz was rebuilt by RISE for Babylon Berlin because the tiled ground did not exist back in the 1930s. (Image courtesy of RISE) BOTTOM THREE: The piggy bank shots for Squid Game were the hardest to execute for Gulliver Studios as the director had a specific look in mind. (Images courtesy of Gulliver Studios)

At the age of seven, Dennis Muren, VES was taken by his mother to see The Beast from 20,000 Fathoms and The War of the Worlds and was subsequently driven to create cinematic spectacles that did not exist in real world of La Cañada, California, where he grew up. In an interesting career twist, the awe-inspired child would go onto remake the H.G. Wells alien invasion classic as Steven Spielberg’s ode to the 9/11 attacks and in the process added to his tally of 13 Academy Award nominations, which includes six Oscar wins as well as two Special Achievement Awards and one Technical Achievement Award. Even though essentially retired after a half century in the visual effects industry, Muren is a Consulting Creative Director at Industrial Light & Magic, on the Advisory Board of VFX Voice, and more importantly has retained his childhood fascination in creating believable and narrative effects.

What started off with taking still photographs of toy dinosaurs after seeing King Kong graduated into motion pictures captured on 8mm and 16mm film stock where the teenager would attempt to recreate his favorite movie moments. “It could be a copy of The 7th Voyage of Sinbad,” Muren recalls. “I wanted to bring those experiences back home. It was interesting seeing The Fabelmans because I had forgotten about this completely, [but] the only way we could see it was, you had to find a dark place in your house to look at it. Steven had a closet, which he shows in The Fabelmans I had a hallway with doors, and you could close all of the doors and you were in the dark. It’s amazing that we all have this shared experience. Steven had never talked about that.”

DENNIS MUREN, VES PUSHES THE EMOTIONAL BUTTONS OF VISUAL EFFECTS

Images courtesy of Dennis Muren and Lucasfilm Ltd.

TOP: Dennis Muren, VES

OPPOSITE LEFT TO RIGHT: Visual Effects Art Director Joe Johnston, Special Visual Effects Supervisor Richard Edlund, VES and Dennis Muren, VES have a discussion with filmmaker Irvin Kershner about The Empire Strikes Back, with concept art for Cloud City in the background.

Phil Tippett, VES and Muren on a Go motion set for Dragonslayer, with the technique winning a Technical Achievement Award at the 54th Academy Awards in 1982.

Muren was responsible for the digital character supervision on Casper (1995).

Muren is unfazed by an AT-AT Walker appearing in front of him.

When tasked to incorporate a tauntaun into the aerial photography for The Empire Strikes Back, Muren realized that there was never only one solution.

To achieve the dynamic speeder bike chase in Return of the Jedi, Muren insisted that the plate photography be shot by a Steadicam in an actual forest.

It was inevitable that The Equinox would get made. “I had completed my first year of college and didn’t want to make another short film during the summer vacation. I just wanted to make a feature film,” Muren recounts. “I realized that Ray Harryhausen’s films are like five sequences that are effects and he finds a writer to fill it all in. I had my friend Dave Allen, a stop-motion guy, and Jim Danforth who could do some stuff. Then I did my stuff. We came up with three sequences that I wanted to do – one stop motion, one forced perspective – and Dave Allen had this puppet we could use. Then we find some actors. Simple. I had $3,500 that my grandfather had saved up for me to go to USC; however, my grades weren’t good enough to go to there. I totalled it all up. Shooting 16mm on my Bolex, we did it the French way, which is silent and incredibly cheap, because then you put the sound in later and that cuts the price down to 10% of what it would have been. I knew it would work and was actually surprised that we sold the movie.”

Getting work in visual effects was not an easy task as the industry was still in its infancy and opportunities were scarce. “I gave up at one point and was going to be an inhalation therapist, which was something I figured I could do,” Muren reveals. “There were no effects movies being made. I got a little work at a union house called Cascade doing commercials for the Pillsbury Doughboy, Jolly Green Giant and Alka-Seltzer. Phil Tippett, VES, John Berg, Dave Allen, Jim Danforth and I were there at that time. It’s important to find your people who like the same things and you learn from each other. Most of us are still really good friends.” Muren pitched himself as a visual effects cameraman. “I never thought of myself as a cameraman, but I could always look at

28 • VFXVOICE.COM SUMMER 2023
PROFILE

movies and ask, ‘Why is it shot this way? Why didn’t they fix that? It’s obviously wrong.’ If I was doing this, I would make sure that I was the cameraman because he’s the guy with the button, and if it’s not right I’m not pushing it. Which isn’t really the way it was, but it kind of has been that way.”

Everything changed when filmmaker George Lucas established Industrial Light & Magic to produce the visual effects for his space opera Star Wars. “I heard that George was doing some sort of effects film, but I didn’t know any of the people working on it or what it was about,” Muren remarks. “I said, ‘I have to get on this show because I want to understand what this stuff is and how it works. I got the number of [ILM Co-Founder/Visual Effects Artist] Robert Blalack, called him up and had an interview. [VFX/SFX Supervisor] John Dykstra thought that my understanding of stop motion would be applicable to this slow camera motion-control stuff that he envisioned for Star Wars, where a motor would take a minute to go through motions, and you would speed it up and slow it down, and when you played it back over four seconds it would look like it was flying accurately. He was right. I could give personality to the ships by having them bank and skid which I thought added a fun factor to the movie.”

Close Encounters of the Third Kind and the end sequence with the mothership was the next project. “I loved working with George

SUMMER 2023 VFXVOICE.COM • 29

and wanted to see what Steven Spielberg was like, and I admired Douglas Trumbull, VES” Muren remarks. “I made sure that the shots got finished in those five months, and they all worked, looked good and matched to the others. We were dealing with light, smoke and mist instead of hardware, speed and energy on Star Wars

Both sides of my brain got boosted in a period of a little more than a year to a new way of looking at things.” An important lesson was learned when asked by Lucas to insert a tauntaun into one of the opening helicopter shots in The Empire Strikes Back. “I said, ‘There is no way to track those moves because we don’t have any recording of it. It’s all white down there. It’s going to look fake. We should build this as a big model.’ George said, ‘Just think about it.’ And he walked out. Within 15 minutes I had figured out how to do it. I learned so much from that moment. There are many ways you can do everything, but there are usually a few ways that are the best and something only gets you 85% there. Then there is another way to tweak that for the 15% to make it look it’s a 100%.”

For Muren, the real breakthrough for digital effects was Terminator 2: Judgment Day. “CG was something I’d been looking into at ILM since 1983 or 1984. It was always a puzzle and a possibility. We did it, but it was always expensive. It wasn’t until Terminator 2: Judgment Day that we figured out how to make that as something which is repeatable and affordable with a department that could do it.” A dramatic moment is when the liquid metal T-1000 transforms and gets temporarily stopped going

30 • VFXVOICE.COM SUMMER 2023
PROFILE
LEFT TO RIGHT: Muren mapping out the CG stained-glass knight sequence from Young Sherlock Holmes (1985). In order to properly track the T-1000 in Terminator 2: Judgment Day, a grid literally had to be drawn onto the body of Robert Patrick. Muren got to remake The War of the Worlds, which was one of the most influential films from his childhood, with Steven Spielberg. Muren reenacts one of the scenes from Raiders of the Lost Ark while making the movie.

through metal bars by the gun he is carrying. “Not only that, the room is quiet, so when that gun hits the bar it’s really loud. James Cameron is thinking it through so deeply and does that all the way through. The guy is great.” The studio reaction was surprising. “I was expecting Hollywood to go nuts, and they were like, ‘That was interesting.’ They didn’t know what they were seeing until Jurassic Park and dinosaurs. Everybody loves dinosaurs!”

Visual effects have become a standard filmmaking tool from indie productions to Hollywood blockbusters, which begs the question, “Can you have too much of a good thing? I would certainly say so,” answers Muren. “A lot of the stuff looks fake and the audience doesn’t seem to care. TV shows can have 100 shots in them. It looks like its shot in old Chicago but is really not shot there at all. But we fixed the buildings up and matted people into the settings. That’s great. It’s invisible. However, when you get into this chaotic action where they throw out any sort of thought of gravity, I get bothered by that.” That is not to say that there are great current examples of effects utilized wisely. “The thing that they did so well in Top Gun: Maverick is that they started with the correct initial elements, which was actually having actors in the planes flying out there; they’re really doing it, so there’s lots of surprise and lots of things you don’t see when you’re onstage.” A personal favorite is Bardo by Alejandro G. Iñárritu. “That film just knocks me out. All of the stuff that he is trying to tell is in the effects. The effects aren’t like a storyboard that an effects guy brought to life.

LEFT TO RIGHT: There are times when it is appropriate to heighten the reality of a shot to get the desired emotional moment, which is something that Muren did for E.T. the Extra-Terrestrial.

As with Close Encounters of the Third Kind, Murren got to shoot a dramatic sequence with an alien spaceship for E.T. the Extra-Terrestrial

Whereas Muren viewed Terminator 2: Judgment Day as the big CG-character breakthrough, it wasn’t until dinosaurs got resurrected in Jurassic Park that Hollywood finally took notice.

A love for stop-motion animation helped to forge a life-long friendship between Muren and Phil Tippett.

A miniature airplane was used for an aerial shot overseen by Muren for Indiana Jones and the Temple of Doom

SUMMER 2023 VFXVOICE.COM • 31

Despite being a digital innovator, Muren has not lost his enthusiasm for in-camera effects as was the case with A.I. Artificial Intelligence.

Getting close and personal with the Rancor from Return of the Jedi.

No, it’s all emotion that has been revealed at the pacing desired by the director. You have to get into the director’s head and figure out what he is trying to do. That guy is great.”

Technology is continually evolving. “I don’t know where AI is going to go, but it sure is fascinating,” Muren states. “Gaming has surely influenced the film industry, and the films have certainly influenced gaming. To think at one time there were no computers and visualization except for pie charts and text. Now they’re all affecting everything else. With AI you can create images in a movie that are synthetic, and the gaming industry can figure out ways to do it in real-time interactively which is phenomenal. Now we have something coming up that is going to be able to decide what image we want to see and don’t want to see, or look at it 300 different ways based on whatever criteria we can give the AI. What I love is it’s almost out of the lab and schools and into the homes. Once you get people in their homes doing stuff, they’re going to come up with ideas that no one else has thought out. However, people are worried about it for rightful reasons. There has to be checks and balances.”

The imagery should be driven by the story, not the other way around. “It’s about the movie, it’s not the shot you’re doing,” Muren observes. “How does it fit out. What is the emotion of it? For the shot of the kids on the bikes in E.T. the Extra-Terrestrial flying off into the sunset, I made the sun a little too orange and the backlight on the bikes a little too bright. The color palette is amped up a tiny bit because it added a magical feeling that I thought those kids would feel when they were experiencing it, especially when those kids were telling what just happened to somebody. That’s the way it should look. In a lot of Steven’s films, I look at them as amped up in places. Anybody nowadays can take a storyboard or animatic and make it look real with the tools that we have, but it needs to have the truth and heart of the moment that the director is going for. Your shot doesn’t want to overpower the story or take away a moment that could have helped. It’s all subtle. Directors and actors go through it all of the time. Editors are the ones picking out those parts and assembling them together. That’s what filmmaking is. When you get there, then it’s not just the shot is terrific, the movie is terrific. I love movies.”

32 • VFXVOICE.COM SUMMER 2023 PROFILE
TOP TO BOTTOM: Muren got to do his own version of Fantasic Voyage with Innerspace (1987) Muren witnesses the CG creature animation that Steve Williams created for Jurassic Park
“CG was something I’d been looking into at ILM since 1983 or 1984. It was always a puzzle and a possibility. We did it, but it was always expensive. It wasn’t until Terminator 2: Judgment Day that we figured out how to make that as something which is repeatable and affordable with a department that could do it. ... I was expecting Hollywood to go nuts, and they were like, ‘That was interesting.’”
—Dennis Muren, Consulting Creative Director, ILM

FINDING THE RIGHT FORMULA FOR ELEMENTAL

Hardly elementary to put together is the original romantic comedy Elemental from Pixar, where a new arrival to a city inhabited by Water, Earth, Air and Fire enters into a relationship that crosses the class divide. The story was inspired in part by a science class joke. “When I looked at the Periodic Table, all I could see was this apartment complex,” chuckles filmmaker Peter Sohn (The Good Dinosaur). “To make the pitch more acceptable to everyone, I boiled it down to the classical elements fire, water, earth and air.” The subject matter is personal in nature, he reveals. “I lost both of my parents during the making of Elemental, and this film has a great deal to do with appreciating our parents and the sacrifices that they make for us. It has been this interesting emotional ride.”

Given the nature of the characters, effects were an essential aspect of their design, with Sohn having to readjust his expectations. “There were lots of articles last year about how tremendously difficult the hours can be in the visual effects industry, and with so many projects going into streaming and features getting so big, that there were some eye-opening ways to produce this material that I was guilty of. I pulled back on a lot of that in the middle of last year. I went in knowing the gameplan of what we were going to get to do, but because my parents had died, I was like, ‘This is to honor them! We’ve got to go further.’ The crew has been a tremendous support in this process, and they have lifted the film in ways that I will forever be grateful for.”

One of the hardest aspects was to make sure that the characters actually look like they are made from their designated element, such as “Ember [Leah Lewis] and Wade [Mamoudou Athie] as our main characters, Fire and Water,” states Sohn. “Ember was the most challenging to get to her look. I remember seeing Ghost Rider as a kid and going, ‘A character with a fiery head.’ The fire was so realistic and meant to be scary. Then there was Jack-Jack in The Incredibles who

34 • VFXVOICE.COM SUMMER 2023
goes on fire. It’s hard to make a fire character Images courtesy of Disney/Pixar. TOP: Ember had to look as if she was made of fire while Wade required several simulations including drips, splashes and bubbles to make him a believable water character. OPPOSITE TOP TO BOTTOM: The Lighting department integrates the characters, sets and effects to produce the final image. In order to discover the performances, the animation team manipulates the previs models of the characters and then computer simulations are activated to make them appear and feel like their element. Production Designer Don Shank creates the sets within Presto, which is Pixar’s proprietary animation system.

that does gaseous without a solid substructure there. Character Designer Daniel López Muñoz took iPhone footage of a fire in his backyard, pulled out the frames and painted over them this fire character; he made these eyes blink on it that was caricature enough where these eyes could fit. Ember’s fire was more forgiving, where Wade became more difficult as the production went on because of the way his rig worked and simulations on top of his shaders and the caustics inside of him; there weren’t a lot of places to hide.”

Reinventing the character pipeline was Bill Reeves, Global Technology Supervisor. “The character gets animated in animation using the standard Pixar pipeline in terms of what you see on the animator’s screens. Its surfaces, polygons as you will. You don’t see the fire. Then they check in their work saying, ‘This shot is done.’ We convert it over to feed into Katana and RenderMan and we render. In the course of that conversion, we feed it into Houdini to generate volumetric pyro simulations and stylize it. It comes out the back end and eventually gets into Katana and then into RenderMan. [For] the part of the pipeline that goes into Houdini and back out again, we had little bits of it here and there. However, Ember is in 95% of the shots, so we had to run that pipeline over and over again. There is a lot of simulation involved with Wade because his hair is like a fountain that is bubbling. That’s another Houdini set of tasks. Then the Air character is another set of simulations to generate the flowing air wisps. The only simple set of characters are the Earth characters, but they’ve got a lot more geometry than Woody and Buzz Lightyear.

An effort was made to limit the number of elemental characters to 10 in a shot. “But we blew passed that,” Reeves laughs. “There is a sequence called Air Stadium in the movie where there are thousands of them.” A new approach was developed for crowd simulations. “It doesn’t actually go into Houdini for every character, but

SUMMER 2023 VFXVOICE.COM • 35

it is volume deform kind of thing. When the characters are further away, it’s one simulation that we’re copying around and deforming in Houdini, but it’s a 10- to 20-second Houdini call rather than an Ember simulation, which is four or five hours easy.” Compositing was leaned on heavily when it came to lighting. “It’s mainly a way of dealing with the complexities of this world of pushing a lot more lighting layers into Nuke and composting and tweaking the end result there, rather than having to go back and re-render. You can work faster because it’s a more interactive system when you have the data. That was something we worked hard on and did a lot more on this show than on other ones,” Reeves observes.

To assist animators, a toolkit was developed by Sanjay Bakshi, Visual Effects Supervisor. Comments Bakshi, “We had to animate things like when Ember gets mad, not just the facial expression and the body language, but what does the fire do? Since animators are experts at the timing of that, it had to be synchronized with their acting choices. We had to give them some visual indication. Our simulation and shading artists were changing a bunch of knobs to get it to feel like what sadness is. Then we map that to one number so there is a sadness control. It was like a zero to 10 kind of thing. More effort was put into fire because Ember is the main character and goes through the most emotions.” Transparency and the speed of the fire help to convey emotion. “When Ember becomes vulnerable her flames become a lot more transparent and candle-like in the movement,” Bakshi notes. “For anger, we did use color. Ember goes into more purple; that’s her signature anger look. A lot of the story is about her being angry and not understanding why, then learning through the movie how to control her anger and why she is angry.”

Much of the action takes place in Element City. “The Fire folks live in Firetown, and there are Water, Air and Earth districts,”

36 • VFXVOICE.COM SUMMER 2023
TOP TO BOTTOM: Major crowd simulations had to be produced for the Air Stadium sequences. Concept art by Lauren Kawahara exploring the color and design for Element City as well as how the characters interact with the urban environment. Locking down the look of the characters was difficult for Directing Animator Gwendelyn Ederoglu because of being so effects dependent.

Bakshi explains. “Pete wanted these districts to have the elements built into the architecture and infrastructure, so we did a bunch of set dressings, like a streetlamp in the Fire district would have some fire on it. Our set dressers could place them, and the fire simulations would come along for the ride. The buildings and architecture have fire and water simulations built into them, so that the set dressers could do their work and get these simulations that would just happen. That was another big component.” Instancing was essential in making rendering manageable. “For fire simulations in Firetown, there were probably 25 to 30 of them that get reused over and over and instanced. The lamps are all instanced. It’s all of the same simulation, just offset in time so they don’t look identical,” Bakshi notes.

Getting the look of the characters locked down was difficult with them being effects dependent. “We partnered with Dan Lund who is a traditional effects animator from Disney,” remarks Gwendelyn Ederoglu, Directing Animator. “He talked about applying animation principles into 2D effects and how the same principles that we use in 3D in our characters, we could still pull and learn from that. One of them being, what are you trying to tell in the shot is key and how can effects support that? What we did learn from our 2D tests early on was that there was going to be a lot of fun to be had with volumetric changes, which is something that our characters don’t typically do. We also learned quickly how tear-offs of fire or water droplets added an immediate believability to their ‘elementalness,’ so we worked with our rigging team to develop prop fire and water, which we could then use in our animation testing early on. That became a critical element in the film. If Ember has a limb detached, rather than breaking off the rig of the limb, we would cheat into prop fire that would match the traits of Ember’s fire and then it could dissipate.”

SUMMER 2023 VFXVOICE.COM • 37
TOP TO BOTTOM: It was important to not have Ember touch anything that was flammable. Lauren Kawahara depicts what a street would look like in Element City, which utilizes a canal system similar to Venice and Amsterdam.
Peter Sohn conducts one of many review screenings of Elemental, which is a tribute to the sacrifices his parents made for their family.

Essential was a close collaboration established between animation and character effects departments. “We worked closely with simulations typically on a film, but jumping all of the ways to character effects, that was a new relationship that was formed,” Ederoglu notes. “A term that one of us might use in animation about stylization could be interpreted so differently by a different department. It took us awhile to create a more common language to be able to talk about shots, problems and issues or what looks we were chasing in animation.” Most of the animators came from working on Lightyear, which was grounded in real human acting. Ederoglu adds, “It did take a bit to adjust to the looseness and constant sense of motion that we needed to have for these characters to feel believable. Something would be funny in the Presto version because of the snappy timing, but we needed those four frames for the pyro to catch up. We did have to learn the language of that as we went.” Lattice controls were overhauled. “Those were far more robust in terms of regional controls than what we have had in the past. That was critical. A quarter of the shots required lattices to do some organic and major shape changes. Beyond that, a lot of the animation was within the rigs themselves.”

Unlike previous Pixar movies where the top portion of a character’s head or the chin could be cropped out of frame, this was not possible for Elemental. “We knew right away talking early on with Pete that 1.85:1 was going have to be the way to shoot something like this,” remarks David Juan Bianchi, DP, Camera. “We were aware that the pyro and flames of Ember were going to be so important, not just for her look but how to tell her emotional state. What is it doing? What is the color in her flames? We needed to go wide and vertical. I asked the engineering teams to try to give us a camera that matched large format photography, and this allowed me to shoot wider lenses. We wanted to have that extra dial of being able to have a shallow depth of field at certain moments, and having this large format version of our Pixar camera allowed us to do that.”

38 • VFXVOICE.COM SUMMER 2023
TOP TO BOTTOM: In order for the characters to be appealing, the elements could not be completely realistic, but at the same time not so cartoony that viewers forgot what they represented. A nighttime examination of Ember by Jason Deamer, Character Art Designer. Concept art by Jonathan Hoffman and Maria Yi experimenting with the facial expressions of Ember.

Reflecting various emotional tones and worlds throughout Elemental is the camera style. “There was a language for Firetown,” Bianchi explains. “We primarily shot that with wide-angle lenses, and the camera is more at the character’s eye level as if someone was hand-operating the camera. When going to Element City we introduced Water characters like Wade’s family living in an apartment made of water. Then we started to have a different camera language. There were Steadicam and Technocrane moves so it felt like the camera was rotating and floating, and longer lenses. When Ember and Wade from these two worlds intersect and interact with each other, we cherry-picked elements from both to make what I called the romance love element or Ember Wade language. Hopefully, it underscores and supports the story of where our two characters are at and where they ended up.”

Rather than just dealing with practical sources like lamps and natural ones such as sunlight, for Jean-Claude (JC) Kalache, DP, lighting had to take in account the characters themselves being light sources. “Ember is a self-illuminated gas, and the way we solved the exposure issue is we exposed everything except Ember. We made a conscious decision that animation would drive her energy, and lighting we’ll treat almost like a lightbulb on dimmer switch. When you think of water, everything is a light source. Water is reflective, refractive and shows through. It captures the whole environment. It was a big mind-bender. The brain is good at realizing what water looks like, but lucky for us, we were stylizing water, so you can break down the five or 10 components that make water look like water, but then you can move them around. It took a good part of the full year just to learn how to make our main Water character appealing.” Then there were the Air characters. Adds Kalache, “I remember looking at the overnight render and it was wonderfully beautiful pink light filling the whole train. Offscreen there was an Air character that was blasted by the sun.”

“The premise of this whole movie is that these elements cannot exist together,” Kalache remarks. “Yin and yang. Firetown is dry, smoky and less reflective. What is the opposite of that? It is a city with glass, water, reflections, and everything is bouncing. Lighting a glass building is a pain because you can’t shape them. We literally treated the buildings of Element City as a character, and we were lighting them as if we were studio-lighting a human, putting special rim and kick lights [on them]. One thing that was revealed quickly was a Water character is dependent on the environment around them, especially what is behind them. When Wade goes to or is in Element City, we quickly noticed if the buildings behind him were busy, it was impossible to look at him because you could see right through him. However, if we took the sunlight and made it slightly ramped down right behind him, conveniently things calmed down and he looked appealing.”

Kalache made an observation that surprised the director. “I remember telling Pete, ‘The world is the light. What do you expect in your character?’ Soon after, animation would come in to talk to him about what they expected from the performance. Then soon after, the effects people would come and talk about what they expected from their character effects. It took all of these conversations to eventually end up with characters that worked for Pete.”

40 • VFXVOICE.COM SUMMER 2023 ANIMATION
TOP TO BOTTOM: Concept art by Carlos Felipe León that visualizes candlelight silhouettes of Ember and Wade. Daniel López Muñoz depicts a street in Firetown that has a fire motif. Peter Sohn attempts to find the visual aesthetic of the Earth district in Element City. One of 97,760 storyboards produced for Elemental with filmmaker Peter Sohn, who started off as a storyboard artist for Pixar.

THE RISE OF VR/AR/VFX AND LIVE ENTERTAINMENT

In one breakthrough after another, AR, VR and VFX are augmenting live entertainment, from ABBA’s avatars to XR concerts to Madonna dancing live on stage with her digital selves.

ABBA: THEIR ‘70S SELVES

When the Swedish group ABBA returned to the stage last May after a 40-year hiatus, they did so digitally with the help of ILM. The foursome – Björn Ulvaeus, Benny Andersson, Anni-Frid Lyngstad and Agnetha Fältskog – appeared in ABBA Voyage via their de-aged digital avatars, virtual versions of themselves on huge screens in the purpose-built, 3,000-capacity ABBA Arena, which was constructed in Queen Elizabeth Olympic Park in London. (ABBA Voyage won the 21st Annual VES Award for Outstanding Visual Effects in a Special Venue Project. )

ABBA’s 20-song “virtual live show,” over five years in the making, is a hybrid creation: pre-recorded avatars appear on stage with a physically present 10-piece band to make the experience more lifelike and convincing. The avatars meld the band’s current-day movements with their appearances in the 1970s.

Silhouettes

(Image

ILM supplied the VFX magic, with more than 1,000 total visual effects artists in four studios working on the project, according to the show’s spokespersons. ILM Creative Director and Senior Visual Effects Supervisor Ben Morris oversaw the VFX of the show, which was directed by music-video veteran Baillie Walsh.

(Image

First, Morris and his team scanned thousands of original 35mm negatives and hours of old 16mm and 35mm concert footage and TV appearances of the band. The supergroup quartet spent five weeks singing and dancing in motion-capture suits as ILM scanned their bodies and faces with 160 cameras at a movie studio in Stockholm. The same process was undertaken with younger body

who followed their moves, guided by choreographer

42 • VFXVOICE.COM SUMMER 2023
doubles, TOP: ABBA seen on a massive LED screen. The de-aged avatars were created by ILM in a project that took over five years. courtesy of ILM and ABBA) OPPOSITE TOP TO BOTTOM: ABBA avatars on stage in lower center. The show’s elaborate lighting and live musicians help bring ABBA’s music to life. (Image courtesy of ILM and ABBA) The digital versions of ABBA appear on the stage and to the sides of the arena on towering ROE Black Pearl BP2V2 LED screens powered by Brompton Tessera SX40 4K LED processors. courtesy of ILM and ABBA) of Björn Ulvaeus, Benny Andersson, Anni-Frid Lyngstad and Agnetha Fältskog – whose virtual versions appear on huge screens in the purpose-built\ 3,000-capacity ABBA Arena in London. (Image courtesy of ILM and ABBA)

Wayne McGregor, and whose movements were blended with those of ABBA to give the band more youthful movements.

The digital versions of ABBA appear on the stage and to the sides of the arena on towering ROE Black Pearl BP2V2 LED walls, powered by Brompton Tessera SX40 4K LED processors. Each screen is 19 panels high, and there are an additional 4,200 ROE LED strips in and around the area. Solotech supplied the LED walls. Five hundred moving lights and 291 speakers connect what is on the screens to the arena. The result is spectacular and suggests that many large-scale digital shows may be on the way for music stars who are getting old or simply don’t like touring.

VIRTUAL TUPAC, VIRTUAL VINCE

Digital Domain created digital representations of the rap star Tupac Shakur and legendary Green Bay Packers football coach Vince Lombardi in 2012 and 2021, respectively, which raised the visual-quality bar for virtual appearances projected live.

On April 15, 2012, at the Coachella Valley Music & Arts Festival in Indio, California, Tupac Shakur appeared in a CGI incarnation on stage at the Empire Polo Field in Indio, California. The virtual Tupac sang his posthumous hit “Hail Mary” plus a “duet” of “2 of Amerikaz Most Wanted” with Snoop Dogg, who was on stage, in the flesh.

The computer-generated realistic image of Shakur was shown to some 90,000 fans on each of two nights; YouTube videos of the event reached 15 million views, according to Digital Domain. Some called it the “Tupac Hologram” – it wasn’t a hologram, but it was 3D-like. Unlike ABBA Voyage (2022), which featured the participation of the band in creating the group’s avatars, the Shakur on stage was created long after the singer’s death in 1996. The project took about two months to complete, with 20 artists of

SUMMER 2023 VFXVOICE.COM • 43

different disciplines, according to Digital Domain’s Aruna Inversin, Creative Director and VFX Supervisor. The virtual Tupac was the vision of Andre “Dr. Dre” Young, and Digital Domain created the visual effects content. AV Concepts, and an audio-visual services and immersive technology solutions provider, handled the projection technology.

The virtual Shakur was a two-part effect, according to Kevin Lau, Digital Domain Executive Creative Director. “In the first, we recreated Tupac using visual effects, and the other was a practical holographic effect created using Hologauze. For the performance, we started by filming a body double performing the set. The actor had an incredible likeness in both stature and movement to Tupac. Once that performance was in the can, we began digitally recreating a bust of Tupac’s likeness. This digital head was then combined with the body performance and animated to match the song. The two were blended together in composting to create a seamless recreation of a seemingly live performance.”

Lau continues, “The performance asset was then synched to the live band and projected on Hologauze. This material has the ability to reflect bright light, while remaining fairly transparent in areas that are dark. The result is a figure that appears to exist physically in a three-dimensional space.”

“The Vince Lombardi project was similar to the Tupac hologram, but we enlisted a slew of new techniques – many of which we created ourselves – that were not available at the time,” comments Lau. The virtual Lombardi – evoking the NFL football coach as he was in the 1960s – appeared on the jumbotron at Raymond James Stadium in Tampa on February 7, 2021.

Lau explains, “We began by filming a body double to do the bulk of the performance. But instead of having to create a full digital recreation of Coach Lombardi’s head, we enlisted Charlatan, our machine learning neural rendering software. This allowed us to train a computer off a data set of available Vince Lombardi images. The software can then synthesize those images and recreate a likeness of the subject – in this case, Coach Vince Lombardi – based off our actor’s [the puppeteer] performance.”

For the virtual Lombardi, a slightly different projection technique than before was used, according to Lau. “Because of the nature of the Super Bowl, we didn’t have access to a sequestered location or a secured space to prep screens and projectors. The stage had to be wheeled out into the end zone and set up, then broken down within a matter of minutes. For this, we enlisted more of a ‘Pepper’s Ghost’ effect, in which a translucent mylar screen is put at a 45-degree angle from a large LED screen. The resulting image is reflected back to the viewing audience, allowing you to see through parts of the screen giving the holographic illusion.”

The digital Shakur and Lombardi are two “tentpole projects” for Digital Domain in that particular area, and they have also done “a handful of other executions for promotional and concert events,” says Lau. “We are always looking for new opportunities to surprise and delight viewers. AR/VR and holographic immersive entertainment have a huge role to play in the future. Being able to re-contextualize our environment will truly bring the audience in and allow for higher engagement with fans.”

44 • VFXVOICE.COM SUMMER 2023
VR/AR/MR TRENDS
TOP TO BOTTOM: A holographic-like virtual Tupac Shakur was created by Digital Domain and shown at night at the 2012 Coachella festival. (Image courtesy of Digital Domain) The digital version of the late rapper was the vision of Andre “Dr. Dre” Young. AV Concepts, an audio-visual services and immersive technology solutions provider, handled the projection technology. (Image courtesy of Digital Domain) The virtual Tupac Shakur with a live musician in the background. A practical holographic effect was created by projecting the performance asset onto a material called Hologauze. (Image courtesy of Digital Domain)

MADONNA AND MARIAH

At the 2019 Billboard Music Awards, five Madonnas and Colombian singer Maluma performed the reggaeton-pop song “Medellín.” Or, more precisely, one Madonna and four avatars sang with Maluma, in an AR-enhanced performance that utilized volumetric capture and Unreal Engine.

“To the surprise and shock of fans around the world, four copies of Madonna – each wearing one of her signature outfits from over the years – appeared and danced alongside the original, creating a show unlike any other and leaving fans to wonder how she pulled it off,” says Piotr Uzarowicz, Head of Partnerships & Marketing for Arcturus, a leader in volumetric video technology.

Arcturus’s HoloSuite software “was instrumental in the post-production of the four pre-recorded Madonna performances that were integrated into the live TV broadcast,” says Uzarowicz. “This was the first time that volumetric video was used in a television broadcast, and it signaled an emerging new era of entertainment.”

The augmented reality content could be seen by TV viewers of the telecast and was presented alongside live components, including Madonna herself, Maluma, and over a dozen dancers.

Sequin AR was tasked with bringing the Madonna clones to life in AR. “We worked with Jamie King [Madonna’s Creative Director], Madonna and several fantastic companies to create the performance, including Dimension Studios [where the volumetric capture of Madonna took place], StYpe [camera tracking], Reflector Entertainment [co-founded by Cirque du Soleil creator Guy Laliberté] and Brainstorm [InfinitySet],” says Sequin AR CEO Robert DeFranco.

“The ‘Madonnas’ were created with volumetric capture, and additional elements were created, including rain, stage extensions and clouds,” explains DeFranco. “The elements were rendered in real-time using Brainstorm and Unreal Engine.”

Sequin AR was founded as a virtual production and augmented-reality solutions company, and it has extended its capabilities to include immersive web3, mobile and VR solutions for its clients. “We consider ourselves a 3D immersive company,” says DeFranco. “AR and real-time render technology allow content producers the ability to tell new stories that were previously only possible in post-production. This helps them make a greater emotional impact and connect with audiences in more engaging ways.”

DeFranco explains, “AR viewing can be accomplished with a number of screen types, whether it be AR glasses, mobile, monitors or broadcast. Broadcast AR leverages the TV, [and] we integrate the real-time technology into the standard production pipeline, creating the virtual production pipeline for AR.”

Sequin AR provided virtual production and AR broadcast services for Mariah Carey’s Magical Christmas Special on Apple TV+ on December 4, 2020. The production utilized “virtual sets being rendered in real-time with Zero Density and Unreal Engine for seven cameras. The shoot was done in five days, which would have not been possible if it were not for virtual production,” according to DeFranco.

He notes that the 2021 avatar singing competition series Alter

SUMMER 2023 VFXVOICE.COM • 45
TOP TO BOTTOM: Legendary Green Bay Packers coach Vince Lombardi was brought to life in digital form by Digital Domain for the 2021 Super Bowl in Tampa, Florida. (Image courtesy of Digital Domain) Digital Domain filmed a body double and then enlisted Charlatan, its machine learning neural rendering software to train a computer off a data set of available Vince Lombardi images. The software synthesized the images to recreate a likeness of Lombardi. (Image courtesy of Digital Domain) The stage for Madonna’s performance with four AR versions of herself at the 2019 Billboard Music Awards show. (Image courtesy of NBC and Dick Clark Productions)

Ego, which Sequin worked on, “was all AR, motion capture and facial capture. We provided virtual production technical engineering on the show supporting the technical pipeline and playout in real-time,” says DeFranco.

Regarding tech utilized, he notes, “In addition to our technical pipeline and processes, we leverage Unreal Engine and a variety of solutions depending on the production needs, including Mo-sys [virtual production and camera tracking], StYpe [camera tracking technology for AR or virtual studio], Zero Density [real-time visual effects, real-time broadcasting and a markerless talent tracking system] and Silver Draft [supercomputing high-end rendering], to name a few.”

DeFranco adds, “Augmented reality and virtual production continue to push the boundaries with innovative companies creating new content and experiences. We are thrilled to be helping advance the adoption of these technologies and creating new ways to use them.”

XR CONCERTS

Last year, Snap signed a pact with Live Nation Entertainment for audiences to access AR experiences via Snapchat at Live Nation music festivals such as Lollapalooza, Bonnaroo, Governors Ball and Electric Daisy Carnival. Snapchat users will be able to find their friends in the audience, locate landmarks on the festival grounds, try on merchandise and experience other AR content.

Snapchat has a large potential audience for its AR experiences – it had 363 million active daily users worldwide as of the third quarter of 2022, according to the firm.

U2, Eminem and Maroon 5 are among the artists who have added various types of AR experiences to their shows via smartphones and apps. In 2022, for Flume’s show, Coachella partnered with Unreal Engine to add AR elements to the Coachella YouTube live stream. In addition, the Coachella Valley Music and Arts Festival’s Coachellaverse app offered various immersive AR effects, including geo-specific experiences created in partnership with Niantic. Last December, the virtual band Gorillaz performed AR concerts in both Times Square in New York City and Piccadilly Circus in London.

In October 2020, in a peak part of the pandemic, Billie Eilish’s “Where Do We Go?” XR concert from XR Studios in Hollywood served up “a meticulous visual affair, replete with lofty LED screens and extended-reality [XR] effects, that felt determined to recapture weary viewers’ attention,” according to Amy X. Wang of Rolling Stone. She continued, “With the feel of a highly produced music video, the show, which charged $30 a ticket, hit on all the strengths of livestreaming. Enormous animated creatures and chimeric landscapes whirled by around Eilish, her brother Finneas and her drummer Andrew Marshall as they played into multiple roving cameras from a 60-by-24-foot stage; the trio’s sparse physical presence made for a striking silhouette to the rapidly shifting scenery.”

BTS, The Weeknd and Metallica are among a few of the big names who have offered VR songs or concerts. And MelodyVR, NextVR, VeeR, Meta Platforms and Wave are some of the

46 • VFXVOICE.COM SUMMER 2023 VR/AR/MR TRENDS
TOP: Five Madonnas with singer Maluma in background. Sequin AR, Arcturus, Dimension Studios, Reflector Entertainment, stYpe and Brainstorm collaborated on the project powered by Unreal Engine. (Image courtesy of NBC and Dick Clark Productions) SECOND FROM TOP: Five Madonnas with Maluma in background. Volumetric capturing before the show took place at Dimension Studios. (Image courtesy of NBC and Dick Clark Productions) BOTTOM TWO: Examples of volumetric capture choreography on the stage with different stage extensions plus clouds. (Image courtesy of NBC and Dick Clark Productions)

platforms that offer XR concerts and have large libraries of such fare.

Moreover, visual effects are responsible now for an increasing number of virtual performers. South Korea’s Eternity is a virtual K-pop band that uses AI technology, while Aespa has both physical and virtual members.

VR VARIETY SHOW

Kiira Benzing is Creative Director and Founder of Double Eye, which is presenting interactive theater and live entertainment in virtual reality. Her latest effort is the VR variety show Skits & Giggles, which is published by Meta Platforms and available on Horizon Worlds.

“We wanted to make a variety show and test how comedy in a scripted format might play out in VR for a live audience,” says Benzing.

The entire show is interactive, according to Benzing. “The show is mostly scripted between the skits, monologues and musical numbers; but there are moments of spontaneity and improv that also arise in every show.” Skits & Giggles was a 2022 nomination for Best Immersive Performance at the Raindance Film Festival.

Benzing comments, “VR is an incredible medium for live performances. I see a wonderful niche deepening in its growth as more and more live performances are being created across different Social VR platforms. Now there are so many new solutions with the hardware becoming more affordable and the Social VR platforms growing in numbers. All of these elements together make an ecosystem more possible for live VR performance to flourish.”

Benzing continues, “VR, AR and XR as a whole are amazing evolutions of experiences we can share with our audiences. The player’s journey is quite different if we use AR to overlay onto the physical world or VR to take them into a brand-new world, but both forms of the tech can be transformative.” She observes, “We are seeing festivals, concerts, bands and theater companies venture into these mediums. Since XR has such a power to transform an experience, I believe the more audiences are introduced to AR and VR. the more they will begin to expect live performances to include an XR extension.”

And, Benzing notes, “There is growing potential for attending concerts and live VR theater and performance from the comfort of your home with VR. Just as we are finding a shift in the film industry due to the growth of streaming, I believe we will find audiences who seek XR entertainment. The possibility of being able to connect with your favorite performers and bands while also attending with your friends and doing so from your home environment is compelling.”

She concludes, “As audiences have a taste for the combination of XR and live performance, they will crave more and more.”

48 • VFXVOICE.COM SUMMER 2023 VR/AR/MR TRENDS
TOP: Drinks in hand, these avatars are part of the VR variety show Skits & Giggles, published by Meta Platforms and available on Horizon Worlds. (Image courtesy of Double Eye Studios and Meta Platforms) BOTTOM: In this virtual living room and elsewhere, the VR show is interactive – scripted in the skits, monologues and musical numbers, but with moments of spontaneity and improv in between. (Image courtesy of Double Eye Studios and Meta Platforms)
“There is growing potential for attending concerts and live VR theater and performance from the comfort of your home with VR. Just as we are finding a shift in the film industry due to the growth of streaming, I believe we will find audiences who seek XR entertainment. The possibility of being able to connect with your favorite performers and bands while also attending with your friends and doing so from your home environment is compelling.”
—Kiira Benzing, Founder and Creative Director, Double Eye

VES LIFETIME ACHIEVEMENT HONOREE GALE ANNE HURD: CELEBRATING SHEROES IS HER SUPERPOWER

From the legendary Lt. Ellen Ripley battling aliens in deep space to the unwitting target of an unstoppable robotic assassin, to a group of survivors in the aftermath of a zombie apocalypse, Gale Anne Hurd has brought forth iconic characters and cinematic experiences that have transported and transfixed audiences worldwide. One of the most respected and influential film and television producers of our generation, acclaimed producer-writer Hurd has been instrumental in shaping popular culture for nearly four decades. And in the process, she has revolutionized action cinema and delivered transformational depictions of women on screen.

Hurd is one of the entertainment industry’s most prolific producers of film and TV projects that shatter both box office and ratings records. After rising from Roger Corman’s executive assistant to head of marketing at his company, New World Pictures, Hurd’s producing career took off when she co-wrote and produced The Terminator. Her unprecedented success was quickly followed by Aliens, which received seven nominations and two Academy Awards, and additional Academy Award winning films including The Abyss, Terminator 2: Judgment Day, The Ghost and the Darkness and Armageddon. When Hurd entered the television industry, she met similar success in shepherding juggernaut The Walking Dead and serving in producing roles on Fear the Walking Dead, Talking Dead and Tales of the Walking Dead. Her latest documentary, The You Tube Effect, a cautionary tale on the impact of social media, will be distributed this summer.

In recognition of her enormous contributions to visual arts and filmed entertainment and the advancement of unforgettable storytelling through cutting-edge visual effects, Hurd was honored at the 21st Annual VES Awards with the Lifetime Achievement Award. In presenting the award, filmmaker James Cameron celebrated Hurd as a ‘true gale… a force of nature.’

On that note, Hurd was visibly moved in accepting her award in front of an audience of more than 1000 VFX artists and practitioners at the awards ceremony: “This is one of the proudest moments of my life. That little girl sitting in a movie theater staring wide-eyed at the extraordinary images on the big screen, never dreaming that one day she’d be producing them herself, is still in awe of the magic you create, each and every day. You are my heroes, and to receive your Lifetime Achievement Award is more meaningful than you could possibly imagine. And to be presented the award by Jim – we grew up in the industry together – and to have it be an evening where Avatar got such accolades – was literally perfection.”

VFX Voice sat down with trailblazer Gale Anne Hurd to talk about breaking barriers, her love of craft and celebrating heroic women – on and off screen.

VFXV: Tell us about your origin story – what were your early sources of inspiration that led you to your career in filmed entertainment?

Hurd: I was an early reader, and I read every science fiction and horror book I could get my hands on. So much so, that the Palm Springs Library asked me to be their consultant and help them acquire books for young people. I even wrote a column for the local

50 • VFXVOICE.COM SUMMER 2023
Images courtesy of Gale Anne Hurd, except 21 st VES Awards photos by Danny Moloshok, Phil McCarten and Josh Lefkowitz.
PROFILE
TOP: Gale Anne Hurd strikes a shero pose.

paper and wrote sci-fi and fantasy book reviews.

I’ve always been a fan of science fiction, fantasy and horror. Growing up, I was lucky enough to have TV that aired both chiller and thriller movies. My local movie theater was essentially my weekend babysitter. I’d watch double features each and every Saturday and Sunday. I was a huge fan of Ray Harryhausen’s work, in particular Jason and the Argonauts and The 7th Voyage of Sinbad, and I think he was my first visual effects artist/hero/crush. There have been countless since then, but you always remember your first...!

VFXV: What most captivates you about the action-adventure genre?

Hurd: I was always an adrenaline junkie and love sharing the theater experience. It’s an art to tell a story that engages many different audiences and finds a way to make them identify with the character in jeopardy and root for them. There is nothing better than being in a dark theater with an audience on the edge of their seats who are cheering, screaming as an integrated part of the experience.

If you boil down the story I strive to tell over and over, it’s of ordinary people who find themselves in extraordinary circumstances and find the strength within themselves that they never knew they had, to succeed and overcome… and in some cases, save the world!

SUMMER 2023 VFXVOICE.COM • 51
TOP LEFT TO RIGHT: Hurd and Jim Cameron on the set of Academy Award-winner The Abyss. Hurd and Jim Cameron promoting their sci-fi sequel Aliens. Hurd consults with Charlize Theron on the set of Aeon Flux.

Hurd on the set of Aliens with Sigourney Weaver.

Hurd on the set of The Terminator with Linda Hamilton.

Hurd with mentor and friend filmmaker Roger Corman.

Hurd meeting with director Ang Lee on the set of The Hulk.

VFXV: What was your pathway from school to your first job in the film industry?

Hurd: At Stanford, I studied economics, political science and communications. It was my original intent to be a marine biologist, but I realized I would never do well enough in math and science, so I embraced the social sciences. I had a seminal event in my junior year when I was part of the Stanford in Britain program. I fell in love with British film and broadcasting and knew what I wanted my future path to look like.

During my college years at Stanford University, I was lucky enough to have the late producer Julian Blaustein as my mentor. He and I bonded over our mutual love for science fiction. Julian produced the original The Day the Earth Stood Still and was one of the few producers at the time who valued sci-fi as an art form in which to tell powerful stories. For a film class my senior year, I chose to write my final paper on Stanley Kubrick’s 2001: A Space Odyssey that touched on its groundbreaking visual effects. As fate would have it, Roger Corman hired me after reading my thesis on 2001: A Space Odyssey, clearly disregarding my less than stellar personal interview with him. So you could say that my fascination with visual effects was instrumental in my Hollywood career from Day 1 and you’d be 100% correct.

52 • VFXVOICE.COM SUMMER 2023
PROFILE
LEFT TO RIGHT: Hurd with feminist icon Gloria Steinem and director Valerie Red-Horse Mohl.

VFXV: You have been credited with ushering in the era of strong female protagonists. What did it take to get the industry to support films with “sheroes” in the lead?

Hurd: There were a lot of early challenges in getting the industry to support women as heroes in the lead; Jim [Cameron] and I were lucky that The Terminator was the success it was. At its heart, the movie really is the story of Sarah Connor, and it was wonderful to tell that story through the female gaze. The Kyle Reese moment with Sarah – ‘I came across time for you Sarah, I love you, I always have’ – is one of my favorite lines, but we knew better than to sell it as a love story. We had everything stacked against us and prepared like we were defending our dissertation. We knew it was an easier sell as The Terminator with an unstoppable villain at the center point. Yes, it was sold as the story of the robotic soldier and Arnold Schwarzenegger. But it was her story.

With Alien, we were lucky, because the only character left alive was Ellen Ripley, brought to power by Sigourney Weaver… So if there was going to be a continuation, it would either be her… or the cat! I hope audiences take away from seeing strong women on screen, that women are capable of living their own truths and being the protagonists of their own stories.

VFXV: As we’re celebrating Women’s History Month, why was Wilma Mankiller a subject you wanted to focus on in a documentary? And what drew you to produce True Whispers: The Story of the Navajo Code Talkers and Choctaw Code Talkers?

SUMMER 2023 VFXVOICE.COM • 53
TOP: Hurd and the cast of The Walking Dead celebrating its 100th episode. BOTTOM: Hurd consults with Jeffrey Dean Morgan on the set of The Walking Dead.

Hurd: The road to making these documentaries was rich and inspiring. I reached out to a Native American woman director, Valerie Red-Horse, who asked to work with me on a documentary on Navajo Code Talkers and their service during World War II. I had read a terrific script when I was Chair of the Nichols Screenwriting Committee at The Academy. When we went to the Navajo Nation and asked for support from the Navajo Code Talkers Association, they said ‘Please tell our real story.’ So we got financing from ITVS, and PBS and made a well-received documentary. What was most rewarding was to see these code talkers, these men who had to keep their service classified for all these years, finally lauded by young people. That project led to our making of the documentary on the Choctaw Code Talkers

Then the Cherokee Nation proposed a documentary on Wilma Mankiller, the first woman elected to serve as Principal Chief of the Cherokee Nation. I had never heard of her and that shocked me. As someone interested in women’s studies and leaders, the fact that there was such an amazing women recognized around the world that I didn’t know about – I was compelled to take this on. We raised most of the funding for MANKILLER on Kickstarter – and wow, the cast of The Walking Dead and the Indigo Girls helped immensely by providing great rewards to donors. I’m proud that our film helped bring national recognition for Wilma, who is now rightly emblazoned on the U.S. quarter coin.

VFXV: You’re known for embracing daring material. What is the essential “it” in taking on a new project? Hurd: Whether it’s Wilma Mankiller or The Walking Dead, I return to similar themes when seeking out new projects. Since I am such a hands-on producer, I have to make a visceral decision to leap and dedicate so much of my time to projects so that I don’t regret that choice later. That’s my first litmus test. I really like telling stories of ordinary people thrust into extraordinary circumstances in new ways. I love examining the human condition and posing the question that the audience is thinking – ‘What would I do in that situation?’

I want people to see themselves and especially women in a different and ‘enlightened’ light. Not victims cowering in a corner waiting to be saved by an alpha male. There is a rich tapestry of roles that women can and are playing in real life as well as in film and TV. I’m inspired by what we’re seeing here

54 • VFXVOICE.COM SUMMER 2023 PROFILE
TOP: Hurd in Peru for her Amazon anthology series Lore. BOTTOM: Hurd in Prague for her Amazon anthology series Lore.
“I want people to see themselves and especially women in a different and ‘enlightened’ light. Not victims cowering in a corner waiting to be saved by an alpha male. There is a rich tapestry of roles that women can and are playing in real life as well as in film and TV. I’m inspired by what we’re seeing here with diverse and older actresses coming to the forefront. They have been doing that for years in British cinema with rich roles for Dame Judi Dench and Maggie Smith. But what’s different now in actresses being lauded in the U.S. [such as Michelle Yeoh and Viola Davis] is that the roles are rather transformative.”
—Gale Anne Hurd

with diverse and older actresses coming to the forefront. They have been doing that for years in British cinema with rich roles for Dame Judi Dench and Maggie Smith. But what’s different now in actresses being lauded in the U.S. [such as Michelle Yeoh and Viola Davis] is that the roles are rather transformative.

When it comes to diversity, equity and inclusion and the ‘state of women’ in the business, here’s the thing: Socially, culturally, even to this day and from my own experience, women are taught not to stand out or speak up. And we are criticized when we do. Women are more frequently interrupted in meetings and quieted when we are the ‘interrupters,’ and at a certain point, subconsciously, you absorb that and adapt. We often see ourselves differently and sell ourselves short.

I wouldn’t have gotten anywhere without mentors who not only believed in me, but pushed and challenged me, and encouraged me to value myself and continue – even when I wasn’t feeling likely to succeed. From the beginning, I always wanted to work with women in every capacity and recognize and support talent. I’m hoping that with strong mentorship and the success of films and TV series that feature strong women and non-traditional heroes in front and behind the camera, our collective influence will grow and make an impact on everyone our work touches.

VFXV: What excites you about using visual effects technology to advance character-driven and highly visual storytelling?

Hurd: What I love about VFX and where it’s going – it can be used to make sets safer, and that should be a top priority on film and TV shoots. And it can be used to broaden our horizons so that a filmmaker can bring anything they can imagine clearly on the screen and be real enough for audiences to suspend their disbelief and feel they are engaged with a character or immersed in an environment – not an effect. I love that filmmakers like Jim [Cameron] and Guillermo del Toro constantly embrace innovation, and as a result are giving people a reason to go back to movie theaters and see bold visual storytelling done like never before.

SUMMER 2023 VFXVOICE.COM • 55
TOP TO BOTTOM: Jim Cameron presents Hurd with the VES Lifetime Achievement Award. Hurd accepts the Lifetime Achievement Award at the 21st Annual VES Awards. Hurd backstage at the VES Awards flanked by VES Executive Director Nancy Ward and VES Chair Lisa Cooke.

FOCUSING ON VIRTUAL CINEMATOGRAPHY

Virtual cinematography has been defined by practical expertise on cameras, lensing and lighting while extending its influence, with real-time game engines being the cornerstone for virtual production and previsualization becoming a staple of live-action blockbuster productions. However, the paradigm that defines the cinematic language is shifting as drones are making once impossible shots achievable and several generations have grown up playing video games. The animation, film and television and video games industries are brought closer together as their tools become more universal and integrated. Whether this trend continues, if controversies such as the Life of Pi winning the Oscar for Best Cinematography dissipates, and whether it will become commonplace for animation DPs to be invited into the ranks of organizations like the American Society of Cinematographers, remain to be seen. There is also the issue of having the cinematographer consulted during post-production to ensure that the camera, lens and lighting choices are consistent, thereby maintaining the visual language.

To develop a complete understanding of the evolving relationship between virtual and live-action cinematography, professionals from film, television, animation, video games, commercials and virtual production have been consulted.

OPPOSITE

Greig Fraser, who won an Oscar for his contributions to Dune: Part One and received a nomination for Lion, lensed the pilot episode of The Mandalorian, which is accredited for accelerating the adoption of the virtual production methodology, along with the COVID-19 pandemic. “I wish that I was really good at using Unreal Engine because if I was training to be a cinematographer right now, the cheapest tool they can get in their arsenal is Unreal Engine,” Fraser notes. “With Unreal Engine, you have metahumans so you can build yourself a face, light it and start to explore how top lights

56 • VFXVOICE.COM SUMMER 2023
TOP: The disguise workflow is being used for the xR stage at Savannah Film Studios to train students. (Image courtesy of disguise and Savannah College of Art and Design) TOP TO BOTTOM: Greig Fraser is able to create unique lens aberrations that stem from his knowledge of live-action cinematography. (Image courtesy of Warner Bros. Pictures) Director Matt Reeves with Greig Fraser shooting The Batman, which was a combination of shooting real locations and virtual production stagework. (Image courtesy of Warner Bros. Pictures)
VFX TRENDS
ILM’s StageCraft technology was used for the background shots of Gotham featured in The Batman (Image courtesy of Warner Bros. Pictures)

work. You can begin to figure out emotionally how you feel when putting a light on the side, top, defused or cooler. You can do that quickly. They can then apply that to the real world, but also have a huge positive base of knowledge when getting into the virtual world by knowing what is and isn’t possible.” A better understanding and integration into the world of filmmaking is required more than additional tools, according to Fraser. “You can’t put up an 18K, flag and diffuse it the same way you can on set. There is a number to change the softness, width and distance. There are differences between those things. I would like to see correct film lights put into Unreal Engine as it will allow a lot of cinematographers who have trained with an old system to be able to come in and use that in a new world.”

Observes Cullum Ross, Chief Lighting Technician on Man vs. Bee, “The biggest problem I have with interactive lighting and creating something virtually is try as you might you cannot add specular light. That’s so difficult in a virtual environment because you’re dealing with a large LED array and that is soft light. Currently, if you want to create highlights and flares you need to do it for real.” Three to four separate passes were done for shots involving the bee. Ross notes, “We had lots of different sizes of bees, some more reflective, while others were matted and satin.” When shooting Man vs. Bee, Cinematographer Karl Óskarsson dealt with an antagonist in the form of a CG insect. “You can create a dinosaur or elephant in post, but you need to see something in reality that gives you the shadows and roughly the theme of what you’re doing. Then you can add all of that in post.” Óskarsson adds, “When the bee is between the main actor and camera, the focus of the eyeline has to be right; that’s where puppeteer Sarah Mardel came in with something on a stick. The approach was that the bee would always fall into what we were doing. Occasionally, we could

SUMMER 2023 VFXVOICE.COM • 57

do a strong backlight because we had to see a small backlit bee over in the frame. There were occasional close-ups. It was much more about Rowan Atkinson. The beauty of what Framestore did was to add what was meant to be.”

WALL-E leveraged the live-action expertise of Cinematographer Roger Deakins. “The big comment made by Roger Deakins was, ‘You are making decisions in layout without knowing where the lighting is going to be. I light the set and then film it,” recalls Jeremy Lasky, DP, Camera at Pixar. “Danielle Feinberg [DP, Lighting at Pixar] and I looked at each other and said, ‘He’s right.’ Previously, we could never manage to get these things working together due to software, complexity and time.” That was the first film where the two DPs could work together visually at the same time. Lasky adds, “Danielle could put some lights in, I could start exploring. We could see shadows and how you could open a hatch in a dark room and the light would spill out. You could time it in editorial and compose to it.”

Practical understanding and testing are important. “The happy accidents that you have on a live-action set, you have to manufacture in CGI,” notes Ian Megibben, DP, Lighting at Pixar. “We need to borrow from both sides as they can complement each other. When we started on Lightyear, I was dissatisfied with the way our lens flares and aberrations looked because one artist would approach it one way and another artist would approach it a different way. There wasn’t a lot of consistency. Chia-Chi Hu [Compositing Supervisor] and I spent a lot of time studying various affectations on the lens that we rolled into our lens package and that informed the look.” Caution has to be exercised. “The computer tools are so flexible that you have infinite possibilities, but if you use every color in Crayon box, it can start to lose its focus,” Megibben says.

58 • VFXVOICE.COM SUMMER 2023
VFX TRENDS
TOP: Bridging the gap between virtual cinematography and virtual production is an important goal for Impossible Objects. (Image courtesy of Impossible Objects) BOTTOM: Cinematography has evolved over the decades, but at its core it’s still moving images. (Image courtesy of Impossible Objects)

“The paradigm of what happens in live-action does not equal the same components of artistry that happens in animation,” states Mahyar Abousaeedi, DP, Camera at Pixar. “What our department does in layout overlaps multiple disciplines. Storyboards communicate the essence of the story while the execution is more about expanding those ideas and making sure that we’re still building a visual language to escalate until we reach the peak of that sequence. The reason I spent 48 hours looking at boy band references was to find out what makes a dance sequence authentic from 2000 for [the concert in Turning Red]. It’s not just how it was shot. What are those characters doing? I can see boards of the character dancing, but are there imperfections in how they do it and should we see those imperfections? One thing that I enjoy about what I do is seeing certain ideas that live in this rough draft become much more thought out. You need to actually see that idea first because there’s nothing to shoot [except for the set] and you’re depending on a bunch of artists to choreograph it [with the characters].” In an effort to balance fantasy and reality, live-action and animation filmmakers approach the material from opposite directions to accomplish the same thing. “In animation, we are trying to make our characters and world feel real to the audience so that the audience can connect with them, so we attempt to do it more like how you do on a live-action set,” remarks Jonathan Pytko, DP, Lighting at Pixar. “Live-action sometimes feels like it’s going in the opposite direction where they’re trying to make it more fantastical and take you out of reality and give it a different vibe. They’re both valid.”

Time spent at Lucasfilm Animation laid the foundation for Ira Owens, who is a cinematographer in the video game industry.

“Some key elements that I use is when you’re showing a wide shot, make it beautiful and epic,” Owens explains. “With medium shots be clear and concise, make sure that they’re telling the story.

TOP: Video game cinematics have become more filmic over the years, as displayed by Ghost of Tsushima, which is in the process of being adapted into a movie. (Image courtesy of Sucker Punch Productions and Sony Interactive Entertainment)

MIDDLE: Ira Owens believes that close-up shots are meant to punctuate the storytelling, as displayed by this still taken from Ghost of Tsushima. (Image courtesy of Sucker Punch Productions and Sony Interactive Entertainment)

BOTTOM: Meptik is responsible for hybrid xR events “such as “Combat Karate” for Karate.com. (Image courtesy of Meptik)

SUMMER 2023 VFXVOICE.COM • 59

Punctuate with your close-ups. That train of thought I learned from my time on The Clone Wars series has allowed me to thrive with different positions that I’ve held in animation and eventually gaming, which has changed so much over the years to become more cinematic in its storytelling.” Owens works directly in the game engine. “I can break the camera off and scout a location quickly. The controller literally becomes my camera. I have pan, pitch and crane functions. I cruise the camera around and look at a scene. I’m not saying that I don’t utilize storyboards or concept art, or if some live-action footage is available I’ll watch that to see if there are some ideas I want to explore.”

To better handle the proliferation of visual effects, various onset virtual tools have been produced. In fact, Game of Thrones led to the creation of the iOS app Cyclops AR, which enables CG objects to be composited into the camera view in real-time. “Director Miguel Sapochnik asked me, ‘How big is Drogon?’” remarks Eric Carney, Founder and VFX Supervisor at The Third Floor. “My stock answer was, ‘About the size of a 747,’ which is really not that helpful. I remember thinking that it would be great if I had a way on my iPad, which I always carried with me for notes, to quickly show everyone a composite of Drogon in the actual physical space. Later in the year, when we went to Italy to shoot the Dragon Pit scene for Episode 806, we created an early prototype of Cyclops. In 2022, we produced a new tool called Chimera that uses AR Hololens glasses from Microsoft and works in conjunction with a laptop to render the content so it is not as portable, but has higher visual quality allowing for a lot of flexibility. Multiple users can come together, either in person or remotely, and share an experience of reviewing sets or scenes virtually.”

“Virtual cinematography is much bigger than virtual production,” states Janek Lender, Head of Visualization at NVIZ. “I’m focused on using it in pre-production and storytelling. When I’m working with the director and visual effects supervisor, it’s always about getting a virtual camera in a virtual space, to look around

60 • VFXVOICE.COM SUMMER 2023
TOP: The physical actions of Rowan Atkinson dictated the placement of the insect adversary in the Netflix series Man vs. Bee. (Image courtesy of Netflix) BOTTOM: Framestore was responsible for the CG bee, with practical proxies of various sizes standing in to get the proper framing and lighting. (Image courtesy of Netflix)
“The biggest problem I have with interactive lighting and creating something virtually is try as you might you cannot add specular light. That’s so difficult in a virtual environment because you’re dealing with a large LED array and that is soft light. Currently, if you want to create highlights and flares you need to do it for real.”
—Cullum Ross, Chief Lighting Technician, Man vs. Bee

and make a sequence of shots, because my end goal is to have a previsualization. Then they take my previs and break it down for visual effects and work out what’s LED walls and spaces. But you can’t do that until the visualization is there and director can go, ‘That’s what I want to make.’” The virtual camera system is paired with Unreal Engine to allow the director and cinematographer to take shots of the previs world in real-time. “The good thing about doing your previs there is that those assets can be reused for your postvis. Once they shoot their plates, you can fill the screens with what you did in the previs world or use it as b-reel to help gauge the people who are going to make the volume.”

ILM and its StageCraft technology are among the pioneers of virtual production. “Now we’re going into this different world with virtual production and LED walls where we’re closely tied to the director of photography and cameras teams,” remarks Chris Bannister, Executive Producer, Virtual Production at ILM. “One thing that has always defined ILM is the blend of the physical and technology; that’s what makes the beautiful images. There’s always a back and forth. We spend a lot of time making our StageCraft toolsets be in a language that people want to speak, to make sure that the color temperatures match, or when the DP says, ‘Go up or down one stop’ it’s a real thing. That’s something that is not traditionally in digital toolsets. Working CG, sometimes people don’t work in those units. What we always try to blend well together is having those two things in dialogue because it’s what gets you the best results.”

A driving force is the adoption of real-time game engines. “The real benefit of real-time that we’re seeing across all of these shows is that it gives our creatives more bites at the apple,” states Landis Fields, Real-Time Principal Creative at ILM. “With traditional visual effects, visual effects supervisors and all of the folks who are involved will be looking at a monitor to review a shot through screenshare and with a little tool draw around a thing and say, ‘This needs to be bluer.’ You’re trusting a lot of folks to interpret what you’re trying to say. When we do that now in real-time, we don’t guess. We have a screenshare up of a living, breathing world in 3D and ask, ‘What do you want?’ ‘The light should come up from down there.’ ‘Like this?’ ‘A little bit lower. That, right there.’ I cannot stress the exponential value of savings that just had. Because now we don’t have to drag the whole thing through the process of guessing that we’re right, showing it a week later and learning that we’re not.”

Technology is constantly changing, and virtual production is no exception. “You can start seeing some of those changes now with the recent updates in AI,” remarks Kevin De Lucia, VFX Supervisor at Vū. “AI is going to become a driving factor. Eventually, we’ll be able to start creating environments from certain AI programs. You see how fast that is growing and will help change the industry as we move forward. If you go into the mindset that AI is a tool, it will get you a starting point. AI is something that gets your creative juices flowing. It’s not the one click, you’re done and ready to go. You still have to finesse it and put your secret sauce on it and get things optimized for virtual production. It’s an amazing concepting tool. For the immediate future, I see

TOP TWO: Previs remains a central element for working out shots and story beats for directors and cinematographers, as was the case for Matilda the Musical. (Image courtesy of NVIZ and Netflix)

The assets created for previs can be reused in postvis.

(Image courtesy of NVIZ and Netflix)

BOTTOM TWO: The system for lens flares and aberrations was revised to ensure that they were consistent throughout Lightyear.

(Image courtesy of Disney/Pixar)

Cinematographer Roger Deakins consulted on WALL-E, which led to Pixar figuring out how to do lighting during layout.

(Image courtesy of Disney/Pixar)

LED screens being the main platform, but those will evolve, too, like the resolution and processing power behind them, everything from your pixel pitch and the different ways of pushing the resolutions that you need to be able to have high fidelity. There are a couple things that we’re working on R&D where we’re not using a LED panel. There are these LED transparent screens where you can do things that are more interactive with gestures and motions.”

Luc Delamare, Head of Technology at Impossible Objects has a passion for bridging live-action photography with virtual production. “Cinematography has evolved over the decades, but at its core it’s still moving images, how you approach coverage and the way you create emotion out of frame,” Delamare observes. “All of those things are rooted in the same concepts, and it’s up to the cinematographer, artists and director on how to use those tools. I would like to think when you break that language, your audience will notice even if they don’t understand it.”

Delamare continues, “We’re able to stage previs and imagery upfront in pre-production at a much higher fidelity level. You’re not spending a week or two looking at grey-scale images of your scene, but actually doing something where you’re already making lighting choices well before normally in a CG pipeline. In terms of VR scouting, you can scan a room or exterior with your phone and bring it into Unreal Engine in a matter of minutes. It’s freeing, as opposed to a director and cinematographer having to work with a visual effects supervisor and artists to explain the things they want.” AI is a scary proposition. “I used to think that my job was the safest from robots. I’m sure that you’d be able to feed an AI that this is the style I want. You can see it already with some of the images that people are doing.”

“We still have those same challenges of traditional cinematography where we have to overcome sense of scale and grounding, where the actors are in space, and how the lighting is hitting the actors; with that in mind, I don’t know if virtual cinematography is really changing the game,” notes Addy Ghani, Vice President of Virtual Production at disguise. “When you build a digital world in Unreal Engine, it’s hard to visualize it all. You don’t know how big those trees are, so putting on some headsets and location scouting is helpful for directors and cinematographers to get a sense of scale and space. You can actually do it without the glasses. A LED volume is useful to stand in and take in the environment; you could always look through the camera to see how you are going to frame up a shot.”

An important element is getting individuals educated properly as technology advances old-school techniques such as rear projection, which is the foundation of virtual production. Ghani comments, “You still need training to be able to harness the power of the tools. To make the sunrise and environment look realistic enough for high-end feature films; that’s the difficult part and is where training and expertise comes in.” There is room for improvement, Ghani adds. “I would love to get even better photorealistic, higher-quality output out of Unreal Engine; that would game-change a lot of shots. Right now, numerous shots still require some post-visual effects enhancement to get it to that final level of completion. Bringing that stuff to principal photography would save time and money and give directors and cinematographers immediate creative decision-making.”

62 • VFXVOICE.COM SUMMER 2023 VFX TRENDS
TOP TO BOTTOM: Vhagar is visualized live in rehearsals for Episode 104 of House of the Dragon utilizing The Third Floor’s portable AR Simulcam app Cyclops. (Image courtesy of The Third Floor and HBO) The Blight sequence in Ant-Man and the Wasp: Quantumania where Scott Lang confronts himself. (Images courtesy of Marvel Studios and The Third Floor) Vū believes that AI is going to become a starting point for creating environments. (Image courtesy of Vū)
“If you go into the mindset that AI is a tool, it will get you a starting point. AI is something that gets your creative juices flowing. It’s not the one click, you’re done and ready to go. You still have to finesse it and put your secret sauce on it and get things optimized for virtual production. It’s an amazing concepting tool.”
—Kevin De Lucia, VFX Supervisor, V ū

DIVERSITY REIGNS SUPREME IN SPIDER-MAN: ACROSS THE SPIDER-VERSE

Much like The Matrix, which caused a ripple through the film industry because of innovative storytelling, Spider-Man: Into the Spider-Verse had the same impact, with its creative influence going beyond its Oscar win for Best Animated Feature to inspire several imitations. Upping the ante is the first of two proposed sequels, Spider-Man: Across the Spider-Verse, by showcasing not only the world inhabited by a slightly older Miles Morales but those serving as the homes of Spider-Gwen, Spider-Man 2099 and Spider-Man: India. And let’s not forget that the cast of Spider-People has been multiplied by 10, and Miles finds himself clashing with them on how to handle the portal-empowered villain known as the Spot.

OPPOSITE

Taking

in

To handle the expanding multimedia aesthetic of the franchise, Sony Pictures Imageworks created a new department called Look of Picture. “On this one, there are so many technical hurdles and different looks that I’ve spent 80% of my time developing tools,” states Bret St. Clair, Senior Look Development and Lighting TD at Sony Pictures Imageworks. “We saw a lot of art that depicted paint being thrown around in-frame in ways where strokes aren’t specifically connected to a character or to anything. They’re floating in space. The problem for us is we try to approach everything rendering in a traditional way. Then we try to have all of our tools handle the rest in compositing. When it comes to positioning brushes in 3D, we needed to be able to reconstruct where those positions are, even when they don’t lie specifically on the surface that you’re apply the brush to, and the rendering tools, especially in 2D, don’t give you that information for free.”

Watercolor was the media of choice for Spider-Gwen’s world. “A lot of the look early on was inspired by the Jason Latour art, and everything was running vertically in that world,” St. Clair remarks. “As you start to move cameras around and we started to get through shots, it became clear we can’t have everything running

64 • VFXVOICE.COM SUMMER 2023
Images courtesy of Sony Pictures Animation. TOP: One of the most difficult tasks was to slightly age Miles Morales and Gwen Stacy without them appearing unfamiliar. TOP TO BOTTOM: Concept art of the adversary known as the Spot, who is made of paint, ink, and has the ability to dissolve worlds into nothingness. Medieval Vulture, inspired by the drawings of Leonardo Da Vinci and made of sepia-toned paper, turned out to be the most complicated character to execute.
ANIMATION
part the battle at the Guggenheim is the motorcycle-driving and pregnant Jessica Drew/SpiderWoman and anti-hero Miguel O’Hara/Spider-Man 2099.

vertically. It’s a balancing act because if you’re painting brushed volumes in a scene and the character starts to move, the brush volumes themselves get your attention as opposed to the thing you’re suppose to be looking at, or they give you the impression that they’re something else other than a volume. There is a lot of things that you can do in 2D that work because it’s a still frame, but as cameras move, it falls apart. Over the time we’ve been working on Gwen’s World, the tools of had to evolve so that the brushes automatically understand the surfaces that they’re painting onto, and painted in the right directions, or we have the ability to add hand-drawn strokes to something interactively.”

Linework is a major component to the look of Spider-Man: Across the Spider-Verse, with the tool development responsibility given to Pav Grochola, FX Supervisor on the film and Lead VFX Artist at Sony Pictures Imageworks. “When you’re creating linework, you’re starting off with a 3D model, and we try to make it look as hand-drawn as possible, which means that you’re trying to emulate the artistic process,” Grochola explains. “What an artist would do in the case of Medieval Vulture is Leonardo Da Vinci as much as you can. But all of that stuff is camera dependent, so you can’t be 100% procedural because 100% procedural looks like toon shading. We actually created our linework inside of Houdini, and it was driven by artists. Linework itself was in curves. We had a dedicated team of artists looking after the linework just for Medieval

SUMMER 2023 VFXVOICE.COM • 65
“There are so many technical hurdles and different looks that I’ve spent 80% of my time developing tools. We saw a lot of art that depicted paint being thrown around in-frame in ways where strokes aren’t specifically connected to a character or to anything. They’re floating in space.”
—Bret St. Clair, Senior Look Development and Lighting, Sony Pictures Imageworks

Vulture. The way you make something has a big impact on how it looks, so as much as possible we try to create procedural linework in Houdini, but also have artists draw linework on keyframes. Then that linework would be interpolated by our systems. It was a combination of hand-drawn and procedural. That’s the big secret with this stuff.”

For the Spot, Sony Picture Imageworks worked with Slovakian creative software company Escape Motions. “In terms of the technique, we did something interesting in this film in that we partnered with Escape Motions, which creates this painterly software called Rebelle that has an inbuilt watercolor solver, and you can do cool stuff like wet the canvas,” Grochola remarks. “You can put ink [on the wet canvas] and the ink spreads into the paper. We always tried to simulate the natural organic detail of painting in real life. Right at the beginning, we were working with that tool, and one of our biggest goals was to try to make the movie look as hand-drawn and handmade as possible; that tool to us was the perfect thing for experimenting with natural media. Not only can Rebelle do watercolor but oil paint and charcoal in such a convincingly way that would be hard for us to develop from scratch. We merged that software into our software and created this cool 3D and 2D combination where the Spot is made of paint, ink, and moves around leaving ink droplets; he is himself constantly redrawn with natural media.”

Medieval Vulture is made out of sepia tone paper. “We have him as translucent like paper so he scatters light through,” remarks Mike Lasker, VFX Supervisor at Sony Pictures Imageworks. “His character is probably one of the most complicated characters we have ever made because he is this Leonardo Da Vinci Medievalstyle Vulture character with tons of pullies, feathers, and he’s got all of this stuff going on plus all of his linework. A lot of linework

66 • VFXVOICE.COM SUMMER 2023
ANIMATION
TOP TO BOTTOM: The visual aesthetic for Spider-Gwen was based on the original comic book artwork by Jason Latour. Sony Pictures Imageworks took advantage of the inbuilt watercolor solver of Rebelle by integrating the hyper-realistic painting software into its pipeline. The visual aesthetic of Mumbattan was inspired by Indrajal Comics, which was a comic book series in India.

we do hand-drawn also. We mix actual 2D-drawn lines in with a lot of our stuff to give it that last piece of hand-drawn quality. But you need materials. What was a challenge on the first one as on this one is, what does a Da Vinci metal look like versus a Syd Mead? We do our traditional look development to a point and then we hand it off to our Look of Picture team where we do a style on top of that. I was just looking at an environment yesterday, and it’s in a world of pistons and gears and you’re in an entire environment of metal. Our look development looked realistic, but I tried to get the team to paint enough texture paint in there that is in the style so when we apply the Look of Picture tool it all works cohesively.”

Simulations have to take on the characteristics of the world in which they occurred. “In just the principles of film work and photography, we have to come up with how do we do depth of field in a painterly way versus a more architectural way versus how we do it in Miles World versus in India,” Lasker notes. “You have to experiment and be constantly ready to fail over and over again or at least try a lot of different things. One of the things we learned on the first one is you can’t inch your way to the finish line. You have to overdo it, and then figure out, what do we like from that, what is working and what is not? What does a lens flare look like in Spider-Man 2099 world versus Spider-Man: India? Every aspect you have to reinvent. The first film was really just one look.” Explosions have to be stylized and believable. “A lot of it is constantly playing with exposure, blowing out the lens, and how the light from the explosion affects the world around it,” comments Lasker. “If the explosion is in Gwen’s World, you want to feel the heat of the fire on the surfaces, and the shadows being cast are brushed like a painting, or the warmth on the surfaces needs to be brushed. How do those brushes react differently to what has already been brushed in the environment? You’re not

The industrial artwork of award-winning visual futurist and conceptual artist Syd Mead, member of the VES Hall of Fame, was theinspiration for Nueva York, which is located on Earth 928 where Spider-Man 2099 lives.

Videos on YouTube were referenced to better understand how artist/designer Syd Mead went from the blank page to creating fully-realized, retro-futuristic worlds.

SUMMER 2023 VFXVOICE.COM • 67
TOP TO BOTTOM: Concept art of Spider-Man: India, Spider-Gwen and Miles Morales battling the Spot in Mumbattan.

only creating a painting, you’re having to paint the affects of the effect on the environment.”

Spider-Punk evolved in animation. “We had to figure out all of the ways to handle his layers, and when you have a character who is meant to look like a magazine cut-out in a fleshed-out world that already has all kinds of crazy art directions in it, it’s hard to make it all cohesive,” observes Alan Hawkins, Head of Character Animation at Sony Pictures Imageworks. “But we found a formula once we got into the shots, like offsetting the frame rates of certain aspects of his body from other parts of his body. There was speculation that on the first one that there was meaning behind frame rates, but that was not the case. His jacket sometimes is a different frame rate than his body, and it’s to give him a patchwork collage feeling, and there are some slightly different layers to him being pasted together, which we did a lot of tests on; he is like 3s and 4s sometimes whereas everyone else is 2s.” The voice acting of Daniel Kaluuya was surprising. “It had such a swagger to his read that made the character so clear to us. Spider-Punk is probably one of my favorites now because he is consistent in his presentation to everyone else, except for a few key moments where you get to see his real truth, and that’s an amazing thing that we got to work with.”

“In the upgrade of Miles from a young teen to an older teen, his growing up a little was one of those where when the new shapes came into play, we handled it wrong; he no longer looked like himself,” Hawkins reveals. “Miles has a particular cheekbone structure and eye-corner shaping; his silhouette from the side and the way his mouth in the first movie had a bit of an overbite, and as they aged him up a little bit, you want to give someone a more pronounced jawline. That stuff did affect the way Miles looked, and he became a little unfamiliar. We had to make adjustments to how the rig would behave and how we handled the posing to make him look like a grownup version of that same Miles everybody loves.”

68 • VFXVOICE.COM SUMMER 2023 ANIMATION
TOP TO BOTTOM: Concept art exploring what the watercolor skyline of Manhattan would look like on Earth 65, which happens to be the home world of Spider-Gwen. Linework is a major component to the look of Spider-Man: Across the Spider-Verse and had to appear hand-drawn despite starting off with a 3D model. Simulations take on the characteristics of the world in which they occur.
VFXVOICE.COM VFX VOICE COVERS THE WORLD – FROM COVER TO COVER

A massive focus was placed upon cinematography, composition and camera language. “That’s one of the things that we would say to every animator when they would join the show: you have to learn that language about overs, French overs, two shots, what lenses mean, which lenses to use in certain circumstances. Every animator was basically an honorary DP on this film because that was half of what we were handling.” Hawkins would love to do a reference cut for the whole movie. “I would like to point out how much the animators’ soul goes into these performances. For every scene where you see someone crying up there, that animator probably got to that place and filmed themselves doing it and re-referenced that stuff. It’s so complex and adult in a great way that we don’t often get in animation.”

Masks were incorporated into texture paint workflow. “For the texture painter, it’s color, bump, roughness and maybe a metallic,” observes texture painter Nicolle Cornute-Sutton. “Those are the traditional PBR types. On Spider-Verse, the laws of physics don’t apply, so physically-based rendering means nothing! Metal doesn’t shine like metal in our world. You have to think of things from a 2D perspective. For example, there is a dinosaur in our movie, and usually we would displace the scales to sell this bumpy, scaly reptilian creature, but you can’t do that in the Spider-Verse, so it’s almost working in a drop shadow and tracing things and giving it a more graphic read.” Templates were created. “The art department wanted it more stylized, and the only way you can do that is by hand. As our team grew, we had to make sure that we laid down foundations in Mari for new painters. We had to develop a lot of templates using node graphs so people could pick up a template and at least have a starting place so no one had to figure out for themselves, what is Miles’ World? Then we had to have a robust library of textures to show people that this is the target. That was new to me having to start from scratch.”

A complex battle that takes place in the Guggenheim Museum. “It’s in Gwen’s World, but the Guggenheim is actually filled with photorealistic artwork created by Jeff Koons,” Cornute-Sutton states. “It was one of the only times in the movie where we actually did go into Substance 3D Painter. We had photorealism, an architecturally significant building that we had to recreate, but in Gwen’s World style, and there is a helicopter bursting through the skylight in the ceiling. Vulture is in this sequence, and he’s from the Leonardo Da Vinci Verse, so he’s in a completely different sepia tone style. Then Spider-Man 2099 comes in with this digital suit that comes on in bits and bytes; his righthand gal Lyla pops in and she’s a hologram. Jessica Drew hops in on her motorcycle. And there’s Gwen and her father. They’re all in their styles.”

Diversity is paramount in the Spider-Verse. Observes CornuteSutton, “I felt like this was the first time I got to paint people who look and sounded like me and my family. That was incredibly exciting and thrilling thing. The audience is going to see that and feel that too. It’s not just about race or gender. I hope that this is just the beginning of many films where there’s a cognizant realization on the part of the filmmakers that we all want to see ourselves in CG features. I cannot help but say kudos to Sony Pictures Animation for taking a chance on making a movie that speaks to a broader range of audience than has been done before.”

70 • VFXVOICE.COM SUMMER 2023 ANIMATION
TOP TO BOTTOM: The webbing for each of the Spider-People, including Spider-Gwen, had to reflect the visual aesthetic of the world in which they come from. To handle the expanding multimedia aesthetic of the Spider-Verse franchise, Sony Pictures Imageworks created a new department called Look of Picture. The world of Miles Morales is visually defined by nongradient colors and a graphic comic book style.
“The way you make something has a big impact on how it looks, so as much as possible we try to create procedural linework in Houdini, but also have artists draw linework on keyframes. Then that linework would be interpolated by our systems. It was a combination of hand-drawn and procedural. That’s the big secret with this stuff.”
—Pav Grochola, FX Supervisor/Lead VFX Artist, Sony Pictures Imageworks

CRISS-CROSSING THE DIGITAL UNIVERSE WITH ROGER GUYETT

Roger Guyett grew up in the Farnborough area in Hampshire, England, where he went to school, then college in Bristol, before eventually finding his way into the world of computer animation in the mid 1980s. “After college, I didn’t know what to do with myself and ended up doing construction jobs, played music in bands around London, lived in France, and generally had a good time, but it wasn’t really taking me anywhere. I did an art foundation course and developed an interest in animation and the idea of making images with computers. Eventually, a friend told me that the British government was funding people to retrain in computer science, so I applied and did an MSc at University College London. I wasn’t quite sure where it was going to take me, but it felt like a positive step. This was all before people had home computers or even before email addresses, which sounds mad as it really wasn’t that long ago!” says Guyett.

After completing the course, Guyett saw an intriguing job advertisement in the Evening Standard for a position at a London post-production company. “As part of the interview process, they asked me if I could do an animation of a flag, an analysis of how it might move,” Guyett explains. “I did the best I could and fortunately was offered the job. At that time, people with all sorts of backgrounds were starting to work in computer animation. There was no specific training, no college courses in Computer Animation – there were probably only 20 of us working in the entire industry in London. I hadn’t known this world existed, but I’d stumbled into something that was a perfect fit for me, and I loved it. Working at a post-production company was incredibly exciting and you were exposed to all sorts of work – TV idents, commercials, pop videos, all sorts of different kinds of work. The people were amazing, and you got to learn every aspect of production from storyboarding to editing. We even acted in some of the work we did when we didn’t have much budget.”

Mission:

Guyett

OPPOSITE

Guyett at Pinewood with many familiar faces from the Star Wars franchise.

Guyett at the premiere of Star Wars: Episode VII – The Force Awakens, 2015.

Guyett

Continues Guyett, “In those days, commercials were often shown in cinemas on film. The commercials were authored on video and then crudely transferred onto film. To do proper digital effects, you had to be able to scan film to digital and then, of course, get the resultant digital images transferred back onto film at a much higher fidelity. This issue was an ongoing development in the industry and crucial to the process. ILM, of course, had solved that problem a few years earlier. One of my proudest moments at that time was working on a Perrier commercial that was one of the first examples of a hi-res digital film out in the U.K. The technology had been developed by Mike Boudry, who had a company called the Computer Film Company, which was eventually acquired by Framestore. That was the first film work I was involved with,” Guyett notes.

Guyett’s big break came when he was hired by Pacific Data Images (PDI) in 1992 and moved to California to work in the rapidly expanding world of digital film effects. He worked at PDI for two years before moving to ILM in 1994. “When I was a kid, I loved movies, but I never imagined I’d get to work on one! The digital VFX world suddenly exploded after movies like Jurassic Park came out, and there were a lot of great opportunities in America for people who knew how to do that work. When I arrived

72 • VFXVOICE.COM SUMMER 2023
Images courtesy of Roger Guyett. TOP: Roger Guyett joined ILM in 1994 to work on the groundbreaking computer animation for Casper LEFT TO RIGHT: Guyett with daughter Ella at the 2009 Academy Awards for Star Trek Guyett at the 2018 BAFTAs for Ready Player One Impossible III VFX team in China, 2005. From right: Guyett, Tom Peitzman, Marty Bosworth, Duncan Blackman and Chris Raimo. with Star Wars: The Force Awakens co-writer Laurence Kasdan and J.J. Abrams’ executive assistant, Morgan Dameron.
PROFILE
on location in China for Star Wars: Episode III – Revenge of the Sith

at ILM, there was actually still a mix of more traditional VFX, like miniatures. A lot of work was still done using optical techniques, but the new horizon was the digital side. My first show was Casper with legend Dennis Muren as Digital Character Supervisor. I was a senior technical director there – you lit the shot, did any effects, and then composited it all together. There was a strong delineation between the disciplines, which became even stronger as each of the disciplines became more and more complex. From there, I was lucky enough to become a CG supervisor on shows like Mars Attacks! and Twister. That show pushed particle systems to a whole new level – all sorts of tools and ideas were developed for that film. There were a lot of very smart people working at ILM. It was amazing seeing ideas like ambient occlusion, for example, get developed there.”

Guyett worked with VFX Supervisor Stefen Fangmeier while at ILM. “He was one of the original digital guys at ILM and had worked on Terminator 2: Judgment Day and Jurassic Park. We’d worked on Twister together. He needed a co-supervisor for Speed 2 and he was kind enough to ask me. That got me on the VFX Supervisor path at ILM, which I’m eternally grateful to him for,” Guyett adds.

In 1997 Guyett worked on Saving Private Ryan, which proved to be a turning point. “Although Stefen was the main supervisor,

SUMMER 2023 VFXVOICE.COM • 73

he got really busy on another show and it meant I was essentially on my own. There’s nothing like feeling the heat and pressure of your first solo show as a VFX Supervisor! It was a great mix of both digital and practical effects. The biggest digital shot was the beach establisher with all the ships, landing craft and troop replication – doing all the interactive water was such a challenge then. We did a lot of effects work – explosions, bullet hits and flying debris and plenty of grotesque wound FX work, standard fare in a war movie these days. And, of course, all the tracer fire! Steven Spielberg was a real force of nature at that time and at the top of his game. I was pretty inexperienced on set and suitably nervous, but had to step up and deal with it all. At that time, you didn’t have the ability to do the number of takes you can do now, the computers were a lot slower and disk space was a premium. It was actually one of the first shows we used the internet as a research tool. I remember typing in “D-Day” and we got back three or four images! What an incredible experience that show was. I was dealing with one of the greatest filmmakers in the world and watching him work first hand,” Guyett explains.

Guyett’s credits also boasts two Harry Potter films. “Rob Legato was the Production Supervisor on The Philosopher’s [aka The Sorcerer’s] Stone. I came and joined him as one of the main supervisors, working through the shoot and then supervising ILM’s work. One of the first sequences I helped shoot was the snake in the zoo. I learned so much from Rob – he was a big influence on me. He had

74 • VFXVOICE.COM SUMMER 2023
PROFILE
LEFT TO RIGHT: Star Wars: Episode VII – The Force Awakens second unit on location in Iceland. Guyett worked with some of his closest collaborators on Star Wars: Episode VII – The Force Awakens, including Pat Tubach and Animation Supervisor Paul Kavanagh. Saving Private Ryan town set. Saving Private Ryan beach set. J.J. Abrams with the camera crew on Star Wars: Episode IX – The Rise of Skywalker

such a strong perspective as a filmmaker, something that I tried to develop through my career. You’re not just looking at the VFX work artistically and technically, but also trying to understand the context of the work and how it contributes to the film as a whole. Warner Bros. was so anxious about the first movie, which is so funny to think of now. You don’t get to work on bigger or more complicated movies than something like Harry Potter and, again, it was a great learning experience. It was an amazing mixture of miniatures, practical and digital work.”

Guyett served as the main visual effects supervisor on Harry Potter and the Prisoner of Azkaban. “It’s still one of the strongest visual movies I’ve ever worked on. Alfonso Cuarón is such a visual director – the shot compositions, the way his shots develop. He didn’t have a lot of experience doing VFX, but he had such great ideas. I learned a lot about filmmaking from him. It was big and complex show and we had a great team. Mike Eames was the Animation Director, and Tim Burke was the other main supervisor on the show. The complexity of the work often demanded innovation and a tremendous amount of planning to set the shots up for success. That’s something that seems sadly less important these days. For example, we developed some great motion-base work tied into the animation for the sequence when Harry rides Buckbeak the hippogriff. It was really gratifying seeing the final shots come together, seeing that Harry’s movement was so integrated with the animation because of the mo-base work we’d

TOP TWO: Guyett also worked on the action-packed Mission: Impossible III.

BOTTOM LEFT: Working as both the Second Unit Director and VFX Supervisor on Star Trek allowed Guyett to have more control and authorship of the shots.

BOTTOM RIGHT: Guyett on the set of Star Trek

SUMMER 2023 VFXVOICE.COM • 75

done. The Time-Turner sequence was another very memorable moment for me. It was probably one of the most complicated and most planned shots I’ve ever worked on. You had so many different elements. Different plates of the cast and background actors moving at different frame rates, but all tied together with moving lighting. Then multiple miniature shots of models at different scales all comp’d together,” Guyett details.

The Knight Bus sequence also proved to be particularly challenging to create. “Alfonso wanted the freedom to move the camera at will inside the bus. We had to figure out a way of shooting a 360 environment plate, something that people do all the time now!” Guyett says. “We built a 14-camera film rig on a vehicle. All the heads were stabilized, and we worked out the visual overlap between all the cameras to create the necessary plate for the interior of the bus. Of course, the bus windows helped because they restricted the view a bit. We’d go to all the locations, which were super expensive as it was all shot in the middle of London, and only needed one or two runs of the vehicle to do all the plates we needed, which, of course, production loved. Then, in post we used a terrifying number of avids to sync all the footage together, which then allowed us to figure out how to move the interior bus set on stage, essentially syncing the motion of the bus set to the background plates we’d shot. If the bus turned left, it leaned appropriately, and because we had a full background plate, Alfonso could shoot in any direction. Trying to explain all this to Alfonso was near impossible, but fortunately he trusted me and it all worked out. Another trick was using a bunch of projectors, which then projected the same footage back into the bus itself. It gave you this great sense of texture and speed and excitement. Alfonso certainly tested your ingenuity and imagination.”

One of Guyett’s most significant contributions to visual effects was his work on J.J. Abrams Star Trek. “At this point, I was more confident and had developed and refined my approach more. Working as both the Second Unit Director and VFX Supervisor really allowed me more control and authorship of the shots. One of those touchstones on Trek was that famous image of the Earth photographed by the Apollo astronauts from the moon. Half the Earth is in shadow and half is lit. When we made Star Trek, that image really inspired my approach to the work. It seemed to encapsulate the idea of exploration, the potential danger, the unknown, traveling into the shadows, but also light and the strong contrast of the potential outcomes and, of course, simply put, the kind of lighting style I was excited about. We also famously

76 • VFXVOICE.COM SUMMER 2023 PROFILE
TOP TO BOTTOM: Guyett and J.J. Abrams have collaborated on five movies. Guyett on the set of Star Trek One of Guyett’s most significant contributions to visual effects was his work on J.J. Abrams’ Star Trek.
“[J.J. Abrams and I] have worked on five movies together. He was somebody that I learned from tremendously and was able to grow with. He’s extremely visual but compliments that with his background as a writer. I think we have common sensibilities and similar tastes, which of course really helps. We’ve done so much work together now that we have developed a shorthand. You just don’t have to spend so much time communicating your ideas, you can cut to the chase. He still surprises and challenges me...”
—Roger Guyett, Visual Effects Supervisor/Second Unit Director, ILM

developed a whole ‘lens flare’ language for the film, something that I see now in other movies. It created this visual energy, like you were seeing into the future or something. It was certainly vibrant but also very dark. J.J. and DP Dan Mindel did such a great job with that movie. It was so well cast and so much fun.”

Discussing his relationship with Abrams, Guyett notes that he was fortunate enough to work with Abrams at the beginning of his career as a film director. “We’ve worked on five movies together. He is somebody that I learned from tremendously and was able to grow with. He’s extremely visual but compliments that with his background as a writer. I think we have common sensibilities and similar tastes, which of course really helps. We’ve done so much work together now that we have developed a shorthand. You just don’t have to spend so much time communicating your ideas, you can cut to the chase. He still surprises and challenges me – he’s incredibly inventive. I’m very proud of the work that we’ve done together. I’ve been very lucky, he’s been a great partner.”

Guyett also collaborated with Abrams on Star Wars: Episode VII - The Force Awakens. “I worked with some of my closest collaborators again, people like Pat Tubach and Animation Supervisor Paul Kavanagh. On that show we finally saw the opportunity to upgrade the famous lightsabers! It always bugged me, especially having done Revenge of the Sith, that you didn’t get the correct interactive light from the sabers themselves, you were always cheating the interactive light as a separate light source. On Force Awakens there’d been some really amazing advances in LED tech, so we could finally build a lightsaber that was literally a tube of LED lights. Finally the prop was real light source! It really helped the quality and realism of the work. DP Dan Mindel was also a great partner on that show, and he really took advantage of the new lightsabers in some key moments in the film. It’s great to work on movies that have the budgets to do that kind of work. Regardless, you’ve always got to be cognizant of budget and try to design the shot within whatever parameters you have for the best-looking shot.”

“As a visual effects supervisor, you’re often one of the first to join a production and the last to leave – you get to see the entire process unfold,” Guyett reflects. “I’ve been lucky enough to work on such a variety of projects, from Pirates to Mission: Impossibles, and work with some incredibly talented people. I enjoy every aspect of the process: the planning, the shoot and, of course, working with the team to create the shots and see the movie come together. In hindsight I was lucky to start supervising at that crossroads between more traditional VFX techniques and the digital world, so I was fortunate to do a lot of miniature work, which I always loved. For example, on Star Wars: Episode III we had a massive practical model crew, despite the huge amount of digital work. It had one of the biggest miniature shoots in ILM’s history: the Mustafar scenes, which I supervised. It’s such a shame that it’s hard to do that work anymore. One of the fundamental ideas I learned from working with directors like J.J. and Steve Spielberg was about trying to make sure that each shot you worked on really advanced the story, even in a small way. Each shot has a purpose – it has a reason to be in the movie.”

SUMMER 2023 VFXVOICE.COM • 77
TOP TO BOTTOM: On location in the Skellig Islands, Ireland, for Star Wars: Episode VII – The Force Awakens. Guyett and the crew on location in Petra, Jordan, for Star Wars: Episode IX – The Rise of Skywalker Guyett with DP Dan Mindel and Production Designer Darren Guildford on Star Wars: Episode VII –The Force Awakens Guyett on location in Alaska for Star Trek plate shoot.

DIGITAL CAST MEMBERS IN HIGH-END EPISODIC AND ANIMATION

Even with the proliferation of visual effects in high-end episodic, it is difficult to name the number of shows that have central CG characters, which is probably due to the tight budgets and schedules as well as screentime being the equivalent to a trilogy of feature films; however, things are beginning to change somewhat with the plethora of soulful animals and supernatural beings in His Dark Materials, a legally smart green giant in She-Hulk: Attorney at Law, a faithful companion that is a detached hand in Wednesday, a toy in search of its owner in Lost Ollie, a CG-enhanced puppet that channels an up-and-coming Jedi master in The Mandalorian, indistinguishable digital doubles for The Falcon and the Winter Soldier and a softball team a week before a championship game in the CG-animated series Win or Lose

Visual effects supervisors are utilized in different ways and given various levels of access on shows. “It’s not a job with a clearly defined role as most people think,” notes Russell Dodgson, Creative Director of Television, Framestore and Senior Visual Effects Supervisor for His Dark Materials. “I was fortunate to be in a production that understood the show, to a degree, lived and breathed through the right use of visual effects, and by giving me a seat at the table afforded them the potential to get through that work as cost-effectively, efficiently and creatively as possible.”

Understanding anatomy is critical in creating believable creatures. Remarks Dodgson, “We do a lot of deep dives into our creatures, and sometimes we have external experts come in and educate us on how everything is truly structured.” All creatures

78 • VFXVOICE.COM SUMMER 2023
TV/STREAMING

have a personality but many of them are not given character arcs. “At the beginning of Season 1,” Dodgson explains, “Iorek Byrnison has a completely different body language and his ability to look at somebody in the eye isn’t there. By the end of the season, Iorek has regained his pride, his posture is different and he can keep eye contact. It’s such a subtle thing, but it flavors the performance in exactly the same way an actor would.” When anthropomorphizing natural creatures, less is more, he says. “I always work body, head and then face in terms of what I care about. Most of the time when you have good animators, like we did on His Dark Materials, and the body is doing the right thing, you feel the emotion in the shot anyway, and the face is something you sweeten it with. You try to do as little as possible so it doesn’t get weird.”

There were numerous things that Jan Philip Cramer, VFX Supervisor at Digital Domain, had to learn when creating the CG titled character for She-Hulk: Attorney at Law. “She-Hulk is a female character who is bubbly, positive, uses makeup and wears high heels and lots of different outfits,” Cramer notes. “One of the problems for us was how much subsurface you would see coming through, which is stuff that you would sometimes cover up with makeup. Similar with rouge; we would put some on lightly because we had to be so careful that her overall skin tone would stay untouched.” Unlike her green-skinned cousin, the physique was not modeled on a bodybuilder. Comments Cramer, “The whole story is about Jennifer Walters struggling with everybody wanting the prettier She-Hulk version of her, who is meant to be strong

OPPOSITE TOP: The great advantage Thing has is that he can move from all fours to upright to back to all fours.

(Image courtesy of MARZ and Netflix)

OPPOSITE BOTTOM: On Wednesday, magician Victor Dorobantu was able to provide a solid foundation for the character of Thing by having his hand as an onset stand-in.

(Image courtesy of MARZ and Netflix)

TOP: Animators on Lost Ollie treated the ears of Ollie as if they were a character in themselves.

(Image courtesy of Netflix and ILM)

BOTTOM: One of the hardest shots for ILM on Lost Ollie was when the price tag attached to the ear of Ollie gets caught on a pair of scissors, which leaves him dangling.

(Image courtesy of Netflix and ILM)

SUMMER 2023 VFXVOICE.COM • 79

(Images

BOTTOM:

and elegant-looking. We had 12 different outfits and she needed to be wearing different bras. We had to develop a shape-wear system because we realize that when you wear a pyjama or super tight-fitting legging or ballgown, the body is a completely different shape [created] by the outside forces. We were running a muscle system that we needed to maintain. You basically have different base geometries, depending on the costume, that is completely different whether it is pulled together or not. This proved to be challenging.”

Given an opportunity to stretch its fingers in Wednesday is the disembodied hand known as Thing, which is fiercely loyal to the Addams Family and a constant companion of Wednesday Addams as she attends her parents’ alma mater, Nevermore Academy. “They were smart and found a way around the budgetary constraints by using [magician] Victor Dorobantu, who as an individual is very good with his hands to bring on what that character is and capture the audience,” states Lon Molnar, Co-President and Co-Founder of Monsters Aliens Robots Zombies VFX (MARZ). “We had a lot of ramp-up because Tom Turnbull [Production VFX Supervisor] took the initiative to say, ‘This is going to be challenging. We need to take advantage of every day now until we start shooting and the plates start coming in.’”

(Image

The cinematic aesthetic of filmmaker Tim Burton and his background as an animator were taken into consideration. States Molnar, “When you’re a fan of animation and studied it, you know he is going to be looking at frames, and posing and silhouettes are important to him.” The mandate was not to emulate previous versions of Thing. Adds Molnar, “Instead of studying hands, it’s studying characters in film. One of the things we were spit-balling was Buster Keaton. Some of the early tests were how would a walk cycle look like with a hand. It’s looking at it like an upright human being walking. The great advantage Thing has is he can move from

80 • VFXVOICE.COM SUMMER 2023
TV/STREAMING
TOP AND MIDDLE: Adding to the performance of Iorek Byrnison in His Dark Materials is a character arc which sees the panserbjørn regain his regal pride over the course of Season 1. courtesy of HBO and Framestore) Creating believable creatures even when they are fantastical like the Muleffa from His Dark Materials requires an understanding of anatomy. courtesy of HBO and Framestore)

all fours to upright to back to all fours. The direction we were getting was almost like parkour; when you see them, they’re leveraging their body parts.”

Embarking on an odyssey to reunite with his owner is a toy rabbit in Lost Ollie. “For Lost Ollie, we wanted something that had a flexible performance,” states Hayden Jones, VFX Supervisor at ILM. “A lot of the scenes are incredibly emotional, so we needed a character to convey that, but similarly we also had to restrict it to something that didn’t belong in the real world. Ollie was a handmade, stuffed toy rabbit, so we could not give so much animation control over the face.” Long patchwork ears are a signature physical attribute of Ollie. Remarks Jones, “In a traditional production, you would animate the character and leave the ears to a creature effects team to put on and do the dynamics for. But we couldn’t afford that iterative cycle between animation and creature effects. Instead, a system was designed where we would give animators control over the physics on the ears, so while animating they could hit a button and there was a Houdini engine plug-in that simulated the ears for them. As the animators were iterating their animation, they were also iterating the creature development on the ears, so both could be brought onboard at the same time. If needed, we could override it and let the animators take control of the ears and craft the performance that the directors really wanted. That was really tricky.” The CG supervisors came up with semi-automated system for doing the level of polyester fiber filling inside Ollie. Explains Jones, “An animator would pose Ollie to get a certain reaction, but when you simulate all of the internal stuffing, it would slightly repose him. What we needed to do was to dial it back in certain areas. We came up with a flexible system not only to make Ollie as real as possible, but also dialing back the reality so we could give him the intended performance as well in areas, too.”

TOP TO BOTTOM: Unlike her green-skin cousin who was modeled upon a bodybuilder, She-Hulk had to be strong and elegant-looking.

(Image courtesy of Digital Domain and Marvel Studios)

Digital Domain developed a shape-wear system to accommodate the physique changing, depending on what outfit is being worn by She-Hulk (Image courtesy of Digital Domain and Marvel Studios)

Tatiana Maslany uses old-school techniques to make sure that her co-stars maintain their proper eyelines when talking to She-Hulk.

(Image courtesy of Digital Domain and Marvel Studios)

SUMMER 2023 VFXVOICE.COM • 81

Jones was also the Visual Effects Supervisor for the Grogu shots in London for Season 1 of The Mandalorian. Jones states, “We did some tests early on where we created a much more realistic Grogu that wasn’t based on the puppet. But as soon as we saw Grogu in his puppet form, everyone knew that was what we needed to do and keep. The puppet gives such a nice nuance performance and has such a feeling of heritage back to the Star Wars tradition of how the franchise has used puppetry and practical creatures throughout the shows. We still had lots of work to do because sometimes shots work and others need a little bit of enhancement. In fact, we ended up creating a great CG double for Season 1, and many times you won’t realize that you’re looking at a CG version of Grogu; that is down to the sheer artistry of everyone involved in the creation of the character and the animators who poured over what the puppet looked like to make sure that the CG version matched 100%. Sometimes we will take the puppet and only add eye blinks to it. Other times you’re doing more complex movements.” Jones adds, “Even though there is a lot of character animation in there, you know that the ground entry is the world’s greatest puppet, so you design your performance around that.” Jones notes that the costume of Grogu proved to be difficult. “The sheepskin collar with flowing cloth underneath was quite tricky to get right, but it all worked out in the end.”

Digital doubles have become a major part of action-oriented shows like The Falcon and the Winter Soldier for performance and safety reasons. “There is no time to waste if the sequence is over five minutes and you’re not sure where the digital double is going to be shown,” remarks Sébastien Francoeur, VFX Supervisor at Rodeo FX. “We have pictures, LiDAR scans and photogrammetry. Those guys are in booths and scanned, but the detail is not all there. We align everything and double check the model because it needs to fit. The eyes and teeth need to be perfect, and the folds on the clothing need to be near perfect. If you change half a centimeter between two eyes, it’s not the same guy again. It’s not working if you’re not doing the right smile with the right volume in the cheeks. You can go forever and forever when you’re doing that, but at some point you have to think about if the clothes need to be the same kind of folds or if it looks a bit different, nobody will care. It’s not about R&D. The client needs to be aware when we’re doing

82 • VFXVOICE.COM SUMMER 2023
TOP TO BOTTOM: The onset puppet of Grogu in The Mandalorian maintains a Star Wars tradition of relying on puppetry and practical creatures. (Image courtesy of Disney and Lucasfilm Ltd.) For The Mandalorian, a digital double was made of Grogu that matches both the appearance and physicality of the puppet, thereby allowing for a seamless integration. (Image courtesy of Disney and Lucasfilm Ltd.) The sheepskin collar with flowing cloth underneath was quite tricky to get right when creating the digital double for Grogu in The Mandalorian (Image courtesy of Disney and Lucasfilm Ltd.)

that. The price also comes when we’re doing that. For The Falcon and the Winter Soldier, we had the time frame to build it right. If it’s not the case, we need to know every shot, where we’re going to fit in and how we’re going to see it, then we can make decisions based on that. We get less and less of previs, so it’s up to us to come up with the staging, how the creature will fight and where they’re going to be based on the plate. With The Falcon and the Winter Soldier, we received a previs, but it kept changing until the last minute.”

Not everything is live-action centric, as a cast of CG characters had to be created for a Disney+ animated series created by Pixar which can be described as The Bad News Bears meets Rashomon “That’s a fair way to characterize it, I suppose,” laughs Lou HamouLhadj, Character Supervisor/Director for Win or Lose. “Given the conceit of the show, that we have a different point of view per episode, you potentially have eight protagonists, and we want to make sure that each of those designs speaks to that particular character, their world view and experiences. For an ordinary production, you would probably build your secondary cast in a fairly bespoke way. What we tried to do, as we were filling out the rest of the world, was leveraging those characters who are carrying the emotional arcs as a pool that we can pull from as a starting place for everyone else. That hopefully means we are being smarter and more efficient how we can shift, redress and pad out, especially our tertiary cast as much as possible.”

Expressiveness was paramount for the character-driven show. “There were some new technologies that we developed to allow us to hit what on a feature budget would be an expansive, broad character range of expression. We don’t want to water down the performance at all.” Personality traits influence the rigging of characters. “If someone is a loud mouth and boastful, we want to evoke that in their design and make sure we have the real estate on that face to go broad and do big things. We do have a baseline of a shared shape and visual language so everything feels of the world. But we would certainly have other characters who are quieter and have different expectations. Maybe their mouth range is much smaller, but there is some other aspect of their physicality that we need to capture in the way we design and rig them so that their personality comes through.”

TOP AND MIDDLE: Changing half a centimeter between the eyes makes the digital double look like an entirely different person in The Falcon and the Winter Soldier (Images courtesy of Rodeo FX and Marvel Studios)

BOTTOM: Because each episode is told from a different point of view, Win or Lose has eight protagonists designed to reflect their view of the world and experiences. (Image courtesy of Disney/Pixar)

SUMMER 2023 VFXVOICE.COM • 83

STEPPING OUT FOR AN IMMERSIVE ADVENTURE WITH LBVR

Sometimes home VR isn’t enough. That’s when location-based virtual reality steps in. LBVR (or LBE VR to specifically denote entertainment) is a subset of VR that offers “out-of-home” immersive experiences that range from the compact-size (arcade-style seats or cockpits, often with motion simulation), to room- or warehouse-scale, free-roaming experiences with haptic feedback, motion tracking and sensory effects. In the latter, family or friends can team up to battle aliens, immerse in movie settings, navigate escape rooms or engage in other adventures. Getting out of the house and socializing are two of the principle appeals of LBVR attractions, which are typically located in arcades, theme parks, family entertainment centers or dedicated sites. The global location-based VR market, which also includes business and education uses, is growing at a CAGR of 34.21% and is projected to reach $19.9 billion by 2028, according to Verified Market Research. In terms of gear, the LBVR market has until now been dominated by the headset manufacturers HTC (Vive), Oculus (now Meta), Valve, HP and Samsung, according to Octopod, and Sony has recently debuted its PlayStation VR2 headset.

The pandemic caused great losses for location-based entertainment centers, shutting the doors of many for good, and slowed LBVR’s momentum, but the platform is growing briskly again. LBVR’s technology is advancing, and many high-profile experiences have taken root or are arriving in 2023, such as Sony/ Hologate’s Ghostbusters VR Academy and, later in the year, Netflix/ Sandbox VR’s Squid Game title.

84 • VFXVOICE.COM SUMMER 2023

LBVR companies like Octopod VR believe in the platform’s powerful appeal. “Virtual reality allows us to create an ultraimmersive experience that can transport players. Whether it’s in a small individual area or in large, shared spaces, there’s something for every type of customer. The most obvious difference between LBE VR games and consumer [video] games is the consideration of the needs and constraints of the end user,” says Jean-Noël Charlier, Sales & Product Manager of Octopod VR, founded in 2017 under the umbrella of Wanadev, which also formed the VR game studio WanadevStudio, based in Lyon, France. “Customers who come to a [VR] center generally want to play in a group – friends, family, team building – over a planned period of time, on games where they can have quick fun.”

Octopod VR is a “distribution ecosystem” for multiplayer virtual reality games. It offers a large catalog of games and is designed to provide services and a complete solution for LBVR, including “the installation and configuration of VR spaces on the client’s premises by our teams. Today, we have more than 80 partner centers. Historically, we have had a strong presence in France and Western Europe, but more recently we have regularly established partnerships in North America, Asia and Africa,” Charlier says.

Many LBVR titles are tied to movies and series. Sandbox VR, along with working with Netflix on Squid Game, previously developed Star Trek: Discovery – Away Mission. Dreamscape Immersive worked on a Men in Black experience as well as the Harry Potter New York flagship store’s experiences Chaos at Hogwarts and

OPPOSITE TOP: The Blitz version of Ghostbusters VR Academy is a driving game with a vehicle motion simulator, while the Arena version is a free-roaming action-adventure that involves working as a team in ghost encounters. (Image courtesy of Sony Pictures VR)

OPPOSITE BOTTOM: The Ghostbuster VR Academy LBVR games were developed by Munich-based Hologate and published by Sony Pictures VR in 2023. (Image courtesy of Sony Pictures VR)

TOP: The Lost Lab from Divr Labs, which has three LBVR locations: London, Stockholm and Prague. (Image courtesy of Divr Labs)

BOTTOM: Vertigo Games’s Arizona Sunshine, a popular arcade game built exclusively for VR, pits the user and up to three fellow survivors against a zombie apocalypse. (Image courtesy of Vertigo Games)

SUMMER 2023 VFXVOICE.COM • 85

(Image

Wizards Take Flight. The last two also had the participation of Warner Bros. Studios, WEVR, Counterpunch Animation and Beyond-FX, among others.

ILMxLAB collaborated with The Void on the pioneering LBVR free-roaming adventure Star Wars: Secrets of the Empire, experienced with realistic 4D effects and warehouse scale at The Void locations, running from 2017 to 2020.

Another LBVR landmark came in 2018 when Jurassic World VR Expedition debuted in over 100 Dave & Buster’s entertainment centers, making it the biggest location-based VR launch to that date, according to Universal. The ride combines seats on a motion simulation platform and utilizes HTC Vive VR headsets, allowing up to four players at a time to visit the jungles of Isla Nublar and engage in an epic rescue adventure together. Jurassic World VR Expedition was developed by Marina del Rey, California-based The Virtual Reality Company ( VRC), with the collaboration of Steven Spielberg’s Amblin Entertainment and Universal Studios, and was directed by VRC Co-Founder Robert Stromberg, an Emmywinning visual effects artist and two-time Oscar winner for art direction.

(Image

Framestore and Thinkwell Group worked on several LBVR experiences for the Lionsgate Entertainment World theme park in Zhuhai Hengqin, China. Karl Woolley, Framestore’s Global Head of Immersive, comments, “Over the past decade, our Immersive team has seen the complexity and scale of experiences that we’re asked to partner on continue to grow. VR and LBE projects [have been] the mainstay of the Immersive team for many years since its inception, often the two areas combining for projects such as Samsung’s A Moon for All Mankind [a lunar gravity simulation VR experience, 2018], or the work done for Lionsgate Entertainment World where multiple LBE VR were delivered.”

86 • VFXVOICE.COM SUMMER 2023
TOP: The user explores a CG world in The Twilight Saga: Midnight Ride, with the VFX created by Framestore’s immersive team for Thinkwell Group. The LBVR is located at Lionsgate Entertainment World in China. courtesy of Lionsgate Entertainment) BOTTOM: In The Twilight Saga: Midnight Ride, guests sit on motorbikes that are mounted on CAVU Designwerks motion bases and use custom-designed VR headsets. courtesy of Lionsgate Entertainment)

In Lionsgate’s The Twilight Saga: Midnight Ride, guests sit on motorbikes that are mounted on CAVU Designwerks motion bases and use custom-designed VR headsets. The motion control is integrated seamlessly into what’s seen inside the VR experience, so when a rider pulls the throttle on their bike, they both see and feel the sensation of accelerating as they speed through the forest alongside the character Jacob – with 90fps real-time virtual media created by Framestore, according to the firm.

Meanwhile, Divergent: Dauntless Fear Simulator at Lionsgate Entertainment World is an open-world, free-roaming VR experience that invites fans of the Divergent franchise to take on their fears and see if they can become a member of the Dauntless faction, immersed in a completely CG world created by Framestore’s immersive team for Thinkwell Group. The LBVR attraction gives participants agency to progress through the world however they choose and thus can repeat the experience but get different outcomes each time.

Sandbox VR’s Squid Game virtual reality experience is set to open late 2023. “In it, players are transported to iconic Squid Game locations, where they become contestants in a variety of pulsepounding challenges inspired by the Netflix series. They compete against each other to be the last one standing. Teams of up to six friends can freely roam and explore virtual worlds together, while relying on each other to succeed,” according to Dacyl Armendariz, a spokesperson for Sandbox VR.

Sandbox VR, founded in 2016, has attracted celebrity investors such as Justin Timberlake, Katy Perry, Orlando Bloom, Michael Ovitz and Kevin Durant. Founded by CEO Steven Zhao, Sandbox VR has headquarters in San Francisco and Hong Kong, and over 35 venues across five countries. All are owned by Sandbox VR, except for one franchise location, located in Paramus, New Jersey,

(Image

Hologate, which has 450+ installations in 42 countries, offers both free-roaming LBVR adventures (in its Arena platform) and experiences with its motion simulation platform (Blitz). (Image courtesy of Hologate)

SUMMER 2023 VFXVOICE.COM • 87
TOP TO BOTTOM: Men in Black: First Assignment was created by Dreamscape Immersive, whose VR experiences use VR headsets, full-body mapping, motion capture technology, haptics and physical effects. Investors have included Steven Spielberg, IMAX, Westfield Malls, AMC Theaters and Nickelodeon. (Image courtesy of Dreamscape Immersive and Universal Pictures) The LBVR experience Chaos at Hogwarts, located at the Harry Potter New York flagship store, had the participation of Warner Bros. Studios, WEVR, Counterpunch Animation and Beyond-FX, among others. courtesy of Warner Bros.)

and some additional franchise-owned stores in Europe. Sandbox currently has six experiences, including Star Trek: Discovery –Away Mission and Deadwood Valley, Deadwood Mansion, Curse of Davy Jones, Amber Sky 2088 and UFL: Unbound Fighting League. Sandbox recently started a partnership with Vicon for real-time, precise motion-tracking tools for collaborative VR gaming experiences.

Sony Pictures VR, in cooperation with Ghost Corps, has published two Ghostbusters VR Academy games developed by Hologate – for the multiplayer VR platform Hologate Arena and for the VR motion simulator platform Hologate Blitz. Launched at 450+ sites, the location-based VR experiences “continue to expand the world of Ghostbusters in a way that honors the legacy of the franchise and offers something entirely new,” says Jake Zim, Senior Vice President, Virtual Reality at Sony Pictures Entertainment (SPVR). “For the first time ever, players have been able to train to be a real Ghostbuster in an amazing academy setting [Arena version] and [also] race a new prototype ECTO hovercraft using Hologate’s vehicle motion simulator [Blitz version].”

In the Ghostbusters Arena game, players are academy trainees who strap on their proton packs and work together as a team in high-risk ghost encounter scenarios. As they are guided through each challenging level, ghosts become more menacing than the last, which requires more collective skills, more strategy and more teamwork. Zim comments, “Players can create memorable moments while fulfilling longtime wishes to operate iconic Ghostbusters equipment. This fantasy-fulfilling experience is only possible in virtual reality and with Hologate’s location-based entertainment technology.”

Hologate, founded in 2013, has 450+ installations across 42 countries and more than 16 million players, and is “the largest VR network worldwide and the worldwide leader in compact location-based virtual reality entertainment,” according to CEO

88 • VFXVOICE.COM SUMMER 2023
TOP: Framestore’s immersive team, working for Thinkwell Group, created the visual effects for Divergent: Dauntless Fear Simulator, an experience based on the Divergent film franchise at Lionsgate Entertainment World. (Image courtesy of Lionsgate Entertainment) BOTTOM: Sandbox VR created the LBVR free-roam experience Star Trek: Discovery – Away Mission with celebrity investors. (Image courtesy of Sandbox VR and Paramount)
“There is no denying that the pandemic had a negative effect on the amount of LBE and/or VR work that was happening over the past 10 years, but we are seeing a strong resurgence in both spaces and are very positive about the future of the two combining at scale again, so much so that we are actively recruiting in this space.”
—Karl Woolley, Global Head of Immersive, Framestore

& Founder Leif Arne Petersen. “The lion’s share of this milestone figure is provided by our pioneering multiplayer turnkey VR platform Hologate Arena, which is not only our best-selling product, [but] also happens to be the most successful turnkey multiplayer VR system in the world,” Petersen adds.

Also, a new Hologate X platform promises to stream highresolution, wireless VR worlds directly into headsets, has full-body tracking and offers 4D effects such as scent and wind, THX 5.1 surround sound, physical props and full-body haptics, according to Petersen.

Octopod is also transitioning from backpack PCs to Wi-Fi streaming. Charlier explains, “Our decision to move away from recommending a PC backpack was driven by supply issues, limitations of the PC backpack – weight, battery changing, design issues. It was Wi-Fi 6 and the ability to stream from a PC to standalone headsets, such as the HTC Vive Focus 3, that led us to make this change.” He adds, “Wi-Fi streaming has many advantages, including the fact that it requires no cables, making it easier to move around the dedicated space. It’s a small revolution in the industry that will allow studios to be more creative with fewer technical constraints.”

“The success of LBE’s VR games has to do with the technology,” says Charlier. “The sector has been able to grow rapidly thanks to the constant improvements offered by helmet, hardware and technology manufacturers, as well as independent studios specializing in this field. The technical and technological boundaries of the sector are rapidly being pushed back, allowing LBVR to offer players a more immersive experience. A few years ago, we were forced to use complex technology to enable free roaming, but now it is quite easy thanks to the tools, technologies and products offered by the manufacturers.”

Dreamscape Immersive uses VR headsets, full-body mapping and motion capture technology to place six to eight users at a time in full-roam virtual reality experiences with haptics

TOP: Created by the Virtual Reality Company (VRC), Jurassic World VR Expedition debuted in over 100 Dave & Buster’s entertainment centers in 2018, making it the biggest location-based VR launch at that time. The ride provides seats on a motion simulation platform, utilizes HTC Vive VR headsets and allows up to four players at a time to visit the jungles of Isla Nublar. (Image courtesy of VRC, Amblin Entertainment and Universal Studios)

BOTTOM: Zero Latency VR teamed with Ubisoft on the free-roaming LBVR title Far Cry VR: Dive into Insanity, which debuted at the former’s 4,200-square-meter arena in Melbourne.

(Image courtesy of Zero Latency VR and Ubisoft)

SUMMER 2023 VFXVOICE.COM • 89

and physical effects in a room-scale set. Current adventures include Alien Zoo, The Blu: Deep Rescue and Men in Black: First Assignment, appearing at Dreamscape locations in Los Angeles, Columbus, Paramus, Dubai, Riyadh and Geneva. Dreamscape seeks to engage its players with “emotional cinematic narrative in our content. Yes, there are elements of gameplay, but our audience’s reaction to our experience tends to be equally emotional and visceral,” says Dreamscape Immersive Chairman and Co-Founder Walter Parkes. Vertigo Games is a multi-platform VR entertainment company with offices in Rotterdam, Amsterdam and Los Angeles. In 2021 it acquired SpringboardVR, the leading provider of VR venue management software and the largest content marketplace for LBE, according to the firm. Vertigo’s VR experiences are available in more than 700 sites through SpringboardVR, Vertigo’s HAZE VR distribution platform and other services. Its games include the perennial arcade hit Arizona Sunshine.

Zero Latency VR pioneered the first free-roam VR entertainment venue in 2015 in Melbourne, Australia, according to the firm, and now has 45 venues in 22 countries. Its latest platform, developed in partnership with HTC, streams to up to eight players simultaneously with 6E wireless technology and a HTC Vive Focus 3 headset, dispensing with a backpack computer, according to the firm. Zero Latency VR teamed with Ubisoft on the title Far Cry VR: Dive into Insanity, debuting the title in 2021 at the former’s 4,200-square-meter arena in Melbourne.

Other key LBVR firms include gaming company Survios (now creating an Aliens video game that will include a VR version), VRstudios (free-roaming, multiplayer LBVR systems and attractions), Tyffon (developer and management of location-based VR/ AR/MR content) and Neurogaming (a developer of VR content and technology). There are LBVR attractions at theme parks, including roller coaster add-ons, at indoor zones (such as Play DXB in Dubai Mall and the Nanchang VR Theme Park in China), and in the expansive VR Star Theme Park (formerly Oriental Science Fiction Valley), located in Guiyang in Guizhou province, China. The virtual reality theme park offers 35 VR attractions on 330 acres and cost an estimated $1.5 billion to construct.

“There is no denying that the pandemic had a negative effect on the amount of LBE and/or VR work that was happening over the past 10 years, but we are seeing a strong resurgence in both spaces and are very positive about the future of the two combining at scale again, so much so that we are actively recruiting in this space,” Framestore’s Woolley concludes.

90 • VFXVOICE.COM SUMMER 2023 VR/AR/MR TRENDS
“The technical and technological boundaries of the sector are rapidly being pushed back, allowing LBVR to offer players a more immersive experience. A few years ago, we were forced to use complex technology to enable free roaming, but now it is quite easy thanks to the tools, technologies and products offered by the manufacturers.”
—Jean-Noël Charlier, Sales & Product Manager, Octopod VR
TOP: ILMxLAB collaborated with The Void to create the pioneering LBVR free-roaming adventure Star Wars: Secrets of the Empire, placed in several Disney locations before closing in 2020. (Image courtesy of ILMxLAB and Lucasfilm Ltd.) In Wizards Take Flight, the guest sits on a “broom” and flies above London and Hogwarts, feeling wind, rain and turbulence, with Dobby providing crucial mission information. (Image courtesy of Warner Bros.)

Washington Revisited: Celebrating Synergy in the Pacific Northwest

The VES’ global community gets stronger every year – and so much of that is because of our regional VFX communities and their work to advance the Society and bring people together.

VES Washington is at the center point of a vibrant gaming and tech community, and the visual effects market is tightly tied to the wealth of visual computing and video game design companies, small production studios and indie boutiques in the region. Founded in 2016, the Washington State Section boasts a dynamic and diverse membership that reflects the unique makeup of the area’s visual effects community. As a game development hub, the VES Washington Section’s membership of 85+ boasts game developers and engineers, as well as educators, recruiters and production artists.

“It’s true: we think and show up differently,” said Andy Romine, VES Washington Co-Chair and Senior VFX Instructor at the Academy of Interactive Entertainment in Seattle. “Because we are home to a lot of independent studios, our membership brings that ‘scrappy’ perspective to our VFX community. And one of the truly great things about working and living in Washington is that you might run into someone working at Apple or NVIDIA, doing comp work for commercials or VFX for a Triple-A game. The ability to highlight all of the cool state-of-theart innovation in our backyard and make those connections is great – and a big selling point for new members.”

In the realm of recruiting new members, the Washington Section has benefited from strong word of mouth and fun evenings akin to pub nights with a ‘Washington flair’ hosted at gaming experience haunts like Mox Boarding House. Now, the Section is leaning into social media to engage prospective members.

“LinkedIn is proving to be a very successful vehicle for us in recruiting new members,” said Eric Greenlief, VES Washington Co-Chair and Visual Effects Artist at Bungie Studios in Bellevue. “Beyond recruitment, having strong communication with our current and prospective members where we trumpet all the amazing things we’re doing is a top priority. And we want to foster more real-time communications with and among our fellow members around the world to share what we’re learning and champion the building of our global network. We believe that compelling communication is key to growing our membership and deepening the involvement of our members.”

When it comes to programming, the VES Washington Co-Chairs are proud of the robust roster of virtual programs they produced during the pandemic. Virtual programs over the past few years included fun Netflix watch parties – with Grub Hub certificates sent to members so everyone could eat together – and a series of moderated conversations and demos under the Creative Spotlight theme, including: a roundtable discussion on Shotgun’s Project Management Software with panelists from Netflix, MGM, Bungie, Shotgun and Autodesk; an overview of Unreal Engine VFX; and a conversation with Independent Visual Effects Producer and Consultant and former VES Board Chair Mike Chambers, VES, as he discussed his journey from The Pickle to

92 • VFXVOICE.COM SUMMER 2023 [ VES SECTION SPOTLIGHT: WASHINGTON ]
Tenet. TOP: VES Washington members served as proud participants in the 2023 VES Awards global nominating events. BOTTOM: Washington Section members celebrate the holidays with a festive games night.

“We are excited to be back in the realm of hosting live events, and the decision to hold our VES Awards nominating event in person earlier this year was a testament to how hungry our group was to have that in-person interaction,” said Romine. “But it’s going to be a balancing act with some virtual events because we all understand how the online world helps in booking speakers and getting attendees to easily join in from across the state.”

Moving ahead, programming ideas in development include a panel on “Lighting in Games” with various studio representatives; a demo on real-time fluid simulation software Embergen; and a VES member demo reel night where members can show their work and share stories from the journey. There is also a lot of conversation within the Section about AI and how that is going to affect the industry in games and film, which may lead to hosting an educational forum that is mindful of how it might impact jobs and career opportunities.

“And with the move towards Virtual Production in the film industry… who knows more about Unreal Engine than game developers?” said Greenlief. “As we think about programs, we have a great opportunity for these two industries to mix. Making VFX in games is different than doing it in film, but the spaces are coming closer together, and there is a lot of learning to be done in the crossover.”

The Section also hosts a bevy of film screenings and Q&As, thanks to its longstanding partnership with the Seattle International Film Festival (SIFF), which holds the screening events in their theaters, including the SIFF Egyptian and SIFF Cinema Uptown.

“This is the perfect time to share gratitude for our former Section Chair Neil Lim Sang, who is our liaison with SIFF and works to ensure we have these great venues for our screenings,” said Romine. “Eric and I are so grateful to Neil who started the Section and has been our core. We feel like Neil was the entrepreneur who created the start-up, and now we are coming in to take things to the next level and realize the vision of what we can do.”

When asked about the value of their VES membership, the CoChairs offered these testimonials: Said Greenlief, “I love networking with people who are all working on different things because there is always more to learn. I can draw a dotted line from the VES to people I have hired for my studio. And we are so fortunate to have the platform from the VES to go out and talk with students about our tales from the journey and their futures. That school outreach is an incredible process that gives me such perspective on my pathway and what I can give back to those that follow. The ability to have an impact on aspiring young artists energizes me to do what I do.”

In closing, Romine remarks, “I’ve been a film and VFX nerd since I was five years old, and finding a community who loves this as much as I do has been such an amazing part of my life. Talking shop and learning how people get things done – the excitement is contagious. And the VES is also the reason I have a job at the Academy of Interactive Entertainment. I was invited to speak on a VES roundtable at AIE, they invited me back, and the rest is history. Let me end with this: VES Washington is hitting another growth period in membership, and we have the largest Board of Managers we’ve ever had. We all are bringing high energy this year and expect to deliver great things for our VES community!”

SUMMER 2023 VFXVOICE.COM • 93
TOP TWO: The VES Washington Creative Spotlight series features a dynamic roster of guest speakers and presentations. BOTTOM: Washington Section members and guests gather for an exclusive LED Wall demo presented by immersive experience experts from Vossler Studios.
“We believe that compelling communication is key to growing our membership and deepening the involvement of our members.”
—Eric Greenlief, VES Washington Co-Chair

VES Welcomes the 15th and Newest Section in Oregon

This Spring, the VES proudly announced the establishment of its newest regional Section in the U.S. state of Oregon. The VES Board of Directors formally authorized the Section’s creation at its March Board meeting. The Oregon Section joins the dynamic worldwide community, which has Sections in Australia, San Francisco’s Bay Area, France, Georgia (U.S.), Germany, India, London, Los Angeles, Montreal, New York, New Zealand, Toronto, Vancouver and Washington state. Now in its 26th year, the Society is thriving with upwards of 4,500 members in more than 45 countries worldwide.

“We are thrilled to welcome our newest VES Section in the state of Oregon,” said VES Board Chair Lisa Cooke. “The VES’ presence gets stronger every year – and so much of that is because of our regional visual effects communities around the world and all that they do to advance the Society and bring people together. Our determination to reach all corners of the globe and to all of the disciplines across the VFX spectrum has yielded us a very rich, talented membership, and that commitment to diversity and inclusion will continue to be a driving force of the organization.”

“The VES Oregon Section has a unique and timely opportunity to build a rich network of VFX artists and innovators and tap into the thriving local visual effects market,” said Eric Wachtman, CG Supervisor at LAIKA Studios in Portland and veteran VES member who was instrumental in forming the new Section. “Creating a VES Section in Oregon has long been a goal and dream – and we are excited this day is here so we can formally come together as part of the VES community and provide immense value to our colleagues. We look forward to becoming the VFX hub in the Portland Metro area and hopefully throughout the state.”

[ VES NEWS ] 94 • VFXVOICE.COM SUMMER 2023
Photo of Portland skyline courtesy of Kenji Sugahara.
“The VES Oregon Section has a unique and timely opportunity to build a rich network of VFX artists and innovators and tap into the thriving local visual effects market. Creating a VES Section in Oregon has long been a goal and dream –and we are excited this day is here so we can formally come together as part of the VES community and provide immense value to our colleagues. We look forward to becoming the VFX hub in the Portland Metro area and hopefully throughout the state.”
—Eric Wachtman, CG Supervisor, LAIKA Studios

VES Education Committee Collaborates on Careers in VFX Speaker Series

Earlier this year, the VES Education Committee – in partnership with the California Department of Education’s Arts Media and Entertainment Initiative and the BRIC Foundation, a nonprofit dedicated to increasing representation in entertainment, gaming, media and tech – launched the dynamic Careers in Visual Effects Speaker Series. With the goal of creating and nurturing pathways for future generations of artists and innovators, the 12-part streaming webcast series, which continues with live webcasts through October 2023, brings together an impressive array of internationally acclaimed VES artists and professionals to speak about their work and career trajectories in the rapidly evolving field.

Directly offered to high school students, each interactive session includes a Q&A forum with the speakers, and teachers are provided with learning materials to supplement their curriculum. Students are introduced to topics ranging from painting of assets, compositing, marketing and an introduction of game VFX. The series also serves as a venue to introduce students to career paths for the BRIC registered apprenticeship for animation, VFX and games, providing a tangible opportunity to receive training and hands-on experience.

Curated by Rose Duignan, the diverse roster of VES members and friends who are guest speaking in this year’s program includes:

·Fortunato Frattasio, VFX Artist/Supervisor/Educator

·Billy Shih, Technical Artist, Bonfire Studios

·Mark Andrews, Director, ReelFX

·Jean Bolte, Senior Painter/Digital Artist, Industrial Light & Magic

·Hal T. Hickel, Animation Director, Industrial Light & Magic

·Erin Ramos, Head of Effects Animation, WDAS

·Rebecca Forth, Look Development Lighting Technical Director, Industrial Light & Magic

·Yulien Tso, Experienced Film/TV Compositor & Trainer

·Kaitlyn Yang, VFX Supervisor and Founder, Alpha Studios

·Neishaw Ali, President & Executive Producer, Spin VFX

·Rachel Rose, R&D Supervisor, Industrial Light & Magic

·Rita Cahill, International PR/Marketing/Business Development Consultant

Replays for the prior episodes of the Careers in Visual Effects Series and signups for the upcoming sessions are available at: https://www.vesglobal.org/careers-in-vfx-speakerseries/. These public videos are available to share with anyone around the globe who is exploring careers in the industry. Additional educational information sessions can be found at https://www.vesglobal.org/ves-education-initiative-resources-page/ under Presentations and Material.

TOP THREE: Rebecca Forth discusses “Lighting for VFX” for the Careers in VFX Speaker Series

BOTTOM THREE: Billy Shih discusses “Introduction to Game VFX” for the Careers in VFX Speaker Series

SUMMER 2023 VFXVOICE.COM • 95
With the goal of creating and nurturing pathways for future generations of artists and innovators, the 12-part streaming webcast series, which continues with live webcasts through October 2023, brings together an impressive array of internationally acclaimed VES artists and professionals to speak about their work and career trajectories in the rapidly evolving field.

Emmy VFX Evolution

The first Emmy Awards were 74 years ago in 1949. The word Emmy was inspired from Immy, a nickname for Image Orthocon, a camera used in television. Cognizant of scientific know-how, the Academy gave out its very first tech award that year to Charles Mesak of Don Lee Television for the introduction of TV camera technology Phasefader. In 1955 the TV Academy handed out an Award for Best Engineering Effects for Robert Shelby’s Four Quadrant Screen.

Beginning with the Award Ceremony of 1998, the category has been divided into Special Visual Effects for a Series (The Book of Boba Fett in 2022) and Special Visual Effects for a Miniseries, Movie or Special (Squid Game in 2022).

Along the way, the Academy had various names for these types of technical umbrella awards, including Outstanding Engineering Effects, Outstanding Special Electronic Effects, Outstanding Special Mechanical Effects and Outstanding Special Photographic Effects.

Among some of the more notable winners for the effects categories over the years have been Voyage to the Bottom of the Sea, The Time Tunnel, Cosmos, Battlestar Galactica, Dinosaur!, Star Trek: Voyager, Babylon 5, The X-Files and Lost Rick and Morty collected two Emmys for Outstanding Animated Program. The most awarded show in Emmy history is Saturday Night Live with 87 Primetime Emmy Awards.

[ FINAL FRAME ] 96 • VFXVOICE.COM SUMMER 2023
Image of Carl Sagan for his Emmy-winning Cosmos TV series courtesy of Turner Entertainment.
Issuu converts static files into: digital portfolios, online yearbooks, online catalogs, digital photo albums and more. Sign up and create your flipbook.