VFX Voice Spring 2023

Page 1

RRR: GLOBAL FILM VFX

21ST ANNUAL VES AWARDS & WINNERS

NEXT-GEN ARTISTS

AI ADVANCES

THE LAST OF US

PROFILES: JACQUELYN FORD MORIE & PAUL DEBEVEC

VFXVOICE.COM SPRING 2023

Welcome to the Spring 2023 issue of VFX Voice!

Coming out of a much-anticipated awards season, this issue of VFX Voice shines a light on the 21st Annual VES Awards gala. Congratulations again to all of our outstanding nominees, winners and honorees.

This issue delves into TV/streaming and the juggernaut HBO series The Last of Us and goes inside the big-screen VFX of Shazam! Fury of the Gods. Blockbuster Indian Telugulanguage action epic RRR – named for the Tollywood star power behind the film – is central to our cover story on the surge of global film production captivating audiences worldwide. We explore trends in AI’s creative advancement and showcase booming activity in virtual production – LED stages, creative storytelling solutions and an industry roundtable on what comes next in this dynamic space. We shine a light on building the pipeline of the next generation of VFX talent. And VFX Voice sits down with VR pioneer Dr. Jacquelyn Ford Morie and award-winning production visionary Paul Debevec for up close and personal profiles.

We serve up insightful guidance from our award-winning VES Handbook of Visual Effects and put our New York Section in the spotlight. And, of course, please enjoy our VES Awards photo gallery, featuring the winners in 25 categories of visual effects excellence, show highlights, our VES Lifetime Achievement Award honoree, acclaimed producer-writer Gale Anne Hurd, and VES Board of Directors Award honoree, former VES Executive Director Eric Roth.

Dive in and meet the visionaries and risk-takers who advance the field of visual effects.

Cheers!

P.S. Please continue to catch exclusive stories between issues only available at VFXVoice.com. You can also get VFX Voice and VES updates by following us on Twitter at @VFXSociety.

2 • VFXVOICE.COM SPRING 2023 [ EXECUTIVE NOTE ]

FEATURES

8 VFX TRENDS: INDUSTRY GROWTH

VFX work continues to expand across the globe.

14 VFX TRENDS: THE AI EXPLOSION

AI’s creative advances provoke ethical and legal concerns.

20 COVER: GLOBAL FILMS SURGE

Film productions and VFX are thriving around the world.

26 TV/STREAMING: THE LAST OF US

Detailing post-apocalyptic America for the HBO series.

32 PROFILE: JACQUELYN FORD MORIE

Virtual reality pioneer traverses virtual worlds and avatars.

38 FILM: SHAZAM! FURY OF THE GODS

Making Philadelphia the epicenter of a VFX eruption.

44 PROFILE: PAUL DEBEVEC

A visionary on the front lines of innovation and application.

50 THE 21ST ANNUAL VES AWARDS

Celebrating excellence in visual effects.

60 VES AWARD WINNERS

Photo Gallery

66 VIRTUAL PRODUCTION: VP ROUNDTABLE

Industry experts weigh in on VP’s current surge and future.

72 VIRTUAL PRODUCTION: LED STAGES

Activity is booming at companies offering LED stages.

78 VIRTUAL PRODUCTION: CREATIVE IMPACT

VP provides new options and real solutions to storytellers.

82 EDUCATION: NEXT-GEN ARTISTS

The search expands for new talent to meet demand.

DEPARTMENTS

2 EXECUTIVE NOTE

88 ON THE RISE

90 THE VES HANDBOOK

92 VES SECTION SPOTLIGHT – NEW YORK

94 VES NEWS

96 FINAL FRAME – VIRTUAL PRODUCTION

ON THE COVER: RRR (Rise Revolt Roar), the epic action drama from India, is becoming an international sensation. (Image courtesy of RRR)

4 • VFXVOICE.COM SPRING 2023 [
] VFXVOICE.COM
CONTENTS

WINNER OF THREE FOLIO AWARDS FOR PUBLISHING EXCELLENCE.

VFXVOICE

Visit us online at vfxvoice.com

PUBLISHER

Jim McCullaugh publisher@vfxvoice.com

EDITOR

Ed Ochs editor@vfxvoice.com

CREATIVE

Alpanian Design Group alan@alpanian.com

ADVERTISING

Arlene Hansen Arlene-VFX@outlook.com

SUPERVISOR

Nancy Ward

CONTRIBUTING WRITERS

Naomi Goldman

Ian Failes

Trevor Hogg

Jim McCullaugh

Chris McGowan

ADVISORY COMMITTEE

David Bloom

Andrew Bly

Rob Bredow

Mike Chambers, VES

Lisa Cooke

Neil Corbould, VES

Irena Cronin

Paul Debevec, VES

Debbie Denise

Karen Dufilho

Paul Franklin

David Johnson, VES

Jim Morris, VES

Dennis Muren, ASC, VES

Sam Nicholson, ASC

Lori H. Schwartz

Eric Roth

VISUAL EFFECTS SOCIETY

Nancy Ward, Executive Director

VES BOARD OF DIRECTORS

OFFICERS

Lisa Cooke, Chair

Susan O’Neal, 1st Vice Chair

David Tanaka, VES, 2nd Vice Chair

Rita Cahill, Secretary

Jeffrey A. Okun, VES, Treasurer

DIRECTORS

Neishaw Ali, Laurie Blavin, Kathryn Brillhart

Colin Campbell, Nicolas Casanova

Mike Chambers, VES, Kim Davidson

Michael Fink, VES, Gavin Graham

Dennis Hoffman, Brooke Lyndon-Stanford

Arnon Manor, Andres Martinez

Karen Murphy, Maggie Oh, Jim Rygiel

Suhit Saha, Lisa Sepp-Wilson

Richard Winn Taylor II, VES

David Valentin, Bill Villarreal

Joe Weidenbach, Rebecca West

Philipp Wolf, Susan Zwerman, VES

ALTERNATES

Andrew Bly, Johnny Han, Adam Howard

Tim McLaughlin, Robin Prybil

Daniel Rosen, Dane Smith

Visual Effects Society

5805 Sepulveda Blvd., Suite 620

Sherman Oaks, CA 91411

Phone: (818) 981-7861

vesglobal.org

VES STAFF

Jim Sullivan, Director of Operations

Ben Schneider, Director of Membership Services

Jeff Casper, Manager of Media & Graphics

Colleen Kelly, Office Manager

Brynn Hinnant, Administrative Assistant

Shannon Cassidy, Global Coordinator

P.J. Schumacher, Controller

Naomi Goldman, Public Relations

Tom Atkin, Founder

Allen Battino, VES Logo Design

6 • VFXVOICE.COM SPRING 2023
SPRING 2023 • VOL. 7, NO. 2 VFX Voice is published quarterly by the Visual Effects Society. Subscriptions: U.S. $50: Canada/Mexico $60; all other countries $70 a year. See vfxvoice.com Advertising: Rate card upon request from publisher@vfxvoice.com or VFXVoiceAds@gmail.com Comments: Write us at comments@vfxvoice.com Postmaster: Send address change to VES, 5805 Sepulveda Blvd., Suite 620, Sherman Oaks, CA 91411. Copyright © 2023 The Visual Effects Society. Printed in the U.S.A. Follow us on social media

VFX INDUSTRY – GROWTH, GLOBALIZATION AND CHANGE

TOP: Guardians of the Galaxy, Vol. 3. VFX: RISE Visual Effects Studios, Framestore, Rodeo FX, Crafty Apes, Wētā FX, Gentle Giant Studios, ILM, Weta Digital and Clear Angle Studios. SFX: Marvel Studios and Legacy Effects. (Image courtesy of Marvel Studios)

OPPOSITE TOP TO BOTTOM: 65. VFX: Framestore, Method Studios, New Holland Creative, Quantum Creation FX, Captured Dimensions and 22DOGS. Visualization: OPSIS. (Image courtesy of Columbia Pictures/Sony)

Spider-Man: Across the Spider-Verse. (Image courtesy of Marvel and Columbia Pictures/Sony)

Mission: Impossible – Dead Reckoning – Part One VFX: ILM, Rodeo FX, BlueBolt and Clear Angle Studios. Visualization: Halon Entertainment. (Image courtesy of Paramount Pictures)

Mission: Impossible – Dead Reckoning – Part One (Image courtesy of Paramount Pictures)

Over the last few years, the VFX industry has surged due to an infusion of visual effects in almost all films and series, the expansion of the streamers, a boom in animation, and the growth of video games and immersive formats. This is happening while LED stages and virtual production have been altering filmmaking and post-production processes, bringing them closer together.

At the same time, VFX work has continued to expand across the planet. “The globalization of the VFX business has been happening for a while, but the opportunities for remote working that have been accelerated by the pandemic have been a great enabler of this trend with artists and teams able to collaborate ever more easily and effectively across geographies,” says Namit Malhotra, Chairman and CEO of DNEG.

Streaming platforms Disney+, Apple TV+, HBO Max, Peacock and Paramount+ launched between 2019 and 2021, joining Netflix and Amazon Prime, collectively boosting production. “The strong growth in demand for content driven largely by the streaming companies has opened new avenues in content creation for VFX and animation companies,” Malhotra says. However, he feels the accelerated recent growth of the industry is now becoming more balanced. “The number of productions globally is now being rationalized to the reality of what can be produced. Demand was outstripping supply to such an extent that it was actually becoming unsustainable. What we are seeing now is more of a sensible and sustainable approach to content creation, and it is finding

8 • VFXVOICE.COM SPRING 2023

equilibrium – which is a good thing. There is still growth, but it is a lot more structured and sustainable.”

Because of the streamers, “there is a lot more episodic content than there was five years ago,” comments Pixomondo CEO Jonny Slow. “This is not a new factor, and the growth of it may slow down a little in the short term, but I don’t see this trend going away. This is a whole new sub-genre of content. In publishing terms, it’s like the invention of the novel, and it has created millions more viewing hours per week.”

The streamers have generated plenty of filmmaking work in domestic production centers across Europe and Asia, from Oslo to Seoul, which in turn has generated more VFX work around the world. India has become an especially important pole of visual effects with many foreign-owned and locally-owned VFX houses working at full throttle there.

Vantage Market Research forecasts that “the increased demand for advanced quality content among consumers across the globe and the introductions of new technologies related to VFX market by industry players are expected to augment the growth of the VFX market,” and predicts that VFX global market revenue will climb from $26.3 billion in 2021 to $48.9 billion in 2028, growing at a compound annual growth rate (CAGR) of 10.9% during the forecast period.

“VFX is now an integral component of cinematic narrative in film, episodic, commercials and themed entertainment. Due in part to the convergence of gaming workflows, GPU-accelerated computing functions and cloud computing, VFX is increasingly accessible to all levels of complexity and budgets in storytelling,” says Shish Aikat, Global Head of Training at DNEG.

ACQUISITIONS AND EXPANSIONS

The dynamic activity of the VFX business in the last year includes acquisitions and foundings. One of the biggest deals in 2022 was Sony’s purchase of Pixomondo, which has facilities in Toronto, Vancouver, Montréal, London, Frankfurt, Stuttgart and Los Angeles. Recent projects include: Avatar: The Last Airbender (Netflix) and the next seasons of House of the Dragon, The Boys, Halo, Star Trek: Discovery, Star Trek: Strange New Worlds and many others. About the sale, Slow comments, “It allows us to benefit from being fully aligned with the whole Sony group, both creatively and from a technology development perspective.”

Also last year, Crafty Apes acquired Molecule VFX. The Fuse Group (owner of FuseFX) bought El Ranchito, which has studios in Madrid and Barcelona. Outpost VFX and Framestore opened Mumbai facilities and BOT VFX a Pune branch. After purchasing Scanline VFX at the end of 2021, Netflix acquired Animal Logic in 2022 and signed a multiyear deal with DNEG through 2025 for $350 million. DNEG will open an office in Sydney this year to go with its existing facilities in London, Toronto, Vancouver, Los Angeles, Montréal, Chennai, Mohali, Bangalore and Mumbai.

INDIA

DNEG’s four Indian studios played a role in how India has become a significant source of global VFX production. MPC and The Mill

SPRING 2023 VFXVOICE.COM • 9

TOP TO BOTTOM: Ant-Man and the Wasp: Quantumania. VFX: ILM, ILM/StageCraft, Digital Domain, Spin VFX, Method Studios, MPC, Luma Pictures, Barnstorm VFX, Sony Pictures Imageworks, MARZ, Territory Studio, Rising Sun Pictures, Perception, Folks VFX and Clear Angle Studios. Visualization: The Third Floor. (Image courtesy of Marvel Studios)

Ant-Man and the Wasp: Quantumania. (Image courtesy of Marvel Studios)

Beau Is Afraid. VFX: Hybride and Folks VFX. (Image courtesy of A24)

Beau Is Afraid. (Image courtesy of A24)

(owned by Technicolor Creative Services), FOLKS VFX (The Fuse Group), Framestore, BOT VFX (based in Atlanta and with three studios in India), Rotomaker India Pvt Ltd, Mackevision, Outpost VFX and Tau Films are other multinationals with facilities in India.

One of India’s leading local visual effects firms is FutureWorks, which has 325 total employees in facilities in Mumbai, Hyderabad and Chennai. CEO Gauray Gupta comments, “Of these, our Chennai studio is geared as a global delivery center to service our international clients. Our Mumbai studio, which was our first one, is focused on Indian filmmakers, and also works closely with platforms like Amazon Prime Video and Netflix for their Indian productions. Early next year will see us relocate to a larger studio in Mumbai and expand our Hyderabad operations with a bigger facility in Q2.” He notes that FutureWorks’ recent portfolio “spans global hits including: The Peripheral for Prime Video, Westworld for HBO, Netflix’s Lost in Space and [the Hindi-language movies] Jaadugar, directed by Sameer Saxena, and Darlings, directed by Jasmeet K. Reen.”

FutureWorks currently has “around a 50% split between our domestic and international customers, and our business strategy is to continue along those lines as we grow,” according to Gupta. “Global demand for VFX services has fueled the rapid increase in VFX studios in India. Indian studios are now full of creative sequences and shots, not just RPM or back-office work.”

Gupta notes, “Global demand and supply have increased concurrently, and there is plenty of room for everyone. What we see is a truly global marketplace, [with] more choices for clients in terms of where and who can execute the top-end work. However, [having] more studios also means that the industry needs more talent, and that talent has a wider range of options than ever before.”

Gupta adds, “This is the most excited I’ve been about the industry in India since I founded the company. There is a huge demand for content from OTT networks and filmmakers. Relationships are global, technology is global, vendors are global. The scene here currently is creative, ambitious and evolving at a rapid pace.”

ACROSS THE PLANET

Ghost VFX was set to open a studio in Pune in 2023 and has facilities in Los Angeles, Vancouver, Toronto, London, Manchester and Copenhagen, with nearly 600 total global employees (Streamland Media purchased Ghost VFX in 2020). “Having studios across the globe means we’re able to work together across a single technology workflow so we can react to the ebbs and flows of demand,” says Patrick Davenport, President of Ghost VFX. “We’re also in several key locations for tax incentives. Although we offer our employees the option of working from home, hybrid or in-studio, having global studios helps us retain talent who want to work in different countries as well as [work] in-studio.”

Davenport adds, “We have larger projects which we share across the studios, but still focus on being able to support local productions. For example, our Copenhagen studio just worked on Troll, a Norwegian film for Netflix. On a global scale, we’ve worked on

10 • VFXVOICE.COM SPRING 2023
VFX TRENDS

several projects including: Star Trek: Strange New Worlds that artists in Copenhagen and Vancouver worked on, and for Fast X we currently have teams in our U.K., Copenhagen, Pune and L.A. studios working on the film.” Another is the new season of The Mandalorian, “one of several projects we’re working on for Lucasfilm.”

Glassworks has studios in London, Amsterdam and Barcelona. “Speaking as a studio with multiple locations across Europe, I can definitely see [having them as an] advantage. The benefits come in different aspects, including a shared pool of resources, access to specialized talent across offices, and opportunities within each market or in tandem across facilities,” says Glassworks COO Chris Kiser. “Scalability is always important in our business, and it’s great to be able to work with artists and producers that you know and trust before needing to go outside of your own studio.”

For Glassworks, “The year started with a couple of big commercial projects, including the Turkish Airlines Super Bowl spot featuring Morgan Freeman and Apple’s Escape the Office film,” Kiser says. “Young adult and fantasy fans will have seen VFX from our team in both Vampire Academy and Fate: The Winx Saga, [and] we have other projects in the works for Netflix, HBO and Amazon Prime.”

VIRTUAL PRODUCTION IMPACT

Virtual production has greatly transformed the VFX business and inspired the construction of hundreds of LED stages, both fixed-location and bespoke/pop-up. Last year ended with the completion of two notable facilities in Culver City. Amazon’s stage, located on Stage 15 of the Culver Studios lot, has an 80-foot diameter with a 26-foot-high volume, a virtual-production takeover of what had been the production scene of many famous movies in the analog era. Nearby, a new LED stage rose at Sony Innovation Studios on the Sony lot, in the same year that the firm purchased Pixomondo and its three LED stages.

Virtual production was valued at $1.46 billion in 2020, projected to reach $4.73 billion by 2028 and expected to grow at a CAGR of 15.9% during the forecast period from 2021 to 2028, according to a market report by Statista.

“Things are normalizing a little now, but along the way virtual production became a more widely adopted production solution, and for a few players, including Pixomondo, there is no turning back from this as we have created some very effective tools. That said, we see it as very complementary to our VFX services business, not a replacement – and in fact, we are able to integrate the processes to deliver additional value and speed of delivery,” Slow says.

INCENTIVES MAKE A DIFFERENCE

Tax breaks are still having an impact on the geography of VFX. “Incentives do create additional production spend overall, as they directly impact how far a production budget will go,” Slow says.

“However, not all of the benefit stays in VFX ultimately – our clients have to balance productions books overall somehow. But, a more generous scheme in one region will influence where our

TOP TO

Transformers: Rise

Beasts. VFX: MPC, ILM and Wētā FX. Visualization: Halon Entertainment and The Third Floor. (Image courtesy of Paramount Pictures)

John Wick: Chapter 4. VFX: Rodeo FX, Crafty Apes, Mavericks VFX, One of Us, The Yard VFX, Tryptyc VFX, Light VFX, Pixomondo, Outlanders VFX, Boxel Studio, Atomic Arts, McCartney Studios, Track VFX, WeFX and Clear Angle Studios. Visualization: NVIZ and Halon Entertainment. (Image courtesy of Lionsgate)

Indiana Jones and the Dial of Destiny. VFX: ILM, Clear Angle Studios, Important Looking Pirates, Rising Sun Pictures, Crafty Apes, The Yard VFX, Soho VFX, Midas VFX and Capital T. Visualization: Proof Inc.

(Image courtesy of Paramount Pictures)

Renfield. VFX: Crafty Apes, ILM, Outpost VFX, Connect VFX, Pixel Magic, Weta Workshop and Spectrum Effects. Visualization: Proof Inc.

(Image courtesy of Universal Pictures)

SPRING 2023 VFXVOICE.COM • 11
BOTTOM: of the

Shazam! Fury of the Gods. VFX: DNEG, Pixomondo, Clear Angle Studios, Wētā FX, Scanline VFX, Method Studios. Weta Digital, BOT VFX and RISE Visual Effects Studios. Visualization/Effects: OPSIS and The Third Floor. (Image courtesy of Warner Bros.)

Servant. Series VFX: PowerHouse VFX, Cadence Effects, Ingenuity Studios and Vitality Visual Effects. (Image courtesy of Apple TV+)

Slumberland. VFX: DNEG, Scanline VFX, Outpost VFX, Rodeo FX, Important Looking Pirates, Ghost VFX, BOT VFX, MARZ and Incessant Rain Studios. Visualization: Halon Entertainment. (Image courtesy of Netflix)

clients want the work to happen, so a small change in the rules can create a very big shift in demand for work in a particular region. This has been very effectively used as a tool to drive investment and jobs into Canada, the U.K., Germany and many other places. It’s a big success story, and I think we will see this continue to evolve.”

Malhotra notes, “Frankly, these incentives make the use of visual effects more competitive for our clients, allowing them to create higher-quality content. This is important all round, as it creates more sustainable employment, as well as great quality of work for our clients, while helping them mitigate the costs of content creation.”

ARTIST CHALLENGES AND OPPORTUNITIES

Davenport comments, “Demand for VFX looks likely to continue, though there is still the challenge of delivering the work within budgets and compressed schedules at a time of rising costs, particularly of labor.”

Malhotra observes that the talent gap is another challenge. “We need more talent in our industry with more experience – not just to creatively deliver projects, but also to manage and produce them,” he comments. “Training is a key focus – the fact that everyone is working from home compromises the culture of learning from your team and those around you, where you gain more experience by asking questions and looking at each other’s work. This has an effect on the time it takes to bring new recruits in our industry up to speed, which poses some interesting challenges for us as an industry — it’s a universal issue.”

Kiser adds, “The biggest challenge to our industry is bringing in the next generation and providing them with training and opportunities to succeed. Many of us were able to get a break somewhere or discover the potential for a career in VFX thanks to technical training or personal connections. We need to take advantage of the momentum and interest in film and TV to reach a wider, more diverse group of young people.”

“The convergence of visual effects and real-time gaming technologies, and the emergence of opportunities in the metaverse, virtual reality, immersive experiences and web 3.0, all significantly contribute to the possibilities for visual effects artists to leverage their skill sets beyond movies and television. It is a very interesting time for our industry and the people that work within it,” Malhotra says.

“We certainly hope the trend for VFX and animation will continue as it has been thrilling and made for some amazing content. The industry and budgets are likely to fluctuate in the same way they have over the years, although the demand has never been higher,” Kiser says. “The key now, as it has always been, is retaining talent and fostering creativity within our teams. We can’t control what happens in the outside world, but we have the ability to build a productive and enjoyable environment where the best creative work happens.”

“[In the] short term,” Snow concludes, “activity levels are calming down a little, but this is a good thing for the industry and the people working in it. Long-term, I don’t think there is any change to the overall trend of continued, healthy, year-on-year growth.”

12 • VFXVOICE.COM SPRING 2023 VFX TRENDS
TOP TO BOTTOM: Dungeons & Dragons: Honor Among Thieves. VFX: ILM, MPC and Clear Angle Studios. SFX: Legacy Effects. Visualization: Day For Nite. (Image courtesy of Paramount Pictures)

PROMPTING AN EXAMINATION OF AI ART

Much has been made lately of the proliferation of artificial intelligence within the realm of art and filmmaking, whether it be AI entrepreneur Aaron Kemmer using OpenAI’s chatbot ChatGPT to generate a script, create a shot list and direct a film within a weekend, or Jason Allen combining Midjourney with AI Gigapixel to produce “Théâtre D’opéra Spatial,” which won the digital category at the Colorado State Fair. Protests have shown up on art platform ArtStation consisting of a red circle and line going through the letters AI and declaring ‘No to AI Generated Images,’ while U.K. publisher 3dtotal posted a statement on its website declaring, “3dtotal has four fundamental goals. One of them is to support and help the artistic community, so we cannot support AI art tools as we feel they hurt this community.”

“There are some ethical considerations mainly about who owns data,” notes Jacquelyn Ford Morie, Founder and Chief Scientist at All These Worlds LLC. “If you put it out on the web, is it up for grabs for scrubbers to come and grab those images for machine learning? Machine learning doesn’t work unless you have millions of images or examples. But we are out at an inflection point with the Internet where there are millions of things out there and we have never put walls around it. We have created this beast and only now are we getting pushback about, ‘I put it out to share but didn’t expect anyone would just grab it.’”

‘In the style of’ text prompts are making artists feel uneasy about

14 • VFXVOICE.COM SPRING 2023
TOP: One of the leading artists incorporating AI into his creative process is Refik Anadol who created a sculpture inspired by high-frequency radar data collections called Bosphorus. (Image courtesy of Refik Anadol) OPPOSITE TOP LEFT: The iconic photo of Buzz Aldrin on the moon, taken by Neil Armstrong on the 1969 Apollo 11 mission, reimagined in the style of Van Gogh’s Starry Night. This is one of the earliest artworks ever made with NightCafé Creator, and still one of the best. (Image courtesy of Night Studio Café) OPPOSITE TOP RIGHT:Trevor Hogg experimenting with DreamStudio by Stability AI to create a futuristic environment by using prompts such as steampunk city block, nighttime, dance club, wet snow falling, in the style of Syd Mead and Ralph McQuarrie. (Image courtesy of Trevor Hogg)
VFX TRENDS
OPPOSITE BOTTOM THREE: Machine learning is seen as an essential element by Autodesk for developing future tools for artists and has been incorporated into Flame. (Image courtesy of Autodesk)

their work being replicated through an algorithm as in the case of Polish digital artist Greg Rutkowski, who is one of most commonly used prompts for open-source AI art generator Stable Diffusion. “I just feel like at this point it’s unstoppable, and the biggest issue with AI is the fact that artists don’t have control of whether or not their artwork get used to train the AI diffusion model,” remarks Alex Nice, who was a concept illustrator on Black Adam and Obi-Wan Kenobi. “In order for the AI to create its imagery, it has to leverage other artist’s collective ‘energy’ [and] years of training and dedication to a craft. Without those real artists, AI models wouldn’t have anything to produce. I believe AI art will never get the artistic appreciation that real human-made art gets. This is the fundamental difference people need to understand. Artists create things, and hacks looking for a shortcut only know how to ‘generate content.’”

Rather than rely on the Internet, AI artists like Sougwen Chung are training robotic assistants on their own artwork and drawing alongside them, which follows in the footsteps of another collaboration that lasted 40 years and produced the first generation of computer-generated art. “There was some interesting stuff going on there that nobody knows about in the history of AI and art,” observes Morie. “Harold Cohen and the program AARON, which was an automatic program that could draw with a big plotter that learned as it drew and made these huge, beautiful drawings that

SPRING 2023 VFXVOICE.COM • 15

TOP: The following prompts were entered into AI Image Generator API by DeepAI: An angel with black wings in a white dress holding her arms out, a digital rendering, by Todd Lockwood, figurative art, annabeth chase, at an angle, canva, global radiant light, dark angel, 3D appearance.

(Image courtesy of DeepAI)

BOTTOM: The following prompts were entered into AI Image Generator API by DeepAI: A girl in a white dress and a blue helmet, a detailed painting, by wlop, fantasy art, paint tool sai!! blue, [mystic, nier, detailed face of an asian girl, blueish moonlight, chloe price, female with long black hair, artificial intelligence princess, anime, soft lighting, old internet art.

(Image courtesy of DeepAI)

OPPOSITE TOP TO BOTTOM: This particular Refik Anadol image was commissioned by the city of Fort Worth, Texas.

(Image courtesy of Refik Anadol)

An interactive art exhibit created by Refik Anadol that utilizes AI.

(Image courtesy of Refik Anadol)

Refik Anadol created projections to accompany the Los Angeles Philharmonic Orchestra performing Schumann’s Das Paradies und die Peri

(Image courtesy of Refik Anadol)

were complex and figurative, not abstract at all.” AI is seen as essential element for developing future tools for artists. “Flame’s new AI-based tools, in conjunction with other Flame tools, help artists achieve faster results within compositing and color-grading workflows,” remarks Steve McNeill, Director of Engineering at Autodesk. “Looking forward, we see the potential for a wider application of AI-driven tools to enable better quality and more workflow efficiencies.”

Does the same reasoning apply to the creation of text-to-image programs such as AI Image Generator API by DeepAI? “When we saw the research coming out of the deep learning community around 2016-2017, we knew practically every part of our lives would be changed sooner or later,” notes Kevin Baragona, Founder of DeepAI. “This was because we saw simultaneous AI progress in very disparate fields such as text processing, gameplaying, computer vision and AI art. The same basic technology [neural networks] was solving a whole bunch of terribly difficult problems all at once. We knew it was a true revolution! At the time, the best AI was locked away in research labs and the general public didn’t have access to it. We wanted to develop the technology to bring magic into our daily lives, and to do it ethically. We brought generative AI to the public in 2017. We knew that AI progress would be rapid, but we were shocked at how rapid it turned out to be, especially starting around 2020.” Baragona sees AI as having positive rather than negative impact on the artistic community. “We’ve seen that text-to-image generators are practically production-ready today for concept art. Every day, I’m excited by the quality of art that takes a couple seconds to produce. Visual effects will continue to get more creative, more detailed and much cheaper to produce. Basically, this means we’ll have vastly more visual effects and art, and the true artists will be able to create superhuman art with the aid of these computer tools. It’s a revolution on par with the rise of CGI in the 1990s.”

Undoubtedly, there are legal issues as to who owns the output and whether the original sources should be given credit. “As the core building blocks of new AI generative models continue to mature, a new set of questions will arise, like it has happened with many other transformative technologies in the past,” notes Cristóbal Valenzuela, Co-Founder and CEO at Runway. “Ownership of content created in Runway is owned by users and their teams. Models can also be retrained and customized for specific use cases. We are also building together a community of artists and creators that inform how we make product decisions to better serve those community needs and questions.” The AI revolution is not to be feared. “There are always questions that emerge with the rise of a new technology that challenges the status quo.” Valenzuela observes. “The benefits of unlocking creativity by using natural language as the main engine are vast. We will see so many new people able to express themselves through various multimodal systems, and we’ll see previously complicated arenas like 3D, video, audio and image be more accessible mediums. Tools are only as good as the artist who wields them, and having more artists is ultimately an incredible benefit to the industry.”

Should AI art be the final public result? “I think that the keyword

16 • VFXVOICE.COM SPRING 2023
VFX TRENDS

prompts used should also be displayed along with it, and any prompt that directly references a notable piece of existing art or artist should require a licensing deal with that referenced artist,” remarks Joe Sill, Founder and Director at Impossible Objects. “For instance, if an AI art is displayed that has utilized the prompt of ‘a mountaintop with a dozen tiny red houses in the style of Gregory Crewdson.’ you’re likely going to need to have that artist’s involvement and sign-off in order for that art to be displayed as a final result. If AI art is simply used in a pipeline as a starting point to inspire an artist with references, I think it’d be great for the programs themselves to start being listed in credits. Apps like Midjourney or DALL·E being credited as the key programs that help artists develop early inspiration only helps with transparency and also accessibility. If an artist releases a piece of work that was influenced by AI art, they can credit the programs used like, ‘Software used: DALL·E, Adobe Photoshop, Sketchpad.’”

AI has given rise to a new technological skill in the form of “the person who can write a compelling prompt for a program like DALL·E 2 to extract a compelling image,” states David Bloom, Founder and Consultant at Words & Deeds Media. “To some extent it’s a different version of what artists have always faced. If you are a musician, you had to learn how to play an instrument to able to reproduce the things that you were hearing in your head. I remember George Lucas saying in the 1990s when they put out a redone version of Star Wars, ‘I’m never going to show the original version again because the technology now allows me to create a film that matches what I saw in my head.’ It’s just like that. The technology is going to allow new kinds of people to see something that they have in their head and get it out without necessarily having the same or any technical skills, if they can articulate it.”

Part of the fear of AI comes from misunderstanding. “It’s not as powerful as people think it is,” notes Jim Tierney, Co-Founder and Chief Executive Anarchist at Digital Anarchy. “It’s certainly useful, but in the context how we use AI, which is for speech-totext, if you have well-recorded audio and someone who speaks well, it’s fantastic but falls off the cliff quickly as the audio degrades. There is a lot fear around AI. We were supposed to have replicants by now! Blade Runner was set in 2019. Come on. Where are my flying cars?” As for the matter of licensing rights, Tierney references what creatively drives artists in the first place. “If you say, ‘Brisbane, California, at night like Vincent van Gogh would have done,’ that’s going to create something probably Starry Night-ish But how is that different from me painting it using that visual reference and an art book? It’s complicated. I spent a bunch of time messing around with Midjourney. If you go in there looking for something specific and say, ‘I want X. Create this for me.’ You will have to go through many iterations. People can make some cool stuff with it, but it seems rather random.”

What is affecting the quality of AI art is not the technology. “People have this idea that you type in three simple words and get some sort of masterpiece,” observes Cassandra Hood, Social Media Manager at NightCafé Studio. “That’s not the case. A lot of work goes into it. If you are planning on receiving a finished image that you are picturing in your mind, you’re going to have put the work

SPRING 2023 VFXVOICE.COM • 17

into finding the right words. There is a lot of experimenting that goes with it. It’s a lot harder than it seems. That applies to many people who think it’s not as advanced or not too good right now. It can be good if you put the work in and actually practice your prompts. We personally don’t create the algorithms, but give you a platform and an easy-to-use interface for beginners who move onto the bigger more complicated code notebooks once they graduate from NightCafé. We are focusing on the community aspect of things and making sure to provide that safe environment for AI artists to hang out and talk about their art. We have done that on the site by adding comments and contests.”

“My opinion on this AI revolution has changed,” acknowledges Philippe Gaulier, Art Director at Framestore. “It has probably been a year or year and a half ago that AI has really exploded in the concept art world. At the beginning I thought, ‘Oh, my god, our profession is dead.’ But then I realized it’s not because I saw the limits of AI. There is one factor that we shouldn’t forget, which is the human contact. Clients will never stop wanting to deal with human beings to produce some work for whatever film or project that they have. However, there will be less of us to produce the same amount of work in the same way when tools in any industry evolve to become more efficient. The tools haven’t replaced people because people are still needed to supervise and run them because machines don’t communicate like we do. But there has been a reduction in the number of people for any given task. I have been in this industry long enough to understand that things evolve all of the time. I have already started playing around with AI for references. I’m not asking myself whether it’s good or bad. I’ve accepted the idea that is going to be part of the creative process because human beings in general like shortcuts.”

In the middle of the AI revolution is Stable Diffusion, which was designed by Stability AI in collaboration with LudwigMaximilians University of Munich and Runway. The release of the free, open-source neural network for generating photorealistic and artistic images based on text-to-images was such a resounding success that Stability AI was able to raise $101 million in funding. “A research team was looking at ways to communicate with computers and did different experiments looking at textto-image,” remarks Bill Cusick, Creative Director at Stability AI. “Their goal was to get to a place where instead of being limited by your physical abilities, you could be able to translate your ideas into images. It has evolved in a way where now we can see what is possible, and there is a bifurcation of what the approaches are. Stable Diffusion and DreamStudio is a tool. DreamStudio gives you settings and parameters. I had a meeting with these guys

18 • VFXVOICE.COM SPRING 2023 VFX TRENDS
TOP AND BOTTOM: Example of completely AI generated character concept art by Joe Sill of Impossible Objects, created through Midjourney’s new V4 update which “makes people look like people.” (Images courtesy of Impossible Objects)
“If you put it out on the web, is it up for grabs for scrubbers to come and grab those images for machine learning? Machine learning doesn’t work unless you have millions of images or examples. But we are out at an inflection point with the Internet where there are millions of stuff out there and we have never put walls around it. We have created this beast and only now are we getting pushback about, ‘I put it out to share but didn’t expect anyone would just grab it.’”
—Jacquelyn Ford Morie, Founder and Chief Scientist, All These Worlds LLC

doing a workflow using Stable Diffusion, and it’s as complex as a Hollywood workflow would be, and the results are incredible. There are people doing Blender and Unreal Engine plug-ins, and I reach out to them to help accelerate their development. I want concept artists to treat it as a tool because it’s going to be more powerful in their hands than anybody else’s.”

Cusick adds, “AI is never going to whole cloth recreate someone’s picture. By the time this article comes out, there is going to be text-to-video, and the question of did my art get stuck into a dataset of billions of images is meaningless when the output is animation that is unique and moving in a totally different way than a single image. I agree with all of the problems with single images and the value of labor. But it’s momentary. We are moving towards a procedurally generated future where there is a whole other method of filmmaking coming.”

SPRING 2023 VFXVOICE.COM • 19
TOP AND BOTTOM: Stability AI Creative Director Bill Cusick experimenting with the possibilities of AI art. (Images courtesy of Stability AI)
“I want concept artists to treat [AI] as a tool because it’s going to be more powerful in their hands than anybody else’s.”
—Bill Cusick, Creative Director, Stability AI

FEATURE FILM VISUAL EFFECTS ON A GLOBAL SCALE

Visual effects have proliferated far beyond Hollywood productions, with talented filmmakers and digital artists in China, South Korea, Mexico, Germany, Australia and India collaborating to produce a wide range of stories. “One of the major changes that I’ve seen over the past four to five years is the model changing from outsourcing to insourcing,” states Sudhir Reddy, Senior Vice President & Head of Studios, Canada & India at Digital Domain. “Almost every major post house has set up in India physically or virtually. India is playing a big part in work sharing. There is so much content being created that there is work for everybody. The domestic studios are thriving doing local and outsourcing work.”

TOP: The whirlpool that appears above the beach grave of Seo-rae (Tang Wei) is part of a visual motif in Decision to Leave designed to depict the growing power she has over Hae-Joon (Park Hae-il). (Image courtesy of MUBI)

OPPOSITE TOP: Previs was critical in figuring out the obstacles needed to make a believable chase between the tiger and Komaram Bheem (N.T. Rama Rao Jr.) in RRR (Image courtesy of RRR)

OPPOSITE MIDDLE AND BOTTOM: A clever transition in Decision to Leave showing the beginning of Detective Hae Joon’s (Park Hae-il) obsession with Seo-rae (Tang Wei) occurs when the shot goes from his arm to the x-ray of her dead husband she is suspected of killing. (Images courtesy of MUBI)

Visual effects for films and television are common nowadays in China. “A lot of small productions and even online series have access to visual effects in China,” remarks VFX Supervisor Samson Sing Wun Wong. “The use of AI and game engines, and virtual LED screen stage shooting, are allowing visual effects companies to finish a huge task with less time with more contained teams. The size of a company will no longer determine the quality of work; it eventually will change the way companies are structured and formed. There is always demand for visual effects, but it’s really about how a company and the artists are able to readapt themselves with new technologies and skillsets within the new platform. That’s the future.”

“The central question as a visualist is what method will be the best way to express that core emotion of a character or to drive the

20 • VFXVOICE.COM SPRING 2023
FILM

narrative forward,” notes acclaimed South Korean director Park Chan-wook (Oldboy, The Handmaiden), who digitally simulated fog, extended a mountaintop set, had CG ants crawling over the face of a corpse and inserted crime scene photographs onto a wall when making Decision to Leave. “Visual effects is one of the important tools. But you have to be cautious and only use visual effects when it’s absolutely necessary. Visual effects have great advantages, as directors can imagine executing things that were not possible in the past, and it cuts down on the production costs, too.” In Decision to Leave, Detective Hae-joon has a fatal obsession with murder suspect Seo-rae. “The one scene that I would want the audience to never miss out on is the finale where we see the whirlpool that is made on top of the tomb of Seo-rae,” remarks VFX Supervisor Lee Joen-hyoung, who has collaborated with Park ever since Oldboy. “That whirlpool or vortex is something that I wanted to put in different places throughout the film because it represents Hae-joon’s emotional entrapment to Seo-rae. When he makes coffee, the steam that comes out produces a little vortex movement, and when Hae-joon is scattering the ashes of the dead mother of Seo-rae, they swirl around him before going away, as if it’s Seo-rae’s presence.”

Renowned Chinese director Zhang Yimou (Hero, House of Flying Daggers) achieves his ambitious visions with the help of Visual Effects Supervisor Samson Sing Wun Wong. “My first show with Zhang Yimou was Shadow in 2018, which is stylized like a

SPRING 2023 VFXVOICE.COM • 21

TOP: There is no shortage of creature effects in RRR especially with the climatic animal havoc sequence. (Image courtesy of RRR)

MIDDLE AND BOTTOM: A major atmospheric that had to be created by House of VFX for Cliff Walkers was snow. (Image courtesy of House of VFX)

OPPOSITE TOP TO BOTTOM: The visual aesthetic that Zhang Yimou desired for Shadow was that of a Chinese ink painting that visual effects vendors such as Digital Domain had to recreate. (Image courtesy of Digital Domain)

One of the most complex visual effects shots to execute in Bardo was Silverio (Daniel Giménez Cacho) interacting with his mirror reflection in the desert. (Image courtesy of Netflix)

Practical and digital effects were seamlessly integrated for All Quiet on the Western Front. (Image courtesy of Netflix)

Effects simulations play a big part in the shaping of the Djinn’s (Idris Elba) character in Three Thousand Years of Longing. (Image courtesy of Kennedy Miller Productions)

traditional Chinese ink painting and has a lot of visual effects,” he explains. The partnership has gone on to produce Cliff Walkers, Snipers and the upcoming Under the Light. “In the last five years, the movies that Zhang Yimou directed are all invisible, seamless visual effects, requiring lot of environment extensions, effects elements and paint-outs. My role requires a close collaboration with the action director, art director and cinematographer to find the best methodology to achieve the best results. We shoot lots of references and avoid right-in-your-eyes visual effects, but help the storytelling in a subconscious way. Visual effects are no longer about fanciness, it is about precision.” A creative challenge was the car chase in Cliff Walkers. “In one or two cases where certain driving actions were slow, we did a full background replacement, but for the majority we managed this by increasing the speed of the snow,” explains Adam Hopper, VFX Supervisor at House of VFX. “For each shot requiring a little more danger, stunt vehicles were used to perform the action. We used these quite successfully as animation reference, but since the stunt vehicles were slightly larger and heavier, we had some alterations to consider in getting the weight distribution correct.”

Getting award circuit nominations for Best International Feature Film and Best Visual Effects is the German remake of All Quiet on the Western Front. “Visual effects did little things in the background where we had planes chasing each other to illustrate that even when they are in the barracks resting, there is still a war going on in the skies,” explains Production Visual Effects Supervisor Frank Petzold, who reunited with German director Edward Berger after working together on the television series The Terror. “We had to do research as to what planes existed at that time and the same goes for guns.” A tank travels over a trench. “That was one of my favorites,” Petzold notes. “I come from the film days of multiplane downshooters and optical printers. To make it look absolutely photoreal, I wanted to use as much photographic stuff as I could get and stay away from CG models or particle simulations for explosions. The tanks going over the trenches is traditional A and B plates. We shot the foreground, and literally on the day we rushed to our special set where there was a narrower trench dug out like a car pit, we rested the camera. Then we did a quick video lineup and had the tank drive over the camera. In CG, we had to fix a lot of stuff on the closeup shots, because when you look under the tank, the wheels and chains were slightly different.”

Whether it be creating the impression the entirety of that Birdman or (The Unexpected Virtue of Ignorance) was captured in one continuous take or depicting a graphic grizzly bear mauling in The Revenant, Mexican director Alejandro González Iñárritu astutely uses visual effects, and in the case of Bardo, False Chronicle of a Handful of Truths he utilizes them to illustrate the mental state of Mexican journalist and documentarian Silverio Gama. “The movie navigates between memories, reality and surrealism,” notes VFX Supervisor Guillaume Rocheron. “You can’t be too stylized because you never want the audience to understand where you enter a visual effects sequence. It’s constantly mixed. When Silverio Gama meets with his dad, they

22 • VFXVOICE.COM SPRING 2023
FILM

go into the bathroom and start to talk. Over one cut you enter the surrealism where Silverio, who is a 50-year-old man, now has a kid’s body and adult head; he feels like a kid in front of his dad but is still his own self. You have to be mindful of driving things as photographically as you can.” Joining the project during post-production was VFX Supervisor Olaf Wendt. “The sequences involving the digital babies were some of the most challenging work, especially doing a digital human in these long shots that come so close to the camera. There are interesting things, like the opening shot of Silverio’s shadow racing over the desert, just because Alejandro wanted this shadow to communicate the personality of his protagonist. It’s something that I’ve never seen before.”

Becoming an international sensation is the epic action drama RRR [Rise Roar Revolt] by Indian director S.S. Rajamouli, which is a cinematic spectacle that blends CG animals, major action set pieces and musical numbers within the era of colonial British rule of India. A signature moment is Gond Tribal leader Komaram Bheem (N.T. Rama Rao Jr.) literally battling a tiger in the jungle. “The tight closeup was a benchmark for us when doing the actual tiger asset,” states VFX Supervisor Srinivas Mohan. “I needed to have a log mark on the right side of face that was previously on the left side. We had to build an extra blend mark there and add more detail to it. We did extensive previs for it mainly to determine the speed of the tiger as it can run 50km and he can only do 10km. We added some obstacles because the tiger was reaching him too quickly.” The reflection in the eye of revolutionary leader

Alluri Sitarama Raju (Ram Charan Teja) as he stands in front of the police station was captured in-camera. “I genuinely thought we were going to be here for a while,” admits Pete Draper, CEO and Co-Founder of Makuta VFX. “But it’s Charan’s eye, a live background, one shot. There are no layer comps and cinematographer K.K. Senthil Kumar got it literally in a single take. Here’s the fun thing: There is no camera paint-out at all. But what there is is digital crowd extension. The guys standing on the hill and the falling watchtower man were digitally added in as well as a few extra flags.” Miniatures were utilized for the bridge explosion and rescue. “The festival area where Ram Charan comes running out chasing the guy, a small piece of road on top of the bridge, two pillars and a full-size train car [for a few shots] were actual set pieces,” explains Daniel French, Producer, VFX Supervisor and Co-Owner of Surpreeze. “But other than that, there was a fairly large miniature that was built with fully functional train cars at 1/8 scale. Because the eight train cars had to be set on fire and blown up, we had to build them in metal to get the proper weight. When you have fire elements you need to construct them as large as possible, but still keep it at a practical scale so you can lift the train cars up, do resets and work with it in a practical manner.”

Australian director George Miller has exacting standards, and for Three Thousand Years of Longing a collaboration was forged with VFX Producer Jason Bath and VFX Supervisor Paul Butterworth to bring to life the time-traveling genie Djinn and his unique relationship with British narratologist Alithea. “Effects simulations play a big part in the shaping of the Djinn’s character,

SPRING 2023 VFXVOICE.COM • 23

The cosmic energy powers in Brahmāstra:

courtesy of DNEG and ReDefine)

The concept art for the various cosmic powers in Brahmāstra: Part One - Shiva had to be refined further once the effects layers started to be added. (Image courtesy of DNEG and ReDefine)

One year of look development was spent on the Giant Snake created by Pixomondo for The Magic Flute. (Image courtesy of Pixomondo)

with many different aspects requiring their own look development pathway,” Bath explains. “The Djinn Bottle Vortex that shows his body being ripped apart is the most complex, but he’s also defined by the seductive vapor of his kiss, the energy from his fingertips as he ‘reads’ books and the aura he creates when he ’tunes’ the modern world’s noise into music. The Djinn is subtly scaled up throughout the film with both 2D and in-camera trickery. The scene where the Djinn first appears to Alithea, oversized and squashed into the hotel room, was achieved with classic scaled photography techniques using a 1/5th miniature hotel room set and motion control to film Alithea separately on the full-sized set.” One of the vendors contributing to the 596 visual effects shots was Fin Design + Effects. “The hero Djinn’s legs were originally meant to be filmed on set, and in an effort to match George’s vision for their unique look became full CGI in every frame across over 100 shots,” explains Roy Malhi, VFX Supervisor at Fin Design + Effects. “We created a feathers and scales system in Houdini specifically for this challenge, enabling us to have intricate control and flexibility to achieve the 4K closeup realistic feathers we were required to render.”

Brahmāstra: Part One – Shiva is part of a proposed trilogy and cinematic universe known as Astraverse, created by Indian director Ayan Mukerji, that had visual effects support provided by DNEG and ReDefine totaling 4,200 shots. “We shot the climax first, which was the most complex part of the film, and that gave everyone a deep learning,” remarks Jaykar Arudra, VFX Supervisor at DNEG. “Interactive lighting was the biggest thing that was required because fire has to move to various places and different energies are casting so much light. Designing the entire shot and ensuring that so much of the interactive light was happening correctly was the largest part of the on-set stuff. We programmed these large LED screens.” The cosmic energy powers were effects driven. “Even fire didn’t look like real fire,” remarks Viral Thakkar, VFX Supervisor at ReDefine. “It had a lot of details, like galaxy particles. We call it love fire so the color is different, and the embers are connected to the galaxy, so they had to be floaty. We could use nothing from practical fire.” Arudra adds, “Even when you do concept art of a frame, it’s not the entire story. We would take one scene and say, ‘Let’s develop the look of this power based on this concept.’ Once we start putting effects layers into that, it develops more and more and gets refined. The love fire had gone through iterations of development of how exactly we need to add the color and cosmic particles into the fire.”

An ambitious production was a modern interpretation of Wolfgang Mozart’s classic opera The Magic Flute by German director Florian Sigl. “The musical aspect of the production and how it transports into the visual effects were quite challenging in a lot of aspects, but as creative partners and working closely with the client we were able to provide bespoke solutions throughout every step of the project,” states Max Riess, VFX Supervisor at Pixomondo. “The dress simulation of the Queen of the Night took the longest time to develop. We extended the real dress of the singer with five to 10 giant CG cloth ribbons moving in sync with the voice track. We’d never done anything like this before, and it

24 • VFXVOICE.COM SPRING 2023 FILM
TOP TO BOTTOM: The Djinn (Idris Elba) is subtly scaled up throughout Three Thousand Years of Longing with both 2D and in-camera trickery. (Image courtesy of Kennedy Miller Productions) Part One – Shiva were effects driven. (Image

took some time to get the setup right.” One year of look development was spent on the Giant Snake, which went through several iterations. “We had versions with dragon-like horns and venomous but, in the end, for colors we decided to create an animal native in its environment with sand colors and realistic animal features. The jaw and teeth anatomy are inspired by several other big creatures, like a shark and crocodile. For a giant snake, it would not make sense to have two venomous fangs. Another challenge was the movement, to find the right pattern, speed and behavior so that the viewer would see the snake as not out to catch and eat the prince, but to taunt and scare him.”

The Korean War drama The Battle at Lake Changjin cost $200 million to make and grossed $931 worldwide, making it the highest-grossing Chinese film of all time. $200 million to make and grossed $931 worldwide, making it the highest-grossing Chinese film of all time. A trio of directors was responsible: Chen Kaige, Tsui Hark and Dante Lam, while the extensive digital augmentation was handled by VFX Supervisor Dennis Yeung. “We created two movies in 10 months, and the films also covered a large number of CG,” states Yeung. “In addition to the huge workload, most of the shooting sites were in the suburbs, and we shot many night scenes. There were many details that had to be worked out before shooting scenes, such as the model of the car, and the clothes of the soldiers had to be historically accurate.”

Among the visual effects vendors recruited was DNEG. “There was one plate element for the opening scene that was shot on a crane with the actor, and we had to insert that into the flying camera and make him work in the overall CG world,” remarks Lee Sullivan, VFX supervisor at DNEG. “You see a column of 100 soldiers in uniform marching out of a landing ship, and they built just the big doors as a front. A couple of real tanks and jeeps are driving as well as 20 tents, which is a big build, but at the same time, we had to extend that crane move, transition into a drone shot and extend everything around the beach. Then you cut down to more surface-level shots where we are extending out the tent village, vehicles and planes flying overhead; meanwhile, the burning Inchon is always in the background.” Being able to balance stylization and realism was not an easy task. “I remember there was this tank fight, and you’re following the bullet in a bullet-time type of shot that was full CG,” recalls Philipp Wolf, the film’s Visual Effects Executive Producer and Executive-in-Charge, Corporate Strategy at DNEG. “That was interesting to get it right – to create a stylized shot that ultimately needs to feel real, which is hard when you’re doing something a camera can’t do. But it was neat in the end.”

SPRING 2023 VFXVOICE.COM • 25
TOP TWO: Despite the extensive set builds, vehicles and 100 extras. the aerial beach shots in The Battle at Lake Changjin had to be extended by DNEG to get the necessary scope. (Image courtesy of DNEG) BOTTOM TWO: One of the sequences that DNEG was responsible for in The Battle at Lake Changjin is the bombing of a supply train. (Image courtesy of DNEG)
“One of the major changes that I’ve seen over the past four to five years is the model changing from outsourcing to insourcing. Almost every major post house has set up in India physically or virtually. India is playing a big part in work sharing. There is so much content being created that there is work for everybody. The domestic studios are thriving doing local and outsourcing work.”
—Sudhir Reddy, Senior Vice President & Head of Studios, Canada & India, Digital Domain

GOING BEYOND GAMEPLAY FOR THE LAST OF US

As part of a computer class assignment at Carnegie Mellon University in Pittsburgh, students had to pitch ideas for a zombie video game to the filmmaker credited with establishing the horror subgenre, George Romero. One of the participants was Neil Druckmann, who despite getting turned down would go on to become the Co-President of the acclaimed video game company Naughty Dog. “Whatever concept he picked, we were going to spend a semester developing a prototype of a video game,” recalls Druckmann. “Back then, it was much simpler. A cop and the daughter of a senator had to travel across the country. I tried to make it into a comic book. It evolved again when pitched at Naughty Dog, and we made it into a video game that became The Last of Us. The core emotional trappings remained consistent.” The fully evolved premise of the 2013 video game revolves around an actual fungal parasite called Cordyceps contaminating the food supply and turning its human hosts into volatile monstrous beings. Twenty years later, immune teenager Ellie Williams is escorted through post-apocalyptic America by smuggler Joel Miller to a medical facility with hopes of producing a cure.

Within the first 13 months of being released, Sony sold seven million copies of The Last of Us and talk commenced about

26 • VFXVOICE.COM SPRING 2023
Images courtesy of HBO. TOP: The prosthetic design work of Barrie Gower spread to set dressing with the infected embedded into the environment. OPPOSITE TOP TO BOTTOM: Principal photography for The Last of Us took place in Calgary and throughout Alberta.
TV/STREAMING
Neil Druckmann was as integral to creating the video game as in producing the live-action adaptation for HBO.

a feature film adaptation helmed by Sam Raimi, but creative disagreements over the proper tone and the studio’s request for more action sequences derailed the project. But everything changed when, coming off the success of the HBO miniseries Chernobyl, Craig Mazin got involved and forged a partnership with Druckmann that would see the entire original game released as a nine-episode HBO series and the two of them serve as co-creators, Executive Producers, directors and writers.

“I’m obviously a big science dork and love that Neil based this story on science,” notes Mazin. “It’s terrifying but beautiful what the fungus can do. Also, dramatically, I am interested in seeing regular people dealing with extraordinary circumstances and not in the typical B-movie way. Watching people twist, turn and squirm, you find out who they are.”

SPRING 2023 VFXVOICE.COM • 27
“[W]hen you’re watching a show or telling a story, violence has a different purpose because you’re watching it, not committing it. We tried hard to make acts of violence relevant to characters and their relationships because that’s the matrix through which the audience processes that this is a good story.”
—Neil Druckmann, Co-President, Naughty Dog

To do a straight retelling of The Last of Us would have been misguided, as watching television and gameplay are not the same experience. “The Last of Us Part I remake, which we just released, can only render as fast as a PS5 allows, and that game runs at 60 frames per second, which means that we have 1/60 of a second to render the Bloater as detailed as we can,” Druckmann notes. “The amount of detail is proportional to how long we have to render it and the computational power. With the show, Wētā FX can have its render farms rendering a frame for a week if they wanted to, so our imagination is the only limit.”

Then there is the matter of violence, blood and gore. “I love playing The Last of Us Part I and The Last of Us Part II,” Mazin remarks. “Part of the enjoyment is being given these obstacles of NPC [Non-playable Characters]. How do I get around them or through them? Am going to sneak or go in with guns blazing? But when you’re watching a show or telling a story, violence has a different purpose because you’re watching it, not committing it. We tried hard to make acts of violence relevant to characters and their relationships because that’s the matrix through which the audience processes that this is a good story.” Druckmann adds, “It’s not like we’re sanitizing the violence in this world, because our world is violent. We’re reducing the quantity of it so when it’s there it’s much more impactful than if you’re going to watch the same body count from the game.” Iconic moments from the game were recreated for the series. “There were certain things that I felt like as a fan I would want to see,” states Mazin. “It’s not even an Easter egg. We’re putting it right there front and center, and it’s not forced. How would they get across from here to there? Why not do the thing with the ladder? It made total sense, looked great and made

me feel something when I played.”

Principal photography took place in Calgary and throughout Alberta. “A big point of discussion was that after 20 years nature would take over and everything would flood,” explains Production Designer John Paino. “We always tried to make the streets look broken up and as ridged as possible. The other big part of the design was making sure that we were building everything in nature, which looked like we were in living in the States. Going from the East to West Coast, especially with Bill’s Town, which is supposed to be a hamlet, which are dotted throughout Massachusetts, we didn’t have a lot of luck finding locations until we got to the part of the story where we are crossing through the middle of America, and the look of Calgary and the feeling of tundra in the Midwest was great. We looked at a lot of reference of natural disasters to help to inform it.” The Quarantine Zone in Boston and Bill’s Town were practically constructed. “The QZ was the first backlot that we built because that kind of federal urban brick architecture doesn’t exist in Calgary. We built that. All of those streets were built as well as the interiors to match them. As we got to Kansas City where they are walking outside more, we were able to find streets that look like that in parts of Calgary and also in Edmonton. Almost all our interiors were built on stages.”

Brought onto the project in December 2020 was Prosthetics

Designer Barrie Gower, who previously worked on Chernobyl and House of the Dragon. “There is quite a menu of prosthetics in the show,” Gower observes. “It’s predominately these infected fungal Cordyceps characters, and then you have the Clicker. There were about six Clickers and a child Clicker as well in Episode 105. The big guy who climbs out of the hole had a lot of digital work going on

28 • VFXVOICE.COM SPRING 2023

there, but we built a fully practical suit for British stunt performer Adam Basil to wear. For the infected, we established several different stages of a human being bitten or infected, by going from two-dimensional makeup and artwork on a face to conjunctivitis around the eyes, to slightly raised veins. It’s like the fungus burrowing into the skin, and things start to blossom out of the skin like little stalks. We had five different stages going from human to just prior to becoming the Clicker, with big mushrooms on the head. The R&D for that and establishing the look of those probably took about six months.”

Contributions were made to the set decorations. Comments Gower, “In Episode 102, where they are walking through a warehouse by torchlight and see a body fused into the wall, that was the first set piece that we built. We sculpted it in modeling clay, made molds and created sections. When we made the whole piece, it was split like a jigsaw puzzle, disassembled and shipped to Canada from the U.K. We had already pre-painted a lot of it and, one of our key artists Joel Hall, who painted it here, we shipped him to Canada as well. He worked with the art department in Calgary installing it into the set against a turquoise ceramic wall with a TV and bits of covered units.”

The weather was not cooperative, causing a logistical nightmare. “Snow was the biggest thorn in our side,” admits Special Effects Supervisor Joel Whist. “The episode we shot in Waterton is supposed to be a town where the cannibals live, and it was to be completely covered in 15 feet of snow with big drifts forming off buildings. They chose Waterton because normally the town shuts down for the winter and becomes a big snowbank. But when we went, there were some remnants of drifts up against the houses

OPPOSITE TOP: Part of the special effects work was to flood sets with water.

TOP: Critical in making the Cordycep inflection believable was to create a sense that the fungus was coming out of the host.

BOTTOM TWO: A view of the walls of the quarantine zone and the infected open city portion of Boston.

SPRING 2023 VFXVOICE.COM • 29

and buildings. We could only use snow from within the park and brought in 350 dump truck loads of snow in four days. Me and 10 guys and a bunch of equipment hand dressed, hand blew, hand raked and hand shoveled for four days straight to get the town to look like it had snow. It would melt as we were doing it, but we got it done. When we came in the next day to shoot, there was an inch or two of fresh snow over everything that we had dressed, so it looked perfect.”

Fire was a dominant element, such as the house being set ablaze. “The house crash was a one-shot deal,” Whist reveals. “We did a test to see how the truck reacted. This truck had a giant snowplow on the front designed to come in and knock abandoned cars out of the way. The truck practically hit these lightened cars loaded up with dust, and sparks would fly. Then that truck had to crash into the house. There was a lot of rehearsal and testing to say, ‘Does this work? How does it look when hitting the house? We can tweak it and cross our fingers that on the day the stunt guy really hits it hard and it works.’ It was constant testing. We did a complete fire set inside of a stage, with the burning of a steakhouse all controlled by us. Tons of spot fires on the street. We did lots of burning cars. We had the burning cop car that was on fire, driven by a stunt guy in a fire suit hooked up with propane that would come in and smash an upside-down pickup. A lot of old school effects.”

Around 3,000 visual effects shots were produced by DNEG, Wētā FX, Distillery VFX, Zero VFX, Important Looking Pirates, beloFX, Storm Studios, Wylie Co., RVX, Assembly, Crafty Apes, UPP, RISE, Framestore, Digital Domain and MAS. Amongst the digital augmentation was how the parasite spreads. “That was probably one of the most abstract challenging and fun creations for visual effects,” states Visual Effects Supervisor Alex Wang. “We call them the tendrils. In the game, we have the spores – that’s how the ‘infected’ transfer their infection to other human beings. The tendrils come out of their mouth. Craig wanted that to feel real, in the sense that when the Cordyceps are done with the insect, they burst out of their brain; that was our foundation. Then there was the whole other research part of what characteristics they carry. The minute when the animation starts to feel like they have a personality, it automatically felt like aliens on the Syfy Channel. We didn’t want to go there. These Cordyceps have one motive, which is to keep infecting to survive.”

Two teams were sent twice to Boston to LiDAR scan, photograph and capture drone footage to ensure the city appeared to be authentic in the devastated open city. “As far as the vines, it’s trickier than you think, especially when you’re trying to deal with buildings at scale and you’re not used to seeing vegetation growing on skyscrapers,” observes Wang. “When they’re overlooking the hotel terrace and see a swarm of infected, that is a crucial defining moment of the show where we understand that the infected are connected to each other in some ways. We call it a dog pile. You can see them like synchronized swimmers start to move in unison. That was challenging because it’s abstract in nature. We were talking about everything from their movement to where we are high up looking at it from scale. All the aspects of it made it a challenging shot. Wētā FX was in charge of that work and used

30 • VFXVOICE.COM SPRING 2023 TV/STREAMING
TOP TWO: The iconic video game moment of Ellie crossing the ladder was recreated for the HBO series. BOTTOM TWO: The tight confines of the quarantine zone situated in Boston.

their Massive for crowd simulations and did a lot of performance capture to feed the Massive library.”

DNEG was the lead VFX vendor on the majority of VFX work across most episodes. Remarks Stephen James, VFX Supervisor at DNEG, “I traveled to Boston with our DNEG shoot team to do the data and reference capture needed to recreate a lot of the iconic [Boston] locations seen in Episodes 1 and 2. One sequence in particular that was brought to life directly from reference was the ladder-crossing sequence in Episode 2. This is such a memorable scene from the game that we wanted to do it justice. Even though the characters are six stories up and looking out at a bombed city, the scene provides a sense of relief, beauty and hope to the characters and the audience with them.”

Most challenging for James and his team was capturing the deteriorating post-apocalyptic environments. Comments James, “The main creative and technical challenges came in our environment work which often involved building or augmenting real-life locations and adding the heavy destruction and overgrowth that the series is so well known for. Much like in the games, the environments had to set the tone and tell their own story. What happened to this place 20 years ago? How has Mother Nature impacted it since then?”

“For any building destruction,” he adds, “it was very important to Alex Wang that you could feel the weight of the buildings pressing down over decades. Our DNEG Environment Supervisor, Adrien Lambert, built custom tools for Houdini that would give us base destruction simulations. This gave us a realistic collapse, broken windows and framework. Those were then cleaned up and manually set dressed in great detail. Every floor was filled with furniture, debris, cables, curtains blowing in the wind and so on. Once our destruction was in place, we would be able to add weathering and overgrowth. A great deal of thought went into the types of vegetation, the seasons and where and how it was growing.”

A complex shot occurs when the pandemic actually begins when Joel along with his daughter Sarah and brother Tommy are fleeing. “It’s a long shot that goes through many stages of that town. In the end, they encounter a plane that is starting to fall and crashes and explodes,” Wang recounts. “A piece of landing gear is pushed out of that explosion, and that’s what knocks their truck over. We’re talking thousands of frames. Because the camera was essentially inside the truck, they drove over a wedge, and that was enough to knock it over. On top of that, we would add some camera shake to that as well.” And, of course, one cannot forget the cameo of a particular animal. Explains Wang, “In Episode 109, we have an iconic moment with Ellie and a giraffe. We built a proprietary scanning booth that enabled our hero giraffe to come into it because we were cutting between a live-action and CG giraffe, and it needed to feel seamless.” Seamless is a key word when describing the visual effects work in The Last of Us “I wanted the world to feel like if I embraced the location, I shot it without replacing every single thing, but touching every portion,” says Wang. “It has to feel grounded and believable that we are there. The best compliment would be, ‘Where are the visual effects here?’”

SPRING 2023 VFXVOICE.COM • 31
TOP TO BOTTOM: An intense moment is when a swarm of inflected comes rushing out from their underground dwelling after a house blows up. A stunt performer portraying a Bloater emerges from the inferno in a prosthetic suit that was digitally enhanced. With humanity devastated by the parasitic fungus Cordyceps, nature reclaims the world. The special effects team led by Joel Whist built a rig for when a steakhouse gets set on fire.

IMMERSED IN THE MANY WORLDS OF JACQUELYN FORD MORIE

Considering her love for mathematics, arts and science, it is not surprising that Jacquelyn Ford Morie is a pioneer of virtual reality, as those three disciplines are the foundation for the medium that offers new ways for people to expand their experiences within a digital construct. Before becoming a Founder and Chief Scientist at All These Worlds, LLC, which explores the future uses of virtual worlds and the societal impact of avatars, the recipient of the Accenture VR Lifetime Achievement Award at the 6th International VR Awards created training programs for the animation and visual effects industry and was a senior researcher at the University of Southern California’s Institute for Creative Technologies.

Life began for Morie in Frankfurt, Germany, where her father was a U.S. Air Force engineer working on planes, redistributing what they could carry for the Berlin Airlift after World War II. The family continued to move around until settling on West Palm Beach, Florida. “I feel like I grew up in a little wonderland of Florida before it became so overbuilt. Our vacations were fishing trips. I knew how to deal with alligators and water moccasins. It was an interesting childhood.” Grade nine was a defining year for her as she went to public high school after attending a small Catholic school for eight years. “I had three classes that shaped my whole future,” Morie says. “An art class that was practical and art history, a math class taught by a guy that looked like Rudy Vallée, and a science class where the teacher gave us foundational science research projects.” It was important to be exposed to all three subjects. “We end up pushing people into choosing one or the other, but they’re compatible, and I’ve always thought that. My mother didn’t try to set boundaries. It was nice not to be steered into a profession that would make money,” Morie reveals.

While attending the Fine Arts photography program at Florida Atlantic University, where she earned her Bachelor’s degree in 1981, Morie decided to take a computer class to prove that computers were detrimental to society: the action backfired. “It turned out that I was one of the five people out of 90 who got an A. What was interesting to me was that you had to go into this logical mindset.” An Apple II personal computer became a part of her household in 1981 and came in handy at the University of Florida. “I spent an extra year looking at the computer as an image-making device. When I would bring something in, my fellow graduate students would be outraged because I was using a machine to make art. We’re talking about photography students here! The whole photography realm had gone through that same argument. I took these overly pixelated images from the Apple II that I would draw with a program or graphics tablet or some other mechanism, and I would photograph them from the back of the room with a telephoto lens to minimize the curvature of those big CRT screens back then. I would take them into the darkroom and multiple print them as regular photo images. That was called ‘Integrated Fantasies.’ Almost 40 years later, I took those images and made a virtual reality experience titled ‘When Pixels Were Precious.’”

Off to college, 1968

In her darkroom, c. 1978

What followed was a Master’s degree in Computer Science from the University of Florida and a career in training, whether it be teaching professors how to use CAD software at the IBM-funded

32 • VFXVOICE.COM SPRING 2023
Images courtesy of Jacquelyn Ford Morie TOP: Jacquelyn Ford Morie OPPOSITE TOP TO BOTTOM: Inside Paul Devebec’s Light Stage at Google Research, 2020. (Photo: Jay Busch) Jacki Ford, aged 2, and her father James Anthony Ford on a trip across the Southwest desert.
PROFILE

CAD/CAM facility at the University of Florida, redesigning the computer graphics program at Ringling College of Art and Design, or implementing a year-long training program for incoming computer animators at Walt Disney Feature Animation. “As much as it was some of the best people I’ve ever worked with, it was a hard system to do something new in, so I left Disney. I heard that VIFX was looking for somebody to set up a training program.” Originally a visual effects company owned by 20th Century Fox, VIFX was sold by Fox to Rhythm & Hues. ”There were vicious wars at that time in the 3D animation industry. Everybody was trying to get into the game and undercutting each other. It hasn’t changed since! The production schedules were not humane, and I had many discussions with production artists and technical directors about what it takes to make another explosion along Wilshire Blvd., or make this lava look good in this Armageddon movie. I could see all these kids feeling that they’d wasted their lives making entertainment for people. I thought, we have this powerful new medium [in virtual reality] that nobody is using: How can we make it so that it provides meaning for people?”

Morie was a guest speaker at a workshop held at the Beckman Institute in Urbana, Illinois, titled “Modeling and Simulation: Linking Entertainment and Defense,” which had evenly split representation from the entertainment and defense industries. “In three days, we went through what it would mean if we took everything we knew about simulations and entertainment and

SPRING 2023 VFXVOICE.COM • 33

put them together to make something new,” Morie explains. “Ed Catmull from Pixar had the first talk and I had the last one. One of Ed’s things was that we need to be free to pursue crazy and wild ideas.” The climate surrounding grants had moved in the opposite direction. “For grants today, you have to have deliverables and do everything that you say that you’re going to do,” Morie says. “The military funded those things back then and just said, ‘Discover stuff.’ That is so freeing and amazing to just pursue something for the joy of pursuing something new. Out of that workshop, the Army decided to fund a lab to do exactly that. They put it out to three universities; USC got the contract in 1999 and that became the Institute of Creative Technologies. I moved from the visual effects and animation industry into starting up this lab.”

Is there a different dynamic between academic grants as opposed to military funding? “The VR lab that I went to in Orlando was military-funded primarily and had a few civilian things,” Morie notes. “It was early and explorational. By the time we started the Institute of Creative Technologies, there was some entrenched military thinking. Luckily, the Institute of Creative Technologies was a line item in the Presidential budgets, so it couldn’t be messed with too much. It was more of a discussion on how we use it and our director, Richard Lindheim, was good at letting us researchers say what we wanted to do and then finding a way of fitting us into that budget. It’s how you manage things and let the voices be heard. It was a unique time at the beginning of the Institute of Creative

34 • VFXVOICE.COM SPRING 2023
PROFILE
LEFT TO RIGHT: Morie and team at the University of Florida IBM CAD/CAM Grant Facility, Gainesville, Florida, 1985. Working on her Master’s thesis on her Apple II computer, 1983. Morie in the Leep System’s Cyberface 2 head-mounted display, wearing the VSL’s own data glove, designed by Daniel P. Mapes, in 1991. Morie and Mike Goslin, creators of the VR work Virtopia, at the University of Central Florida Institute for Simulation and Training’s Visual Systems Lab (IST/VSL), 1992. (Goslin later worked at Disney Imagineering on VR.)

Technologies. One of the smartest things that they ever did was not make it part of a department at USC. It was reporting directly to the provost for research so we did not have to dance to Engineering or Cinema. It was the most freeing thing to pursue interesting intellectual and creative pursuits. In my long career, what I have found is, one or two people can change the way something unfolds because of their passion and voice.”

After spending 13 years at the USC Institute of Creative Technologies, Morie established All These Worlds, LLC in 2011, which developed virtual worlds on behalf of NASA to provide astronauts relief from social and psychological isolation associated with long-duration space flight missions. “We did the NASA project, which was an open simulation server that we ran ourselves in California. I did a lot of work in Second Life [video game] before the new VR gear started coming out in 2013. It was a good platform especially for social research. One of the big things that I did was a warehouse full of hazardous equipment for Texas A&M University, and they used that for studies.”

Virtual reality has been hampered by the need to make it story driven, according to Morie. “Virtual reality is a medium most akin to our real-life experiences. The story comes out of what your experience is. VR allows us to make our choices, have agency, and make every single instance of experience different if we do it right. Very few people came at it from a foundation of thinking about what VR is, like Nonny de la Peña with her immersive journalism.”

Morie,

Morie contemplates the team brainstorming board for the VR environment Darkcon at the USC Institute for Creative Technologies, c. 2003

SPRING 2023 VFXVOICE.COM • 35
LEFT TO RIGHT: Morie, ever the photographer, at Kings’ Canyon, 2001. The NSF-funded Interfaces Exhibit at the Boston Museum of Science. Morie was an integral part of bringing these interactive virtual humans to life as twin guides to the museum. Jeff Rickel, Lewis Johnson, William Swartout and Jonathan Gratch. The Mission Rehearsal Project team at the USC Institute for Creative Technologies, 2000. Cover of the 2020 Handbook of Research on The Global Impacts and Roles of Immersive Media, co edited by Dr. Morie and Kate McCallum. Morie showing off her Women in XR t-shirt at the 2022 Augmented World Expo (AWE) Conference.

Other notable innovators are Nanea Reeves, Co-Founder and CEO of TRIPP, Inc., who creates trippy VR journeys that are good for one’s mental health, and Virtual World Society Founder Tom Furness, who is driven by a mandate to have immersive technology lift humanity. “This whole idea of the metaverse has made us think how much bigger VR can be,” Morie observes.

Populating the metaverse are user avatars which raises questions as to whether this will alter how people interact with one another socially. “This is a great unanswered question,” Morie remarks. “The last project I was going to do at the Institute of Creative Technologies was called the ‘Avatar Investment Metric,’ because I truly believe avatars do influence the way we interact. The way we use avatars depends on the look, feel and functionality of the avatar, and on certain personalities or deep psychological traits that the person has. That’s what the Avatar Investment Metric was going to tease out. It was going say, ‘We’ve done this study and people with this type of personality tend to use avatars as a substitute for self or experimental personas or a record of their life.’” There are dozens of uses for avatars that even date back to the existence of shamans. “In the 2000s and 2010s, kids growing up were more involved in these virtual spaces than anyone realized, such as Club Penguin, Poptropica and Minecraft. These kids have grown up with avatars as their social glue, and we have not looked at that from a research perspective, but I know that for students that I’m teaching at Otis College of Art and Design, avatars are part of their lives. I don’t have an answer for how it influences their face-to-face communication. We need studies on that. Maybe it makes us feel better about ourselves or helps us to find our true person. Avatars are key to a lot of mental health advances that we could be making. I see them as critically important to the development of our technological society and how we interact.”

There is much to be explored and discovered about virtual reality. “Virtual reality only works because we put our bodies into it and track everything that our bodies are doing,” Morie states. “What is it about our bodies that we are ignoring for this? That’s a new phase of research that will help with mental and physical health and therapy. We are starting to see it. I always say, ‘We have underestimated the brain’s plasticity and VR is a way to unlock it.’”

The scalability of gear and the constant need to upgrade firmware and hardware are major issues. “It’s not easy to use, but all of that is progressing. We do need more sensors and have to think about the ethics of looking at people’s eye tracking. What do you do with that data? The form function of these ways that we put ourselves into VR is changing.” The future lies in WebXR, Morie says, with “things like Mozilla Hubs, where you can build stuff in an immersive application and then share it with people through a web-based interface. The program knows what is connected to your system and will serve you the correct functionality for whatever gear you’ve got. That’s the way it has to go. I believe in five years WebXR will be the primary way we experience much of this immersive stuff. It’s not on the radar of most ordinary people, but I do believe WebXR can crack that scalability aspect.”

36 • VFXVOICE.COM SPRING 2023 PROFILE
TOP TO BOTTOM: At the 1992 IAPPA Convention in Orlando, Florida, in the CyberTron VR interface. At the 1992 IAPPA Convention in Orlando, Florida, in the Virtuality VR System. Trying out the early Oculus DK1 headset, 2013. Working on the VR Navigator project at the University of Central Florida Institute for Simulation and Training’s Visual Systems Lab (IST/VSL), 1993. Morie made the wand she is holding in her left hand as part of the system interface.

ILLUMINATING THE VISUAL EFFECTS OF SHAZAM! FURY OF THE GODS

Best described as the movie Big with a superhero complex, Shazam! brought an irreverent teenage attitude to the DC Extended Universe where Billy Batson (Asher Angel) and his fellow foster-children friends can transform themselves into adults with extremely heightened abilities and powers. The mix of comedy and action resulted in a worldwide box office of $366 million. Filmmaker David F. Sandberg has returned along with a cast of Asher Angel, Zachary Levi, Jack Dylan Grazer, Adam Brody, Djimon Hounsou, Faithe Herman, Meagan Good, Ian Chen and Ross Butler for Shazam! Fury of the Gods

“In the first movie, Shazam [Zachary Levi] broke the Wizard’s [Djimon Hounsou] staff, which was the gateway for the bad guy to have the power,” explains Bruce Jones (Reminiscence) who, along with Raymond Chen (Togo), served as co-visual effects supervisor for the sequel. “In this story, the Wizard’s staff had imprisoned gods from Greek mythological times, so when Shazam broke it, he released three daughters of Atlas who have come back for revenge on mankind because human wizards had imprisoned them. They plant a seed from the god realm that grows into a giant tree in our world. Roots sprout throughout the city of Philadelphia and these seed pods burst open, releasing manticores, minotaurs, harpies and cyclops that terrorize the fine folk of Philadelphia. Our heroes come to the rescue and fight these evil powers until they ultimately overcome them.”

Certain effects were carried over into the second installment of the franchise, which consists of just under 2,000 visual effects shots. “The way Shazam fires electricity bolts was a look that had been developed by MPC,” Jones states. “This had new suit designs

38 • VFXVOICE.COM SPRING 2023
Images courtesy of Warner Bros. Pictures. TOP: The chest bolt was a practical lighting element that subsequently had to be replaced in post-production to fit into the suit properly. OPPOSITE TOP TO BOTTOM: Lucy Liu’s character Kalypso rides the dragon, so there was a lot of hydraulic buck shooting that happened on set. Anthea (Rachel Zegler) has the Power of Axis, which is the ability to manipulate the world around her.
FILM
The daughters of Atlas, Hespera (Helen Mirren) and Kalypso (Lucy Liu), escape from the broken magical staff of the Wizard Shazam (Djimon Hounsou) and reassemble the ancient relic to wreak havoc on humanity.

has what looks like armor plates growing out of its flesh and skin. The harpies look humanoid around the face and are the creepiest creature that we did. Then, of course, they have wings and feathers. The manticore was our most complex creature. The head is like a lion, but at the same time it’s scaly and has a crocodile mouth, bat-like wings and a scorpion’s tail.”

and bolts that looked partly electrified and metallic. Early on, David Sandberg did concepts of creatures that were built with ZBrush and had turntables of them. Those were our guides. Our special effects team, led by J.D. Schwalm, built big mechanical bucks for the unicorns that our heroes ride. The dragon was primarily DNEG and Pixomondo supplemented 100 shots. Wētā FX handled a lot of the mythological creatures. Scanline VFX looked after the Benjamin Franklin Bridge sequence. RISE did big, beautiful environments like the Realm of Gods that takes place on Mount Olympus and the Library of Eternity, which is this endless library where the flocks of books fly around. Method Studios was responsible for when Hespera (Helen Mirren) and Kalypso (Lucy Liu) enter the Acropolis Museum in Athens, which has the broken staff on display. We had BOT VFX and Stereo 3D enhancing 400 bolts on the costumes, an in-house company called Hollywood VFX did a ton of look development and Territory Studio created the motion graphics.”

A physical chest-bolt appliance on Shazam meant to provide Cinematographer Gyula Pados with interactive lighting proved to

SPRING 2023 VFXVOICE.COM • 39
“[The cyclops] is quite a Ray Harryhausen cyclops, which I appreciate. The minotaur
—Thrain Shadbolt, VFX Supervisor, Weta FX

be problematic. “It would go on top of the metal chest bolt. Then, being LED, there was some foam on top of that to cast a soft light on the underside of the chin,” Chen reveals. “This added quite a bit of bulk to the chest, and, in some cases, it protruded quite a bit. There were a large number of shots where we had to take out the appliance, put in the CG version of the glowing bolt and fix everything. There were shots where the entire chest was replaced. It was a lot more shots than we had anticipated.” Other practical elements worked quite well. “We had a number of dragon shots, and one of the things that we worked out with J.D. Schwalm was an industrial robot that he actually used on Black Adam, which was originally built to move around car chassis. We programmed it to have the motion of the dragon so that Lucy Liu’s character, Kalypso, could actually ride it, as the more traditional six-pistol hexapod motion bases didn’t have enough travel for us for some of the shots. We went through and pre-animated the scenes, converted the files, worked out some of the motion, and basically filmed those. Those actually worked out quite well to get the interactivity of Lucy reacting to the movement of the motion base. For an actor, it’s hard to fake that real physical act of being tilted.”

Omnidirectional carts were constructed for the unicorns. “They could have 1,000-pound payload on them, and it allowed us to have essentially a mechanical unicorn rig that would replicate the backbone and neck and follow the same motion for a walk and run,” explains Schwalm. “Each cart had an effects operator on it. With a remote control you could drive them left, right, back, spin them around, bring them all into a circle, do their dialogue, peel out and take off running down the stage with the actors on it.”

Principal photography took place in Atlanta, which is receptive to film productions. “We got to do a lot of neat stuff right in the

40 • VFXVOICE.COM SPRING 2023
FILM
TOP: Ross Butler as Super Hero Eugene, Adam Brody as Super Hero Freddy, Grace Caroline Currey as Super Hero Mary, Zachery Levi as Shazam, Meagan Good as Super Hero Darla and D.J. Cotrona as Super Hero Pedro. MIDDLE AND BOTTOM: A portion of the Benjamin Franklin Bridge was created in the parking lot at Blackhall Studios.

middle of downtown, and even got road closures on some busy intersections to do that stuff,” Schwalm says. “Three cars were flipped at the same time, with one of them going over the top of the others.” A complex gag that required robotics and motion control cameras was having the dragon crush a motorhome in front of the young adult actors. “We had to time the speed at which the dragon crushed the motorhome with the speed of the dragon in the previs,” Schwalm notes. There is no lack of breakaways. “In the underground warehouse, we had a huge number of breakaways consisting of concrete ceilings and floors, walls and shelving units as Shazam is getting thrown around like a rag doll,” Schwalm adds. Ladon, the dragon that protects the Tree of Life, had to go through design modifications. “The creature was originally designed with small or evenly spaced limbs so the front and back legs are the same lengths, but as soon as we got into this lioness motion, a lot of that power needed to come from the front. So we started building the character up with a much bigger chest and longer front limbs, and then started getting into the length of the neck and size of the head,” explains Russell Bowen, VFX Supervisor at DNEG. “Ultimately, the dragon was still the same size, but the proportions were manipulated so we could get a more powerful creature out of it and a much more elegant adult dragon.” A major environmental element is the Power of Axis, which is an ability that belongs to Atlas’ daughter Anthea (Rachel Zegler). “Anthea can manipulate the world around her. It almost has this kaleidoscope feel where everything starts to become like a puzzle piece that starts to rebuild itself, and she can move a building from one place to another. The city build of Philadelphia had to be detailed, including interiors. Obviously, you can’t do that for every single building, so we had to come up with a robust procedural way of building a massive city on a large scale that was as close as we could to the real thing, because if it doesn’t look like Philadelphia then everyone is going to know that it’s CG.”

Around 760 shots were spread over DNEG facilities in Vancouver, Montreal, Toronto, Mumbai and Shanghai. “We had those conversations at the beginning, conceptualizing a dragon that is made out of wood and breathing fire, but it’s magical fire, so it works!” laughs Christine Neumann, VFX Producer at DNEG. “The bigger thing turned out to be the reshoot where the third act got redone. Then you get new plates and have an entirely new segment that you need to plan out all of a sudden and fit in.” A particular shot comes to mind for Neumann. “I would talk about DB7120, which was shot where Shazam punches the dragon in the face and it’s in slow motion just because we couldn’t fit in the timeline and had to get creative in production on how to schedule it. There was a plate for Shazam where he does a backflip, and that’s what we got in the reshoot. He was in this rig and they flipped him around. It ended up being 95% CG.”

Referring back to the stop-motion animation icon is the design of the cyclops. “He is quite a Ray Harryhausen cyclops, which I appreciate,” remarks Thrain Shadbolt, VFX Supervisor at Wētā FX. “The minotaur has what looks like armor plates growing out of its flesh and skin. The harpies look humanoid around the face and are the creepiest creature that we did. Then, of course, they have

SPRING 2023 VFXVOICE.COM • 41
TOP TWO: The smoke coming from the damaged cars was digitally added by Scanline VFX. BOTTOM TWO: A major environmental element for DNEG was creating the roots from the Tree of Life spreading throughout Philadelphia.
“The city build of Philadelphia had to be detailed, including interiors. Obviously, you can’t do that for every single building, so we had to come up with a robust procedural way of building a massive city on a large scale that was as close as we could to the real thing, because if it doesn’t look like Philadelphia then everyone is going to know that it’s CG.”
—Russell Bowen, VFX Supervisor, DNEG

wings and feathers. The manticore was our most complex creature. The head is like a lion, but at the same time it’s scaly and has a crocodile mouth, bat-like wings and a scorpion’s tail.” Extensive simulates were generated to depict the giant seed pods from the Tree of Life that give birth to the mythological creatures. “There is an outer skin to the pod which is rough and an inner membrane that is more like a lot of babies. In between those, you have another set of fluid simulations. We had to start with the animated creature and then we had to simulate the initial interior layer of the pod. Then, the other layer begets the other layer, and then we had to simulate the fluids in and around those as well. Then, there was dirt on the outside of the pod, so there would be a rigid body simulation for that. Each creature had its own birthing moment, and the actions within those were bespoke for the shot and creature.”

Not everything is associated with Greek mythological mayhem as is the case of Benjamin Franklin Bridge where a wayward crane smashes into a Jersey rail [barrier] causing a disastrous domino effect. “This whole sequence is like a short film,” observes Joel Delle-Vergin, VFX Supervisor at Scanline VFX. “The set itself was a carpark at Blackhall Studios where we added a zipper rail and had Jersey rails on both sides. Then, the rest of it is bluescreen. Each day would have its own set pieces as to where these cars are, and we could change the direction of the cars for the other lanes. We used The Third Floor’s Cyclops, which is like Simulcam, to see the bridge in the shot to make sure that eyelines were correct.”

The bridge had to be photoreal and dynamic. Delle-Vegin explains, “Everything down to the rivets was built, and the bridge is moving the whole time. It ended up being almost 200 million polygons. It had to be constructed in a way that was modular enough so it could be like a puzzle piece, in order to be flexible and detailed enough for effects to pull it apart to reflect the different stages of destruction.” Another major build were the surroundings. “Another guy and I ended up taking 20,000 photos to produce giant bubbles from all parts of the bridge, not knowing exactly where things were going to be placed,” Delle-Vegin adds.

“You probably wouldn’t think that heavy visual effects would be the most difficult work,” states Chen. “For example, we used LED screens on the school rooftop, but we needed to replace a large number of them. We were shooting with multiple cameras on set, so you couldn’t get two cameras to have the right perspective, and the color and look of the sky weren’t necessarily right. We did a lot of sky replacement on stuff that we shot on LED screens. There is a lot of fine work in terms of roto, hair and integration. The end sequence is satisfying because it’s a nice final battle between Shazam and Kalypso. The destruction is fun, but it’s also visually quite striking in the nighttime setting and using the lightning to illuminate things.” Getting the proper interaction for things that didn’t exist in reality was difficult.

Schwalm was kept busy every day because of the scope of the work. “[I’m looking forward to] the bridge sequence and when the monsters are running wild downtown,” he says. “I’m excited about the scene that takes place in that underground warehouse with Shazam going through all the concrete walls. That

going

42 • VFXVOICE.COM SPRING 2023 FILM
is to look neat.” TOP TO BOTTOM: Ladon the Dragon is made out of wood just like the Tree of Life that he protects. Hespera (Helen Mirren) creates a forcefield dome around Philadelphia that creates a lightning-in-abottle scenario for the trapped Shazam. The proportions of Ladon the Dragon were manipulated by DNEG to get a more powerful, elegant adult creature. The design of the Cyclops was inspired by stop-motion animation icon Ray Harryhausen.

CONJURING MOVIE MAGIC WITH PAUL DEBEVEC

In the middle of the innovation centered around photogrammetry, HDRI, image-based lighting, digital actors and virtual production has been Paul Debevec, VES, who is currently the Director of Research, Creative Algorithms and Technology at Netflix and Adjunct Research Professor of Computer Science in the Viterbi School of Engineering at the University of Southern California. “As I accepted the Charles F. Jenkins Lifetime Achievement Award [at the 2022 Technology and Engineering Emmy Awards], one of the things I noted is that they were recognizing all the stuff that I did 20 years ago,” Debevec recalls. “I was a different person back then! I am still trying to do some cool stuff. There was a bit of magic at Berkeley that was nice to ride the wave of.”

For Debevec, motivation comes from trying to figure out how things are made. “Movie visual effects were so astounding and exciting to watch, and there’s a reason why they call it movie magic because it looks like one thing but is actually a different thing. It’s a miniature or bluescreen or CGI render but looks like a spaceship or dinosaur, or your lead character is careening off a railroad track.” Academia and the visual effects industry are technological partners, Debevec says. “So much of what happens in visual effects is made in this virtual collaboration with academia; they inspire and feed back into each other. It comes together at the SIGGRAPH conference.”

Academia was not an unfamiliar career path for Debevec as his father earned a Ph.D. in Nuclear Physics from Princeton University and became a professor at the University of Illinois, which had a nuclear accelerator. “My dad being a nuclear physicist and a professor was definitely influential, because I felt that I should have some relationship with academia. His work was technical, and when I visited his nuclear physics laboratory, there would be all sorts of interesting technology. There were PDP-11 and VAX computers. My parents also happened to do some darkroom photography. At some point in elementary school, the computer van came by, and students were allowed to play with the TRS-80 or Apple II.” His fascination for computer programming led to the purchases of a Commodore VIC-20, Commodore 64 and Commodore 128, which are preserved in a display case along with an 8mm Bell & Howell film camera that was gifted by his grandfather. “It’s like your typical story where you get a movie camera as a kid and would make some stop-motion films. You could hear it click and shoot the individual frames. I also had the Canon AE-1, which I used when I was my high school yearbook photographer.”

Debevec

Debevec

Debevec earned degrees in Math and Computer Engineering at the University of Michigan in 1992 and a Ph.D. in Computer Science from the University of California, Berkeley in 1996. “You try to find something that maximizes how many of your talents and interests it leverages because that’s going to give you the most capability in that particular area. Finding something that leverages an interest in photography and computers, especially using computers to take pictures, and my love of film visual effects, all of that converged. At the same time, since my father was a professor, I don’t think it would have ever occurred to me to directly work at Digital Domain or Industrial Light & Magic. However, I did apply for an internship at Industrial Light & Magic in the summer

44 • VFXVOICE.COM SPRING 2023
TOP: Paul Debevec, VES. BOTTOM: A self-portrait taken in 1988. OPPOSITE TOP TO BOTTOM: Debevec stands beside the Light Stage rig that resulted in him receiving his second Technical Achievement Award at the 91st Academy Awards, 2019. in the early 1980s with his Commodore 64, which he used for computer programming rather than playing games.
PROFILE
developing photographs in the University of Illinois Laboratory High School –also known as Uni High – darkroom in 1987.

of 1994 but never heard back from them. But, I was lucky at USC that I could be a research professor publishing at the SIGGRAPH conference and collaborating and working with the visual effects industry for scanning actors and consulting on image-based lighting that was done on films.”

Photography made Debevec aware of the possibilities for lighting. “Being in the darkroom and shooting all that film gave me a sense of the dynamic range of light in the world – seeing those little numbers on the shutter reel that went from 1 to 1,000. The work that I ended up doing in high-dynamic-range imaging was inspired by realizing back then that there’s this huge range of light in the world that is not getting captured on a single exposure of a negative. When digital cameras came out, they had even less dynamic range.”

After witnessing the dinosaurs in Jurassic Park during his second year of graduate studies at Berkeley, Debevec was surprised by the methodology utilized by Industrial Light & Magic to integrate the digital creatures into the scenes with actors. “They said, ‘We try to write down where the lights were and then try to recreate it. If it doesn’t look right then we iterate until it finally looks right. That sounded laborious and something difficult for me to practice. I liked the idea of adding things that were never there for me to tell a story. When I made my film Fiat Lux, where we put the big monoliths and spheres in St. Peter’s Basilica, I was able to use this process where I captured the real light in the real world

SPRING 2023 VFXVOICE.COM • 45

using HDR photography panoramically, so you see light from all directions and then light your CGI objects with this measurement of the light that was actually there. The data worked, and nowadays when ILM needs to put a dinosaur into a shot, they’re likely to use an HDRI map to light it.”

For Debevec, his best accomplishments occurred when he was able to take a new technical idea and apply it in a creative way that counted for something, such as with his short films, The Campanile Movie, Fiat Lux and The Parthenon. “I was lucky that when I went to SIGGRAPH for the first time I was covered by my internship at Interval Research Corporation and I saw the SIGGRAPH Electronic Theater and was blown away. I did my Ph.D. work on modeling and rendering architecture from photographs, which got published in SIGGRAPH 1996. One of the examples in the paper was doing a 3D modeling of Berkeley’s tower from a single photograph because it has so much symmetry. I had the idea that this would be far more striking if we could not model just one building, but also model its environment in 360 and then fly around the building. I met an architect professor named Cris Benton who did kite aerial photography. We took a Canon Rebel film camera with a 24mm lens and got it 400 feet in the air and took pictures of the Campanile with that.” The photogrammetry work in the short captured the attention of Visual Effects Supervisor John Gaeta, who was working on the visual effects of The Matrix. “John realized that tech would solve a visual effects problem that they were having in recreating a CGI version of the scenes that the bullet time shots would take place in. I helped to advise him before he went to take the photos on top of that building in Australia. John hired one of the grad students I had supervised on The

46 • VFXVOICE.COM SPRING 2023
PROFILE
TOP LEFT: A GoPro shot of the Light Stage rig that produced a 3D scan of President Barack Obama at the White House. BOTTOM: A career highlight for Debevec was producing a 3D scan of President Barack Obama, courtesy of the Light Stage rig. TOP RIGHT: Debevec gives an acceptance speech upon receiving the Charles F. Jenkins Lifetime Achievement Award at the Technology & Engineering Emmy Awards in 2022.

Campanile Movie, and they got my software I had done my Ph.D. for at Mannix Entertainment, which was then used to create the virtual backgrounds. Being associated with The Matrix gave me a certain runway to keep doing the stuff that I was hoping to do in the academic world.”

The first light stage at UC Berkeley was a rather low-tech affair with the original goal being to surround actors with LEDs to display images of the set around them. “It was made out of a few hundred dollars of lumber and a single 250w photography kit spotlight that spun around on ropes,” Debevec recalls. “Once we had devices that surrounded people with concisely controlled computer illumination, it became clear that we could do lots of interesting things with this. The polarized spherical gradient illumination was born out of trying to have a process that would get 3D scans of the face that would be as detailed as doing a plaster cast and then laser scanning it. One day, I started playing around with putting polarizers on all the lights and flipping a polarizer in front of the camera, and I was able to isolate the specular shine of the skin. From different lighting conditions, you could figure out the surface orientations, basically what part of the light stage is reflecting in that pixel of the skin so you can get a high-resolution surface normal map photographically. If you write an algorithm that does a photogrammetry solve, plus embossing for all of this detail, you get face scans from this normal map that are 1/10mm accurate. When [Visual Effects Supervisor] Joe Letteri saw our work through Mark Sagar at Wētā FX, he decided to send the Avatar actors over, and we got to start applying this technique in movies.”

Neural rendering and AI are completely transforming how

SPRING 2023 VFXVOICE.COM • 47
LEFT TO RIGHT: Debevec is one of the inventive minds behind the Polarized Spherical Gradient Illumination facial appearance capture method. Debevec gives visual effects legend Douglas Trumbull, VES, a tour of the Light Stage. Debevec with one of his mentors, film director Alexander Singer (middle), and his son Jethro Singer at Alex’s 90th birthday. At the 2010 VES Awards with actress Emily O’Brien, of our “Digital Emily” project, and Pixar founder Ed Catmull. Panorama of Debevec standing amongst the team responsible for the Light Stage.

computer-generated images are created. “There are the famous stories about how Phil Tippett thought his career was over when computer-generated dinosaurs became a thing,” Debevec observes. “It turns out that his career was not over. Look at Starship Troopers. This is another moment where a lot of us need to take the time to familiarize ourselves with the power of these algorithms. AI-generated art may be still frames, but the research papers are showing that they can make image sequences. I did an interview for The Hollywood Reporter about how AI would change Hollywood back in May 2020 and I almost cheekily said, ‘A few decades from now you’ll be able to send in the script of your movie and it will output the movie. You just tell whose style you want to make it in.’ It turns out that’s not decades from now. We will be able to have this PDF to MP4 converter within this decade. In fact, you don’t even have to supply your own PDF of the script. We’re going to make the script with ChatGPT, which doesn’t do things that have tons of substance to them. But if you have a person figure out the key elements, story points and character arcs of a film, that generating script and text, at least in an augmented way, is going to help you write your script, and then it will output a version of the film. Where the real research is going to be is how to then direct it to fix it up and also get it to be consistent from frame to frame. I firmly believe that as crazy as it is, we won’t be making movies or visual effects the way we are now within a decade. The most optimistic guess of what’s going to happen is this will make it so that many more people have access to creating things that look like high-end feature films. It’s about embracing change and finding out how these tools can elevate your craft,” Debevec states.

Certain achievements stand out for Debevec, such as “building the 3D facial scanner that got used on 100 movies with the light stage and getting invited to the White House to scan President Barack Obama in 2014,” Debevec remarks. “And getting to work with the survivors of the Holocaust through the USC Shoah Foundation and recording them with holographic imaging lightfield arrays we built for that.” There are defining character traits. “Whenever I get to a point where I have a vision for something and it doesn’t exist yet, then I want to close that gap.”

Private research has bigger budgets than academia. “It’s a constant effort to have the powers that be convinced that this is research worth investing in. In the context of a film production where you need to do something that’s not going to fail, you need a safe space to try new things or something just because it’s interesting. A lot of the stuff that I’ve done which had an impact were things that I didn’t know were going to work or not. However, I knew it was something that I’d learn something from. I’m in a good position at Netflix because I have two decades plus of research results, not all of which have been applied in the industry yet. We can dip into the archives of cool things that are so much more practical to do today because cameras or LED panels are better. I’m hoping that we’ll get to come up with some good stuff that will help our productions and eventually everybody else.”

48 • VFXVOICE.COM SPRING 2023 PROFILE
LEFT TO RIGHT: Debevec working with the Brutus rig, which records light field video with an array of 24 synchronized cameras. Debevec behind the scenes explaining what he has in mind to a colleague. A shot taken from the 2004 short called The Parthenon A shot taken from The Campanile Movie, which in turn provided the methodology that was used to create bullet time in The Matrix Paul Debevec presents his thesis on “Modeling and Rendering Architecture from Photographs” to Prof. Jitendra Malik at the University of California at Berkeley, 1996.

VISUAL EFFECTS ARTISTRY IN THE SPOTLIGHT

Captions list all members of each Award-winning team even if some members were not present or out of frame. Some winners provided a video acknowledgment. For more Show photos and a complete list of nominees and winners of the 21st Annual VES Awards, visit vesglobal.org.

The Visual Effects Society held the 21st Annual VES Awards, the prestigious yearly celebration that recognizes outstanding visual effects artistry and innovation in film, animation, television, commercials, video games and special venues.

Comedian Patton Oswalt served as host for the 10th time to the nearly 1,200 guests gathered at the Beverly Hilton Hotel in Los Angeles on February 15, 2023 to celebrate VFX talent in 25 awards categories.

Avatar: The Way of Water was named the Photoreal feature winner, garnering nine awards. Guillermo del Toro’s Pinocchio was named top Animated film, winning three awards. The Lord of the Rings: The Rings of Power was named best Photoreal episode, winning three awards. Frito-Lay: Push It topped the Commercial field.

Academy Award-winning filmmaker James Cameron presented the VES Lifetime Achievement award to acclaimed writer-producer Gale Anne Hurd.

The Society’s current and former Board Chairs presented the VES Board of Directors Award to former Executive Director Eric Roth. The group included Lisa Cooke, current VES Chair; Jim Morris, VES, President of Pixar Animation and founding VES Chair; and former Chairs Jeffrey A. Okun, VES; Mike Chambers, VES; Carl Rosendahl, VES; and Jeff Barnes.

Award presenters included: Academy-Award nominated filmmaker Rian Johnson, Academy-Award-winning filmmaker Domee Shi; actors Jay Pharoah, Tyler Posey, Randall Park, Angela Sarafyan, Bashir Salahuddin, Josh McDermitt and Danny Pudi. Dara Treseder, Autodesk’s Chief Marketing Officer, presented the VES-Autodesk Student Award.

For the first time, the VES presented The Emerging Technology Award, with the recipient being Avatar: The Way of Water; Water Toolset

50 • VFXVOICE.COM SPRING 2023 VES AWARDS
All photos by Danny Moloshok, Phil McCarten and Josh Lefkowitz. 1. Nearly 1,200 guests gathered from around the globe for the 21st Annual VES Awards. 2. Comedian Patton Oswalt hosted the show.
1 2 3
3. VES Executive Director Nancy Ward greeted the crowd.
SPRING 2023 VFXVOICE.COM • 51
4 5 6 7 8 9
4. The award for Outstanding Visual Effects in a Photoreal Feature went to Avatar: The Way of Water and the team of Richard Baneham, Walter Garcia, Joe Letteri, Eric Saindon and JD Schwalm. 5. The award for Outstanding Supporting Visual Effects in a Photoreal Feature went to Thirteen Lives and the team of Jason Billington, Thomas Horton, Denis Baudin, Michael Harrison and Brian Cox. 6. The award for Outstanding Visual Effects in an Animated Feature went to Guillermo del Toro’s Pinocchio and the team of Aaron Weintraub, Jeffrey Schaper, Cameron Carson and Emma Gorbey. 7. The award for Outstanding Visual Effects in a Photoreal Episode went to The Lord of the Rings: The Rings of Power; Udûn and the team of Jason Smith, Ron Ames, Nigel Sumner, Tom Proctor and Dean Clarke. 8. The award for Outstanding Supporting Visual Effects in a Photoreal Episode went to Five Days at Memorial; Day Two and the team of Eric Durst, Danny McNair, Matt Whelan, Goran Pavles, and John MacGillivray. 9. The award for Outstanding Visual Effects in a Real-Time Project went to The Last of Us Part I and the team of Erick Pangilinan, Evan Wells, Eben Cook and Mary Jane Whiting.

10. The award for Outstanding Visual Effects in a Commercial went to Frito-Lay; Push It and the team of Tom Raynor, Sophie Harrison, Ben Cronin and Martino Madeddu.

11. The award for Outstanding Visual Effects in a Special Venue Project went to ABBA Voyage and the team of Ben Morris, Edward Randolph, Stephen Aplin and Ian Comley. Accepting the award was Janet Lewin, Senior Vice President, Lucasfilm VFX and General Manager ILM.

12. The award for Outstanding Animated Character in a Photoreal Feature went to Avatar: The Way of Water; Kiri and the team of Anneka Fris, Rebecca Louise Leybourne, Guillaume Francois and Jung Rock Hwang.

13. Comedian and actor Jay Pharoah appeared as a presenter.

14. Academy-Awardwinning filmmaker Domee Shi was on hand as a presenter.

15. The award for Outstanding Animated Character in an Animated Feature went to Guillermo del Toro’s Pinocchio; Pinocchio and the team of Oliver Beale, Richard Pickersgill, Brian Leif Hansen and Kim Slate.

52 • VFXVOICE.COM SPRING 2023 VES AWARDS
10 11 14 15 12 13
SPRING 2023 VFXVOICE.COM • 53
16. The award for Outstanding Animated Character in an Episode, Commercial or Real-Time Project went to The Umbrella Academy; Pogo and the team of Aidan Martin, Hannah Dockerty, Olivier Beierlein and Miae Kang.
17 16
18 20 21
17. The award for Outstanding Created Environment in a Photoreal Feature went to Avatar: The Way of Water; The Reef and the team of Jessica Cowley, Joe W. Churchill, Justin Stockton and Alex Nowotny.
19
18. The award for Outstanding Created Environment in an Animated Feature went to Guillermo del Toro’s Pinocchio; In the Stomach of a Sea Monster and the team of Warren Lawtey, Anjum Sakharkar, Javier Gonzalez Alonso and Quinn Carvalho. 19. Academy Award-nominated filmmaker Rian Johnson attended as a presenter. 20. Actor Randall Park took part as a presenter. 21. Actor Tyler Posey was on hand as a presenter.

22. The award for Outstanding Created Environment in an Episode, Commercial or Real-Time Project went to The Lord of the Rings: The Rings of Power; Adar; Númenor City and the team of Dan Wheaton, Nico Delbecq, Dan Letarte and Julien Gauthier.

23. Actor Danny Pudi was a presenter.

24. Actor Bashir Salahuddin attended as a presenter.

25. The award for Outstanding Virtual Cinematography in a CG Project went to Avatar: The Way of Water and the team of Richard Baneham, Dan Cox, Eric Reynolds and AJ Briones.

26. The award for Outstanding Model in a Photoreal or Animated Project went to Avatar: The Way of Water; The Sea Dragon and the team of Sam Sharplin, Stephan Skorepa, Ian Baker and Guillaume Francois.

27. The award for Outstanding Effects Simulations in a Photoreal Feature went to Avatar: The Way of Water; Water Simulations and the team of Johnathan Nixon, David Moraton, Nicolas James Illingworth and David Caeiro Cebrian.

54 • VFXVOICE.COM SPRING 2023 VES AWARDS
22 23 26 27 24 25

28. The award for Outstanding Effects Simulations in an Animated Feature went to Puss in Boots: The Last Wish and the team of Derek Cheung, Michael Losure, Kiem Ching Ong and Jinguang Huang.

29. The award for Outstanding Effects Simulations in an Episode, Commercial or Real-Time Project went to The Lord of the Rings: The Rings of Power, Udûn; Water and Magma and the team of Rick Hankins, Aron Bonar, Branko Grujcic and Laurent Kermel.

30. The award for Outstanding Compositing & Lighting in a Feature went to Avatar: The Way of Water; Water Integration and the team of Sam Cole, Francois Sugny, Florian Schroeder and Jean Matthews.

31. The award for Outstanding Compositing & Lighting in an Episode went to Love, Death and Robots; Night of the Mini Dead and the team of Tim Emeis, José Maximiano, Renaud Tissandié and Nacere Guerouaf.

32. The Award for Outstanding Compositing & Lighting in a Commercial went to Ladbrokes; Rocky and the team of Greg Spencer, Theajo Dharan, Georgina Ford and Jonathan Westley.

33. The award for Outstanding Special (Practical) Effects in a Photoreal Project went to Avatar: The Way of Water; Current Machine and Wave Pool and the team of JD Schwalm, Richard Schwalm, Nick Rand and Robert Spurlock.

29 28 30 31 32 33

34. The award for Outstanding Visual Effects in a Student Project (Award Sponsored by Autodesk) went to A Calling. From the Desert. To the Sea and the team of Mario Bertsch, Max Pollmann, Lukas Löffler and Till SanderTitgemeyer. The award was presented to the team by Dara Treseder, Autodesk’s Chief Marketing Officer.

35. The Emerging Technology Award went to Avatar: The Way of Water; Water Toolset and the team of Alexey Stomakhin, Steve Lesser, Sven Joel Wretborn and Douglas McHale.

36. Actor Angela Sarafyan participated as a presenter.

37. Pixar’s Chief Technology Officer Steve May and Chair of the VES Technology Committee Sebastian Sylwan, VES were on hand to present the first ever Emerging Technology Award.

38. Actor Josh McDermitt attended as a presenter.

39. Former Executive Director Eric Roth makes an impassioned speech upon receiving his VES Board of Directors Award.

56 • VFXVOICE.COM SPRING 2023 VES AWARDS
36
37 38 39
34 35
SPRING 2023 VFXVOICE.COM • 57
40. Director James Cameron presented the VES Lifetime Achievement Award to Gale Anne Hurd. 41. Honorees Gale Anne Hurd and Eric Roth share a backstage moment. 42. Former VES Executive Director Eric Roth and host Patton Oswalt reminisce about their friendly verbal jousts over the years at the Awards Show.
40 41 43
43. VES Awards Committee enjoys a moment on the red carpet. From left to right: Dave Gouge, Olun Riley, Stephen Chiu, Rob Blau, DJ Johnson, Katie Brillhart, Dan Rosen, Martin Rushworth, Lopsie Schwartz, Diego Rojas, Den Serras (Co-Chair), Scott Kilburn and Reid Paul (Chair). Not pictured: Brent Armstrong, Emma Clifton Perry, Chuck Finance, Chris Ingersoll, Sarah McGee, Jeff Okun and David Valentin.
42

44. Eric Roth, former VES Executive Director, is flanked by former VES Board Chairs Jeff Okun, VES; Jeff Barnes; Mike Chambers, VES; Roth; current Chair Lisa Cooke; Pixar GM & President Jim Morris, VES; and Carl Rosendahl, VES.

45. Producer Gale Anne Hurd holds her VES Lifetime Achievement Award, flanked by VES Executive Director Nancy Ward, left, and current VES Chair Lisa Cooke.

46. Jim Cameron and Gale Anne Hurd show off Hurd’s VES Lifetime Achievement Award.

47. Jeffrey A. Okun, VES celebrates his 21st year as the VES Awards show producer.

VES AWARDS 44 45 46 47

AVATAR: THE WAY OF WATER

60 • VFXVOICE.COM SPRING 2023 VES AWARD WINNERS

The VES Award for Outstanding Visual Effects in a Photoreal Feature went to Avatar: The Way of Water, which garnered nine VES Awards including Outstanding Animated Character in a Photoreal Feature (Kiri). Outstanding Created Environment in a Photoreal Feature (The Reef), Outstanding Virtual Cinematography in a CG Project, Outstanding Model in a Photoreal or Animated Project (The Sea Dragon), Outstanding Effects Simulations in a Photoreal Feature (Water Simulations), Outstanding Compositing & Lighting in a Feature (Water Integration), Outstanding Special (Practical) Effects in a Photoreal Project (Current Machine and Wave Pool) and The Emerging Technology Award (Water Toolset).

(Photos courtesy of 20th Century Studios)

SPRING 2023 VFXVOICE.COM • 61

GUILLERMO DEL TORO’S PINOCCHIO

62 • VFXVOICE.COM SPRING 2023 VES AWARD WINNERS

Outstanding Visual Effects in an Animated Feature went to Guillermo del Toro’s Pinocchio, which won three VES Awards including Outstanding Animated Character in an Animated Feature (Pinocchio) and Outstanding Created Environment in an Animated Feature (In the Stomach of a Sea Monster). (Photos courtesy of Netflix)

SPRING 2023 VFXVOICE.COM • 63

THE LORD OF THE RINGS: THE RINGS OF POWER

64 • VFXVOICE.COM SPRING 2023 VES AWARD WINNERS

Outstanding Visual Effects in a Photoreal Episode went to The Lord of the Rings: The Rings of Power (“Udûn”), which won three VES Awards including Outstanding Created Environment in an Episode, Commercial or Real-Time Project (“Adar,” Númenor City) and Outstanding Effects Simulations in an Episode, Commercial or Real-Time Project (“Udûn,” Water and Magma).

SPRING 2023 VFXVOICE.COM • 65
(Photos courtesy of Amazon Studios)

VP ROUNDTABLE: ON THE SURGING NEW TECHNOLOGIES THAT ARE CHANGING THE WORLD

The acclaimed science fiction novelist Arthur C. Clarke once said, “Any sufficiently advanced technology is indistinguishable from magic.” It’s not a reach to say that virtual production has not only created magic for the movies, but VP will create magic for most of society in the years to come. VP and its cousin technologies – VFX, virtual reality, AR and others – are destined to transform the world.

Each of these technologies is in some progression toward altering, not just the world of entertainment, but education, medicine, science, military, architecture, real estate, travel, you name it. In some instances, VP, VFX, VR and other technologies are working together to accelerate this new world. What will the landscape look like in 10 years? We gathered a virtual roundtable of those on the front lines and asked them to gaze into their crystal balls.

Sam Nicholson, CEO and Founder, Stargate Studios

Virtual production, virtual reality and artificial intelligence are rapidly changing the way we live, work and play. In the next 10 years, these technologies will exponentially accelerate and merge to disrupt a wide range of fields.

Pixomondo’s

In arts and entertainment, AI will greatly reduce the cost of creating original content, enabling realistic virtual characters and photoreal synthetic environments. VR and VP will increasingly employ these assets to create immersive, interactive movies, games and new media. This will also greatly affect education with realistic

66 • VFXVOICE.COM SPRING 2023
TOP: In the cockpit of the Razor Crest with The Child and the Mandalorian (Pedro Pascal) in ILM’s StageCraft VP volume during shooting of Season 2 of The Mandalorian (Image courtesy of ILM, Lucasfilm Ltd. and Disney+) OPPOSITE TOP TO BOTTOM: Magicbox provided the mobile VP setup on Catalina Island for a test shoot with Stargate Studios’ Sam Nicholson and crew. (Image courtesy of Magicbox) LED volume in Toronto customized for Season 4 of Star Trek: Discovery.
VIRTUAL PRODUCTION
(Photo: Michael Gibson. Courtesy of Paramount+) Shooting Season 2 of The Mandalorian in the ILM StageCraft volume. (Image courtesy of ILM, Lucasfilm Ltd. and Disney+)

simulations and personalized learning experiences.

In the healthcare industry, AI will analyze massive data sets in the cloud to instantly diagnose patients and recommend treatments. Immersive VR will be used to train surgeons and treat complex mental conditions. In architecture, real estate and travel, VR and VP will increasingly be used to allow users to virtually explore building designs and virtual locations, which will optimize industrial design and environmental planning. In the military, AI will interpret large amounts of historical and real-world data, then employ VR and VP to run multiple simulations for various tactical scenarios.

In the near future these technologies will become widely adopted, blurring the line between physical reality and virtual reality and offering new opportunities which will dramatically change our lives in myriad ways.

Alex Webster, Head of Studio, U.K., Pixomondo

Virtual production tools have changed visual content and entertainment forever, but the extrapolation of real-time rendering technology into medicine, education and beyond is also revolutionary. VR and AR offer a solution to visualize complex, stressful or dangerous situations in a real-world environment. Doctors and surgeons can prepare and review clinical skills to remotely build pre-operative experience, as well as remotely connect with colleagues to view and share how they execute a procedure and democratize skills and experience and the sharing of life-saving

SPRING 2023 VFXVOICE.COM • 67

information in a tangible way. Virtual production also allows manufacturers to simulate entire production lines and visualize potential points of failure. Architects can fabricate engineering and design choices in real-time and contextualize those decisions in a virtual simulation of a real-world environment.

As for how film VP will look in 10 years’ time, it’s almost impossible to predict because of the current speed of innovation. But, we can assume that VP will behave as an intelligent ecosystem that links pre-photography development, previsualization to final pixel images in one engine, and leverages AI to enhance and optimize critical thinking in filmmaking practice. Machine learning will automatically apply tropes of film craft into pre-production, VFX and in-camera VFX, and latency in tracking will be reduced to nanoseconds, improving the performance of on-set, in-camera VFX profoundly. Previsualization of sequences and environments and the characters within them will be almost photoreal and will usher in a paradigm shift in how film and episodic content is conceived and created. We have seen the sudden advances in AI-informed concept art; expect this to translate to moving images in our near future.

On one hand, it’s unnerving, as preconceived notions of production are upended by game engine technology. On the other hand, technology and innovation have been inherent in the film industry since its inception, and we are at the early tipping point of the next fundamental change in how filmmakers can tell their stories.

The ability to quickly create realistic, real-time renderings of complex structures or objects via game engine technology will empower adjacent industries to expedite the development, training and delivery of services.

Everyone is so excited about VR, AR, VFX and VP because they each possess attributes that have been present in just about every major advancement throughout history. I’m talking about increases in power capability, greater storage capacity, hardware miniaturization and mobilization. These are the reasons why we can expect this tech to change our work, our lives and our world. All these attributes are incredible on their own, but when they are combined, they give us the power of scale. This is the key I believe we will use to unlock the secrets of the universe.

When I say scale, I’m talking about the size of the world around us and the ability to change it instantly, zooming in and zooming out, like the scale tool in Adobe Illustrator, with infinite resolution. Imagine the perspective we will gain when we harvest real-world data and use it to build virtual environments we can then safely dive into and swim through as we get smaller and smaller, passed the atomic level. What will we discover when we use these technologies to see the quantum world up close? What will we learn about ourselves when we look back? This ability is not that far away, and it’s going to make our universe a much smaller place.

Kim Davidson, President and CEO, SideFX

Virtual production is in its infancy with production tools and techniques constantly evolving. Currently, VP is done with a patchwork of hardware and software from multiple vendors glued together by proprietary tools from a SWAT team of talented pioneers. In

68 • VFXVOICE.COM SPRING 2023
VIRTUAL PRODUCTION
TOP TO BOTTOM: Pixomondo’s LED stage in Toronto showcases the set of the Starship Enterprise’s engine room for Star Trek: Strange New Worlds. (Image courtesy of Pixomondo) ILM StageCraft deployed for a night scene taking place on Daiyu for Obi-Wan Kenobi. (Image courtesy of ILM, Lucasfilm Ltd. and Disney+) On Pixomondo’s VP stage in Toronto for House of the Dragon (Image courtesy of Pixomondo and HBO)

10 years, the VP workflow will be standardized, common and available for rent to independent content creators. There will still be several vendors of hardware/software solutions, but they will be adhering to VP standards.

As virtual production becomes more commonplace, the fruits of the advancements, pioneered by high-end film, will have benefits beyond entertainment. Imagine walking around inside a 3D display dome and having the perspective of the projected environment adjust correctly to your position and respond to your gestures. Military, archeologists and tourists will familiarize themselves with sites before visiting. First responders will be given several scenarios to train in safe and highly realistic settings. Non-moving vehicles set up in the center of the dome will allow for effective training for bus drivers and operators of heavy machinery as they move through traffic and obstacles.

Outside the dome, actors or teachers can be captured in real-time and placed in the projected environment for viewers and students to engage with. Truly, the ability to create highly realistic spaces where anything is possible is only limited by our imagination.

The first thing to say is that virtual production already means several things which apply to different areas of the production process. The production process is already changing although often at a slower pace than the technologies around it. It’s tempting to think, as many people optimistically did around the pandemic, that virtual production might replace aspects of location shooting and make them cheaper and easier. Or, that the side of virtual production geared around prep might replace the need for scouting, recces and getting people together to share reactions to a location or a set.

I personally hope that it doesn’t replace either of these valued, real and human processes – but that instead, it can give us a wider range of tools to conceive putting the visual pieces of a puzzle together more flexibly and creatively.

One of the aspects of virtual pre-production, which has already developed hugely over the last 10 years, has been the digitization of the art department, and the convergence of the design worlds of art department and visual effects. This requires quite a lot of readjustment in terms of department structures and ways of communicating, but it can and will make us stronger together in terms of the coherence of the worlds we plan and create together. Virtual production in the sense of sharing models and visualizing collectively what the director’s vision can be is definitely an evolving and powerful principle. More a principle of communication than technology, but certainly enabled by modern tools.

On the other end of things, the convergence of photography and visual effects in the in-camera VFX side of virtual production is a powerful means of getting creative decisions made by the right people in collaboration on set. The ability to conceive of lighting and backgrounds as part of the same language, as we do in response to a location with available light, but with control over the time of day and weather, is definitely interesting!

SPRING 2023 VFXVOICE.COM • 69
TOP TO BOTTOM: Filming Thor: Love and Thunder with ILM StageCraft at Sydney’s Fox Studios. (Image courtesy of ILM, Lucasfilm Ltd. and Marvel Studios) Pixomondo’s electric “red stage” in Toronto, one of three LED stages maintained by the company, which was purchased by Sony last year. (Image courtesy of Pixomondo and Sony Pictures Entertainment) Chris Hemsworth as Thor walks on water in Thor: Love and Thunder, filmed with ILM StageCraft on site at Sydney’s Fox Studios. (Image courtesy of ILM, Walt Disney Studios and Marvel Studios)

It is only an answer to a relatively small set of shooting circumstances, but if it can be part of replacing what we have traditionally used blue and green screens for without completely upending traditional shooting schedules, it will be a step in the right direction.

I don’t foresee this technology replacing traditional filmmaking in any significant sense, but hope it will continue to enable us to re-align creative disciplines and visualize how they converge – in prep, on set and in post.

Irena

Cronin, CEO, Infinite Retina

Virtual production is still in its infancy. Within 10 years, in combination with VFX, VR, generative AI and technologies not even thought up yet today, VP will have a wide range of uses in addition to entertainment. VP will be useful anywhere where a virtual environment is needed that is made relatively fast, adjusted in real-time and flexibly swapped out, in addition to the perks of being created and updated remotely. This includes realistic visualization for architecture planning, education, training exercises (military, first responders, law enforcement, etc.), marketing and advertising, and staging and backgrounds for real estate. It could be that the expense of VP will come down so much that it will become even more commonplace and be featured in restaurants and public spaces. The time for realistic visuals on demand has come and will increasingly dazzle.

Rene Amador, CEO and Co-Founder, ARWall

VP has completely changed the way genre blockbusters are shot. It’s currently the fastest-growing segment of the movie industry. ARwall’s technology has been designed to democratize VP tools for independent creators and home studios. We envisage creators having a complete toolset to create blockbuster-quality effects with their existing hardware, to the extent of being able to win major film awards from their living room within the next few years.

In 10 years’ time, I expect to see the lines blur between filmmaker, audience and performer. The ultimate goal of entertainment is as of yet unrealized: epic, world-spanning stories that can be enjoyed collectively, but are also meaningful on a personal level. Through XR, viewing experiences could be tailored to interweave details of an individual’s real life into the narrative. We’re not far from a society in which these virtual worlds may become more meaningful and ‘real’ than the physical world, and this is both an opportunity and danger I hope ARwall can help navigate.

Steve Calcote, Director and Partner, Butcher Bird Studios

Imagine turning a script into a blockbuster movie in a single day – with one filmmaker – within the next decade. That’s the inevitable level of creative power being unleashed by the current fusion of virtual production, AI-augmented creativity, real-time game engines and VFX, generative machine learning, reality scanning, digital doubles, automated cloud-based post-production and a global hunger to share visual storytelling like never before.

What place then for the artist? The same place as always: at the forefront of using revolutionary tools to invent visions the world has never seen and inspire humanity anew. We’ve been through this recently: smartphones allow every person on the planet to make perfect photos. Yet relatively few photos cut through the noise to truly move us. The current creativity singularity will empower

70 • VFXVOICE.COM SPRING 2023 VIRTUAL PRODUCTION
TOP TO BOTTOM: Netflix has made a significant investment in VP’s global potential. 1899 was shot at Dark Bay VP stage in Germany, one of the largest VP stages in the world (Image courtesy of Dark Bay and Netflix) In the ILM Stagecraft VP volume for The Book of Boba Fett, a Mandalorian spin-off that extensively utilizes volume technologies. (Image courtesy of Lucasfilm Ltd. and Disney+) Filming a battle scene for Thor: Love and Thunder with ILM StageCraft at Sydney’s Fox Studios. (Image courtesy of Lucasfilm Ltd. and Marvel Studios) In the Dark Bay volume for 1899. (Image courtesy of Dark Bay and Netflix)

individuals across every industry and every aspect of society to share their vision more clearly. But it will also enable artists and storytellers to create transformative works that take our breaths away… all over again.

Virtual production is changing the world with digital humans and in how we produce content and tell stories in general. VR relies heavily on the capabilities of hardware. The drive to photorealism is completely dependent on it. As the technology progresses to where digital content is indistinguishable from reality, this is where the possibilities become endless. AI and ML aid in reducing artist time and help bridge the gap in photorealism by using libraries of content, but having the ability to create fully photorealistic, close-up humans, either based on real people or completely invented, means we can have posthumous interactions with people from the past, tell stories with celebrities from any point in their lifetime and tell pretty much any story we can think of.

The trick is then, how do we create the worlds and characters for storytelling? Our whole current production workflow is very much “building the car as we drive it,” at least in film and television. Storytellers shoot their content, edit and then fix it all in post. However, with worlds and characters becoming more digital, the work to create them is now needing to be done up front. Producers and creatives are asked to make decisions at the start and plan ahead before production begins. It’s something many people are very resistant to. We find those prepared to make decisions early on are few and far between. Most creatives are just not used to it, and it is hard to adapt to.

Libraries of photoreal content will eventually be accessible to anyone, and the ability to create all types of digital content will be democratized, in much the same way digital phones now have decent cameras and allow anyone to shoot films and tell stories. We are headed toward a Ready Player One future. It’s not that far off. I can’t even imagine what storytelling will be like when it’s happening in real-time and we’re both creating and interacting with worlds and characters that are photoreal, with the potential of simulating empirical data through neural networking. This is all the stuff you see in Black Mirror or read about in science fiction, but hey, looking around the world today, most of what I see were the things of science fiction when I was a kid.

Virtual production, or using digital tools in places we may have traditionally used analog techniques, continues to change the way movies and shows are made. In just a few short years, the LED Volume technique has gone from one stage in Manhattan Beach driven by a visionary filmmaker, Jon Favreau, and his team, to many stages all around the world supporting productions of all scopes and budgets. Ten years from now, we’ll look at an LED stage as one of the many powerful tools in the VFX practitioners’ growing arsenal of techniques, alongside bluescreen, on-set compositing and real-time AI tools that will further transform filmmaking in the future.

Casey Schatz, Head of Virtual Production, The Third Floor

Virtual production is changing the world by opening the door to computer graphics for creative individuals from the arts and sciences who may not have prior experience with CG software. The tech has become more intuitive and easier to engage with in the day-to-day work of many different personnel. The interface to technology has become more accessible. Within moviemaking, we may have an actor in a mocap suit, a cinematographer holding a virtual camera, a production designer scouting with VR and a key grip doing a tech scout with AR. In the sciences, we currently have car engineers using AR to collaborate on design and assembly, medical students using VR to practice heart surgery and motion capture being used in biomechanics. At home, we can use AR within our phones to preview furniture, put a funny face on someone and translate languages.

Virtual production is only just getting started. It’s difficult to imagine any industry that wouldn’t benefit from immediate feedback, infinite iterations and the democratization of these tools beyond the tech savvy.

Michael Ford, CTO, Sony Pictures Imageworks

As seen in the VFX and virtual production areas, visual complexity is increasing and the speed that we are needing to calculate and render pictures is trying to keep pace. We will need a massive amount of compute and network bandwidth to power the next generation of visualization, simulation and spatial computing that will dramatically impact the virtual and augmented worlds that will be created and ultimately explored. Our digital future, whether in virtual reality, augmented reality and whatever is next, is going to be produced for us not from a machine that is close to us, but from a network of machines that are distributed around us.

Mariana Acuña Acosta, Senior Vice President, Virtual Production, Technicolor Creative Studios

Virtual production unifies how content is made and brings more types of content into the mainstream media. Just like we have seen streaming services blur the lines between feature films and episodic, virtual production will blur the lines between film and less linear types of media such as games. It’s likely that in five years, A-list film directors will be directing games. In 10 years, they may be involved with content production and even further removed from film. This is not because content across different media types is becoming more similar. However, storytelling has always been a common core of all these media types, and virtual production is unifying how we can create them.

Joe Letteri, VES, Senior VFX Supervisor, Wētā FX

Rather than primarily entertainment and advanced design workflows, we’re likely to see improvement in the layering of digital “realities” on top of our everyday experiences. For Avatar: The Way of Water we created a new on-set depth compositing workflow that allowed James Cameron to see digital characters with accurate positioning and occlusion on a live-action set. These types of mixed-reality techniques are often pioneered in VFX and quickly adopted for more consumer applications, aiding everything from autonomous vehicle navigation, to advanced medical imaging and surgical capabilities and daily wayfinding.

SPRING 2023 VFXVOICE.COM • 71

LED STAGES: THE BOOM IN EQUIPMENT AND TECH

LED stages have boomed in popularity in a short time and have been used in many high-profile films, series and commercials. While only three LED stages were operating in 2019, that number jumped to 300+ tracked in late 2022 by Epic Games, according to a company spokesperson. The latter number refers only to stages using Unreal Engine, the leading real-time 3D creation tool. The growth of virtual production with LED stages has created a VP ecosystem full of names including Unreal Engine, ROE Visual, disguise, ARRI, Brompton Technology (and its Tessera system), Mo-Sys, OptiTrack and StageCraft, among others. Follwing is a sampling of activity at companies offering LED stages and technology. It is not intended to be a complete list.

disguise

“We see LED volumes as the future standard in film production setups due to the many efficiencies they bring to filmmaking, not to mention the environmental benefits and cost savings from removing the need to travel on location to shoot a particular scene,” says Addy Ghani, Vice President of Virtual Production at disguise. In the past two years, the disguise xR platform has helped generate over 600 real-time productions in 50 countries, including LED-based virtual productions for Netflix and Amazon, according to the firm.

Ghani explains that disguise “is a technology platform enabling media and entertainment industries to imagine, create and deliver [live] visual experiences.” He adds, “Our solution provides

72 • VFXVOICE.COM SPRING 2023
TOP: Star Trek: Strange New Worlds has utilized Pixomondo’s LED stage since the series began. (Image courtesy of Pixomondo) OPPOSITE TOP TO BOTTOM: A snowy mountain backdrop on an LED wall in a Pixomondo Toronto facility. (Image courtesy of Pixomondo) Astronaut on wires is suspended above the Earth on a LED wall in the Virtual Production Innovation documentary series, as DNEG teamed with Dimension and Sky Studios. (Image courtesy of DNEG)
VIRTUAL PRODUCTION
Cityscape on LED wall with crew and actor in foreground for the Virtual Production Innovation documentary series. (Image courtesy of DNEG)

an end-to-end workflow for creatives and technical producers... to design, sequence and control their production from concept through to showtime.” In recent years, disguise developed its software to enable users to deliver such productions in a virtual setting.

Ghani notes that for the past two years disguise has been powering an “array of immersive virtual productions across broadcast and film and episodic TV that essentially place actors, presenters and performers into photorealistic virtual worlds, all in real-time.” In 2022, disguise received an Engineering, Science and Technology Emmy Award for its Extended Reality (xR) technology.

Although disguise is often employed along with the leading VP tech names mentioned above, “content creation can come in many shapes,” Ghani says. “Generative engines like Notch can often produce spectacular results with little effort. Having very well-designed driving plates from drivingplates.com can be a huge lift to production quality. Other innovative technologies we work with include: GhostFrame, stYpe [camera tracking] and Ncam [real-time visual effects (RVFX) solutions].”

Ghani points to the development of sub-2mm LED panels as important. “As the LED pitch distance shrinks, this enables cameras to get closer to the LED panels and alleviates a lot of aliasing issues on set,: he says. “With denser pixel pitch comes the added need to compute higher resolution pixels, but with photoreal content it’s a welcomed upgrade in the near future.”

Ghani mentions MOVEAi (move.ai) as an innovator and notes

SPRING 2023 VFXVOICE.COM • 73

that “having the ability to track performer movements without the need for suits and traditional mocap hardware is huge. This will unlock digital doubles and many more digital elements that [are] dynamic during performance.” And he points to Midjourney, an AI art generator from a research lab based in San Francisco. “In the near future, having AI-driven content generation that can go up on an LED volume will create new creative opportunities,” Ghani says.

According to Ghani, disguise’s Virtual Production Accelerator, in partnership with ROE Visual, “is focused on training up filmmakers on virtual production workflows powered by disguise and equipping them with the knowledge to build their own LED volumes to drive the future of film production.”

GhostFrame

ROE Visual, AGS and Megapixel VR joined forces to deliver the aforementioned GhostFrame, which makes it possible to show multiple full dynamic images on an LED screen simultaneously. GhostFrame has been tested in the field by partners such as Lux Machina and was demo’d on a VP screen at FMX last May, in partnership with disguise, Epic Games, ARRI, TrackMen and a GhostFrame team represented by ROE Visual

Lux Machina Consulting

Last year, Lux Machina Consulting installed stages in the U.K. (Warner Bros. and Apple), built a temporary one for FOX Upfront and completed its Prysm Stage at Trilith Studios in Atlanta, among other projects, according to President Zach Alexander. The Prysm Stage is owned and operated by NEP Virtual Studios (the division of NEP Group that Lux Machina is part of).

Lux Machina also worked on Amazon’s enormous LED stage in Culver City that is 80 feet in diameter, 26 feet high and is integrated with AWS, Amazon’s cloud computing platform, to optimize the workflow. The volume is located on historic Stage 15, located at the MBS Group’s Culver Studios. Epic Games, Fuse Technical Group, 209 group and Grip Nation also contributed their tech know-how to the project. The facility debuted at the end of last year and is located down the street from Sony’s LED stage, which utilizes Sony Crystal LED display panels and was unveiled last fall at Sony Innovation Studios on the Sony Pictures lot in Culver City.

“We saw a bit of a transition in 2022,” Alexander comments. “In 2021, the majority of stages I saw doing larger-scale work were all owned by the studios directly, even if we were the ones operating it. From mid-to-late 2022, we’ve seen that model shift to a renewed interest in owner-operated stages and temporary, production-specific volumes. Permanent LED stages will always have a place for shows of a certain scale that need the pre-existing infrastructure. Still, the ability to erect and strike temporary volumes provides a level of flexibility that will be very attractive to productions.”

Alexander continues, “The overall mission is to advance the art and science of storytelling, and these days the focus is to grow a digital production ecosystem that supports that advancement.” He mentions House of the Dragon (HBO), Masters of the Air (Apple TV+) and Barbie (Warner Bros.) as some of the productions Lux Machina worked on in 2022.

74 • VFXVOICE.COM SPRING 2023
VIRTUAL PRODUCTION
TOP TO BOTTOM: A conifer forest enlivens an LED wall in the background in the Virtual Production Innovation documentary series. (Image courtesy of DNEG) Shutterstock acquired TurboSquid, which offers a vast array of 3D models and environments (images and videos) for virtual productions, including this model of an “Atlas Starship” created by GrafxBox, a TurboSquid 3D model artist. (Image courtesy of TurboSquid/Shutterstock) This TurboSquid model is a Steampunk fantasy airship created by Cordy, a TurboSquid 3D model artist. (Image courtesy of TurboSquid/Shutterstock)

In terms of its VP partners, Alexander notes, “As game engines go, Unreal is our preferred real-time rendering environment. For plate playback, we’ve used disguise for years, but PIXERA is very interesting and something we are using more and more as well. As far as LED processing goes, Brompton is a good choice, but Megapixel’s Helios processor and their GhostFrame solution are things that are making their way into more and more of our systems. As for the LED itself, ROE and Sony are probably the main brands we have been using.”

StageCraft

ILM is a leader in LED stages, and a historic presence because of its work on The Mandalorian. Its StageCraft system has also been used for the movies Rogue One: A Star Wars Story, Solo: A Star Wars Story, The Midnight Sky and Thor: Love and Thunder, and series such as The Book of Boba Fett and Obi-Wan Kenobi. ILM currently has three purpose-built StageCraft volumes located in the greater Los Angeles area and one apiece in Vancouver and London (at Pinewood Studios). Ian Milham, ILM Virtual Production Supervisor, comments, “We’re adding offerings to our existing standing stages including bespoke volumes, each tailored to the needs and requirements of the production it’s being designed for. We will still have our large permanent stages, but we’ve also been doing mobile deployments of various scales for a couple of years now and anticipate the opportunities for those to grow.”

Milham continues, “We’re happy to run an industry-standard Unreal-based pipeline if that’s what works for a show and have done many that way. ILM has developed our own custom StageCraft toolset, including our Helios Cinema Render Engine with specialized tools for filmmakers, which has benefited from the many lessons learned from our hundreds of virtual locations shot so far. Ultimately, we have the flexibility to adapt to a show’s needs. Our hardware setups are mostly industry-standard equipment, with variations by location, all calibrated by our proprietary color science technology.”

Milham explains, “Over the years of doing this, we’ve developed a series of methodologies for color performance and alignment that have given us the ability to set up a stage quickly and build exactly as much as needed to get the desired shots, which has been a big benefit in mobile situations where time is of the essence.”

Magicbox

While doctors may not make house calls, LED stages soon will. Virtual production has gone fully mobile with Magicbox, which transforms a semi-trailer truck into an LED volume and computer control center in minutes. The LED volume inside the “Gen 2.0 Beta model” (due this month) will expand to 23 feet x 28 feet x 10 feet, according to Magicbox Founder and CEO Brian T. Nowac. He explains, “We will provide three trained operational technicians with every Magicbox rental.” The firm, based in San Francisco, has satellite locations in Burbank and Berkeley. It plans to open various regional locations in the future. Nowac comments, “Magicbox is on a mission to democratize technology accessibility

SPRING 2023 VFXVOICE.COM • 75
TOP TO BOTTOM: The xR stage at SCAD, the Savannah College of Art and Design, is powered by disguise. Courtesy of Design and SCAD) A massive Pixomondo LED stage in Toronto. Sony acquired the large VFX firm and its three LED stages. (Image courtesy of Pixomondo) Sequin AR provided virtual production and AR broadcast services for Mariah Carey’s Magical Christmas Special on Apple TV+ in 2020. (Image courtesy of Sequin AR)

within the motion picture industry.”

He recalls, “When I discovered modern virtual production, I could see the impact it could have in the production of more creative content, faster and cheaper than ever possible before. I was crushed when I realized just how expensive it was to build and operate an LED volume. I needed to solve this problem, not only for me but for at least [the] 75% of the motion picture and video production industry who would love to take advantage of the technology but can’t afford [it].” To pull it all together, Magicbox worked with Unreal Engine, ARRI, Vū, Megapixel Visual Reality, Mo-Sys, Stargate Studios and Craftsmen Industries, among others.

According to Nowac, Magicbox has many advantages over a dedicated LED stage. He explains, “Magicbox can go anywhere, wherever you want or need production to happen. This is tremendously advantageous to the content producer because we create logistical convenience and cost efficiencies previously impossible with a fixed volume.”

DNEG

Looking at the growth of LED stages, “We are coming out of the ‘slope of enlightenment’ stage, so [there will be] some more growth, but it’ll be cautious growth,” comments Steve Griffith, Executive Producer, DNEG Virtual Productions.

DNEG has two of its own stages – one in L.A. and one in London – and is working on other LED stages in multiple locations globally, according to Griffith. He says that one of the tools in DNEG’s shed “is that we provide expertise and services for creatives and producers. We have come a long way in the last few years in terms of what works and what doesn’t. We are becoming more efficient with higher success of ICVFX.”

Unreal Engine, ROE panels, disguise, ARRI, Brompton Technology and Mo-Sys tracking are among the tech used in DNEG’s volumes. Griffith observes, “There is a large list of gear that can be added to that, but generally speaking those are the top list of suppliers and brands used. Other tracking solutions include Vicon, OptiTrack, RACELOGIC and stYpe (RedSpy).” For DNEG, he notes that “Color workflow and VFX post-workflow are key areas of focus of our development.”

Arcturus

Arcturus offers tools for volumetric video editing and streaming. Its HoloSuite is a capture-agnostic post-production platform that makes it easier for creators to edit, compress and stream volumetric video. The suite is composed of two products: HoloEdit and HoloStream. Last year, Arcturus announced an $11 million round of funding, led by CloudTree Ventures, including investments from Autodesk and Epic Games.

Piotr Uzarowicz, Arcturus Head of Partnerships & Marketing, comments, “Virtual production studios are now using live-action performances recorded with volumetric capture to create 3D characters. These characters are ideal for use on LED walls thanks to the human-real nature of the medium, the 360-degree view of each character and lack of uncanny valley. Our software makes it possible to puppeteer these characters in real-time on the set.

76 • VFXVOICE.COM SPRING 2023 VIRTUAL PRODUCTION
TOP TO BOTTOM: TurboSquid, via Shutterstock, has models such as this Manhattan District Times Square created by FraP, a TurboSquid 3D model artist. (Image courtesy of TurboSquid/Shutterstock) A bespoke StageCraft LED installation was built into the set itself for the award-winning The Midnight Sky (Image courtesy of ILM)

Volumetric video and HoloSuite benefit entire productions from the virtual art departments [to] pre-viz [and] VFX and in some cases all the way to final pixel.”

Zero Density

Zero Density’s TRAXIS talentS is an AI-powered markerless stereoscopic talent tracking system that can identify the people inside the 3D virtual environment without any wearables, according to Yavuz Bahadiroglu, Global Channel and Growth Manager at Zero Density, which is based in Izmir, Turkey and Las Vegas. “Zero Density’s hardware and software have been used on various LED and hybrid productions,” he says.

Shutterstock

Shutterstock calls itself a “360-degree content creation solution and has a vast library of royalty and royalty-free images and videos for virtual production.” Paul Teall, Vice President of 3D Strategy and Operations at Shutterstock, comments, “Our library of stock 3D models and other content acts as a ‘virtual prop shop.’ If you need something in your scene, we’ve probably already got a large variety of options built. You can drop 3D objects quickly into your virtual scene rather than doing any prop or set build-outs.”

He continues, “We can also generate custom environments that can be fine-tuned ahead of the production date. These environments can be anything from real-world locations to custom interiors or even fantastical worlds.” Shutterstock purchased TurboSquid last year, which, according to the former, made it the world’s largest 3D marketplace.

Teall comments, “In addition to our stock libraries, we also offer full custom services through our Studios division. We can do anything from assisting on a production to handling the entire shoot for you.” Teall is optimistic about virtual production with LED stages in general. He lauds their “speed, control, flexibility.”

Pixomondo

In 2022, Sony purchased Pixomondo, which at year’s end owned and operated three large-scale LED stages (two in Toronto and one in Vancouver), which has seen the filming of the series Star Trek: Discovery Season 4, Star Trek: Strange New Worlds and Avatar: The Last Airbender. Pixomondo’s Josh Kerekes, Head of Virtual Production, thinks LED stages have ironed out many of the platform’s wrinkles. “We believe there are fewer technical challenges now than when this industry was in its infancy, and that’s largely due to the mass adoption in recent years. There’s a saying that ‘the last 10% takes 90% of the time.’ Well, we’ve collectively solved the first 90% and all that remains [are] the little nuances that will make what is virtual and captured in camera indiscernible from reality.”

SPRING 2023 VFXVOICE.COM • 77
TOP TO BOTTOM: ILM, which utilized StageCraft, was responsible for the terrestrial visual effects work on The Midnight Sky, plus views from the observatory. (Image courtesy of ILM) A rugged fantasy mountain scene by Color Farm from TurboSquid. (Image courtesy of TurboSquid/Shutterstock) Green grids light up on the LED walls for the Virtual Production Innovation documentary series, which was filmed at the ARRI mixed reality stage in London. (Image courtesy of DNEG)
“In the near future, having AI-driven content generation that can go up on an LED volume will create new creative opportunities.”
—Addy Ghani, Vice President of Virtual Production, disguise

HOW VP RETHINKS THE WAY STORIES ARE BEING TOLD

Today, virtual production encapsulates so many areas – visualization, real-time rendering, LED wall shoots, simulcams, motion capture, volumetric capture and more. With heavy advances made in processing power for real-time workflows, virtual production tools have exploded as filmmaking and storytelling aids.

The idea, of course, is to give filmmakers and storytellers more flexible ways to both imagine and then tell their stories, and in that way virtual production has no doubt improved creative outcomes.

Via on-the-ground stories told by visual effects, virtual production and virtualization supervisors, we look at examples of where virtual production has provided new options and practical solutions to storytellers and how it has made a clear difference on real productions.

VISUALIZATION IS STORYTELLING

Proof

During the early stages of planning on Comandante, a

Studios specializing in previs (now more commonly referred to as visualization) have always helped filmmakers shape stories. In the past few years, those same studios have also spearheaded a number of virtual production innovations. Proof Inc., for example, recently added real-time motion capture during its virtual camera (VCAM) sessions. “We can now direct performance and cinematography at the same time,” notes Proof Creative Technology Supervisor Steven Hughes. “The data captured during a VCAM session is quickly passed to the shot creators, who know that the staging and composition are already in line with the director’s vision.”

On Amsterdam, Proof’s ‘viz’ work was directly used by the filmmakers for a period New York sequence. It started with a virtual production set build of 1930s New York and then blocking of animation in Maya before moving into Unreal Engine 4. “Then we were ready for our virtual camera scout with Cinematographer Emmanuel ‘Chivo’ Lubezki and VFX Supervisor Allen Maris,”

78 • VFXVOICE.COM SPRING 2023
TOP: Proof Inc.’s VCAM or virtual camera setup as used on Amsterdam. (Image courtesy of Proof Inc.) OPPOSITE TOP TO BOTTOM: A Proof Inc. VCAM screenshot of a previs frame from the 1930s New York sequence in Amsterdam, which includes details such as lensing. (Image courtesy of Proof Inc.) Inc.’s previs was able to be used to map out the action of the New York sequence to work out staging, camera placement and also greenscreen positions. (Image courtesy of Proof Inc.) virtual production test on an LED wall with real-time rendered ocean water and ‘live’ compositing via Foundry’s CopyCat tool was undertaken. (Image courtesy of Foundry)
VIRTUAL PRODUCTION
The AI tool CopyCat inside of Nuke runs on the Comandante test set. (Image courtesy of Foundry)

details Proof Previs Supervisor George Antzoulides. “The filmmakers were able to use our virtual sandbox to plan out complex shots and camera movements before filming ever began.”

“With a scene that required visual effects work in nearly every shot, with Proof’s help, we were able to create a master scene file and then using the VCAM, go in and block out the camera moves with Chivo,” says Maris. “This allowed us to figure out the bluescreen details on the main unit set and then to plan the specific plate needs in New York.”

“As the only sequence they were going to previs,” adds Proof Head of Production Patrice Avery, “Allen really wanted to use it to help inform the mood of the overall shoot. He wanted to capture the noir, low-light, period New York feel. It also wasn’t an action scene as much as an accident that just happens that we needed to figure out.”

The result, shares former Proof Inc. Virtual Art Department Lead Kate Gotfredson, now with ILM, was a way to use previs and virtual production to give the filmmakers a way to explore all options. “It was a road map that helped drive the rest of the production.”

GOING FOR NEAR REAL-TIME

Getting an early sense of what will be in the final frame has become one aim – and benefit – of virtual production. For Edoardo de Angelis’ World War II submarine film Comandante, Visual Effects Designer Kevin Tod Haug enlisted a group of virtual production professionals to build what he describes as a ‘near real-time’ workflow for the film.

These included Foundry with its Unreal Reader toolset and AI CopyCat tool used for real-time on-set rotoscoping and compositing, Cooke with its lens calibration and metadata solutions, in-camera VFX approaches from 80six, High Res and DP David Stump, previs from Unity, and visual effects R&D and finals from Bottleship and EDI.

Previs first proved very effectual in finding storytelling moments. “The previs definitely impacted the way Edoardo shot the movie,” describes Haug. “He had this idea that the camera should feel like a member of the crew on the boat, but mostly that didn’t work so well, especially since a real submarine is quite thin. It became pretty clear that you had to be able to reach out with the Technocrane and shoot scenes ‘off-boat,’ and that’s how we did the shoot.”

Furthermore, the on-set real-time compositing workflow and pre-made virtual environments provided accurate templates of the final shots on the director’s monitor, which, while production was still ongoing in Italy, provided one of the major benefits of adopting a virtual production approach: it allowed for some very advanced rough cuts.

“Everyone’s looking at shots that start to verge on what is usually called post-vis,” Haug says, “and we haven’t even finished production yet, so it’s not final pixels in camera, but by the time Edoardo is done with his director’s cut, he’s going to know what the movie looks like. That’s streets ahead of the old days.”

SPRING 2023 VFXVOICE.COM • 79

VIRTUAL PRODUCTION SHOWS ‘WHAT’S GOING TO GO THERE’

Beginning on Robert Zemeckis’ performance capture film A Christmas Carol as a digital effects supervisor, and then working with the director as Production VFX Supervisor on such projects as The Walk, Allied, Welcome to Marwen, The Witches and Pinocchio, Kevin Baillie has continually been exploring the storytelling benefits of virtual production.

“I think Bob Zemeckis has adopted virtual production so enthusiastically over the years because it empowers him to have more control over the filmmaking process and be closer to the filmmaking process while making the film,” says Baillie. “There’s nothing worse for a director than to have to sit there on set and be looking at a big bluescreen and have no idea what’s going to go out there. What virtual production does is it just shows everybody, ‘This is what’s going to go there,’ if you’re using a simul-cam or, say, an LED wall.”

On Pinocchio, Baillie sought to capitalize on numerous virtual production techniques, including virtual real-time environments made in Unreal Engine by Mold3D, previs utilizing those environments by MPC, virtual stages and animation brought together by Halon, simulcam tech from Dimension Studio and set-wide camera tracking by RACELOGIC.

Interestingly, LED walls were not used, although a variation –‘out-of-view’ LED walls devised by Dimension Studio for interactive lighting – came in handy.

“All of this meant we actually shot and cut together a temp version of almost the entire movie before we hammered a single nail into a real set,” Baillie advises. “And then it meant we had ways to view live-action actors with CG characters during filming. Also, our DP, Don Burgess, could set lighting looks with real-time tools. These real-time and VP tools all meant we could focus on and plan for all the eventualities that were in the movie.”

THE ESSENCE OF A PERFORMANCE

When a life-sized dancing and singing reptile was required for Lyle, Lyle, Crocodile, the filmmakers quickly recognized that telling this story would require some kind of performance capture and virtual production approach for achieving the character in pre-production, production and into post. So, they engaged Actor Capture.

“We presented a solution early on to visualize a scale-accurate 6-foot 1-inch Lyle with an actor of the same height, later portrayed by Ben Palacios,” outlines Motion Capture Supervisor James C. Martin. Palacios wore an inertia suit, head-mounted camera and proxy pieces of tail and jaw during filming.

“We then had to work out how to deliver Lyle in real-time, in camera and with zero percent latency for face, body and hands,” Martin says. “We implemented workflows in both Xsen’s MVN studio and Unreal Engine 4 to achieve the on-screen feed directors Josh Gordon and Will Speck requested.”

Martin suggests that this approach, from a production and storytelling point of view, had several benefits. Firstly, the live-action actors had something to interact with, and then the filmmakers immediately had a performance to review and carry

80 • VFXVOICE.COM SPRING 2023 VIRTUAL PRODUCTION
TOP TO BOTTOM: An early previs mock-up for the submarine in Comandante. (Image courtesy of Foundry) One of the goals of the virtual production workflow on Robert Zemeckis’ Pinocchio was to help the filmmakers imagine scenes where live-action actors, such as Tom Hanks, here, would be interacting with CG characters. (Image courtesy of Walt Disney Pictures) The bluescreen set for Pleasure Island in Pinocchio. Interactive light was enabled via ‘out-of-view’ LED walls. (Image courtesy of Walt Disney Pictures and MPC) Final Pleasure Island visual effects environment by MPC. (Image courtesy of Walt Disney Pictures and MPC)

through to post-production.

Ultimately, the impact was to save time and money while still allowing for intricate scenes that would feature a complex CG character. “Hundreds of cleaned takes were provided with accurate time-coded synced data that the animators could review directly in from the VFX witness reference cameras,” Martin adds.

“Actor Capture also provided a way for the directors and VFX Producer, Kerry Joseph, to audition Lyle in the beginning of the film and complete the run of show with visuals to coincide with what was already in editorial. A big cost solution that was fixed in pre-production.”

BLACK ADAM EMBRACES VIRTUAL PRODUCTION

Cinematographer and Virtual Production Supervisor Kathryn Brillhart has spent the past several years immersed in virtual production, managing volumetric capture and VAD teams, working as a cinematographer consulting with studios in the area, directing her own short film that utilizes LED walls and real-time tools, and as Virtual Production Supervisor on Black Adam and the upcoming Rebel Moon and Fallout. That makes her well-placed to engage with the myriad of ways that productions are adopting virtual production, as well as work with a wide range of directors. “The director and content drive what virtual production is on a project,” notes Brillhart, “and the goal is to create workflows that support the story and their creative vision.”

On the production of Black Adam, just some of the VP approaches included: real-time and pre-rendered CG environments for playback on LED walls (produced by UPP, Wētā FX, Scanline and Digital Domain), facial and performance capture, and synchronization between those elements with robotic SPFX rigs and motion bases to emulate flying, helicopters, driving and flybike chases.

“One of the biggest challenges on a VFX-driven project like this is actually capturing as much closeup footage of the actors as possible during production to improve the realism of the shots later in post,” Brillhart says. “It was also important to capture accurate interactive lighting on subjects in-camera and use final pixel ICVFX to make complex VFX shots and make less complex and simple VFX shots possible to capture in-camera.”

“Also,” says Brillhart, “one of the most cutting-edge techniques we used was the combination of real-time environment playback in Unreal Engine displayed on the inside of a small, closed LED volume, with cameras mounted inside the rig at every angle (360 degrees), creating video volumetric capture of ‘The Rock’s’ performance. This process created a super-photorealistic digital puppet of any actor we captured.”

Continues Brillhart, “It was a great collaboration between Eyeline Studios’ technology and Digital Domain’s Charlatan FACS procedure that gave us high-quality digital characters to use. Due to the position of the cameras inside the rig, new camera moves could be created after the capture process as well. Our director, Jaume Collet-Serra, was really open to this process. From prep to production, the use of virtual production techniques ended up benefiting every department on the project.”

An Actor Capture test setup to work out the real-time capture of an actor and playback as a crocodile creature for Lyle, Lyle, Crocodile. (Image courtesy Actor Capture)

Actor Ben Palacios performed the role of Lyle on set.

(Image courtesy of Actor Capture)

The capture process included an inertial motion capture suit and head-mounted camera. Palacios wore wire pieces and extra body pieces in different scenes to represent the shape of a crocodile. (Image courtesy of Actor Capture)

Dwayne “The Rock” Johnson films a scene for Black Adam. The production utilized several virtual production technologies, including LED walls and 3D volumetric capture. (Image courtesy of Warner Bros. Entertainment)

SPRING 2023 VFXVOICE.COM • 81

NEXT-GENERATION VISUAL ARTISTS: WHERE WILL THEY COME FROM?

VFX work has grown tremendously in recent years and, consequently, so too has the search for talented new visual effects artists around the globe. The best places to look include: VFX and animation schools and job fairs, conventions and social media. Help also comes from head-hunters and inclusion programs. Once studios have hired a potential visual artist, they can be developed through apprenticeships, mentoring and in-house training.

“The studios are spreading the net wider and wider as their demand for talent grows,” says Dr. Ian Palmer, Principal at Escape Studios in London. “We are certainly finding more people than ever are contacting us about our range of courses, so there is an equal growth in the desire to become a VFX artist.”

EXPANDING OUTREACH

Escape Studios conducts outreach with high schools and colleges to make more young people aware of employment opportunities in the studios. Palmer explains, “The VFX industry used to be a well-kept secret, especially in the U.K., but that is less the case now. We also run a ‘Saturday Club’ (saturday-club.org) where young people can come and try out some of the techniques that are used and use resources that they may not have access to in their school or college.”

Shish Aikat, DNEG Global Head of Training, comments, “People who aspire to work in a VFX studio should be aware that there are technical and production roles in a studio that have promising growth potential and high levels of job satisfaction.”

82 • VFXVOICE.COM SPRING 2023
EDUCATION
TOP: Solving visual effects problems at Exceptional Minds, which cultivates skills of artists in the autism spectrum. (Image courtesy of Exceptional Minds)

know it yet. Through our work we meet young people who create content, make video games on their phones and teach themselves animation software because it’s fun and makes for good TikTok posts. It’s our responsibility to guide them toward a flourishing VFX and animation industry where they can harness their talents beyond a hobby. … We need to get into towns and communities where outreach work doesn’t usually take place and blow their minds.”

“Future VFX artists are out there; they just don’t know it yet,” says Framestore’s Simon Devereux, Director, Global Talent Development. “Through our work we meet young people who create content, make video games on their phones and teach themselves animation software because it’s fun and makes for good TikTok posts. It’s our responsibility to guide them toward a flourishing VFX and animation industry where they can harness their talents beyond a hobby. There are plenty of entry-level options out there, so we need to get into towns and communities where outreach work doesn’t usually take place and blow their minds.”

Framestore’s Amy Smith, Global Director, Recruitment & Outreach, comments, “There is still a visibility issue for the industry in many of our locations and, more often than not, this is tied to social inequity, but [is] also a result of education systems in many countries around the world still prioritizing academic over creative learning. The VFX industry is in a relatively unique position in the sense that it requires both technical/scientific and artistic skills, a combination that schools often persuade young people away from. Add to this the fact that a lot of adults, who are the key influencers in a young person’s life — parents, teachers, career advisors, etc. – aren’t aware of our industry and don’t understand the skill requirements.” She continues, “Going into schools and running workshops and presentations not only opens the eyes of the students but also helps their teachers and career leaders to understand all of the options that are out there for their young people.”

Devereaux adds, “Outreach with schools is vital. There is talent out there, we just need to keep finding ways to create that inspiration that leads to aspiration and ultimately application!”

STUDIO TRAINING AND APPRENTICESHIPS

DNEG has made a concerted effort to connect with local and global talent pools through various means, including apprenticeship programs such as its Greenlight program. Aikat explains, “Our

Learning stop-motion filming at Escape Studios. (Image courtesy of Escape Studios)

Escape

SPRING 2023 VFXVOICE.COM • 83
“Future VFX artists are out there; they just don’t
—Simon Devereux, Director, Global Talent Development, Framestore
TOP TO BOTTOM: VFX artists working at Exceptional Minds, which addresses an underrepresentation of neurodiverse artists in the industry. (Image courtesy of Exceptional Minds) Studios Principal Dr. Ian Palmer with senior industry professionals at the VFX Festival, a conference hosted by the school. (Image courtesy of the Rhode Island School of Design)

Marketing Communications and Talent Acquisition departments work closely to identify and develop relationships with communities online and academies worldwide. Our Greenlight apprentice programs for recent graduates have facilitated hundreds of participants into successful careers in the creative, technical and production disciplines. Many of our apprentices have gone on to leadership roles on tentpole films.”

“The growth in demand is generally outpacing the growth of talent at all levels,” says Ron Edwards, Technicolor Creative Services (TCS) Global Head of Commercial Development for L&D. To prepare more artists, the TCS Academy “develops early career talent at scale with over 1,000 students per year graduating intensive eight-to-10-week courses in most VFX and animation disciplines. Courses are 40 hours per week and include daily interaction with highly experienced industry artists who’ve become trainers and work with the creative production team to provide additional ongoing feedback throughout the courses to help develop the students to become production ready. Courses are run in the cloud by providing studio-grade workstations, software, rendering and training assets to increase accessibility and provide a consistent learning experience.”

Edwards adds, “A key aspect of the recruitment process is making it clear that degrees are not required if the skills are there. The [TCS] Academy is skill-based enrollment versus credential based. We test each candidate, and while we encourage a show reel, we don’t require it for entry.”

VFX SCHOOLS AND READINESS

Aikat comments, “Since there is no standard VFX curriculum worldwide, the graduates of most schools fall across a range of proficiency levels and production readiness. Most schools focus on the creative disciplines [such as modeling, lighting, FX, animation, compositing, etc.] while some VFX schools are starting to place emphasis on additional disciplines such as pipeline development and production management. All in all, there appears to be a gap between the global needs [for] qualified VFX artists and the supply of production ready VFX artists.”

Interchanges between VFX studios and VFX and animation schools can help close that gap. Escape Studios’ Palmer notes, “We work very closely with our studio partners on the curriculum and the way it’s delivered, making sure we’re up to date with the software and the way it’s used in a typical studio pipeline. Besides the core technical and creative skills, we put a lot of emphasis on working in teams, communicating, and giving and receiving feedback. Even if you’re a highly talented artist, VFX is a ‘team sport,’ so being able to work effectively as part of a large project is key to success. Having professional artists come in and review our student projects, helping them become aware of what it takes to succeed and the expectations that will be asked of them from day one is a big part of what we do.”

Palmer thinks more scholarships would expand the labor pool. “It’s about targeting those that would find it difficult to support their studies with just the normal financial support available.” Edwards adds, “Increasing access to the industry is important to

84 • VFXVOICE.COM SPRING 2023
EDUCATION
TOP TO BOTTOM: Student Carter Hiett and alumna and faculty member Judy Kim gather around Carter’s experimental film installation in the Film/Animation/ Design department at the Rhode Island School of Design. (Image courtesy of Jo Sittenfeld and RISD). Vancouver Film School’s Animation campus includes a render farm. (Photo: Danny Chan. Image courtesy of Vancouver Film School) Young students try out visual effects with Access VFX, which is powered by over 50 VFX firms and schools and seeks to foster inclusion, diversity, awareness and opportunity within the VFX, animation and games industries. (Image courtesy of Access VFX)

us, and targeted scholarships could be part of the mix of activities to generate interest, enrollment and success.”

THE TUTORIAL GENERATION

Many next-generation visual artists are training themselves with online tutorials to get ready for VFX and animation schools or to apply for work at the studios. “There’s never been more availability to online resources for helping people get a start with the software,” comments Palmer. States Edwards, “We refer potential [TCS] Academy candidates to external training resources to ensure they have the foundation to succeed in the Academy. We’ve made many of these available on our website and continue to supplement them with our own videos. Many of our artists share external resources as do our trainers to supplement their curriculum. There is a lot of quality content out there, especially from software suppliers.”

Yet, VFX artists cannot absorb everything necessary online. Smith observes, “There are fantastic courses at all sorts of levels for young people wanting to get into the industry. Often the issue is how young people can take what they have learned and translate it into a production environment. That’s where opportunities such as apprenticeships, internships, trainee programs and academies can really help as they not only provide a first step into the industry but also immediately create mentoring relationships in a hybrid world that can be difficult to navigate as a new entrant.”

INCLUSION AND DIVERSITY

There are also places to find new artists where no one bothered to look before. Palmer adds, “We shouldn’t be prescriptive about this; I think we need to look far and wide. As I said, we’re seeing the highest-ever demand for places on our courses, and organizations like Access: VFX are doing a great job at getting the message out to less-represented groups about the opportunities that there are. Raw talent doesn’t know any boundaries, and it’s great that we can point to so many success stories to inspire people from all backgrounds to put themselves forward.”

To expand outreach further, Framestore’s Devereux founded the non-profit Access: VFX in London to foster “inclusion, diversity, awareness and opportunity within the VFX, animation and games industries on a global scale.” About the organization, Devereux notes, “We have always been driven by raising awareness of careers in our industry that continue to remain hidden and rarely get discussed when young people are making decisions about their future. From a diversity perspective, we simply want to level the playing field so that everyone has a line of sight to available opportunities no matter who they are and what their background is. We have certainly made strides to focus on specific areas of under-representation that includes the growth of XVFX, our race equity community that we built in 2020, and QVFX, our LGBTQI+ group that came together during Pride 2019.”

The official debut of Access: VFX took place during National Inclusion Week in 2017 and included 28 separate events, screenings and workshops. “Today, we have chapters in the U.K., across the U.S., Canada, New Zealand and Australia. Most recently, we

(Image courtesy of DNEG)

Created as part of an industry-first partnership between DNEG, Dimension and Sky Studios, the Virtual Production Innovation documentary series showcases state-of-the-art virtual production technologies, workflows, and creative concepts, and is designed to educate and help upskill the creative community by demonstrating the advantages and versatility of virtual production.

(Image courtesy of DNEG)

The Virtual Production Innovation series, filmed at ARRI’s mixed reality stage in London, is part of the many resources that aspiring VFX students can search out on their own. Here, Planet Earth looms on an LED wall.

(Image courtesy of DNEG)

SPRING 2023 VFXVOICE.COM • 85
TOP TO BOTTOM: DNEG artists working with a motion capture actor in the London studio.

launched our new European chapter with a mentoring program that supports emerging talent.” Framestore, The Mill, MPC, Epic Games, ILM, Foundry, DNEG, Glassworks, Ghost VFX, The Third Floor and Escape Studios are among the nearly 50 firms and schools that are part of Access: VFX.

Neurodiverse talent is under-represented in the industry, a concern addressed by Exceptional Minds, founded in 2011 and based in Sherman Oaks, California. Head of Studio Scotty Peterson comments, “The mission of E-M is to cultivate the skills of artists on the autism spectrum through customized training and hands-on experience to launch careers in digital arts and animation.” After more than a decade of existence, Peterson says, “We have a really good following and great support from all the major studios and our donors. They see the work of the artists and graduates that come out of the Academy, on major Hollywood projects, and believe what is happening here is a great thing. That spreads quickly and is a huge part of the success.”

Palmer comments, “There’s also Animated Women UK (www.animatedwomenuk.com) that is doing some excellent work promoting more women in both VFX and animation. So, we all have to keep striving to ensure the information is available [with] access to all. One of the most important things is celebrating the successes of a more diverse set of role models, and seeing someone in your own image that you can aspire to be is a strong message for people.” TCS’s Edwards says, “We’ve trialed a few initiatives to increase access including: women-only courses in India to help encourage participation and enrollment, recruiting from different education levels and backgrounds for production, and are launching a new campaign in India to make it easy and exciting for women artists to return to work after taking career breaks.”

GEOGRAPHIC INCLUSION

Speaking of India, VFX studios outside of North America and Europe are growing and tasked with creating a lot of visual effects work for India, China, Hollywood and elsewhere. BOT VFX, DNEG, TCS (which owns MPC and The Mill), ILM, Digital Domain, The Third Floor, Outpost VFX, Ghost VFX (owned by Streamland Media), Mackevision, Scanline VFX, VHQ Media and Rotomaker Studios are among the multinational VFX firms with facilities in Asia.

“Up-and-coming talent will continue to come from all over the world, and a challenge is encouraging those not in school with talent and self-learning to engage and apply,” says Edwards.

From Aikat’s perspective, remote work and cloud computing have greatly expanded access to international VFX artists, who are in great demand. “Today, VFX is a global enterprise and remote-computing workflows established over more than a decade have enabled VFX studios to tap into talent from any location that suits the technical and artistic needs of a project.”

Concludes Aikat, “The world is our talent pool.”

86 • VFXVOICE.COM SPRING 2023 EDUCATION
TOP TWO: Before-and-after shots from a “Robot Garage Exercise” for the TCS Academy. (Image courtesy of Technicolor Creative Studios) BOTTOM: At the TCS Academy, an animation student uses his or her own reference for the lip sync module. (Image courtesy of Technicolor Creative Studios)
“Increasing access to the industry is important to us, and targeted scholarships could be part of the mix of activities to generate interest, enrollment and success.” —Ron Edwards, Global Head of Commercial Development for L&D, Technicolor Creative Services (TCS)

Talent in Action

Visual Development Director at Tencent Games

VES Member: 7 Years

VES Section: Los Angeles

VES benefit most enjoyed: Networking events

Jamie Jasso, born and raised in Guadalajara, Mexico, was about eight years old when he became fascinated with Star Wars and filmmaking. Alien and Blade Runner also got him hooked. Jasso has a partial major in Industrial Design, but he dropped out and taught himself 3D and early compositing in After Effects.

His career started in Mexico, working at local production companies doing TV commercials while simultaneously building his personal portfolio with his own concepts and ideas that later helped him get his opportunity to move to Los Angeles. “I had the chance to meet Craig Barron and he kindly invited me over to Matte World Digital back in 2001, where I had my first approach to a real Hollywood visual effects studio and where my interest for digital matte painting techniques started. From there, and thanks to what I learned, I got my first official job at Blur Studio in 2006. Three years later, I finally got a job at ILM where I had the chance to meet my heroes and work alongside them.”

Jamie has been part of numerous movies over the years, and he is most proud of his work on Avatar (2009) as well as his childhood dream projects, Star Wars: The Force Awakens (2015), Rogue One: A Star Wars Story (2016) and Star Wars: The Last Jedi (2017)

His advice to fellow young aspiring artists: “Create your own opportunities, knock on doors constantly and be willing to help people behind you.” On what distinguishes Jamie’s and his work in VFX: “My personal style, my own vision, is trying to always work on the minor details that will bring complexity and reality. I say I have a filmmaker’s eye rather than a shot maker. I try to use the least resources to create a complex looking shot.”

In his free time, Jamie loves writing, producing and directing and is currently working on his personal “indie” horror short film, pursuing more dreams.

Pramita Mukherjee

Senior CFX Development Artist – DreamWorks Animation

VES Member: 6 Months

VES Section: Los Angeles

VES benefit most enjoyed: VES Awards and networking

Pramita was born and raised in Kolkata, India. She started in India’s animation industry at the age of 19. “Back then, our industry awareness on that side of the world was really very low. For any young girl to even step into animation and VFX was just a dream. When I started I never thought someday I would get the opportunity to move to L.A. and be part of making the kind of movies I grew up watching. It still feels surreal.”

The animated Spider-Man series and Good Morning, Mickey! in the early ‘90s enticed Pramita to join the animation industry. She started as a character rigger at a studio in Mumbai, India in 2007, and after working there for a decade, she moved to London in 2017 as Creature FX Lead at DNEG. She eventually joined DreamWorks Animation in L.A. to work on The Croods: A New Age (2020).

Her advice to other up-and-comers: “I have worked globally starting from India to London and now in L.A. Honestly, I would say surviving in this industry can be tough at times, but if one is really passionate about moviemaking and open to possibilities, then with time, they will find success.”

When not working, Pramita likes to learn new skills and loves mentoring young aspiring women who are trying to break into the industry. “I have been a WIA (Women in Animation) mentor for the past three years, and so far I have mentored more than 40 young women, many of whom have successfully stepped into the business.”

The films she is most proud of are her latest animation movie Puss in Boots: The Last Wish (2022) along with Wonder Woman 1984 (2020), The Croods: A New Age (2020) and The Boss Baby 2: Family Business (2021).

88 • VFXVOICE.COM SPRING 2023 [ ON THE RISE ]

Real-Time Motion Capture

Abstracted from The VES Handbook of Visual Effects – 3rd Edition

Real-time can be a very useful tool on the motion capture stage. Its uses include some of the following:

Character Identification

It can be invaluable for an actor to see their performance as it will look on the character they are portraying. The actor can work with the director to account for the differences between himself and the character and incorporate those differences into the performance.

Virtual Cinematography

By deploying a virtual camera, the director or DP can virtually fly around in the 3D set and play with camera, composition and lighting. The freedom of the virtual environment means that cameras can be placed and moved in ways not possible on a live-action set. In fact, the set itself can be rearranged near-instantaneously for composition, light, color, time of day and a variety of other factors that are under the control of the director while the performance takes place.

Character Integration

A motion capture stage is often sparse. For this reason, it can be useful for an actor to see himself in the virtual environment. This allows for a sense of space and an awareness of one’s surroundings.

Real-Time Limitations

Real-time motion capture is an emerging field. To date, however, significant compromises must be made to achieve real-time.

Line of Sight

Optical motion capture is computer vision-based and therefore the cameras must have an unobstructed view of the markers. In post, it is common for the tracking team to sort out artifacts induced by cameras losing line of sight or markers having been occluded. Because real-time MoCap forgoes post-processing, it is common to reduce the number of actors, limit the props and govern the performance to maximize marker visibility.

Markers

Trying to make sense of the markers on a stage in real-time is daunting. Beyond simply using fewer markers, inducing asymmetry into the marker placement will make them easier to identify. When the labeler is measuring the distance between markers, it is less likely to get confused and swap markers when the markers are placed asymmetrically. An asymmetrical marker placement, however, may be less than ideal for solving.

Solving

Some methods of solving are very processor intensive. Solving methods that are currently possible in real-time often involve approximations and limitations that will yield a resulting solve that does not look as good as an offline solve. Given this, plus the compromises induced by marker placement modifications and performance alterations, what comes through in the real-time previsualization might not be indicative of the final quality of the animation.

Visualization

Live Performances

Aside from the many benefits real-time affords to motion capture and virtual production, it can also be used as a final product. Real-time puppeteering of characters and stage performances is becoming very common.

Real-time rendering technology is catching up with offline rendering technology at a radical pace. Some of the games on the shelves look nearly as good as the movies in the theaters. Using real-time game rendering technology to previs a film is a common practice but one that requires significant investment. Expect that real-time rendered characters and environments will come with compromises to visual quality, most notably in lighting and lookdev (look development).

90 • VFXVOICE.COM SPRING 2023 [ THE VES HANDBOOK ]
Figure 4.16 Real-time Motion Capture at House of Moves. (Image courtesy of House of Moves)

VES New York Revisited: Thriving in the Tri-State VFX Metropolis

Thanks to our regional VFX communities and their work to advance VES and bring people together, the Society’s global presence gets stronger every year. The VES New York Section continues to thrive with 200+ members from New Jersey, Connecticut and all across New York – forming a tightly knit tri-state VFX community, with professionals that encompass the diversity of feature film, television, commercials, gaming and art installations/special venues.

“The visual effects landscape here feels like a small town within the fabric of a big industry,” said Jose Marin, VES New York Section Chair. “Our region is home to a lot of small studios and many artists need to do multiple jobs, which creates both a bench of good general artists and a rich collaborative environment. And with our public transit system, it’s easy to take advantage of opportunities across New York and the rest of the region and further build that sense of camaraderie and shared experience.”

“The industry is healthier than ever, and our members and colleagues have never been busier,” said Tara Marie Jacobson, former VES New York Section Co-Chair. “The downside of the shift to more virtual work during the pandemic, is that new graduates entering the field don’t get as much time in the studio and the important learning that comes from face-to-face relationships. And so we are working to help bridge that gap in on-site mentorship and support the next generation of VFX talent.”

VES New York launched their virtual program, Live At 5, to great success. “Live At 5 started as an online space to hangout with fellow members and friends, where we would talk about everything from VES benefits offering support, like the Member Assistance Program, to interviews with studio representatives,” said Jacobson. “What started as a substitute for live events [during the pandemic] evolved into a rich ongoing discussion forum where people could gather and talk shop, like R&D on projects, and non-shop talk. Sponsors started joining the mix and get to know our members in an informal space, which has only strengthened our relationships to benefit the Section and our local industry. So Live at 5 keeps going!”

VES New York hosted virtual events on topics ranging from sessions on VR 360 light field captures and LED walls and primers on a number of evolving virtual production technologies, resources and opportunities for hands-on learning. The Section’s roster of programs also included virtual events that reached across the world to forge connections and partner with a shared vision of bringing VFX education to new and emerging VFX markets. The Section hosted a forum with Action VFX and GridMarkets that led to networking with communities in South Africa, Taiwan and Azerbaijan, and a burgeoning relationship with Afro FX and CG Africa to share ideas and extend opportunities for outreach and education.

92 • VFXVOICE.COM SPRING 2023 [ VES SECTION SPOTLIGHT: NEW YORK ]
TOP: VES New York members were proud to serve as judges for the 21st Annual VES Awards nominations events held around the world. MIDDLE AND BOTTOM: VES New York members and guests gathered for a festive holiday party and rang in the New Year.

As restrictions opened up to allow for in-person events, the Section has hosted a strong line-up of screenings and panel conversations, in partnership with the School of Visual Arts (SVA) in New York City, and festive summer and holiday parties in Brooklyn and Manhattan. The Section is continuing to plan and partner with other allied organizations, including the Producers Guild of America, to join areas of expertise to recruitment forums, reel/portfolio reviews and other ways to give guidance to professionals seeking to enter, transition or advance in the field.

The new Section Board of Managers has hit the ground running with an exciting slate of goals and initiatives. “With a larger number of Board members, who bring a spectrum of expertise and their passion projects, we are well poised to serve our members and the greater visual effects community,” said Marin. “I am grateful to work with such a dedicated group of professionals, who give of their volunteer time and talent. We have a lot of ideas to innovate our approach to programming and member services, so 2023 and beyond will be a great time to be a part of VES New York.”

One focus is to pilot educational programming with the global VES Education Committee to reach students and recent graduates and help build the pipeline of VFX professionals. This includes speaking engagements at both colleges/film schools and K-12 classrooms and building a database of VFX-oriented lesson plans.

The Section aims to recognize and profile its members – to shine a light on the exceptional talent in the Section and showcase them as role models to inspire new artists and innovators. This spotlight initiative will harness the Section’s social media and direct e-blasts to members, as well as some videos/webcasts in the spirit of the VES luminary interviews and Ask Me Anything; VFX Pros Tell All webcast series.

Last year, the Section surveyed members to take its pulse, and saw the highest demand for educational courses and tutorials, panels with industry leaders and content to learn fundamental business skills – from contracting and taxes and tips for freelancers. They are building out programs to meet the needs and interests of the members.

The Section is also committed to amplifying the benefits of VES membership, both to existing members and prospective members they plan to attract. “We are excited to personalize and reinforce the reasons why the VES is such a valuable asset to visual effects professionals, and encourage people to run for positions of leadership and serve on committees to have a hand in shaping the future of the Society as a while, our Section and our regional community,” said Marin. “VES is a great place to be a working professional and share our unique perspectives with one another. We are the sum of our collective experience and offer something really special throughout the world that we can be immensely proud of.”

SPRING 2023 VFXVOICE.COM • 93
TOP TO BOTTOM: VES New York members and guests celebrated summer at the annual Summer party.

VES Announces New Executive Director Nancy Ward

In December 2022, the VES Board of Directors announced the selection of Nancy Ward as the Society’s new Executive Director, at the culmination of a comprehensive search process. Ward, who joined the Society in 2014 as the organization’s Program and Development Director, was appointed Interim Executive Director upon the retirement of Eric Roth in September 2022.

“Nancy has a passion for the VES and a vision to further uplift the Society and bring it to the forefront of the global entertainment community,’ said VES Chair Lisa Cooke. She has earned a tremendous reputation among the Board, staff, Sections, worldwide membership and industry partners, and we are confident that the VES will achieve new heights under her leadership. I am thrilled to have someone of Nancy’s caliber to helm our next chapter.”

“I am honored to have the opportunity to serve as the Society’s Executive Director,” said Nancy Ward. “It’s a privilege to connect, educate, honor and celebrate the hardest working and probably most underappreciated entertainment professionals. The VES is a beacon of creative and technological innovation and excellence, and it’s my intention to build on the strong foundation created by Eric Roth and help the Society cement its position as a leading voice at the epicenter of the entertainment industry.”

As VES Program and Development Director, Ward oversaw fundraising, partnerships, alliances and new programs. She drove annual sponsorship revenue; directed the publishing team for VFX Voice; spearheaded initiatives around diversity, equity and inclusion and virtual production; oversaw the annual VES Honors Celebration, VES New York Awards Celebration and other VES and Section events; and led the VES Archives initiative and development of the Society’s forthcoming VES digital museum.

VES Elects 2023 Board of Directors Officers; Lisa Cooke Re-Elected Board Chair

The 2023 VES Board of Directors Officers, who comprise the VES Board Executive Committee, were elected at the January 2023 Board meeting. The Officers include Lisa Cooke, who was re-elected Board Chair, and is the first woman to hold this role since the Society’s inception.

“It is my privilege to continue to Chair this global society of outstanding artists and innovators,” said VES Chair Lisa Cooke. “The esteemed leadership on our Board bring passion, diverse experience and enormous commitment to volunteer service to our organization. I’m excited about what we can achieve, working together, to grow the Society and take our worldwide membership to new heights.”

“The Society is so fortunate to have this year’s Executive Committee members guiding our mission-driven work,” said VES Executive Director Nancy Ward. “Individually, they’ve each devoted countless hours in service to VES, and together, they have a deep understanding of the organization and an unwavering commitment to serve the membership. I look forward to working with this year’s Executive Committee to evolve the Society.”

[ VES NEWS ] 94 • VFXVOICE.COM SPRING 2023

The 2023 Officers of the VES Board of Directors are:

• Chair: Lisa Cooke

• 1st Vice Chair: Susan O’Neal

• 2nd Vice Chair: David H. Tanaka, VES

• Secretary: Rita Cahill

• Treasurer: Jeffrey A. Okun, VES

A VFX Producer at Tippett Studio and an Executive Producer at Tippett Productions, Lisa Cooke brings her experience as an animation/VFX producer, story consultant, screenwriter and actor. She is the founder of Green Ray Media, where she produced animation and VFX to create scientific, medical and environmental media for a broad audience. Cooke served six years on the VES Bay Area Board of Managers before joining the global Board of Directors, where she has served as 2nd Vice Chair, 1st Vice Chair, Co-Chair of the Archives Initiative and is honored to serve her third term as Chair.

Susan O’Neal, a VES Founders Award recipient, has served on the global Board of Directors as Treasurer and 2nd Vice Chair. For many years, she served as the Chair for the legacy global Education Committee and currently co-chairs the Membership Committee, playing an instrumental role in the Society’s growth. She is currently a recruiter for BLT Recruiting, Inc. and has worked as an Operations Manager at The Mill, Operations Director at Escape Studios in L.A., and as an Account Manager at Side Effects Software, Inc.

David H. Tanaka, VES, is an Editor, Producer and Creative Director with experience in visual effects, animation, documentary and live-action feature filmmaking. He is a staff VFX Editor at Tippett Studio and Adjunct Professor at the Academy of Art University, San Francisco, and previously served at Industrial Light & Magic and Pixar Animation Studios. A VES Fellow and Lifetime Member, Tanaka served three terms as Chair of the VES Bay Area Section, 2nd Vice Chair on the global Board of Directors, and on the Archives, Awards, Outreach and Work From Home Committees.

Rita Cahill is an international business and marketing/PR consultant and has worked with a number of U.S., Canadian, UK, EU and Chinese companies for visual effects and animation projects. Previously, Cahill was VP Marketing for Cinesite and a founding Board member of the Mill Valley Film Festival/California Film Institute. A VES Lifetime Member and Founders Award recipient and Chair or Co-Chair of the VES Summit for eight years, this is Cahill’s eighth term as Secretary.

Jeffrey A. Okun, VES, is known for creating ‘organic’ and invisible effects as well as spectacular ‘tent-pole’ visual effects that blend seamlessly into the storytelling aspect of the project. Okun is a VES Fellow and recipient of the VES Founders Award. He created and co-edited the award-winning VES Handbook of Visual Effects, and is currently co-editing the VES Handbook of Virtual Production, due out in Summer 2023 Okun served as VES Chair for seven years and as Chair of the Los Angeles Section of the VES for two years.

Opposite top to bottom:

Nancy Ward

Lisa Cooke

Top to bottom:

Susan O’Neal

David H. Tanaka, VES

Rita Cahill

Jeffrey A. Okun, VES

SPRING 2023 VFXVOICE.COM • 95

Yesterday and Today – Reimagining How It’s Done

In 1958, the grueling chariot race in the 11 Oscar-winning film Ben-Hur needed 15,000 extras on an 18-acre set built at the Cinecitta Studios near Rome. A total of 18 chariots were built and the race took approximately 10 weeks to get on camera. That sequence cost the producers roughly $4 million or about a quarter of the film’s entire budget.

Ben-Hur and many films before and after it reflected the long tradition of classical movie making – using physical cameras, sets, locations and actors – and a lot of post-production. Today, that movie would certainly employ the newer approach of virtual production (see special section page 66).

Virtual production, as the term is used today, is a newer filmmaking and TV production technique that combines live-action elements with computer-generated imagery (CGI) to create film and TV content. This method uses virtual reality technology, motion capture and computer graphics to create and visualize scenes in a virtual environment, reducing the need for physical sets and locations. This technique allows filmmakers to pre-visualize and design the look and feel of a scene before it’s actually shot and make changes on the fly during filming. Virtual production also enables filmmakers to create scenes that would be impossible to achieve in the physical world, such as fantastical landscapes, futuristic cities or large-scale action sequences. The result is a hybrid of live-action and CGI elements that are seamlessly integrated to create a more immersive and visually stunning viewing experience.

Notably, some recent films and streaming shows using the VP model include: The Mandalorian, The Book of Boba Fett, The Suicide Squad, Dune, The Batman, Loki, House of the Dragon, Black Adam and Shazam! Fury of the Gods

[ FINAL FRAME ] 96 • VFXVOICE.COM SPRING 2023
Ben-Hur image courtesy of MGM. The Mandalorian image courtesy of Lucasfilm Ltd. and Disney+.
Issuu converts static files into: digital portfolios, online yearbooks, online catalogs, digital photo albums and more. Sign up and create your flipbook.