VFX Voice Fall 2023

Page 1

VFXVOICE.COM FALL 2023

THE POWER OF OPPENHEIMER

AI AND VFX • KILLERS OF THE FLOWER MOON • THE EQUALIZER 3 • DIVERSITY ROUNDTABLE THE CREATOR • ANIMATION AROUND THE WORLD • PROFILES: LIZ BERNARD & MIKE CHAMBERS, VES

CVR1 VFXV issue 27 Fall 2023 FINAL DARKER.indd 1

8/24/23 7:29 PM


VFX VOICE COVERS THE WORLD – FROM COVER TO COVER VFXVOICE.COM CVR2 VFXV BRAND COVERS HOUSE AD.indd 2

8/23/23 6:54 PM


PG 1 AUTODESK ADVERTORIAL.indd 1

8/23/23 6:58 PM


[ EXECUTIVE NOTE ]

Welcome to the Fall 2023 issue of VFX Voice! Thank you for your continued and enthusiastic support of VFX Voice as a part of our global community. We’re proud to keep shining a light on outstanding visual effects artistry and innovation worldwide and lift up the creative talent who never cease to inspire us all. In this issue, our cover story delves into the blockbuster biographical thriller Oppenheimer. Get the inside story of cinematic feature films Killers of the Flower Moon, The Equalizer 3 and The Creator. Check out the rising animation scene around the world, writing for virtual production technology, the latest in studio marketing, and trends in AI and machine learning. Enjoy our up close and personal profile stories on VFX Supervisor Liz Bernard and VFX Producer and former VES Chair Mike Chambers, VES. Take a seat at our insightful Industry Roundtable on Diversity, Equity and Inclusion with five leading women executives. Catch our tech & tools feature on cloud computing, the latest in AI and video games, get to know our VES India Section in the spotlight, and more. Dive in and meet the visionaries and risk-takers who push the boundaries of what’s possible and advance the field of visual effects. Cheers!

Lisa Cooke, Chair, VES Board of Directors

Nancy Ward, VES Executive Director

P.S. You can continue to catch exclusive stories between issues only available at VFXVoice.com. You can also get VFX Voice and VES updates by following us on Twitter at @VFXSociety.

2 • VFXVOICE.COM FALL 2023

PG 2 EXECUTIVE NOTE.indd 2

8/23/23 5:34 PM


PG 3 SIDEFX AD.indd 3

8/23/23 6:59 PM


[ CONTENTS ] FEATURES 8 VFX TRENDS: AI AND VFX Exploring the growing impact of AI/ML on the industry.

VFXVOICE.COM

DEPARTMENTS 2 EXECUTIVE NOTE 90 THE VES HANDBOOK

16 DIVERSITY: INDUSTRY ROUNDTABLE Five leading women executives share views on achieving DEI. 22 COVER: OPPENHEIMER Christopher Nolan captures as much practically as possible. 28 PROFILE: MIKE CHAMBERS, VES Up close with the VFX Producer and VES Chair Emeritus. 34 FILM: KILLERS OF THE FLOWER MOON Scorsese uses invisible VFX to tell a dark tale set in history.

92 VES SECTION SPOTLIGHT – INDIA 94 VES NEWS 96 FINAL FRAME – MUSHROOM CLOUDS

ON THE COVER: The creative explosion of Oppenheimer. (Image courtesy of Universal Pictures)

40 ANIMATION: AROUND THE WORLD Quality animation is rising up in hot spots outside the U.S. 50 PROFILE: LIZ BERNARD Animation is a multi-disciplinary puzzle for VFX Supervisor. 56 FILM: THE EQUALIZER 3 Postvis artists play key role creating near final-quality effects. 62 TECH & TOOLS: CLOUD COMPUTING Paving the way for the virtual highway of the all-digital future. 68 FILM: THE CREATOR Building an original sci-world based on real-world locations. 74 VIDEO GAMES: AI AND GAMES New AI methods are expanding games in different ways. 80 VIRTUAL PRODUCTION: WRITING FOR VP VP technology has altered the writing process for users. 84 INDUSTRY: MARKETING & PRODUCTION VFX studio marketers’ current strategies to boost awareness.

4 • VFXVOICE.COM FALL 2023

PG 4 TOC.indd 4

8/30/23 12:58 PM


PG 5 SCANLINE AD.indd 5

8/23/23 6:59 PM


FALL 2023 • VOL. 7, NO. 4

WINNER OF THREE FOLIO AWARDS FOR PUBLISHING EXCELLENCE.

VFXVOICE

Visit us online at vfxvoice.com PUBLISHER Jim McCullaugh publisher@vfxvoice.com

VISUAL EFFECTS SOCIETY Nancy Ward, Executive Director VES BOARD OF DIRECTORS

EDITOR Ed Ochs editor@vfxvoice.com CREATIVE Alpanian Design Group alan@alpanian.com ADVERTISING Arlene Hansen Arlene-VFX@outlook.com SUPERVISOR Nancy Ward CONTRIBUTING WRITERS Naomi Goldman Trevor Hogg Chris McGowan Oliver Webb ADVISORY COMMITTEE David Bloom Andrew Bly Rob Bredow Mike Chambers, VES Lisa Cooke Neil Corbould, VES Irena Cronin Paul Debevec, VES Debbie Denise Karen Dufilho Paul Franklin David Johnson, VES Jim Morris, VES Dennis Muren, ASC, VES Sam Nicholson, ASC Lori H. Schwartz Eric Roth

OFFICERS Lisa Cooke, Chair Susan O’Neal, 1st Vice Chair David Tanaka, VES, 2nd Vice Chair Rita Cahill, Secretary Jeffrey A. Okun, VES, Treasurer DIRECTORS Neishaw Ali, Laurie Blavin, Kathryn Brillhart Colin Campbell, Nicolas Casanova Mike Chambers, VES, Kim Davidson Michael Fink, VES, Gavin Graham Dennis Hoffman, Brooke Lyndon-Stanford Arnon Manor, Andres Martinez Karen Murphy, Maggie Oh, Jim Rygiel Suhit Saha, Lisa Sepp-Wilson Richard Winn Taylor II, VES David Valentin, Bill Villarreal Joe Weidenbach, Rebecca West Philipp Wolf, Susan Zwerman, VES ALTERNATES Andrew Bly, Johnny Han, Adam Howard Tim McLaughlin, Robin Prybil Daniel Rosen, Dane Smith Visual Effects Society 5805 Sepulveda Blvd., Suite 620 Sherman Oaks, CA 91411 Phone: (818) 981-7861 vesglobal.org VES STAFF Jim Sullivan, Director of Operations Ben Schneider, Director of Membership Services Charles Mesa, Media & Content Manager Ross Auerbach, Program Manager Colleen Kelly, Office Manager Brynn Hinnant, Administrative Assistant Shannon Cassidy, Global Coordinator P.J. Schumacher, Controller Naomi Goldman, Public Relations

Tom Atkin, Founder Allen Battino, VES Logo Design Follow us on social media

VFX Voice is published quarterly by the Visual Effects Society. Subscriptions: U.S. $50: Canada/Mexico $60; all other countries $70 a year. See vfxvoice.com Advertising: Rate card upon request from publisher@vfxvoice.com or VFXVoiceAds@gmail.com Comments: Write us at comments@vfxvoice.com Postmaster: Send address change to VES, 5805 Sepulveda Blvd., Suite 620, Sherman Oaks, CA 91411. Copyright © 2023 The Visual Effects Society. Printed in the U.S.A.

6 • VFXVOICE.COM FALL 2023

PG 6 MASTHEAD.indd 6

8/23/23 5:36 PM


PG 7 YANNIX AD.indd 7

8/23/23 7:00 PM


VFX TRENDS

RAPID EVOLUTION AT THE INTERSECTION OF AI AND VFX By CHRIS McGOWAN

TOP AND BOTTOM: Wētā’s adapted new deep learning methodologies and utilized neural networks for Avatar: The Way of Water. (Images courtesy of 20th Century Studios) OPPOSITE TOP TO BOTTOM: Machine learning helped Digital Domain meet its deadlines for She Hulk: Attorney at Law. (Images courtesy of Marvel Studios)

Crafty Apes’s AI division came into existence during the pandemic. Company Co-Founder Chris LeDoux recalls, “It all started during COVID with me watching YouTube videos from authors like Bycloud and Two Minute Papers. Then our VFX Supervisor and resident mad scientist Aldo Ruggiero began to show me a number of incredible things he was using AI for on the film he was supervising.” It became clear to LeDoux “that AI was going to shake up our industry in a massive way.” He explains, “Developments in AI/ML seemed like they would create a fundamental shift in how we approached and solved problems as it relates to shot creation and augmentation. I knew we had to make it a top priority.” Since then, Crafty Apes has applied AI to a wide range of VFX projects, reflecting an accelerating implementation of AI technology by the visual effects industry. LeDoux comments, “I can tell you that we have leveraged machine learning [ML] for tasks like deepfake creations, de-aging effects, facial manipulation, rotoscoping, image and video processing and style transfer, and the list continues to grow.” He notes that once AI tools are integrated into the pipeline, they “speed up the workflow drastically, lower the costs of VFX significantly, and allow the artists to put more time into creativity.” Regarding the teaming up of AI with VFX, “the first challenge is really managing expectations, in both directions,” says Hanno Basse, Chief Technology Officer of Digital Domain. He adds, “We shouldn’t overestimate what AI will be able to do, and there is a lot of hype out there now. At the same time, it will have a significant and immediate impact on all aspects of content creation, and we need to recognize the consequences of that.” Digital Domain The industry is looking at “many concepts and implementations for AI and ML that are very promising, and [is] using some of them already today,” according to Basse. Digital Domain has utilized

8 • VFXVOICE.COM FALL 2023

PG 8-14 AI VFX.indd 8

8/23/23 5:37 PM


machine learning on high-profile movies such as Avengers: Infinity War and Avengers: Endgame, and She-Hulk: Attorney at Law limited series for Disney+. It also created a 3D “visual simulation” of famed NFL coach Vince Lombardi – with the help of Charlatan, Digital Domain’s machine learning neural rendering software – for the February 2021 Super Bowl. “AI, and especially its close cousin, machine learning, have been in our toolbox for five years or so. We use it on things like facial animation, face-swapping, cloth simulation and other applications,” Basse says. “Our work on She-Hulk last year made extensive use of this technology. In fact, we don’t believe we could have delivered that many shots without it given the time and resources we had to work on this project. [We also did] some fantastic work with cloth simulation on Blue Beetle. We’re basically using this technology now on virtually any show we get to work on.” Prior to that, the digital creation of the character Thanos’ face in Avengers: Infinity War was Digital Domain’s first major application of machine learning and utilized the Masquerade facial-capture system. Avengers: Endgame followed close on its heels. “Since then, DD has done a lot more work with this technology,” Basse remarks. “For example, we created an older version of David Beckham for his ‘Malaria Must Die – So Millions Can Live’ campaign and used our ML-based face-swapping technology Charlatan to bring deceased Taiwanese singer Teresa Teng back to life, virtually.” Basse adds, “In general, machine learning has proven very useful to help create more photorealistic and accurate results. But it’s really the interplay of AI and the craft of our artists – which they acquired over decades, in many cases – that enables us to create believable results.” Wētā FX “We have been working with various ML tools and basic AI models for a long time,” says Wētā FX Senior Visual Effects Supervisor

Joe Letteri. In fact, Massive software, employed all the way back on The Lord of the Rings, uses primitive fuzzy logic AI to drive its agents. Letteri notes, “Machine learning has also been prevalent in rendering for de-noising for years across the industry. For Gemini Man we used a deep learning solver to help us achieve greater consistency with the muscle activations in our facial system. It helped us streamline the combinations that were involved in complex movements across the face to build a more predictable result.” Wētā changed its facial animation system for Avatar 2 and adapted new deep learning methodologies. Letteri says, “Our FACS-based facial animation system yielded great results, but we felt we could do better. As our animators and facial modelers got better, we needed increasingly more flexible and complex

FALL 2023 VFXVOICE.COM • 9

PG 8-14 AI VFX.indd 9

8/23/23 5:37 PM


VFX TRENDS

systems to accommodate their work. So, we took a neural network approach that allowed us to leverage more of what the actor was doing and hide away some of the complexity from the artists while giving them more control. We were also able to get more complex secondary muscle activations right from the start, so the face was working as a complete system, within a given manifold space, much like the human face,” Letteri and his crew created another neural network to do realtime depth compositing during live-action filming. He explains, “During that setup process, we utilized rendered images to train the deep learning model in addition to photographed elements. This allowed us to gather more reference of different variations and positions than we could feasibly get onset. We could train the system to understand a given set environment and the placement of characters in nearly every position on the set in a wide range of poses – something that would be impractical to do with actors on a working film set.” Comments Letteri, “VFX pipelines are always evolving, sometimes driven by hardware or software advancements, sometimes through new and innovative techniques. There is no reason to think that we won’t find new ways to deploy AI-enhanced workflows within VFX. Giving artists ways to rapidly iterate and explore many simultaneous outcomes at the same time can be enormously powerful. It also has great potential as a QC or consistency tool, the way many artists are using it now.”

TOP TO BOTTOM: Rising Sun Pictures’ machine learning incorporated data from a “learned library of reference material” to help create Baby Thor for Thor: Love and Thunder. An early adopter of AI, RSP used machine learning to give the baby and uncannily lifelike quality while exhibiting behaviors required by the script. (Images courtesy of Marvel Studios)

Autodesk “AI has the potential to be revolutionary for VFX as artists design and make the future,” says Ben Fischler, Director of Product Management, Content Creation at Autodesk. “The Internet took shape over many years, and it took time for it to become a part of our daily lives, and it will be similar with AI. For the visual effects industry, it’s all about integrating it into workflows to make them better. It won’t be an immediate flip of the switch, and while certain areas will be rapid, others will take longer.” It has been more than two years since Autodesk embraced AI tools in Flame. “Flame puts a sprinkle of AI into an artist’s workflow and supercharges it dramatically. Things like rotoscoping, wire removal and complex face matte creation are processes that go back to the origins of visual effects when we did things optically, not digitally, and they’re still labor intensive. These are the processes where a little AI in the right places goes a long way,” Fischler explains. “In the case of Flame, we can take a process that had an artist grinding away for hours and turn it into a 20-minute process.” Autodesk recently launched a private beta of Maya Assist in collaboration with Microsoft. “It was developed for new users to Maya and 3D animation and uses voice prompts via ChatGPT to interface with Maya,” Fischler says. Rising Sun Pictures Some five years ago, RSP began collaborating with the Australian Institute for Machine Learning (AIML), which is associated with the University of Adelaide, on ways to incorporate emerging

10 • VFXVOICE.COM FALL 2023

PG 8-14 AI VFX.indd 10

8/23/23 5:37 PM


technologies into its visual effects pipeline. AIML post-doctoral researchers John Bastian and Ben Ward saw the potential for AI in filmmaking and joined RSP; they now lead its AI development team with Troy Tobin. One of the multiple projects that benefited from their work was Marvel’s Thor: Love and Thunder, in which RSP applied data collected from a human baby (the grandson of former Disney CEO Bob Chapek) to a CG infant. Working in tandem with the film’s production team, they were able to “direct” their digital baby to perform specific gestures and exhibit emotions required by the script. According to the Senior VFX Producer on the film, Ian Cope, quoted on the RSP website, “The advantage of this technique over standard ‘deep fake’ methods is that the performance derives from animation enhanced by a learned library of reference material.” The look was honed over many iterations to achieve a digital baby that would seem real to audiences. “The work we’re doing is not just machine learning,” adds Ward, now Senior Machine Learning Developer at RSP. “Our developers are also responsible for integrating our tools into the pipeline used by the artists. That means production tracking and asset management and providing artists with the control they need from a creative point of view.” Working together across several projects, the AI and compositing teams have grown in their mutual understanding. Having explored this space early, “we’ve learned a lot about how the two worlds collide and how we can utilize their [AI] tools in our production environment,” observes RSP Lead Compositor Robert Beveridge. “The collaboration has improved with each project, and that’s helped us to one-up what we’ve done before. The quality of the work keeps getting better.” Jellyfish Pictures “AI and ML offer exciting opportunities to our workflows. and we are exploring how to best implement them,” says Paul J. Baaske, Jellyfish Pictures Head of Technical Direction. “For example, how we can leverage AI to create cloth and muscle simulation with higher fidelity. This is a really intriguing avenue for us. Other areas are in imaging – from better denoising, roto-masks, to creating textures quicker. But some of the greatest gains we see [are] in areas like data or library management.” Baaske adds, “The key moving forward will be for studios to look at their output and data through the lens of ‘how can we learn and develop our internal models further?’ Having historic data available for training and cleverly deploying it to gain competitive advantage can help make a difference and empower artists to focus more on the creative than waiting for long calculations.” Vicon “One of the most significant ways I think AI is going to impact motion capture is the extent to which it’s going to broaden both its application and the core user base,” comments David “Ed” Edwards, VFX Product Manager for Vicon. “Every approach to motion capture has its respective strengths and shortcomings. What we’ve seen from VFX to date – and certainly during the

TOP TWO: Data on a real baby was collected from the grandson of a former Disney executive in order to create the digital Baby Thor in Thor: Love and Thunder. (Images courtesy of Marvel Studios) BOTTOM TWO: Soccer icon David Beckham participated in the ‘Malaria Must Die – So Millions Can Live’ campaign. His face was aged into his ‘70s by Digital Domain’s Charlatan technology. The video was produced by the Ridley Scott Creative Group Amsterdam. (Images courtesy of Digital Domain)

FALL 2023 VFXVOICE.COM • 11

PG 8-14 AI VFX.indd 11

8/23/23 5:37 PM


VFX TRENDS

proliferation of virtual production – is that technical accessibility and suitability to collaboration are driving forces in adoption. AI solutions are showing a great deal of promise in this respect.” Edwards adds, “The demands and expectations of modern audiences means content needs to be produced faster than ever and to a consistently high standard. As AI is fast becoming ubiquitous across numerous applications, workflows and pipelines, it’s already making a strong case for itself as a unifier, as much as an effective tool in its own right.” Studio Lab/Dimension 5 “I think with the use of AI we will see many processes be streamlined, which will allow us to see multiple variations of a unique look,” says Ian Messina, Director of Virtual Production at Studio Lab and owner of real-time production company Dimension 5. Wesley Messina, Dimension 5 Director of Generative AI, says, “Some trailblazers, like Wonder.ai, are pushing the boundaries of technology by developing tools that can turn any actor into a digital character using just video footage. This gets rid of the need for heavy motion-tracking suits and paints a promising picture of what’s to come in animation.” Wesley Messina adds, “As the technology becomes more widely available, we can expect to see AI tools being used by more and more creators. This will change the way we make movies and other visual content, bringing stories to life in ways we’ve never seen before.” Perforce and VP Rod Cope, CTO of Perforce Software, sees AI as having a big impact on virtual production. He explains, “For one, AI is going to let creative teams generate a lot more art assets, especially as text-to-3D AI tools become more sophisticated. That will be key for virtual production and previs. Producers and art directors are going to be able to experiment with a wider array of options, and I think this will spur their creativity in a lot of new ways.”

TOP TO BOTTOM: Wētā utilized a “deep learning solver” to achieve greater consistency with the muscle activations in the facial system for Gemini Man. (Images courtesy of Wētā FX and Paramount Pictures)

The Synthetic World Synthesis AI founder and CEO Yashar Behzadi opines that synthetic data will have a transformative impact on TV and film production in a number of areas, such as virtual sets and environments, pre-visualization and storyboards, virtual characters and creatures, and VFX and post-production. Behzadi continues, “The vision for Synthesis AI has always been to synthesize the world. Our team consists of people with experience in animation, game design and VFX. Their expertise in this field has enabled Synthesis AI to create and release a library of over 100,000 digital humans, which serves as the training data for our text-to-3D project, Synthesis Labs.” More on GenAI “Now, with the emergence of more sophisticated generative AI models and solutions, we’re starting to look at many more ways to use it,” explains Digital Domain’s Basse. “Emerging tools in generative AI, such as ChatGPT, MidJourney, Stable Diffusion and

12 • VFXVOICE.COM FALL 2023

PG 8-14 AI VFX.indd 12

8/23/23 5:37 PM


RunwayML, show a lot of promise.” Basse continues, “GenAI is really good to start the creative process, generating ideas and choices. GenAI does not actually generate art, it creates variants and choices which are based on prior art. But this process can provide great starting points for concept art. But the ultimate product will still come from human artists, as only they really know what they want. Having said that, I have high expectations for the use of GenAI technology in storyboarding and previsualization. I believe we will see a lot of traction with GenAI in those areas very soon.” Autodesk’s Fischler notes, “Having the ability to generate high quality assets would be very impactful to content creators in production. but the challenge is making these assets production-ready for film, television or Triple A games. We are seeing potentially useful tools on the lower end, but it’s much harder to have AI generate useful assets when you have a director, creative director and animation supervisor with a creative vision and complex shot sequence to build.” Wes Messina adds that text-to-3D-model technology “could be a game-changer, moving us away from the hard work of starting from scratch in developing 3D assets.” LeDoux argues, “However, it’s important to remember that AI-generated concept art isn’t here to replace human creativity. Instead, it’s a rad tool that can add to and improve the artistic process. By using these AI technologies, artists can focus on the creative side of their work and bring the director’s vision to life more effectively, leading to super engaging and visually stunning productions.” VFX Taskmasters Overall, AI will help with many tasks. LeDoux comments, “If we divide it up into prep, production, and post-production, and then think of all of the aspects of VFX for each one, you can help wrap your mind around all of the applications. In prep, having generative tools such as Stable Diffusion to help create concept art is obvious, but other tools to help plan, such as language models to help parse the script for VFX-based bidding purposes, as well as planning via storyboarding and previz is massive. In production, having tools to help with digital asset management, stitching and asset building for virtual production is a massive time saver. In post-production, the list is endless from rotoscope assistance to color matching to animation assistance.” “We think that AI will impact our entire workflow,” Basse says. “There are so many scenarios we can think about: creating 3D models with text prompts, creating complex rigs and animation cycles, but we also see potential applications in layout, lighting, texture and lookdev. There is also an expectation that machine learning will revolutionize rotoscoping, which is a very labor-intensive and tedious part of our workflow today.” Perforce’s Cope adds, “AI is going to have an impact on quality assurance and workflow as well. I think we will see AI automate some of the more rote tasks in 3D animation, like stitching and UV mapping, and identifying rendering defects – things that take time but don’t require as much creativity. AI is going to accelerate those

TOP TWO: Digital Domain created the digital face of Thanos in Avengers: Infinity War with the Masquerade system and machine learning, and then worked their magic again in Avengers: Endgame. (Images courtesy of Marvel Studios) BOTTOM TWO: Digital Domain used their Charlatan technology and machine learning to create a CGI likeness of the late Taiwanese singer Teresa Teng for a virtual concert that mesmerized fans. (Images courtesy of Digital Domain, Prism Entertainment and the Teresa Teng Foundation)

FALL 2023 VFXVOICE.COM • 13

PG 8-14 AI VFX.indd 13

8/23/23 5:37 PM


VFX TRENDS

“The Internet took shape over many years, and it took time for it to become a part of our daily lives, and it will be similar with AI. For the visual effects industry, it’s all about integrating it into workflows to make them better. It won’t be an immediate flip of the switch, and while certain areas will be rapid, others will take longer.” —Ben Fischler, Director of Product Management, Content Creation, Autodesk

TOP: Autodesk’s Maya Assist has a ChatGPT assistant. (Image courtesy of Autodesk) BOTTOM TWO: Autodesk Flame software offers the ability to extract mattes of the human body, head and face with AI-powered tools for color adjustment, relighting and beauty work, as well as to quickly isolate skies and salient objects for grading and VFX work. (Images courtesy of Autodesk)

tasks. And, since AI allows teams to go faster, directors will demand even more with quicker turnarounds. Teams that don’t adopt AI in their workflows will be left behind sooner than later.” VFX tasks that will benefit from AI also include object removal, matchmoving, color grading, and image upscaling and restoration, according to Synthesis AI’s Behzadi. AI, VR and Video Games It’s easy to imagine that AI could give a big boost to video games and VR by vastly increasing interactivity and realism. “Thinking on another level, I think that games as we know them will change,” Cope says. For example, “AI is going to open the doors for more natural and unique interactions with characters in an RPG. And could even lead to completely unique in-game experiences for each player and journey.” Synthesis AI’s Behzadi comments, “Virtual reality experiences can be greatly enhanced by AI in several ways, including digital human development, enhanced simulations and training, as well as computer vision applications, to name a few.” Behzadi continues, “AI can generate realistic digital humans or avatars that can interact with users in real-time. These avatars can understand and respond to users’ gestures, facial expressions and voice commands, creating more natural and engaging interactions within virtual environments. When coupled with computer vision techniques, AI has a powerful impact on enhancing the visual quality of VR experiences, including improved graphics rendering, realistic physics simulations, object recognition, and tracking users’ movements within the virtual environment. These advancements ultimately lead to more visually stunning and immersive VR worlds.” The Road Ahead Looking ahead, LeDoux opines, “While it’s true that AI, in its essence, is an automation tool with the potential to displace jobs, historical precedents suggest that automation can also stimulate job creation in emerging sectors.” A look back at the last quarter-century provides a good understanding of this trend, he notes. “The VFX industry has seen exponential growth, fueled largely by clients demanding increasingly complex visual effects as technology progresses.” LeDoux adds that AI will bring significant improvements in the quality and accessibility of visual effects, “thereby enhancing our capacity for storytelling and creative expression.” Letteri comments, “In VFX we are always looking for new ways to help the director tell their story. Sometimes this is developing new tools that enable greater image fidelity or more sophisticated simulations of natural phenomena – and sometimes they are about finding ways to do all of that more efficiently.” Basse concludes, “Not a day goes by where we don’t see an announcement from our tool vendors or new startups touting some new accomplishment relating to content creation using AI and ML. It’s a very exciting time for our industry. For many applications, there is still a lot of work to be done, but this technology is evolving so rapidly that I think we need to measure these major advancements in months, not years.”

14 • VFXVOICE.COM FALL 2023

PG 8-14 AI VFX.indd 14

8/23/23 5:37 PM


PG 15 FTRACK-BACKLIGHT AD.indd 15

8/23/23 7:01 PM


DEI

Achieving Diversity, Equity & Inclusion in VFX and Media & Entertainment Tech

IN CONVERSATION WITH WOMEN WHO LEAD

By NAOMI GOLDMAN

TOP: Janet Muswell Hamilton, VES, Senior Vice President, Visual Effects, HBO OPPOSITE TOP: Janet Lewin, Senior Vice President, Lucasfilm VFX & General Manager, ILM OPPOSITE BOTTOM: Leona Frank, Director of Media & Entertainment Marketing, Autodesk

How do we attract and retain a diverse pipeline of VFX and entertainment tech professionals to carry our global industry forward? How can women leaders help achieve parity and diversity in the workplace? How do we create a productive, inclusive culture that supports workforce development, equity and advancement? The issues of diversity, equity and inclusion are paramount and complex, and the clarion call for efforts to compel systemic and sustainable progress is loud and unwavering. For the past four years, the VES and Autodesk have fueled a partnership dedicated to lifting up voices from often underrepresented communities through our “Ask Me Anything: VFX Pros Tell All” initiative. Working with Autodesk, we have interviewed more than two dozen professionals from diverse backgrounds. These industry leaders have shared lessons learned from their personal career journeys, brought forth bold actions their companies are undertaking to move the needle, and issued calls to action to their peers and colleagues to join the DEI movement. This year, we sat down and hosted panel conversations with five extraordinary women leading the charge in visual effects and media & entertainment tech and gleaned their insights and ideas. Lending their voices to this ongoing conversation are: Leona Frank, Director of Media & Entertainment Marketing, Autodesk; Nina Stille, Director of Global Diversity & Inclusion Partners, Intel; Barbara Marshall, Global M&E Industry Strategy Lead, Z by HP; Janet Lewin, Senior Vice President, Lucasfilm VFX & General Manager, ILM; and Janet Muswell Hamilton, VES, Senior Vice President of Visual Effects, HBO. As individual leaders and as a collective, they exemplify what can be done to address shared challenges and achieve a shared vision for the global entertainment industry. VFXV: In service of creating an inclusive work environment for your employees, what are some of the programs and initiatives offered by your company? Leona Frank, Autodesk: We have nine “Employee Resource Groups,” including groups for Black, Latinx, LGBTQ+ and neurodiverse employees, because we want everyone to feel represented and a sense of inclusion. Anyone can join the groups as allies – you don’t have to be a member of that community. And because running these employee-led groups takes considerable volunteer time, we provide stipends for that important labor and emotional labor, in recognition of our employee dedication, and ensure that each group has the support of an executive sponsor. Janet Lewin, ILM/Lucasfilm: When I was coming up the industry, there wasn’t a language for what it took to develop the confidence and competencies to self-advocate and help navigate obstacles in the male-dominated industry. So, we are really investing in tools to help underrepresented groups and women specifically. “Promote Her” was piloted at ILM and focuses on teaching ‘soft skills’ – how to advocate for yourself, raise your hand for consideration, network, and combat that sense of imposter syndrome.

16 • VFXVOICE.COM FALL 2023

PG 16-19 WOMEN IN VFX.indd 16

8/23/23 5:38 PM


Nina Stille, Intel: We offer a wonderful service to all employees, a confidential “Warm Line,” staffed by advisors trained in resilience, who are available to provide counsel and discuss challenges and solutions. We also have our “Talent Keepers” program, aimed at engaging mid-level Black employees in the U.S. and Costa Rica and their managers. It helps employees with career empowerment and it fosters best practices for managers. Since the tracks ultimately merge, the co-creation of career development plans by employees and their supervisors has resulted in higher employee promotions rates and less race and gender bias in management practices. Barbara Marshall, HP: We are quite bold in the sustainability and diversity realms; our goal is to be the most sustainable and just global technology company. Our strategy is built around three pillars – climate action, human rights and digital equity – interlinked and with specific programs. The company is very transparent in sharing our progress to meet our goals of achieving gender parity in 2023 and doubling the number of Black executives by 2025. I’m proud that our Board is already 46% women and that we have had two female CEOs, which is unique for a blue-chip, publicly-traded company. VFXV: You are all committed to building the pipeline to bring up new talent. What are you doing and what needs to be done in the areas of recruitment and outreach to help bring diverse voices to the forefront? Janet Muswell Hamilton, HBO: We are making strides, but to create that kind of rich workforce, we need to invest more in early education to middle school and high school students. We all need to do more to showcase that these kinds of jobs in VFX and entertainment exist, that they are exciting and viable careers, demystify our industry and lower the bar for entry. We need to go beyond the schools where we tend to cultivate almost exclusively white men and expand our horizons in every aspect. I also appreciate the renewed focus I’m seeing in programs that look at training and retraining people who may have left the workforce, are looking to transfer from another industry or have valuable lived experience – like the work we are doing in the VES Education Committee to develop new career pathways for veterans. Nina Stille, Intel: I’m proud of our “Relaunch Your Career” program, which we initially piloted in 2019 to help people who took a career break (parents, caregivers) to re-enter the workforce. This is an untapped and highly experienced workforce, often overlooked because of a break in their résumé, but rich in unique and valuable soft skills. In 2022, we hired 80 contractors for a 16-20 week ‘returnship’; 87% of those people were converted to full-time roles and 88% of those converted were women and people from underrepresented communities. We are proud of the results and want to keep investing in efforts like these. We also have deep ties to Historically Black Colleges and Universities (HBCU) and partner with these academic institutions and organizations like AfroTech and Lesbians Who Tech to help identify and develop talent that encompasses a diverse spectrum of voices and experience.

FALL 2023 VFXVOICE.COM • 17

PG 16-19 WOMEN IN VFX.indd 17

8/23/23 5:38 PM


DEI

Barbara Marshall, HP: We also work with HBCUs and are a founding member of the HBCU Business Deans Roundtable, and introduced the business case competition, now in its fourth year, which gives students opportunities to develop solutions to real HP business problems and get hands-on experience. As part of our DEI strategy, we also overhauled our internship and graduate recruitment strategy – if you keep recruiting from the same places that are predominately white in the prospective talent pool, you won’t improve your diversity. This needs to be a holistic approach looking at every level from early education to pipeline cultivation to meaningful change in how we all approach hiring, training and employee support. VFXV: Creating work environments that are more conducive to caregivers and working parents, especially amidst a childcare crisis, exacerbated by the pandemic, is a big topic of discussion in many industries. How does your company approach these issues?

“[A] lot of what we see is teaching women or underrepresented minorities how to be successful in existing environments versus rebuilding environments to make space for different ways of being and doing that diverge from the dominant culture.” —Nina Stille, Director of Global Diversity & Inclusion Partners, Intel

Leona Frank, Autodesk: This topic is close to me, as I had two children during COVID. We offer a number of programs including “Flex Forward,” offering hybrid models for people to work from home, which has been a real game-changer and resonates really well with parents and caregivers. We also have working rooms at conferences where moms can pump, and we also pay for breast milk to be shipped home by our working moms while traveling. These kinds of programs put people at the center and help us look at how we can enable them to be their best self and get their best work done. Nina Stille, Intel: I also had a COVID baby and so this resonates for me. We offer a hybrid and flexible work model that helps accommodate working parents and caregivers. Beyond providing additional paid medical leave, when people are aiming to reintegrate into the workforce we offer pathways to work part-time with full-time pay for a period of time to get back into the work rhythm. Through external vendors we also offer forums to join coaching sessions with other birthing parents to tap into community support, and financial assistance and priority no-fee enrollment for local childcare facilities and emergency no-fee childcare. These are all really attractive benefits for prospective and existing employees. VFXV: There is often an onus on women and people from underrepresented communities to solve the inequities they did not create and lead the way forward. What can allies do to engage in this work around diversity, equity and inclusion to help create meaningful progress and change?

TOP: Nina Stille, Director of Global Diversity & Inclusion Partners, Intel

Janet Lewin, ILM/Lucasfilm: Be a mentor. Take someone under your wing and formalize that relationship and bring them into your process and mindset. For many people, the process is daunting, and being on set and working with filmmakers requires a lot of tutelage. Raise your hand, use your platforms and step up to help develop someone’s career and enable them to be successful. Align with vendors who share your company’s priorities and commitment to

18 • VFXVOICE.COM FALL 2023

PG 16-19 WOMEN IN VFX.indd 18

8/23/23 5:38 PM


change. And think about hiring for potential and taking a leap of faith on someone. Janet Muswell Hamilton, HBO: I agree, mentorship is essential and helping people to grow and learn on-set etiquette – which can be a minefield – and is an important element in and of itself worthy of more training and educational tools. One idea to get someone ready for success to ‘over hire’ on a project – hire a second supervisor with enormous potential, mentor them, give them a safe landing to ask questions and feel supported. Job shadowing and job sharing are great models to integrate into talent pipeline cultivation. That kind of on-the-job training and exposure can mean the difference in getting someone ready to play more of a senior role when the next opportunity arises and be successful. Barbara Marshall, HP: There are subliminal messages in everyday language, so one idea is to go through your documentation and remove language that is rooted in racism and sexism, like blacklist. Neutralize the language to not undermine any group. And on training – I find it fascinating that we tend to focus on coaching women how to be less emotional or lower our voices, and I think we should also be coaching men on how to understand and tune in to different voices and modes of working dynamics to be the best partners and team members. Nina Stille, Intel: I agree that a lot of what we see is teaching women or underrepresented minorities how to be successful in existing environments versus rebuilding environments to make space for different ways of being and doing that diverge from the dominant culture. It’s also appealing to jump to action, but it must be grounded in your own self-awareness and understanding of how your positional power and privilege influences your world view and approach to work and to life. Operating from a place of intention and with a more intersectional lens can make an enormous difference in engaging in the work to be done. Leona Frank, Autodesk: True allyship means making someone’s problem your own and pushing for change. As a Black woman, the emotional labor that comes with having to explain issues affecting me and my community is a huge onus and unfair expectation. Allies can take on the labor to self-educate, which keeps my back free. And when you are in a room that I am not in, where decisions get made, use your voice and push for equity issues in all aspects of hiring practices, performance reviews, opportunities for recognition and advancement. Raise critical questions and bold ideas that can really advance our DEI goals and help shape a more just and equitable future where new players have the chance to lend their unique talents – and truly be set up to thrive. We are all so much richer to be surrounded and influenced by a harmonious chorus of diverse voices. Tune in to the full conversations with these dynamic Women Who Lead at https://www.vesglobal.org/ama

“As part of our DEI strategy, we also overhauled our internship and graduate recruitment strategy – if you keep recruiting from the same places that are predominately white in the prospective talent pool, you won’t improve your diversity. This needs to be a holistic approach looking at every level from early education to pipeline cultivation to meaningful change in how we all approach hiring, training and employee support.” —Barbara Marshall, Global M&E Industry Strategy Lead, Z by HP

TOP: Barbara Marshall, Global M&E Industry Strategy Lead, Z by HP

FALL 2023 VFXVOICE.COM • 19

PG 16-19 WOMEN IN VFX.indd 19

8/23/23 5:38 PM


COVER

CHRISTOPHER NOLAN PUSHES THE LIMITS OF CINEMATIC STORYTELLING AGAIN WITH OPPENHEIMER By CHRIS McGOWAN

Images courtesy of DNEG and Universal Pictures. TOP: Robert Oppenheimer (Cillian Murphy) examines the atomic bomb prior to the weapon’s first test.

Theoretical physicist J. Robert Oppenheimer’s time as the director of the Los Alamos Laboratory during World War II was arguably the most impactful job ever undertaken. Oppenheimer led a team of scientific luminaries as part of the Manhattan Project, which developed the first nuclear bombs, the most powerful weapons in history – changing the world forever. Oppenheimer felt great moral turmoil about his efforts, and when the Trinity test was successful on July 16, 1945, he thought, “Now, I am become death, the destroyer of worlds,” words from the Hindu scripture Bhagavad Gita. In the movie, Cillian Murphy (as Oppenheimer) says, “They won’t fear it until they understand it. And they won’t understand it until they’ve used it. Theory will only take you so far. I don’t know if we can be trusted with such a weapon. But we have no choice.” After the end of World War II, Oppenheimer positioned himself against nuclear proliferation and the development of a hydrogen bomb, which put him at odds with many in the government and military. Christopher Nolan had long wanted to tell Oppenheimer’s story, and he has done so with a movie that is part biographical drama and part thriller, shot primarily in IMAX and best viewed on large-format, high-resolution movie screens. Nolan dramatizes the frantic race of the American military to build the first A-bomb before the Nazis did (they were thought to be in the lead) and immerses the audience inside the brilliant

20 • VFXVOICE.COM FALL 2023

PG 20-26 OPPENHEIMER.indd 20

8/23/23 5:39 PM


“We wanted all of the images on screen to be generated from real photography, shot on film, and preferably IMAX. The process involved shooting an extensive library of elements. The final shots ranged from using the raw elements as shot, through to complex composites of multiple filmed elements. This process of constraining the creative process forces you to dig deeper to find solutions that are often more interesting than if there were no limits.” —Andrew Jackson, Production VFX Supervisor Oppenheimer’s conflicted mind, from his deepest fears to imaginings of the quantum realm and the cosmos, and the fate of the planet. He alternated color with black-and-white footage – the first time B&W IMAX 65mm film stock has been used for a feature film. And, typically, he tried to avoid CGI – not an easy task when dealing with the first atomic bomb explosion. In the fall of 2021, Nolan announced his attention to write

TOP: Some of the techniques that the VFX team used to produce the spectacle of nuclear fission were also used to help create the scenes that portray Oppenheimer’s inner world. BOTTOM: Christopher Nolan adjusts IMAX camera over Cillian Murphy (portraying J. Robert Oppenheimer). Oppenheimer was shot primarily in IMAX. (Photo: Melinda Sue Gordon)

FALL 2023 VFXVOICE.COM • 21

PG 20-26 OPPENHEIMER.indd 21

8/23/23 5:39 PM


COVER

and direct the film, with Universal Pictures as the distributor. Nolan’s script was based on the Pulitzer Prize-winning American Prometheus: The Triumph and Tragedy of J. Robert Oppenheimer by Kai Bird and Martin Sherwin. Filming took place February to May 2022, mostly in New Mexico. Emma Thomas, Charles Roven and Nolan produced the movie, in which Murphy portrays Oppenheimer. The star-studded ensemble cast includes Emily Blunt (Kitty Oppenheimer), Matt Damon (Leslie Groves), Florence Pugh (Jean Tatlock), Robert Downey Jr. (Lewis Strauss), Kenneth Branagh (Niels Bohr), Jack Quaid (Richard Feynman), Matthew Modine (Vannevar Bush), Tom Conti (Albert Einstein) and Gary Oldman (Harry Truman). The scientists portrayed make up a veritable 20th century hall of fame for theoretical physics. Oppenheimer was DNEG’s eighth straight film working with Nolan, and it has been the director’s exclusive outside VFX studio starting with Inception (2010). Andrew Jackson, Production VFX Supervisor, comments, “As well as having the creative experience from years of working with Chris, DNEG also has a huge benefit when it comes to solving the technical challenges of working with IMAX resolution in a largely optical production pipeline.” Scott R. Fisher, Special Effects Supervisor, Giacomo Mineo, DNEG VFX Supervisor, Mike Chambers, VES, VFX Producer, and Mike Duffy, DNEG VFX Producer, also helped bring the movie to life. Long-time Nolan colleague Hoyte van Hoytema handled the cinematography. Oppenheimer was Andrew Jackson’s third film with Nolan, following Dunkirk and Tenet. “During that time, I have developed a strong understanding of his filmmaking philosophy,” Jackson says.

22 • VFXVOICE.COM FALL 2023

PG 20-26 OPPENHEIMER.indd 22

8/23/23 5:39 PM


“His approach to effects is very similar to mine in that we don’t see a clear divide between VFX and SFX, and believe that if something can be filmed it will always bring more richness and depth to the work.” Nolan challenged the VFX artists to pull off the visual effects in Oppenheimer without any computer graphics, if possible. Jackson comments, “I spent the first three months of the project in Scott Fisher’s workshop, working with him and his SFX crew to develop the various simulations and effects. We continued working closely with the SFX crew for the duration of the shoot – in fact, it was really one FX team.” Mineo says, “The entire project was a constant creative brainstorming process, as we had to be imaginative and experimental with the footage to visualize what was in Oppenheimer’s mind. From creating the birth of stars by the talented comp artist Ashley Mohabir, to visualizing sub-atomic chain reactions and even the ‘end of the world’ scenario after the nuclear proliferation, designed by Marco Baratto, we pushed the boundaries of creativity.” One difficulty was capturing Oppenheimer’s ideas and imagination considering the different scientific and visual references available during the 1940s. However much the scientist was a visionary, he was also a man of his time. Mineo explains, “Concepts like the Earth seen from space or modern physics were relatively new at the time. To truly portray Oppenheimer’s mindset, we had to let go of our modern understanding and delve into his world. It was a fascinating journey.” There were about 150 total visual effects shots in Oppenheimer, 30% of which were in-camera, according to Chambers. He notes,

OPPOSITE PAGE AND THIS PAGE: Placing a premium on practical effects, the VFX team generated a library of idiosyncratic, frightening and beautiful images to represent the thought process of Oppenheimer, who was at the forefront of the paradigm shift from Newtonian physics to quantum mechanics, looking into dull matter and seeing the extraordinary vibration of energy that exists within all things.

FALL 2023 VFXVOICE.COM • 23

PG 20-26 OPPENHEIMER.indd 23

8/23/23 5:39 PM


COVER

TOP: Atomic Energy Commission Chairman Lewis Strauss (portrayed by Robert Downey Jr.) carried a grudge against Oppenheimer. The film alternated color with black-and-white footage, and it marked the first time B&W IMAX 65mm film stock was used for a feature film. (Photo: Melinda Sue Gordon) BOTTOM: Christopher Nolan lays out the scene for Cillian Murphy, who portrays Oppenheimer. (Photo: Melinda Sue Gordon) OPPOSITE TOP TO BOTTOM: A young Oppenheimer. The filmmakers had to be imaginative and experimental with the footage to visualize what was in the physicist’s visionary and conflicted mind. Oppenheimer (Cillian Murphy) in an emotional moment with his wife (Emily Blunt) at Los Alamos. (Photo: Melinda Sue Gordon) General Leslie Groves (Matt Damon), who was in charge of the Manhattan Project, confers with Oppenheimer (Cillian Murphy) at Los Alamos. (Photo: Melinda Sue Gordon)

“Being a period piece, some contemporary anachronisms were necessarily cleaned up, but only when glaringly obvious. As a general rule, pure fixes were used very sparingly.” Mineo says, “We embraced old-style techniques such as miniatures, massive and micro explosions, thermite fire, long exposure shoots and aerial footage. The majority of the VFX work was based on these elements, without any reliance on CGI and mainly involving compositing treatments.” Oppenheimer and his scientific team worked at the Los Alamos National Laboratory, located some 30 miles from Santa Fe, New Mexico. Mineo comments, “Surprisingly, we needed minimal interventions for Los Alamos as the town had been re-built almost entirely. Our focus primarily involved minor clean-up and retouching work for specific elements such as the tower and the bomb.” Chambers notes, “Some of the magic was intercutting the magnificent set that [Production Designer] Ruth De Jong designed and built with some of the actual locations in Los Alamos itself, including the Lodge and Oppenheimer’s actual house.” The VFX team was in sync with DP Hoyte van Hoytema. Jackson says, “Throughout the production process we worked closely with Hoyte experimenting, testing and developing ideas to illustrate the concepts in the script. On this project we worked with him and the camera department to develop an IMAX probe lens specifically for some of the FX elements. Hoyte also built some extremely powerful LED lights that we used in the tank shoot.”

24 • VFXVOICE.COM FALL 2023

PG 20-26 OPPENHEIMER.indd 24

8/23/23 5:40 PM


The Trinity nuclear explosion was done without any CGI. “We knew that this had to be the showstopper,” Nolan told the Associated Press. “We’re able to do things with picture now that before we were really only able to do with sound in terms of an oversize impact for the audience – an almost physical sense of response to the film.” Explains Chambers, the team strove to “determine how best to illustrate various aspects of a nuclear explosion without relying on CG VFX, stock footage, or actually setting one off.” The Trinity test was recreated using real elements. Mineo remarks, “We only had the original footage of the Trinity explosion as a precise reference. Chris Nolan was determined to keep the VFX grounded in reality and maintain the raw feeling of the actual footage. We had to find the right balance, striving to be minimal with our treatments while ultimately recreating such a massive event. Our talented team of compositing artists, including Peter Howlett, Jay Murray, Manuel Rivoir and Bensam Gnanasigamani, did an exceptional job working for months to create this iconic moment in history.” Chambers adds, “Though there were a few shots using 2D compositing in order to utilize multiple elements, all elements were acquired photographically. For some effects, large pyrotechnical explosions were set off in the desert, however for some views we also used a variety of cinematic tricks, including cloud and water tanks, forced-perspective miniatures and high-speed photography for scale.” For Chambers, “One of the big challenges for me specifically was the logistics of experimenting and shooting our elements and gags

FALL 2023 VFXVOICE.COM • 25

PG 20-26 OPPENHEIMER.indd 25

8/23/23 5:40 PM


COVER

TOP TO BOTTOM: Oppenheimer, in the searing light of his creation, at the Trinity test in New Mexico. The explosion was recreated using real elements and no CGI. Nolan wanted to take the audience “in the room” when the button is pushed. Theoretical physicist Edward Teller (played by Benny Safdie) at the Trinity test. (Photo: Melinda Sue Gordon) Director Chris Nolan with his long-time colleague, DP Hoyte van Hoytema, who is wielding an IMAX camera. The visual effects team worked with van Hoytema and the camera department to develop an IMAX fish-eyed probe lens specifically for some of the VFX elements. (Photo: Melinda Sue Gordon)

while traveling to all the locations and working alongside the main unit throughout principal photography. Though we didn’t have the luxury of a fixed stage for our work, the advantage of having the director at hand to review and expand on our ideas in real-time was priceless.” Some examples of Oppenheimer gags, according to Chambers, included “a special rig designed to spin beads on multiple axes, a special rig designed to recreate the effect of an implosion, and a special tabletop rig to recreate the effect of a ground-based shockwave. Some of these were relatively simple setups and others were more complex. Some were used as shot in-camera and others were combined with traditional elements to complete the shots. As the nature of our approach was primarily physical and tangible, Scott [Fisher] and the SFX crew helped to devise and create the various rigs and gags that we wanted to shoot.” Director Nolan also sought a deeper immersion by employing IMAX, his preferred format. “The headline, for me, is by shooting on IMAX 70mm film, you’re really letting the screen disappear. You’re getting a feeling of 3D without the glasses,” Nolan told the Associated Press. Chambers notes, “We used both IMAX 65mm and [Panavision Panaflex] 5perf 65mm film formats. 5perf was sometimes used for more intimate dialogue scenes as the IMAX cameras are not as quiet on the set. On the VFX unit, we also used some 35mm for extreme high-speed photography.” Mineo adds, “IMAX is undeniably the most cinematic format, but it poses unique challenges for VFX due to its incredibly high resolution. At DNEG, we have a well-designed pipeline that caters to the IMAX workflow, thanks to our experience collaborating with Chris Nolan since our work on Batman Begins. As for the black-and-white footage, our color science team, led by Mario Rokicki, seamlessly adapted our pipeline to accommodate it.” Jackson remarks, “The two biggest challenges were also the most rewarding aspects of working on this movie. Firstly, the script describes thoughts and ideas rather than specific visual images. This was both exciting and challenging as we searched for solutions we could build and shoot that were both inspired by the ideas in the story and were visually engaging.” Continues Jackson, “The second challenge was the set of creative rules that we imposed on the project. We wanted all of the images on screen to be generated from real photography, shot on film, and preferably IMAX. The process involved shooting an extensive library of elements. The final shots ranged from using the raw elements as shot, through to complex composites of multiple filmed elements. This process of constraining the creative process forces you to dig deeper to find solutions that are often more interesting than if there were no limits.” Mineo concludes, “One of the highlights was the opportunity to learn a different way of creating VFX. Understanding how Chris Nolan thinks and approaches his movies allowed us to bring his unique philosophy into our way of working. Initially, it presented some challenges and limitations, but the result is unquestionably real, 100% believable, and ultimately more satisfying.”

26 • VFXVOICE.COM FALL 2023

PG 20-26 OPPENHEIMER.indd 26

8/23/23 5:40 PM


PG 27 DNEG AD.indd 27

8/23/23 7:02 PM


PROFILE

NO TASK IS TOO BIG OR TOO SMALL FOR VFX PRODUCER MIKE CHAMBERS, VES By TREVOR HOGG

Images courtesy of Mike Chambers and the VES, except where noted. TOP: Mike Chambers, VES. OPPOSITE LEFT TO RIGHT: Mike Chambers, VES receives the VES Founders Award in 2021. (Photo courtesy of the VES) Mike Chambers reunites with former employer and mentor Roger Corman backstage at the 17th VES Awards. (Photo courtesy of the VES) In front of a helicopter on the set of The Day After Tomorrow. IMAX is part of the workflow for Chambers when working on a Christopher Nolan production, as was the case for Oppenheimer. (Image courtesy of Universal Pictures) Chambers had a real explosion to deal with when recreating the nuclear bomb tests for Oppenheimer. (Image courtesy of Universal Pictures)

Not many people can take it in stride when a movie director requests to physically recreate a nuclear explosion, but such was the case with Christopher Nolan and his biopic Oppenheimer, which had frequent collaborator Mike Chambers, VES, serving as the visual effects producer. When asked whether Nolan ran the risk of destroying the world in an effort to make his creative vision a cinematic reality, Chamber laughs, “We did some big practical explosions, for sure! Oppenheimer was imagining not just explosions but all of the atomic theory he was dealing with, and being only concepts at that time in history, they didn’t know what these things really looked like. We used a lot of elements, and while there is certainly a digital aspect to some of what we’ve done, there is no CG explosion. A big part of the pyrotechnics were shot in a New Mexico location that was similar geographically to White Sands where they actually did the ‘Trinity’ nuclear test. It sounds and looks huge. That’s all that matters!” While studying at UCLA, Chambers started his career in the mailroom and was a projectionist for Roger Corman at New World Pictures. “I moved on to become production assistant on a bunch of films, a couple of which were heavy sci-fi, like Alien and Star Wars rip-offs. That was my first taste of it. After that, I went to the San Francisco Art Institute so I could learn to make movies of my own.” His next job was as a projectionist and production assistant at Showscan Entertainment, Doug Trumbull’s large-format film production company. “The first guy who I saw as a real mentor for was Robert Hippard, a production manager/line producer working for Doug at the time. Bob later got a job running production for Richard Edlund at Boss Film Studios, and he brought me along. That was one of the few visual effects studios at the time where all of the departments were there, such as the stage, modelmakers, optical printers, editorial, roto artists and motion control. There was so much to learn there, and I realize even more so now that some of the people I worked with at the time were true legends and remain so today. When Jim [Cameron] was setting up The Abyss, a lot of the VFX crew, like the supervisor, John Bruno, came from Boss, and they brought me along to be a visual effects coordinator. Working at Boss and the move to work on the production side for a film like The Abyss were both huge opportunities for me.” Whereas Christopher Nolan emphasizes the use of practical effects, James Cameron openly admits that digital augmentation is an essential aspect of his filmic language. “Obviously, Jim is big on pushing the boundaries, finding new technologies and helping others to develop new techniques,” Chambers observes. “Along with assisting Jim on some of his films, I helped to set up Digital Domain. My job in visual effects in general is dealing with the different attitudes and knowledge of various filmmakers who come from a variety of places and adapting to what they need. Chris definitely is a practical filmmaker, and his films have won a number of visual effects awards. He doesn’t denigrate visual effects but does think that the use of too much CG can take you out of it. There is some photographic element to most of what we’re doing. I can think of very few things over the many films that I’ve done with him that are full CG fantasy; whereas, with Jim and those kinds of films, it’s all about that.”

28 • VFXVOICE.COM FALL 2023

PG 28-32 MIKE CHAMBERS.indd 28

8/23/23 5:40 PM


Movies like Avatar: The Way of Water (which Chambers didn’t work on) are blurring the line of distinction between the visual effects and animation industries. “The definition is changing because the technology is changing,” Chambers notes. “If we’re looking back in filmmaking history at what were visual effects and animation, they were definitely different pipelines, workflows and techniques. These days those worlds have come together. The technology and skills are the same. There is so much overlap. Avatar is a great example, but there are plenty of other examples of fully animated characters within live-action films, and it works. Then it’s just a matter of style. Avatar was blending fantasy

FALL 2023 VFXVOICE.COM • 29

PG 28-32 MIKE CHAMBERS.indd 29

8/23/23 5:40 PM


PROFILE

TOP: Inception features the iconic shot of a city folding over on itself. (Image courtesy of Warner Bros.) MIDDLE LEFT: Chambers hosting the VES Honors Celebration. (Photo courtesy of the VES) MIDDLE RIGHT: As VES Board Chair in 2017, Chambers spearheaded the establishment of the VES Hall of Fame. Pictured here are some members of the inaugural class of the VES Hall of Fame. From left to right: Brian Henson (on behalf of his father Jim Henson), Dennis Muren, VES, Bob Burns III, Phil Tippett, VES, John Knoll, Doug Trumbull, VES, Ed Catmull, VES, Syd Mead and Mike Chambers, VES. (Photo courtesy of the VES) BOTTOM: Chambers believes that his job as a VFX producer is to support the vision of the director, whether it be Jordan Peele for Nope or Francis Lawrence for I Am Legend. (Image courtesy of Warner Bros.)

characters with a photoreal presentation. It’s animation based on motion capture, as opposed to keyframe animation, although I’m sure throughout the film there is evidence of both techniques.” A blend is also taking place between visual effects and the other areas of filmmaking. “One of the things that I’ve loved about working in visual effects is the true cooperation and need to work with all of these other departments because we all affect what each other is doing. A cohesion in a crew, cinematography, art department, editing and visual effects, is essential.” Like many roles in production, the visual effects producer has had to adapt to what is required today. “The roles are generally the same, certainly at a production level,” Chambers states. “I don’t feel overshadowed by a visual effects supervisor. We work in partnership but have different functions to serve. When I first got started, I was working for supervisors who were much more experienced than I was, and I was learning about it. As I’ve gotten further along in my career, I’m sometimes working with younger supervisors who need to understand how things work, and I’m there to help them to do that. In some cases, the producer needs to be the grownup in the room as would an executive producer or line producer. It doesn’t mean that I’m not involved in creative decisions and how we’re going to do things. And it doesn’t mean that the supervisor is not conscientious of how the budget works and what’s going on with the schedule. Every show is unique. Every grouping of producer, supervisor and crew has unique dynamics.” Visual effects are no longer created just in California but around the world. “When the Visual Effects Society formed 25-plus years

30 • VFXVOICE.COM FALL 2023

PG 28-32 MIKE CHAMBERS.indd 30

8/23/23 5:40 PM


ago there were a dozen or so companies in the business, mostly in California,” remarks Chambers, who has been a member of the VES for over two decades. He served as the Board of Directors’ Chair for six years and is on the publication advisory committee for VFX Voice. “That whole model has changed completely. Now, the VES has become a truly global organization with over 4,500 members in 45 countries and 15 sections in major visual effects centers around the world. Visual effects aren’t only in movies anymore, but also television, streaming – and have uses in other areas like science, medical and forensics. We’re trying to bring all of those folks into the fold. The VES is not a union or guild, but we’re the only major organization that has made a family of this industry.” Even with streaming services cutting back on content spending, demand for visual effects work is growing. “This is the challenge of the global business model that we have now,” Chambers says. “It’s changed a lot and is going to continue to change, who the big and little players are, where they fit into the greater scheme of things, and that balance will keep changing. When we came out of the pandemic every shop was full to the brim and overflowing with work. You couldn’t find a place anywhere because everyone was so busy. Now, it’s scaling back. Does this give these companies an opportunity to better be able to approach the work, or are we going to get back to the same thing when it gets busy again? The fact of the matter is all of this work needs to be done. All of these shows do want stuff. Will some become smaller shows? The days of the majority of visual effects shows being 2,000, 3,000 or 4,000

TOP LEFT TO RIGHT: Chambers is standing center as a VistaVision camera captures footage for the natural disaster movie Volcano. The crew photo is on the set of the miniature causeway explosion with the miniature truck from True Lies. Chambers is in the upper left corner along with VFX legends John Bruno, Mat Beck, Wayne Baker, Robert Spurlock, Les Ekker, Ron Gress, Joe Viskocil, Richard Stutsman and John Stirber. Chambers joins with the VES Chairs to honor Eric Roth with the VES Board of Directors Award at the 21st Annual VES Awards. Left to right: Jeffrey A. Okun, VES, Jeff Barnes, Chambers, Eric Roth, Lisa Cooke, Jim Morris, VES and Carl Rosendahl, VES. (Photo courtesy of the VES) Early in his career, Chambers worked for Showscan, founded by Doug Trumbull. (Photo courtesy of the VES) Chambers inducted Douglas Trumbull into the inaugural class of the VES Hall of Fame in 2017. (Photo courtesy of the VES)

FALL 2023 VFXVOICE.COM • 31

PG 28-32 MIKE CHAMBERS.indd 31

8/23/23 5:41 PM


PROFILE

“I realize even more so now that some of the people I worked with at the time were true legends and remain so today.” —Mike Chambers

TOP LEFT TO RIGHT: Chambers had to deal with the topic of AI when making Wally Pfister’s feature film directorial debut, Transcendence. (Image courtesy of Warner Bros.) Chambers forged a partnership with Jordan Peele to make the alien creature that lives amongst the clouds a reality for Nope. (Image courtesy of Universal Pictures) Chambers has become a frequent collaborator with Christopher Nolan, with the mantra being to capture as much practically as possible, as was the case with Tenet. (Image courtesy of Warner Bros.) Chambers has worked on a wide range of projects, from visual-effects-heavy blockbusters to invisible augmentation, as was the case for Dunkirk. (Image courtesy of Warner Bros.)

shots compared to more restrained use of visual effects, we might see that.” AI and machine learning are here to stay. “The question of the day is what does it all mean? I have a basic understanding of some of the applications for us as visual effects and filmmakers. I don’t know the answer. But the genie is out of the bottle. We’re going to have to learn how to deal with it. I worked on Transcendence with Wally Pfister where AI ran amok. Not that I will hold that up as our future!” Keeping track of advances in technology is important. “True, although personally, I will usually focus more on what I need to do. And if my next project is going to start utilizing AI, I’ll learn a lot more about it.” Along with Christopher Nolan and James Cameron, Chambers has also collaborated with George Miller, Roland Emmerich, John Woo, Kathryn Bigelow and Jordan Peele, among others. “I’m there to help fulfill someone’s vision. Jim’s vision is different from Chris’ and Jordan’s visions. How can we bring our knowledge and technology together to serve any of them and others?” Communication is the best way to prevent getting lost in the iterative process. “You have to talk to them. Some of their minds are easier to get into than others! I have had an opportunity with a few different directors to do multiple shows with them, and they’re so much easier. Oppenheimer is my fifth film with Chris, and so many of his crew are the same. With our shared knowledge of what Chris is after and how he likes to work, there is a lot of shorthand going on.” The methodology does not completely change depending on the size of the project. “I scale it,” Chambers explains. “On a big show you have a big team, and it’s all about delegation. Smaller shows have a smaller team. On Oppenheimer, the visual effects production crew was me, the supervisor and production manager, and that was it, and it was fine.” Preparation is critical. “You have to have a plan and roll with the punches when things don’t go to plan, which will always be the case. In fact, part of the plan is that the plan is going to break at some point. What are you going to do about it? It’s all about solving problems. If everything always went according to the way it’s supposed to, it wouldn’t be much of a job!”

32 • VFXVOICE.COM FALL 2023

PG 28-32 MIKE CHAMBERS.indd 32

8/23/23 5:41 PM


PG 33 COREWEAVE AD.indd 33

8/23/23 7:03 PM


FILM

CAPTURING LOVE OF A DEADLY KIND FOR KILLERS OF THE FLOWER MOON By TREVOR HOGG

Images courtesy of Sikelia Productions and Apple Studios. TOP: For Martin Scorsese, the face, eyes and body language of Lily Gladstone were intrinsically correct for the role of Mollie Burkhart. (Photo: Melinda Sue Gordon) OPPOSITE TOP TO BOTTOM: The oil geyser was achieved practically and inspired by an iconic cinematic moment in Giant, starring James Dean, Rock Hudson and Elizabeth Taylor. It was important to eliminate any indication of modernization, especially for the wide shots. It was essential for Martin Scorsese to shoot in the locations where the actual historical events took place. (Photo: Melinda Sue Gordon)

Ill fortune may be the best way to describe the fate of the Osage Nation when oil was discovered on their Oklahoma reservation and, subsequently, they began to die under mysterious circumstances. The newly-formed Bureau of Investigation (predecessor to the FBI) discovered a murderous conspiracy masterminded by cattleman William Hale to gain control of the Osage headrights, then seize their wealth from the profits from the resources extracted from the land. The nefarious true story was the subject of the book Killers of the Flower Moon: The Osage Murders and the Birth of the FBI by David Grann and has been adapted into an epic historical crime drama by renowned filmmaker Martin Scorsese (Goodfellas) on the behalf of Appian Way, Sikelia Productions, Imperative Entertainment and Apple Studios. “It’s a picture that explores what love is, what it could be and what all of us are capable of,” Scorsese states. “One can become complicit without even realizing, and when do you realize, do you change?” Characters and scenes drove the editorial process. “Whole scenes are interwoven, so ultimately it creates a whirlpool midway through or maybe an hour into the film,” Scorsese explains. “If the film is speaking to you, then you’re stuck in it, like the characters are stuck and can’t get out.” Even with the rising body count, there is restraint in depicting the violence. “Marty was conscious of the continuing pain that the Osage feel about this horrible time,” notes Thelma Schoonmaker, who established a life-long creative partnership with the filmmaker when she restored his student film while attending a six-week film course at NYU and has subsequently won Oscars for editing Raging Bull, The Aviator

34 • VFXVOICE.COM FALL 2023

PG 34-39 KILLERS OF THE FLOWER MOON.indd 34

8/23/23 5:42 PM


“What I was trying for mainly was giving a real impression of being out there on those prairies. And yet I found that we still needed visual effects to give a sense of scale and place that one can see in a shot where the car is going down the road at the beginning of the film. The camera booms up and you see the prairie on left and right, but then slowly oil rigs start to appear, and visual effects helped us to create a real sense of the oil encroaching on nature.” —Martin Scorsese and The Departed. “Almost all of the killings are in a wide shot, and they worked incredibly well.” The unique narrative structure of Killers of the Flower Moon was partially caused by the script being originally much larger. “You’re being jerked from one scene or time frame to another, and that was the result of the cutting down, but also deliberate after awhile,” Schoonmaker reveals. “We realized that we could do that, and it pulls you along in the film.” Visual effects have become a more significant cinematic tool for Scorsese ever since depicting Potala Palace in Kundun and was central in being able to de-age characters in The Irishman. “We rely heavily on the visual effects editor because I will go into the

FALL 2023 VFXVOICE.COM • 35

PG 34-39 KILLERS OF THE FLOWER MOON.indd 35

8/23/23 5:42 PM


FILM

TOP TO BOTTOM: The mandate to capture as much as possible in-camera led to a real train being brought in for principal photography. Principal photography mainly took place in Pawhuska instead of Fairfax, Oklahoma, where only three to four blocks were art-directed and the rest were completely CG. A practical oil derrick was built, scanned and replicated digitally to create a massive oil field that is encroaching nature.

room and say, ‘Can you take this person out of the shot?’ Or, ‘Can you remove or change the color of this?’” Schoonmaker remarks. “It’s wonderful to be able to do that and have it done for us quickly. Marty required a certain double image in one frame of two children’s faces in the first scene of the film and [VFX Editor] Red Charyszyn was able to create that beautifully, and that’s actually in the film now.” Even though an effort was made to shoot in the actual locations such as the doctor’s office of the Shoun brothers and the Masonic lodge where the meetings take place, visual effects were still needed. “The art department did a great job of restoring the actual towns that we were shooting in, but they could only go so far,” Charyszyn states. “We were going to be more involved because we were talking about putting oil derricks everywhere, making the town properly period, and so many cows!” “What I was trying for mainly was giving a real impression of being out there on those prairies,” Scorsese remarks. “And yet I found that we still needed visual effects to give a sense of scale and place that one can see in a shot where the car is going down the road at the beginning of the film. The camera booms up and you see the prairie on left and right, but then slowly oil rigs start to appear, and visual effects helped us to create a real sense of the oil encroaching on nature.” ILM was the sole vendor and responsible for approximately 700 shots, which were supervised by Pablo Helman, who previously worked on Silence, Rolling Thunder Revue and The Irishman. “Generally, we did the oil rigs, enhanced the train, and there were shots where we had hundreds and hundreds of cows. We tried to shoot cows, but the temperature was like 96 degrees, so at 8:30 a.m. you have 350 cows and by 11 a.m. you have 11. They all go to the shade! You can’t move them. That’s it. We shot mainly in Pawhuska for Fairfax. There were only three or four blocks that were art-directed, and the rest was completely CG.”

36 • VFXVOICE.COM FALL 2023

PG 34-39 KILLERS OF THE FLOWER MOON.indd 36

8/23/23 5:42 PM


A practical oil derrick was built and scanned. “We went in doing the previs, shot the shot, then I took it in and did postvis on it,” Helman states. “I said, ‘What if at the beginning we see something but don’t know what it is, but it’s an oil rig.’ Then we start seeing all of the oil rigs. Marty said, ‘That’s great. Let’s do that.’ We went back and forth with placing a certain number of rigs because he wanted to reveal it little by little until the end.” A hat and blood were added digitally for the scene when the car crashes into the tree. “He didn’t have a hat and blood when we shot it. They edited the movie, and Marty said, ‘I don’t have any way to know who this character is, but he does wear a hat all of the time.’” The car was actually constructed from two different takes. “The first time, the hood broke. We did it again and it did the right thing, but Marty liked the performance of the first one, so we had to split it.” Black-and-white newsreel footage transitions into color when Ernest Burkhart (Leonardo DiCaprio) arrives on the train at Osage Nation for the first time. “The last shot of an Osage pilot standing in front of a plane is actual newsreel footage from the family of the current Chief of the Osage, Chief Geoffrey Standing Bear,” Schoonmaker reveals. “We had to decide how much grain to then use as we go into the train from that actual footage.” Adjustments were made incrementally with the final version sent to ILM to copy exactly. “Marty was particular about how the saturation had to come up,” Charyszyn recalls. “But then also I thought it was going to be mathematically diametrical to the grain lessening, but he wanted the impact of the color coming in to reach you and be so subtle.” While driving home from the set in Oklahoma, Scorsese witnessed local farmers burning off their fields, which led to some surreal, hellish imagery appearing when it is revealed that Ernest Burkhart is poisoning his Osage wife, Mollie (Lilly Gladstone),

TOP: The inclination for cattle to wander off in search of better pastures before the camera began rolling contributed to them being expanded upon digitally. MIDDLE AND BOTTOM: Plate photography of the house explosion by Cinematographer Rodrigo Prieto and the final image produced by ILM. (Photo: Melinda Sue Gordon)

FALL 2023 VFXVOICE.COM • 37

PG 34-39 KILLERS OF THE FLOWER MOON.indd 37

8/23/23 5:42 PM


FILM

“This is using visual effects as a tool to tell the story in a completely invisible way. The whole movie is a piece of art. There are no compromises.” —Pablo Helman, VFX Supervisor

TOP: ILM was the sole vendor and responsible for approximately 700 shots. SECOND FROM TOP: The Bureau of Investigation gathers in an oil field to discuss the progress of their Osage murder investigation. BOTTOM TWO: Some of the scenes take place in Washington, D.C., added by bluescreen.

under the orders of his uncle, William Hale (Robert De Niro). “We began to encounter areas that were burning all around us, and at a certain point it became like we were in the middle of a volcano,” Scorsese notes. “It’s almost like when you hear the term ‘fever dream’. What does that feel like when you’re having a fever of that kind? How do you see things? Mollie is in the fever, and so is Ernest.” Dancers were hired to create weird background figures. “The footage [by Cinematographer Rodrigo Prieto] was stunning already,” Schoonmaker states. “One guy almost looks like someone on the cross.” The fire was practically achieved by a newcomer to Scorsese’s inner circle. “We created a foreground fire, which was the lower one with the guys walking around it, and there are a bunch of little spot fires,” reveals Brandon K. McLaughlin, Special Effects Coordinator. “We produced that by burying 60 to 70 bars in the ground. It was 250 feet long. Then the background ones were bars laying on the ground because we were never going to go back that far. I was blown away by how awesome it came out.” It was a learning curve for McLaughlin. “The opening sequence of the movie with the pool of bubbling oil – that happened in reality,” explains McLaughlin, who used a product from Blair Adhesives that is a combination of Methocel, water and food coloring as a nontoxic substitute for oil. “But Marty kept saying that he wanted this geyser to come up from the ground. Oil doesn’t do that. Everyone kept bumping on that because Marty had said he wanted to keep it as true to the story as possible from the get-go. I finally asked him, ‘Is it an artistic piece that you’re putting together or is this something you’re going to want to shoot and have physically come out of the ground, like we dig a hole, put a nozzle in, bury our line, and you see it come from the ground?’ When Marty told me that’s what he wanted, then it made perfect sense.” Giant, starring James Dean, Rock Hudson and Elizabeth Taylor, was a cinematic reference. “There is this wonderful scene where oil erupts from the ground and covers him [James Dean] with oil,” Scorsese states. “I start with what’s there, and if it isn’t there, what would I like to be there which will give it a natural appearance; then, if the scene calls for it, to introduce elements that can be somewhat unrealistic but appear to the mind as an image that you would observe.” Executing the bank vault explosion was a blast for McLaughlin. “I got giddy because the script simply said, ‘Asa Kirby comes in, puts too much pyro in this vault door and it blows up. We had the liberty to run with it. The door was built out of steel and I used a rapid accelerator, which is a 4:1 pulley system with a pneumatic ram. You pull the shives apart and multiple how much pressure you need to pull ‘x’ amount at

38 • VFXVOICE.COM FALL 2023

PG 34-39 KILLERS OF THE FLOWER MOON.indd 38

8/23/23 5:42 PM


OCTO

PIX 1

what weight. I wanted it to dance across the floor. We made some changes because there were a couple of things on the door that would break, so we made those materials stiffer and heavier. Then, behind the door we put all of the pyro, the fire, sparks and money. We also had a shotgun mortar with pyro and sand in it. You use it as a fist for the most part. You can’t see it because it happens so fast. For the money, we had drop baskets overhead out of camera. We hit the pyro events on the latches, the doors would open and the money would fall down.” Originally, the owl, which is a symbol of impending death, was going to be digital. “But we found an owl that did what it was supposed to do,” Helman remarks. “Another thing that we did that was interesting and invisible was people getting sick. How do you do that? You have a three-and-a-half-hour movie, and you’re telling the story of somebody getting sicker and sicker and dying. It’s difficult to do with makeup because you’re always shooting out of continuity. You have to be careful. We’re doing takes and takes of stuff. It’s difficult to keep track of it. It is a lot easier to go with a base makeup and then go in post.” A combination of things was done digitally. “It’s getting under the eyes darker, being gaunt there, doing some warping or some 3D work where you take some swelling, or sometimes you make them swell more or sweat. We did some research as to what cyanide and poison does.” The book on Osage culture had to be altered digitally in order to be narratively relevant. “We didn’t have the right book, so a lot of the pages were replaced. There was a piece of art that we produced that was researched, period correct and had to bend appropriately. De Niro says to DiCaprio, ‘Can you spot the wolves?’ The drawing in the book didn’t have the wolves visible and Marty needed to see them, so we had to replace it digitally.” With much being made of Scorsese’s two muses, Robert De Niro and Leonardo DiCaprio, sharing the screen together, the real star is Lily Gladstone, who plays Mollie. “She represents all of the Native American Nations in her phenomenal performance and dignity,” Schoonmaker declares. “I would say that the death of the mother and the ancestors coming to take her away is one of the most beautiful things that Marty has ever done. It’s so simple.” Helman points out that the general complaint about visual effects does not apply to Killers of the Flower Moon. “This is using visual effects as a tool to tell the story in a completely invisible way. The whole movie is a piece of art. There are no compromises.” McLaughlin was pleased with the end result. “I thought Marty did a great job. I hope that people walk away from it going, ‘Oh my god! I can’t believe that this truly happened.’” Whenever Scorsese was in doubt, he went back to the relationship between Mollie and Ernest and stayed with them as long as possible. McLaughlin notes, “I love the scene at the table at the beginning when Leo and Lily have their first dinner together, which ends with the rainstorm. There is something in their faces, an electricity and warmth at the same time that is so sweet and moving. And then her teaching him how to sit still. ‘Just sit still and let the power of Wah-Kon-Tah’s storm float over us.’ That’s like saying, ‘Let’s live here in life.’”

:Kille

Kille

Kille

Kille

Kille

Kille

Kille

Kille

Visua tell the

--*PI TOP TO BOTTOM: The visual effects were instrumental in transporting viewers to Oklahoma of the early 1920s.

Kille

A favorite moment of Martin Scorsese’s occurs at the dinner table when Mollie teaches Ernest to be quiet and still during a thunderstorm.

Kille

Killers of the Flower Moon brings together the two major muses for Martin Scorsese, Robert De Niro as William Hale and Leonardo DiCaprio as Ernest Burkhart. (Photo: Melinda Sue Gordon) Jesse Plemons portrays Tom White, who leads the Bureau of Investigation in its effort to solve the Osage murders, which were orchestrated by cattleman William Hale played by Robert De Niro.

Kille

FALL 2023 VFXVOICE.COM • 39

PG 34-39 KILLERS OF THE FLOWER MOON.indd 39

8/23/23 5:42 PM


ANIMATION

CIRCLING THE GLOBE TO CAPTURE THE WORLD OF ANIMATION TODAY By TREVOR HOGG

TOP: Blue Eye Samurai is a Netflix series animated by Blue Spirit that further pushes the boundaries of 2D and 3D animation techniques. (Image courtesy of Netflix) OPPOSITE TOP TO BOTTOM: Folivari is branching out into computer animation with the Netflix series The Seven Bears. (Image courtesy of Folivari and Netflix) Sun Creature views animation as a vast canvas for storytelling and is responsible for the Cartoon Network series The Heroic Quest of the Valiant Prince Ivandoe. (Image courtesy of Hanna-Barbera Studios Europe Ltd.) Cartoon Saloon has distinguished itself by producing hand-drawn feature films such as My Father’s Dragon. (Image courtesy of Cartoon Saloon)

U.K & EUROPE

Following behind America and Japan as the third-largest animation producer is France, which is home to the Annecy International Animation Festival and public broadcaster France Télévisions, the biggest commissioner of animated content in France and Europe. Amongst this thriving industry that employs 7,790 people and generates over 500 graduates and six features annually is Blue Spirit Studio, which has studios in Paris, Angoulême and Montreal, and is responsible for My Life as a Zucchini and Arthur and the Children of the Round Table. “We have a lot of schools, producers and studios in a small country because the animation industry is under the Ministry of Culture,” remarks Olivier Lelardoux, CEO of Blue Spirit. “It’s not considered to be a pure private industry.” Not everything is Euro-centric when it comes to clients and partnerships, as Blue Spirit was responsible for animating Gremlins: Secrets of Mogwai for Max and Blue Eye Samurai for Netflix. “The big U.S. platforms are clearly leading the way of a new type of content, especially for adults. It will be interesting to see directors starting in the animation industry, directors from live action and directors from live events working with studios in order to invent new types of storytelling for adult content.” Game engines and AI are going to have a dramatic impact on the animation pipeline and workflow, Lelardoux says. “Soon you will have the possibility to keyframe directly in the game engines. It means that the animators will be in a totally immersive way to do animation in the future. Maybe AI will be the solution for the studio and producers to create content faster and cheaper. Artists and creators have to be able to trust studios, so that’s why we have to be careful with AI.”

40 • VFXVOICE.COM FALL 2023

PG 40-48 ANIMATION.indd 40

8/23/23 6:07 PM


Also located in Paris is the production company Folivari, which is known for the Ernest and Célestine movies and television series as well as The Big Bad Fox and Other Tales and The Summit of the Gods. “When talking about French animation, it is important to talk about co-productions, as it’s an ecosystem that goes beyond France,” states Thibaut Ruby, Managing Director of Fost Studio and Executive Producer for Folivari. “Co-productions make financing easier even though it takes longer because you need to make things in the right order and apply as a specific date. It also requires coordination across different studios; sometimes that is not easy because you have a director who needs to see people, so we have to have him traveling a bit.” Adult animation remains a niche market. Explains Ruby, “That’s why it’s so hard to raise a lot of money to make an [adult animated] movie the traditional way. It’s great that the streamers with their global audience can amalgamate all of those niche markets and hopefully make them profitable, as we’re pushing that animation not as a genre but a technique.” Technology is not static, nor are viewing habits. “Real-time CG is going to be the main game-changer for producing animation, and it seems to be improving in quality and cost. How it’s going to evolve is the blending of traditional narrative in feature film, with the viewer having the ability to change the story without making it like a video game,” Ruby states. Situated in Copenhagen, Sun Creature partnered with Monica Hellström at Final Cut for Real and director Jonas Poher Rasmussen to create the multi-Oscar-nominated Flee, which documents the struggles of a gay Afghan refugee on the verge of getting married in Denmark. “It took us five years to put the financing together and about two years to produce the film,” states Charlotte de La Gournerie, Co-Founder and Executive Producer

FALL 2023 VFXVOICE.COM • 41

PG 40-48 ANIMATION.indd 41

8/23/23 6:07 PM


ANIMATION

TOP TO BOTTOM: Lloyd of the Flies is the first computer-animated series produced entirely at the Bristol headquarters of Aardman. (Image courtesy of Aardman) Matchbox Mountain was involved with conceptualizing the “wolfvision” sequences in Wolfwalkers, with a pencil render being generated. (Image courtesy of Matchbox Mountain and Cartoon Saloon) El Guiri Studios was established during the pandemic, so it started off with everyone working remotely to create “Sith” for Star Wars: Visions Volume 2. (Image courtesy of Lucasfilm Ltd.)

at Sun Creature. “It has been an amazing experience at so many levels. We never thought that the film would be so successful.” Denmark is home to six million people, and the small population size is reflected in the Danish Film Institute supporting one feature film per year. “We don’t have a tax rebate compared to all the other countries in Europe. That is one of the reasons why we opened a second studio in France in 2020. It allowed us to strengthen our collaboration with French talents and to seek other sources of financing to stay competitive.” Animation provides a vast canvas for storytelling. Says de La Gournerie, “We really love that the medium allows us to explore different art directions and are keen on creating stories and worlds that appeal to an adult audience. But I also want to produce films that propose a point of view or tell a story that we don’t hear often. At the moment I am co-producing Julián [based on the book Julián is a Mermaid by Jessica Love] with Cartoon Saloon and Aircraft Pictures. It tells the story of a little boy who wants to be a mermaid and is a new take on Hans Christian Andersen’s fairy tale.” Cartoon Saloon has distinguished itself for producing handdrawn features and has received Academy Award nominations for The Secret of Kells, Song of the Sea, The Breadwinner, Wolfwalkers and the short, Late Afternoon. “The design identity that is so strong, that you see in all of the films, is tied to Ireland and our mythology here,” states Fergal Brennan, Technical Director of Cartoon Saloon. “It’s good to have an identity, but an identity that allows you to progress, express, experiment and do new things, and not trap you in doing the same thing over and over again.” The remote workflow caused by the pandemic has not entirely provided open access to talent from around the world. Explains Brennan, “Potentially, but it’s not necessarily that straightforward

42 • VFXVOICE.COM FALL 2023

PG 40-48 ANIMATION.indd 42

8/23/23 6:07 PM


that you can now pick everyone from wherever. Often funding structures would be related to people having to be in the country or potentially be in-house.” Cartoon Saloon set up shop in Kilkenny instead of Dublin where most of the animated studios are located. He adds, “That allowed them to differentiate themselves, become more independent and take these risks like creating a hand-drawn feature film. At the time, The Secret of Kells was a big mountain to climb. They had to FedEx animation paper over from Brazil to their co-production partners.” Brennan loves the idea of mixing 2D and 3D techniques. “Certainly, a lot of possibilities are opening up, particularly because of how good Blender has gotten for 2D animation. Blender is free, open source, and has these grease pencil tools that are good for 2D as well as 3D.” Matchbox Mountain, which collaborated on the “wolfvision” sequences in Wolfwalkers, is based in the Irish town of Athlone. “It’s never been easier to find work by international teams,” notes Eimhin McNamara, Co-Founder/Director of Matchbox Mountain. “You can see how larger companies like Disney have capitalized on an interest in diverse styles and types of storytelling with their Star Wars: Visions series, casting numerous smaller independent studios to do their interpretations within that world.” Technology and remote workflows have lowered the point of entry for productions, McNamara says, “but there are still issues of resourcing and costs, which are more biased towards Americancentric productions, especially when it comes to outsourced animation work to sub-contractor companies.” Ireland is in the midst of a housing shortage and cost-of-living crisis, he says. “As animation wages have yet to reflect the labor and value contributed, there is a huge issue of retention within the industry.” There is no in-house style, just a willingness to experiment. “We have

TOP TO BOTTOM: The Pack Studio developed a real-time pipeline for animation, VR and XR called SYNK when working on the feature Journey to Yourland with Plutoon Animation Studio and animation production company Bfilm. (Image courtesy of The Pack Studio and SYNK) Triggerfish aims to produce family entertainment such as the feature Seal Team and takes an agnostic approach when it comes to style or format. (Image courtesy of Triggerfish) 88 Pictures has set up a studio in Toronto that will help fill the talent gap. From “The Bandits of Golak” for Star Wars: Visions Volume 2. (Image courtesy of Lucasfilm Ltd.)

FALL 2023 VFXVOICE.COM • 43

PG 40-48 ANIMATION.indd 43

8/23/23 6:07 PM


ANIMATION

TOP TO BOTTOM: Studio Mir has developed a reputation for choreographing dramatic action sequences as found in the acclaimed series The Legend of Korra. (Image courtesy of Studio Mir) From “In the Stars” for Star Wars: Visions Volume 2. A concern for Gabriel Osorio at Punkrobot is that there is not enough jobs in the Chilean animation industry to support the number of animation graduates, leading to a talent drain. (Image courtesy of Lucasfilm Ltd.) The largest producer of animation in Latin America is Brazil, and situated in São Paulo is Split Studio, which is responsible for the family adventure series WeeBoom. (Image courtesy of Split Studio)

produced black-and-white period horror, cartoony animal fables and a whole mix of other stories using stop-motion methods, like oil paint on glass animation, cutout, sand animation and 3D puppetry, alongside 2D and 3D digital pipelines, as in my work with Cartoon Saloon’s Wolfwalkers and my recent collaboration with Dreamfeel on their next 3D video game.” Across the Irish Sea in Bristol, England, Aardman has excelled in stop-motion animation with the Wallace and Gromit franchise while branching into the realm of CG to create Lloyd of the Flies. “The underlying theme of anything with our name attached is that it should have a strong emotional story that has all the comedy and British quirkiness that you would expect from the studio,” remarks Sean Clarke, Managing Director at Aardman. “The U.K. continues to be a strong player in the European animation industry, with a recent BFI report in 2019 estimating a value of £1.6 billion generated and approximately 20,000 jobs. This is primarily down to a number of reasons around a rich, successful heritage in telling stories in the medium and strong infrastructure and expertise. There is an increasing challenge for the U.K. to remain on a par with other European countries with more competitive local tax incentives, and Brexit has meant that the European Media funding is no longer available to U.K. producers, which historically has been a key form of development and production financing.” The impact of streamers has been huge over the past five years. Notes Clarke, “The expansion of digital platforms and streaming services that look to acquire global rights has driven a more diverse global audience with a growing demand for different types of stories that can reflect different cultures, perspectives and experiences. Interestingly, our biggest global brand is Shaun the Sheep, which has been sold to over 170 territories worldwide, with two of our biggest markets being in China and Japan where we even have a theme-park experience and café.” A newcomer is two-year-old El Guiri Studios in Madrid, which was established during the COVID-19 pandemic. “It was almost like this mirage feeling of creating the short film [Sith for Star Wars: Visions Volume 2] and a studio at the same time, but not have everyone in the same location,” states Rodrigo Blaas, Co-Founder/Creative Director at El Guiri Studios. “What I learned from working at Pixar and DreamWorks is how important it is to bring to life an inanimate object or even a character that actually looks more human-like. The performance and the way that thought process and behavior changes are at the core of everything that I’m interested in animation.” Technology is not the critical aspect, say Blaas. “Being clear, having a strong script from the beginning and a core idea that is strong are more important than how to make it.” Animation is still in its infancy in Spain. “I don’t know if I would call it an industry yet,” he says. “The talent in Spain is resourceful because they always have to work with so little, but have ways to find the solution or a least have some initiative. The environment is growing in terms of what is happening in Madrid, but also how the tax rebates are now working better than ever and are getting more competitive.” Interesting technological developments have occurred in Europe, in particular when The Pack Studio in Belgium partnered

44 • VFXVOICE.COM FALL 2023

PG 40-48 ANIMATION.indd 44

8/23/23 6:07 PM


with Plutoon in Slovakia and Bfilm in Czech Republic to make the family animated adventure Journey to Yourland, and in process created a real-time pipeline for animation, XR and video games. “We had a script, but it became clear that it was over the budget that we could put together,” explains Tom Verbeek, Business Developer for SYNK and Content & Marketing Manager for The Pack Studio. “For example, there is a 15-minute water sequence, which usually becomes expensive. We decided to make the film in a game engine, and as we began to use Unity we realized this technology allows us to do a lot more within our budget. But there is not a playbook on how to make a feature animated film in a game engine. We started developing a workflow, made the film and were happy with the end result. That’s how we developed the SYNK product.” AFRICA

In Africa, one finds Triggerfish Animation, which was founded in 1996 in Cape Town, South Africa. “It’s still a very young industry [in Africa] but growing very quickly,” remarks Stuart Forrest, CEO of Triggerfish. “By 2050 it’s estimated that there will be two billion people on the continent, of which the majority will be young and tech-savvy – and perfectly positioned to take advantage of the democratization of technology-driven animation. Triggerfish has enjoyed a great run as the leading platform for developing, producing, and bringing to market animated storytelling from African creators, and we’ll be continuing to search for fresh emerging talent from the continent, supporting their creative vision and packaging them for the global market. We’ve evolved from producing our own stories to becoming a company that supports other creators to tell their stories.” Projects include Disney+ anthologies Star Wars: Visions and Kizazi Moto: Generation Fire, as well as Disney Junior television series Kiya & the Kimoja Heroes and feature film Seal Team. “We work across kids and family and are agnostic to style or format. We tend to find a creator with a strong vision and then back that vision, whether it’s 2D, CG or any other medium. Our in-house specialty is currently CG, but we’re looking at ramping up 2D shortly.” INDIA

In India, 88 Pictures has studios in Mumbai, Bangalore and Hyderabad, and most recently expanded into Canada to establish operations in Toronto. “The talent in India comes to us with below-average ability, and we train them and make them average creatively and technically,” states Milind D. Shinde, Founder/CEO of 88 Pictures. “For people like me who have worked at a big studio on $100 million-plus movies, the question is how do we make them feature-ready? That talent gap exists in North America and certain parts of Europe. What we decided as a studio is to be in Toronto to fill that 20% gap of creative and technical expertise, which will help to develop and empower Indian talent.” A readjustment is occurring in the marketplace, Shinde observes. “A year ago, things were booming with Netflix spending $10 billion [on content] while last week Disney laid off 600 people and MPC laid off 800 people, 400 in India and 300 in Montreal. The demand is not shrinking, but

TOP TO BOTTOM: My Life as a Zucchini, animated by Blue Spirit, became an international sensation, receiving Oscar and BAFTA nominations as well as dominating the 40th Annecy International Animated Film Festival. (Image courtesy of Blue Spirit) Ernest et Célestine: Le Voyage en Charabie, produced by Folivari, is one of six French animated features released each year by France. (Image courtesy of Folivari) Flee, animated by Sun Creature, made Oscar history by being nominated for Best Documentary Feature, Best Animated Feature Film and Best International Feature film in 2022. (Image courtesy of Sun Creature) Cartoon Saloon has a distinct design identity tied to Irish mythology, which is reflected in its first feature The Secret of Kells. (Image courtesy of Cartoon Saloon)

FALL 2023 VFXVOICE.COM • 45

PG 40-48 ANIMATION.indd 45

8/23/23 6:07 PM


ANIMATION

everybody is pulling back and thinking, ‘What exactly do we want?’ This is because when I’m on Netflix my eyes don’t get stuck onto anything. There’s a lot of content that is not being watched.” 88 Pictures focuses on computer animation and has worked on Star Wars: Visions, Fast & Furious Spy Racers and Trollhunters: Tales of Arcadia. “I don’t have any experience in 2D, so I stick to creating a business that I know the best. 3D animation can have different styles for every project that we do.” SOUTH KOREA

In East Asia, one country that has emerged from the shadow of Japan to become an animation powerhouse is South Korea. A major reason for this is the acclaimed work by Studio Mir, located in Seoul, on The Legend of Korra and Big Fish & Begonia. “When the streaming services came to the scene, it drastically changed the animation landscape,” remarks Studio Mir in a joint statement from the directors and vice president. “For Japan, their anime was able to reach broader audiences with improved budgets. Korean animation also saw opportunities and expanded its market worldwide.” A major struggle was transitioning from pencil and paper to a fully digital pipeline. The statement continues, “Until mid-2010, most artists drew on paper by hand. We were one of the first studios in Korea that went fully digital. There were a lot of learning curves. It was most challenging for the veteran artists, so as a studio we invested in the training system to help their transition.” Interestingly, a current trend is retro-style animation. Explains Studio Mir, “Traditional hand-drawn aesthetics are getting the limelight again because they represent the fundamental skills and values, while the visualization techniques are reaching their peak, with CGI creating almost any requested image. But we are not sure how long this trend will last. Ironically, the studios are leaning more toward technical developments with the limited pool of experienced artists. Speaking of technological trends, we should mention AI. It is a groundbreaking tool for visualization work, and we are also investing resources to experiment with the tool. However, we only see AI as a tool to assist with the needs of artists and retro aesthetics. It will not replace humans.” TOP TO BOTTOM: Aardman has gained worldwide recognition for its clever use of stop-motion animation in shorts and features such as Chicken Run: Dawn of the Nugget. (Image courtesy of Aardman and Netflix) Character animation is a fundamental aspect of storytelling, believes Rodrigo Blaas of El Guiri Studios, making it a priority for projects such as “Sith” for Star Wars: Visions Volume 2. (Image courtesy of Lucasfilm Ltd.) A driving force for animating using the Unity game engine and the creation of the SYNK pipeline was the amount of simulation work in Journey to Yourland. (Image courtesy of The Pack Studio and SYNK) “Enkai” is one of the 10 sci-fi and fantasy stories told from an African perspective found in the Disney+ and Triggerfish anthology series Kizazi Moto: Generation Fire. (Image courtesy of Disney+)

LATIN AMERICA

“The Oscar for Bear Story [for Best Animated Short Film, 2016] was a great thing not just for me but for the whole Latin American animation community, as we realized that it was possible to create animation that can go outside of Latin America,” observes Gabriel Osorio Vargas, Co-Founder, Director and CG Supervisor of Punkrobot Animation Studio in Santiago, Chile. “We have our own way, but at the same time influenced by Hollywood, and we were colonized by Europeans, so we have that culture too. It’s an interesting mix and that’s one of the things that makes our identity. I try to embrace that.” Punkrobot is a family-oriented animation studio responsible for Wow Lisa, which is about a curious little mouse that believes that everything can be a treasure when you look close enough. “We always try to give something back to society with our stories; that’s something which defines us.” Government funding is critical, Vargas says. “It’s the main or maybe the only

46 • VFXVOICE.COM FALL 2023

PG 40-48 ANIMATION.indd 46

8/23/23 6:07 PM


PG 47 CDW AD.indd 47

8/23/23 7:03 PM


ANIMATION

TOP TO BOTTOM: 88 Pictures mixed Indian culture, Star Wars mythology and universal themes to create “The Bandits of Golak” for Star Wars: Visions Volume 2. (Image courtesy of Lucasfilm Ltd.) The animation landscape dramatically changed in Japan and Korea upon the arrival of streaming services, which provided greater exposure and commissioned content such as Dota: Dragon’s Blood from Studio Mir. (Image courtesy of Studio Mir and Netflix) Punkrobot hopes that high-profile projects like “In the Stars” for Star Wars: Visions Volume 2 will encourage private investment in the Chilean animation industry. (Image courtesy of Lucasfilm Ltd.) Government support through funding and quotas led to a boom in the Brazilian animation industry, which in turn enabled Split Studio to produce the 3D platform game Wizavior. (Image courtesy of Split Studio)

way you have to create your own content, because it’s difficult to tell your stories through advertising, as you have to manage what the client wants to do. Also, private investors in Chile are more interested in copper and wine, industries that are more established. I hope this will change as they realize there are good studios making animation; that is what makes us happy about the Star Wars: Visions experience, because we were asked to do good animation and we delivered.” The largest producer of animation in Latin America is Brazil, and situated in São Paulo is Split Studio, which created the feature Tito and the Birds and TV series WeeBoom. “In 2011, Brazil developed a quota for independent Brazilian content in cable TV and the means to get them financed, which allowed a substantial amount of original Brazilian content to see the light of day,” states Jonas Brandão, Co-Founder and Business Development for Split Studio. “Therefore, that also promoted a boom in the local industry, with the sector’s rise in employment and analog ecosystems [such as schools]. The high demand also brought many studios to the market, and some of them became known globally, including Split Studio. We started in 2009 with four people and nowadays we’re one of the leading studios in the country, in operation with about 150 people currently, and with productions nominated to awards such as the Annies, Annecy and the Oscars in our portfolio.” Brazilian culture is seen as an aspect of the storytelling. States Brandão, “Split Studio, as a Brazilian-originated studio, has a set of upcoming projects with a Brazilian feel. However, that’s all in the background, as the core of these stories relies on stories with relatable universal human conflicts. An example of this is Among the Stars, an original IP we’re developing in video game and feature film formats. It tells the story of two young indigenous sisters who get separated from each other after a violent attack by land grabbers on their village. While one sister ends up in the spiritual world and is trying to return to the human world, the other fights the land grabbers on her own, as she thinks the criminals have her sister. It’s a very exciting journey, set in Brazil, with many native Brazilian cultural layers. Still, in the end, it comes down to a beautiful journey of two sisters trying to find each other, which many people can relate to.” What has become clear by talking to animation studios from Europe, Africa, Asia and Latin America is that while streamers have had a huge impact on the increased demand for animated content, government grants and tax incentives remain critical in supporting the industry domestically and internationally. Co-productions have become a necessary practice to be able to pool together the required financial and artistic resources and have made Europe a cultural mosaic of storytelling. The technological future will see the proliferation of real-time animation and the adoption of AI and machine learning to fill in the efficiency gaps. This will lay the foundation for the next platform that enables individuals to be participants rather than viewers in an organic manner. Something that will remain constant is the desire to communicate with each other, tap into universal emotions and explore the vastness of the human imagination.

48 • VFXVOICE.COM FALL 2023

PG 40-48 ANIMATION.indd 48

8/23/23 6:07 PM


PG 49 MATHEMATIC AD.indd 49

8/23/23 7:04 PM


PROFILE

LIZ BERNARD FINDS FASCINATION IN THE DETAILS OF CHARACTER ANIMATION By TREVOR HOGG

Images courtesy of Liz Bernard. TOP: Liz Bernard, Senior Animation Supervisor, Digital Domain OPPOSITE LEFT TO RIGHT: A precocious three-year-old living in Goochland, Virginia, where her parents were graphic designers. A memorable trip for the eight-year-old was visiting the Who Framed Roger Rabbit attraction at MGM Studios in Orlando, Florida. Lemony Snicket’s A Series of Unfortunate Events was an example of where Bernard used origami to illustrate what she needed in rig for a character. (Image courtesy of Digital Domain and Paramount Pictures) Power Rangers provided numerous places for Bernard to be creative, in particular with the Zords. (Image courtesy of Digital Domain and Lionsgate)

All one has to do is look at the tattoo that encompasses the upper half of her left arm to realize how much love Digital Domain Senior Animation Supervisor Liz Bernard has for character animation. “I’m nerdy and the tattoo is from Ender’s Game. The Alien Queen was a big opportunity for me to design the movement of a character on my own for the first time. In the book and movie, she represents this cross-cultural understanding and empathy in the way she communicates telepathically with Ender across this great expanse of space. Getting her to emote was quite challenging. You had to add a lot in the eyes and have subtle facial details that indicated she wasn’t threatening while still looking quite alien and different, so that she would have this internal struggle to understand why Ender reacted the way he did to her.” Bernard was born and raised in Goochland, Virginia, where her parents were graphic designers who ran their own company. “I had access to all of the good art supplies from the beginning of my life, which was funny because I would go over to other kids’ houses and ask, ‘Can you show me where the X-Acto knife is?’ The parents would be like, ‘X-Acto knife? But you’re nine. We can’t give you that!’” Art lessons involved being exposed to practical and digital techniques. “My parents were doing this in the 1980s when typesetting was still done. You would print it out, cut out the block of text and then spray glue it onto a big board; that’s how you would create a newspaper layout or ad. Then you would take that whole thing and photograph it with this giant camera that I was fascinated by as a child. All of that stuff was interesting to be around. My dad was at the forefront of all of this computer stuff and was teaching me Photoshop as he was learning it too.” Much of her free time in high school was spent taking part in theatrical projects such as building sets, doing lighting design and running spotlights, and carried on to post-secondary education with Bernard getting a Bachelor of Arts in Drama from the University of Virginia, as her goal was to become a lighting designer. However, the lack of job opportunities caused a rethink of her chosen career path. “Theatre is such a good way to understand things from a bigger picture because you’re telling an entire story every night, so you can’t get too bogged down in minutiae. You have to look at things holistically. When you’re given a visual effects sequence in film, there are criteria that have to be hit: These are the characters, this is what they’re supposed to be doing; we know what the script is, that is the location, and you have a series of puzzles that need to be solved. I like that so much partly because I started solving those same kinds of puzzles in theatre at the beginning of my career.” While running her own photography business, Bernard decided to get diplomas in Advance Character Animation and Animal and Creature Animation from Animation Mentor. “There are other schools out there now that have emulated the same approach where they find someone actually in the industry to teach you the real way things are done, not just the principles of animation in a vacuum. You also get instant networking, which is critical for any field. Nicole Herr [then]

50 • VFXVOICE.COM FALL 2023

PG 50-54 LIZ BERNARD.indd 50

8/23/23 6:08 PM


at Sony Imageworks [now Animation Supervisor at Halon Entertainment] taught the ‘Intro to Acting’ class and gave me good practical advice, which is, you have to go where the work is.” According to Bernard, when it comes to demo reels, a particular element should be addressed. “I see a lot of stuff that is more cartoony style, or a character doing an acting performance on a CG background. One thing that we’re always looking for is, do you know how to work with a plate that was shot on location, has a tracked camera, and be able to put the character in that environment? Because that’s most of what we do in visual effects,” she says.

FALL 2023 VFXVOICE.COM • 51

PG 50-54 LIZ BERNARD.indd 51

8/23/23 6:08 PM


PROFILE

TOP: Ender’s Game was a major opportunity for Bernard to get involved in designing the motion of characters. (Image courtesy of Lionsgate and Digital Domain) MIDDLE LEFT: Motion capture was helpful to the animation process in quickly shaping and blocking out the performance in She-Hulk, while two head-mounted cameras provided facial capture footage. (Image courtesy of Marvel Studios) MIDDLE RIGHT AND BOTTOM: Maintaining the animation style for She-Hulk was made easier by the performance provided by Tatiana Maslany. (Images courtesy of Marvel Studios)

Breaking into the visual effects industry occurred when Bernard was hired as an animator by Arconyx Animation, where she remotely created creature animation on plates for Season 2 of Finding Bigfoot. “It was run by Kenny Roy, who taught me the Animal and Creature course that I took at Animation Mentor.” A major turning point for the up-and-coming animator was being hired by Digital Domain in Vancouver, and over the past 12 years she has thrived, going from doing background animation on Jack the Giant Slayer to Senior Animation Supervisor on She-Hulk: Attorney-at-Law. “The job of a supervisor involves a lot of translating. You’re translating between different teams, departments and individuals within a show. Much of that is figuring out what are the tools that can be made for any given show, or for the studio as a whole, that would make animation more productive, faster and less frustrating, and how to communicate that to the coders who write these tools for us. The foundation is breaking everything down to the fundamentals so that we all speak the same language.” Maintaining continuity in the animation style is essential. “Sometimes we’ll all have dailies together so everyone can hear everyone else’s notes,” Bernard remarks. “She-Hulk was more

52 • VFXVOICE.COM FALL 2023

PG 50-54 LIZ BERNARD.indd 52

8/23/23 6:08 PM


straightforward because we had this golden reference and an amazing performance from Tatiana Maslany that we were trying to follow, if not literally then at least the spirit of, in most cases.” Motion capture can get you to 80% of the performance quickly and makes blocking straightforward. “Mocap has a tendency to clarify what the shot should be, so the client is less likely to explore other options because they’ve selected the performance already. When you have motion capture you have to get things to a high level of polish, and there is a lot of stuff that gets left behind. Fingers are often not motion captured, so that’s all done by hand. Facial animation is its own whole thing that is some of the most challenging and tricky animation to nail down because as humans we are so entuned with the human face.” There are certain emotions that are considered to be universal. “One of the things I love about animation is that it’s not just about one field,” Bernard states. “It’s multi-disciplinary. You’re focused on physics, acting, shot composition and all sorts of specialized things. You might work on a show about how sailing boats work, so you need to understand the physics of that. With faces we spend a long time looking at the character’s face. On She-Hulk, it took a

TOP LEFT, TOP RIGHT AND BOTTOM LEFT: Origami is a hobby that has proven useful to Bernard in solving animation problems. (Mammoth design by Satoshi Kamiya) MIDDLE RIGHT: Bernard conducted personal research to learn more about animating the wolves for Disney’s 2017 Beauty and the Beast. BOTTOM RIGHT: A Bernard photograph of an iguana living on the Pacific Coast of Costa Rica.

FALL 2023 VFXVOICE.COM • 53

PG 50-54 LIZ BERNARD.indd 53

8/23/23 6:09 PM


PROFILE

“One of the things I love about animation is that it’s not just about one field. It’s multi-disciplinary. You’re focused on physics, acting, shot composition and all sorts of specialized things. You might work on a show about how sailing boats work, so you need to understand the physics of that.” —Liz Bernard, Senior Animation Supervisor, Digital Domain

TOP: While living in Ouagadougou, Burkina Faso in West Africa, Bernard photographed a girl clapping to the music at a local festival. BOTTOM: While in Mali, Bernard captured the performance of Dogon dancers.

while before we understood what she was supposed to look like, which is important for being able to give notes to other people. It takes a while to internalize the bone structure and different facial shapes; those are so specific to that actress that it doesn’t happen overnight. You do have to spend some time immersing yourself in that character, looking at all of the available footage and working with it.” Deepfakes are more prevalent as AI and machine learning technology advances in sophistication. “It’s so much in the news right now,” Bernard notes. “I have no doubt that deepfakes are going to change our industry dramatically in the next five years. It doesn’t feel like it has made a huge impact quite yet, and part of the reason is that it’s hard to art direct deepfakes. If there are little tweaks or performance changes requested, it doesn’t always turn out the way you expected because the deepfakes are a black box scenario. You don’t always know what you’re going to get out of it. There is still a lot of artistry and ‘eye’ that you have to apply to deepfakes, too. We did a bit of work with Charlatan, which is our in-house proprietary system for deepfakes for facial replacements at Digital Domain. We did some of that on She-Hulk, especially on a character called Titania played by Jameela Jamil, and noticed after the first couple of rounds of Charlatan takes that something was wrong – her whole face was too small on her head!” Creative problem-solving is something that Bernard enjoys. “A hobby that I kept from my childhood is origami. In Lemony Snicket’s A Series of Unfortunate Events, there was this character who had mechanical dragonfly wings that were supposed to deploy from a backpack. Because my task was to figure out how they come out of the backpack, I put the wings down from a front view and cut out paper that was in the exact same shape and proportions of the final model. Then I figured out through much trial and error how to fold them up with the actual paper. I demonstrated deploying them to my visual effects supervisor and he said, ‘That’s awesome! Let’s show the client.’ The client loved it. When I had my briefing with the rigger, I gave her this little crumpled up piece of paper and said, ‘Make it do this!’” For Bernard, no project is exactly the same. “The reason why I’m still interested in animation is because every show is an opportunity to learn about something new,” Bernard observes. “One of my favorites was Power Rangers because oftentimes you come onto a project, all of the decisions have been made and you’re there to execute the plan. But on Power Rangers, there were numerous places to still be creative and come up with things to pitch to the client. It seemed like everything that we pitched they were like, ‘That’s awesome.’” Bernard concludes, “It’s all about finding balances between things. Storytelling and physical accuracy – that’s often something where we have to split the difference and figure out how we make this look real. If it looks too real it might be boring, so it needs to be interesting and also tell the story and maybe emote. All of that stuff together is what forms the restraints on each one of these shots that is a puzzle to solve. I like that process.”

54 • VFXVOICE.COM FALL 2023

PG 50-54 LIZ BERNARD.indd 54

8/23/23 6:09 PM


PG 55 ACCENTURE AD.indd 55

8/23/23 7:05 PM


FILM

TAPPING INTO THE TRUE GRIT OF THE EQUALIZER 3 By TREVOR HOGG

Images courtesy of Columbia Pictures/Sony. TOP: Denzel Washington and director Antoine Fuqua enjoy a moment of levity in-between action-packed set pieces. (Photo: Stefano Montesi) OPPOSITE TOP: The crew at work in Minori, Italy for the driving shot of villain Vincent (Andrea Scarduzio) in the SUV. (Photo: Stefano Montesi) OPPOSITE BOTTOM: Stunt Coordinator Liang Yang and his team bring up the stunt person on set. (Photo: Stefano Montesi)

For those wishing to return to the days when effects were predominating done practically, then The Equalizer 3 will be a joy to watch, as the mandate of filmmaker Antoine Fuqua (Training Day) and Cinematographer Robert Richardson (Snow Falling on Cedars) was to capture as much in camera as possible. “The most dramatic thing is the explosion, when one of our leads is impacted by that, and the whole environment she is walking in is essentially digitally created,” states Editor Conrad Buff IV, who assembled the original outing of Denzel Washington as Robert McCall, a former government agent turned vigilante. “Most of it is sweetening existing shots. In some cases, adding backgrounds to moving vehicles and dialogue going on, or cleanup in certain scenes in Italy where we would want to remove some contemporary pieces of equipment on a beautiful old building, like a satellite dish or a communication device. It’s not a strong visual effects film and yet we still have several hundred effects shots.” Principal photography took place in Italy with Atrani standing in for the fictional Italian town that McCall decides to call home. “Fortunately, the town in the story is not a postcard place that everyone knows about,” notes Production Designer Naomi Shonan. “It’s a place where lives are lived on a small scale, and what we wanted to produce is that feeling of community and intimacy. Atrani has 700 inhabitants and 90% of them are related. You’re trying to make sure that everything in the script is there. That store is such a distance from this. At the same time, you’re being instructed and limited by the real place. I always find that limitations are a source of creativity. Open-ended is not good. If

56 • VFXVOICE.COM FALL 2023

PG 56-61 THE EQUALIZER 3.indd 56

8/23/23 6:09 PM


you pretend that you’re infinitely creative, I don’t think you end up with anything that is grounded, and what’s the point?” Some scenes were shot at Cinecittà Studios in Rome. Notes Shonan, “Plates were taken on the location that the sets were supposedly in. The sets were heavily referenced on the real locations because I always find you learn so much from wherever you are. It’s like an anthropology of Italy. You don’t want to pretend that it’s something else. You want to mine it for all it’s worth.” Clarity in action sequences is essential. “I’m not a fan of things that are so absurdly overcut that are frame, frame, frame,” Buff remarks. “We’re not really watching, but receiving an impression. I’m more old school. I prefer seeing as much as possible what the action is and not trying to mask it with energetic cuts.” A blueprint was already ingrained into the footage. “There was a stunt and action unit that worked with Antoine and choreographed some of the sequences. They were choreographed tightly, so that when I started to receive them for real, I was obliged to essentially follow the pattern that had been established by Liang Yang [Stunt Coordinator and Action Unit Director]. In this film, a lot of it was predetermined,” Buff says. Stuntvis was captured onstage based on the script. “There are always challenges when we find out what the location is going to be,” Yang states. “For example, a fast-traveling van hits the villain and crashes onto the church wall. I was really stressed to get the rig I needed and to think of the solution to make the van crash work, because in Rome every single wall and building are historical, so you cannot damage them, and yet I still had to make everything look realistic.”

FALL 2023 VFXVOICE.COM • 57

PG 56-61 THE EQUALIZER 3.indd 57

8/23/23 6:09 PM


FILM

Only three sequences required stunt wirework. “One was the van crash where we needed a wire to pull the person into the wall,” Yang reveals. “Another one was the car explosion and the stunt double flying backwards. The third is the person being thrown out of the window from the fourth floor. When we do those things, we make sure that the stunt looks realistic but is safe at the same time.” Each project requires different action and fight choreography. “My trick is to analyze the script inside out, find the key element for that particular action sequence and understand the character’s motivation. Antoine wanted McCall’s fight movements to be very brutal, fast and intended to kill. This is because the villains in this story are truly evil, and McCall wants them to receive the punishment they deserve,” Yang says. Between 550 and 600 visual effects shots were created by Blackbird VFX, Frame by Frame, Future Associate, Luma Pictures, ModelFarm, slatevfx, Supervixen Studios and an in-house team. “The significance that visual effects are playing in this movie is probably more than I anticipated, but that makes for more fun,” remarks James McQuaide, VFX Producer and VFX Supervisor. “An interesting challenge was that [Cinematographer] Robert Richardson has custom lenses, which have focal lengths that aren’t necessarily accurate. Several times we were trying to shoot these elements and construct things practically where the focal length between his lenses and the other lenses didn’t match; that required a certain figure-it-out-as-you-go process, which I hadn’t run into previously.” Details were incorporated into the imagery, McQuaide adds. “A small example is when we added clothes lines in the small

58 • VFXVOICE.COM FALL 2023

PG 56-61 THE EQUALIZER 3.indd 58

8/23/23 6:09 PM


“We’re Charlie Watts of the Rolling Stones, not Mick Jagger. When you’re doing a creature-driven movie, then you’re front and center; however, here we’re supporting the practical photography. The biggest challenge was trying to dust off that part of your brain that developed when CG wasn’t an option and you had to rely on practical. It has come together quite well.” —James McQuaide, VFX Producer/ VFX Supervisor Amalfi town to take what Naomi had done and make it prettier,” he says. Theatrical muzzle flashes and blood hits were not an option. “They had to underscore what was going on, but Antoine was most interested in accuracy. All of the muzzle flashes that you see in the pictures were shot with those guns. There is little blood in the movie that is practical,” McQuaide notes. Evercast was utilized to discuss visual effects shots with Buff and Fuqua, while vendor notes were sent via PIX. “PIX is handy because vendors can clearly see what James is talking about and is frame accurate,” remarks Visual Effects Editor Steven Cuellar. “We shot an actor in a car on a grey screen, so I was able to put background plates to make it look like he is being driven

OPPOSITE TOP: A man (stunt double Gabriele Scilla) in a wheelchair throws himself out the window to protest Marco’s eviction. (Photo: Stefano Montesi) OPPOSITE BOTTOM: Denzel Washington reprises his role as Robert McCall for the final installment of The Equalizer trilogy. (Photo: Stefano Montesi) TOP: The greenscreen shoot for the car explosion that takes place outside a police station in Naples. (Photo: Stefano Montesi) BOTTOM: VFX Producer/VFX Supervisor James McQuaide shoots background plates with the ACHTEL camera for the Naples car explosion scene.

FALL 2023 VFXVOICE.COM • 59

PG 56-61 THE EQUALIZER 3.indd 59

8/23/23 6:09 PM


throughout Italy. I would do my own test comp inside the Avid and present it to Conrad; he would give me directions, so I would refine it and give it back to him. When it gets to something like bullet hits or blood splatters, that goes above me, and we send those over to postvis!” Postvis was a critical part of the editorial and visual effects process, Cuellar says. “James has a great team of postvis artists who are able to create close to final-quality visual effects for us and has been instrumental in shaping and designing, not just one particular shot but certain sequences. It has definitely been great to have that tool at our disposal to help shape and fix things as Conrad thinks of something. These postvis artists are able to do it relatively quickly and get it back to him so he can review and cut it in.” The high-quality postvis was used for studio screenings and extremely cost-effective, McQuaide explains. “It allows you to wait until the cut begins to solidify, and therefore not throw away a lot of shots. We’re using Company3’s Portal. Everything is online and sent to the vendor within 24 hours or less. It turns over in three or four days.”

“James [VFX Producer/VFX Supervisor McQuaide] has a great team of postvis artists who are able to create close to final-quality visual effects for us and have been instrumental in shaping and designing, not just one particular shot but certain sequences. It has definitely been great to have that tool at our disposal to help shape and fix things as Conrad [Editor Conrad Buff] thinks of something. These postvis artists are able to do it relatively quickly and get it back to him so he can review and cut it in.” —Steven Cuellar, Visual Effects Editor

TOP: The stunt team carefully choreographed the fight sequences by shooting stuntvis to inform the final cinematic result. (Photo: Stefano Montesi) SECOND FROM TOP: The van crash was tricky to accomplish, as the necessary equipment had to be improvised and the walls of the church were considered to be historical, so they could not be damaged. (Photo: Stefano Montesi) BOTTOM THREE: Amongst the invisible visual effects work is the addition of CG vehicles to the plate photography.

Stable Diffusion was integrated into the Maya and Houdini pipeline belonging to Luma Pictures to achieve the desired face replacements. “We are using the Stable Diffusion method to rapidly prototype how a face could look in a plate lighting context,” remarks Brendan Seals, VFX Supervisor at Luma Pictures. “We are still sculpting expressions and certain blend shapes for the base mesh that is being fed into Stable Diffusion via a normal map. We are boxing the AI into certain boundaries that conform to the proportions of Denzel Washington’s face. That’s important because it’s all too easy now to generate Denzel circa 10 to 15 years ago, but we had to be faithful to Denzel of today and the performance that he’s put into this film. A metahuman version of Denzel was also built by using Epic Games Unreal Engine. We were trying to come up with a few different techniques, put them all together and then give it to compositing as a component of the overall workflow.” The Stable Diffusion methodology was applied for when the camera zooms into the eye of the protagonist for the McCall Vision sequence. “It’s a mixed bag of high-resolution photography

60 • VFXVOICE.COM FALL 2023

PG 56-61 THE EQUALIZER 3.indd 60

8/23/23 6:09 PM


provided to us of Denzel in the scanning booth and AI to up-res the photography of him lying down.” A key visual effects moment is when a car bomb goes off in a piazza outside of a police station. “That was the big job that we had to do because everything was shot in many different locations in Rome and Naples,” remarks Fabio Cerrito, Head of VFX at Frame by Frame. “We built a rough environment based on the LiDAR scans that had been given in order to use it to cast the shadows on top of everything. We had some debris coming from the real explosion, but they didn’t give us the proper brutal force that type of explosion would have. We had to recreate and choreograph them in CG. There are a lot of cars that react to the explosion by shaking and the tempered glass cracks into a typical spider web. We had to add tons of different smoke details. The main plate was shot in normal daily life where people are walking the dog or buying a newspaper. The thing that we had to be careful of was having people in the background reacting to the explosion. It is another small touch, but makes the effect look believable.” 190-degree 9.3K plate photography of the piazza in Naples was captured using the ACHTEL 9x7 Digital Cinema Camera and Entaniya HAL 220 LF fisheye lens. “One of the things that the camera can do is, I can login to the camera from anywhere in the world and completely take over,” remarks Pawel Achtel, Director and Producer with ACHTEL Pty Limited. “James was telling me when to roll and cut the camera. I was ensuring that the camera exposure and settings were correct. Then, at the end of the day, I was able to back up the footage and transcode it to what they needed remotely from Sydney. James joked that DPs work from home these days!” A major concern had to be addressed, Achtel adds. “There was a massive difference between the shadowy areas and this white billboard that the sun was hitting. James wasn’t sure whether that stuff would be clipped. I intentionally shot it with slightly clipped highlights so we could get better exposure in the shadows, and reconstructed the highlights 100% with all of the detail and color matching, which was quite extraordinary.” Five weeks before locking the picture, ideas were still being tried out. “The bad guys are involved with some bombs going off in Rome, but we only see the aftermath,” McQuaide states. “We made a temp shot from stock footage of a train station in Rome for Antoine to take a look at, where we actually see the explosion from a surveillance camera. By putting that in the movie you are increasing the stakes. You now really hate the villain because you’re seeing innocent people being killed by these bomb explosions. Sometimes, it’s putting people physically in danger, but in this case it’s something more abstract than that but still important. You’re actually seeing why the bad guy gets the punishment he gets in the end.” A music analogy accurately describes the visual effects work in The Equalizer 3. “We’re Charlie Watts of the Rolling Stones, not Mick Jagger,” McQuaide observes. “When you’re doing a creature-driven movie, then you’re front and center; however, here we’re supporting the practical photography. The biggest challenge was trying to dust off that part of your brain that developed when CG wasn’t an option and you had to rely on practical. It has come together quite well.”

TOP TO BOTTOM: Rising Sun Pictures utilized AI to get the necessary detail for the eye of Denzel Washington in the McCall Vision scene.

FALL 2023 VFXVOICE.COM • 61

PG 56-61 THE EQUALIZER 3.indd 61

8/23/23 6:09 PM


TECH & TOOLS

REALIZING THE SKY-HIGH POTENTIAL OF THE CLOUD By TREVOR HOGG

TOP AND BOTTOM: Major productions that were rendered on Amazon Web Services included Avatar: The Way of Water and Lord of the Rings: The Rings of Power. (Avatar 2 image courtesy of Weta FX and 20th Century Studios. LOTR: The Rings of Power courtesy of Amazon Studios) OPPOSITE TOP TO BOTTOM: RebusFarm Faminizer Software is an advanced plugin for 3D software designed to make it easy to send error-corrected renders. (Image courtesy of RebusFarm)

Establishing a network of mainframe computers that could be accessed by multiple users dates back to the 1950s when schools and corporations were provided access by technological companies such as IBM. This concept of “server rooms”’ and “dumb terminals” are the central components for what has become known as cloud services. With the usership costs becoming more affordable, it has allowed for widespread adoption and integration, and in the process is paving the way for the virtual highway, which will enable life-like digital interaction. Dominating the marketplace that is still far from achieving its full potential are Amazon Web Services, Microsoft Azure and Google Cloud Platform, where there are three models of service being offered: software, infrastructure and platform. “All three models are important and part of the same ecosystem,” notes Gretchen Libby, Director of Visual Computing for Amazon Web Services. “Platform providers are enabling virtual workstations, while software providers facilitate pixel streaming, and we provide infrastructure at scale. They’re all key components to cloud-based workflows. We’re excited to continue partnering with independent software vendors and solutions integrators to evolve and adapt to our customers’ needs.” Two significant areas of concern are cost and security. “We have always countered the cost spiral by operating our own data center,” explains Ralph Huchtemann, CEO of RebusFarm. “This means that we do not have to pass on the high fees of the large billiondollar companies to our customers. In addition, having our own data center prevents third-party access. This makes a big difference compared to rendering providers who only rent the necessary hardware from the big players like AWS, GCP or Azure. We have full control over the data and can guarantee that there is no thirdparty access. Secondly, we have ensured complete redundancy of hardware resources. All components in the data center are designed to be fail-safe. The power supply, Internet connection of

62 • VFXVOICE.COM FALL 2023

PG 62-67 CLOUD.indd 62

8/23/23 6:11 PM


the data center, the cooling, the central servers for monitoring the processes, all HDD- and NVME HDD storage and all other components are redundant and guarantee a smooth operation 24/7.” Typically, data is stored in a particular region for the selected cloud provider. “If a studio needs to move that data, either to a different region or another cloud provider, it will likely incur egress fees, and they can be significant,” states Mac Moore, Head of Media &Entertainment at CoreWeave. “CoreWeave is the exception here and allows users to move data out to other cloud providers at no cost. Hefty egress charges break down the business model for using the cloud in VFX and animation. Part of MovieLabs’ 2030 Vision [white paper] for new technologies in content production is an emphasis on interoperability and open standards. I think when there’s improved interoperability and limited egress between cloud providers and regions so that data can flow indeterminately, it will greatly benefit cloud users from a cost and flexibility perspective.” Scalability and elasticity of resources for computation and rendering are a major benefit. “It’s important to note that the small margins in visual effects are a challenge, and this has been the case for as long as I can remember due to the unpredictable nature of those workloads,” Libby states. “Delayed decision-making makes it tough for visual effects companies to predict render capacity needs. The cloud has helped mitigate some of those historical bottlenecks by enabling studios to work in a more elastic way. It also provides studios with the option to expand and contract when needed and removes barriers to entry. A big capex investment isn’t required to get started; more work can be done and at a faster pace when needed with the cloud.” The core technology behind cloud computing in visual effects and animation has been in place for a decade. “The true evolution is how studios are accessing the technology,” Moore says. “Early on, people looked to the cloud as they needed more machines, and

FALL 2023 VFXVOICE.COM • 63

PG 62-67 CLOUD.indd 63

8/23/23 6:11 PM


TECH & TOOLS

TOP TO BOTTOM: RebusFarm counters the spiraling costs of rendering by operating its own data center. (Images courtesy of RebusFarm)

they treated them like on-premises machines in that they’d spin up cloud resources and leave them going. This approach was incredibly inefficient and led to cost overruns. Visual effects companies already have such limited margins that burning machines like that was highly problematic. What’s helped mitigate some of these early inefficiencies is the rise of services to help automate the scaling process, both up and down; in parallel, more studios are building out staff experienced with using the cloud. There are also solutions providers dedicated to helping studios set up cloud-enabled workflows.” Globalization of the workforce is driving the adoption of cloud computing. “Maybe the term digital transformation has been used too much, but people are gradually moving towards cloud computing and services because there are more distributed teams working with offices around the world and clients around the world; everything is connected,” observes Vladimir Dragoev, Product Manager at Chaos. “It’s not just the pandemic that contributed to this digital transformation. When working with a local render farm, teams have to consider a lot of additional work around it such as maintenance, provisioning, updates and installation. We have clients who comprehend that and want to offload this load and maintenance cost, and cloud computing is good at that.” Cloud services are democratizing technology. “The visual effects and animation industries have long been dominated by giant studios with access to expensive hardware and software,” states Le Quang Hieu, CEO of iRender. “But that is no longer the case anymore. The introduction of online render farms/ cloud rendering services has made those resources accessible to everyone.” “Moving to the cloud will help unify workflows and data across the production pipeline, enabling teams to access and manage their assets wherever and whenever they’re needed,” notes Paolo Tamburrino, Senior Industry Strategy Manager for Autodesk. “At Autodesk, we envision a future where data flows seamlessly across teams, and film/animation studios as well as visual effects facilities can collaborate better, so artists stay in the creative zone, focused on creating high-quality content rather than rebuilding assets from project to project. Autodesk is working on Autodesk Flow, our industry cloud for the media & entertainment industry. It will further connect workflows, data and teams across the entire production pipeline to a single secure source of truth for all assets, versions and feedback. With a foundation built on open standards, studios will also be able to customize flow for their unique production needs and workflows.” Client input has been essential. “To ensure that we are always aware of and able to support our clients through these rapid market changes, Autodesk M&E business is investing in cloud. Flow will connect existing content creation and production management capabilities with new platform-native capabilities to unify and extend customers’ workflows on the desktop and in the cloud. Maya, ShotGrid and Moxion will be the first to be integrated to create new extended workflows across the entire production life cycle. Enabled by open standards, Autodesk Flow will allow our existing tools, as well as third-party tools and APIs, to plug into an open ecosystem to speed innovation,” Tamburrino says. “We manage the workflow from submitting the scene to the cloud computers,” Dragoev explains. “We start a virtual machine, provision

64 • VFXVOICE.COM FALL 2023

PG 62-67 CLOUD.indd 64

8/23/23 6:11 PM


it, load the scene, do other pre-steps, start V-Ray, which after rendering the scene generates image outputs. After that, we store the images, and users can download them in several different ways [including integration with cloud storage providers like Google Drive, OneDrive and Dropbox]. Chaos Cloud also offers traditional services like sharing VR previews with end clients. There are all sorts of things that we do. It’s not strictly for rendering.” Ease of use is a priority. “The most important thing is to provide a simple and smooth but powerful process,” Huchtemann observes. “We already achieve this through our advanced interface. There is beauty in simplicity. The customers basically just want to press one button and get what they need, which is fast rendering. But we will soon release our new interface, which will simplify and empower the workflow even more. We are already good at that, but the new version will have some additional features that people will like, and many performance optimizations. There will be a new feature that allows collaboration with teams of any size and at the same time gives full cost control to the budget manager.” It is important to make the UI not intrusive. “In terms of workflow, users can render their images without too much hassle or having to go through a lot of cumbersome steps,” Dragoev remarks. “We’re trying to have fewer steps and settings that users have to customize. That’s why, for example, we don’t

“Every year, the cloud computing and rendering market grows by 25%, thus doubling the market in four years. According to our assessment, cloud computing and rendering is a must-have choice in the next 10 years when client standards and graphics standards don’t stop at HD, 2K or 4K. The higher resolution is increasingly popular and becoming mandatory for the end-product. And to meet such a high standard, studios are required to use cloud computing and rendering.” —Le Quang Hieu, CEO, iRender have job prioritization. We treat everyone with the same highest priority and start each job as soon as possible on its own dedicated machine. Every user benefits and there’s no wait time for them. Because we strive to have a UX that helps, we have only one button in the Chaos Cloud Rendering integrations in each supported DCC or CAD application. We also have a live preview of the rendering so you can pause or stop it before completion. Chaos Cloud Rendering also offers no-UI Submit interface, if the users are confident to directly start the rendering without customizing any settings. But the majority of the workflows still involve going through the submission page and adjusting the available settings.” Real-time, AI and machine learning have a role to play. “Realtime is being adopted more than ever, driven by tools like Epic

TOP TWO: Conductor is a secure cloud-based platform that enables the offloading of rendering and simulations workloads for VFX, VR/AR, architecture visualization and animation studios. (Images courtesy of CoreWeave) BOTTOM TWO: The introduction of online render farms/cloud rendering services like iRender has made those resources accessible to everyone. (Images courtesy of iRender)

FALL 2023 VFXVOICE.COM • 65

PG 62-67 CLOUD.indd 65

8/23/23 6:11 PM


TECH & TOOLS

Games’ Unreal Engine,” Libby remarks. “We’re seeing a lot of virtual production technology coming together and real-time rendering improving. Still, we’re just scratching the surface of how real-time tech can accelerate the ways digital artists create. AWS has been developing AI and ML technologies for over 20 years, but we’re still evolving how we apply these technologies to M&E applications. What’s happening across the industry with generative AI is super interesting. Generative AI will change the entertainment business in a way that we haven’t seen in a long time.” Machine learning is pivotal in creating an automated system. “This will include concepts like up-resing footage, denoising images and black frame or partial render detection,” Moore notes. “Historical data can train ML models to help identify patterns and inform problem detection and correction. Just as AI/ML helps artists focus on the end-product, the emerging technology will help studios and wranglers render scenes in a more cost-effective way and reduce the need to manually analyze and/or intervene at each phase of the project. In summary, the mundane tasks of cloud computing have been automated, and now intelligence is being built to optimize the workflow.”

“Maybe the term digital transformation has been used too much, but people are gradually moving towards cloud computing and services because there are more distributed teams working with offices around the world and clients around the world; everything is connected. It’s global, not so domestic.” —Vladimir Dragoev, Product Manager, Chaos

TOP: A major benefit of cloud computing is the ability for clients to eliminate the cost of maintaining a render farm. (Images courtesy of Chaos) BOTTOM TWO: Examples of the Chaos Cloud Rendering UI where scenes in Maya are being rendered by V-Ray. (Images courtesy of Chaos)

Partnerships are being forged. “We are a diamond member of the Blender Development Fund to support community development of this open-source 3D software,” Hieu states. “We are also a long-term user of AWS service in terms of cloud storage. Currently, we are working closely with Maxon to provide licenses for Cinema 4D and Redshift as part of our plan to provide PaaS-based cloud rendering services. In the future, we continue looking for further cooperation with hardware and software vendors to further enhance our services.” A lot can be learned from clients. “Because of the characteristics of the IaaS render farm model, we are a bit more technology and infrastructure-oriented. By interacting with clients when supporting them using the service, we have gained more insight into 3D software and tools for content creation. iRender provides a rating and commenting system after each session, so we can get more direct feedback from clients. What we are really grateful for is our clients constantly providing feedback and useful suggestions. Thanks to them, we know where to focus on developing so that we can deliver a better user experience,” Hieu adds. Workflows are becoming smoother. “You could go back to the beginning of the film industry to see we’ve always had hard stops,” notes Tim Moore, CEO of Vu Technologies. “When you were shooting on film before, you had to stop production, develop the

66 • VFXVOICE.COM FALL 2023

PG 62-67 CLOUD.indd 66

8/23/23 6:11 PM


film and then go into editing. Once we made it in digital form where you capture it digitally and edit it digitally, there is a uniformity across that exchange because it’s in the same format. Now when you look across all of the creative exchanges in the life cycle of a video production, we’re hitting that point where almost all is going to be uniformly digital.” There is an ongoing evolution occurring in cloud services. “Once people stop trying to replicate on-premises environments in the cloud and embrace the cloud for its true potential, we’ll see innovation in different ways,” states Libby. “Studios won’t have to worry about IT management and can instead put all their effort into creating amazing visuals. We’re working to help our customers achieve that and helping to spur the growth of the industry at large.” The market is growing. “Every year, the cloud computing and rendering market grows by 25%, thus doubling the market in four years,” Hieu notes. “According to our assessment, cloud computing and rendering is a must-have choice in the next 10 years when client standards and graphics standards don’t stop at HD, 2K or 4K. The higher resolution is increasingly popular and becoming mandatory for the end-product. And to meet such a high standard, studios are required to use cloud computing and rendering.” Despite being established on brick-and-mortar studios, Vu Technologies acknowledges that the future is headed towards virtual studios. “AWS has been able to help connect all of our studios for co-creation and remote control, and a lot of the platform is built on their cloud computing network,” Tim Moore remarks. “We’ve created some core applications that it runs on. One of them is an AI orchestration layer that manages all of the other apps in the platform. This is the biggest paradigm shift since the advent of the digital camera. For the past 100 years, studios have not been connected to the process or the workflow of a cloudbased production. The tools and workflows that we’re building are to help that digital transformation, and our vision is that in the future there are no physical studios. It’s an interesting time in the creative marketplace. Fortnite has opened up its creator tool and are saying that they will pay creators to build worlds for them. You can build an island now on Fortnite, charge people to come onto that island and then play a multiplayer game. Now imagine in the future if you and I want to go shoot a movie on an island, which they will get royalties for the movie that is being made or there’s a permit to actually shoot in there.” Photorealism is being achieved with generative AI such as Midjourney 5.1. “Is it going to take the fun away of actually blowing up the car?” Tim Moore reflects. “Yes. Anytime you make technology this accessible, it takes away the creative constraint that people are put under to figure these things out. But I will say this: Whenever you democratize or make a technology more accessible, the participation rate increases so dramatically that it pushes the creative work further. Practical will still be an option, but in five to 10 years it’s all going virtual.” This digital transformation would not be possible without cloud computing and rendering. Tim Moore believes. “It has to be, because what other way can we co-create and connect with other creators? It’s all going to be a cloud-based environment.”

TOP: Autodesk envisions a future where data flows seamlessly across film/animation studios as well as visual effects facilities with the help of cloud services like Autodesk Flow. (Image courtesy of Autodesk) BOTTOM TWO: Despite being originally established on brickand-mortar studios, Vū Technologies acknowledges that the future is headed towards virtual studios. (Images courtesy of Vū Technologies)

FALL 2023 VFXVOICE.COM • 67

PG 62-67 CLOUD.indd 67

8/23/23 6:11 PM


FILM

DELIVERING THE EFFECTS FOR A SCI-FI WORLD AT WAR IN THE CREATOR By CHRIS McGOWAN

Images courtesy of 20th Century Studios. TOP: John David Washington portrays Joshua, a hardened ex-Special Forces agent recruited to hunt down and destroy the Creator, the elusive architect of advanced AI who has supposedly developed a mysterious world-ending weapon.

The sci-fi action thriller The Creator is set amidst a future war between the human race and the forces of artificial intelligence after a nuclear warhead has devastated Los Angeles. Director Gareth Edwards (Rogue One: A Star Wars Story) and his team based their elaborate world-building on a meticulous refinement of their visual ideas and extensive real-world photography; the latter included shots at 32 locations in Southeast Asia and thousands of reference photos. ILM Visual Effects Supervisor Jay Cooper comments, “I love the way Gareth chose to start with real locations when possible and build our sci-fi world around them. There is an authenticity to this approach that grounds the film.” ILM Associate Visual Effects Supervisor Trevor Hazel adds, “The main challenge in creating a post-apocalyptic new world is doing something that hasn’t been done. Gareth wanted to create a world that was based in reality – there were no greenscreen shoots [for] this film. Gareth and a small team shot at numerous locations and incorporated many real people, their wardrobe and their community. He wanted the audience to watch the film and never be sure what was really captured in-camera.” “What we created together is a mix of live-action anime, old school sci-fi and Baraka [the 1992 documentary with shots of people and settings around the world],” Hazel comments. The Creator, distributed by 20th Century Studios, reunited director Edwards, co-screenwriter Chris Weitz, Cinematographer Greig Fraser and ILM, all veterans of Rogue One. In The Creator, Joshua (John David Washington) portrays a hardened ex-Special Forces agent mourning the disappearance

68 • VFXVOICE.COM FALL 2023

PG 68-73 THE CREATOR.indd 68

8/23/23 6:15 PM


of his wife (Gemma Chan). He is recruited to hunt down and destroy the Creator, the elusive architect of advanced AI who has supposedly developed a mysterious weapon with the power to end the war, as well as all mankind. Joshua and his team of elite soldiers venture across enemy lines into the heart of AI-occupied territory. There, they discover that the world-ending weapon Joshua has been ordered to destroy is in the form of Alphie, a robotic child (Madeleine Yuna Voyles). Edwards’s past experience as a visual effects artist and his previous collaboration with ILM gave The Creator a head start with the project’s VFX. “Gareth had a close relationship with ILM already, due to working together on Rogue One. When we began work on The Creator, he came to our San Francisco studio for a virtual camera shoot on our stage,” Hazel comments. “Right off the bat, he seemed at home and comfortable back at ILM to begin his new project.” ILM was the lead VFX studio and handled just shy of 900 visual effects, with other work carried out by vendors including MARZ, Crafty Apes, Folks VFX, Outpost VFX, Jellyfish Pictures, Wētā Workshop, Supreme Studio, Territory Studio, Clear Angle Studios, Fin Design + Effects and Proof, Inc. Edwards made it a point to try and collaborate directly with ILM, and he visited both San Francisco and London multiple times, whether it be for reviews, virtual cam sessions or virtual production shoots. “He wanted to be integrated with the whole team,” notes Charmaine Chan, ILM Visual Effects Supervisor. Hazel adds, “It was wonderful working so closely with Gareth in

TOP: Director Edwards, co-screenwriter Chris Weitz, Cinematographer Greig Fraser and ILM were all veterans of Rogue One: A Star Wars Story, which gave The Creator a head start with the VFX. (Photo: Oren Soffer) BOTTOM TWO: The world-ending weapon Joshua (John David Washington) has been ordered to destroy is in the form of Alphie, a robotic child (Madeleine Yuna Voyles). It was important to make Alphie accessible, yet not read as human.

FALL 2023 VFXVOICE.COM • 69

PG 68-73 THE CREATOR.indd 69

8/23/23 6:15 PM


FILM

TOP: Final destruction of the Nomad space station was a complex beat where the VFX team had to destroy the ship externally and internally. Scouting the digital sets in Unreal Engine prior to shooting allowed the team to experiment with Nomad’s scale, location and orientation. MIDDLE: The film’s elaborate world-building was based on a meticulous refinement of visual ideas and extensive real-world photography. BOTTOM: In a future war between the human race and the forces of artificial intelligence, Los Angeles has been devastated by a nuclear strike.

creating his new world. Gareth is part storyteller, part cameraman and part graphic/VFX artist. He’s a great partner that knows our VFX world well, and it really helped our global team of artists dream big. We spent a lot of time getting the details right, and it paid off when everything came together.” Edwards often showed that he is deeply conversant in visual effects techniques and challenges, according to Andrew Roberts, ILM On-Set Visual Effects Supervisor. “During one scene, a grenade explosion launches some robot police through the air. The rachet pull rig wasn’t working as intended on the stuntmen, and after a few failed attempts I approached Gareth to discuss digi-double options, and he turned to me and asked, ‘Can ILM do a ragdoll sim to throw them around more violently?’ I gave him the thumbs up, and Gareth was happy to move on to the next setup, confident our team would deliver the effect he needed.” Cooper explains, “Gareth has an extraordinary knack for visual storytelling, and this especially shone through in a science fiction film with extensive world-building. His understanding of the genre and his background in visual effects meant that he was very hands-on and had a deep comprehension of what was technically feasible. What stood out was his commitment to the characters and the worlds we were creating. He pushed the team to think outside the box and encouraged an environment of open communication and collaboration. It was refreshing to see a director so invested in the details, as Gareth often provided invaluable insights that elevated both the character development and the environmental design.” From the beginning of The Creator, the ILM VFX team also worked closely with Production Designer James Clyne and the art department. “Gareth and James had a shorthand for the design language of the film, and we would go to that well of knowledge again and again for huge environments as well as small asset details,” Hazel notes.

70 • VFXVOICE.COM FALL 2023

PG 68-73 THE CREATOR.indd 70

8/23/23 6:15 PM


“The main challenge in creating a post-apocalyptic new world is doing something that hasn’t been done. Gareth [Director Edwards] wanted to create a world that was based in reality – there were no greenscreen shoots [for] this film. ... He wanted the audience to watch the film and never be sure what was really captured in-camera. What we created together is a mix of live-action anime, old school sci-fi and Baraka [a 1992 world-conscious documentary].” —Trevor Hazel, Associate Visual Effects Supervisor, ILM Edwards and Clyne constructed the rich design framework. “They had been looking at concepts and living within the world of The Creator for such a long time that when we showed Gareth one particular shot of a hero CG pyrotechnic effect, he instead honed straight in on the font used for a logo on a background wall, which did not fit within the design language of the film,” recalls ILM Visual Effects Supervisor Ian Comley. “It’s so rare to be on a project where the Production Designer is on from the start to the very end. The art department was absolutely vital for Gareth’s vision, and he couldn’t find a better partner than with James Clyne and the art department. We were on daily calls with James where he would provide such detailed visual guides for our environments and assets. You could see how much Gareth relied on James, so having such a direct and close relationship with James proved to be a huge advantage,” Chan comments. The visuals in The Creator were subject to a great many changes. “One aspect of this film that is unique is that we didn’t shoot plates on set knowing exactly what the final result was going to be,” Cooper says. “Gareth’s idea was that we would find a set location that could serve as the core of a set piece and that we would figure out the rest in post. The checkpoint is a good example of that. We shot in a location that was a toll booth station, and then many months later James Clyne would sink his teeth into it with his team, with the intention of turning it into a super-cool sci-fi

TOP TO BOTTOM: Director Gareth Edwards and his team built the world of The Creator around real locations to ground the sci-fi film in an environmental/cultural authenticity.

FALL 2023 VFXVOICE.COM • 71

PG 68-73 THE CREATOR.indd 71

8/23/23 6:15 PM


FILM

“[For the destruction of the Nomad space station] we referenced historic Bikini Atoll footage, looking at the rapid formation and vaporization of clouds, as well as, of course, the classic mushroom. The simulation, at high altitude and with the Nomad creating an axis down the middle, led to a vast double-mushroom-cloud effect.” —Ian Comley, Visual Effects Supervisor, ILM TOP TWO: The biggest and most enjoyable challenge for the VFX team was creating from scratch a post-apocalyptic world that didn’t exist before. BOTTOM TWO: Real-world photography included shots at 32 locations in Southeast Asia and thousands of reference photos.

building. They would do multiple versions of concept art, using the plate as a starting point. We would then pick up his initial designs and get to work. Somewhere along the way, we would shuttle a version back to James and he would refine the idea, maybe adding serpentine channels of light, negative space, or refining the visual language.” It was important to Edwards that the art wasn’t the end of the design journey. Cooper comments, “It would evolve as he saw it with a moving camera, or with realistic textures. This was especially true when it came to the Nomad space station and our robot characters. We would look at something about the Nomad in art and like it, then flesh it out in 3D, and fall out of love, improve it in 3D and then move forward. It was a partnership with James and the production design team. One that started before principal photography and ended really only in the last weeks.” “Gareth, James and Thai Art Director Lek Chaiyan Chunsuttiwat poured an incredible amount of design and attention into physical props, weapons, background robots, right down to everyday objects like phone booths and picture frames that made up the world of The Creator,” recalls Roberts. “Visiting the art warehouse in Bangkok during pre-production, walking down aisles packed with simulants and utility robots was like being in an amazing sci-fi museum.” He continues, “We shot thousands of bracketed reference photos, took LiDAR scans and captured as much of the detail as possible so we could use it as a foundation for our visual effects work.” The visual effects complemented the locations. For the Floating Village, a location in “New Asia” built into a riverbank, Gareth and his team found a beautiful location in Thailand where there was a village by the river. “What we did to make it look more post-apocalyptic was add these industrial abandoned buildings sitting in the background hills,” Chan says. “They weren’t the main focus but helped ground that vision – that was a common theme throughout the film.” Hazel adds, “ILM would replace some of the filmed people with robots and simulants, sometimes where you’d least expect it, and we would augment those real locations for our new world.” The simulants (the most human-like, sophisticated AI models) posed technical challenges. Cooper explains, “The technical challenges were mostly around incorporating the mechanical head gear into a number of human characters. Humans are obviously non-rigid and their faces stretch and squash as they move and talk. We had to find ways to lock off parts of a human face and interface it with mechanical CG bits. We would animate the jaw of our skeletal rig to make it look like the simulants were operating their mechanical components when talking. We had hero geometry for our principal actors, but not for some of the extras who were elevated to become simulants. Every head is different, and our layout team and compositing teams did a spectacular job fitting these parts together.” For the fatal blow to the Nomad, Edwards wanted something that had the scale and finality of a nuclear blast. “We referenced historic Bikini Atoll footage, looking at the rapid formation and vaporization of clouds, as well as, of course, the classic mushroom. The simulation, at high altitude and with the Nomad creating an axis down the

72 • VFXVOICE.COM FALL 2023

PG 68-73 THE CREATOR.indd 72

8/23/23 6:15 PM


middle, led to a vast double-mushroom-cloud effect,” Comley says. Chan remarks, “The final destruction of the Nomad was a complex beat where we had to look at destroying the ship both externally and internally. As Ian mentioned, there was lots of research on the Bikini Atoll footage, but we also wanted to make sure that rippled into the interior sections of the ship. We had mini-sub explosions happening both inside and out, and destruction and debris falling dangerously close to our heroes in their final moments. We choreographed the buildup of more and more destruction until we had that one final explosion.” ILM’s StageCraft was used for two locations on the space station Nomad – The Biosphere and an Escape Pod Bay. Comley comments, “We used our ILM StageCraft volume at Pinewood which was 55m x 43m [wxl] and 15m high. We had Vicon camera tracking and Roe Black Pearl BP2’s, which were 2.8mm pitch LEDs. We also used our in-house proprietary renderer at ILM called Helios to run out the content.” Long before pre-production, Fraser, Edwards and Cooper discussed what environments they could target for StageCraft. “Greig is an expert when it comes to shooting with StageCraft, having served as a cinematographer on The Mandalorian and The Batman, both of which had large StageCraft elements, as well as Rogue One which had a proto ILM StageCraft volume. We chose two sequences [Escape Pod Bay/Airlock and Biosphere] that would work well with StageCraft. They’re both large and complicated sets that you couldn’t build, and ideally you would want to frame your shots, knowing what you’re looking at,” Cooper explains. Roberts notes, “Prior to shooting, ILM went through an extensive design phase with Gareth and James Clyne. Our Virtual Art Department Supervisor, Billy Brooks, worked closely with Gareth scouting the digital sets in Unreal Engine. This allowed them to experiment with Nomad’s scale, location and orientation.” Cooper adds, “Along the way, we would bring Gareth into our virtual world and have reviews using an Oculus headset. He could fly around our virtual sets, place cameras and inspect our build while it was in progress.” Coming up with a look for the simulants and robots was an aesthetic challenge. Cooper recalls, “Finding ways to make Alphie [Alpha O] both accessible and robotic was extremely important. You don’t want her to be off-putting, and you want the audience to enjoy Madeline’s wonderful performance, but at the same time she shouldn’t read as human. We did several rounds of models to try to thread that needle and then gave her head mech some purpose and intent. Her ‘wheels’ spin in keeping with her performance.” Continues Cooper, “On the aesthetic front, we were trying to design a world that was both huge in scope but also had uniting elements of design. Gareth wanted an aesthetic that had a fit and finish of polished industrial design. The design language had to feel intentional and refined. This was true for characters as well as environments.” Cooper concludes, “It was exciting to work on a project that was original and wasn’t part of a larger story universe. There was a freedom and delight that came with starting with a clean slate.”

TOP THREE: An sci-fi-rich art warehouse in Bangkok provided thousands of reference photos, LiDAR scans and important details that served as a foundation for the visual effects work.

BOTTOM TWO: The Floating Village in “New Asia” was built into a riverbank at a idyllic location in Thailand where there was a village by the river. To make it look more post-apocalyptic, abandoned industrial buildings were added to the background hills.

FALL 2023 VFXVOICE.COM • 73

PG 68-73 THE CREATOR.indd 73

8/23/23 6:15 PM


VIDEO GAMES

AI GETS AN UPGRADE IN VIDEO GAMES By TREVOR HOGG

TOP: Unity developed a demo, featuring the alien known as Orb, showing the effectiveness of large language models in producing characters that have more natural responses. (Image courtesy of Unity) OPPOSITE TOP: An example of synthetically-balanced generated data. (Image courtesy of Julian Togelius and modl.ai)

For the longest time, AI has been associated with non-player characters (NPC) in the video game industry and divided between control structures and path-finding. “If you are in the chasing state, you’re using a path-finding algorithm to try to find the shortest path to the player character,” explains Julian Togelius, Associate Professor of Computer Science and Engineering at NYU and Co-Founder and Research Director, modl.ai. “If you are in the patrolling state, you are basically executing a fixed set of moves. Traditionally, you have all of these simple, symbolic AI techniques that have been there in one form or another since the 1980s. This is because when most modern game genres were developed, like role-playing games, first-person shooters and strategy games, we didn’t have modern AI algorithms, and we certainly didn’t have computers that could run them.” Togelius believes that AI is going to cause even further fragmentation in game development. “There will be designers who look at the new AI methods and say, ‘This doesn’t help me with the kind of games that I do.’ Other designers will go, ‘Hey, let’s start from the new AI methods, design around that and see what new things we get.’ Basically, what we call a game will expand even more in different ways.” Open research and development initiative Ubisoft La Forge has been exploring applications of machine learning in video games productions for over a decade. “AI works best when there is a lot of data available that represent the tasks you’re targeting,” states Yves Jacquier, Executive Director of Ubisoft La Forge. “As such, repetitive tasks are particularly suited to be generalized and automated.” By using motion capture data to train AI models, realistic

74 • VFXVOICE.COM FALL 2023

PG 74-79 AI GAMES.indd 74

8/23/23 6:16 PM


and fluid character animation will be generated. Jacquier explains, “AI also has the potential to assist in automating the animation workflow, reducing the time and effort required by animators. A good example for that is Choreograph, a technology developed by La Forge and used in Far Cry 6, which helped create the animations in the game through an AI-driven technique known as ‘motion matching.’ Another prototype we developed at La Forge is Zoobuilder, which helps create 3D animations for animals from videos using machine learning. We’ve been using generative AI in voice synthesis, DeepMotion, text-to-speech and now Ghostwriter, our in-house tool created to assist scriptwriters. We’re seeing fast progress in terms of adoption internally in areas like ideation or concept art. We expect a similar trend in programming with assistance for writing code or finding the potential source of a bug and proposing a fix. We can expect the use of AI assistants to rise and progressively become an integral part of our everyday routine, going beyond basic functions to handle intricate tasks, such as assisting with coding and executing complex commands.” Unity views AI tools as a means to further the democratization of the video game industry. “Muse Chat has helped the creation process that starts with natural language which is a great place for all humans because that’s how we communicate,” notes Ralph Hauwert, Senior Vice President/GM, Unity Editor, Runtime, Ecosystems & AI/ML at Unity. “Being able to ask Unity, ‘How do I make a Match 3 game?’ It lists it out in steps for you. Getting that type of help accelerates creators on whatever part of their journey, whether they’re a starter, intermediate or expert. Sentis is quite an innovation from the perspective of what it intends to do. Building a

“With games you spend 10% to 20% of the game development budget testing it. It’s a huge weight limiter because testing games is actually boring because you do the same thing again and again. The beautiful thing is, this is the kind of stuff that everybody is okay with being automated – giving AI instructions on how to test the game, what to look for and what specifically to explore.” —Julian Togelius, Associate Professor of Computer Science and Engineering at NYU and Co-Founder and Research Director, modl.ai.

FALL 2023 VFXVOICE.COM • 75

PG 74-79 AI GAMES.indd 75

8/23/23 6:16 PM


VIDEO GAMES

TOP: $8.4M was raised in 2022 by modl.ai to develop QA and playtest bots that mimic human players via AI and machine learning. (Image courtesy of Julian Togelius and modl.ai) BOTTOM TWO: Automatic game testing is the one area where AI can have an immediate and less controversial impact on the video game industry. (Images courtesy of Julian Togelius and modl.ai)

neural network is already hard enough, but then deploying them to all the different devices with all different kinds of accelerators and APIs, that gets complicated and expensive for a small team of game developers. What Sentis does is be the runtime on the device that allows you to deploy one of your neural networks. Essentially, it’s a cost reduction and enables you to use the silicone that’s inside all of these devices to the best of their abilities. Right now, if you want to run a neural network on an IOS or Android device you’re already needing to make two different versions. Sentis takes care of that problem for you.” In 2019, NVIDIA introduced Deep Learning Super Sampling (DLSS) which is an AI model trained on NVIDIA supercomputers that fills in detail and improves performance. “Since then, over 300 games and applications have adopted this technology, and we’ve continued to improve our AI rendering capabilities with the introduction of DLSS 3, which generates entire frames and multiplies performance,” remarks Ike Nnoli, Senior Product Marketing Manager at NVIDIA. “This success has inspired others in the gaming industry to create their own performance-enhancing solutions. By offloading rendering work to AI, DLSS has allowed developers to do more in their games, from pushing graphical fidelity to increasing geometric density. Beyond rendering, we’re spearheading generative AI technologies in gaming production pipelines. The launch of ChatGPT has created a surge in applications powered by Stable Diffusion and large language models [LLMs]. We’ve built NVIDIA Audio2Face for voice-to-facial animation, Picasso for text-to-3D assets and Avatar Cloud Engine [ACE] for games for intelligent NPCs. This is an area of significant R&D as generative AI will change how games are made.” Automatic game testing is the most logical place for AI. “With games, you spend 10% to 20% of the game development budget

76 • VFXVOICE.COM FALL 2023

PG 74-79 AI GAMES.indd 76

8/23/23 6:16 PM


testing it,” Togelius states. “It’s a huge weight limiter because testing games is actually boring because you do the same thing again and again. The beautiful thing is, this is the kind of stuff that everybody is okay with being automated – giving AI instructions on how to test the game, what to look for and what specifically to explore. Also, you can use deep learning when it comes to upscaling a game or making a different visual style, like anime or a 1960s Disney movie. Then there is improved player modeling and capturing playing styles. This gives you a lot of examination for how to tune your game.” Continues Togelius, “We’re going to get better building environment-generation, which we’ve had since the 1980s with Rogue and more recently with No Man’s Sky. Generative AI provides a wealth of generating game levels that are controllable, fit particular players or player types and have specific challenges. It used to be 10 to 15 years ago that all of the animation was simulated and used every time. As we’ve gotten better at procedural animation, you get more fluidity. Finally, what everybody is talking about right now is not so easy – the dialogue tree through large language models, where we can have real dynamic dialogue. We are going to have technical advancements that will make large language models more controllable and less likely to say random words. We are also going to need to design around the fact that large language models are fundamentally unreliable.” One should not forget that AI is a tool. “The challenge is to make sure that we learn how to integrate this potential to create meaningful experiences beyond being a mere gadget,” Jacquier states. “As a player, you want to explore a world and feel that each character and each situation is unique, and that involves a vast variety of characters in different moods and with different backgrounds. As such, there is a need to create many variations to any

TOP AND BOTTOM: Testing the accuracy of text-generated imagery. (Images courtesy of Julian Togelius and modl.ai)

“There will be designers who look at the new AI methods and say, ‘This doesn’t help me with the kind of games that I do.’ Other designers will go, ‘Hey, let’s start from the new AI methods, design around that and see what new things we get.’ Basically, what we call a game will expand even more in different ways.” —Julian Togelius, Associate Professor of Computer Science and Engineering at NYU and Co-Founder and Research Director, modl.ai.

FALL 2023 VFXVOICE.COM • 77

PG 74-79 AI GAMES.indd 77

8/23/23 6:16 PM


VIDEO GAMES

TOP TO BOTTOM: At COMPUTEX 2023, NVIDIA took centerstage to demonstrate its vision of how AI can change the video game experience for players. (Image courtesy of NVIDIA) NVIDIA partnered with Convai to help optimize and integrate ACE modules into their platform. (Image courtesy of NVIDIA) Jin is a character that utilizes NVIDIA’s Avatar Cloud Engine (ACE) to create intelligent NPCs for games. (Image courtesy of NVIDIA)

mundane situation, such as one character buying fish from another in a market. A writer tasked with writing 20 variations for each of those situations might come up with a handful of examples before the task might become tedious. This is where Ghostwriter kicks in: proposing NPC dialogues and their variations to a writer gives the writer more variations to work with and more time to polish the most important narrative elements. This can help create even more believable worlds for players to explore, with the potential for unlocking more complex procedural narratives for all NPCs, no matter how ancillary they are to the main story. This is a great example of how generative AI can assist without sacrificing the narrative integrity of our games.” “NVIDIA ACE for games is our custom AI model foundry, which brings a level of intelligence to game characters previously not seen before,” Nnoli remarks. “Traditionally, player interactions with NPCs have been transactional, scripted and short-lived. Now, generative AI can make NPCs more conversational with persistent personalities that evolve over time and that are unique to the individual player. The process to deploy intelligent NPCs is very different from the NPCs today with their preset dialogue choices. Developers will need to run LLMs, customized with game-specific lore, through a cloud service or PC in real-time. And, they’ll need to have facial animation that adapts to dynamic language in realtime and speech that sounds lifelike – all while ensuring conversations are on topic.” The creation of NPCs has become more sophisticated, Nnoli says. “The number of prerecorded lines has grown. The number of options a player has to interact with NPCs has increased, and facial animations have become more realistic. Gamers’ expectations when it comes to narrative decision-making continue to rise, and AI can significantly multiply the number of interactions that a user has with a story and allow for quicker

78 • VFXVOICE.COM FALL 2023

PG 74-79 AI GAMES.indd 78

8/23/23 6:16 PM


injections of new narratives [through LLMs like NVIDIA NeMo and ChatGPT].” Creating aesthetic variations is part of the heritage of Unity. “If one of the things is that you need to have 100 variations of the best sword,” Hauwert explains. “You have already trained against your own art style, and we allow you to easily generate another 100 more. You can start building variations based on that depth of how the game ecosystem works and being able to control that as well. Then we think about gameplay and how you can learn from others. How cool is it if you could play against the AI of one of your favorite streamers on Twitch? Or instead of ghost racers you could play against an interactive version of that player that has already been. We’re still at the beginning of that journey.” Unity has established partnerships with Leonardo.ai, Replica Studios and Inworld AI. “It is all about building up this ecosystem of AI tools, but also classifying them clearly for our community as opportunities of engagement.” The technology alone is not enough, Hauwert declares. “What we’re learning is how to make the optimal UX that sits within a workflow when it comes to using it. Then, there are all of these fun thresholds that creators need to cross to be able to even target this or be able to use it. We are learning a lot from the creators that now have access to our data and are using Muse Chat, for example, and telling us how it is or is not useful.” Whereas technology is generally designed to allow individuals to achieve, there is also a sense that more checks and balances need to be put into place. “By lowering the barrier to entry for content creation, there is going to be an increasing influx of content that automatically calls for stronger moderation and safeguards to prevent the creation of things we don’t want to see in games,” Jacquier observes. “If implemented right, AI has the potential to significantly change the way games are developed and played, with obvious positives for both game developers and players: more immersive game worlds and diverse and rich characters; more engaging or challenging gameplay; personalization of the experience and AI-based user-generated content; more believable world components like fluids, fire or smoke simulations; and overall

TOP TO BOTTOM: Choreograph is technology developed by La Forge that creates animation for games through an AI-driven technique known as ‘motion matching.’ (Image courtesy of Ubisoft La Forge) Predicting the movement of a character through Choreograph. (Image courtesy of Ubisoft La Forge) Choreograph was designed to simplify the decision-making process. (Image courtesy of Ubisoft La Forge)

production efficiency at the service of faster creative workflows.” Fear of AI becoming sentient and making a human workforce obsolete is misguided. “AI is a tool,” Togelius states. “But as with every tool, it gives you new performances. If you just try to fit a new tool into an old design and workflows, you’re going to be limited.” Unity starts with the creator rather than the technology. “We will always be on the side of creators as we deploy this,” Hauwert remarks. “For me, AI will accelerate the entire game industry to be more creative.”

FALL 2023 VFXVOICE.COM • 79

PG 74-79 AI GAMES.indd 79

8/23/23 6:16 PM


VIRTUAL PRODUCTION

HOW WRITING FOR VIRTUAL PRODUCTION HAS FLIPPED THE SCRIPT By OLIVER WEBB

TOP LEFT: For 1899, scripts had to be as informative as possible, not just blueprints, because locations were already locked in and the world-building was well in progress before the scripts were finalized. (Image courtesy of Netflix) TOP RIGHT: 1899 relied on virtual production to create its vivid landscapes. Much of the rewriting in post was accomplished during pre-production or during production as the locations were already set and images projected onto ready-designed LED screens. (Image courtesy of Netflix)

“The sky is the limit [with virtual production], to the kinds of worlds we can create for any given story, to the places on this planet or this galaxy we want to explore without ever having to set foot on a plane, or even to the worlds that only exist in our minds. Especially when writing sci-fi – which was my experience – or any genre in period; you can go back to the Ice Age if you want to.” —Juliana Lima Dehne, Writer, 1899

Combining CGI, real-time visual effects and live-action filming, virtual production technology has boomed in the last few years. With The Mandalorian utilizing the technology, other productions have since followed suit, including 1899, Star Trek: Discovery, House of the Dragon and Avengers: Endgame, among others. The cutting-edge technology has not only become an alternative to on-location filming, but it has opened up endless possibilities for filmmakers, as well as offering a more immersive world for actors to surround themselves in. While there are many benefits to virtual production technology, when it comes to the creative coverage of a scene, it also has its limitations. Michael Shelton, Creative Director at Pixomondo, oversees projects from prep through final delivery. “Depending on Pixomondo’s role on a given project, this would include on-set supervision as well,” Shelton notes. “I end up wearing many hats and sometimes find myself on the box lighting shots or doing a bit of character animation or matte-painting work, which I love. Anytime you can walk onto a set and see what the end result will look like while still in principal photography is an advantage to everyone involved. Aside from the wow factor, it gives both the VFX team and the production team the best opportunity to blend the live action into the created environments. Actors especially benefit from seeing the world built around them, in a way it removes the burden of needing to imagine what their surroundings look like.”

80 • VFXVOICE.COM FALL 2023

PG 80-83 WRITING FOR VP.indd 80

8/23/23 6:17 PM


Virtual production technology has altered the writing process when it comes to writing for films and series that use the technology. “The difference – or at least in my singular experience working on 1899 – was that much of the rewriting in post is happening in pre-production or during production latest,” says Juliana Lima Dehne, who penned eight episodes of Netflix series 1899, a show that relied on virtual production to create its vivid landscapes. “One of the reasons for it is that locations – for example – are set. You can’t change your mind; nor will the controlled environment change once you start production. It’s all definitive,” Dehne adds. “The sets and the images projected onto the LED screens have already been designed, and the soundstage has already been dressed to match it. It’s like a game. You have to work with what’s there. You can’t decide to go off script in the middle of a game because you want to enter a door that doesn’t open because nobody designed what’s behind that door. I remember having these discussions in the writers’ room about locations, how the technology worked in terms of where to place characters in any given location we were working with. There was a rule of three, where it was like here’s the wide shot we’re working with, the ‘bigger picture,’ and here are the two points we can move towards, zoom in or enter within this wider shot. Scenes have to happen and make sense within that. So, I think you can think of it as doing post in pre-production, and the scripts have to be as informative as possible. More than just blueprints, they kind of had to be the framing of the house already, and locations have to be locked in, in that sense. In case you’re wondering, pre-production is extensive and intense. As a staff writer, that was the biggest difference. It was like, don’t start pitching a scene in a burning skyscraper when they’ve already designed the abandoned cottage in the woods. Even if for maybe the character in that moment, a burning skyscraper would’ve conveyed the character’s inner struggle/emotional state/

TOP: Forever vistas, lush landscapes and a dedication to detail showcased the volumetric creativity that shaped the world and narrative of The Lord of the Rings: The Rings of Power. (Image courtesy of Amazon Studios) MIDDLE AND BOTTOM: The Mandalorian demonstrated the benefits of utilizing VP on a TV series. At the same time, VP has altered the writing process when it comes to writing for films and series that use the technology. (Image courtesy of Disney+)

FALL 2023 VFXVOICE.COM • 81

PG 80-83 WRITING FOR VP.indd 81

8/23/23 6:17 PM


VIRTUAL PRODUCTION

TOP: Series such as House of the Dragon have followed the virtual production path of The Mandalorian. (Photo: Ollie Upton. Courtesy of HBO) MIDDLE AND BOTTOM: Having the scripts essentially written in advance assisted in deciding which sets needed to be built pratically or digitally for House of the Dragon. (Image courtesy of HBO)

turning point much better, it’s literally too late. Granted there was still a window to change locations during the writers’ room period, because the designing of the world was happening simultaneously, but once they were done and we were still writing, it was over. It definitely forces you to be more diligent with those early drafts.” For Emma Needell, writer and director of the short film Life Rendered, virtual production greatly affected the writing process. “This was because I embraced the tools of virtual production for story development specifically. I first learned how to use Unreal Engine – which is free to download and use – then I created an animatic of my short film in the program, experimenting with pacing, cinematography and music all before we began pre-production in earnest,” Needell remarks. “As a filmmaker, the ability to iterate through ‘drafts’ of your film via an animatic is extremely advantageous. The process gave me the ability to visualize and communicate the intent of each scene with key members of my production team, because I workshopped it on my own, like a draft of a script. The flip side, of course, is ensuring that you don’t get too attached to the animatic. You must allow for happy accidents and your team’s expertise to help shape the final film.” Despite the many benefits of virtual production technology, the technology can ultimately be time-consuming. “In a live-action production, what you shoot is what you get, and the film is then fully crafted in editing. With virtual production, there are extra steps in both pre and post to finesse the image in Unreal Engine, which can take a long time, especially if you’re working on an indie project like I was,” Needell explains. “It’s pivotal to work with a producer who understands the nuances of virtual production, because it’s very different from traditional filmmaking.” According to Dehne, one of the advantages of virtual production, when it comes to creative coverage of a scene, is in the types of stories you can tell that wouldn’t be possible if you had to shoot on location, or would be too costly to do so. “The sky is the limit, to the kinds of worlds we can create for any given story, to the places on this planet or this galaxy we want to explore without ever having to set foot on a plane, or even to the worlds that only exist in our minds,” Dehne details. “Especially when writing sci-fi – which was my experience – or any genre in period; you can go back to the Ice Age if you want to.” Dehne also notes the impact virtual production has on actors. “Unlike with a greenscreen, it helps them focus on the emotion and intensity of scenes, because nobody has to waste brain cells trying to imagine the set while saying lines. The set, the landscape, the world, it’s already there in front of your eyes,” Dehne observes. “On set, you’re watching monitors of a scene being shot on a ship in the middle of the ocean, some people even got seasick if I remember correctly, but you’re in a studio somewhere in Berlin. It’s quite impressive. What would take extensive time to marry in post-production – the real and the digital, VFX – is literally happening in real-time as you shoot. You can basically adjust as you go. Plus, for all your exterior shots you can control lighting, the weather, unlike shooting on location somewhere. In Germany, for example, I remember shooting this short film when I first moved over from New York City, and we shot it in April. We literally had all four

82 • VFXVOICE.COM FALL 2023

PG 80-83 WRITING FOR VP.indd 82

8/23/23 6:17 PM


“I embraced the tools of virtual production for story development specifically. I first learned how to use Unreal Engine... then I created an animatic of my short film in the program, experimenting with pacing, cinematography and music all before we began pre-production in earnest.” —Emma Needell, Writer/Director, Life Rendered seasons in one day, and then we were stuck trying to make it work in post and overspending to reshoot because snow, rain, sun and clouds in an important character establishing two-minute scene is an absolute nightmare. Having complete control of the environment is definitely one less problem to worry about as a creative.” When it comes to limitations of the creative coverage of a scene, Needell argues that there are trade-offs to every tool and technology in film production. “The limitations to getting coverage in virtual production can be offset by the ability to be on a climate-controlled stage where you control every aspect of the set. Magic hour lighting from the first shot to the martini shot? No problem. You also don’t have to worry about location changes, since you can just load up your next scene onto the stage with a few clicks of a button. However, bugs in the technology can slow down your days and cut into the time you need to maximize coverage. For instance, during our production, the AC in the building went out one day, and the computer servers got so overheated we had to shut down production for an afternoon. But that’s film production! It never goes according to plan, and, as a filmmaker, you need to be ready to adjust and pivot given what the conditions of the day are,” Needell adds. Dehne stresses the importance of being well-prepared as both knowledge and experience of the technology are vital. “In terms of angles and shots, lighting and mood, within a defined space it can be actually creatively freeing since you’re seeing the result in real-time and can adjust accordingly,” Dehne explains. “Beyond that defined space, it’s challenging. You can’t suddenly decide to peek in through a window projected on your LED screen if nobody designed the reverse of that.” “Ironically, in this case, virtual production was the only possibility to make it happen. I believe they made the show they wanted because of it. I do remember the excitement in the writers’ room when writers would pitch crazy ideas like: Could there be a graveyard of ships from previous failed simulations? Is there a world where there isn’t just one trapdoor but a million of them all over the ship? Or how many character ‘memories’ could interact with each other in a single episode. The technology itself was a beast everyone had to tame. It’s complex, requires time to understand, to learn, but what it brought and helped the creators achieve was well worth the initial trepidation,” Dehne concludes.

TOP: In Life Rendered, a young man finds his true self in virtual reality. Writer/director Emma Needell embraced the tools of virtual production for story development, greatly affecting the writing process for her short indie film. (Image courtesy of Emma Needell) MIDDLE: Needell found it advantageous to iterate through ‘drafts’ of her film via an animatic. Workshopping it on her own, like a draft of a script, gave her the ability to visualize and communicate the intent of each scene with her production team. (Image courtesy of Emma Needell) BOTTOM: Ant-Man and the Wasp: Quantumania relied on cutting-edge technologies to create immersive worlds. While VP technology offers many benefits, it also has its creative limitations and extra steps versus live-action filming and editing. (Image courtesy of Marvel Studios and Walt Disney Studios)

FALL 2023 VFXVOICE.COM • 83

PG 80-83 WRITING FOR VP.indd 83

8/23/23 6:17 PM


INDUSTRY

VFX MARKETING AND PRODUCTION STRATEGIES IN AN ERA OF HIGH PRESSURE, HIGH STAKES

What are some of the successful marketing and production strategies for VFX studios in a time of tremendous pressure to deliver visual effects of ever-increasing quality and demand? How do you differentiate your company and showcase your brand? How do you achieve and maintain profitability in a fiercely competitive and price-driven environment? And how can you meet all those standards while fostering good relationships with clients? Here, several studios weigh in with their opinions about the current state of the VFX business.

By CHRIS McGOWAN MARKETING AND PROMOTION

TOP: Digital Domain, which worked on visual effects for the HBO series The Last of Us, seeks to actively listen to client needs, be transparent about capabilities and limitations, deliver on promises, and maintain clear, consistent communications. (Image courtesy of Digital Domain and HBO) OPPOSITE TOP: FutureWorks, which worked on this cloaking car effect for The Peripheral, puts their talent front and center when the firm talks about its work so visual artists get the recognition they deserve. (Image courtesy of FutureWorks and Amazon Prime)

“A successful marketing and promotion mix for me would revolve around being able to show and talk about our work soon after a project releases,” says Hayley Miller, Pixomondo Group Head of Marketing and Communications. “Catching the buzz of a release is great for our audience engagement and recognition of the teams’ hard work. Fans are increasingly calling out to see VFX breakdowns and behind-the-scenes insights, too; they want to see all the cool and innovative ways we help create the magic they see on screen.” For VFX vendor studios, effectively marketing and showcasing your capabilities, services and talent roster is crucial, according to Lala Gavgavian, President and COO at Digital Domain. It not only brings visibility to the firm’s offerings, but also highlights its commitment “to collaborating with filmmakers to create visually stunning effects that enhance and serve storytelling,” she notes. The Third Floor finds success through sharing high quality marketing materials that showcase its storytelling visuals and embrace the latest branding trends, according to Lauren Moore,

84 • VFXVOICE.COM FALL 2023

PG 84-89 MARKETING.indd 84

8/23/23 6:19 PM


The Third Floor Chief of Staff. “From press pieces, social media posts, project/technology case studies, newsletters and events, we’re able to create more engagement, which is key to promoting.” “Most of the time it’s about building our reputation and showcasing the quality of work we do,” comments Managing Director Laura Usaite of Vine FX. “The nature of the work is that you have to be trusted by clients to deliver what you promise, so we’re always trying to show what we’re capable of.” Usaite explains, “Word-of-mouth is still a central part of our process, and we’re working with a marketing and PR agency now to help build our external presence. Our approach isn’t to ‘sell’ what we offer, but instead to show the people we work with the talent we have, and the projects we work on. Let the work speak for itself – industry colleagues don’t need to be told that something is good, they can see it.” Lux Aeterna prioritizes going to the places its potential clients frequent, from online to industry events. “It gives us the chance to interact with them in the space they play in. Most recently, our Founder and CEO Rob Hifle spoke at the MPTS alongside other key players from the VFX world, which was a great opportunity for us to join in the industry conversation,” says Laura Ashley, Operations & Marketing Manager of Lux Aeterna. “A successful marketing and promotional campaign for VFX and animation studios in this digital era should be designed around your creative talent, culture and passion for both industry and community, explains Neishaw Ali, Co-Founder, President and Executive Producer of Spin VFX. “It’s about the personal stories you share with everyone globally that bring them closer to understanding who you are and what motivates you as a company.”

MIDDLE AND BOTTOM: Lux Aeterna, which generated galaxies for Netflix’s Our Universe, is based in Bristol, England, and has focused on generating brand awareness since its rebrand at the end of 2022. (Images courtesy of Lux Aterna and Netflix)

FALL 2023 VFXVOICE.COM • 85

PG 84-89 MARKETING.indd 85

8/23/23 6:19 PM


INDUSTRY

SHOWCASE YOUR BRAND

TOP TO BOTTOM: Important Looking Pirates (ILP), located in Stockholm, worked on this demogorgon for Stranger Things Season 4. As part of its strategy, ILP seeks out the most creative shows to work on and targets VES and Emmy award nominations. (Image courtesy of ILM and Netflix) A successful marketing and promotion mix for Pixomondo, which worked on this fire-breathing creature from House of the Dragon, revolves around being able to show and talk about its work soon after a project’s release. (Image courtesy of Pixomondo and HBO) Catching the buzz of a release is great for audience engagement and recognition of the Pixomondo team’s hard work. Pixomondo contributed VFX to the Apple+ series See. (Image courtesy of Pixomondo and Apple+) For Spin VFX, which worked on the boxing biopic Big George Foreman, a successful marketing and promotional campaign should be designed around the firm’s creative talent, culture, and a passion for both industry and community. (Image courtesy of Spin VFX and Sony Pictures)

“We mainly showcase our work on our social media channels like Facebook, LinkedIn and Twitter and try to spend some time creating nice breakdowns of our projects. We also try to be a part of the most creative shows and create the best work to get nominated to the VES [Awards] and Emmys,” comments Måns Björklund, Executive Producer for Important Looking Pirates (ILP). It is important to maintain a strong online presence, notes Gavgavian. “Maintaining a professionally designed and regularly updated website is essential. It serves as a platform to showcase the studio’s brand, capabilities and portfolio. Additionally, active social media channels can be leveraged to effectively market and promote the studio’s work and talent base.” Gavgavian continues, “It is important to showcase completed projects. Distributing company reels that highlight the diversity and quality of visual effects services offered is an effective way to impress potential clients. Including ‘behind the scenes’ breakdowns of specific projects can provide insight into the studio’s approach, the complexity of the work and the final results.” Usaite adds, “We love to celebrate the work we’ve completed together with the artists that were involved. We always do reels, case studies and behind-the-scenes breakdowns – everything that demonstrates the skills, creativity and technical expertise we have in-house. It’s exciting to be working on big-name projects, and we share that excitement when we promote the successes we’ve had. Industry and local press are crucial to our marketing and PR strategy – getting all coverage we possibly can from print to social media. We want potential clients to see what we’re doing, and future team members as well.” Since Lux Aeterna’s rebrand (from Burrell Durrant Hifle or BDH) at the end of 2022, the firm’s focus has been generating brand awareness, so it has tailored its promotional activities accordingly. “Our goal currently is to get the new brand out there so that people understand who we are, what we’re great at and why they should work with us. We’re confident that our work does the talking for us, so showcasing our projects has been at the heart of our activities. Social media is an obvious but essential tool, as we can speak to all of our audiences instantly and easily. But as a relatively new brand, we’re focusing on building up our online presence, and PR is a huge driver for this,” Ashley says. To stand out in a crowd of VFX studios, “ILP is and has always been about the quality of the final image. We always strive to deliver the best possible result, not only in the final product, but also it is important for us to deliver a great service during the production process, making it easy and enjoyable for all involved to work with us,” Björklund says. “One of our goals is to put the talent front and center when we talk about our work so they get the recognition they deserve and can share with their families and colleagues. Going beyond just showing before and after comparisons, we seek to highlight the underlying technology and expertise that enables it all,” says Gaurav Gupta, CEO and Founder of FutureWorks.

86 • VFXVOICE.COM FALL 2023

PG 84-89 MARKETING.indd 86

8/23/23 6:19 PM


PRODUCTION STRATEGIES

Spin VFX has a standard production strategy template that all its productions follow. Firstly, Ali says, “Casting of the VFX supervisor and producer is important to ensure that they have previous experience on a similar-type project and that their creative mindsets are likely to mesh with the director and production team.” For artists, it is important that they love the work and are inspired to work on the show, “so we would send out a call to ask who wants to work on the show, or if they know about it.” Secondly, it is important to establish a development team to create the project’s initial setup immediately upon project award and start breaking down any development tasks to be created and scheduled. Thirdly, Ali notes, “Resourcing is our biggest constraint and needs the most focus. Most times, the hours bid for a project and the actual time spent don’t mesh. When scheduling, it is important to have transparency with the team creating the work.” She adds, “This level requires direct talks, clear understanding of the vision and deadlines to properly schedule based on the artist skillsets and complexity of work, understanding that there will be revisions both internal and external.” Lastly, comments Ali, “One of the best production strategies that should be considered is the automation in repetitive tasks. Pipeline and rendering are two important departments that are underestimated until the last moment. This creates unnecessary stress on all teams when they should have been considered in the project planning.” Production strategies will differ depending on how spread out a studio is – a single office, multiple regional offices or a global organization,” notes Gupta. “Different strategies will suit those setups, and it’s rapidly changing as the technology is becoming increasingly real-time, adopting the use of game engines for in-camera VFX and previsualization. At the same time the various project management tools on the market enable a dynamic review system. You can send large amounts of data and reach anyone on the other side of the world very quickly, all of which makes productions more agile and iterative.” THE WORK ENVIRONMENT

When work is underway, “we do everything we can to create a healthy and creative environment where people share their ideas. Everyone at Vine FX has a voice,” comments Usaite. “Problemsolving becomes easier when you have a talented team that feels empowered to get involved. This not only creates good quality work but also eliminates the stress factor.” The best production strategies are fueled by connection and communication, according to Liz Montes, The Third Floor Global Head of Production. “At The Third Floor, we listen to our clients’ needs while focusing on our people and our pipeline. We have to enable everyone on the team to collaborate and shine, while removing blockers so that they can be successful.” Usaite comments, “Listening to your staff is essential, and [so is] knowing what changes or systems you have to implement to help them to get their jobs done. This will further grow the achievable objectives that you set. We have a development system in place to

TOP: For the Netflix film True Spirit, Spin VFX created the boat, water and environmental effects. (Image courtesy of Spin VFX and Netflix) SECOND FROM TOP: Spin VFX, which contributed VFX to Ant-Man and the Wasp: Quantumania, focuses on the casting of the VFX supervisor and producer to ensure that they have previous experience on a similar project and that their creative mindsets mesh with the director and production team. (Image courtesy of Spin VFX and Marvel Studios) BOTTOM TWO: The Third Floor, which worked on effects for Guardians of the Galaxy Vol. 3, prides itself on its unique ability to handle unexpected events, which they refer to as “real-time production calibration.” (Image courtesy The Third Floor and Marvel Studios)

FALL 2023 VFXVOICE.COM • 87

PG 84-89 MARKETING.indd 87

8/23/23 6:19 PM


INDUSTRY

“From press pieces, social media posts, project/technology case studies, newsletters and events, we’re able to create more engagement, which is key to promoting.” —Lauren Moore, Chief of Staff, The Third Floor

grow our team and support them in becoming more independent. Ownership of a role and a career path is something that’s very important to us, and that will naturally help us to build a stable business with longevity.” EMBRACING CHANGE

Constant learning and innovation are essential to sustain growth. “It’s normal for things to change, so you have to be flexible,” Usaite says. The COVID-19 pandemic was a challenge, and it was one we rose to. We’d already had a flexible technology infrastructure, so we could adapt quickly. The quicker you adapt, the better the outcomes for everyone. We embrace change and challenge. It’s not something to be frightened of, but [to be] inspired by. Our goal is to continue a steady pace of growth and just deliver good work with a happy, healthy team behind it.” The Third Floor prides itself on its ability to handle unexpected events with ease. Notes Dane Allan Smith, Chief Strategy Officer at The Third Floor, “I refer to this as real-time production calibration, and it’s something that we take very seriously. Transparency is key in communication, and everyone within our organization is aware of the current and projected volume of business. We also place a great deal of value on company culture, and regularly develop new training programs to ensure that our team is made up of top-tier talent and exemplary leaders.” “Ours is a dynamic industry, so we make it a point to revisit our strategy regularly and see where we can further optimize our processes and adapt to stay ahead of the curve,” Ashley notes. “We constantly experiment with new ways of working and new pieces of software. Anything that can enable us to push further and produce better results is worth trying in our eyes. At the same time, we make sure that no matter what we’re testing, it never compromises the quality of our work.” ACHIEVING PROFITABILITY

TOP TO BOTTOM: The Third Floor, which contributed VFX to She-Hulk: Attorney at Law, listens to clients’ needs to align them with its own staff and pipeline to enable everyone on the team to collaborate and shine. (Image courtesy of The Third Floor and Marvel Studios) Cambridge, England-based Vine FX worked on the No Escape series. Their approach isn’t to “sell” what they offer, but instead to showcase the talent they have and the projects they work on. (Image courtesy of Vine FX and Paramount Plus) Digital Domain, which contributed VFX to Black Panther: Wakanda Forever, believes vendors must showcase their capabilities, services and talent roster, and highlight their commitment to collaborate with filmmakers. (Image courtesy of Digital Domain and Marvel Studios)

The VFX industry is known for challenging budgets, intense competition and global adjustments, notes Gavgavian. “While profitability can be challenging, we have evaluated our business plan and made adjustments in various ways,” she says. Another important factor, she continues, is that “investing in more velocity towards achieving highest possible efficiency for workflows also allows the ability to strive [for] and achieve profitability.” “We believe in transparency and ensure that we’re always being realistic with what we can achieve with the client’s budget and have those honest conversations early on. This in turn builds trust and sets us up for a positive working relationship. Having a production team with years of experience under their belt enables us to run a tight ship, tracking the project throughout so we can proactively course-correct when necessary to avoid problems down the line,” comments Ashley. “Profitability is not only driven by price, but also solid client relationships, production processes and efficiencies both in talent and innovation that can help you to price competitively,” Ali explains. “We strive not to compete on price, but on creativity, as there will always be someone ready to ‘outbid’. Therefore, if we can compete

88 • VFXVOICE.COM FALL 2023

PG 84-89 MARKETING.indd 88

8/23/23 6:19 PM


on creativity and innovation, this is not easily achievable if you’re not already established.” “The first thing that comes to my mind is honesty. Communication and mutual respect with clients are incredibly important,” Usaite remarks. “Of course, the real key is to make sure that your costs don’t exceed your incomings, which is pretty simple. Don’t over promise, don’t under deliver. Don’t lose money on a project. We try to be open with clients, including in development stages, making sure their expectations are met. We use production management software to help keep us on track and maintain a clear billing cycle, too.” Gupta says, “It’s always a balancing act with multiple scales involved, from bidding projects to your workflows and team management. Having multiple strategies in place to manage the cost of talent, from acquisition to internal training frameworks and mentorship is one of the pillars for maintaining profitability. At the same time, a robust technological framework is just as necessary to stay efficient with your pipeline. You want to factor in as many variables as possible to keep the production pipeline on track. The more we repeat the same type of work, the more efficient we become, both on the systems level and on the talent side.”

“Our approach isn’t to ‘sell’ what we offer, but instead to show the people we work with the talent we have, and the projects we work on. Let the work speak for itself – industry colleagues don’t need to be told that something is good, they can see it.” —Laura Usaite, Managing Director, Vine FX

FOSTER CLIENT RELATIONSHIPS

“First, the most important first pillar of any for service business is building strong client relationships,” comments Gavgavian. Developing and maintaining a strong foundation of trust and open communication with studio clients is essential. “Actively listening to client needs, being transparent about capabilities and limitations, delivering on promises and maintaining consistent and clear communication throughout the relationship and project journey are crucial aspects of building successful partnerships,” she notes. “We started to become well-known for our communication skills,” notes Usaite. “Being nice to everyone involved in the process is really important, as well as the ability to translate information about deadlines and budgets. Reacting in a timely manner to client questions and simply being there for them helps to foster long-term partnerships, which has resulted in a lot of repeat clients for us.” She adds, “Being open, present and kind are the guiding principles we adhere to.” “We try where possible to help people understand our industry so they can get the best possible results from a VFX team. This helps to get more of the budget on-screen and makes the process a lot smoother for all parties. At Lux Aeterna, we pride ourselves on fostering a creative partnership with our clients – we’re an extension of their team, going on the journey with them from the earliest stages of a project right up to the finish line,” Ashley notes. “When you have a base level of honesty with clients, staff and industry partners, you can be trusted. It’s building that trust that gives people confidence in what we do, which all leads to ongoing and future success. Be honest and everything else will fall into place,” Usaite advises.

TOP THREE: Vine FX, which contributed visual effects to War of the Worlds Season 3, aims to set an active example of how constant learning and innovation are essential to sustain growth. (Image courtesy of Vine VFX and Disney+) BOTTOM: FutureWorks provided all of the color work for the second season of Rocket Boys, the historical drama from Indian streaming platform Sony LIV (Image courtesy of FutureWorks and Sony LIV)

FALL 2023 VFXVOICE.COM • 89

PG 84-89 MARKETING.indd 89

8/23/23 6:19 PM


[ THE VES HANDBOOK ]

Coming in October 2023! The VES Handbook of Virtual Production is the most comprehensive guide to virtual production techniques and best practices available. The Visual Effects Society and the editors of The VES Handbook of Visual Effects, Susan Zwerman, VES, and Jeffrey A. Okun, VES, compiled the latest industry-standard technologies and workflows for the ever-evolving, fast-paced world of virtual production from the experts in the field. The editors tasked the authors with sharing their knowledge in the various areas of virtual production, including VR, AR, MR and XR technology, as well as providing detailed sections on interactive games and full animation. Additionally, the 80 industry contributors share their best methods, tips, tricks and shortcuts developed using real-world technology for In-Camera VFX (ICVFX). This book is for everyone from novices to working professionals. Some of the topics included are: • Visualization • VAD (Virtual Art Department) • Volumetric Capture • How to Capture Environments • LED Stage Setup • LED Display • Software/Hardware for VP • Cameras and Camera Tracking • Color Management • External Lighting for the Volume • Challenges and Limitations of Shooting in a Volume • Virtual Production Glossary The VES Handbook of Virtual Production covers essential techniques and solutions for all VP artists, VP producers, and VP supervisors, from pre-production through filming in LED volumes to post-production. It demystifies virtual production so that more producers and filmmakers can more easily navigate this new technology. Currently, there is an explosion of LED Volumes being built around the world as well as in the U.S. There was no informational handbook available until now on what teachers, students and artists need to learn. The pandemic has transformed the entertainment industry, forcing studios to rethink the way productions are planned.

The VES Handbook of Virtual Production is incredibly timely as there has been a seismic shift in how visual effects are being created. Using LED walls to create environments, self-lighting, and using a camera that is not locked into position has opened a new range of abilities to capture in-camera what before was only possible to add in post. This allows the actors, crew and camera to see and react in real time – merging the physical world with the digital world – bringing higher levels of performance, believability and technical perfection to the craft. About the Editors: Susan Zwerman, VES, is an experienced Visual Effects Producer with a passion for cutting-edge film production. She is highly respected for her expertise in visual effects and virtual production budgeting and scheduling. As chair of the DGA UPM/AD VFX Digital Technology Committee, Zwerman organizes virtual production seminars to introduce members to this exciting and evolving new technology. Zwerman received the Frank Capra Achievement Award in recognition of career achievement and service to the industry and the Directors Guild of America in 2013. She is a member of the Academy of Motion Picture Arts and Sciences, the Producers Guild of America, the Directors Guild of America and a member and Fellow of the VES. Jeffrey A. Okun, VES, is an award-winning Visual Effects Supervisor who is more than conversant with virtual production. He is a member and Fellow of the VES and a member of The Academy of Motion Pictures Arts and Sciences, the American Society of Cinematographers, the Television Academy and the Editor’s Guild. Okun created visual effects tracking and bidding software in 1992 that is still the basis of what is in wide use within the industry today, as well as the revolutionary visual effects techniques dubbed the “PeriWinkle Effect” and the “Pencil Effect” – a predictive budgeting tool. He is also a noted rock ‘n’ roll photographer of bands from the ‘70s and ‘80s. Members receive a 30% discount with the code ADC20 Order yours today. https://bit.ly/43JVf1h

90 • VFXVOICE.COM FALL 2023

PG 90 HANDBOOK.indd 90

8/23/23 6:20 PM


PG 91 ABSEN AD.indd 91

8/23/23 7:06 PM


[ VES SECTION SPOTLIGHT: INDIA ]

India Revisited: Celebrating the Dynamic VFX Community By NAOMI GOLDMAN

TOP TWO: VES India members hold their 2023 VES Awards nominations event in Mumbai. BOTTOM TWO: VES India members and guests mingle at the Fall 2022 membership drive mixer.

The VES’ international presence gets stronger every year, largely because of our regional visual effects communities and all that they do to advance the Society and bring people together. While most VES sections are concentrated in one main city center, the VES India Section encompasses the country’s four VFX industry hubs in Mumbai, Chennai, Bangalore and Hyderabad. Now six years strong, VES India is thriving with 75+ members from VFX studio backgrounds, cinematography, gaming and production, and drawn from across the Asian subcontinent. Abhishek Krishnan, VES India Chair since its inception in 2020 and Head of Studio & General Manager at Mihira AI, offers perspective on the state of the regional industry: “India is a growing market for animation and VFX, driven largely by increasing demand for top-quality effects in film, television and gaming. Indian and international studios have elevated their standing in the last few years by providing services, as well as their own IP content, and an increasing number studios have established hubs in Mumbai, most recently ILM and Framestore. Given our thriving industry and a great talent pool of highly skilled artists and technicians, we are proud that our country has become a global center point for visual effects.” “The Indian film industry is embracing visual effects to create visually stunning content,” said Rutul Patel, VES India Co-Chair and CEO of Digital District India. “Indian VFX artists – many of whom are now among our VES Section members – are driving this trend, elevating the profile of our local talent on the global stage and showcasing our country’s role and potential as a true hub for VFX and animation.” The geographic span of the membership is approximately 60% in Mumbai and about equally dispersed in the other three hubs. While most of the in-person events have been held in Mumbai, the section has worked to hold simultaneous events with Bangalore and hopes to hold more events in the coming year in Hyderabad – including potential involvement in the annual VFX Summit in Hyderabad, which is one of the pioneering VFX conclaves in South Asia. The VES India Section aims to reach 100+ members by the end of the year and recruits VFX professionals through meetand-greets and events, where prospective members have the opportunity to interact with the close-knit group of artists who form this vibrant community. Word of mouth and personal networking to identify and welcome in potential members is also a strong ongoing thread contributing to the Section’s expansion. Co-Chair Patel shared what it’s like to be a woman in visual effects in India and how she is using her platform: “When I joined the VES in 2017, I was one of only two women, and now we comprise closer to 10% of our Section members, which is aligned with the ratio in the industry. Most of the bigger studios, even if they have more women in their ranks, are in coordination or

92 • VFXVOICE.COM FALL 2023

PG 92-93 VES SECTION INDIA.indd 92

8/23/23 6:21 PM


production and just about 10% in creative roles. It’s tough because of the culture, and harder for women to convince their families this could be their job. More education needs to be done to explain the importance of persevering in this hard industry, which yields so many great things if you stick with it. I like being part of the 10%; it feels special. I view my role with pride as I work to reach out to women and encourage them to join the VES – as I am teaching my friend’s 13-year-old daughter to use Blender and lean in to her love of stop-motion and graphics!” When it comes to the Section’s programming: “COVID was tough on everyone, and we adopted to go online, which was the safest and quickest way to stay connected,” said Krishnan. “We organized training sessions on the latest and greatest in Unreal. And we had sessions from various regional and international studios who provided key insights on the making of those films.” One of the virtual events was a “behind the scenes” program with Co-Chair Patel, who shared her experience as part of Epic Games’ ‘Women Creators Program’ fellowship, serving as both an educational event and membership drive. “In our cohort, there was a woman from Afghanistan who barely had internet, and we all took turns helping her with connectivity as she could really only get online for minutes at a time. This was an amazing community of women, and I was proud to represent India and the VES. Bringing this transformational experience back to the VES was very cool.” Back to hosting live events, the Section held a nominating event for the 2023 VES Awards at Netflix India, which was an opportunity for members to come together and use their expertise and build community – which was VES India’s first event back in-person and a great success. Krishnan continued,” We have restarted physical screenings, which were shut for a long time due to COVID, and we now have two new venues to do our screenings in Mumbai and plan to explore more venues to support our regional hubs. We hold regular physical and online meet-and-greets to discuss current topics or how we can improve engagements with the rest of the members and draw in new members. We are also planning to collaborate with companies for educational events and restart the physical ‘The Making of’ series of events where studios across India participate to show and tell how they achieved VFX and animation shots in a showcased film.” The Section Board of Managers is focused on keeping the engagement and spirit high. They believe that every member is of high caliber, and while each of them is occupied with their professional and personal lives, the goal is to find the right balance to ensure everyone moves forward and stays connected. Remarking on his tenure in the VES, Krishnan noted, “As a member, it is so easy to approach anyone in the industry, as they always make you feel welcome and at home. It’s all one big connected family.” Patel shared, “Keeping our professional competition aside, we come together under the name of the VES, and it makes all of our interactions more human and personal. All of the people we get to know and understand – that is the real gift and value of being a part of the VES.”

TOP: VES India members and guests at the Fall 2022 membership drive mixer. SECOND FROM TOP: VES India members and guests gather for an exclusive film screening. BOTTOM TWO: VES India members and guests gather for their Spring 2023 membership drive.

FALL 2023 VFXVOICE.COM • 93

PG 92-93 VES SECTION INDIA.indd 93

8/23/23 6:21 PM


[ VES NEWS ]

VES Celebrates SIGGRAPH 2023 By NAOMI GOLDMAN

TOP LEFT: The Los Angeles Section Board of Managers hosted the successful VES LA SIGGRAPH Party. TOP RIGHT TO BOTTOM: VES members and guests enjoying the festive party at SIGGRAPH.

The VES Los Angeles Section celebrated our global community with VES members from around the world at a a high-energy VES SIGGRAPH Party on August 9th in downtown Los Angeles. SIGGRAPH is the premiere conference and exhibition on computer graphics and interactive techniques worldwide. This year, ACM SIGGRAPH celebrated its milestone 50th conference and reflected on a half century of discovery and advancement, while charting a course for the bold and limitless future and next 50 years of innovation. The annual conference has ushered in new breakthroughs, bolstered a community of perpetual dreamers, and mapped a future where exponential innovation isn’t just possible – it’s inevitable. VES members from around the globe and their guests gathered at the festive mixer during the conference to meet fellow members and colleagues who are at the forefront of visual effects, computer graphics, digital art, animation, new realities, artificial intelligence, research and more. The hugely popular and capacity-crowd event, which was free to VES members, gave SIGGRAPH attendees the opportunity to enjoy one of the many benefits that the VES offers its global membership.

94 • VFXVOICE.COM FALL 2023

PG 94 VES NEWS.indd 94

8/23/23 6:22 PM


PG 95 VFXV VOICE ACTIVATED AD.indd 95

8/23/23 7:07 PM


[ FINAL FRAME ]

There Will Be Mushroom Clouds

Christopher Nolan’s Oppenheimer is not the first film to tackle the subject of the atom bomb and the harrowing potential of world-ending technology. Some 150 films and documentaries have been made dealing with this subject, including The Beginning of the End in 1947, not long after the U.S. dropped two nuclear bombs on Japan to end World War II. Among the more notable nuclear-themed films over the years have been 1954’s Godzilla, 1964’s Fail Safe, 1964’s Dr. Strangelove or: How I Learned to Stop Worrying and Love the Bomb, 1979’s The China Syndrome, 1983’s Testament, 1983’s The Day After, 1983’s War Games, director Shôhei Imamura’s 1989 Black Rain, 1989’s Fat Man and Little Boy and 2019’s Chernobyl meltdown. And who could forget the nuclear destruction in James Cameron’s Terminator films. Terminator 2: Judgment Day won the Oscar for Best Visual Effects in 1992 with the team of Dennis Muren, VES, Stan Winston, Gene Warren Jr. and Robert Skotak.

Image from Terminator 2: Judgment Day courtesy of TriStar/Sony.

96 • VFXVOICE.COM FALL 2023

PG 96 FINAL FRAME.indd 96

8/23/23 6:23 PM


CVR3 FOTOKEM AD.indd 3

8/23/23 6:55 PM


CVR4 RAYNAULT AD.indd 4

8/23/23 6:56 PM


Issuu converts static files into: digital portfolios, online yearbooks, online catalogs, digital photo albums and more. Sign up and create your flipbook.