VFX Voice Spring 2024

Page 1

EYE ON THE LAS VEGAS SPHERE

22ND ANNUAL VES AWARDS & WINNERS • VFX’S UNSUNG HEROES

VFX IN CANADA

• PROFILES: MARIANNE SPEIGHT & SEARIT HULUF

VFXVOICE.COM SPRING 2024

Welcome to the Spring 2024 issue of VFX Voice!

Coming out of a much-anticipated awards season, this issue of VFX Voice shines a light on the 22nd Annual VES Awards gala. Congratulations again to all of our outstanding nominees, winners and honorees!

Immersive experiences at the Las Vegas Sphere take center stage in our cover story on this epic VFX venue. This issue delves into the state of TV/streaming and highlights sci-fi series 3 Body Problem, post-apocalyptic juggernaut Fallout and historical drama Shōgun. We explore trends in non-entertainment use of virtual tools, the VFX industry’s approach to environmental sustainability and the latest advances in mocap. VFX Voice also sits down with “Self” writer-director Searit Huluf and Milk Visual Effects Chief Business Development Officer Marianne Speight for up-close-and-personal profiles.

Continuing our expanded coverage of global VFX, we deliver a special focus on the booming VFX industry in Canada. We also spotlight our Oregon Section on its first anniversary. And, of course, share the joy with our VES Awards photo gallery capturing the festive celebration that awarded outstanding artistry and inventiveness in 25 categories and honored venerated Producer and Visual Effects Producer Joyce Cox, VES with the VES Lifetime Achievement Award and acclaimed actor/director William Shatner with the VES Award for Creative Excellence.

Dive in and meet the innovators and risk-takers who push the boundaries of what’s possible and advance the field of visual effects.

Cheers!

Kim Davidson, Chair, VES Board of Directors

Nancy Ward, VES Executive Director

P.S. You can continue to catch exclusive stories between issues only available at VFXVoice.com. You can also get VFX Voice and VES updates by following us on X (Twitter) at @VFXSociety.

2 • VFXVOICE.COM SPRING 2024 [ EXECUTIVE NOTE ]
4 • VFXVOICE.COM SPRING 2024 [ CONTENTS ] VFXVOICE.COM
8 COVER: LAS VEGAS SPHERE World’s largest high-resolution LED screen immerses audiences. 14 TECH & TOOLS: MOCAP Fast-advancing mocap has a bright future in a variety of markets. 22 TELEVISION/STREAMING: 3 BODY PROBLEM Major visual effects establish epic scope of ‘novel’ sci-fi series. 28 PROFILE: SEARIT HULUF “Self” writer/director steps out on stop-motion Pixar SparkShort. 34 TELEVISION/STREAMING: SHOGUN Using VFX to close the gap between accuracy and authenticity. 40 VFX TRENDS: UNSUNG HEROES Recognizing ‘hidden’ talent pivotal to making final shots a reality. 48 THE 22ND ANNUAL VES AWARDS Celebrating excellence in visual effects. 56 VES AWARD WINNERS Photo Gallery. 62 SPECIAL FOCUS: VFX IN CANADA Frontline studios in Vancouver, Toronto and Montreal power growth. 68 PROFILE: MARIANNE SPEIGHT Providing filmmakers the tools they need to succeed with VFX. 72 TELEVISION/STREAMING: FALLOUT Westworld team brings Megaton Power to game adaption. 78 VFX TRENDS: NON-ENTERTAINMENT VFX Virtual tools have become indispensable for multiple industries. 86 VFX TRENDS: SUSTAINABILITY How the VFX industry is tackling its large carbon footprint. ON THE COVER: A 366-foot-tall eyeball projected from spectacular live-entertainment venue Sphere lights up the Las Vegas evening skyline. (Image courtesy of Sphere Entertainment)
2 EXECUTIVE NOTE 90 VIRTUAL PRODUCTION HANDBOOK 92 VES SECTION SPOTLIGHT: OREGON 94 VES NEWS 96 FINAL FRAME – MATTE PAINTING
FEATURES
DEPARTMENTS

WINNER OF THREE FOLIO AWARDS FOR PUBLISHING EXCELLENCE.

PUBLISHER

Jim McCullaugh publisher@vfxvoice.com

EDITOR

Ed Ochs editor@vfxvoice.com

CREATIVE

Alpanian Design Group alan@alpanian.com

ADVERTISING

Arlene Hansen

Arlene-VFX@outlook.com

SUPERVISOR

Nancy Ward

CONTRIBUTING WRITERS

Naomi Goldman

Trevor Hogg

Chris McGowan

Oliver Webb

ADVISORY COMMITTEE

David Bloom

Andrew Bly

Rob Bredow

Mike Chambers, VES

Lisa Cooke

Neil Corbould, VES

Irena Cronin

Kim Davidson

Paul Debevec, VES

Debbie Denise

Karen Dufilho

Paul Franklin

David Johnson, VES

Jim Morris, VES

Dennis Muren, ASC, VES

Sam Nicholson, ASC

Lori H. Schwartz

Eric Roth

Tom Atkin, Founder

Allen Battino, VES Logo Design

VISUAL EFFECTS SOCIETY

Nancy Ward, Executive Director

VES BOARD OF DIRECTORS

OFFICERS

Kim Davidson, Chair

Susan O’Neal, 1st Vice Chair

Janet Muswell Hamilton, VES, 2nd Vice Chair

Rita Cahill, Secretary

Jeffrey A. Okun, VES, Treasurer

DIRECTORS

Neishaw Ali, Alan Boucek, Kathryn Brillhart

Colin Campbell, Nicolas Casanova

Mike Chambers, VES, Emma Clifton Perry

Rose Duignan, Gavin Graham

Dennis Hoffman, Brooke Lyndon-Stanford

Quentin Martin, Maggie Oh, Robin Prybil

Jim Rygiel, Suhit Saha, Lopsie Schwartz

Lisa Sepp-Wilson, David Tanaka, VES

David Valentin, Bill Villarreal

Rebecca West, Sam Winkler

Philipp Wolf, Susan Zwerman, VES

ALTERNATES

Andrew Bly, Fred Chapman

John Decker, Tim McLaughlin

Ariele Podreider Lenzi, Daniel Rosen

Visual Effects Society

5805 Sepulveda Blvd., Suite 620

Sherman Oaks, CA 91411

Phone: (818) 981-7861

vesglobal.org

VES STAFF

Jim Sullivan, Director of Operations

Ben Schneider, Director of Membership Services

Charles Mesa, Media & Content Manager

Ross Auerbach, Program Manager

Colleen Kelly, Office Manager

Brynn Hinnant, Administrative Assistant

Shannon Cassidy, Global Manager

P.J. Schumacher, Controller

Naomi Goldman, Public Relations

6 • VFXVOICE.COM SPRING 2024
us
VFXVOICE Visit
online at vfxvoice.com
• VOL. 8,
VFX Voice is published quarterly by the Visual Effects Society. Subscriptions: U.S. $50: Canada/Mexico $60; all other countries $70 a year. See vfxvoice.com Advertising: Rate card upon request from publisher@vfxvoice.com or VFXVoiceAds@gmail.com Comments: Write us at comments@vfxvoice.com Postmaster: Send address change to VES, 5805 Sepulveda Blvd., Suite 620, Sherman Oaks, CA 91411. Copyright © 2024 The Visual Effects Society. Printed in the U.S.A. Follow us on social media
SPRING 2024
NO. 2

LAS VEGAS’ SPHERE: WORLD’S LARGEST HIGH-RES LED SCREEN FOR LIVE ACTION AND VFX

On the outskirts of the Las Vegas Strip, a 366-foot-tall eyeball gazes out at the urban landscape. The traffic-stopping orb, simply named Sphere, has an exosphere of 580,000 square feet of LED panels that morph into the moon, an immense pumpkin, vast fireworks and much more.

While the exterior of Sphere is now an imposing part of the Greater Vegas skyline, its interior is an immersive, scaled-up entertainment destination with seats for 17,600+. Films, concerts and events are displayed on the largest high-resolution LED screen in the world, an arena-sized canvas for live action and visual effects.

The wraparound 16K x 16K resolution interior display is 240 feet tall, covers 160,000 square feet and is comprised of 64,000 LED tiles manufactured by Montreal-based SACO Technologies. The audio system, powered by Berlin’s Holoplot, uses 3D audio beam-forming technology and wave-field synthesis. Sphere Entertainment’s $2.3 billion project was designed by global architectural design firm Populous.

THIS PAGE AND OPPOSITE BOTTOM THREE: The newest addition to the Greater Las Vegas skyline is the 366-foot-tall Sphere. Its exosphere, the exterior shell of Sphere, has 580,000 square feet of LED panels that morph into all types of images. Sphere’s images range from a giant eyeball and leaf-like color bursts to an architectural lattice and a vivid moon. The Rockettes’ kicking and dancing also fill the Sphere and seem particularly well-suited to light up a Las Vegas night.

(Photos courtesy of Sphere Entertainment)

OPPOSITE TOP:

Nevada’s most endangered species crowd Sphere’s interior in Es Devlin’s “Nevada Ark” for U2’s show.

(Photo: Es Devlin. Courtesy of disguise and U2)

Sphere Entertainment developed bespoke technology for the outsized format, including its Big Sky 18K x 18K, 120 fps camera system. The Sphere Studios division’s main Burbank campus is dedicated to production and post-production of visuals and mixing of immersive audio for Sphere and houses Big Dome, a 28,000-square-foot, 100-foot-high geodesic dome that is a quarter-sized version of Sphere, for content screening.

The rock band U2 inaugurated Sphere with a five-month-plus residency for “U2: UV Achtung Baby Live at Sphere,” and showed off the venue’s vast creative possibilities for live shows. Director Darren Aronofsky’s immersive 50-minute film Postcard from Earth, which debuted soon after U2’s launch, tells the story of our planet seen from the future. Postcard used the Big Sky camera as well as Sphere’s 4D technologies, including an infrasound haptic

8 • VFXVOICE.COM SPRING 2024

system to simulate the rumbles of thunder or a rocket launch and sensory effects like breezes and scents.

“At its best, cinema is an immersive medium that transports the audience out of their regular life, whether that’s into fantasy and escapism, another place and time or another person’s subjective experience. The Sphere is an attempt to dial up that immersion,” Aronofsky wrote in a press release.

Soon after Sphere’s opening, Autodesk and Marvel Studios teamed up to create an ad celebrating the former’s software and The Marvels film for an Autodesk customer event in Las Vegas. The Mill helped with the VFX, utilizing the Autodesk tools Maya and Arnold. The segment featured a gigantic Goose the flerken (a cat-like creature that transforms into a monstrous alien) on the exterior of Sphere, another massive visual certain to draw attention for miles around.

7thSense provides Sphere’s in-house media servers, processing and distribution systems utilized fully on Postcard from Earth. They are the venue’s main playback system. For “U2:UV,” the visuals were coordinated by Treatment Studio and powered at Sphere by a disguise playback system.

U2 AT SPHERE

Brandon Kraemer served as a Technical Director for Treatment Studio on the “U2:UV” residency at Sphere. He comments, “The unique thing that Sphere brings to the concert experience is a sense of immersion. Given that it’s a spherical image format and covers much of your field of view – and it’s taller than the Statue of Liberty on the inside – means it becomes an instant spectacle, and if you leverage that for all its uniqueness, you can’t help but blow audiences’ minds.”

Kraemer recalls, “Willie Williams [U2 Creative Director and

TOP TWO: To capture large-scale, ultra-high-resolution imagery, Sphere Entertainment’s Burbank-based unit, Sphere Studios, developed the 18K x 18K, 120fps Big Sky camera system, used in spectacular fashion by Darren Aronofsky’s Postcard from Earth.

BOTTOM: A massive cross of light is a simple but powerful visual at this scale, part of the band’s “U2: UV Achtung Baby Live at Sphere” residency. (Photo Kevin Mazur. Courtesy of disguise and U2)

Co-Founder of London-based Treatment Studio] contacted me in September of 2022 about the project. That was very early on in the process. Early creative was being discussed then, but just as importantly we started to embark on just how we were going to technically pull this off.”

Kraemer continues, “The majority of the visuals were designed by the artists at Treatment under the creative direction of Williams and Producer Lizzie Pocock. However, there were other collaborators on key pieces as well. Khatsho Orfali, David Isetta and their team from Industrial Light & Magic created an amazing cityscape that deconstructs itself for U2’s new song ‘Atomic City.’ And, he adds, “Marco Brambilla and his team at The Mill in Paris created a unique world for ‘Even Better Than the Real Thing,’ a dense psychedelic collage.”

There were numerous technical challenges and quite a few diplomatic challenges as well, and these two areas often overlapped. Kraemer explains, “Opening a building and working in a construction site while stepping through rehearsal programming is quite a feat. My hats off to U2’s legendary Production Manager, Jake Berry, for keeping the whole operation moving forward in the face of what were, at times, some serious headwinds. Getting content rendered on that screen has lots of challenges along the way, and we were also very fortunate to have the support of disguise and their [GX 3] servers as the backbone of the playback system. We couldn’t have produced the show we did without their support.” In addition, the show utilized a custom stage, based on a turntable design by Brian Eno, and covered by Yes Tech and ROE panels.

U2’s reaction was very positive, according to Kraemer. “The band put a lot of trust in the teams that Willie Williams put together, and they were pretty blown away by it all.”

DISGUISE

Peter Kirkup, disguise’s Solutions and Innovation Director, recalls, “We first became involved in Sphere through [U2’s Technical Director and Video Director] Stefaan ‘Smasher’ Desmedt. Together with Smasher, disguise has been working on U2 shows for decades, so it was a perfect fit.”

Kirkup adds, “Disguise’s software and hardware powered the visuals that were displayed on Sphere’s wraparound LED screen during the U2 show. First, our Designer software was used to help previsualize and edit the visual content – all brought together by the creative minds at Treatment Studio, including Brandon Kraemer and Lizzie Pocock as well as Willie Williams.”

Disguise’s Designer software allowed the creative team to previs their visuals on a computer with the help of a 3D digital twin of the Sphere stage. “This real-time 3D stage simulator meant ideas could be communicated more clearly and quickly to get everyone on the same page,” Kirkup notes. “Designer also helped the team to sequence the visuals into a timeline of beats and bars – and import audio to lock visuals to the beat. This helped create snappy, rhythmic edits and some extra looping segments that could be pulled in on the fly in case the band decided to do an extra riff on the day of the show.”

Kirkup continues, “Once the visuals were complete, our software

10 • VFXVOICE.COM SPRING 2024
COVER
(Photo courtesy of Sphere Entertainment)

split and distributed the 16K video into sections. We were working with one contiguous LED screen but still needed to split the video into sections because of the sheer volume of content involved. We were playing real-time Notch effects and pre-rendered NotchLC content at 60fps across the Sphere’s 256,000,000 pixel, 16K x 16K interior canvas.

“Finally, our GX 3 media servers enabled all individual pieces to be perfectly in sync throughout the show,” Kirkup says. “This technology also allowed us to composite layers of video together in real time. For example, the video feed of the band that cinematic cameras were capturing during the show could be composited into our LED visuals from the Designer software. Each server was also upgraded with a 30-terabyte hard drive, so we had local storage machines for playout and 100GB networking back to the content store for file transfers and media management.”

Kirkup adds, “We furthered our Single Large Canvas workflows, which enable content to be broken up into pieces and distributed across a cluster of machines – essential work to make a project like this come to life. We also introduced some custom color pipeline work for Sphere, adapting our standard color pipeline to match the unique characteristics of the in-house LED system.” Adds Kirkup. “A big challenge was handling such a large volume of content across 256,000,000 pixels – in real time. There were 18,000 people watching the show, and they all had their camera phones ready to broadcast to even more people, so we really had to make sure the show went well.”

Kirkup remarks, “Bono mentioned this during the show, but I believe the most important thing about Sphere is that for the first time, a venue of this scale is being created with musicians in mind. In the past, musicians needed to squeeze into sporting arenas or stadiums that weren’t created for music – they may have had tiny screens or the wrong acoustics. With Sphere, that’s all changed. For real-time graphics and VFX artists, that’s a big trend to watch for in 2024 and beyond. I expect to see more venues designed specifically to highlight 3D visuals. With that, more VFX artists and studios will be pulled in to develop not only movie and TV effects – but incredible visuals for live events, too. The two industries will start to blur.”

7THSENSE

7thSense – a creative software and technology company based in Sussex, England – put together the Sphere in-house playback system and provides hardware for media serving, pixel processing and show control. “Building a first-of-its-kind venue like Sphere brought with it a significant number of challenges that the 7thSense team was keen to dig their collective fingers into,” explains Richard Brown, CTO of 7thSense.

Brown notes, “Managing exceptionally large canvases of playback, generative and live media as a single harmonious system is of utmost importance in a venue of this scale, and it is a workflow and underpinning technology we have been working on for quite some time. With a 16K x 16K canvas size, Sphere placed a priority on accelerating the development of the tools for media playback, multi-node rendering of generative assets and live compositing

SPRING 2024 VFXVOICE.COM • 11
TOP TO BOTTOM: The visuals for U2’s “Atomic City,” with VFX work by ILM, includes a stunning deconstruction of Las Vegas going back in time. (Photo: Rich Fury. Courtesy of disguise and U2) The desert landscape around Las Vegas became a backdrop for U2’s “Atomic City.” (Photo: Rich Fury. Courtesy of disguise and U2) Marco Brambilla’s dense psychedelic collage “King Size,” put together with the help of the Mill in Paris, is an ode to Elvis Presley that accompanies the U2 song “Even Better than the Real Thing.” (Photo: Rich Fury. Courtesy of disguise and U2)

from multiple ST 2110 streams, as well as for pre-visualizing the show without having access to the full system. Because time in the venue is an incredibly rare commodity, anything that can be done ‘offline’ helps to make the time in the venue more productive.”

Brown adds, “High-speed streaming of uncompressed media from Network Attached Storage (NAS) is something we have been wanting to do for a long time, but the technology was not sufficiently advanced to support the bandwidth and timely delivery of data until very recently. Fortunately, the use case for this technology aligned very much with the desired workflow at Sphere, giving us the chance to really dig into what could be an industry-changing technology for media production and presentation systems.”

Brown continues, “Managing synchronized media playback across dozens of servers is one thing, but making it straightforward for a show programmer to build the show that spans dozens of servers is quite another. 7thSense developed an Asset Logistics workflow that simplifies what actual movie frames each server streams from the NAS based on representative meta-media used for programming the show timeline.”

Brown explains, “Each server is configured with what section of the dome it is responsible for playing back, and this information, coupled with the name of the movie from the timeline, is used to determine the file path on the NAS that each media server uses to access the appropriate movie frames. This workflow reduces user error and makes timeline programming significantly faster than managing individual movies per server.”

Brown comments that Sphere is the first entertainment venue of its kind when it comes to the size and resolution of the media being presented to an audience. He says, “It is imperative that all media players, generative engines and pixel processors are working in absolute synchronization, or the illusion of immersion is lost for the audience. Worse than that, image tearing or jitter, could cause the audience to become ill because of the immersive nature of the media plane. Everywhere you look, you are surrounded by the media.”

In addition, Brown notes, “Not only is it our first major application of ST 2110, it just happens to be the largest ST 2110 network in an entertainment venue on the planet!” 7thSense has been in the world of immersive presentations in planetaria, domed theaters, museums and theme park attractions since going into business nearly 20 years ago. But what has been created at Sphere is something new, a destination live-event venue, and the technology far surpasses what has been built to date. This hybrid type of entertainment has the potential to create its own category of immersive live show experience. It’s exciting to be part of the team building it from the ground up.”

The interior display of Sphere can create huge individual displays for any performer, and the venue uses 3D audio beam-forming technology and wave field synthesis for an appropriately big and precise sound.

The huge $2.3 billion Sphere has altered the Greater Las Vegas skyline and become an entertainment destination, celebrating its launch in September 2023 with the “U2: UV Achtung Baby Live at Sphere” residency. (Photo courtesy of Sphere Entertainment)

“I think it’s an experience like no other,” Treatment Studio’s Kraemer says about Sphere. “It was a thrilling experience to be part of the first creative team to produce an amazing show there. I think ‘U2:UV’ will be a very tough act to follow, but I think there is a tremendous opportunity to give an audience something that is impossible in a stadium or arena show, and I look forward to seeing how this all evolves.”

12 • VFXVOICE.COM SPRING 2024
TOP TO BOTTOM: The interior display of Sphere is 240 feet tall and covers 160,000 square feet with LED panels from SACO Technologies. (Photo: Rich Fury/Ross Andrew Stewart. Courtesy of disguise and U2) (Photo courtesy of disguise and U2)

THE EXPANDING HORIZONS OF MOTION CAPTURE

TOP: Snoop Dogg at Astro Project motion capture studio in Santa Monica for his “Crip Ya Enthusiam” music video utilizing the Vicon system and StretchSense gloves. (Image courtesy of Vicon and Astro Project, LLC)

OPPOSITE TOP: On the mocap set of She-Hulk: Attorney at Law with diode suit and Digital Domain’s Charlatan “face-swapping” system.

(Photo: Chuck Zlotnick. Courtesy of Marvel Studios)

OPPOSITE BOTTOM: Rokoko’s Coil Pro is the company’s recent innovation in motion capture hardware, featuring no drift and no occlusion through a fusion of EMF and IMU capture.

(Image courtesy of Rokoko)

Motion capture, performance capture and volumetric video technologies are rapidly advancing, incorporating AI and ML to a greater extent and focusing on enhancing realism, precision and accessibility. Peter Rabel, Technical Product Manager at Digital Domain, comments, “The trend towards real-time capabilities has become prominent, allowing for immediate feedback and integration into virtual environments, video games and live events. As we integrate artificial intelligence and machine learning as tools to enhance these functions’ capabilities further, it will enable automated analysis and capture of movements in real-time, which will help save time on the process, leading to cost savings. It’s essential for us to stay updated on recent developments and industry trends to understand the current trajectory of these capture technologies as technology continues to evolve so we can better serve our clients.”

VICON: MARKERLESS

Vicon made a splash in 2023 with its Los Angeles SIGGRAPH announcement of the debut of its machine learning (ML) powered markerless mocap. The news came after some three years of research and development focusing on the integration of ML and AI into markerless motion capture at Vicon’s R&D facility in Oxford, U.K. Vicon collaborated on the technology with Artanim, the Swiss research institute that specializes in motion capture, and Dreamscape Immersive, the VR experience and tech company.

“The ability to capture motion without markers while maintaining industry-leading accuracy and precision is an incredibly complex feat,” says Mark Finch, Vicon’s Chief Technology Officer. “After an initial research phase, we have focused on developing the world-class markerless capture algorithms, robust real-time tracking, labeling and solving needed to make this innovation a

14 • VFXVOICE.COM SPRING 2024

reality. It was our first step towards future product launches, which will culminate in a first-of-its-kind platform for markerless motion capture.”

Finch continues, “What we demonstrated at SIGGRAPH was markerless recognition of the human form – using prototype cameras, software and algorithms – to track six people, with their full body solved in real-time, in a VR experience. This completely the need for participants to wear heavy gear with motion capture markers. As a result, the VR experience is more seamless and believable as the motion capture technology is largely invisible and non-invasive.” Finch adds, “Of the technology we showcased, Sylvain Chagué, Co-Founder and CTO of Artanim and Dreamscape, said, ‘Achieving best-in-class virtual body ownership and immersion in VR requires both accurate tracking and very low latency. We spent substantial R&D effort evaluating the computational performance of ML-based tracking algorithms, implementing and fine-tuning the multi-modal tracking solution, as well as taking the best from the full-body markerless motion capture and VR headset tracking capabilities.’ ”

ROKOKO VISION

Based in Copenhagen, Rokoko had two major announcements on the product front in the last year, “First, with Rokoko Vision, our vision AI solution that allows for suit-less motion capture from any camera. We released the first iteration mainly to get to know the space and gather insights from early use of the product,” CEO and Founder Jakob Balslev comments. “It’s becoming increasingly clear to us what the users need, and we are excited to release more updates on that front.

He adds, “Second, we unveiled our Coil Pro – the biggest innovation we’ve ever done on the hardware side – and, in my

SPRING 2024 VFXVOICE.COM • 15

TOP: OptiTrack’s Primex 120 and Primex 120W cameras offer the company’s longest camera-to-marker range for Passive and Active markers. OptiTrack accuracy with more range enables very large tracking volumes for a wide variety of training and simulation scenarios, extreme ground or aerial robotic facilities and larger cinematic virtual production studios. (Image courtesy of OptiTrack)

BOTTOM: OptiTrack’s Primex cameras quickly identify and track Passive and Active markers. (Image courtesy of OptiTrack)

eyes, probably the biggest innovation ever in motion capture. Through a fusion of EMF and IMU capture, the Coil Pro unlocks the holy grail of motion capture: No drift and no occlusion. With drift-free global position over time and no need for line of sight from optical solutions, the Coil Pro is the best of both worlds of mocap [IMU and optical]. The underlying platform, named Volta Tracking Technology, fuses EMF and IMU and will be at the core of all our motion capture hardware solutions going forward.”

DIGITAL DOMAIN: CHARLATAN

Digital Domain is further developing its machine learning neural rendering software Charlatan (sometimes referred to as a face-swapping tool). “Acknowledging the expense and time associated with traditional methods, including our top-tier Masquerade [facial capture] system, we developed Charlatan to introduce efficiency and affordability,” Rabel comments. “Several years ago, Charlatan was created using machine learning techniques. This innovative approach involves utilizing real photography of an individual’s face and applying enhancements, seamlessly transferring it to another person’s face, or even manipulating discrete aspects such as aging or de-aging. Recently, we have been developing Charlatan 3D, which evolves this technology to produce full 3D geometry from this process but at a lower cost and simpler capture conditions than Masquerade. In essence, Charlatan represents a significant stride towards streamlining the creation of lifelike digital humans with unparalleled realism.”

OPTITRACK: NEW CAMERAS

OptiTrack provides tracking solutions that vary in use, including AAA game studios, medical labs, and consumer and prosumer budget solutions. In November the firm announced its three most advanced motion capture cameras; the PrimeX 120, PrimeX 120W and SlimX 120. “With higher resolution and increased field of view, these new additions enable larger tracking areas for a wider variety of training and simulation scenarios and larger cinematic virtual production studios,” says Anthony Lazzaro, Senior Director of Software at OptiTrack. All three cameras, which are designed and manufactured at OptiTrack’s headquarters in Corvallis, Oregon, feature their highest-yet resolution, 12 megapixels. With the PrimeX 120, customers benefit from a standard 24mm lens while the PrimeX 120W comes with an 18mm lens with a wider field of view. [And] we have 24mm or 18mm wide lens options available with the Slim X 120.”

Lazzaro continues, “We also released a more informative and intuitive version of our mocap software, which is now compatible with all OptiTrack mocap cameras. Motive 3.1 is aimed at simplifying high-quality, low-latency performance motion tracking, offering users easy-to-use presets and labeling for tracked items that deliver the best possible motion data while saving time and eliminating extra steps. Customers also have greater visibility into possible issues and can automatically resolve against the harshest of tracking environments.”

16 • VFXVOICE.COM SPRING 2024
TECH & TOOLS

STRETCHSENSE: MOCAP GLOVES

Founded in Auckland in 2012, StretchSense took on the mission to build the world’s best stretchable sensors for comfortably measuring the human body. “Building on top of our sensor technology, in 2019 we pivoted the business to focus on motion capture gloves for AAA studios, indie studios, streamers, VR/AR, live shows and more,” explains StretchSense Co-Founder and VP Partnerships & New Markets Benjamin O’Brien.

“Our Studio Gloves are incredibly unobtrusive, with a less than 1mm thick sensor layer on top of breathable athletic fabric, and a small transmitting module,” O’Brien says. “This is more than just a comfort and style thing though; it means that our gloves don’t get in your way, and you can continue to type, use a mouse, hold a prop, use your phone or just get a pizza from the door. Once you start to think about mixed-reality applications, this becomes even more critical, as our gloves allow you to switch seamlessly between interacting with virtual spaces and the real world.”

O’Brien adds, “Our mission is to democratize motion capture, allowing independent content creators and streamers to create incredible and immersive stories and experiences. To achieve this, we have a long-term goal of getting our gloves down to a true consumer price point, which will really open up the space. At $795, we think our latest StretchSense Studio Glove is the biggest step the industry has ever taken towards this goal; less than two years ago, something with similar performance would have cost well over $5,000.”

ARCTURUS AND VOLUMETRIC VIDEO

Based in Beverly Hills, Arcturus Studios was founded in 2016 by veterans of DreamWorks, YouTube, Autodesk, Netflix and other notable companies. “Together, they saw the potential for volumetric video and decided to work together to steer its development,” recalls Piotr Uzarowicz, Head of Partnerships and Marketing at Arcturus. “That led to the creation of the HoloSuite tools, consisting of HoloEdit – a tool that can edit the 3D performances of performers recorded with volumetric video – and HoloStream, software that can compress a completed volumetric video file and stream it to any 2D or 3D device, even if the broadband signal is unstable. Together, HoloSuite has helped make it possible to use volumetric video for everything from e-commerce to AR projects to virtual production and more.”

Uzarowicz continues, “Arcturus took over Microsoft’s Mixed Reality Capture Studios (MRCS) business [in 2023], including the development of that capture system – the most sophisticated in the world – as well as the rights to maintain and supply MRCS licenses to studios around the world. That has put Arcturus in a unique position where it is now developing for all stages of volumetric video, from the capture and editing all the way to the final distribution.”

“One of our goals has always been to make volumetric video more accessible. We’re looking at new ways to make it easier to capture volumetric videos using fewer cameras, including the use of AI and machine learning. With the MRCS technology and our licensees, we are working with some of the best and most creative

TOP TO BOTTOM: OptiTrack’s Motive 3.1 advanced motion capture software can be paired with any of OptiTrack’s motion capture cameras, including the premium PrimeX, Slim or low-cost Flex series. Motive 3.1 also offers trained markersets, enhanced sensor fusion and pre-defined settings. (Image courtesy of OptiTrack)

StretchSense makes motion capture gloves for major and indie studios, streamers, VR/AR and live shows. (Image courtesy of StretchSense)

StretchSense’s mocap gloves are unobtrusive, with a less than 1mm-thick sensor layer on top of breathable athletic fabric and a small transmitting module. StretchSense’s $795 Studio Glove is a step toward the company’s goal of getting its gloves down to a true consumer price point. (Image courtesy of StretchSense)

SPRING 2024 VFXVOICE.COM • 17

“The trend towards real-time capabilities has become prominent, allowing for immediate feedback and integration into virtual environments, video games and live events. As we integrate artificial intelligence and machine learning as tools to enhance these functions’ capabilities further, it will enable automated analysis and capture of movements in real-time, which will help save time on the process, leading to cost savings.”

content creators in the world to find where the technology can evolve and improve the production experience,” comments Uzarowicz. “We just released a new video codec called Accelerated Volumetric Video (AVV) that makes it possible to add more volumetric characters to a digital environment. With the MRCS technology, the quality of a captured performance is better than ever. Volumetric video is constantly evolving,” he adds.

MOVE AI

Move AI has announced the official release of a single-camera motion capture app, Move One, the company revealed in late November. “The app is now available to animators and creditors looking to bring realistic human motion to their 3D characters,” said the company. “Move AI makes it easy to capture and create 3D animations.”

AI/ML

“Arcturus is currently experimenting with AI and machine learning in several ways. From the moment we were founded, one of our main goals has always been to make volumetric video more accessible, and AI can help us do that in a few different ways,” Uzarowicz comments. “Among other things, one of the areas we are currently focusing on in our R&D is using AI to help us capture the same level of quality – or better – we can currently capture but use fewer cameras. One of the things that makes our MRCS technology the best in the world is the software that converts the multiple captured recordings into a single 3D file. With AI, we hope to improve that process.” Regarding AI/ML, O’Brien says, “We are seeing many companies using motion capture to create their own proprietary databases for training or tuning generative AI models, and we are looking at how we can lean into this. Finally, we are ourselves constantly investing in machine learning to improve the data quality [of] our products.”

“Given our experience with machine learning, we see Gen AI as a tool like any other in our toolbox, enabling us to create artistically pleasing results efficiently in support of the story,” Digital Domains’s Rabel says. “We have found that the combination of powerful tools, such as machine learning and AI, with our artists’ creative talent produces the photorealistic, relatable, believable and lifelike performances we are striving for. We feel the nuances of an actor’s performance in combination with our AI and machine learning toolsets are critical to achieving photorealistic results that can captivate an audience and cross the uncanny valley.”

BOTTOM: Arcturus’s HoloSuite tools consist of HoloEdit – a tool that can edit the 3D performances of performers recorded with volumetric video – and HoloStream, software that can compress a completed volumetric video file and stream it to any 2D or 3D device, even if the broadband signal is unstable. With HoloSuite it’s possible to use volumetric video for e-commerce, AR projects and virtual production.

Lazzaro comments, “OptiTrack already uses ML algorithms to derive optimal solutions for things like continuous calibration and trained markersets. Continuous calibration takes existing visible objects in a scene, i.e. markers, and uses that data to determine how to make small adjustments to fix calibration issues related to bumps, heat or human error. Trained markersets allow you to feed marker data into an algorithm to make a model that can track objects that were previously not trackable, such as trampolines, jump ropes and other non-rigid objects. Lazzaro adds, “Advances in AI and ML will continue to shape the way that objects are tracked in the future.” Rokoko’s Balslev notes, “AI/ML will

18 • VFXVOICE.COM SPRING 2024 TECH & TOOLS
TOP: Arcturus took over Microsoft’s Mixed Reality Capture Studios (MRCS) business in 2023, including development of the capture system, as well as rights to maintain and supply MRCS licenses to studios worldwide. Arcturus also now develops for all stages of volumetric video. (Image courtesy of Arcturus) (Image courtesy of Arcturus)

“Our mission is to democratize motion capture, allowing independent content creators and streamers to create incredible and immersive stories and experiences. To achieve this, we have a long-term goal of getting our gloves down to a true consumer price point, which will really open up the space. At $795, we think our latest StretchSense Studio Glove is the biggest step the industry has ever taken towards this goal; less than two years ago, something with similar performance would have cost well over $5,000.”

fundamentally change the motion capture space. Text-to-motion tools are emerging and maturing and will eventually completely disrupt the stock space for online marketplaces and libraries. These tools will however not be able to replace any custom mocap that requires acting and specific timing.”

VR AND MOCAP

“We [Vicon and Dreamscape Immersive] are together mapping out just how far markerless mocap can go in providing a more true-to-life adventure than any other immersive VR experience by allowing for more free-flowing movement and exploration with even less user gear,” Vicon’s Finch comments. “Dreamscape has said it has long awaited the time when markerless could break from concept and into product, where the technology could support the precision required to realize its amazing potential. We’re testing that potential together now.” Finch adds, “Seeing people’s initial reactions to VR when they’re fully immersed is remarkable. The fantasy-reality line blurs, the more freedom you have in a VR space, which is reduced when a user is tethered and they feel the pull of the cable or know they’re wearing a backpack.” He continues, “There’s also the customer experience element that’s a central driver in all of this. People’s experience with markerless is a big wow moment. Markerless is going to lead to more magic – more wow.”

Lazzaro explains, “Mocap is used in all sorts of VR and AR applications. Typically, home systems use what is called inside-out tracking to have a head-mounted display [HMD] track the world around a user. This works great for HMD and controller tracking, but can’t be used to see other people wearing HMDs. OptiTrack uses an approach called outside-in tracking where we track the HMD, controllers and props using external cameras. This allows users to build location-based VR experiences in which multiple people can go through an experience together or engineers can work on designs in VR as a group.”

OUTLOOK

“We think these markets [motion capture, performance capture and volumetric video] will all be changed with the continued increase in accessibility,” comments StretchSense’s O’Brien. You can now do full-body mocap for less than the cost of a new iPhone, and basic volumetric capture can now be had for free on that same iPhone. This means different things for different markets: On a major AAA studio, you are going to see mocap happening on all of the people all of the time, and also on more ambitious projects that have more animated content than ever before. For independent creators, the financial costs of getting into mocap are dropping away so more people can join the space. Finally, there are millions of streamers worldwide who are getting new ways to connect with their community and make money while doing so by stepping into virtual worlds.”

“Mocap has a bright future in a variety of markets,” OptiTrack’s Lazzaro says. “This includes but is not limited to movies, video games, medical applications, robotics, measurement and VR. Mocap techniques are also becoming more commonplace with V-Tubers and other prosumer applications.”

20 • VFXVOICE.COM SPRING 2024
TECH & TOOLS
TOP AND BOTTOM: Move AI offers a single-camera motion capture app, Move One, for animators looking to bring realistic human motion to their 3D characters, making it easy to capture and create 3D animations. (Images courtesy of Move AI)

SEIZING THE OPPORTUNITY TO VISUALIZE THE 3 BODY PROBLEM

A computational conundrum occurs when the motion of three celestial bodies mutually influences each other’s gravitation pull. This serves as the premise for the science fiction series 3 Body Problem by novelist/series writer Liu Cixin, where an alien race living on an environmentally unstable planet caught between a trio of suns sets in motion a plan to invade Earth with the assistance of human conspirators. Adapting the novels for Netflix is Game of Thrones duo, David Benioff and D.B. Weiss, along with True Blood veteran Alexander Woo. The first season of 3 Body Problem encompasses eight episodes that feature major visual effects spanning environment builds, a multi-dimensional supercomputer compressed into a proton, a sliced and diced oil tanker, characters being rehydrated/dehydrated and a virtual reality game that literally feels real. The epic scope of the project required the creation of 2,000 shots by Scanline VFX, Pixomondo, BUF, Image Engine, Screen Scene and El Ranchito. An in-house team took care of additional cleanups, which ranged from a character blinking too much to having to paint out an unwanted background element.

Images courtesy of Netflix.

TOP: A major visual effects undertaking was constructing the environment and crowd at Tsinghua University watching the torture of intellectuals during the Chinese Cultural Revolution.

Previs was an indispensable tool. “It’s a complete game-changer being able to do everything in Unreal Engine,” Visual Effects Supervisor Stefen Fangmeier states. “We did nearly no storyboarding. It was essentially camerawork. The funny thing was they were trying to get me to use a camera controller, and I said, ‘No. I’m a curve guy.’ I set a keyframe here and a keyframe there and interpolate. I even reanimated characters, which you can do in Unreal Engine in the most elegant way. You can take a couple of big performances and mix them together; it’s a fantastic tool. We worked with NVIZ in London who would prep all of these scenes, do the animation, then I would go shoot and light it; that was a great joy for me, being interactive. What was so interesting about 3 Body Problem was there is an incredible variety of work.”

22 • VFXVOICE.COM SPRING 2024
TELEVISION/STREAMING

A unique cinematic moment involves an oil tanker being sliced by nanowires as part of an elaborate trap to capture a hard drive belonging to a cult that supports the San-Ti invading Earth.

“People get sliced every 50 cm, which we did mostly with digital doubles and a few practically built hallways and interior buildings. When you slice something that heavy vertically at 50 cm increments, the weight of what’s above it keeps it in place until the bow hits the shoreline. The dish on top of it collapses into the Panama Canal, which we created as a full CG environment,” Fangmeier states.

Opening the series is a massive crowd gathering at Tsinghua University during the Chinese Cultural Revolution to watch the torture of intellectuals, and because of the controversial nature of the subject matter shooting in Beijing was not an option. “Ultimately, we built the environment from photography and then took some liberties,” Visual Effects Producer Steve Kullback describes. “We wanted it to be realistic, but how big is the quad? What did the buildings actually look like? I don’t think anybody is tracking it quite that precisely, but what we ended up with is having 100,000 screaming students in front of us, and that was all shot quite virtually with a stage set that was built out and extended. It was an array of bluescreens on Manitous that were set up to move around and reposition behind 150 extras.” Crowd tiling was minimal. “We did one shot, which was a poor artist’s motion control. The director wanted a shot where the camera is pushing out towards the stage over the crowd, so what we did was start in the foreground pushing over it, repeat the move pushing over it and move everyone up. We put the pieces together, and it worked quite well. We didn’t have a motion control crane, just a 50-foot Technocrane and a good team that was able to repeat their moves nicely,” Kullback says.

TOP TO BOTTOM:

Bai

A radar dish test at Red Coast Base kills a flock of birds that were entirely CG.

SPRING 2024 VFXVOICE.COM • 23
Vedette Lim as Vera Ye in one of the many environments given the desired scope and vastness through digital set extensions. Mulin (Yang Hewen) sits alongside Young Ye Wenjie (Zine Tseng) who makes a fateful first contact with the San-Ti, which sets their invasion plans in motion.

TOP TO

The reflective quality of the VR headset meant that extensive photogrammetry had to be taken so each set piece could be reconstructed digitally.

One of the major environments simulated in the VR game is the observation deck of the Pleasure Dome constructed by Kublai Khan.

Another key environment build was the Red Coast Base where astrophysics prodigy Ye Wenjie makes first contact with the San-Ti in the 1960s, which sparks an invasion conspiracy. “For Red Coast Base, we had part of an observation base in Spain that was on a mountaintop, and it was a windy day with no rain, so we had some nice sunsets and great clouds,” Visual Effects Supervisor Rainer Gombos remarks. “Some of the buildings didn’t match what we wanted, and the main building was missing the large radar dish. We only had the base built for that. We had some concepts from the art department for how the extensions should work, and then we did additional concept work once we had the specific shots and knew how the sequence would play out.” The years leading up to the present day have not been kind to the Chinese national defense facility. “The roofs have collapsed, so we had to design that. It had to look like winter and cold when it was actually a hot spring day with lots of insects flying around, which had to be painted out. There is a sequence where the radar dish is being used for some test, and birds are flying from the forest and get confused by what is happening, fly close to the dish and die. There were a lot of full CG shots there and CG birds that had to be added. Also, one of the characters revisits the base to commit suicide, so we had to introduce a digital cliff that allowed her to walk up to the side of the dish and look over,” Gombos adds.

Simulating what life is like on Trisolaris is a virtual reality experience developed by the San-Ti that demonstrates the global catastrophes caused by living in close proximity to three suns.

“It was described as a simple arid desert landscape,” Fangmeier explains. “The more unique aspect of that was a certain lighting change. One sun, small and in the distance, was rising, and then suddenly that goes away and it’s night again. Having the light on the actors move that quickly was tricky to achieve on set. We decided along with Jonathan Freeman, the DP for Episodes 101

24 • VFXVOICE.COM SPRING 2024
TELEVISION/STREAMING
BOTTOM: Sophon (Sea Shimooka) is an avatar in a VR game created by the San-Ti to illustrate the destructive environmental impact of living next to three suns.

and 102, to shoot that in a LED stage with a bunch of sand on the ground where we could animate hot spots and the colors of the panels even though we were going to replace all of that in CG.”

Being in the realm of VR meant that the destruction could be fantastical, such as 30 million Mongol soldiers being lifted in the air because gravity no longer exists, or witnessing the entire landscape engulfed by a sea of lava. Fangmeier explains, “Then, we have some pseudoscience, like going inside of a particle accelerator. The San-Ti have sent these two supercomputers the size of a proton to stop the progress of human technology, so when they arrive 400 years later [Trisolaris is over three light years from Earth], we won’t be able to easily destroy their fleet. The proton [referred to as a sophon] unfolds into this giant two-dimensional sphere that then gets etched with computer circuitry. We talked a lot about going from 10 dimensions down to two and then going back to a 10-dimensional object. It’s stuff where you go, ‘That’s what it said in the book and script. But how do you visualize that?’”

To preserve their species until the chaotic era gives way to a stable one, the San-Ti have a specific methodology that involves dehydrating and rehydrating their bodies. “It happens in two places and provided us with unique challenges and creative opportunities,” Kullback observes. “The first time we see it is when the rolled-up dehydrated bodies are being tossed into the water by the army to bring our characters back to life. The rolled-up bodies that get rehydrated were a prop that was designed by the prosthetics artists and looked quite beautiful. We go underwater and see the roll land and begin to unfold. The camera is below it and the sun is above the water, so you have these beautiful caustics and an opportunity for all kinds of subsurface scattering and light effects that make the image magical and ethereal and support the birthing process that it’s meant to represent. At the end of the experience,

TOP TO BOTTOM: 30 million Mongol soldiers appear in front of the Pleasure Dome before being lifted into the air because of the gravitational pull of the three suns.

The VR game created by the San-Ti is so sophisticated that it stimulates the five senses of users such as Jin Cheng (Jess Hong).

The VR game setting allowed for a more hyper-real visual language and the ability to defy physics, like when Sophon (Sea Shimooka) talks with Jin Cheng (Jess Hong) and Jack Rooney (John Bradley) in Episode 103.

SPRING 2024 VFXVOICE.COM • 25

you have a beautiful nude woman who comes to the surface. Then, you find there are other nude folks who have been rebirthed. We shot in a tank at Pinewood to have the underwater shots and the shots of the woman, who is the final realization of this rebirthing. For the elements of the roll landing in the water, we did shoot one for real, but ultimately that was CG. Then the environment above the surface was fully CG. But then you go to the virtual reality game where Jin Cheng is walking with the Emperor and the Follower, and a chaotic era suddenly comes upon us, and there is no room to hide behind a rock from the immense forces of the sun getting ready to melt everybody. The Follower lies down on the ground in a vast desert with the pyramid off in the distance and has to dehydrate. That one presented a bit more of a challenge because you didn’t have the opportunity to travel around her and have these beautiful caustics. We heavily researched the footage of things dehydrating, like fruit left in the sun rotting, to try to get a look that was like how the body would deflate when it was completely sapped of water.”

Being able to digitally reconstruct sets and locations was made even more important by having a highly reflective VR headset. “The reflective headset required some photogrammetrytype work while you were shooting because it was often in smaller places, and there’s some crew, all of the lighting equipment, and everything is dressed in one direction,” Gombos remarks. “You had to capture that three-dimensionally because as production turned around, you needed it for the paint-out from the other direction. We had HDRI panorama photography of that, but then we also had good spatial information about the room and how that would connect to the shot lighting we would do. We wanted to be precise, and on top of that, we often did a special reconstruction shoot after we were done. I would come in for a few hours and do the photography and LiDAR required for locations. These assets were created on the fly, so we had them to review our work but also to send off to the vendors, and they were using them in post. The 3D assets were helpful in quality-controlling the work and a good tool for orienting our teams. I could have this little 3D representation of the set and share and discuss that with the DP or director. I would say, ‘If they are here, it’s going to look like this.’ It wasn’t theoretical but quite precise.”

Eiza González portrays Auggie Salazar, a member of the Oxford Five, which attempts to foil the invasion plans of the San-Ti.

Cinematographer Jonathan Freeman made use of complex and specific lighting panels for the VR setting shots to emulate what it would be like surrounded by three suns.

“One thing that was a bit different for me was that I did a lot of the concept work,” Gombos observes. “I enjoyed doing that for set extensions that then Stefen and the visual effects vendor working with him would execute.” Fangmeier is intrigued by what the viewer reaction will be beyond hardcore sci-fi fans of the books. “It’s not your typical sci-fi where you spend a lot of time in outer space or meet aliens, and it’s not an alien invasion per se. It’s the first season, so it’s fairly mellow and highbrow. It’s deals with concepts other than the stuff that people are usually used to when they watch sci-fi. I’m curious what the mainstream viewer will think about that.”

There is a core mandate no matter the project for Kullback. “If we are able to help tell the story visually in areas where you can’t photograph something, then that’s our dimension. We’re never creating eye candy for the sake of eye candy. We work hard to have everything that we do fit into the greater whole and to do it in a seamless and attractive way. And, most importantly, in a way that communicates and moves the story forward and realizes the vision of the filmmakers.”

26 • VFXVOICE.COM SPRING 2024 TELEVISION/STREAMING
TOP TO BOTTOM: The Follower (Eve Ridley) and Sophon (Sea Shimooka) are San-Ti appearing in human form to make it easier for VR users from Earth to relate to them.

SEARIT HULUF BRINGS TOGETHER LIVE-ACTION AND ANIMATION

With the release of “Self,” a cautionary tale about the desire to please and be accepted by others, Searit Huluf got an opportunity to showcase her filmmaking talents as part of the Pixar SparkShort program. The project was partly inspired by her parents trying to adjust to life in America after immigrating from Ethiopia, which, at the time, was ravaged by civil war. “My mom and dad separated, so it was just my mom looking after me. I had a lot more independence because she was working a lot. I mainly stayed in the east side of Los Angeles, which became my playground. It wasn’t until I got to UCLA that I started to explore more of Los Angeles, in particular the west side, which felt like being in a different country because everything is so clean, and there were a lot more shops.”

An opportunity presented itself to visit Ethiopia right before the coronavirus pandemic paralyzed international travel. “It was our first mother/daughter trip, and I had forgotten what it was like to be under my mom again,” Huluf recalls. “While in Ethiopia, my mother was cautious because the capital of Addis Ababa is not where my people are from, which is the Tigray region. It wasn’t until we got to Mekelew where my mom’s side of the family lives that we got to relax and meet people.” Huluf watched her aunts make coffee called ‘buna’ from scratch. “After roasting the coffee, they take it to everyone to smell to say thanks before grinding. Then you have to hand-grind the roasted coffee with a mortar and pestle. My friends and I made it every day. It was so much fun.”

Participating in sports was not an affordable option growing up, so Huluf consumed a heavy dose of anime consisting of Sailor Moon, Naruto, One Piece and Bleach. What was made available to her in high school was the ability to take community college classes on computer coding and engineering through STEM [Science Technology Engineering and Mathematics] programming. “I did a website competition inside of which there was a film competition, so I did a live-action short with all of the seniors in my group, and afterward I was like, ‘I want to go to art school.’” The art school in question was the UCLA School of Theater, Film and Television where she studied screenwriting and stop-motion animation. “I was trying to figure out what is the closest I could get to animation but not have to draw, and it was stop-motion; that was the happy medium because I do love live-action and animation. My schooling was live-action, but a lot of my internships were animation; that’s how I divided it up.”

TOP: Searit Huluf, Writer and Director of “Self.”

OPPOSITE TOP TO BOTTOM: Soul was the first high-profile project at Pixar for Searit Huluf.

“Self” was inspired by Searit Huluf desiring to gain social acceptance as well as by the struggles her parents faced immigrating to America from Ethiopia.

BOTTOM RIGHT: “Self” marks the first time since WALL-E that live-action elements have been integrated with computer animation by Pixar.

Internships included Cartoon Network and DreamWorks Animation, then Pixar came to UCLA. “I kept in contact with the recruiter and started at Pixar as an intern in production management while making films on the side,” Huluf remarks. “I am also big in the employee resource groups within Pixar. I spearheaded the first celebration of Black History Month at Pixar and decided to make a documentary where Black Pixar employees talk about what it is like to be Black in America. The 19th Amendment documentary came about because I cared about people voting for the 2020 elections. It was a way to promote Pixar fans to go out and vote by having Pixar women talk about why they should do it and the complicated history of the 19th Amendment. Documentaries are scary because you go in with what’s there and make the story in the editing room. That was a lot of fun, and I gained more confidence to be a filmmaker, and I switched back to making narrative films.”

28 • VFXVOICE.COM SPRING 2024
Images courtesy of Pixar Animation Studios.
PROFILE

“I got to work with Tippett Studio, which I love! ... There’s that Pixar comfort where everybody knows each other or someone adjacent. But these were complete strangers, and there was a big age gap between us. A little bit of me was going, ‘Are they not going to respect me?’ And it was the exact opposite. They were so loving and caring.”

Critiquing, not writing, is where Huluf excels. “I went to a talk where a writer said that you have to wear different hats when you’re writing. When you’re wearing the writing hat, you’re writing all of your thoughts and ideas. Once you’re done writing, you put on the critique hat, and that’s where you start editing what you wrote. Is this actually good? Is it going to help your story? Is your structure right? You can’t wear both hats at the same time. I think a lot about that when I write. What is also great is that I went to UCLA and did screenwriting. I’m still in touch with all my screenwriting friends, and everyone is still writing. It’s nice to write something and the next week we do a writing session together and talk about the things that we’re writing.” Two individuals standout for their guidance, she says. “I still keep in touch with my UCLA professor, Kris Young, and am part of the Women in Animation mentorship program; [director] Mark Osborne is my mentor. It’s nice talking with him. He did Kung Fu Panda and The Little Prince. Mark is doing everything I want to do with my life! He’s doing live-action and animation. In this mentorship program, other women are working on their own projects. One Saturday we have it with him and the other Saturday is just us. That has been great.”

Huluf has a support network at Pixar. “Luckily for me, I’m not the first Black shorts director at Pixar. Aphton Corbin made “Twenty Something,” so it‘s nice to be able to talk to her about it.

SPRING 2024 VFXVOICE.COM • 29
“[Director]

Mark [Osbourne] is doing everything

I

want

to do

with my life! He’s doing live-action and animation. In this mentorship program, other women are working on their own projects. One Saturday we have it with him and the other Saturday is just us. That has

been great.”
—Searit Huluf, Writer and Director of “Self”

TOP: Soul afforded Huluf the opportunity to work with one of her role models, writer/director Kemp Powers, who co-directed Soul

BOTTOM LEFT: Spearheading the first celebration of Black History Month at Pixar, Huluf went on to serve as a cultural consultant on Soul

BOTTOM RIGHT: Searit Huluf helped to facilitate brainstorming sessions to make sure that there was cultural authenticity to the story, character designs and animation for Soul

Michael Yates did the Win or Lose streaming [series for Disney+], and I keep regular contact with Kemp Powers. It’s nice to talk to people who are in your arena. Personally, too, that’s why I do both live-action and animation, because there’s something about both mediums that gives me motivation and hope.” Like Mark Osborne with The Little Prince, Huluf was able to combine computer animation and stop-motion to make “Self,” where the protagonist is a wooden puppet surrounded by environments and metallic characters created digitally. “I got to work with Tippett Studio, which I love! I studied stop-motion at UCLA, so I know what the process looks like, but I have never done it in a professional setting, and I’m not the animator; other people are doing this who have worked on James and the Giant Peach and The Nightmare Before Christmas. There’s that Pixar comfort where everybody knows each other or someone adjacent. But these were complete strangers, and there was a big age gap between us. A little bit of me was going, ‘Are they not going to respect me?’ And it was the exact opposite. They were so loving and caring. I still text with them.”

A significant lesson was learned when making “Self.” “I did a lot of my independent films by myself, and this time I had people

30 • VFXVOICE.COM SPRING 2024
PROFILE

“I spearheaded the first celebration of Black History Month at Pixar and decided to make a documentary where Black Pixar employees talk about what it is like to be Black in America. The 19 th Amendment documentary came about because I cared about people voting for the 2020 elections. It was a way to promote Pixar fans to go out and vote by having Pixar women

talk about why they should do it and the complicated history of the 19 th Amendment.”

who are paid and wanted to be involved,” Huluf notes. “Working with the animators was one of the most insightful moments for me. I would film myself and say, ‘How about we do this?’ They would be like, ‘We could do that, but how about this?’ And it was so much better. In the beginning, I was very precious about it and slowly realized, ‘They know what this film is and what needs to be told, too.’ It was a learning curve for me.” The transition to feature directing is more likely to first occur in live-action rather than animation. “That’s primarily because the stakes are higher in animation than a live-action film. This is purely based on budgets.”

“When I think about filmmakers I look up to, I see that they start with smaller indie features. Barry Jenkins is a perfect example. Moonlight was only a couple of million dollars, and then he made a higher-ground film If Beale Street Could Talk. I want to start small and slowly build myself up. The big jump for me now is to do a feature. Luckily for me, I’m not too intimidated to do it. It’s more about when someone will give me the chance. I do believe in my ideas and storytelling capabilities. Right now, I’m writing and seeing how things go. I look forward to people watching ‘Self’ and being able to talk to them about it because that’s something new for me.”

CLOCKWISE: Going through various characters designs for the character of Self.

A comparison of Self with one of the female Goldies.

A personal joy for Huluf was being able to design the costume for Self.

SPRING 2024 VFXVOICE.COM • 31

Pixar SparkShorts Build “Self” Esteem for Emerging Filmmakers

Treading a path blazed by WALL-E where live-action footage was incorporated into the storytelling, the Pixar SparkShort “Self,” conceived by Searit Huluf, revolves around a wooden stop-motion puppet desperate to be accepted into a society of metallic beings.

“For me, it was, ‘I really want to do stop-motion. I want to visually see something alive onscreen that you can see the handprint of a human touching it,” Huluf states. “I wanted the story to be the reason it had to be stop-motion.”

A central theme is the personal cost of gaining social acceptance. “I will play this game in my head of hiding parts of myself so I can conform and be part of the group,” Huluf explains. “That’s how I visualized Self as she literally rips herself apart to be like everyone else. The other aspect is my mom immigrated to America from Ethiopia, and I wanted to talk about how immigrants are usually not seen or heard. I wanted Self to feel like she is Ethiopian, so she has natural wood that has been carved by a masterful craftsman. There is something nice about her being so natural herself but wanting to be something so shiny, plastic and fake. There is something visually beautiful about that. Another layer on top is that she is even animated differently. Self is stop-motion, so she’s animated on 2s and 3s versus the CG Goldies, which are on 1s and are so slick when they move. Self is poppy and jumpy at points when she tries to talk and interact with them.”

Excitement and fear were felt when working out the logistics for the project. “I was excited about doing something so different and unique, but at the same time I had no idea of how you properly schedule out and manage a stop-motion film,” remarks Eric Rosales, Producer of “Self.” “I was like, ‘Alright, let’s learn this on the fly.’ You’re taking this whole new element and trying to fit

pieces into our puzzle and take their puzzle pieces and put them all together.” The other puzzle pieces belonged to Tippett Studio which constructed, animated and shot the stop-motion puppet. Rosales says, “It was a breath of fresh air in the sense that you get to see how other studios approach their scheduling, decision-making and problem-solving. It was exciting for us to learn from them as much as they were learning from us, and learn how to take the different aspects of the stop-motion process and incorporate it into our pipeline. And vice versa, how we would handle something and transfer that information back over to Tippett. We did a lot of back and forth with them and shared a lot of thoughts.”

Complimenting and informing the design of the physical puppet was the digital version. “We had a digital puppet that Pixar was able to move around in the computer and act out what they wanted the puppet to do. That informed us in terms of how we needed to build the puppet to be able to effectively move in those ways,” states Mark Dubeau, Senior Art Director and Lead Puppeteer at Tippett Studio. “There is a lot you can do digitally that you can’t do with a puppet, and so we knew probably that we would have to build about three or four puppets to be able to do that number of shots.” Nine different faces were constructed to express panic, sadness, happiness and anger.

For a long time, the digital double of Self was a placeholder for 19 shots that utilized stop-motion animation. “But as things progressed, we turned off our character as she is now being added in the comp,” states Nathan Fariss, Visual Effects Supervisor of “Self.” “The amount of color tweaking and general polish that was happening in comp, and even the color grading steps in post, were much more than any of our other projects because we needed to match a photographic element to our CG world and vice versa.”

Previs and layout dictated the shot design for the stop-motion scenes. “We had a first lighting pass that was already done

32 • VFXVOICE.COM SPRING 2024 PROFILE

and even before Tippett started lighting everything up,” Rosales remarks. “We sent members of our lighting team over there to do the last bits of tweaking. Searit acted out every single shot that Tippett was going to do. She did it in her living room by herself. To sell the foot contact, Tippett ended up building a concrete slab out of Styrofoam so we were able to see Self physically walking on top of something.”

Self makes a wish upon a falling star that enables her to exchange wooden body parts with metallic ones. “I usually talk about what the character is feeling at the moment,” Huluf states. “The way we talked about that scene of her jumping off of the roof, I wanted to show how she goes from, ‘Oh, cool these body pieces are falling from the sky,’ to slowly becoming more obsessive in finding them. That face is the last piece for her. ‘I’m going to finally belong.’ A lot of people do a lot of crazy things to belong. In Self’s case she’ll rip herself apart to be like everyone. Self-jumping off of the roof is the climax of the film because it’s her craziness and obsessiveness all wrapped into one as she falls into darkness. We had a lot of conversations about how she snaps out of it, and for me, your face is who you are. As she steps on her own face, it snaps her back into reality and makes her realize and go, ‘Oh, my God! Why did I do this?’”

The cityscape did not have to be heavily detailed. “We ended up settling up a look that was a flat color or a gradient so it felt like there was a little bit of life in the city and things were lit up,” Fariss reveals. “There were other people present in the buildings, but it didn’t necessarily draw the audience into the lives that are going on in the buildings around there. The cities were mostly hand-built. There wasn’t enough scope to warrant going a procedural route to put the cities together, so they were hand-dressed, and there was a lot of shot-by-shot scooting some buildings around to get a more pleasing composition.”

More problematic was getting the hair right for the puppet.

OPPOSITE PAGE, LEFT TO RIGHT: Tippett Studio Senior Art Director and Lead Puppeteer Mark Dubeau explains the puppet design to Searit Huluf.

A dream come true for Huluf was being able to collaborate with Tippett Studio on “Self.”

The hair of Self was the hardest aspect to get right. It was inspired by the hairstyle of Searit Huluf.

Showcasing the detailed eyeballs for the stop-motion puppet crafted by Tippett Studio.

THIS PAGE, TOP TO BOTTOM: “Self” Producer Eric Rosales and Huluf examine the various pieces that go into making a stop-motion puppet.

Various body parts and variations had to be created by Tippett Studio to give the stop-motion puppet the correct range of physicality and emotion.

States Dubeau, “Once we figured out what urethane to use then we did all of the hair. However, we found out it was too heavy for the head. We had to go back and make two pieces of hair that go down and frame either side of her face. Those were made out of that material and painted. We hollow-cast the ones on the back, which had a wire that went into the head, and then you could move those pieces around, but you couldn’t bend them. The ones in front could swing and twist. It totally worked. Now you got the sense of this light, fluffy hair that was bouncing around on her head.”

“Self” was an educational experience. “One of the things that we learned from Lisa Cooke [Stop-Motion Producer] at Tippett is you end up saving your time in your shot production,” Rosales notes. “It’s all of the pre-production and building where you’re going to spend the bulk of your money. There was a lesson in patience for us because with CG we can take everything up to the last minute and say, ‘I want to make this or that change.’ But here we needed to zero in and know what we’ve got going on. Once the animators get their hands on the puppet and start doing the shots, the first couple of shots take a little bit of time. After that handful of shots, they get a feel for the character, movement and puppet, and it starts moving quickly. Then we were able to get our team on, and they were able to start learning their cadence as well. It started becoming a nice little machine that we were putting together.”

Searit appreciated the collaborative spirit that made the stop-motion short possible. “I’m approving things at Tippett and going back to Pixar to approve all of the CG shots multiple times a week. We had a lot of people who were big fans of ‘Self’ and helped us while they were on other shows or even on vacation or working on the weekend because they were so passionate. I’m grateful that Jim Morris [President of Pixar] let me have this opportunity to make a stop-motion film, which has never been done before at Pixar.”

SPRING 2024 VFXVOICE.COM • 33

BRIDGING THE GAP BETWEEN ACCURACY AND AUTHENTICITY FOR SHOGUN

Inspired by the power struggle in feudal Japan that led to the rise of Tokugawa Ieyasu to Shōgun and his relationship with English sailor William Adams, who became a key advisor, James Clavell authored the seminal historical fiction novel Shōgun, adapted into a classic NBC miniseries starring Richard Chamberlain and Toshiro Mifune. Forty-four years later, the story has been revisited by FX and Hulu as a limited 10-episode production under the creative guidance of Justin Marks and Rachel Kondo.

“What we felt made Shōgun interesting today would be to tell more of an E.T. the Extra-Terrestrial story of an outsider who has shown up in the world that we let the audience inhabit,” states Justin Marks, Creator, Executive Producer and Showrunner. “We worked with our producers and star, Hiroyuki Sanada, as well as Eriko Miyagawa [Producer], to use their expertise to craft the dialogue in the right kind of Japanese.”

TOP: Actor Hiroyuki Sanada had a key role in making sure that period-accurate Japanese was spoken by the characters.

OPPOSITE TOP TO BOTTOM: Anna Sawai felt completely in the role of Toda Mariko when the Christian cross was hung around her neck.

Kashigi Yabushige attempting to rescue Vasco Rodrigues from drowning was a challenge to assemble for Editor Maria Gonzales.

Regarding depicting the Sengoku Period, compromises had to be made. “There will always be a gap between accuracy and authenticity, which means negotiating which spaces are necessary to keep distance and which ones you need to close the gap,” states Creator and Executive Producer Rachel Kondo. “We were constantly defining and redefining what we’re trying to be authentic to. Are we trying to be authentic to a time or people or specific place?” Originally, the plan was to shoot in Japan, but the [COVID-19] pandemic caused British Columbia to become the stand-in for the island nation. “Very little cleanup was required relative to what it would be in Japan where you would be removing power lines all

34 • VFXVOICE.COM SPRING 2024
Images courtesy of FX.

day just to get something to look natural, and then you want to plus it to the premise of the story,” Marks says. “With Michael Cliett [Visual Effects Supervisor and Visual Effects Producer], we worked out a system that would keep us flexible in postproduction with what we would call a high and low appetite version of a shot; that element of protection was for storytelling reasons but largely for budget reasons. Then, what it allowed us to do was to say, ‘This is a show about landscapes,’ and on some level, we have broad landscapes and what we called ‘landscapes of detail,’ such as close-ups of tatami mats because they were shot with macro lenses.”

Osaka was the most straightforward of the three cities to develop because extensive reference material exists from 1600 and the general topography has not changed. “Ajiro was a gorgeous little cove on the waterfront, but the area itself wasn’t quite large enough to create a whole village. So, we had to make a mental jump to say that the upper village is where the samurai class live and the lower village was where the much poorer fishing folk live,” Production Designer Helen Jarvis explains. “We ended up using two different locations and then knit them together in a few establishing shots. Edo [modern-day Tokyo] was the city that Yoshii Toranaga [cinematic persona of Tokugawa] was actually in the process of developing and building at the time. We saved a portion of the waterfront Osaka set and didn’t fully develop it until much later in the series knowing that we had to create two city blocks that were in the process of being built. One of our set designers did a

preliminary model of the shape of the city and how the castle might relate to the city; that ended up being much more in Michael Cliett’s hands. He had people scanning the buildings that we had and we had various other 3D files of buildings that we would like to see represented, like temples.”

Exterior garden shots of Osaka Palace were captured inside of Mammoth Studios, requiring soundstage ceilings to be turned into CG skies. “There was a lot of fake bounce light as if the set was lit by the sky rather than sunshine,” reveals Christopher Ross, Cinematographer, Episodes 101 and 102. “We would light the garden as if it was an exterior and then each of the sets would not only have direct light from whatever lighting rig, but they also had borrowed light from the gardens themselves. The way to create chaos in the imagery was to allow the sun to splash across the garden at times then let that borrowed light from the splash of sun push itself into the environment. Thanks to the art department, all of the ceilings were painted wood paneling, and we could raise and open each of them. Each ceiling had a soft box, so for the interiors there was a soft-colored bounce fill light that we could utilize should we need to.” A complicated cinematic moment was executed onboard the galleon, which gets hit by a huge wave. “You start on deck, end up below deck then return to the top deck, all within the space of a two-and-half-minute sequence. It required a lot of pre-planning and collaboration between the departments and in total unison with the performers on the day, getting the camera to track with one character, change allegiance and track with a different character, and track with yet another. It forced everybody to be very collaborative. It was great that we could pull that sequence off, and it looks epic.”

Contributing to the collaborative spirit was Special Effects Supervisor Cameron Waldbauer. “You take the boat sequence, for example. We’re dumping water on a ship that is on a gimbal, and Lauro David Chartrand-DelValle [Stunt Coordinator] has guys going off the side of the boat, and we’re rehearsing that and putting that together. Then, Michael Cliett takes that, puts it out into an open ocean, and

36 • VFXVOICE.COM SPRING 2024
TOP TO BOTTOM: Originally, the plan was to shoot in Japan, but the COVID-19 pandemic caused principal photography to take place throughout British Columbia.

it looks seamless in the end,” Waldbauer says. Storyboards were provided early on. “We would do tests of things and make things that we wanted to do. We would almost go backward so they would get the information from those tests and put that into the storyboards that were presented to everybody else,” Waldbauer adds. Shōgun offered an opportunity to return to old-school special effects. “I’ve done several superhero movies with lots of greenscreen and stage work, and that wasn’t what this was. This was interesting for me and the crew to work outside for the next seven months. Now you’re dealing with all of the weather and elements, and you’re working on a show that doesn’t have the time to come back to do it later. You deal with what’s happening on the day. We did get the weather that we wanted for the most part. The desire to get everything in-camera meant incorporating effects rigs into sets and hiding them on location. We have tried to match what would actually happen on the day and what would happen at the time. A sword hits a person in 2024 the same way as it did in 1600. However, you need to make sure to get the look that the director wants out of it dramatically, instead of having to adhere to what it used to look like,” Waldbauer explains.

Serving as a translator between Yoshii Toranaga and John Blackthorne is Toda Mariko (based on Akechi Tama), portrayed by Anna Sawai. “For Shōgun, there wasn’t that much acting with visual effects,” Sawai notes. “It was more, we have an amazing set, and on top of that when they go in on a wider shot, they’ll be able to see through visual effects what Japan looks like. There is an ambush scene, which was supposed to be arrows flying and, obviously, they weren’t going to do that, so we had to pretend they were coming at us. For the ship scenes, I would have to look out into blackness because we were shooting that at night and visualize it being a beautiful ocean. It’s difficult when they zoom into my face, and you’re thinking about, ‘I’m visualizing this, but I’m actually seeing a camera thrown right in my face!’ Those things are hard,

SPRING 2024 VFXVOICE.COM • 37
TOP: Hiroyuki Sanada portrays Yoshii Toranaga, who author James Cavell based on Tokugawa Ieyasu, founder of the last shogunate in Japan. BOTTOM TWO: The opening of the series was altered to have the Erasmus appear like a ghost ship during a vicious storm.

but it’s part of our job that we use our imagination.” Two years were spent training at Takase Dojo prior to production. “Then on Shōgun,” Sawai continues, “I found out that I had to do the naginata fighting, which is a completely different thing because now you’re working with something that is super long and hard to control because it’s heavy, which it should be because if it’s light it’s not going to show that you’re actually fighting.” Performing stunts is not a problem for Sawai. “I love it! I love it so much! I feel lucky that when Lash [Lauro David Chartrand-DelValle] saw me fighting, he was like, ‘Let’s try to use as much of you as we can, and other times we will go with Darlene Pineda [who did an amazing job as my stunt double].’”

“We didn’t have a lot of previs for this show, which is unusual considering the scope of it,” observes Maria Gonzales, Editor, Episodes 101, 104, 107, 110. “We did have some storyboards and used those when we could. I stayed in touch with Michael Cliett as much as possible because he was my go-to in terms of understanding the potential for some of these shots. You try to put the thing together in the way that makes the most sense, and some of it we had to pick up later on once we met with the directors and talked with Michael. Sometimes, he was able to send me artwork that helped guide us in a certain direction.” Temp visual effects were created within the Avid Media Composer by the editorial team. Gonzales adds, “I did the pilot episode where there was a huge storm and some of those big reveals of Osaka. Our guys decided to pull in as many shots as they could to give an idea what the real scope of the scene was going to be.” The cliffside rescue of a drowning Vasco Rodrigues was a mindbender to assemble. Gonzales explains. “I had some of the close-ups and wider shots. I had no idea of what this was going to look like and what the height of the cliff really was. My first assembly was very different from what you saw in the final. Once Michael and Justin came to the cutting room, we were able to finesse it and get it to what you see today. But it was with Michael’s help that I was able to finally see what this was supposed to be. It’s like, ‘No. No. No. These guys are supposed to be way up and Kashigi

38 • VFXVOICE.COM SPRING 2024
TOP TO BOTTOM: Osaka was the most straightforward city to construct because extensive reference material exists from 1600.

Yabushige is supposed to be falling way down.’”

Three different locations were involved in creating the scene mentioned above. “We were on a six-foot-high set piece in a field of grass in Coquitlam, B.C.,” reveals Michael Cliett, Visual Effects Supervisor and Visual Effects Producer. “Everything on the top of the cliff was shot on that set piece. Every time you looked over the top, that was all CG water, coastline and Rodrigues. We did another set piece that was on the side of the cliff when Yabushige was repelling down. We shot all of the profile shots and him hanging from the top down on a vertical cliff piece in our backlot over where we had the Osaka set ready as well. Then we had the gulch where the water was out on a 60-foot tank special effects setup with the rocks. We were praying for the right weather and light at all three locations because each of them was outside.” Another dramatic water moment is when the Portuguese carrack known as the Black Ship attempts to prevent Toranaga from leaving the harbor of Osaka. “The galley was stationary, but we did put the Black Ship on 150 feet of track. We got the Black Ship from Peter Pan & Wendy that had just finished shooting here, chopped it up and made our own design. It’s roughly one-eighth of the ship. We did have some motion where it appeared that the ships were jostling for position. We shot a bunch of footage, but at the end of the day we weren’t quite sure how we were going to fill in the gaps, what the ships would be doing, what shots we needed of the ships that were going to be all visual effects and how that story was going to come together. ILP and I cut things together differently and tried to fill in those gaps. Over two months in the summer of 2022, we finally had it working with a bunch of greyshade postvis.”

Over the 10 episodes, 2,900 shots were created by SSVFX, Important Looking Pirates, Goodbye Kansas Studios, Refuge VFX, Barnstorm VFX, Pixelloid Studios and Render Imagination, while Melody Mead joined the project as a Visual Effects Associate Producer, allowing Cliett to focus more on supervision side of the visual effects work. “At the beginning of Episode 105, Toranaga is arriving with his 100,000-person army, which was 99% digital, as we rise up and move past him,” Cliett remarks. “The Japanese have a way of moving and walking, so we did do a number of motion capture shoots with Japanese soldiers and instilled a lot of that into the digital versions of them.” Toranaga’s army establishes a base camp on an encampment that subsequently gets destroyed by a massive earthquake. “This is why we had to put mountains surrounding the training fields, because there are huge landslides that come down which bury the army, and we had to make it on the magnitude where we could sell that 75,000 people died,” Cliett notes. FX Networks Chairman John Landgraf raised a narrative question when trying to lock the pilot episode about how the Erasmus, the Dutch ship piloted by Blackthorne, gets to Ajiro. Cliett explains, “I said to Justin, ‘Why don’t we look into having the ship being towed in? The samurai are running about 50 skiffs, but the villagers are doing all of the work. Then, we can fly past the ship into Ajiro, which you get to see for the first time.’ Justin loved it. Then, John Landgraf loved it. I ended up taking a second unit out, directing that plate and doing that whole shot. It’s one of my favorite shots of the series.”

SPRING 2024 VFXVOICE.COM • 39
TOP TO BOTTOM: Three different locations were assembled together for when Kashigi Yabushige descends a cliff to rescue a shipwrecked Vasco Rodrigues.

SINGING PRAISES FOR UNSUNG HEROES

TOP: The prevailing question for Aaron Eaton in regard to holograms is how to make something that does not exist look like something that could be captured by a camera, such as this one featured in Avengers: Endgame. (Image courtesy of Cantina Creative and Marvel Studios)

OPPOSITE TOP: Understanding composition and cinematography was important to Cameron Widen when doing the layout of the exterior train shots for Season 3 of Snowpiercer (Image courtesy of Image Engine Design and TNT)

Cameron Ward produced previs for the Black Panther: Wakanda Forever sequence of Namora leading a squad of Taloncail to take out the sonic emitter on the Royal Sea Leopard. (Image courtesy of Digital Domain and Marvel Studios)

Technical animation had to be created by Jason Martin for Lost in Space Season 2 to support the effects needed for destruction shots. (Image courtesy of ILP and Netflix)

When the final credits roll, it becomes quite clear that you literally need an army of talented individuals spanning a wide variety of professions to make a film or television production a reality. To take a more micro perspective, one can look at the visual effects section where hundreds upon hundreds of names are listed for each of the vendors, and then it truly sinks in – the number of unsung heroes who have contributed their time and talents far from the public spotlight. This lack of awareness also happens within the visual effects industry as generalists have given way to specialists who are more insulated from the contributions of their colleagues in other departments. In an effort to rectify the situation, a number of visual effects companies were asked to put forward candidates deserving of recognition for their exemplary professionalism and skillset. Think of those listed below as just a small sampling of people and occupations that are pivotal in making the visual spectacle and invisible transformation possible.

Aaron Eaton, VFX Supervisor, Cantina Creative

I like that I’m not specialized because I would hate to be doing one single thing all day long! I’m happy to have found Cantina Creative where I can still be a generalist even today. You don’t just work on a shot for an hour, send it off to somebody and never see it again. I’m able to work on something, and it can be very much my own, and you’re involved with it through all of the stages; that has been cool. Compositing is definitely my favorite. It’s that final creative push of bringing something extra to a shot that makes it sit in there and look awesome.

Holograms are a lot trickier than it seems because you’re working on something that doesn’t exist. How do you make the hologram absolutely believable as if it’s something you could film with a camera? There are numerous things that it takes to make

40 • VFXVOICE.COM SPRING 2024
VFX TRENDS
“As workflows and techniques are ever-evolving, for me, it is more important to be on top of the questions that often do not change. How do I stay efficient so that I can be creative? How do I continue to be inspired and to inspire? How do I stay proud of my work?”
—Jason Martin, Layout Artist, ILP

the hologram feel integrated into the shot. It has a lot to do with mimicking everything that the camera is doing, with lots of depth in the element, textural elements, noise, grain and glitches. All kinds of subtle features that could come with all of these holograms because a hologram may not be perfection. You have to think about the technology that is projecting or creating the hologram and all of the aspects of how it would actually work.

Alan Puah, Head of Systems, Territory Studio Systems is responsible for some of the most critical parts of the pipeline, things like the storage, network and render farm, which form the backbone of the infrastructure in a visual effects studio. Sometimes the existing infrastructure will dictate how the pipeline works, but often it works the other way around, and we’ll need to upgrade and adapt things to support how a project pipeline is structured.

Creating CGI places some of the highest demands on the technology used, so it’s important to make sure that you’re keeping up with new technology. There is probably more happening now than at any other time as advancements in machine learning and how the exponential growth in computing power impacts our industry. But there’s also been some reversal in trends. For example, in some cases utilizing the cloud hasn’t been the best fit, so there’s been a migration back to on-premise for various reasons that include saving costs or maintaining more control over data and security.

SPRING 2024 VFXVOICE.COM • 41

TOP TO BOTTOM: Jeremie Lodomez believes that compositing is vital to seamlessly blend CG, animation and live-action footage, which was the case for the Heroes of the Horn reveal in Season 2 of The Wheel of Time. (Image courtesy of Framestore and Prime Video)

It was the book The Art of The Lord of the Rings that made Jeremy Melton want to be involved with world-building for shows such as The Orville: New Horizons. (Image courtesy of FuseFX and Hulu)

For Maike Fiene, it’s important not to be too precious about visualization, as the needs of a production like Jingle Jangle: A Christmas Journey will evolve over time. (Image courtesy of Framestore and Netflix)

Alicia Carvalho, Senior Rigger, DNEG

I broadly describe my job to people who aren’t in VFX as “putting in the control structures so that animators can animate. “Coming mostly from feature animation, TV and game cinematics, working in a visual effects pipeline has been a really interesting experience, especially when you’re working on rigs where the end result has to match a plate. You have another layer of restrictions of what can move and in what way compared to the relatively free rein you have in Feature when the bounds of what you can do are based on the needs/imagination of an animator.

With machine learning and the move towards game engine integration, it’s going to be more important for artists to hold onto their foundational skills. In general, I’ve noticed a promising trend among companies discussing and wanting to move more female colleagues into supervisory or lead roles, but there doesn’t seem to be enough mentorship support once those positions are filled. There’s definitely always room to improve.

Cameron Ward, Previsualization Artist, Digital Domain

I was on Black Panther: Wakanda Forever, and there were some beautiful renders of the boat as the hydro bombs kept coming and explode beneath it. We had a little time so we could dial it in and make it look great before delivering it to the client, but that’s not always the case. It depends on the project and what the client requires because sometimes they’re only looking for rough. However, sometimes lighting and composition can sell a shot.

Years ago, I was on The Fate of the Furious, and we went to get scans of the city. We were laying out the streets and the heights of the buildings. We got a Dodge Challenger and mounted a camera on its hood. When the day came for shooting, they weren’t paying craft services for four days’ worth of shoots, but for one because they got it all in a day. There’s that aspect as well. You’re cutting

42 • VFXVOICE.COM SPRING 2024
VFX TRENDS

down the cost of an actual day of production because you already know your camera angle, focal length, how high you want the camera off the ground and how fast it will be going.

Cameron Widen, Layout, Image Engine

The word ‘layout’ means a different thing for every studio – and often with every person you speak with in a studio. Layout in feature animation is wrapped up a whole lot more in previs-type tasks, like figuring out camera angles and composition. In visual effects, most of the time we’re working with plates that have been shot, so there are not a lot of choices to be made by us in that regard. That said, in almost every project there will be some full CG shots that don’t have associated photography with them, and that’s where we get to flex our creative muscles and use our composition and cinematography skills. Recently, we’ve been getting a push to give our layout versions and presentation that we send for review a much nicer look than what I’m typically used to doing. My preference is to send grayshaded renders for review because then people will be commenting on composition, speed of the camera and camera framing. If our layout versions look too nice and polished then we will start getting visual effects supervisors or other people who will see an issue with a texture map or some shading that we have no control over, and they will fixate on that and won’t make any comment on the layout part.

Essentially, my job is to look after all of the real-time technologies on our stage, and that’s everything from basic characters to in-camera effects to LED walls. We also tend to get involved with pre-production looking at assets, the things that we will be driving live and what the director wants to achieve. Then we put together a bunch of real-time technologies that we have at our disposal for that project. I essentially see the job as a tie-in with the animation, mocap and visual effects for films, video games, television, AR and the web. We use Unreal Engine 5 to stream all of our live motion-capture data onto, and that’s where we’ll do the live characters and virtual cameras to support the director. The main reason we do this is that the client can go away on the day, have signed off shots, know exactly what they’re doing and bringing into post-production and, depending on the workflow, sometimes walk away with a real-time edit. They can go into post-production confident that they’ve got everything, and generally it saves a bunch of money and time in the decision-making process. Also, I oversee our pipeline and head a R&D team.

Jason Martin, Layout Artist, ILP

As workflows and techniques are ever-evolving, for me, it is more important to be on top of the questions that often do not change. How do I stay efficient so that I can be creative? How do I continue to be inspired and to inspire? How do I stay proud of my work? [One of the most complex tasks] would be something we call “Technical Animation” that I did on Lost in Space S1 and S2 to support effects on the destruction task where large environments or spacecrafts collapse or get destroyed. I would supply a semi-detailed version of the event to effect, made with various methods in Maya, like keyframe animation, rigid simulation, cloth simulation

As a workflow supervisor, Michael Billette spends time informing the various departments at Image Engine how to best utilize the pipeline when working on projects like Bloodshot. (Image courtesy of Image Engine Design and Columbia Pictures/Sony)

Concept art of

SPRING 2024 VFXVOICE.COM • 43
TOP TO BOTTOM: Meliza Fermin created a futuristic Brooklyn Bridge for The Orville. (Image courtesy of FuseFX and Hulu) Lucifer’s Palace door by Niklas Wallén for the Season 1 of The Sandman. (Image courtesy of ILP and Netflix)

or deformation, that talented effects artists would enhance, develop or add to. This workflow enabled us to maintain a high level of artistic control on small-sized teams often consisting of me plus one to two persons, but the sheer amount made it complex.

Jeremie Lodomez, Global Head of 2DFilm & Episodic, Framestore

Compositing plays a vital role in the visual effects pipeline, seamlessly blending elements such as CG, animations and live-action footage in the final output. It enhances realism and supports storytelling by ensuring all elements are consistent in lighting, perspective, and color. My aspiration is for compositors to perform rapid iterations within their software. For instance, tweaking a CG environment without getting entangled in lengthy interdepartmental revisions. This approach would enable swift creative iterations, with the potential to integrate these fixes into later stages of the pipeline. The rise of technologies like USD and Unreal Engine heralds a future where compositors could emerge as more dynamic players in the field, evolving into Image Composition Artists. The fact that audiences are unable to discern our visual effects work speaks volumes about the quality and realism we achieve.

Jeremy Melton, Art Department Supervisor/DMP/ Concept Artist, FuseFX

I saw The Art of The Lord of the Rings when it came out, and my mind was blown. I said, ‘That’s what I want to do.’ As an art director supervisor, I try to encourage everyone to be an artist, be the best that they can, and encourage them to go in the direction that they want to go. Not pigeonhole someone or make them do something that they don’t want to do. But at the same time there is the corporate side of making sure that the budgets and all of the rules are being followed, that we’re doing everything that we’re supposed to do. It’s wild. I was never trained in it. I worked into the position by experience. You have to be open, especially with the advent of AI using Blender or ZBrush, whatever helps the artist to get to where they need to be to create the best possible image. That’s one thing I want to encourage. Instead of, ‘This is how it’s done,’ let’s open it up.

Katie Corr, Lead Facial Animator and Facial Capture Technical Director, The Imaginarium Studios

A large part of the job for Sam Keehan is providing the necessary support so that artists can concentrate on their job and produce the best results for clients like Marvel Studios on Ant-Man and the Wasp: Quantumania (Image courtesy of Territory Studio and Marvel Studios)

Turntables are indispensable when submitting textures for review to make sure that the final image has the right reflectivity and surface deformation, such as when working on Shang-Chi and the Legend of the Ten Rings (Image

of Digital Domain and Marvel Studios)

It’s quite fun working with a lot of different clients because you’ve got some realistic projects that use metahumans, and that’s one of our pipelines. Then you have stylized projects that are cartoony, and I get to have a bit more freedom with that. My job begins from onstage with capture, and that means taking care of the actors, making sure that the client is happy, and capturing the date so that the post team can make sure that they get a good result on their tracking. Then, we move on and start tracking through one of our pipelines. From there we take it onto their provided rig and do final cleanup to their specs that they have been asked for depending on the game, movie, TV show or ad. Anything you can think of, we’ve attempted! The more time you spend on it, the higher quality it becomes. It’s quite a subjective area. You try to nail down little nuances like nose flares or when someone is breathing or little eye twitches. The fun part of the job is you get to hear the request

44 • VFXVOICE.COM SPRING 2024 VFX TRENDS
TOP TO BOTTOM: During pre-production and through post on The Marvels, Patrick Haskew provided visualization that was used to help convey a whole crew being swallowed up by Flerkens. (Image courtesy of The Third Floor and Marvel Studios) courtesy

from the client, then challenge yourself, find ways around it and maintain their expectations.

Maike Fiene, Visualization Supervisor, Framestore

Visualization being the first step to showing the interaction with the CG elements, it is essential to accept that there will be changes to the work as it develops, and you need to be able to adapt and cannot be too precious about it. It is also rewarding as you get to shape the interaction of fun and sweet character moments. This is a very fast-paced environment and requires a general skillset of: understanding of practical filming techniques; being able to interpret storyboards and scripts; general overview and intention of the sequence you are working on (tone, timing, what purpose does this sequence have in the film? What is the director trying to communicate?); general understanding of cinematography (staging, lighting, composition); and all-round technical troubleshooting skills.

In postvis, we’re often collaborating with the finals teams as they might have developed assets further or are developing character animations, and we try to incorporate as much of that as possible to stay true to the final look of the project. This gives the director a chance to shape his vision of the edits at an early stage and test out ideas, and it gives the finals teams a solid foundation to start from.

As a matte painter, you’re in the beginning of the process, and I prefer that because you have more time, it’s a lot more creative, and you’re choosing more of the elements that are going to be used. Our clients say, ‘I want New York in the 1960s.’ You have to create that, but I’m the one who chooses all of the photographic elements to put together so it works in that environment. You have some creative input. Some studios have me do the matte painting and comp it, or I have worked where I was strictly a matte painter and hand it off to the compositor. The nice thing about having both is I know the problems compositors are going to run into; I have already prepped the matte painting so it does work, and they don’t have to come back to me. Compositing is more technical than creative. Sometimes there’s no time to go back to CG or matte painting, so you have to find fast ways to fix it. You’re a problem-solver.

We’re constantly talking to every department about the challenges of their day-to-day job, and we think about how things can be improved and how can they utilize the parts of the pipeline we have already developed they might not be aware of or understand how it could be applied. We do a lot of trying to teach people how to use the tools. Then we also think about how can we improve upon our processes and keep things in sync. We can only do so much as a support department, and it’s not necessarily fast enough for what people need on the floor. Often times, if artists are developing their own workflows or tools, things get very fragmented very fast. Each person tries to do their own solution and can go down different roads. We try to keep things in line because it’s a lot easier for the technical teams to develop when they don’t have all

When

SPRING 2024 VFXVOICE.COM • 45
TOP TO BOTTOM: Thomas Mouraille believes that the term ‘Environment/Generalist’ is better than ‘Matte Painter’ as it more accurately describes the work done for shots of the gulag in Black Widow. (Image courtesy of Wētā FX and Marvel Studios) Thomas Mouraille makes use of 3D software, such as Maya, ZBrush and Substance combined with 2D elements created in Photoshop, to produce matte paintings for The Eternals (Image courtesy of Wētā FX and Marvel Studios) creating technology for the big screen, one has to keep in mind how it would actually work, which was the case for Aaron Eaton when working on Black Adam. (Image courtesy of Cantina Creative and Warner Bros. Pictures)

TOP AND BOTTOM: When working with plate photography. Cameron Widen has a lot less creative freedom for layout than when dealing with full CG shots, as reflected in The Book of Boba Fett (Image courtesy of Image Engine Design and Lucasfilm Ltd.)

There are constant questions that Jason Martin is always trying to answer, such as how to stay efficient in order to be creative when working on an image of Sundari in Season 3 of The Mandalorian (Image courtesy of ILP and Lucasfilm Ltd.)

Being an art department supervisor means that Jeremey Melton also has to be conscious of budgetary restrictions when working on The Orville: New Horizons (Image courtesy of FuseFX and Hulu)

of these parallel processes to work on. They can create some core features and make sure they can support the other workflows that are needed.

Niklas Wallén, Concept Artist, ILP

You need to have your fundamentals as a concept artist and some design rules that you can always apply that will make it look better. But sometimes I get given things and go, ‘I don’t see the problem here.’ When I began at ILP, my mentor, Martin Bergquist, told me, ‘Your job is to check out the art direction and documents from the client and what they had in mind from the start, be good with that and create a design rule book. But then you take those design rules and have fun with them.’ If you have these design rules in yourself always, it’s easy to spot when something is wrong with an image. Whether it’s The Mandalorian or The Sandman that have totally different shape languages, I can tell quickly if this is out of line because there’s usually someone who has built the stuff, has been with it for a long while, and maybe they have put themselves into a rabbit hole and forgotten what the shape language is. It’s my job to go in and say, ‘That will work better if I did this.’

Patrick Haskew, Sr. Visualization Supervisor, The Third Floor Visualization helps build the foundation of what you see on the silver screen. Because you can iterate quickly and work closely with many collaborators – including the VFX Supervisor – from day one on the production, the process is invaluable in helping develop the look of the visuals as it relates to telling a believable [shootable] story. We are also able to provide and use technical visualization and virtual production tools that help connect what’s visualized to shots and equipment on set, and ensure that work and plans from pre-production carry through and can be built upon through post. The industry is always trying to figure out how to make film and television cheaper and faster, and we are at the forefront of that endeavor. But, at the end of the day, the relationships built with the directors, producers and VFX supervisors are at the heart of the process. Technology will always change, but it all starts from an idea to tell a story audiences will love. We are in that room and help represent that storytelling vision.

Patrick Smith, Head of Visualization, MPC Visualization has grown out of its infancy and is starting to get into its teenage punk-rock years. With the advent of all the tools and real-time technology that is coming to the forefront with virtual production, visualization is certainly a key component of that evolving filmmaking pipeline, and it shines a spotlight on everything that we’re doing. It’s taken off like wildfire. Everybody and their brother have a visualization studio now, and every visual effects house is folding a visualization department into the front-end of their pipeline. The easiest way of understanding visualization is likening it to sculpture. Imagine starting with a giant slab of marble and saying, ‘We’re going to sculpt the Statue of David.’ And everybody is wondering, ‘What does that look like?’ What you’re doing is helping to develop and shape what the visual aesthetic of that actually looks like. You can consider previs to be your rough draft, go and shoot, and then finalizing what that draft is so you are setting up your finals team for success on the back-end of that.

46 • VFXVOICE.COM SPRING 2024
VFX TRENDS

Sam Keehan, Creative Director, Territory Studio

The most important thing about being a creative director has always been to try as often as I can – whether that be when we’re resourcing projects or hiring people – to surround myself with people who are better than me. There will be particular skills that people will be way better at than me. We can talk, and they can go and enjoy the thing that they’re really good at and come up with interesting stuff. I will be able to sit back and say, ‘Yes. That’s exactly what I was thinking.’ The inherent difficulty is if you’ve got multiple people across multiple jobs. You want to make sure that everyone is getting the best work out, but the only people getting the best work out are the ones getting enough support. For my job, in particular, it feels like a lot of it is facilitation and making sure that people have the support they need to just concentrate on the job that has to get finished.

Stuart Ansley, Lead Texture Artist, Digital Domain

The way I describe being a texture painter is to imagine if you went to a toy model shop, got a figurine or car and have to paint colors and details onto them. Sometimes there is metallic stuff or shiny things and you have to decide what color something is going to be, how reflective it is going to be and how dirty it is going to be. We put in all of those little details. In order to be good at the job, you have to have an eye for color, composition and detail. You have to see the little things. I get into trouble when sometimes I have conversations with people and I zone out and my eyes glaze over. It’s because I’m looking at their forehead pores or the way their eyes wrinkle. My wife will always call me out! Whenever submitting our work for review, we always view it in a turntable so the object itself is turning and the lights are turning around it as well, because it’s so important the way the light scrapes across the surface so that you’re getting the right reflectivity and surface deformation.

Thomas Mouraille, Lead Matte Painter, Wētā FX

In a nutshell, we could group the software we use in three categories. We use the first group for creating 3D assets, the second for assembling scenes and the third for creating and adjusting 2D content. We create 3D elements using software such as Maya, ZBrush, Substance and Mari, as well as Houdini and Gaea for terrain. The scene assembly process is done within Clarisse iFX and Houdini. The 2D elements are created and adjusted using Photoshop and Nuke. When required, we use specific software such as Terragen, Reality Capture or Unreal Engine for bespoke tasks.

The matte painting step represents around 10% to 20% of the entire work we actually produce on a show. The bulk of the work is now done using 3D packages. “Environment/Generalist” would be a more correct name to define what we do. The tools evolve quickly, and the current AI breakthrough will likely bring new tools to our toolbox soon. It is already happening with software like Photoshop and Nuke, which have some AI-driven tools. The real-time engines are also being incorporated into the visual effects industry as a solution to render final pixel. It is something we keep a close eye on, and are slowly integrating in our pipeline.

TOP

Aaron Eaton

Alicia Carvalho

Cameron Ward

Cameron Widen

George Sears

Jason Martin

Jeremie Lodomez

Jeremy Melton

Katie Corr

Maike Fiene

Meliza Fermin

Michael Billette

Niklas Wallén

Patrick Haskew

Sam Keehan

Stuart Ansley

Thomas Mouraille

Alan Puah

Patrick Smith

SPRING 2024 VFXVOICE.COM • 47
LEFT TO RIGHT:

VISUAL EFFECTS ARTISTRY IN THE SPOTLIGHT

Captions list all members of each Award-winning team even if some members were not present or out of frame. For more Show photos and a complete list of nominees and winners of the 22nd Annual VES Awards, visit vesglobal.org.

All photos by Danny Moloshok, Al Seib and Josh Lefkowitz.

1. Nearly 1,200 guests from around the globe gathered at The Beverly Hilton for the 22nd Annual VES Awards.

2. Actor-comedian Jay Pharoah led the evening as the VES Awards show host.

3. VES Executive Director Nancy Ward welcomed guests and nominees.

4. VES Chair Kim Davidson kicked off the evening by presenting several VES Awards categories.

The Visual Effects Society held the 22nd Annual VES Awards, the prestigious yearly celebration that recognizes outstanding visual effects artistry and innovation in film, animation, television, commercials, video games and special venues, with generous support from our premier sponsor AMD.

Comedian and master impressionist Jay Pharoah served as host of the capacity crowd gala as nearly 1,200 guests gathered at The Beverly Hilton hotel in Los Angeles on February 21st to celebrate VFX talent in 25 awards categories.

The Creator was named the photoreal feature winner, garnering five awards. Spider-Man: Across the Spider-Verse was named top animated film, winning four awards. The Last of Us was named best photoreal episode, winning four awards. CocaCola topped the commercial field. There was a historic tie in the Outstanding Visual Effects in a Special Venue Project category with honors going to both Rembrandt Immersive Artwork and Postcard From Earth.

Award-winning actor-producer Seth MacFarlane presented the VES Award for Creative Excellence to legendary actordirector William Shatner. Award-winning VFX Supervisor Richard Hollander, VES presented the VES Lifetime Achievement Award to pioneering VFX Producer Joyce Cox, VES. Award presenters included: The Creator director Gareth Edwards; actors Ernie Hudson, Fortune Feimster, Katee Sackhoff, Andrea Savage and Kiersey Clemons; and Leona Frank, Autodesk’s Director of Media & Entertainment Marketing, presented the VES-Autodesk Student Award.

48 • VFXVOICE.COM SPRING 2024
1 2 3 4

5. The Award for Outstanding Visual Effects in a Photoreal Feature went to The Creator and the team of Jay Cooper, Julian Levi, Ian Comley, Charmaine Chan and Neil Corbould, VES.

6. The Award for Outstanding Visual Effects in an Animated Feature went to Spider-Man: Across the SpiderVerse and the team of Alan Hawkins, Christian Hejnal, Michael Lasker and Matt Hausman.

7. The Award for Outstanding Visual Effects in a Photoreal Episode went to The Last of Us; Season 1; Infected and the team of Alex Wang, Sean Nowlan, Stephen James, Simon Jung and Joel Whist.

8. The Award for Outstanding Supporting Visual Effects in a Photoreal Feature went to Nyad and the team of Jake Braver, Fiona Campbell Westgate, R. Christopher White and Mohsen Mousavi.

9. The Award for Outstanding Supporting Visual Effects in a Photoreal Episode went to Winning Time: The Rise of the Lakers Dynasty; Season 2; BEAT LA and the team of Raymond McIntyre Jr., Victor DiMichina, Javier Menéndez Platas and Damien Stantina.

10. Actor Ernie Hudson accepted the Award for Outstanding Visual Effects in a Real-Time Project on behalf of Alan Wake 2 and the team of Janne Pulkkinen, Johannes Richter, Daniel Konczyk and Damian Olechowski.

SPRING 2024 VFXVOICE.COM • 49
5 6 7 8 9 10

11. The Award for Outstanding Visual Effects in a Commercial went to Coca-Cola; Masterpiece and the team of Ryan Knowles, Antonia Vlasto, Greg McKneally and Dan Yargici.

12. The Award for Outstanding Visual Effects in a Special Venue Project was a TIE and was awarded to both Postcard From Earth and the team of Aruna Inversin, Eric Wilson, Corey Turner and William George (pictured); as well as Rembrandt Immersive Artwork and the team of Andrew McNamara, Sebastian Read, Andrew Kinnear and Sam Matthews (not pictured).

13. The Creator director Gareth Edwards cheered on the nominees.

14. The all-volunteer VES Awards Committee celebrated the success of the 22nd Annual VES Awards Show.

15. Guests enjoyed the festive cocktail reception, thanks to generous support from Premier Sponsor AMD.

16. The Award for Outstanding Animated Character in a Photoreal Feature went to Guardians of the Galaxy Vol. 3; Rocket and the team of Nathan McConnel, Andrea De Martis, Antony Magdalinidis and Rachel Williams.

50 • VFXVOICE.COM SPRING 2024
VES AWARDS
11 12 15 16 13 14

17. The Award for Outstanding Animated Character in an Animated Feature went to Spider-Man: Across the Spider-Verse; Spot and the team of Christopher Mangnall, Craig Feifarek, Humberto Rosa and Nideep Varghese.

18. The Award for Outstanding Animated Character in an Episode, Commercial, Game Cinematic or RealTime Project went to The Last of Us; Endure and Survive; Bloater and the team of Gino Acevedo, Max Telfer, Dennis Yoo and Fabio Leporelli.

19. The Award for Outstanding Created Environment in a Photoreal Feature went to The Creator; Floating Village and the team of John Seru, Guy Williams, Vincent Techer and Timothée Maron.

20. The Award for Outstanding Created Environment in an Animated Feature went to Spider-Man: Across the SpiderVerse; Mumbattan City and the team of Taehyun Park, YJ Lee, Pepe Orozco and Kelly Han.

21. Leona Frank, Director of Media & Entertainment Marketing, Autodesk, presented the VES Autodesk Student Award.

22. Comedian/ Actress Fortune Feimster brought the laughs to The Beverly Hilton.

SPRING 2024 VFXVOICE.COM • 51
17 20 19 21 22
18

23. Actress Andrea Savage (Tulsa Kings) presented several Award categories.

24. The Award for Outstanding Created Environment in an Episode, Commercial, Game Cinematic or Real-Time Project went to The Last of Us: Post-Outbreak Boston and the team of Melaina Mace, Adrien Lambert, Juan Carlos Barquet and Christopher Anciaume.

25. Academy Award-winning Senior VFX Producer, Richard Hollander, VES introduced VES Lifetime Achievement Award recipient Joyce Cox, VES.

26. Joyce Cox, VES received the VES Lifetime Achievement Award.

27. The Award for Outstanding Virtual Cinematography in a CG Project went to Guardians of the Galaxy Vol. 3 and the team of Joanna Davison, Cheyana Wilkinson, Michael Cozens and Jason Desjarlais.

28. The Award for Outstanding Model in a Photoreal or Animated Project went to The Creator; Nomad and the team of Oliver Kane, Mat Monro, Florence Green and Serban Ungureanu.

52 • VFXVOICE.COM SPRING 2024 VES AWARDS
23 24 27 28 25 26

29. The Award for Outstanding Effects Simulations in a Photoreal Feature went to The Creator and the team of Ludovic Ramisandraina, Raul Essig, Mathieu Chardonnet and Lewis Taylor.

30. The Award for Outstanding Effects Simulations in an Animated Feature went to Spider-Man: Across the SpiderVerse and the team of Pav Grochola, Filippo Maccari, Naoki Kato and Nicola Finizio.

31. The Award for Outstanding Effects Simulations in an Episode, Commercial, Game Cinematic or Real-Time Project went to The Mandalorian; Season 3; Lake Monster Attack Water and the team of Travis Harkleroad, Florian Witzel, Rick Hankins and Aron Bonar.

32. The Award for Outstanding Compositing & Lighting in a Feature went to The Creator; Bar and the team of Phil Prates, Min Kim, Nisarg Suthar and Toshiko Miura.

33. The Award for Outstanding Compositing & Lighting in an Episode went to The Last of Us; Endure and Survive; Infected Horde Battle and the team of Matthew Lumb, Ben Roberts, Ben Campbell and Quentin Hema.

34. The Award for Outstanding Compositing & Lighting in a Commercial went to Coca-Cola; Masterpiece and the team of Ryan Knowles, Greg McKneally, Taran Spear, and Jordan Dunstall.

30 29 31 32 33 34

35. The Award for Outstanding Special (Practical) Effects in a Photoreal Project went to Oppenheimer and the team of Scott Fisher, James Rollins and Mario Vanillo.

36. The VES Emerging Technology Award was awarded to The Flash; Volumetric Capture and the team of Stephan Trojansky, Thomas Ganshorn, Oliver Pilarski and Lukas Lepicovsky.

37. The Award for Outstanding Visual Effects in a Student Project (Award Sponsored by Autodesk) was awarded to Silhouette and the team of Alexis Lafuente, Antoni Nicolaï, Chloé Stricher, Elliot Dreuille (with Baptiste Gueusguin).

38. Seth MacFarlane, award-winning Actor and Creator of Family Guy and The Orville, prepared to present William Shatner with the VES Award for Creative Excellence.

39. Actress Kiersey Clemons (Monarch: Legacy of Monsters) joined the show as a presenter.

40. Board Chair Kim Davidson with Lifetime Achievement Award recipient Joyce Cox, VES, VFX Producer Richard Hollander, VES and Executive Director Nancy Ward.

54 • VFXVOICE.COM SPRING 2024 VES AWARDS
37
38 39 40
35
36
SPRING 2024 VFXVOICE.COM • 55 41 42 43 45 44
41. The Creator director Gareth Edwards met up on the red carpet with Takashi Yamazaki, Godzilla Minus One director and VFX Supervisor. 42. Friends William Shatner and Seth MacFarlane enjoyed a moment together backstage. 43. James Knight, left, Global Director, Media & Entertainment/ Visual Effects, AMD, with director Gareth Edwards. 44. Acclaimed Actor, Director and Producer William Shatner received the VES Award for Creative Excellence. 45. Actress Katee Sackhoff (The Mandalorian) congratulated all the nominees and winners.

THE CREATOR

56 • VFXVOICE.COM SPRING 2024 VES AWARD WINNERS

The VES Award for Outstanding Visual Effects in a Photoreal Feature went to The Creator, which garnered five VES Awards including Outstanding Created Environment in a Photoreal Feature (Floating Village), Outstanding Model in a Photoreal or Animated Project (Nomad), Outstanding Effects Simulations in a Photoreal Feature and Outstanding Compositing & Lighting in a Feature (Bar). (Photos courtesy of Walt Disney Studios)

SPRING 2024 VFXVOICE.COM • 57

SPIDER-MAN: ACROSS THE SPIDER-VERSE

58 • VFXVOICE.COM SPRING 2024 VES AWARD WINNERS

Outstanding Visual Effects in an Animated Feature went to Spider-Man: Across the Spider-Verse, which won four VES Awards including Outstanding Animated Character in an Animated Feature (Spot), Outstanding Created Environment in an Animated Feature (Mumbattan City) and Outstanding Effects Simulations in an Animated Feature. (Photos courtesy of Columbia Pictures/Sony)

SPRING 2024 VFXVOICE.COM • 59

THE LAST OF US

60 • VFXVOICE.COM SPRING 2024 VES AWARD WINNERS

Outstanding Visual Effects in a Photoreal Episode went to The Last of Us; Season 1; Infected, which won four VES Awards including Outstanding Animated Character in an Episode, Commercial, Game Cinematic or Real-Time Project (Endure and Survive; Bloater), Outstanding Created Environment in an Episode, Commercial or Real-Time Project (Post-Outbreak Boston) and Outstanding Compositing & Lighting in an Episode (Endure and Survive; Infected Horde Battle). (Photos courtesy of HBO)

SPRING 2024 VFXVOICE.COM • 61

VFX IN CANADA: A GLOBAL LEADER CONTINUES TO EVOLVE

Powered by studios in Vancouver, Toronto and Montreal, Canada continues to grow and evolve as a major hub of the global VFX industry. Tax incentives, immigration policy, quality of life, excellent VFX and animation schools and beneficial time zones have all contributed to Canada’s prominence in global VFX. Moreover, a consistent number of Hollywood productions are lensed there, confirming Canada’s reputation as a world-class source of talent and innovation.

OPPOSITE TOP TO BOTTOM: Image Engine Design in Vancouver has been a key contributor to the Netflix series 3 Body Problem (Image courtesy of Netflix)

Recent projects for Montreal-based Raynault VFX include Secret Invasion Season 2 for Disney+ as well as Percy Jackson and the Olympians, All the Light We Cannot See, White Noise and Fantastic Beasts: The Secrets of Dumbledore. (Image courtesy of Raynault VFX and Disney+)

MARZ in Toronto contributed VFX to Moon Knight as well as WandaVision, Ant-Man and the Wasp: Quantumania, Spider-Man: No Way Home, Wednesday, Stranger Things and the Percy Jackson series.

(Photo: Gabor Kotschy. Courtesy of Marvel Studios)

Canada’s early animation and VFX software legacy and long history with Hollywood have helped Canadian film and VFX schools, like the Think Tank Training Centre in Vancouver, consistently address the need for talent to fill available positions at studios.

(Image courtesy of Think Tank Centre)

“Canada has always been a leader in the industry from the early days of the National Film Board to companies like Alias (now Maya) and SideFX, who have defined the founding principles of VFX and animation,” says Dave Sauro, Partner and Executive Producer at Herne Hill Media, a Toronto studio founded in 2021. “Canada has become a center of excellence not just for VFX, but also for film production in its entirety. Whatever is scripted, we can bring it to life.” Herne Hill worked on Guillermo del Toro’s Cabinet of Curiosities soon after the firm opened its doors. Sauro notes, “We are currently in various states of post-production on a few different projects, including In the Lost Lands, which is an adaptation of a George R.R. Martin short story directed by Paul W.S. Anderson.” The firm is also finalizing Lee Daniels’ The Deliverance and The First Omen

On the other hand, perhaps Canada’s VFX success “has got something to do with the long, cold, dark winters?” asks Shawn Walsh, Visual Effects Executive Producer, General Manager & Cinesite Group Chief Operating Officer VFX at Image Engine. Founded in Vancouver in 1995, Image Engine merged with Cinesite in 2015. “When you spend a good deal of time indoors following your passions, that creates a kind of fertile ground for the focus, creativity, technical knowledge and innovation that high-end visual effects require. It seems to me that Canadians have never been shy [about] a little hard work either! Canadians have had a strong presence in Hollywood, animation and visual effects for a very long time.”

62 • VFXVOICE.COM SPRING 2024
TOP: Toronto-based Herne Hill Media worked on Guillermo del Toro’s Cabinet of Curiosities soon after the firm opened its doors in Toronto. (Image courtesy of Herne Hill Media and Netflix)

In recent years, Image Engine has worked on a healthy mixture of high-end series like Game of Thrones, The Mandalorian, 3 Body Problem and Lost in Space as well as the Fantastic Beasts films, Mulan and District 9, Elysium and CHAPPiE for director Neill Blomkamp, and Zero Dark Thirty for director Kathryn Bigelow.

COMPUTER ANIMATION

“Canada has always been a front-runner when it comes to computer animation. Maya, and Softimage before Maya, began in Canada. Canadians filled many of the earliest positions because we were more familiar with the software and skills needed for those early films and TV shows,” says Scott Thompson, CEO and Co-Founder of Think Tank Training Centre in Vancouver. “That legacy helped Canadian schools better address the positions made available at studios that have settled north of the border.”

SideFX, co-founded in Toronto in 1987 by current President Kim Davidson, used its software, PRISMS, to lay the groundwork for Houdini. SideFX technology has been recognized by the Academy of Motion Picture Arts and Sciences five times for Houdini and its breakthrough procedural-based technology. Numerous VFX studios working on Oscar-winning and/or blockbuster films have used the software.

With its flagship product Houdini, SideFX has been a key driver in the growth and innovation of the Canadian VFX industry, particularly in Toronto,” comments Christopher Hebert, SideFX Senior Director of Marketing. The company’s work “has led to significant advancements in VFX and animation, making Houdini a staple in many studios and pushing the boundaries of visual effects capabilities. This influence extends to job creation and talent development, with SideFX employing a significant portion of its workforce in Canada. Their contribution to the software development ecosystem – as well as initiatives like the Houdini Internship Program – not only supports the local economy but also ensures a high level of VFX expertise within the country, fostering a robust and skilled VFX workforce.”

Also in Canada, Alias Research launched in Toronto in 1983 and Softimage in Montreal in 1986, eventually resulting, after various acquisitions and mergers, in (Autodesk) Maya, the award-winning, widely-used 3D model and animation software. In 2003, Alias was given an Academy Award for Technical Achievement for the development of Maya software.

When new computer animation tools arrived, Canada’s art schools began “modifying their classical curriculums by adopting tech,” explains Lon Molnar, Co-Founder of MARZ, a VFX studio that launched in Toronto in 2018. “In the early days, Sheridan College outside of Toronto became a world-renowned leader for animation due to its talented faculty. In the ’90s, schools like Vancouver Film School built a reputation for training traditional and computer animation while cranking out amazing talent – and still do. The Canadian Government along with certain territories jumped on board and supported the industry with various incentives. Filmmaking in Canada as a stand-in for various locations grew while solid investment in soundstages in centers like Vancouver, Toronto and Montreal followed the demand.

SPRING 2024 VFXVOICE.COM • 63

“[I]t’s hard to imagine a more multicultural, multinational visual effects company than a Canadian one. Canada has always been a tremendous draw for immigration, and the visual effects industry has been a strong contributor to that story. It seems like Canada has found the right soupy mixture of various factors that have created an environment that supports the talent that’s so crucial to visual effects work. The job now is to continue to grow that talent base through continuing immigration, creative and technical development opportunities and strong industry leadership.”

Add all this up and eventually you have a vibrant industry with a reputation to deliver.” MARZ contributed VFX to projects such as WandaVision, Moon Knight, Ant-Man and the Wasp: Quantumania, Spider-Man: No Way Home, Wednesday, Stranger Things and the Percy Jackson series.

CONTINUED TRAJECTORY

The acceleration of the visual effects industry in Canada over the last 25 years or so can also be attributed to “gradual factors such as the introduction of tax incentives in the late 1990s, strategic investments in education and the establishment of high-quality studios in cities like Vancouver and Montreal,” says Valérie Clément, VFX Producer for Raynault Visual Effects. In addition, Clément observes, “The boom in film and TV production in Canada has significantly boosted the exposure and vigor of the VFX and animation industry at large by creating increased demand for services, ensuring a steady flow of projects, fostering collaboration opportunities, contributing to the economy, driving talent development, gaining global recognition and spurring technological advancements.”

Clément points out, “Of course, the government incentives –generous tax credits, for example – played a huge role. There is also the skilled workforce, the strong infrastructure with more and more visual effects studios located mainly in Montreal, Vancouver and Toronto.” Clement’s firm, Raynault VFX, was founded in Montreal in 2011 by industry legend Mathieu Raynault, He surrounded himself with a small, select team of artists and grew the studio into a full-service VFX facility. Some 2022-2023 projects have included Percy Jackson and the Olympians, All the Light We Cannot See, White Noise, His Dark Materials, Season 3, Thor: Love & Thunder, Fantastic Beasts: The Secrets of Dumbledore, The Old Man and Invasion. Clément also emphasizes the advantage of Canada’s time zones compared to those of Hollywood. There is a limited or non-existent time difference, such as Vancouver sharing the same time zone as California as opposed to London being eight hours later.

Ryan Stasyshyn, Managing Director of Toronto-based Mavericks VFX, cites these reasons for Canada’s VFX success: generous tax incentives and rebates offered by various provinces, a skilled workforce, strong government support, a favorable exchange rate (vs. USD) and, as Clement notes, a significant amount of physical production taking place in Canada. Recent high-profile projects for Mavericks VFX are John Wick: Chapter 4, The Handmaid’s Tale, Fargo, Fellow Travelers, The Offer, Don’t Worry Darling, The Boys, The Expanse and Halo.

TIPPING POINT – VANCOUVER

BOTTOM:

Walsh notes, “I think a key turning point in Vancouver’s history as a hub for high-end visual effects work was when Image Engine completed our work for Neill Blomkamp’s District 9 at more or less the same time that [Vancouver-based] The Embassy did some stunning work for Marvel’s first Iron Man and MPC created some solid work for Watchmen. I think this was around 2008-2009. This was the first time that Vancouver visual effects studios, broadly speaking, were really producing work that was the equal of any

64 • VFXVOICE.COM SPRING 2024
SPECIAL FOCUS
TOP: Recent high-profile projects for Toronto-based Mavericks VFX include John Wick: Chapter 4 as well as Fargo, Fellow Travelers, The Offer, Don’t Worry Darling, The Boys, The Expanse and Halo. (Image courtesy of Lionsgate) Toronto-based Mavericks VFX contributed VFX to the Hulu series The Handmaid’s Tale. (Image courtesy of Hulu)

location, any company in the world. In fact, District 9 was the only project to beat Avatar in any category at the VES Awards that year! The town really grew from that point on. We were on the map, as they say. Since then, Vancouver has gone from strength to strength and has continued to lead the Canadian scene.”

LIFE PERKS

There are many advantages to living in the three major Canadian cities. In Toronto, Stasyshyn notes, “We have a vibrant cultural scene that’s extremely diverse. It’s also a very welcoming city with lots to explore and do.” Clément notes, “[Artists] also have access to all the perks of working and living in Canada: good quality of life, high living standards and a safe environment with the emphasis on a healthy work-life balance.”

Sauro also points to the opportunity to live and work in different parts of Canada. “Vancouver, Toronto and Montreal are all important VFX hubs in the Canadian market, and each offer something different from a lifestyle perspective. Whether it’s the great outdoors of Vancouver, the big city living of Toronto or the European feel of Montreal with its excellent restaurants, each can allow you to not only earn a living doing what you love, but also live in a city that best fits your interests outside of work,” he explains.

BENEFITS & SUPPORT

In addition, “Employment standards in visual effects companies across Canada are generally very high,” Walsh notes. Wages are generally on par with anywhere in the world that’s doing equivalent levels of quality of execution. Aspects like healthcare and benefits again are on par or above those offered in other visual effects hubs around the world. And there’s a broadly diversified industry with many different companies in terms of shapes, sizes and focuses. [For artists], Canada represents a prime location to consider plying your trade.”

“Canada has a government that understands the importance of supporting the arts,” Sauro says. Part of that help comes in support for non-Canadians studying and working in VFX. Walsh notes that Canada “is a relatively open country that supports companies towards their immigration needs.” Aline Ngo, Image Engine recruiter, notes the importance of “facilitating the retention of skilled talent in Canada.” She says that enabling the stays of visual effects graduates with a government visa help is key. One path is “the possibility of getting a three-year post-graduate work permit after graduating.”

INCENTIVES

“Government support has played a pivotal role in Montreal’s VFX industry success,” Clément comments. “Generous tax incentives and subsidies attract studios, fostering growth. Investment in infrastructure and education ensures top-notch facilities and a skilled talent pool. In essence, government backing has been instrumental in shaping Montreal into a VFX pole.”

Walsh comments, “There’s great local support for the industry in the three main cities where the majority of the visual effects work transpires – Vancouver, Montreal and Toronto. Labor-based

MARZ

Image

SPRING 2024 VFXVOICE.COM • 65
TOP TO BOTTOM: In addition to Guillermo del Toro’s Cabinet of Curiosities, Toronto-based Herne Hill Media is involved in In the Lost Lands, which is an adaptation of a George R.R. Martin short story directed by Paul W.S. Anderson, as well as Lee Daniels’ The Deliverance and The First Omen. (Image courtesy of Herne Hill Media and Netflix) in Toronto provided VFX for Spider-Man: No Way Home. (Photo: Matt Kennedy. Courtesy of Marvel Studios) Engine Design in Vancouver has worked on the Fantastic Beasts films, including Secrets of Dumbledore, and high-end series such as Game of Thrones, The Mandalorian, 3 Body Problem and Lost in Space. (Image courtesy of Warner Bros. Pictures)
“When you look at the post-secondary institutions in this country, we’re fortunate to have some of the best for VFX, animation and design: OCAD University, Sheridan College, Humber College, Vancouver Film School, etc. It’s an embarrassment of riches.”
—Dave Sauro, Partner and Executive Producer, Herne Hill Media

TOP TO BOTTOM: Image Engine Design in Vancouver provided VFX for Venom: Let There Be Carnage. (Image courtesy of Columbia Pictures/Sony and Marvel Studios)

Image Engine Design provided VFX for 3 Body Problem (Image courtesy of Netflix)

Montreal-based Raynault VFX contributed to FX Network series The Old Man. (Image courtesy of Raynault VFX and FX Network)

tax credit regimes certainly don’t hurt when attracting the client base, but without the talent to execute the work, no amount of tax credit will matter.” Likewise, Sauro affirms, “We can’t ignore the obvious benefits tax credits play, both at a provincial and federal level, in attracting studios and producers to Canada, but that alone is not enough.”

SCHOOLS

VFX and animation schools have also helped build the industry. Sauro comments, “When you look at the post-secondary institutions in this country, we’re fortunate to have some of the best for VFX, animation and design: OCAD University, Sheridan College, Humber College, Vancouver Film School, etc. It’s an embarrassment of riches.”

Stasyshyn points to Seneca College, Sheridan College and Vancouver Film School (VFS) as some of the top VFX and animation schools in Canada. “These schools have played a crucial role in shaping the skills of the Canadian VFX workforce. Their programs often include hands-on training, industry connections, networking events and exposure to some of the latest technologies.”

For SideFX, “The proximity to top-tier educational institutions – like the University of Waterloo and the University of Toronto, renowned for their software development and graphics labs, and colleges like Sheridan College, known for its CG education – ensures a steady stream of skilled graduates and potential innovations in VFX technology,” says Hebert. Clément adds, “In Montreal, the VFX talent pool has expanded through esteemed educational institutions like NAD [The School of Digital Arts, Animation and Design] providing industry-relevant education. The establishment of major VFX studios has also significantly expanded career opportunities for local artists.”

Noteworthy Canadian VFX/animation schools also include public schools Emily Carr University of Art and Design, Langara College, British Columbia Institute of Technology (BCIT) and Capilano University and private institutions like Lost Boys School of Visual Effects, Think Tank and Vancouver Film School (VFS), according to Ngo.

“With a focus on story and problem-solving, we have been serving our industry with students who are prepared for an adaptable career in VFX,” says Colin Giles, Head of the School for Animation & VFX at Vancouver Film School. “Given the rapid changes in techniques and technology, we continue to upgrade our facilities and curriculum to not only stay in tune with current methodologies, but prepare our students on why these changes can enchant their storytelling and help them find their artistic voice.”

Thompson remarks, “Canadian-based studios have a substantial piece of the VFX pie, so students are very close geographically to their first VFX job. These studios also give Canadian schools access

66 • VFXVOICE.COM SPRING 2024 SPECIAL FOCUS

to instructors and mentors that are working on the biggest VFX films and TV shows being made. At Think Tank, we are continually polling the industry to better understand the software, workflows and demands of the FX industry.”

Canadian VFX/animation schools work hard to stay in demand and stay in touch with the VFX companies. Lost Boys Co-Founder and Director Ria Ambrose Benard comments, “We were the first school to teach Houdini for FX and Katana for lighting. This helped our students stay ahead of the curve and in demand when the industry was growing. The FX program and the lighting program were designed at a request from the studios in the industry years ago.” Giles notes, “The growing talent pool is being fueled by high school interest, and international students are attracted to animation schools across Canada. This has allowed the VFX industry to tap into deep tax credits and build a sustainable nationwide industry. In addition, like VFS, we are able to bring in top instructors and mentors from the expanding VFX footprint.”

Walsh continues, “Bringing visual effects work north seems to have been a natural progression. Initially, many of us left home to work abroad because that’s what you had to do to experience working on the visual effects shots that captivated our attention. However, there was a turning point around the late 2000s when many of us returned home and brought our new-found friends from around the world with us! Since then, it’s hard to imagine a more multicultural, multinational visual effects company than a Canadian one. Canada has always been a tremendous draw for immigration, and the visual effects industry has been a strong contributor to that story. It seems like Canada has found the right soupy mixture of various factors that have created an environment that supports the talent that’s so crucial to visual effects work. The job now is to continue to grow that talent base through continuing immigration, creative and technical development opportunities and strong industry leadership.”

VFX STUDIOS

Other notable Canadian VFX/animation firms include Spin VFX (Toronto), Rocket Science VFX (Toronto), Rodeo FX (Montréal, Québec City, Toronto), Soho VFX (Toronto), Guru Studio (Toronto), Folks VFX (Toronto), Zoic Studios (Vancouver), The Embassy (Vancouver), Hybridge Ubisoft (Montréal), Artifex Animation Studios (Montréal) and Alchemy 24 (Montréal). Branches of foreign VFX and animation companies have also contributed to Canada’s growth in visual effects. Vancouver has outposts of Wētā FX, ILM, Framestore, DNEG, Pixomondo, Sony ImageWorks, Walt Disney Animation Studios, Digital Domain, Crafty Apes VFX, FuseFX, Scanline VFX (owned by Netflix), Ghost VFX, Luma Pictures, Clear Angle Studios, Animal Logic, CoSA VFX, Barnstorm VFX and Ingenuity Studios, among others. Montréal branches include Framestore, DNEG, Pixomondo, Sony ImageWorks, MPC (Technicolor), Mikros Animation (Technicolor), Digital Domain, Mathematic Studio, Crafty Apes, Folks VFX (Fuse Group), Outpost VFX and Scanline VFX. Toronto facilities include DNEG, Tippett Studio and Ghost VFX, among others.

“Government

support has played a pivotal role in Montreal’s VFX industry success. Generous tax incentives and subsidies attract studios, fostering growth. Investment in infrastructure and education ensures top-notch facilities and a skilled talent pool. In essence, government backing has been instrumental in shaping Montreal into a VFX pole.”

—Valérie Clément, VFX Producer, Raynault Visual Effects

TOP: Vancouver Film School continues to upgrade its facilities and curriculum to stay in tune with current methodologies and better prepare students for the evolving industry. (Image courtesy of Vanouver Film School)

BOTTOM: Think Tank Training Centre in Vancouver is among the wealth of animation and VFX schools in Canada developing a steady stream of creative and technical talent that keeps the industry growing. (Image courtesy of Think Tank Centre)

SPRING 2024 VFXVOICE.COM • 67

MARIANNE SPEIGHT: ACHIEVING THE FILMMAKER’S VISION BY EMBRACING VFX

Marianne Speight was born in Stockton-on-Tees in the North East of England but lived in New Zealand for a few years. Besides studying specific university courses, Speight didn’t receive any formal VFX training. “I wasn’t aware of any training availability when I started,” Speight says. “Mainly, it was on the job with artists/supervisors and producers passing on their knowledge to me on a daily basis. The only course I did take was compositing basics at Escape Studios as I wanted to learn what the compers were talking about with regard to various lighting and comp passes they needed when I was a coordinator.”

Speight broke into the visual effects industry after joining Peerless Camera Company. “Terry Gilliam had just started his film The Brothers Grimm at that time, so it was an exciting time to start my VFX journey – and a steep learning curve! I loved visual effects in Star Wars but had little concept of how VFX effects were actually made,” Speight notes. “It was really exciting having direct access to the filmmaker and learning how creative feedback would affect asset builds or shot composition and in turn what the impact on the schedule would be. Mostly, though, it felt like we were very connected to the creative process and the director’s vision, which was inspiring. It was also intriguing to work with a lot of very experienced artists who were happy to share their knowledge and also some amazing stories from the days of optical printers.”

Speight was very interested in the production budget/scheduling side of the effects from the beginning, and Peerless owner Kent Houston encouraged her to go in that direction. Speight served as Visual Effects Coordinator on the 2005 film Racing Stripes. “Racing Stripes was fascinating to me as a new coordinator because it had so many parts of the VFX process involved. So, it was a good learning experience to get me used to many parts of the process for muzzle replacements and full CG animals,” she explains. “I learned a lot about dependencies on different areas such as prep animation lighting and comp. I especially enjoyed watching the animators pull faces into a mirror and recreate those on the zebra! It gave me a good grounding to know how long different parts take and what can speed up or slow down the schedule.”

OPPOSITE TOP TO BOTTOM: Speight was looking to expand her experience on challenging projects in terms of volume and complexity, and found both working on The Chronicles of Narnia: The Voyage of the Dawn Treader (2011) (Image courtesy of 20th Century Fox and Walden Media, LLC)

As a new coordinator on Racing Stripes, Speight was introduced to many parts of the VFX process, including muzzle replacements and full CG animals, as well as to factors that impact a production schedule.

(Photo:

Speight began her visual effects career on Terry Gilliam’s The Brothers Grimm (2005). (Image courtesy of Dimension Films and MGM)

When Speight first joined MPC in 2009, she was already established as a visual effects producer, but was looking to take her career in a new direction. “I was looking to expand my experience on challenging projects in terms of volume and complexity, and I enjoyed The Chronicles of Narnia: The Voyage of the Dawn Treader, which had both. It was a fun and technically challenging show. It was my first experience working with CG water simulation and rendering. Back then, Flowline was the software of choice, and that presented its own set of challenges since simulation and render times were evolving during production. It was also a heavy creature and animation show, and I loved being involved in the development process of the creatures from initial concept to final shots. The show itself was quite an eclectic mix of challenges, but it was a really fun film to work on as there was a great team across the board and a collaborative client.”

Among Speight’s other credits as a Visual Effects Producer is X-Men: First Class. “I loved working on X-Men: First Class. It wasn’t

68 • VFXVOICE.COM SPRING 2024
TOP: Marianne Speight, Chief Business Development Officer and Executive Producer, Milk VFX. (Photo: Simon Wicker) Alsbirk, Blid. Courtesy of Warner Bros. Pictures)
PROFILE

a huge volume of shots, 200 max, but it was complex work with a host of separate technical and artistic problems to solve,” Speight remarks. “I particularly liked the Hank [McCoy] beast transformations involving muscle deformations and fur appearance/disappearance. It was a real challenge for rigging and groom, and it felt like we were learning a lot throughout the process in terms of how we developed our approach to using software. It was fun working with the creative team led by [MPC Visual Effects Supervisor] Nicolas Aithadi on this one as everyone loved the franchise. I also got to work with [Visual Effects Designer] John Dykstra. That was a huge honor to work with a VFX legend.”

Speight currently serves as Chief Business Development Officer and Executive Producer for Milk VFX. “I always admired Milk while working at other companies. They always had a talented team, and I was a fan of their creature work and their ability to be involved in a wide range of projects. Milk has always had a strong client base for repeat business, and when I came in, my goal was to expand that client base and reinforce our connections in the industry and with filmmakers and showrunners. I enjoy working with clients to break down scripts and work out methodologies that are going to give the look they want but also within their budget. We always want to be with clients, working to help them develop their ideas and inform their creative process. We are working to expand our client base to a more global reach while still offering a very personal interaction with our clients. As part of this, Milk now has studios in Bordeaux, Barcelona and Dublin, and it’s

SPRING 2024 VFXVOICE.COM • 69

It was a career highlight for Speight to work with Ridley Scott on Prometheus (2012)

(Image courtesy of Twentieth Century Fox)

Speight’s favorite shot from her career was the sequence of the Juggernaut crash from Prometheus as it was quite an epic build and a landmark part of the film.

(Image courtesy of Twentieth Century Fox)

great to be able to access the range of talented artists in those areas. We are a full-service VFX house with very strong FX and environment teams, but our continuing specialization is creatures, so we are looking to develop that even further. In my role as Executive Producer, I also oversee projects from the bid stage through award through shot delivery. It’s great to have that continuity and to see how the ideas, assets and shots have evolved and all the various creative twists and turns it may have taken. It’s always very satisfying to see your work on the big screen!”

When it comes to selecting a favorite visual effect shot from her career, Speight is quick to point to one shot in particular. “My favorite shot from my career is probably actually a group of shots, which I guess is cheating, but it was the sequence of the juggernaut crash from Prometheus as it was quite an epic build of the asset and a landmark part of the film,” Speight says. “It was a huge asset, and the textures needed to be very high-res as they were coming very close to the camera. The animation and FX also had to be spot on to make the weight of the ships and crash believable so the audience felt how vast it was. It was a great sequence to develop as I’m a big fan of Ridley Scott. It was pleasing to develop that with him and see how happy he was with it.”

Choosing an overall project that she is most proud of is more of a daunting task for Speight. “It’s a tough choice on which project I’m the most proud of,” Speight notes. “I think in terms of my early career I was most proud of Casino Royale because I always wanted to work on a Bond film. I also was just very proud of getting that one delivered in a short space of time, comparatively speaking. It involved a range of visual effects from environment extensions to face replacements. It was a great project to work on. I particularly liked the crane sequence we did with Daniel Craig.”

70 • VFXVOICE.COM SPRING 2024
PROFILE
TOP TO BOTTOM: X-Men: First Class didn’t involve a large volume of shots, but Speight found it was complex work with a host of separate technical and artistic problems to solve. (Image courtesy of Marvel Studios)

“However, I think my favorite project they worked on was Guardians of the Galaxy because it was just such a big film with just so many different sequences requiring different elements, different characters that we needed to do,” Speight continues. “MPC and Framestore were building their own assets for Rocket and Groot, and they had to match exactly. Both used some proprietary software, so it was kind of ‘how do we make assets match while we are both building at the same time with our own setups?’ But I think everyone worked together really well. It was a good example of studios working together to get the best outcome for the film. I loved the process of building Groot as that was complicated from a rigging perspective, and groom was a challenge for Rocket, but the biggest one was making both characters ‘real,’ and I think the animators did a fantastic job. The spaceship fights were cool to work on both in terms of animation and in terms of having massive ships with an immense polygon count to render. There were just so many different stand-alone sequences over the project; it wasn’t like there was much repeating in terms of effects requirements, so that’s really challenging but satisfying to do. A lot of that fun came down to the team – they were brilliant, and I loved how well everyone worked together.”

Technology-wise, the VFX industry is ever-evolving and has developed since the beginning of Speight’s career. “Certainly, when I first started my career it felt as though there was a new development monthly in terms of software, pipeline and hardware that could be used,” Speight details. “It has always been a very fascinating industry to be in. When faced with technical and creative challenges that haven’t been solved before, it is always incredible to see technical and artistic talent within our industry take on those challenges so readily and design and build something that works. During my career, we went from shooting on film to digital to LED walls, virtual production and virtual scouting, and I think those technical technological updates have been great for showrunners and filmmakers to give them all the tools they need to achieve their vision in a way that embraces VFX and how it can work for them. Having a facility that can consult early on to get the best possible outcome for their project is crucial, and I think we want to always try and advise on what tools could help them.”

In terms of inclusivity within the industry, Speight explains that it is an area that can always be improved, but the industry is heading in the right direction. “There is a lot more to be done to achieve a more diverse workplace through targeted recruiting and outreach to individuals and groups that wouldn’t necessarily have considered VFX as an option open to them due to its previously atypical demographic. For me, flexibility has led to inclusivity as a Mum, and it’s been great to be able to carry on with my career progression and have a family. It’s crucial for me to have a level of flexibility to be able to perform at the highest level, and there’s been a move towards flex and hybrid working in recent years, so I can juggle the needs of my family and the needs of the company. I feel that the industry is now more and more recognizing the benefits of that as it is retaining very experienced and talented crew who also want to have balance in their lives while working on some cool projects!”

Guardians of the Galaxy was a “big film” in Speight’s career for its vast scope of demands because it was filled with many different sequences requiring different elements and different characters that needed to be realized. (Image courtesy of Marvel Studios and Walt Disney Studios)

Speight cites Guardians of the Galaxy as a good example of studios working together to get the best outcome for the film.

SPRING 2024 VFXVOICE.COM • 71
TOP TO BOTTOM: Casino Royale involved a range of visual effects from environment extensions to face replacements, and Speight was proud of delivering a complex project in a comparatively short period of time. (Image courtesy of Columbia Pictures/Sony) (Image courtesy of Marvel Studios and Walt Disney Studios)

THE CONSEQUENCES OF FALLOUT

Images courtesy of Amazon Prime Video.

TOP: VFX Supervisor Jay Worth and the team at RISE FX were responsible for creating the nuclear explosion in Los Angeles. CG buildings were added between real buildings to create a retro-futuristic L.A.

OPPOSITE TOP TO BOTTOM: RISE FX handled all the power armor work.

Power Armor Suits on the move.

Director Jonathan Nolan and Ella Purnell (Lucy). The series story is set in the Fallout world, but it’s not a story that has already been told or seen in the games, which gave the director, producers and VFX team the freedom to shape the story and tone of the show.

Based on the popular video game franchise, Fallout is set in post-apocalyptic Los Angeles. After a devastating nuclear explosion, citizens are forced to live in underground vaults. Jay Worth served as Visual Effects Supervisor on the Amazon Studios show and has worked alongside director Jonathan Nolan and Executive Producer Lisa Joy since Person of Interest (2011). “We also worked together for the entire run on Westworld and The Peripheral, and I helped out with Lisa’s movie, Reminiscence,” Worth adds. “It has been a great partnership for a long time. I’ve had the privilege of working with them for 13 years. So I was able to come onto Fallout very early into the project – it has been an amazing one.”

When it came to his initial conversations with Jonah [Jonathan Nolan] and Lisa about the look of Fallout, Worth notes that the conversations always begin the same way when working on a new project. “It’s first getting the script, then there are the huge ideas behind it and then figuring out how to take all these pieces and figure out a visual language that will work. This one started out with immersing myself in the game because the specifics were so vital for this project. Fallout has a very specific patina that we wanted to honor and be faithful to, but that is also what makes it a fun playground for VFX to play. We always start with real locations as our canvas, and we build from there. We did some amazing scouts to Namibia and Utah, which helped ground the world we were trying to create. I’ve enjoyed working with Howard Cummings, the Production Designer, since Westworld, and it has been really fun partnering with him to bring this world to fruition. Working with the same people for years brings a measure of trust and collaboration that makes a daunting project like this much more enjoyable.”

Andrea Knoll was Visual Effects Producer on the show. “A big part of the initial conversations, specifically in visual effects,

72 • VFXVOICE.COM SPRING 2024
TELEVISION/STREAMING

involved Jay and I breaking down the scripts together, planning how we would want to approach the day-to-day with our On-Set Visual Effects Supervisor, Grant Everett, then really digging into early discussions with key vendors we both loved working with and knew would make a good fit for the type of work we would be doing on Fallout, from environment to hard surface to creature work. I said it from the beginning, this is a show where we truly got to do it all. And we wanted the best teams to join us on this journey. To ensure we’d have the best, we started specific planning very early on,” Knoll says.

“We knew we would be shooting primarily in New York, and we knew block one would be big, with Jonah at the helm,” Knoll continues. “Building a strong on-set team in New York was key to the success of the shoot. The team was composed of our amazing on-set supervisor, Grant [Everett], and three data wranglers as well as an on-set production assistant. They all brought a lot to the table and did a fantastic job. When we moved into the later blocks, we expanded the team a bit to accommodate our shooting plan; we hired a second on-set supervisor, Sam O’Hare. He and Grant had the support of the three wranglers, an assistant wrangler and a production assistant. This way they could handle multiple units that were running simultaneously. Jay and I were very hands-on when it came to the day-to-day during the shoot and made sure the right teams were covering the right units and ensuring we would have what we would need in post to execute the best work.

In L.A., we built our VFX team as we were gearing up to shoot in the summer of 2022. We had a VFX Editor, Jill Paget, and VFX Assistant Editor, Stephanie Huerta-Martinez, who had worked together previously and came on as a very strong team. We also hired our VFX Production Manager, Jackie VandenBussche, and two coordinators, with Kacey Phegley leading the coordinators.”

SPRING 2024 VFXVOICE.COM • 73

TOP: FutureWorks in India was responsible for the Ghoul noses and character. Over 500 nose replacements were required for Walton Goggins (The Ghoul) for the entire season.

BOTTOM TWO: Creating the desolate, post-apocalyptic L.A. landscape was a big VFX challenge, as multiple shooting locations were combined to make it feel real while staying true to the tone and look of the Fallout world.

Worth usually takes a more categorical framework when approaching the workload with his team, and Fallout hit at a time when all of his favorite vendors had bandwidth. “In years past, I’ve gotten stuck not getting the teams I wanted on certain shows, which made it a little hard when I was trying to say yes to certain creative choices. So, this situation where I got to have my dream vendors work on all the things I wanted was truly rewarding. I was able to partner with Framestore out of Montreal to do our creature work, RISE FX in Germany to work on all of the Vertibird shots, power armor work and numerous environments, and Important Looking Pirates out of Sweden did the flawless work on the Cyclops and our robot, Snip Snip. Additionally, FutureWorks in India is doing all our Ghoul noses; they have stepped up to the challenge of creating the character and having that look amazing. We also have the privilege of working with Refuge, CoSA, Mavericks, One of Us, Studio 8 and Deep Water FX. I was able to lean into each of their strengths or the things that I love about the work they do.”

“I was impressed by Framestore’s work on the polar bears in His Dark Materials. I felt confident working with them to create the Yao Gui,” Worth continues. “Grant had worked on that with them, and we leaned on his experience for our shooting methodology. We have this amazing battle between the power armor and the Yao Gui. These two titans are fighting, so figuring out how to get the weight right and all technical aspects accurate is key to making it feel real. We try to make sure there is always something real in the frame, which gives us something to work from and build on.”

The team at Legacy FX was responsible for building the incredible power armor suit. “The suit is just amazing,” Worth says. “For the season, we only had to do a few power armor shots, and it was just for when it was flying through the air and we weren’t able to rig it for safety reasons. We were also able to build a massive puppet

74 • VFXVOICE.COM SPRING 2024
TELEVISION/STREAMING

head for the Gulper. The VFX team worked with Framestore early in the process to get the proportions, look, and, most importantly, we nailed down the skin tone. The team at Quantum FX built this beautiful puppet head that was integral to our shooting methodology. We had this real object for our actors to interact with, for water to interact with, and it was even rigged so someone could get swallowed, which was vital to pull off the climactic moment between Max and Thad in Episode 3. Whenever it’s interacting with somebody, it is really interacting with them. So, you get all of the contact points, the shadow and lighting interactions, and that makes it far easier to build off of and make and end up with something that looks real.”

For the Vertibirds, Howard and the production team built the entire interior buck of the Vertibird. “So, we have all these real practical surfaces, and then we are able to take that, put it on a gimbal and shoot it on a huge LED volume,” Worth explains. “We went to our locations and shot numerous three-camera helicopter and drone plates to play back on the LED stage. So, all of our flying footage is all captured in-camera, which was critical to make it feel real. There’s no other way to do it and make it look like you are really flying.”

“We were able to partner with the team at Magnopus to create three unique and vital environments in Unreal for use on the LED stage,” Worth continues. “The story lent itself to the technology of the LED stage. Our story is that the vault used a Telesonic projector to recreate a Nebraska landscape inside the vault, so we were able to lean into and use that to our advantage to create one side of every vault interior. However, the interior of the vault door room before Lucy leaves for the outside world is my favorite. The set Howard and the art department built, combined with the assets that Ben Grossmann, AJ Sciutto and the team at Magnopus

SPRING 2024 VFXVOICE.COM • 75
TOP: The VFX team worked with Framestore to get the proportions, look and skin tone, and Quantum FX built the puppet head that provided a real object for the actors and water to interact with. It was even rigged so someone could get swallowed. BOTTOM TWO: The Brotherhood of Steel and Vertibirds.

put together, created a flawless environment. We got the footage back after the first day, and we couldn’t tell where the practical set ended and the virtual one began.”

Overall, there were approximately 3,300 visual effects shots for the season. “Every time we said yes to something, we thought it would be the most challenging thing, and then our vendors kept knocking it out of the park,” Worth notes. “The Cyclops, played by Chris Parnell, was one of the biggest challenges of the season because he’s a character we see over multiple episodes. We wanted to retain his amazing performance while figuring out how to replace the majority of his face. We didn’t want to create the traditional Cyclops that we had seen before. It needed to feel more real and organic. One of the things that we felt cracked part of the code on that realism, was we ended up giving him two eyebrows. It sounds like a subtle thing, but that for me makes it feel like it’s a real person with a real single eye. Even though all of us watched it through iteration after iteration, his entire performance made us laugh every single time. The team and Important Looking Pirates delivered magnificently on that one. We also knew that replacing the Ghoul’s nose was going to be rough, having to do over 500 nose replacements for one of our leads for the entire season. However, the team at FutureWorks created an amazing pipeline for those and stepped up to the challenge.”

Creating the Wasteland proved to be particularly daunting for Worth and his team. “The biggest challenge we had was in combining multiple shooting locations to make it feel like a real place, all while making sure we are true to the tone and look of the Fallout world, ” Worth says. “It was a really fun puzzle to put together. The game has this wonderful tone and look to it, and we were lucky enough to shoot in amazing locations like Namibia, Utah and New York. However, to bring all of these together and make it look like a unique version of Fallout Los Angeles took a lot of collaboration between Jonah, [Executive Producer] Geneva Robertson-Dworet, [Executive Producer] Graham Wagner, Howard and myself. Usually, I try to have a singular vendor that would do all of our environmental work, but we had the opportunity to work with a number of teams to create this world, which brought a unique richness to the world. There’s a lot of freedom there with how desolate the wasteland is, but with our specific locations that had such richness with what Howard was able to build and find, tying that all together was the biggest challenge for us throughout the season. The other challenge was telling the layered story of a bomb that went off hundreds of years ago along with the story of the people that have survived on the surface.”

For the iconic shot of Los Angeles when the nukes are dropped at the end of the teaser in Episode 1, the first step for Worth and the team at RISE FX was to recreate all the buildings you see in the source plate with enough detail so that they could use them for the interaction and integration with the bomb shockwave. “This modeling step was done in Maya. We also used geographic height data to get a closer representation of the hills in the shot,” Worth details. “We added several CG buildings in between the real buildings to create the retro-futuristic L.A. to match the

76 • VFXVOICE.COM SPRING 2024 TELEVISION/STREAMING
TOP TWO: Production Designer Howard Cummings and the production team built the interior buck of the Vertibird. RISE FX worked on all the Vertibird shots. BOTTOM TWO: Important Looking Pirates was responsible for creating the robot Snip Snip.

establishing shot at the beginning of the sequence. We did the same for the monorails and thousands of trees. Those additional buildings and vegetation were all prepared to be used in simulations so that they could be affected by the bomb. As the plate was based on stock footage that did not provide the best quality, we also replaced the complete mountain on the right side of the frame. This helped with the integration of the street, horse and billboard as well. To make the plants as realistic as possible, we created several types of trees and bushes that are common in Los Angeles.

“As every nuke explosion, especially the shockwave, interacts with the city, it was not possible to re-use just one simulation,” Worth continues. “So, we had to run a custom sim for every nuke, which consists of the mushroom, the rolling wave around it, the fast-traveling shockwave and a debris and glass pass emitted from the buildings. Another challenge was the fact that the whole shot was running in slow-motion, and there are not many references out there showing this type of explosion in slow motion. But we could use reference of normal explosions to transfer the look to the individual layers of the atomic blast. The whole shot has more than 20 effects and CG layers that were composited in Nuke.”

“I’m really happy with how everything has come together in terms of all of the effects for the season, especially the environments and making it feel like a version of Los Angeles. We took a bit of creative license in terms of geography throughout the season. We were not worried about an exact documentary path in terms of a route through Los Angeles. We do have a map tracing an approximate path which helped us ground certain things like the distance toward downtown L.A., and it does all make sense. However, it still had to be beautiful and impactful. One of the big things was not necessarily worrying about how much it looked and felt exactly like Los Angeles, but how much it always needs to look like Fallout. That always felt like our goal, and that is what we kept coming back to more than anything else. It has been a fun thing to have that as our North Star throughout the season and making sure that’s where we try to land each time. This is a story in the Fallout world, but not a story that has already been told or seen in the games, which gave a lot of freedom to Jonah, Geneva and Graham with how to shape the story and tone, which in turn gave us all the freedom in the world with how to make that come to life,” Worth says.

“Working on a show like Fallout is a dream come true for several reasons: working with visionary filmmakers like Jonah, working with an immensely talented, passionate team and, of course, the work itself. We set out to produce stunning, photorealistic visual effects and achieved that goal. The show looks incredible, and I’m certain many viewers won’t be able to tell what’s practical and what’s visual effects. We worked hand in hand with production and created this beautiful, interesting, textured world. Geneva and Graham are just a joy; two very talented writers and showrunners, they also have a clear vision and a great deal of respect for those they’ve entrusted to work with them to create the show. On a personal note, I’ve always wanted to work with a female showrunner and after many years in this industry, this has been my first opportunity to do so. Geneva is brilliant,” concludes Knoll.

BOTTOM:

SPRING 2024 VFXVOICE.COM • 77
TOP TWO: Important Looking Pirates worked on the Cyclops. Played by Chris Parnell, the Cyclops was given two eyebrows to highlight the single eye. (Photo: JoJo Whilden) A Power Armor Suit and Aaron Moten (Maximus). Legacy FX was responsible for building the power armor suits. (Photo: JoJo Whilden)

BEYOND ENTERTAINMENT: THE CYCLE OF LIFE AND ART THROUGH VISUAL EFFECTS

The dictionary definition of visual effects as the sole domain of film and entertainment does not remotely offer a complete picture of how widely visual effects are used today. Hollywood has popularized and refined the digital technology that began life in laboratories situated in universities, military organizations and corporations such as IBM and Xerox. Therefore, it is not surprising to learn that actors and filmmakers, like Tom Cruise and Christopher Nolan, are not the only beneficiaries of making the impossible possible through virtual means as there are innovative practitioners in a vast range of areas such as education, policing, healthcare, automotive design and space exploration.

TOP: Real-time computer graphics is not as much about the speed of producing the image as it is also about changing how people collaborate.

(Image courtesy of CG Pro)

OPPOSITE TOP: The important thing for the Territory Group is developing a design language that is distinct for Audi.

(Image courtesy of Territory Group)

OPPOSITE BOTTOM: Virtual reality has streamlined the design process for the interior and exterior of vehicles such as the new Volkswagen Transporter. (Image courtesy of Volkswagen)

“The accuracy obtained with the latest generation of simulators emulates in an amazing way real flying when it comes to a precision not seen before giving the pilot great confidence regarding the capabilities of the equipment he is using,” retired pilot Ed Acosta observes. “Also, all kinds of emergency situations can be introduced that prepare a crew to confront it with a lot of confidence should it happen in the real world. This is the essence of flight training in a simulator. And, with no risk, these situations can be taken to the very limit.”

Computer animation has not changed how cars are developed. “But it brought an emotional side that engineering software couldn’t deliver,” notes Albert Kirzinger, Head of Design for Volkswagen Commercial Vehicles. “The designer takes the ‘key sketch’ that defines the design idea best and transforms it into 3D using CAD, like Autodesk Alias or Blender 3D. The latter has become the go-to product after the program´s complete redesign back in 2018. We’re all about 3D after that, both digitally

78 • VFXVOICE.COM SPRING 2024
VFX TRENDS

“Clay models are irreplaceable because you can only add or remove material with clay. But we build the data digitally for milling clay models. Once milled, they’re manually altered. After the design is locked, it’s reverse engineered with 3D printing for foam or materials like metal.”

—Albert Kirzinger, Head of Design, Volkswagen Commercial Vehicles

and physically [scale and full-size models].” The reliance on clay models remains. Kirzinger says, “Clay models are irreplaceable because you can only add or remove material with clay. But we build the data digitally for milling clay models. Once milled, they’re manually altered. After the design is locked, it’s reverse engineered with 3D printing for foam or materials like metal.” Virtual reality has streamlined the design process. “VR simplifies prototyping for both interior and exterior designs, slashing costs and speeding up the review of concepts before committing to a physical prototype,” Kirzinger adds. Predicting the evolution of digital tools in automotive design is hard to determine. “One trend has been on the block for the past two years: generative AI tools speeding up design processes and making tedious tasks disappear,” Kirzinger notes.

Digital tools have become more sophisticated in allowing complex workflows to be completed in shorter periods of time. “Before the introduction of web mapping tools, web map development required significant time and effort from programmers,” explains Gary Wheeler, Ministry Spokesperson, Ministry of the Environment, Conservation and Parks, Ontario, Canada. “As more tools that allow the creation of maps and dashboards without the need for programming knowledge are introduced, the ability to incorporate these elements into our work becomes available to

SPRING 2024 VFXVOICE.COM • 79

more staff. Geomatics uses graphics and visualizations to provide data analytics services to manage environmental issues across the ministry. The ministry uses tools including Canva for creating graphics [using predeveloped elements] to add visual interest to digital products and/or summarize complex concepts/data in an easy-to-understand way. We also use a variety of tools for data visualization, including maps. Examples include Power BI and Esri ArcGIS. Graphics and visualization allow complex data and information to be communicated in a way that is easily understood by the end user.”

For students wanting to visit different places around the world or to teach life skills to those with learning disorders, RobotLAB has created a series of VR experiences. “There is a VR software being used to help students on the autistic spectrum to understand things like, if you’re in a high school hallway how do you look at someone in the eye and have small talk with them or how do you engage with the police in a way that they can understand that you need a different sort of interaction than another person,” remarks Amy George, K-12 Account Manager at RobotLAB. “It is a cool and interesting software that comes with an autism VR pack.” A classroom endeavor is VR Expeditions in partnership with Encyclopaedia Britannica. “When Google Expeditions was canceled, we decided to create our own VR platform,” remarks

Maria Galvis, Marketing Communications Manager for Education at RobotLAB. “We have three types of questions with our expeditions. What do you see? What did you learn? What if? Teachers also have pointers so they can tell students, ‘Check here. Learn this. Let’s listen to what they are trying to say.’ That’s another thing that helps education change in the sense that we’re not just learning from a book but we’re actually living it. Students are seeing it through their eyes.”

80 • VFXVOICE.COM SPRING 2024
TOP AND BOTTOM: Virtual reality is a key component for enabling NASA to explore various challenges the Artemis crew might experience while on the Moon. (Image courtesy of NASA)
“In support of scientists seeking to explain submarine phenomena, we have created a model of Canadian oceans and seabeds. This modeling features a number of whale pods and studies the potential impact of military probes deployed in our waters on whale behavior. This digital picture becomes indispensable when it is either impossible or too costly to capture and represent these phenomena. We made use of the unparalleled real-time rendering capabilities of Unreal Engine, lit the environment with Lumen and rendered millions of polygons using Nanite, a new technology.”
—Defence Research and Development Canada

CG Pro School is the first premier Unreal Engine authorized training center in North America and specializes in real-time and virtual production. “We do public training which tends to be related more to media and entertainment, and private training, such as teaching virtual production to NASA. “There were simulation engineers working on the Artemis mission, which is getting ready to go back to the South Pole of the Moon, who were simulating the landing in Unreal Engine so they can understand what it looks like before going there,” explains Edward Dawson-Taylor, Co-Founder and Head of School, CG Pro. “Where advances in technology should be focused is where they’re useful and make something better,” Dawson-Taylor advises. Technology is agnostic as to where it can be applied. “Films have taken all of the industrial use cases and improved the visual quality and user experience. At this point, what we’re unlocking is real-time computer graphics, and that’s not only about the speed of producing the image; it’s the workflow itself. It’s changing the way that people work together and collaborate. But it is predominately visualization and simulation, being able to see something that otherwise would be hard to see.”

Catering technology to suit the needs of clients, whether it be in

TOP:

(Image

BOTTOM: Simulations have been put together to understand the potential impact that military probes might have on whale pods and behaviors.

(Image

SPRING 2024 VFXVOICE.COM • 81
Final Pixel offers the technology to suit the needs of clients ranging from advertising to corporate events. courtesy of Final Pixel) courtesy of Canadian Armed Forces)

the entertainment industry, advertising or corporate events, is the mandate for global virtual production studio Final Pixel. “We had to figure out how to meet fairly large client needs that had pressing deadlines, so one of the models that we built through the production approach was a popup stage,” remarks Michael McKenna, CEO and Co-Founder of Final Pixel. “We would put up a stage, film on it and bring it back down again. We’ve done that multiple times in Los Angeles, New York and London, and we’re doing a lot more internationally now as well. We are also building a trusted stage network and have developed a way of interfacing, because how you get the best out of virtual production is to have a seamless way through pre-production, the stage and post.” Democratizing the knowledge and expertise associated with the latest virtual production techniques and skills through online and in-person training is the Final Pixel Academy. “We deliver it for four major groups: corporate, professional crew, general public and clients. What we saw from the beginning was the skill shortage, and I was happy to start an academy to share everything that we’ve been learning in order to build a knowledge base for the industry. People can go to an academy and work for other companies as well.”

Commencing in 2009, simulation technology for driving, marksmanship and judgment has been incorporated into the Cadet Training Program at the RCMP Academy. “When the driving simulators were first purchased, it was because we needed to address a gap in our training program,” remarks Dr. Gregory P. Krätzig, Ph.D. (Psyc), Director Research and Strategic Partnerships, Depot Division for the RCMP. “Teaching emergency vehicle operations [driving with emergency equipment activated] was not possible in a civilian environment. What the simulators allow is to expose cadets to civilian traffic and pedestrians in a realistic setting. If mistakes are made in the simulated environment, they are made in a safe setting where those errors are addressed and teaching can be reinforced. This can also be applied to firearms simulation training where multiple cadets can be trained in situations previously limited to one or two cadets at a time.” The marksmanship simulation has shown the most progress. “New developments have resulted in the simulated pistols functioning close to a real pistol while maintaining a safe, risk-free environment. It is expected that the technology will mature in the coming months and will provide opportunities to be adopted in the field for experienced officers. The driving simulation is also advancing and has been used to teach decision-making and situational awareness,” Dr. Krätzig says.

As for the expected proliferation of simulators, part-task trainers and extended reality devices in Air Force schools, contracted training centers and operation training units, the Royal Canadian Air Force states, “With advancements in artificial intelligence, enhanced digital animation, graphics and effects incorporated in these modern training aids, the training realism afforded to students continues to improve their learning while at the same time reducing costs and operational impacts of using real-life aircraft or facilities for training. For example, students at the Canadian Forces School of Aerospace Control in Cornwall, Ontario, make use of control tower simulators with sophisticated graphics allowing them increased depth perception and detail to spot and control aircraft

82 • VFXVOICE.COM SPRING 2024
TOP AND MIDDLE: Emperia has worked with Bloomingdale’s to bridge the gap between e-commerce and the in-store experience. (Image courtesy of Emperia) BOTTOM: Stereo imagery uses the anaglyph method to display the height of objects within the frame. (Image courtesy of Ontario Ministry of the Environment, Conservation and Parks)

and ground vehicles, thereby improving the fidelity of the training experience in a way not achievable to the same degree with earlier generation graphics/simulators.”

When it comes to simulating experiments, evaluations, staff training and rehearsals before operational deployment, the Canadian Joint Warfare Center explains the reasons for doing so. “There are many different types of digital tools from flight and driving simulators to learning to operate equipment in a combat zone. We can teach CAF members basic familiarization of their environment and an aircraft maintenance task without leaving a classroom. These are immersive or virtual environments that you can interact with. This technology is probably the hardest to develop but will be the most productive and rewarding in the future. Especially as we start to include machine learning as part of the system.”

Aiming to make the invisible visible and visualize the impossible is the Science Visual Documentation team at Defence Research and Development Canada, which released the following response about the use of videography, photography, 3D animation, graphic design and illustration. “In support of scientists seeking to explain submarine phenomena, we have created a model of Canadian oceans and seabeds. This modeling features a number of whale pods and studies the potential impact of military probes deployed in our waters on whale behavior. This digital picture becomes indispensable when it is either impossible or too costly to capture and represent these phenomena. We made use of the unparalleled real-time rendering capabilities of Unreal Engine, lit the environment with Lumen and rendered millions of polygons using Nanite, a new technology.”

After a 50-year absence, NASA plans to set foot on the Moon in preparation for human missions to Mars via the Artemis Program, which is responsible for the engineering, planning and training. “Digital tools [virtual reality, ray tracing, virtual production] have enabled us to explore the various challenges that a crew may encounter while on a mission in a more immersive environment than was possible before,” states Lee Bingham, Aerospace Engineer, Simulation and Graphics Branch of the Software, Robotics, and Simulation Division, NASA. “This allows us to create

TOP AND MIDDLE: Osso VR has created a VR training and assessment platform in an effort to standardize medical procedures and communicate the latest information about technological advances to healthcare professionals.

(Images courtesy of Osso VR)

BOTTOM: Talespin makes use of AI-enabled CG characters in virtual workplace scenarios that are designed to develop the skills of employees. (Image courtesy of Talespin)

SPRING 2024 VFXVOICE.COM • 83

TOP TO BOTTOM: Simulations allow RCMP cadets to experience civilian traffic and pedestrians in a safe environment.

(Image courtesy of the Royal Canadian Mounted Police)

“VR Expeditions 2.0” is the latest version RobotLAB has released where teachers and students can tour the world together without leaving the classroom.

(Image courtesy of the RobotLAB)

Among the world field trips on “VR Expeditions 2.0” is Patagonia. (Image courtesy of the RobotLAB)

realistic simulations and tools that are used in assessments and studies that can influence requirements in engineering and mission planning. For example, the areas around the Lunar South Pole that the Artemis missions plan to explore have unique lighting challenges that were not experienced in the Apollo missions. By creating a realistic and immersive simulation with visualization products, we can get a first-order approximation of how those lighting conditions impact design and mission operations. Additionally, we can perform end-to-end human-in-the-loop simulated missions with crew to determine how all the elements and systems interact with each other. Game engines are becoming more suitable for engineering and simulation development by supporting real-world physics models, integrating features that support the engineering design process and interoperability standards with other analytical and simulation tools.”

Not everything is environmentally oriented; Talespin makes use of AI-enabled CG characters in virtual workplace scenarios that are designed to develop the skills of employees in 15 different industries, including healthcare, insurance, hospitality and financial services. “We have made it drag-and-drop simple for an enterprise user who has no experience with game engines, has never had a 3D file on their computer, doesn’t know the first thing about optimization or IK rigs or animation,” states Kyle Jackson, CEO and Co-Founder of Talespin. “We built a toolkit for them where they can come in and focus on the simulation that they’re trying to achieve because of a business outcome. In the healthcare space, there can be difficult conversations between doctors and patients or doctors and staff.” In most user cases, it is beneficial not to have photorealistic CG characters. Comments Jackson, “It’s the same way with filmmaking where you’ve done your job well if, by the second act. people are not thinking about all the false rules that are making your world unbelievable. In learning, we have the same kind of barriers, but they’re more about getting people to unblock their biases and distractions or focus. If you cross the uncanny valley, people still know it’s software, and at this point in time, we’re more fascinated with the idea that we’re talking to a virtual human rather than focusing on the goals of what the conversation is about. The focus is in the wrong place, so it actually helps to not be photoreal because then people focus on the engagement rather than the science of it, or the fascination with the technology,” Jackson comments.

Online shopping has taken consumers beyond brick-and-mortar stores. Emperia, aiming to bridge the gap between e-commerce and the instore experience, provides a virtual store platform for retail and fashion enterprises. “The construction of the environment is important and how you position the product in it to make sure the product drives the focus,” remarks Olga Dogadkina, Co-Founder & CEO at Emperia. “Even though you’re [virtually] on the Moon, you can still see what you’re shopping for clearly. The layout of virtual stores versus physical ones is quite different. The first thing we found out early on is that no one wants to see the replica of a physical store in virtual space because you can simply go to the store and explore it yourself, whereas having a digital world brings a whole other aspect of interaction into how you can shop. Rules of merchandising are a bit different in the virtual world. Placing best-selling

84 • VFXVOICE.COM SPRING 2024
VFX TRENDS

products closer to the start of the experience helps users to ground themselves and explore further, while in a physical store, the best-sellers always go in the back; otherwise, you create a crowd at the entrance and no one goes through. The other part of this is we have launched a product for 3D creators that is focused on how to enable others to create these environments that are targeted towards e-commerce.”

In an effort to standardize medical procedures and communicate the latest information about technological advances to healthcare professionals, Osso VR has created a VR training and assessment platform. “There is a lot that we’re pulling from gaming and visual effects, but also from the simulation work done by aviation and the military,” remarks Dr. Justin Barad, CEO and Founder of Osso VR. “We’re not at the final period of the so-called augmented surgeon. There are a lot of different components that will come into play, but certainly, the pieces are starting to come into focus. A lot of things happen outside of the operating room that are critical. The problems I was seeing in my surgical residency firsthand are four things. First, there are many procedures that are expanding all of the time, and it’s gotten beyond the ability of any human to know how to do all of these different procedures. Second, most of these procedures are more complicated than what came before, so to learn them takes a lot longer and many more repetitions, and those repetitions are on people. Third, we don’t have a way to assess the abilities to perform these procedures. Fourth, surgery is a highly coordinated team activity whereas in the past it was just the surgeon who did it. Not everybody has the ability to train together. Training is such an important part of this whole ecosystem because your ability to perform a procedure has a major impact on patient outcomes. Even more important than training is deciding to do surgery. Generative AI has an incredible potential to enhance our ability to figure out when we do and don’t want to do surgery.”

Inhabiting entertainment and non-entertainment industries as a graphics design studio is the Territory Group, which has clients ranging from Marvel Studios to Audi. The brief is similar to Hollywood studios and automotive manufacturers. “In a movie, I need that visual to read in three seconds, and our graphic is probably not the most important thing in the shot,” remarks David Sheldon-Hicks, Founder of the Territory Group. “It has to be intentional and direct, but also needs a visual sophistication that still feels cinematic to an entertainment-level production value. The way I translate that in the real world is that people still have short attention spans; it still needs to translate, be immediate, transmit to a mass audience and have brand sophistication. When we’re doing those digital interfaces for Audi cars or General Motors or Porsche, there has to be a point of difference in the interface; otherwise, how do you know why you choose one brand over another one? You’re always communicating functionality and purpose, both in terms of storytelling and utility. And you have a second layer of that emotional resonance whether that is at a brand or storytelling level.” There is a co-dependency that exists. Sheldon-Hicks concludes, “Life imitates art and art imitates life. It’s this beautiful cycle.”

TOP

Territory Group worked with Medivis to create 3D medical applications. (Image

An example of work imagery created by Edward Dawson-Taylor, Co-Founder and Head of School, CG Pro. (Image courtesy of CG Pro)

DesignCar is a mobile game created by the Territory Group that allows players to customize officially-licensed vehicles in real-time 3D. (Image courtesy of Territory Group)

SPRING 2024 VFXVOICE.COM • 85
TO BOTTOM: The Canadian Joint Warfare Centre simulates experiments, evaluations, staff training and rehearsals prior to operational deployment. (Image courtesy of Canadian Armed Forces) courtesy of Territory Group)

VFX AND SUSTAINABILITY: REDUCING CARBON FOOTPRINT, ITS IMPORTANCE AND MORE

The demand for visual effects has soared considerably in the last few years. While the VFX industry no doubt plays a role in contributing to the film industry’s large carbon footprint, little data exists surrounding the exact figures regarding VFX’s carbon footprint in comparison with other areas of the industry. A 2016 BAFTA report, however, found that the average animation production produced a staggering 5.5 tons of CO2 emissions per hour, with the majority of these emissions caused by production offices and post-production. Dupe VFX Chief Operations Officer Robin Chowdhury notes that it’s nearly impossible to find an exact statistic on what the industry average is, and he isn’t aware of any other VFX company that has at least two years of data to compare with.

TOP: One of Dupe VFX’s most recognizable projects is Peaky Blinders Jonathan Harris, CEO and VFX Supervisor at Dupe, set up the company with a focus on sustainable practice, environmentally and ethically, for staff well-being and diversity.

(Image courtesy of BBC Studios and Tiger Aspect Productions)

OPPOSITE TOP TO BOTTOM: Hello Tomorrow! was one of Brainstorm Digital’s 2023 projects. The company has gone fully remote since 2020, eliminating the need for a server room – and air conditioning, resulting in a massive reduction in energy usage.

(Image courtesy of Brainstorm Digital)

Framestore contributed to Rebel Moon – Part One: A Child of Fire. The multinational is carbon neutral in the U.K. and India and aims to be carbon neutral in its U.S. and Canada offices this year.

(Photo: Clay Enos. Courtesy of Netflix)

Outpost VFX worked on Foundation Season 2. All of Outpost’s 2023 projects were delivered using AWS, having become entirely cloud-based through AWS in 2022. (Image courtesy of Outpost VFX and Apple TV +)

Recent studies have indicated that Hollywood blockbuster films with budgets in the realm of $70 million produce on average 2,840 tons of CO2 per production, a devastating figure that is roughly equivalent to the amount absorbed by 3,700 acres of forest in just one year. Around 51% of those emissions were related to transport, with 30% impacted by air travel and 70% by land. Energy consumptions make up the rest of the figures with 34% of the average blockbuster’s CO2 emissions going on mains electricity and gas and around 15% on diesel generators. In many cases, carbon emissions and resource consumption aren’t even reported at all, with many productions failing to acknowledge the damaging impact. Reports have shown that London’s screen production industry is also staggeringly high, producing approximately 125,000 tons of carbon emissions per year, with 15% of the overall emissions being the direct cause of post-production.

Renowned for their commitment to tackling carbon footprint within the industry, Dupe VFX was founded in 2018 by Jonathan

86 • VFXVOICE.COM SPRING 2024

Harris, who also serves as CEO and VFX Supervisor. “We have grown organically into a mid-sized company offering high-quality visual effects services for episodic TV and film, where we do a lot of ‘invisible’ VFX. I would say that we punch above our weight in previs, on-set supervision and complex, technical comp,” Harris says. Dupe’s notable projects include leading VFX on Sex Education and Gangs of London, which established them as a serious studio. “More recently, our work on the political thriller Liaison showed our 3D hard asset capabilities. And we’re looking forward to the release of our first feature film Havoc for Netflix and the actionpacked TV series Renegade Nell for Disney+.”

As well as having a passion for visual effects, Harris wanted to set up a company with a focus on sustainable practice, both environmentally and ethically, for staff well-being and diversity. “When we set up Dupe we discovered B Corp, an organization that promotes and supports business as a force for good, for the environment, people and communities. That felt right to me, and we pursued certification, spearheaded by our Chief Operations Officer Robin Chowdhury. Our mission statement is ‘excellence in visual effects while inspiring change,’ so this topic is as dear to our hearts as visual effects,” Harris explains. “We have been tracking and reporting our carbon usage since 2020 and became carbon neutral in 2021. We’re currently carbon neutral by offsetting, but the next stage is to reduce and aim for net zero. The pandemic and global strikes have slowed us down, diverting our focus on keeping the business alive, but the foundation and intention to move forward is very much part of our business model.”

“One of the obstacles that companies in the VFX industry face about developing sustainable practice is finding ‘how to’ information,” Chowdhury adds. “And it’s complex, including things like taking waste into consideration, gas, AC, even working from home emissions. Some info is held behind paywalls, and there are consultancy firms that specialize in this area. We talked to five or six companies, but the consultancy fees were too expensive, so we found our own way through the process, and that journey took us a few years. We have had a third-party audit two years of data and look into our numbers and methods. We do want to go deeper into the visual effects process, and a lot of that is around hardware, power usage and the processing power that we have, especially rendering. Internally, we raise awareness of our B Corp commitment through onboarding, environmental policy, data transparency and many other ways big and small. We invite organizations like Greenpeace in to talk about impact, encourage conversation and collective decisions about impact and support community volunteering, [for] 1% for the Planet and Access VFX. Externally, we share our tracking and recording data, offer clients carbon emission reports for each project we work on, and do what we can to inspire and support others to do the same.”

Brainstorm is a leading visual effects company based in New York. Since 2020, Brainstorm has gone fully remote, which has eliminated their need for a server room – and the air conditioning to keep it cool. “The server room would normally reach 100 degrees, which meant that our A/C unit was constantly running day and night. This became a massive reduction in energy usage,”

Brainstorm Co-Founder and VFX Supervisor Richard Friedlander says. “The choice to continue to work remotely is one way. We also look for efficiencies in computing power. Our cloud server and virtual machines are upgraded periodically. The newer technology is more efficient, thus saving energy. Also, as I mentioned, by working remotely, the employees and owners of the company who needed to drive vehicles to get to work, no longer need to. Some of Brainstorm’s recent projects have included No Hard Feelings, Breathe and Love Me. There have been a number of green productions that Brainstorm has also worked on. “Some that come to mind are The Dilemma, The Lost City of Z and Armageddon Time.

SPRING 2024 VFXVOICE.COM • 87

TOP: Dupe VFX worked on Renegade Nell for Disney+. The company is well-known for its commitment to tackling carbon footprint within the industry. The company has been tracking and reporting its carbon usage since 2020 and became carbon neutral in 2021.

(Image courtesy of Disney+)

BOTTOM: Framestore has worked on a variety of features recently, including Barbie. Framestore has been introducing methods and policies to combat carbon emissions.

(Image courtesy of Warner Bros. Pictures)

By ‘green’ production, that means that the film crew was required to carry their own re-fillable water bottles and coffee mugs, thus significantly reducing plastic waste.”

Framestore is an Academy Award-winning visual effects company based in London. Some of their recent projects include Barbie, The Little Mermaid, Guardians of the Galaxy Vol. 3 and Rebel Moon – Part One: A Child of Fire. Framestore has also been introducing methods and policies to combat carbon emissions. “Last year, we completed our baseline measurement of our carbon emissions in London, and we have completed the same for our other offices globally this year,” says Nicola Miller, Global Marketing Director, Film at Framestore. “We are carbon neutral in the U.K. and India [Scope 1 and 2], and we will be offsetting our Scope 1 and 2 emissions in the U.S. and Canada to become carbon neutral in those markets next year. We are currently evaluating a number of globally recognized organizations that might help us to address our Scope 3 emissions in all countries. We continue to work on reducing our emissions by working with our landlords to encourage take up of green energy suppliers and where feasible to install solar panels on roofs. We recycle 100% of our waste in the U.K. and India, and we are again working with landlords to see how recycling can be improved. We are in the process of appointing a Waste Champion in each site to encourage more and better recycling, both in the office and at home.”

With the increase in productions relying on virtual production technologies, there has been a significant impact in reducing the need for extensive post-production. The use of virtual production has been beneficial in reducing carbon emissions, notably by eliminating the need to travel to remote filming locations. Another positive aspect of virtual production is that background characters can be more easily created within the game engine, allowing productions to cut back on the need for numerous background extras. Physical props and sets have previously had a damaging impact on the environment and the advancement of new technologies means that these props and sets can be digitally substituted.

Cloud computing and virtual computers have also contributed to reducing the need for extensive post-production and rendering work. “Of course, ‘virtual’ computers are stationed in large banks of computer processors, but there are efficiencies built into that,” Friedlander details. “For example, we now rely on these machines on an ‘as-needed’ basis as opposed to computers spinning all day long whether they’re used or not. Real-time rendering also saves energy – rendering computers such as Unreal Engine simply take less time and energy. What might have taken hours to render can now take minutes thus saving power needed to run those computers. AI technology can also contribute to saving energy by helping visual effects artists be more efficient.”

Outpost VFX is a global visual effects company founded in 2013. All of Outpost’s 2023 projects were delivered using AWS, having become entirely cloud-based through AWS in 2022. They have worked on numerous projects, including The Hunger Games: The Ballad of Songbirds & Snakes, Napoleon, Wonka, The Creator, Foundation Season 2, The Wheel of Time and Monarch: Legacy of Monsters. “This has given us access to more efficient servers

88 • VFXVOICE.COM SPRING 2024

and data centers powered almost entirely by renewable energy. This has helped us to significantly reduce our carbon footprint,” Outpost Marketing Lead Eloise Lacey says. “It was important that we went with a provider that was committed to continually improving their sustainability year on year, too, with AWS committing to using 100% renewable energy in 2025. We are also in the process of calculating our global carbon footprint so we can identify the areas where we can improve and continue to reduce our impact on the planet.”

Last year, Outpost started on a long-term project to roll out several initiatives to make a positive impact on the environment around them. “In the U.K., we’ve partnered with GreenTheUK on a number of ongoing environmental initiatives, including protecting kelp forests along the Dorset coast, an act that is imperative for storing carbon, improving water quality and helping the local sea life thrive; protecting native oysters and their habitat in the Solent, which sequesters carbon and provides a home for hundreds of other species; planting wildflower meadows to conserve native pollinators and a wide range of other wildlife, and others. Looking forward, we have plans to adopt similar initiatives in our other sites to have a positive impact on both our teams’ immediate environment and to fight the effects of climate change on a wider scale. Outside of this, in the rare case that our team has to use air travel for work, we offset our carbon emissions through an environmental travel agency to limit our impact when flying is unavoidable,” Lacey says.

A number of major companies have also introduced policies and targets to help tackle the ongoing climate crisis. Netflix, whose carbon footprint in 2021 was roughly 1.5 million metric tons, has recently set a target to cut its emissions in half by 2030. Amazon Studios, HBO Green and Sony Pictures are three major studios also aiming to prioritize sustainable practices on productions by limiting waste, transitioning to clean energy batteries, planting trees and through other methods. While little data exists surrounding the exact figures of the carbon footprint in VFX, there are signs that companies are striving towards going green and adopting more sustainable methods. Furthermore, organizations such as the Environmental Media Association and Green Screen promote green production and were formed to help reduce the carbon emissions and environmental impacts of the film, TV and advertising industry.

Harris acknowledges that sustainability and being green is more topical than it was three years ago. “It’s a big, complex topic, ultimately about the sustainability of the industry, how we work, the pricing modeling, how you treat your staff and the impact you have on your community. We were the first visual effects company to get a B Corp certification, and with the work we’ve done to develop methods to track and report, I want Dupe to be leading on this topic. We find a lot of interest from clients and have shared our approach and methods with Amazon AWS and Netflix and this year spoke to our peers at FMX. But there is more that we can do, and we’re working on ambitious plans to support other VFX companies who want to adopt more sustainable, carbon-neutral practices,” Harris concludes.

(Image

Outpost

Outpost

in the process of calculating its global carbon footprint to identify areas for improvement.

(Image courtesy of Outpost VFX and Amazon Prime Studios)

Dupe VFX joined forces with major vendors to help deliver the opening sequence to the Netflix show The School for Good and Evil.

(Image courtesy of Dupe VFX and Netflix)

SPRING 2024 VFXVOICE.COM • 89
TOP TO BOTTOM: Outpost VFX contributed to Foundation Season 2. Becoming cloud-based has given Outpost access to more efficient servers and data centers powered almost entirely by renewable energy, helping to significantly reduce its carbon footprint. courtesy of Outpost VFX and Apple TV+) VFX worked on The Wheel of Time Season 2. is

How to Shoot and Edit Animation Using Live-Action Virtual Production

for this publication

Abstracted from The VES Handbook of Virtual Production

Introduction: The Creative Story Process

At its heart, this is a conversation about the creative story process itself. The use of real-time tools in animation enables the convergence – or intersection – of animation and live-action techniques. Many trailblazing films have led the way, from Beowulf (2007) to The Adventures of Tintin (2011), all before the advent of real-time technology as it exists now. Over the last decade, there have been significant advances in game engines’ readiness for real-time (live) visualization on set and in camera and, in some cases, “final pixel” applications. Stylized hand-key animation can also be produced with this approach.

As real-time tools improve, their artistic and workflow benefits become impossible to ignore. The development of Virtual Production for animation, or “real-time animation,” can be viewed as a new and complementary toolset for the animation professional. The framework of understanding also needs to integrate the impact of cinema’s live-action heritage suddenly made available to the animation professional. This live-action mindset will not be right for all animated projects, and great content will continue to be made in the traditional way. Whatever the approach, the benefits of real-time are compelling enough for a deep investigation. Ultimately, the goal of animation is telling a good story with nice pictures. Real-time animation offers a new road to that end, with some surprising benefits.

As the name implies, real-time animation accelerates the production process, which can lead to creative and/or cost s avings. This acceleration does not equally impact every phase of production, but it does affect most and ripples through the entire production. There are three primary means from which this acceleration occurs:

• Immediacy of Visualization

• Increased Creative Iterations

• Making Creative Decisions in Context

The first (and the catalyst for the rest) is the immediacy of visualizing and/or rendering using a real-time renderer that renders frames in milliseconds, allowing artists to work without waiting, saving time and money. Final renders for an entire 22-minute episode of Super Giant Robot Brothers! (Reel FX/Netflix 2022) finished overnight on five workstations, as opposed to the traditional 500-node render farm. It is worth noting, however, that final frames can still be rendered with a traditional renderer. While real-time renderers cannot yet achieve the quality of an offline renderer, they inch closer every day. Visualizing in real-time and translating those decisions to an offline renderer can become quite complex, and speed will be lost in later departments as they lose these real-time render speeds. Take all factors into consideration when choosing the final renderer.

The second acceleration is the reduction of the time between creative iterations. The underlying assumption is that more iteration will lead to a better story/product. It can be said that this rapid visual ideation provides a creative team with “more bites at the apple.”

The third acceleration is improved context when making collaborative creative decisions. Teams can collaborate live and parallelize tasks with a real-time workflow. This saves time. Because renders are instantaneous, live “real-time” reviews are possible. Creative work that was previously accomplished asynchronously can now be realized simultaneously, and artists can react to one another. For example, camera work can interplay with performance during motion capture, or lighting and camera can interplay during the virtual camera phase. Decisions are made with more context up front, leading to fewer changes and surprises downstream.

Members receive a 20% discount with code: HVP23

Order yours today!

https://bit.ly/VES_VPHandbook

90 • VFXVOICE.COM SPRING 2024 [ THE VES HANDBOOK ]
Figure 4.2 Super Giant Robot Brothers! (2022). (Image courtesy of Reel FX Animation)

Oregon Rising: Celebrating a Milestone First Anniversary

The Visual Effects Society’s worldwide presence gets stronger every year, and so much of that is because of our regional VFX communities and their work to advance the Society and bring people together. The VES Oregon Section was established as our 15th VES Section in 2023 and has quickly grown into a vibrant community of artists and innovators in its inaugural year. VES Oregon boasts a dynamic membership that reflects the talented makeup of the area’s visual effects community, with about 85 members primarily working in film and television, and drawing from studios and VFX houses including LAIKA, Shadow Machine, Refuge VFX, Hinge, Crafty Apes and ILM.

The regional visual effects industry is experiencing a big growth spurt, creating new opportunities to work on film, episodic and commercial projects. “We are seeing a massive influx of VFX artists, especially from California, migrating up here,” said Allan McKay, VES Oregon Co-Chair and Visual Effects Supervisor at Refuge VFX. “That inflow is vastly expanding the skills and abilities that Supervisors can tap into, which is vital as we have more demanding work and an escalated need for more diversified and senior talent.”

VES Section leadership is very proud of what they accomplished in their first year, rich with events and a smart community-building strategy. “We are so fortunate to have a great group of volunteer leaders on our Section Board of Managers,” said Eric Wachtman, VES Oregon Co-Chair and CG Supervisor at LAIKA Studios. “Right out of the gate in Year One, we’ve held a BBQ, holiday party, screenings, recruitment events, and our whole team has carried the weight. Our members are invested, and we are getting good turnout from the community. Everyone has stepped up, and it’s a stellar collaborative team.”

“I’m so blown away watching everyone come together and work with such passion and focus,” said McKay. “2023 was such a rough year, so having a sense of community, especially here, where we’re a bit off the grid, has been vital as a place for us to come together as a community.”

VES Oregon Section events in 2023 included a robust screening program, hosted in Portland venues including: McMenamins Bagdad Theater, McMenamins Kennedy School and the Oregon Museum of Science and Industry (OMSI) theater; a family-friendly Summer BBQ at Camp Rivendale in Beaverton; and a festive first annual Holiday Party, hosted in concert with Autodesk. And, 2024 kicked off with a VES Awards nominations event, a galvanizing gathering for the new Section.

McKay continued, “Reviewing all that we achieved as a new Section, we want share our special gratitude to some of our Board of Managers colleagues: Thank you to Maria Fery for your hard work and creativity in planning our parties and events; to Holly Webster for taking on our screenings series with such dedication; and to Michael Cordova for planning our first VES Awards nominations event with such enthusiasm.”

In doing the work to establish the Section locally, Wachtman, McKay and others were a strong presence at industry events around

92 • VFXVOICE.COM SPRING 2024 [ VES SECTION SPOTLIGHT: OREGON ]
TOP: VES Oregon members judge submissions for the 22nd Annual VES Awards. BOTTOM TWO: VES Oregon members and guests gather in Portland for pub nights.

VES Elects 2024 Board of Directors Officers

TOP TO BOTTOM:

Kim Davidson

Susan O’Neal

Janet Muswell Hamilton, VES

Rita Cahill

Jeffrey A. Okun, VES

The 2024 VES Board of Directors Officers, who comprise the VES Board Executive Committee, were elected at the January 2024 Board meeting. The Officers include Kim Davidson, who was elected Board Chair, and is the first Board member from outside the United States to hold this role since the Society’s inception.

“It is my privilege to serve as Chair of our worldwide community of visual effects artists and innovators,” said VES Chair Kim Davidson. “Since I joined the Society 18 years ago, I have seen the VES grow from a California-based organization to a global society with 16 regional Sections and members in 48 countries. As the first Chair elected from outside the U.S., I am representative of our thriving globalization, and I look forward to further championing our expansion. The leadership on our Board bring enormous commitment, expertise and enthusiasm to their volunteer service, and I’m excited about what we can achieve.”

“The Society is fortunate to have exceptional leadership on our Executive Committee,” said Nancy Ward, VES Executive Director. “This group of impassioned and venerated professionals has a strong vision for the Society’s future, especially amidst this time of dynamic change. We appreciate their commitment to serve the VES, and I look forward to working with the Committee to evolve the Society.”

The 2024 officers of the VES Board of Directors are:

• Chair: Kim Davidson

• 1st Vice Chair: Susan O’Neal

• 2nd Vice Chair: Janet Muswell Hamilton, VES

• Secretary: Rita Cahill

• Treasurer: Jeffrey A. Okun, VES

Kim Davidson is the President and CEO of SideFX®, a company he co-founded in 1987. SideFX is a world-leading innovator of advanced 3D animation and special effects Houdini® software. He has received three Scientific and Technical Awards from the Academy of Motion Picture Arts and Sciences. Davidson was the first Chair of the VES Toronto Section and has served on the Toronto Section board for eight years. He has served on the global VES Board of Directors for seven years, on the Membership Committee for seven years, as well as a number of VES ad hoc committees.

Susan O’Neal, a VES Founders Award recipient, has served on the global Board of Directors as Treasurer and 2nd Vice Chair. For many years, she served as the Chair for the legacy global Education Committee and currently co-chairs the Membership Committee, playing an instrumental role in the Society’s growth. She is currently a recruiter for BLT Recruiting, Inc., and has worked as an Operations Manager at The Mill, Operations Director at Escape Studios in Los Angeles and as an Account Manager at SideFX.

Janet Muswell Hamilton, VES, is the Senior Vice President of Visual Effects for HBO. Janet is a VES Fellow and member of the Visual Effects Society global Board of Directors and Education Committee. With a career spanning several decades, she established her reputation as a VFX Producer and Supervisor for cutting-edge visual effects on a wide range of groundbreaking television series and theatrical features, as well as animation, IMAX, commercials and Stereoscopic Special Venue projects.

Rita Cahill is an international business and marketing/PR consultant and has worked with a number of U.S., Canadian, U.K., EU and Chinese companies for visual effects and animation projects. Previously, Cahill was the Vice President of Marketing for Cinesite and a founding Board member of the Mill Valley Film Festival/California Film Institute. A VES Lifetime Member and Founders Award recipient and Chair or Co-Chair of the VES Summit for eight years, this is Cahill’s ninth term as Secretary.

Jeffrey A. Okun, VES, is known for creating ‘organic’ and invisible effects, as well as spectacular ‘tent-pole’ visual effects that blend seamlessly into the storytelling aspect of the project. Okun is a VES Fellow and recipient of the VES Founders Award. He co-edited The VES Handbook of Visual Effects, an award-winning reference book on visual effects techniques, and acclaimed The VES Handbook of Virtual Production. He created and founded the VES Awards, now in its 22nd year, served as VES Chair for seven years and as Chair of the Los Angeles Section of the VES for two years.

[ VES NEWS ] 94 • VFXVOICE.COM SPRING 2024

Visual Effects Society Announces 16th and Newest Section in Texas

The Visual Effects Society is proud to announce the establishment of its 16th and newest regional Section in the State of Texas, authorized by the VES Board of Directors in January.

“We are thrilled to welcome our newest Visual Effects Society Section in the State of Texas,” said Nancy Ward, VES Executive Director. “The Society’s presence gets stronger every year – due in large part to our local leadership who gather and connect our members around the world. Our determination to reach all corners of the globe and all disciplines across the VFX spectrum has yielded us a rich, talented membership, and that commitment to diversity and inclusion will continue to be a driving force of the organization.”

“VES members in Texas are excited to be recognized as an official

Section of the VES,” said Colin Campbell, VES Board member who was instrumental in forming the new Section. “Texas has a long history of filmmaking distinction. Continuing that legacy, our members are comprised of professionals from some of the biggest names in the industry, including ILM, Sony Pictures Imageworks, Digital Domain and Blizzard/Activision. They represent a diverse cross section of moving image artistry: feature film and television visual effects, game development, cinematic production, animation, architectural visualization and educators. It’s an exciting time to be a Texan in the VFX-related moving image industry!”

SPRING 2024 VFXVOICE.COM • 95
The Austin (top) and Dallas (bottom) VES recruitment pub nights.

The Wizard of Matte

It would be difficult to single out one “most valuable player” for the classic, groundbreaking 1939 movie The Wizard of Oz. After all, cinema is a collaborative medium. Yet, Scotland-born George Gibson is a major contender for that designation. He is one of filmdom’s major unsung heroes (see article this issue on contemporary unsung VFX heroes, which now number in the thousands for VFX-infused projects). Gibson created all the matte paintings for this iconic MGM film while he also created the backdrop and sets. Gibson was the head of MGM’s scenic design department for 30 years. He also created matte paintings for such films as An American in Paris and Brigadoon. Born in 1904 in Edinburgh,

he attended the Edinburgh College of Art and the Glasgow School of Art. He migrated to America, and his talent allowed him to become head of the scene painting workshop at MGM. One of his innovations there was to paint backdrops on movable frames rather than on fixed scaffolds. There was no sophisticated VFX in those days, so he supervised three months of secret backgrounds and hand-painted scenery for Oz. He claimed the Emerald City was born from his imagination. He lived to 96 and painted every day until his death. Back then, studios did not want the public to know that the actors were standing in front of paintings. Like many unsung heroes in movie history, he was not credited for his work.

[ FINAL FRAME ] 96 • VFXVOICE.COM SPRING 2024
Image courtesy of MGM and Silver Screen Collection.
Issuu converts static files into: digital portfolios, online yearbooks, online catalogs, digital photo albums and more. Sign up and create your flipbook.