VFX Voice Winter 2024

Page 1

VFXVOICE.COM WINTER 2024

THE RICHES OF POOR THINGS

STATE OF THE INDUSTRY • NAPOLEON • BLUE EYE SAMURAI UK VFX • PROFILES: GAIA BUSSOLATI & ROBERT STROMBERG

CVR1 VFXV issue 28 Winter 2024.indd 1

11/24/23 12:56 PM


CVR2 UNIVERSAL OPPENHEIMER AD.indd 2

11/25/23 2:13 PM


PG 1 AUTODESK ADVERTORIAL.indd 1

11/25/23 2:15 PM


[ EXECUTIVE NOTE ]

Welcome to the Winter issue of VFX Voice! As we enter 2024, we wish you a New Year filled with good health and happiness. Thank you for being a part of the global VFX Voice community. We’re proud to be the definitive authority of all things VFX, and we value the opportunity to keep us tethered through stories that celebrate hard-working, visionary visual effects talent across the world. In this issue, our cover story goes inside director Yorgos Lanthimos’ black comedy fantasy film Poor Things. We preview the films anticipated to compete for VFX Oscar gold, go behind the scenes of filmmaker Sir Ridley Scott’s latest epic film Napoleon and animated TV/streamer Blue Eye Samurai, and highlight the latest wave of short films. We showcase a compelling State of the VFX Industry, an outlook on powerful new hardware and software for VFX artists, and share a variety of perspectives on AI & Animation. It’s London Calling... with a special report on VFX in the U.K. and a spotlight on our VES London Section. VFX Supervisor Gaia Bussolati and creative visionary Robert Stromberg take center stage in this issue’s profiles, and we salute the legendary Stan Winston and his lasting impact, and more. Dive in and meet the innovators and risk-takers who push the boundaries of what’s possible and advance the field of visual effects. Cheers!

Lisa Cooke, Chair, VES Board of Directors

Nancy Ward, VES Executive Director

P.S. You can continue to catch exclusive stories between issues only available at VFXVoice.com. You can also get VFX Voice and VES updates by following us on X (Twitter) at @VFXSociety.

2 • VFXVOICE.COM WINTER 2024

PG 2 EXECUTIVE NOTE.indd 2

11/24/23 12:57 PM


PG 3 GHOST VFX AD.indd 3

11/25/23 2:16 PM


[ CONTENTS ] FEATURES 8 FILM: THE VFX OSCAR Previewing top prospects for this year’s prize.

VFXVOICE.COM

DEPARTMENTS 2 EXECUTIVE NOTE 104 VES NEWS

18 FILM: SHORT FILMS Testing new tech, giving aspiring filmmakers a chance. 28 FILM: NAPOLEON Ridley Scott’s epic paints a broad canvas of war and history. 36 ANIMATION: AI & ANIMATION Animators’ attitude about AI is optimistic, not apocalyptic. 42 PROFILE: GAIA BUSSOLATI VFX Supervisor bridges Italian cinema, Hollywood blockbusters. 48 COVER: POOR THINGS A fresh, flowing surrealism inspires Yorgos Lanthimos’ vision.

107 ON THE RISE 108 VES HANDBOOK 110 VES SECTION SPOTLIGHT – LONDON 112 FINAL FRAME – REANIMATION ON THE COVER: Emma Stone as Bella experiences rebirth, self-discovery and liberation in Poor Things. (Image courtesy of 20th Century Studios)

56 INDUSTRY: STATE OF THE VFX INDUSTRY Despite production lulls, innovations continue to forward the craft. 64 INDUSTRY: HARDWARE/SOFTWARE OUTLOOK Advanced tech provides powerful new tools for VFX artists. 74 SPECIAL FOCUS: VFX IN THE U.K. Demand drives growth as industry seeks stronger tax relief. 84 PROFILE: ROBERT STROMBERG Evolution of a multi-talented, Oscar-winning creative visionary. 90 TV/STREAMING: BLUE EYE SAMURAI Blue Spirit’s bold animation inspired by Japanese historical art. 98 LEGEND: STAN WINSTON The late effects master’s impact still resonates in the industry.

4 • VFXVOICE.COM WINTER 2024

PG 4 TOC.indd 4

11/24/23 12:58 PM


PG 5 SCANLINE AD.indd 5

11/25/23 2:17 PM


WINTER 2024 • VOL. 8, NO. 1

WINNER OF THREE FOLIO AWARDS FOR PUBLISHING EXCELLENCE.

VFXVOICE

Visit us online at vfxvoice.com PUBLISHER Jim McCullaugh publisher@vfxvoice.com

VISUAL EFFECTS SOCIETY Nancy Ward, Executive Director VES BOARD OF DIRECTORS

EDITOR Ed Ochs editor@vfxvoice.com CREATIVE Alpanian Design Group alan@alpanian.com ADVERTISING Arlene Hansen Arlene-VFX@outlook.com SUPERVISOR Nancy Ward CONTRIBUTING WRITERS Naomi Goldman Trevor Hogg Chris McGowan Oliver Webb ADVISORY COMMITTEE David Bloom Andrew Bly Rob Bredow Mike Chambers, VES Lisa Cooke Neil Corbould, VES Irena Cronin Paul Debevec, VES Debbie Denise Karen Dufilho Paul Franklin David Johnson, VES Jim Morris, VES Dennis Muren, ASC, VES Sam Nicholson, ASC Lori H. Schwartz Eric Roth

OFFICERS Lisa Cooke, Chair Susan O’Neal, 1st Vice Chair David Tanaka, VES, 2nd Vice Chair Rita Cahill, Secretary Jeffrey A. Okun, VES, Treasurer DIRECTORS Neishaw Ali, Laurie Blavin, Kathryn Brillhart Colin Campbell, Nicolas Casanova Mike Chambers, VES, Kim Davidson Michael Fink, VES, Gavin Graham Dennis Hoffman, Brooke Lyndon-Stanford Arnon Manor, Andres Martinez Karen Murphy, Maggie Oh, Jim Rygiel Suhit Saha, Lisa Sepp-Wilson Richard Winn Taylor II, VES David Valentin, Bill Villarreal Joe Weidenbach, Rebecca West Philipp Wolf, Susan Zwerman, VES ALTERNATES Andrew Bly, Johnny Han, Adam Howard Tim McLaughlin, Robin Prybil Daniel Rosen, Dane Smith Visual Effects Society 5805 Sepulveda Blvd., Suite 620 Sherman Oaks, CA 91411 Phone: (818) 981-7861 vesglobal.org VES STAFF Jim Sullivan, Director of Operations Ben Schneider, Director of Membership Services Charles Mesa, Media & Content Manager Ross Auerbach, Program Manager Colleen Kelly, Office Manager Brynn Hinnant, Administrative Assistant Shannon Cassidy, Global Manager P.J. Schumacher, Controller Naomi Goldman, Public Relations

Tom Atkin, Founder Allen Battino, VES Logo Design Follow us on social media

VFX Voice is published quarterly by the Visual Effects Society. Subscriptions: U.S. $50: Canada/Mexico $60; all other countries $70 a year. See vfxvoice.com Advertising: Rate card upon request from publisher@vfxvoice.com or VFXVoiceAds@gmail.com Comments: Write us at comments@vfxvoice.com Postmaster: Send address change to VES, 5805 Sepulveda Blvd., Suite 620, Sherman Oaks, CA 91411. Copyright © 2024 The Visual Effects Society. Printed in the U.S.A.

6 • VFXVOICE.COM WINTER 2024

PG 6 MASTHEAD.indd 6

11/24/23 12:59 PM


PG 7 YANNIX AD.indd 7

11/25/23 2:17 PM


VFX OSCAR

HITTING THE HIGH NOTES IN A YEAR OF RECOVERY AND RESURGENCE By OLIVER WEBB

TOP: TOP: Preferring to shoot practically, Christopher Nolan aimed to avoid CG effects when making Oppenheimer. SFX and filmed elements were used to capture the power of the Trinity nuclear test. (Image courtesy of DNEG and Universal Pictures) OPPOSITE TOP TO BOTTOM: Driven by more than 3,000 visual effects shots with many fully CG sequences, Guardians of the Galaxy Vol. 3 remains a serious standalone contender, even though the first two films were nominated but missed out in 2014 and 2017. (Image courtesy of Marvel Studios) With a tip of the visor to Blade Runner and the Lucasfilm legacy, Gareth Edwards’ The Creator introduces a brave new sci-fi world, both compelling and grounded in real-world locations that ring true. (Image courtesy of 20th Century Studios) With 2,350 visual effects shots, including a prologue and a portal-through-time visitation to the Siege of Syracuse in 200 BC, Indiana Jones and the Dial of Destiny shines with a keen attention to a treasure chest of detail. (Image courtesy of Lucasfilm Ltd)

Last year, Avatar: The Way of Water scooped the Academy Award for Best Visual Effects, beating Top Gun: Maverick, Black Panther: Wakanda Forever, All Quiet on the Western Front and The Batman. When the nominees are announced on Tuesday, January 23, for Best Visual Effects at the 96th Academy Awards, it will be another competitive year with outstanding films in contention for the VFX Oscar. Nominated for Best Visual Effects in 2014, the first Guardians of the Galaxy installment narrowly missed out to Interstellar. In 2017, Guardians of the Galaxy Vol. 2 saw itself nominated for the award, this time losing out to Blade Runner 2049. A third-time’sthe-charm win would be fitting for the third and final installment; although it should be noted that with the extraordinary craftsmanship of the film’s Visual Effects Production Supervisor, Stephane Ceretti, and all of the VFX vendors involved, it would be anything but luck, as Vol. 3 remains a serious forefront contender. “We had 3,066 visual effects shots in the film, a huge number,” says Ceretti, who also worked on the first film. “On top of that, we had to do the Christmas special at the same time, which was an additional 560 shots.” Ceretti credits Guardians series director James Gunn for constantly evolving and challenging his VFX team to break new ground. “James Gunn always wants things to look as real as possible,” Ceretti explains. “He’s got his filming style that’s very specific to him, and that has evolved from the first Guardians film. He was really challenging us in terms of how he is now filming things with smaller cameras that are really portable and moving all the time. We knew that we had potentially a lot of full CG sequences, especially with the flashbacks. We had discussions

8 • VFXVOICE.COM WINTER 2024

PG 8-16 OSCARS.indd 8

11/24/23 12:55 PM


about the different worlds that we would either revisit from previous films, like Knowhere, or the new places that we would introduce like Counter Earth or The Orgoscope. We worked alongside Production Designer Beth Mickle, who is fantastic and had previously done The Suicide Squad with James.” Rocket proved to be the most challenging CG character to create for Ceretti and his team. “It was great to come back and finish the story of Rocket,” Ceretti says. “We knew we had to build Rocket across his different ages throughout the film. There was also a sense of an animalistic feeling that we had to get from Rocket that, in my opinion, we had lost a little bit across the different films.” Ceretti adds, “The emphasis on the high level of detail in terms of both modeling, grooming and keyframe animation for all of these animals was really key for the success of the movie and being able to tell that story.” Indiana Jones and the Dial of Destiny marked the titular character’s return to the big screen. With Andrew Whitehurst serving as Visual Effects Supervisor, the film consisted of 2,350 visual effects shots. Whitehurst describes, “Because the movie is a journey, most scenes take place in different times and locations. It’s a story that has many components with little overlap between them. We knew that the scope of the work would mean that we would need several vendors on the show. By the time we were starting to think about that, Kathy Siegel, the VFX Producer, was onboard, so we were talking about the various folks that we might approach to work on it and who’d we worked with before, and who we knew specialized in certain types of work. It was an exercise in logistics really.” Along with the prologue, the Siege of Syracuse sequence was one of the main focus points in the pre-production of Dial of Destiny.

WINTER 2024 VFXVOICE.COM • 9

PG 8-16 OSCARS.indd 9

11/24/23 12:55 PM


FILM

TOP: Gritty and grounded, Transformers: Rise of the Beast stars robots that reflect the different environments they inhabit, and features complex character animation and FX simulations in almost every shot. (Image courtesy of Paramount Pictures) MIDDLE AND BOTTOM: Ridley Scott’s Napoleon conjures a broad historical canvas of 18th century France with battlefields and thousands of soldiers, requiring a seamless weave of practical, CG and invisible effects that all contribute to the film’s unique look. (Image courtesy of Apple Studios and Columbia Pictures/Sony)

“We started blocking it out with Previs Supervisor Clint Reagan,” Whitehurst notes. “[Editor] Mike McCusker, who was cutting that sequence, was on the project by then. We were exploring the sequence through previs and storyboards because a screenplay often isn’t the best place to explore an action sequence like that. The other thing we needed to figure out was how to film it and how the narrative would drive the layout of the environment. In my office in L.A., I had a map on my whiteboard that I was forever updating with what ancient Syracuse looked like based on the cut. I had all the various story beats as thumbnail sketches drawn on Post-it notes that I was laying out along the planes’ trajectories as I drew them out on the board. From that we could work out how the coastline and the city had to be laid out to accommodate that action. Previs was also vital in working out how we might stage action in the plane set pieces that we had at Pinewood. We were able to figure out ahead of shooting how we could get our actors, crew and cameras into such a confined space. It saved a lot of time on the shoot.” “Everyone working on the film was very conscious of the previous films and the way that they were made,” Whitehurst explains. “The thing you will hear more from Jim [director James Mangold] when you are on set or talking about previs shots, is ‘nouns and verbs.’ Every shot needs a noun and every shot needs a verb. That’s a very classical filmmaking way of working, and I think that attention to detail really shines through in the finished film.” Following Rogue One: A Star Wars Story, Gareth Edwards was at the directing helm again with The Creator. FIN VFX contributed over 100 shots for the film. “By the time we joined the project, The Creator had been in Gareth’s head for years,” says FIN VFX Supervisor Stuart White. “When we received our first shots to begin work on, ILM had already finalized a large number of their shots, having had about a two-year head start. So, the expectations were high! I’ve long been a fan of concept artist [Production Designer] James Clyne. He and his team at ILM had been working

10 • VFXVOICE.COM WINTER 2024

PG 8-16 OSCARS.indd 10

11/24/23 12:55 PM


on the most amazing body of concept art for this movie that I think I’ve ever seen, all handed to us as a PDF that numbered hundreds of pages.” Discussing the most challenging visual effects shot to create, White notes a shot where two (CG) police transport ships land in an alleyway, and about nine (partially CG) robots jump out of it and run into a building. “In the postvis, they had replaced the whole top half of the frame with a still that basically implied ‘insert Blade Runner awesome future city alleyway and skyscrapers here.’” Joe DiValerio of Outpost VFX served as VFX Supervisor and interacted with Charley Henley, Production Visual Effects Supervisor, for Ridley Scott’s Napoleon. “Our work was just under 30 shots, which we ultimately hope are invisible. The film has a unique look and style we needed to adapt to,” DiValerio says. “We had a sequence, fleshed out shot design, and a solid plan of shot plates and elements that needed to be assembled. There was some amazing roto and paint work to preserve the essence of the practical element photography. We had some very clever uses of mixing up the photography in ways it wasn’t originally intended – mixing up takes, augmenting stunt work, mixing angles and standard hiding logistics of the practical effects. The biggest challenge we anticipated was the CG crowd work. The production team had motion capture performances with specific actions for our sequence. We were able to build a library of just the right parts and then art-direct them into the film. Our sequence had a little bit of everything, matte painting, CG FX, crowds, element work and stunt enhancement.” Rodeo FX completed 237 VFX shots for The Little Mermaid with Graeme Marshall and Ashley Bellm both serving as Rodeo FX Visual Effects Producers on the film. “We were only one of multiple vendors,” say Marshall and Bellm with one voice. “As on most projects, we interacted mainly with the client-side VFX team, helmed by VFX Supervisor Tim Burke, and VFX Producer Leslie Lerman. Their team was an absolute delight to work with. Although we did receive notes from Rob [director Rob Marshall],

TOP: The Little Mermaid surfaces from the deep blue for live-action on land, but not without blending in merfolk and classic Disney animated undersea characters: Scuttle, Sebastian and Flounder. Characters built by Framestore. (Image courtesy of Walt Disney Studios). MIDDLE AND BOTTOM: Rodeo FX recreated Place de L’Étoile in Paris, one of the busiest locations in the world, for John Wick: Chapter 4. The John Wick franchise is known for its stylish, choreographed action sequences and gunfights.(Photo: Murray Close. Courtesy of Lionsgate)

WINTER 2024 VFXVOICE.COM • 11

PG 8-16 OSCARS.indd 11

11/24/23 12:55 PM


FILM

TOP: Wētā FX designed, built and animated Cokie and her two cubs for Cocaine Bear. (Image courtesy of Universal Pictures) MIDDLE: ILM’s VFX team worked closely with Production Designer James Clyne and the art department to develop the rich design framework of The Creator. (Images courtesy of 20th Century Studios) BOTTOM: Indiana Jones and the Dial of Destiny. (Image courtesy of Lucasfilm Ltd)

our main point of contact was the studio VFX team. Rodeo was initially brought on board to take on some overflow character work from another studio. All of the merfolk shots were pretty challenging, there was a lot of plate reconstruction and integration to be done. Our most difficult shot was in the closing scene of the film when Ariel’s family and community join her on the beach to send her off on her adventures with Prince Eric. The shots with Scuttle, Sebastian and Flounder were also challenging as the character assets – built by our friends at Framestore – were shared across multiple vendors, and we had to ensure continuity not only in their appearance, but in their overall demeanor and performance.” One of the most highly anticipated films of the year was Christopher Nolan’s Oppenheimer. The combined effects departments created over 200 shots and only half of those needed post work in VFX. Production VFX Supervisor Andrew Jackson was the first person to read the Oppenheimer script after Producer Emma Thomas. “Chris told me before I read the script that he wanted to avoid computer-generated effects,” Jackson says. “He thought if we could shoot it practically, it would fit better with the language and feel of the film. His approach to effects is very similar to mine in that we don’t see a clear divide between VFX and SFX, and believe that if something can be based on filmed elements it will always bring more richness and depth to the work. After my initial discussions with Chris, I spent the first three months of the project in SFX Supervisor Scott Fisher’s workshop, developing and testing dozens of simulations and effects. For the duration of the shoot, we ran a small VFX IMAX film unit, which compiled an extensive

12 • VFXVOICE.COM WINTER 2024

PG 8-16 OSCARS.indd 12

11/24/23 12:55 PM


PG 13 ZEISS AD.indd 13

11/25/23 2:19 PM


FILM

TOP AND MIDDLE: Barbie will likely be in contention for Best Production Design, but the film’s visual effects should not be overlooked. (Images courtesy of Warner Bros.) BOTTOM: Guardians of the Galaxy Vol. 3. (Images courtesy of Marvel Studios)

library of filmed elements. The final shots ranged from using the raw elements as shot, through to complex composites of multiple filmed elements led by VFX Supervisor Giacomo Mineo and the team at DNEG, the film’s sole VFX partner. “Each shot presented its own set of challenges as the script describes thoughts and ideas rather than specific visual images,” Jackson continues. “This was both exciting and challenging as we searched for solutions we could build and shoot that were both driven by the story and visually engaging. Combined with the set of creative rules like only using real elements shot on film, this meant that we had to dig deeper to find solutions that were often more interesting than if there had been no limits. The one sequence that was the most challenging was the Trinity test itself. Unlike the other effects scenes, this one needed to replicate the original test but still be made only using filmed elements. It was a huge design and compositing exercise, involving retiming and combining multiple high-speed explosion and ground detail elements. The result is a truly unique cinematic experience.” MPC and Wētā FX were the lead studios on Transformers: Rise of the Beasts, and the number of visual effects shots ended up in the 1,800-1,830 range. “[Director] Steven Caple Jr. and I met in pre-production when there was only a script. He’d been gathering mood material and came in with a good idea of the world he wanted. It was a more grounded, gritty and distressed look he was after,” says MPC VFX Supervisor Gary Brozenich. “We discussed a heavier patina for the robots in general and wanted to link the look of their aging to the places they inhabited. The Autobots are city dwellers and the Maximals are jungle-dwelling – we should feel the environment in their shell. Both should have a different quality. Unicron and its interior carried the weight of introducing the franchise’s ultimate villain, one that is ingrained in the lore

14 • VFXVOICE.COM WINTER 2024

PG 8-16 OSCARS.indd 14

11/24/23 12:55 PM


PG 15 CHAOS AD.indd 15

11/25/23 2:20 PM


FILM

TOP: Real locations and a preference for practical effects ground Blue Beetle in a family-friendly everyday sense of reality that offers superhero VFX on a down-to-earth scale. (Image courtesy of Warner Bros.) BOTTOM: Oppenheimer. (Image courtesy of DNEG and Universal Pictures)

and the minds of a huge fan base. We wanted to make sure we were true to the original structural design in its planetary form.” Matt Aitken was Wētā FX Visual Effects Supervisor on the film. “Our work on the show was all complex with character animation and FX simulations in almost every shot. The transformation shots were particularly challenging; we set up a dedicated transformation team comprising specialists from animation, rigging and models to handle the specifics of those, and some of the transformation shots were in progress throughout our time working on the movie. The most difficult transformation sequence was probably Mirage transforming into an exosuit around Noah as he slowly stands up. We kept Noah’s face but most of the time his body, clothing and hair has been replaced to allow the articulating suit pieces to neatly form around him. That was the last shot we delivered on the show!” Wētā FX also provided the bulk of effects for Cocaine Bear. “Our work on Cocaine Bear focused on the furry, drug-fueled lead, Cokie, a CG black bear that goes on a rampage after stumbling on abandoned cocaine in the wilderness,” notes VFX Supervisor Robin Hollander. This year, audiences also saw Brie Larson reprise her role as Captain Marvel in The Marvels, which undoubtedly will be a contender at the Academy Awards, while DC’s Blue Beetle could be an outside pick. This year also marked the return of John Wick with John Wick: Chapter 4, an excellent addition to the franchise. Barbie will certainly be in contention for Best Production Design. The film’s visual effects are also worth special mention. Whichever film wins the award for Best Visual Effects, it has been an incredible year for film VFX.

16 • VFXVOICE.COM WINTER 2024

PG 8-16 OSCARS.indd 16

11/24/23 12:55 PM


PG 17 SONY SPIDER-VERSE AD.indd 17

11/25/23 2:34 PM


VFX TRENDS

TESTING TECH WITH SHORT FILMS By BARBARA ROBERTSON

TOP: The 1997 Oscar-winning short “Geri’s Game” gave Pixar’s R&D team a vehicle in which to investigate subdivision surfaces and explore using the technology for cloth simulation and creased cloth. It became a standard part of the toolset, and Pixar released the technology as OpenSubdiv. (Image courtesy of Disney/Pixar) OPPOSITE TOP TO BOTTOM: For the 2014 short, “Lava,” Pixar’s R&D team incorporated the Foundry’s Katana, explored the use of USD (Universal Scene Description) and their new RenderMan RIS path tracer, tools that would become important for the 2016 feature film Finding Dory. (Image courtesy of Disney/Pixar) Visual Effects Supervisor Adam Valdez received support from an Epic MegaGrant and MPC to create the short film “Brave Creatures,” which he wrote and directed. He set the story in the heartlands of “Elysia,” a world on the brink of war. (Image courtesy of Adam Valdez) Senior VFX Producer Doug Oddy assembled a team of 35, including 20 artists, that cut the cord from Sony Pictures Imageworks’ pipeline and instead worked in Unreal Engine running on Windows to create the “In Vaulted Halls Entombed” episode for Netflix’s Love, Death + Robots. Senior VFX Supervisor Jerome Chen directed. (Image courtesy of Sony Pictures Imageworks and Netflix) Artists working on “FLITE” created and tweaked shots in parallel; there was concurrent development and refinement of assets through the process. FUSE gave them the benefits of Unreal’s real-time engine within a pipeline developed for large-scale projects. (Image courtesy of Framestore and Inflammable Films)

Feeding new technology into a pipeline while a project is in production is a high-wire adventure, whether the production is visual effects for a live-action film, an animated feature or a streaming episode. Instead, by using short films as test beds, some studios try new technology and give aspiring directors a less stressful playing field. In fact, before there were animated features or many CG effects in live-action films, Pixar proved the technology in short films, some of which received Oscars. Thus, those films led the way to the first full-length CG feature Toy Story. Similarly, short films at Blue Sky Studios and PDI paved the way for their first features. Some of the technology in Pixar’s short films then and later became – and is still becoming – vital for the industry at large. Recently, several visual effects studios have begun creating short films to experiment with real-time engines. Doug Oddy at Sony Pictures Imageworks and Adam Valdez at MPC led crews set on small islands apart from the main pipeline to create their short films almost entirely in Epic’s Unreal Engine. Tim Webber and a crew at Framestore integrated Unreal Engine into the studio’s pipeline to create the short film “FLITE.” PIXAR TECH

Before Pixar split from Lucasfilm, the graphics group had created the short film “The Adventures of André and Wally B.” (1984) with animation tools developed by Tom Duff and Eben Ostby and rendering with Reyes from Loren Carpenter and Rob Cook. Once established as a studio, Pixar used those two pieces of software to create “Luxo Jr.” and “Red’s Dream.” The next film, the Oscarwinning “Tin Toy” (1988), earned Academy Awards for Bill Reeves

18 • VFXVOICE.COM WINTER 2024

PG 18-26 SHORT FILMS.indd 18

11/24/23 12:55 PM


“Everyone says we introduced subdivision surfaces in ‘Geri’s Game,’ but we also used them in A Bug’s Life. What we did for ‘Geri’s Game’ was to work out more details, to use subdivision surfaces for cloth simulations and to make creases for Geri’s collar on his shirt. The big point was to investigate subdivision surfaces and make them prime tools. Now we use them every day.” —Bill Reeves, Head of Animation Research and Development, Pixar and John Lasseter. The first completely CG-animated film to win an Oscar, “Tin Toy” showcased new software that evolved from the early tools: Marionette for animation and RenderMan for rendering. Ostby and Reeves, the two primary people writing Marionette, called the animation software MenV, but Steve Jobs named it Marionette. Pixar made RenderMan available to the public. “We offered RenderMan as a sort of API, an interface that other renderers could use,” says Reeves, who is now Head of Animation Research and Development at Pixar. “Many did, but then people started using RenderMan.”And they still do. Pixar’s first feature, Toy Story, used evolutions of both tools developed for “Tin Toy,” but it wasn’t until 1997 that the studio again explored significant new tech in a short film. “We had jumped off the cliff for ‘Tin Toy’ using the two new pieces of software,” Reeves adds. “We burned out for a while.” The 1997 short film “Geri’s Game,” also an Oscar winner, used new subdivision surfaces. “Everyone says we introduced subdivision surfaces in ‘Geri’s Game,’ but we also used them in A Bug’s Life,” Reeves says. “What we did for ‘Geri’s Game’ was to work out more details, to use subdivision surfaces for cloth simulations and to make creases for Geri’s collar on his shirt. The big point was to investigate subdivision surfaces and make them prime tools. Now we use them every day.” Pixar has since released the technology to the public as OpenSubdiv. Three of Pixar’s later shorts became technology test beds as well. In 2009’s “Partly Cloudy,” Pixar explored the use of volumetric clouds as characters, which started the R&D group down a road leading to the studio’s latest feature, Elemental. Then, the 2013-2014 shorts “The Blue Umbrella” and “Lava” became the last shorts used as a test bed, this time for the 2016 feature Finding Dory. For those two short films, Pixar integrated the Foundry’s Katana into its pipeline for the first time, explored the use of USD (Universal Scene Description) and introduced the new path tracer RenderMan RIS, which is now widely available. Pixar has also open-sourced USD. Today, many studios and software companies have integrated USD into pipelines and toolkits and the USD buzz and relevance is growing.

WINTER 2024 VFXVOICE.COM • 19

PG 18-26 SHORT FILMS.indd 19

11/24/23 12:55 PM


VFX TRENDS

After Finding Dory, Pixar used shorts, including its largely 2D Spark Shorts, to explore looks and introduce new directors more than to prove new technology. “We’ve gotten better at introducing new technology as we go,” Reeves says. “We no longer have to jump over a cliff and hope it will work as we did way back on ‘Tin Toy,’ where we had to throw out the old, try the new and see what would break. We don’t need the shorts to help us get ahead and keep going ahead. We have a cushion in that we have more great people who can bail us out of trouble, and we can bend and mold the pipeline to do lots of different looks. The other thing is that shorts cost a lot of money.” VFX REAL-TIME ISLANDS

TOP: Pixar first used their Universal Scene Description (USD) framework for the 2013 short film “The Blue Umbrella.” OpenUSD is now a widely used CG ecosystem that allows, for example, the collaborative construction of CG-animated scenes within and among studios. Pixar first used the path tracer RenderMan RIS in “The Blue Umbrella.” The software continues to be used in many studios and firms as well as Pixar. A new version was released in October. (Image courtesy of Disney/Pixar) BOTTOM: Pixar’s 2009 short “Partly Cloudy” explored the use of volumetric clouds as characters, which started the R&D group down a road leading to the studio’s 2023 feature Elemental. (Image courtesy of Disney/Pixar)

Throwing out the old and trying the new is what happened at Sony Pictures Imageworks and MPC – or better said, sidestepping the old and trying the new: Both studios wanted to immerse a crew within Epic’s Unreal Engine to make a short film. They are among many studios now experimenting with real-time tools for visual effects and animated films. To create the 15-minute episode titled “In Vaulted Halls Entombed” for Netflix’s Love, Death + Robots, Senior VFX Producer Doug Oddy assembled a team of 35, including 20 artists, who elected to cut the cord from Sony Pictures Imageworks’ pipeline and instead work in Unreal Engine running on Windows. Senior VFX Supervisor Jerome Chen directed the episode. “We recognized that if we looked to integrate our existing pipeline for real-time workflows, it would require years of development that the broadcast schedule did not accommodate,” he says. “So, we used Unreal Engine 4.27 out of the box – version 5 was still in beta. Instead of writing tools to make it conform, we would conform the way we work. We aspired to have six digital photorealistic characters, all capable of dialogue, on screen in a single shot. Our partners at Epic helped us reach that goal by introducing us to their Metahuman development team that were nearing production

20 • VFXVOICE.COM WINTER 2024

PG 18-26 SHORT FILMS.indd 20

11/24/23 12:55 PM


launch. We ended up relying on the Metahuman system for three of the six main characters. To model and animate the characters, the team worked in Maya, Marvelous Designer, Z-Brush, Houdini and other packages.” Chen continues, “At that time, some of the Unreal Engine toolsets worked much differently than our own. For some of our disciplines, it made more sense to work in our Imageworks pipeline. Our teams of artists working on assets and animation relied on our existing pipeline as it would have been too disruptive to the production schedule to switch. Otherwise, the team worked entirely within Unreal Engine. Within a week, the team had built a rough version of the entire 15-minute film, and they delivered the film in eight months even though they were working the entire time during COVID. The people working on the production never met face to face.” “It was a layout version,” Oddy says of the rough 15 minutes. “That scene file was the scene file we finaled eight months later. Everyone plugged into the scene file and moved laterally back and forth. Everyone could join that in real-time and slowly raise the water level. The quality just kept getting better. We brought in the mocap data, the performance scans and plussed the quality. We were making decisions live all the time; we didn’t have to wait for updates made elsewhere. We were in one environment. We were not taking notes and handing them to other remote teams. We discovered things live as a group and made changes in parallel. We worked in final quality onscreen in real-time. We could go into any part of that 15-minute film. A small finishing team can address an enormous number of adjustments in real-time and at final quality on a daily basis.” That isn’t to say it was easy. Oddy says the biggest challenge was the learning curve. “The steepest learning curve was breaking from an offline mindset and adapting to a real-time workflow when we didn’t really even understand what that was,” he says. “Much of our established technology had to be abandoned; for example, I

TOP: A cluster of people at MPC worked in Unreal Engine using a small network of Windows machines linked together to create Nells for “Beautiful Creatures.” (The digital character Nells was voiced by Nish Kumar.) (Image courtesy of Adam Valdez) BOTTOM: Working collaboratively in FUSE (Framestore-Unreal Shot Engine) gave the “FLITE” team the ability to have everything live all the time and to see shots in context. FUSE handled concurrent versioning, asset management and tracking within, to and from Unreal Engine and Framestore’s traditional pipeline. (Image courtesy of Framestore and Inflammable Films)

WINTER 2024 VFXVOICE.COM • 21

PG 18-26 SHORT FILMS.indd 21

11/24/23 12:55 PM


VFX TRENDS

hadn’t worked in Windows in 15 years. We knew what we wanted to achieve, what the look was, and knew Unreal Engine could get there, but we didn’t know each step required. Once we embraced that we had to work differently, everything improved.” Oddy is continuing to work in Unreal Engine, version 5 now, but as part of the pipeline development team which supports both visual effects and animated feature production. The team has started incorporating Unreal Engine into the film pipeline. “Can a feature film be achieved entirely in Unreal Engine? That would be a massive challenge, but knowing what we know now, it’s certainly possible, perhaps not today, but very soon,” Oddy concludes. BRAVE CREATURES

TOP TO BOTTOM: The Oscar-winning 1988 short film “Tin Toy” was the first to use Pixar’s new animation tool MenV/Marionette and new rendering software RenderMan. (Image courtesy of Disney/Pixar) The MPC team used Unreal Engine 5 (UE5) and its Path Tracer rendering capabilities, Megascans and MetaHumans, to create “Beautiful Creatures.” (The digital character “Ferris” was played by Robert James-Collier.) (Image courtesy of Adam Valdez) A team of 35 built a rough version of the 15-minute “In Vaulted Halls Entombed” within a week and finaled the scene file eight months later. As the artists moved laterally back and forth within the scene file, accessing any part of the film in real time, they improved the quality collaboratively. (Image courtesy of Sony Pictures Imageworks and Netflix)

MPC has been working with Unreal Engine for several years for feature film virtual production, but hadn’t used the real-time engine to create a short film until Adam Valdez received an Epic MegaGrant to do Brave Creatures, which was released in March 2023. “At MPC, we start working in Unreal Engine in our virtual production toolset and then migrate shot by shot for high detail work in our established pipeline,” Valdez says. (Valdez received Academy and BAFTA nominations for his work as a visual effects supervisor on The Lion King.) “For my project, we did the inverse.” As with Oddy’s team, the Brave Creatures crew did rigging and animation in MPC’s traditional pipeline, and a cluster of people did everything else in Unreal Engine using a small network of Windows machines linked together. “You can use Unreal on Linux, but it’s more powerful on Windows,” Valdez says. “You’re adopting a game company model about how work is combined and brought together in Unreal. The established pipeline using Maya and RenderMan have large-scale tools to manage high graphics complexity. That’s a robust and battle-hardened system that still has a place. A game engine is a different technology platform, and it takes a lot of surrounding enabling technology to fully exploit it. Integrating it into a large facility is non-trivial.” As did the Imageworks crew, the artists at MPC built their film in phases. Valdez likens the work to live-action. “You can move lights, move a set, move camera gear and see the effect, just like on a live-action set,” he says. “You can do something quickly and get quick turnaround. The CG world has accepted a slow turnaround time to process the work of specialists. What we’ve been missing all these years is instant graphics. The game engine increases interactivity. So, working in Unreal feels more like you’re on a virtual set. You make choices more familiar to live-action. If you can creep up on it slowly, work on where the good camera angles are, where you want to work, it can be a combination of what it feels like on live-action but with more plasticity.” The downside of all that plasticity is the potential to drive the crew crazy while giving the director the ability to bring more emotion into a story. “As the director, I could become a nightmare,” Valdez says. “I could keep changing my mind. When you have plates, they anchor you for good or ill. It might be easy to think of real-time as an indulgence. But, when I could see the set and lighting and animation, I could adjust the camera and make it feel

22 • VFXVOICE.COM WINTER 2024

PG 18-26 SHORT FILMS.indd 22

11/24/23 12:55 PM


PG 23 AMAZON CREED AD.indd 23

11/25/23 2:35 PM


VFX TRENDS

like the camera responds to what was in front of the lens. Being able to go back in and change a response could be unnerving for the team. But having the camera respond to what’s happening in front of the lens improves the storytelling for the audience. “When you’re talking about technology that’s going to change the way we work, though,” Valdez adds, “we really need to talk about the elephant in the room: AI. To talk only about real-time ignores that huge train barreling down toward us. We’ve already seen AI ‘co-pilots’ in the form of machine learning used for de-aging. We won’t be able to type text and see a movie come out the other end, but there’s room for more co-piloting assistance within the steps of production we already understand.” FULL REAL-TIME INTEGRATION

TOP TO BOTTOM: The team putting together “In Vaulted Halls Entombed” decided that rather than writing tools to make Unreal Engine conform to their traditional way of working, they would conform the way they worked to create the episode for Love, Death + Robots. The Imageworks team worked in Maya, Marvelous Designer, Z-Brush, Houdini and other packages to model and animate characters; otherwise, they stayed entirely within Unreal Engine, bringing in motion-captured performances for designated characters and plussing the quality in real time. VFX Producer Doug Oddy said that the steepest learning curve was breaking from an offline mindset and adapting to a real-time workflow. (Images courtesy of Sony Pictures Imageworks and Netflix)

Unlike some Unreal-based projects, the crew at Framestore took on the hard job of integrating Unreal Engine into the studio’s VFX pipeline for Tim Webber’s 14-minute film “FLITE.” The studio released a trailer for the film in May 2023, and it’s showing in film festivals. The resulting system called FUSE (Framestore-Unreal Shot Engine) handles concurrent asset management and tracking within, to and from Unreal Engine and the traditional pipeline. “One of the phrases we like to say is that we enabled people to work with Unreal, not in Unreal,” Webber says. “A lot of people have used Unreal to make short films, but they tend to have a small team of people working closely together, generalists who learn to use Unreal. We wanted the benefits of Unreal with a more large-scale department, to build it so we could work on features at that quality and scale.” The Framestore crew liked the quick turnaround with FUSE. “I could say, ‘Let’s move that building or change that light,’ and we’d all see it immediately,” Webber says. “The rendering and compositing happened quicker. And if we didn’t have immediate turnaround, rather than two or three days, we could do something in 20 minutes. That speed brings a lot of advantages.” There were other important advantages as well. “An important power of real-time is that everyone was looking at something in context and everyone was looking at the same thing,” Webber says. “We had parallel workflows and concurrent development of all sorts of assets. Even the writing could work in parallel because we leaned into the virtual production. Everything was live and in context throughout the whole process.” Some animation studios have developed systems that give their crews the ability to look at a shot they’re working on in context of other shots. That has rarely been as true for visual effects studios working on live-action films. FUSE made that possible. “This is a live-action film, although not photoreal,” Webber says. “We had human characters with human-nuanced performances, and we could judge a shot in context of other shots. That’s a key part of what we were doing.” For those performances, Webber and his crew filmed actors on a stage with an LED wall. Environments and animation projected onto the wall gave the actors an immersive experience, but not in-camera visual effects. Later, the assets were tweaked to final quality. “On the whole, we just continued working on them,” Webber says. “But we had the flexibility to change them.”

24 • VFXVOICE.COM WINTER 2024

PG 18-26 SHORT FILMS.indd 24

11/24/23 12:55 PM


PG 25 PERFORCE AD.indd 25

11/25/23 2:35 PM


VFX TRENDS

TOP TO BOTTOM: Framestore used “FLITE” to test a new filmmaking techniques and a VFX pipeline that has Unreal Engine at its heart from the initial planning and previsualisation through to final pixel. The 15-minute “FLITE” marked the directorial and writing debut of Framestore Chief Creative Officer Tim Webber. (Image courtesy of Framestore and Inflammable Films) Although the actors saw visual sets and animated objects flying on the LED wall for context, Webber used the LED stage for immersive lighting, not in-camera visual effects. By working in real-time in a continuous process in which everything is live and nothing is thrown away, from previs to post, lighting changes became possible after filming. (Image courtesy of Framestore and Inflammable Films) “FLITE” is a hybrid live-action/digital film that stars digital humans and is based on the emotional performances of actors Alba Baptista, Gethin Anthony and Daniel Lawrence Taylor, filmed over the course of five days on an LED stage. (Image courtesy of Framestore and Inflammable Films)

The LEDs provided the in-camera lighting. “Because we had a continuous process, we used pre-lighting in the previs to design the lighting,” Webber says. “We made decisions at the right time in the right context all at the same time, even the lighting opportunities. Everything was live and in context through the whole process. We could layer on top. We could choose when to make decisions at different times.” The process they used to create the digital humans drew on Webber’s experience as Visual Effects Supervisor for Gravity, for which he received Academy and BAFTA awards. But this time he says it was quicker and easier. “We could get live renders of a scene on set as the actor was performing,” Webber continues. “We could see a rough approximation. During the three-minute chase sequence on Tower Bridge, we could cheat the lighting here and there. Because it was a continuous process, we didn’t end up with bits of live action that didn’t fit into the CG. It’s hard to describe, hard to get to the heart of what makes this different. We did everything all together. It enables you to achieve more. There’s no way we could have done a three-minute continuous-action sequence for a short film otherwise.” Webber expects the tools in FUSE to be more widely adopted in the studio. “Already, some of the tools are being used,” he says. “And we’re talking to other directors. It doesn’t suit every project. It takes a certain style to use all the methods involved; it isn’t appropriate for something gritty-real. Only a certain style of director will want it, but when it is appropriate, people are excited about it.” There were challenges, of course. Unreal Engine Version 5 with its path tracer wasn’t available when they started the film, so that made things more difficult than they would have been later. “We rendered it in Unreal and did a lot of work to make that possible,” Webber says. “Even with 5.1, we would still need to do compositing to fix renders, but less of it. Working with Unreal is totally transformative; the process is fluid. When it comes to the final render, getting quality images out, even with 5, to get the quality we are after is still a challenge. We would love to have all the benefits of Unreal until the final render and then switch over, to have a split pipeline that we can divide up at the end to solve the tricky problems. USD, which is being embraced by most people now, could make that easier and more powerful.” For Webber, it was important to develop the new FUSE tools on a project. Although harder to make a film with unfinished tools, and harder for the toolmakers, he believes the end result was worth it. “Visual effects studios don’t have a lot of spare money, so it’s tricky to get the opportunity,” he says. “Framestore invested time and effort, and we had the Epic MegaGrant. So, we were able to do it. It was a really great thing.” The studio released a trailer for “FLITE” in May 2023, and it’s showing in film festivals. That wraps this story back around to the beginning. As Bill Reeves said, shorts cost a lot of money. But sometimes the return is worth it. After all, in 2013-2014, USD, a tool developed for a Pixar short animated film, might help make it possible to fully integrate a real-time engine with the most sophisticated graphics tools, give real-time advantages to animation studios and turn visual effects for live-action films into a more collaborative, immersive process.

26 • VFXVOICE.COM WINTER 2024

PG 18-26 SHORT FILMS.indd 26

11/24/23 12:55 PM


PG 27 ABSEN AD.indd 27

11/25/23 2:36 PM


FILM

CHARGING INTO BATTLE WITH NAPOLEON By TREVOR HOGG

Images courtesy of Apple Studios and Columbia Pictures. TOP: Hundreds of extras had to be turned into thousands of troops for the battlefield scenes. OPPOSITE TOP TO BOTTOM: BlueBolt added CG ships around a practical one for the Battle of Toulon. It was important during the battle sequences to clearly depict the strategic genius of Napoleon Bonaparte. When Ridley Scott initially saw footage of the exploding air mortars, he actually thought that the stunt performer had been blown up.

There are no signs of Ridley Scott slowing down, as he released The Last Duel and House of Gucci in 2021, and upon completing Napoleon immediately went into pre-production for Gladiator 2. “Normally, Napoleon would be an 18-week shoot, and we did it in something like 10 weeks,” marvels Neil Corbould, Special Effects Supervisor. “It was incredibly fast, and it’s the pace that Ridley works at.” No one is better prepared than the octogenarian filmmaker. “Ridley does all of his own storyboards, and they’re so precise; he’ll draw little explosions or cannons or rifles,” Corbould observes. Napoleon explores the life of Napoleon Bonaparte from becoming the Emperor of the French to his tumultuous love affair with Joséphine de Beauharnais, and his military campaigns in Europe, Russia and Egypt. “There is a myth that in the Battle of Toulon that Napoleon’s horse was hit by a cannonball, and there isn’t a massive amount of reference about that,” Corbould says. “Ridley heard it from one source and wanted to do that as much practically as he could with an animatronic horse, which we did with a rider on it. It’s a big mix of special effects and visual effects to get the shot.” Air blasts were buried on the battlefield for the Battle of Waterloo. “The first time we showed Ridley, he thought that we had blown the stuntman up! It was that realistic. The guy came out with a bit of peat on his head. Ridley loved it after that. We used those all of the time. It was so safe,” Corbould adds. “It’s 18th century France, which is different from some of the futuristic projects that I’ve done with Ridley and earlier periods, like Gladiator,” explains Arthur Max, Production Designer. “Each

28 • VFXVOICE.COM WINTER 2024

PG 28-34 NAPOLEON.indd 28

11/24/23 1:03 PM


historical period has its own visual vocabulary. I got interested in the challenge of the practical limits. We didn’t go to France. It was England and Malta. You get your architectural reference book out and start to look at what the architectural language is of the French, 18th century and earlier.” A partnership was forged between the visual effects and art department. Max explains, “This is a big canvas with battlefields, with hundreds of thousands of troops involved historically, and we’re just dealing with hundreds of extras. From my point of view, you have to design what’s not there as well as what is there. We built a lot of physical sets and relatively large ones in some cases. The ones in Malta were very big constructions. For the Battle of Toulon, we couldn’t build everything. It’s a combination of the set extension that Charley Henley [Visual Effects Supervisor] did, but we tried to indicate as much as possible what they are with the same set of levels of design drawings that we do for the built sets, so that the visual effects department knows how to join them together so they match seamlessly.” Electricity was nonexistent, which meant a heavy reliance on candlelight with some digital assistance. “It was not always possible to shoot with real candlelight,” describes Dariusz Wolski, Cinematographer. “Because of shooting in real estates with 18th century paintings on the wall, we were restricted in using real candles. That was a big deal which was slowly creeping up on me during prep. I found this little LED filament and created fake candles that were on dimmers. The poor set decoration department; they did tons of chandeliers. What I did with Charley was to photograph a lot of real candles to put the flames on the fake ones.

WINTER 2024 VFXVOICE.COM • 29

PG 28-34 NAPOLEON.indd 29

11/24/23 1:03 PM


FILM

TOP: The shadows in the Battle of Toulon cause audience members to fill in the visual gaps with their imagination. MIDDLE AND BOTTOM: A village was constructed on the airfield at RAF Abingdon in England for the Battle of Austerlitz.

That was quite remarkable to me. You get a few scenes where some of them being completely out of focus in the background were real because they looked fine. Otherwise, they are all replaced. We always tried to have real candles close by.” Visual effects were involved from the first location scout. “From the beginning, Charley was part of the creative process, and he created his own previs that Ridley looked at and corrected. That’s the prep. We’re always discussing what should be done for real or needs his help.” Bluescreen was avoided as much as possible. “A lot of times in those big battles, we had 30 x 20 frames and ended up using grey because once you set everything up it disappears and gives you another dimension,” Wolski adds. In the middle of principal photography was Richard Bain, On-Set Visual Effects Supervisor. “Charley, Sarah Tulloch [VFX Producer] and I talked about how to represent scale and perspective on the battlefield. We felt that having markers in the shape of soldiers dotted around the field would assist where we would otherwise have no reference in an empty landscape. A similar idea was used on Sergei Bondarchuk’s version of Waterloo [1970] where he used groups of mannequins in battle dress mounted in a line on a pole and carried by two soldiers at either end, to help swell the numbers in the distance.” Various approaches were used to capture the necessary crowds for scenes. “On some of the battle scenes, we would peel off one or two SAs [Supporting Artists] away from the main group to assist in scale and perspective for our digital doubles. We had to limit the number of SAs due to the COVID-19 restrictions. We would shoot stunt fight sprites on set between setups so we had the same lighting conditions. However, when we had so many cameras shooting at different angles in disparate positions, it’s difficult to know how to quantify the use this had for each camera setup. Hence, we needed to wait for an edit to then set up shot specific groups to match camera angles, which became a visual effects and pick-up shoot at a later date.” Approximately 1,046 visual effects are in the final cut and were

30 • VFXVOICE.COM WINTER 2024

PG 28-34 NAPOLEON.indd 30

11/24/23 1:03 PM


created over a period of 30 weeks by MPC, ILM, BlueBolt, Outpost VFX, One of Us, Light VFX, an in-house team, PFX, Ghost VFX, The Magic Camera Company, Argon and The Third Floor. The biggest visual effects challenge were the crowds; second was the architecture and third was nature, which includes animals and weather. “We brought in architect Inigo Minns as a consultant and researched the buildings that we shot on, the real buildings where that a historical event happened in, and how the architecture married up,” remarks Charley Henley, Visual Effects Supervisor. English roofs had to be digitally altered to reflect the French mansard style; windows had to be changed, and in some cases the color of the stonework had to be shifted. “We also added buildings on skylines but a lot of it was working off the buildings that were there already,” Henley notes. There were some unexpected visual effects. “One of these surprise ideas was, ‘We need to have dogs in this interior scene,’ the scene where Napoleon goes into the Kremlin and sits on a throne. We went off auditioning all of our friends’ dogs. Ridley commented, ‘I need a lot of pigeons in here.’ We never had enough pigeons! Then it’s like, ‘We need the pigeon poop all over the place,’ so we’d go shoot more elements,” Henley explains. Atmospherics were another critical component of nature effects. “Neil Corbould did massive amounts of practical effects which we augmented. It was always more work on wide shots, but with the 14 cameras, some of them get the smoke and others miss it; there’s a whole layer of continuity to deal with,” Henley says. Looking after the Battle of Waterloo, Napoleon’s horse getting hit by a cannonball and the Kremlin was MPC. “The costume designer [Janty Yates] did a tremendous job of making sure that every single military uniform was unique,” notes Luc-Ewen Martin-Fenouillet, VFX Supervisor at MPC. “They had their own backpacks and rifles with different levels of grime and dust. The first thing we talked about with Charley was that our CG army needs to look at least as good as the on-set one. Then, we had to

TOP: Shooting with multiple cameras enabled filmmaker Ridley Scott and Cinematographer Dariusz Wolski to get the necessary coverage without having to do numerous setups that would slow down the production. MIDDLE AND BOTTOM: The hillside footage of Napoleon looking out at the frozen lake during the Battle of Austerlitz was captured at Bourne Wood.

WINTER 2024 VFXVOICE.COM • 31

PG 28-34 NAPOLEON.indd 31

11/24/23 1:03 PM


FILM

TOP: Miniature ships were created by The Magic Camera Company for the Battle of Toulon that were subsequently shot catching fire and having cannonballs ripping through them. MIDDLE AND BOTTOM: The practical and digital snow effects were extensive for the Battle of Austerlitz as the sequences were shot during the summer.

make sure that the army behaved and moved the same way the plates were behaving and moving.” Horses proved to be more difficult than the soldiers. Martin-Fenouillet remarks, “Having a biped on top of a quadruped and getting it simulated so that the cloth and groom flows when the horse runs or stops or falls was painful. Very quickly, we learned that a horse only looks as good as the mane and tail that comes with it. Those pieces always had to have movement and be interlocking, whether it’s from the wind or action or the motion of running horses. Getting all of those dynamics to the next level added to the believability.” A decision was made early on to tone down the smoke on the battlefield. “The story had to be told first. We had to be able to read the strategies and the different beats of which army goes where and does what. Whenever the crowd agent was physically pulling the trigger, it would generate a muzzle-flash cache of smoke that would drift to the wind it was set to. We had a way of previsualizing it in Houdini so you could get the timing right. For the cannons, it was more complex because Ridley had a specific choreography in mind. We decided on the speed of the cannonball then figured out that it only worked for some specific angles. In the end, it was mostly hand-animated to tell the story,” Martin-Fenouillet says. ILM handled the Battle of Austerlitz and March to Moscow, with the former requiring three different locations being blended seamlessly together to create an environment surrounding a frozen lake which subsequently gets blown apart by French cannon fire. “They had to create previs for all of these locations to make sure that everything was working right,” states Simone Coco, VFX Supervisor at ILM. “An amazing, massive real set was built for the village on the airfield. It was nice being on the battlefield back in the18th century. That was the main part of where the battle happened. They shot in Bourne Wood for the hillside where Napoleon is watching the battle with hidden cannons. Then we shot the water tank at Pinewood Studios for the end of the sequence where the Russian soldiers get killed running away

32 • VFXVOICE.COM WINTER 2024

PG 28-34 NAPOLEON.indd 32

11/24/23 1:03 PM


F O R

Y O U R

C O N S I D E R A T I O N

Best Visual Effects/Visual Effects Supervisor - MATTHIEU ROUXEL Visual Effects Department Supervisor - PAUL-ÉTIENNE BOURDE Visual Effects Department Lead - GIAI WONG

PG 33 PARAMOUNT TMNT AD.indd 33

11/25/23 2:37 PM


FILM

TOP TWO: The hardest part of the crowd simulations for the Battle of Austerlitz was making sure that the digital soldiers were indistinguishable from those on set. BOTTOM TWO: 63,539 variations were created for the soldiers along with a huge library of mocap motion clips by ILM for the Battle of Austerlitz.

from the French infantry and get shot by the cannons from the hill.” Despite being set in the winter, the footage was actually captured during the summer and required lots of practical snow. Coco notes, “It was quite cloudy sometimes, so we were lucky with the weather.” Fog is a major part of the Battle of Austerlitz because it hid the French Army from the Russians and Austrians. He adds, “When there’s a lot of wind, sometimes you get sun spill even if it’s snowing, so we played with that idea. Our job was mostly extending what they shot.” About 63,539 variations were created for the soldiers, along with a huge library of mocap motion clips. “There is no one who looks the same,” Coco states. To get the correct interaction, actors were captured with ice made out of polystyrene. “The layout artists had to track every single piece that they shot, replace it with a proper ice sheet, and then the effects artists would create water and snow explosions and debris,” he explains. The Battle of Toulon and Paris riots were the responsibility of BlueBolt. “The Battle of Toulon is a long sequence,” remarks Henry Badgett, VFX Supervisor at BlueBolt. “There was a lot of to-and-fro in the editing to get all of the beats working and not overcook it. It’s an introduction to what a genius Napoleon is as a strategist and how that wins battles for him. From our point of view, we were trying to make visual sense of this onscreen without being too obviously descriptive. Working on night shots of battles is quite nice because of the chaos, and you want to leave some to the imagination due to the shadows. That in itself can be quite hard to do when you’ve got nighttime and smoke lit by fires that are coming and going. When you get the grayscale CG renders in, you can see clearly that this building is here and that one is over there and a ship over there. Then, it’s about completely losing that image when comping it, and you’re trying to persuade the compositor to do less. You just want to see a hint of whatever is back there.” Miniature ships were created by The Magic Camera Company that were subsequently shot catching fire and having cannonballs ripping through them. Badgett adds, “When you shoot elements like that you always get something unexpected for free, so you pick that and put it in. That’s your hero piece, and you can surround it with CG stuff.” The land environment was slightly easier than looking out at sea. He notes, “When out at sea, we’re making more of it from scratch. For the land, 75% would already be there, and we would be adding the extra 25% of chaos, atmosphere and gunfire.” “I’d be interested how people react to the Battle of Waterloo,” Henley says. “That was crazy and a bit terrifying mainly because we had the most cameras, the biggest number of real crew, and the weather was amazing but unpredictable. In one day, we had rain, hail, sun, clouds, moody skies, wind and then nothing. It was a mad weather pattern that we were shooting in. However tough, what was amazing was that it gave us reference for everything Ridley wanted for the scene.” The most impressive moment for Max is the coronation of Napoleon. “We referred to the Jacques-Louis David painting, which is well-known as historical paintings go, and tried to bring that to life.” Napoleon was a terrific experience for Corbould. “They’re not going to be making movies like this much longer because the directors who can direct them are getting fewer and farther between. We have to make the most of it while we can.”

34 • VFXVOICE.COM WINTER 2024

PG 28-34 NAPOLEON.indd 34

11/24/23 1:03 PM


PG 35 SONY APPLE NAPOLEON AD.indd 35

11/25/23 2:37 PM


ANIMATION

PUTTING THE AI IN THE ANIMATION INDUSTRY By TREVOR HOGG

TOP: Wētā FX built a new facial system for Avatar: The Way of Water using a series of neural networks. (Image courtesy of Wētā FX and 20th Century Studios) OPPOSITE TOP TO BOTTOM: The cost-effectiveness of AI will enable more animated movies like Nimona to be made that are not beholden to the traditional studio four-quadrant marketing strategy. (Image courtesy of Netflix) Nimona is an example of the diversification occurring in animated storytelling that Hank Driskill, Head of CG for Feature Animation at Cinesite, finds exciting. (Image courtesy of Netflix) The major theme for Big Hero 6 is that technology is not inherently bad, which was demonstrated by the different ways the microbots got used by Hiro and villain Yokai aka Professor Robert Callaghan, who stole Hiro’s mind-controllable microbot technology. (Image courtesy of Disney) Machine learning was utilized by Framestore to help create a digital double of Yara Shahidi, who portrays Tinkerbell in Peter Pan & Wendy. (Image courtesy of Framestore and Disney)

On the positive side, AI represents the automation of laborious tasks allowing additional time to be creative as well as producing new revenue streams and more immersive virtual experiences, while the negative part of the equation has it restricting creativity to an algorithm, plundering the efforts of others and eliminating jobs. Where does the truth lie for the animation industry? To find the answer, the expertise of Framestore, Technicolor Creative Studios, Wētā FX and Cinesite were consulted, and the general attitude about AI is not apocalyptic, but optimistic. “AI is baked into many of the tools that we use at Framestore. In our own research and development, we are guided by the principle of using it to get to the creative faster,” states Tim Webber, Chief Creative Officer at Framestore. “Animation is a craft, and it’s the subtleties, mannerisms, nature and idiosyncrasies that an animator brings to a character and the performance that makes it great. Many animators consider themselves to be actors, bringing invisible performances to the characters they create. This is inspired by the animator’s own life experiences, watching people, animals or the study of reference material. They bring sympathy, empathy, emotional accuracy and deep creativity to their work, often facilitated and sped up by AI, but something it can’t replicate.” AI and machine learning are fundamentally changing every type of technology. “Users are expecting everything to be smarter, responding to simpler inputs and helping users stay directed towards the users’ goals,” remarks Michael O’Brien, Senior Vice

36 • VFXVOICE.COM WINTER 2024

PG 36-39 AI ANIMATION.indd 36

11/24/23 1:07 PM


President of Research and Development at Technicolor Creative Studios. “Animation is no different. The tools we use need to become more intuitive and intelligent. AI will augment and assist artists so they can accomplish their goals in a shorter time by leveraging the learned intelligence of how artists worked in the past. Animators will still animate. They will still be creative. They will be helped along the way by algorithms providing assistance to common tasks, assurances that the approach is correct and communication of how the work is going.” “AI does not replace artistry,” O’Brien observes. “It is another tool for us to make better pictures faster. New ideas will be connected and generated from algorithms, but artistry will always be the artist. The field of AI is growing at such a speed that we need to make sure we continue to assure the technology is used with full visibility to what the algorithms do and how they are trained, but the opportunities are amazingly vast. Animators starting today will benefit from decades of artistry, and standing on those shoulders will create new worlds, iterate into new directions and create an industry unlike the one we know today.” A shift in focus is needed to better understand AI. “First off, it’s helpful to forget about the idea of ‘Intelligence,’ artificial or otherwise, and think about the function, which is a neural network,” notes Joe Letteri, Senior VFX Supervisor at Wētā FX. “What a network allows you to do is control multiple channels of information at the same time. For example, if you put a spline curve on an elbow joint rotation, you may want the shoulder to have some degree of sympathetic rotation. Then you go up the skeleton and add whatever joint rotations you need to make the movement look natural. A neural network allows you to encapsulate and manage those relationships, so that adjusting one joint will also adjust related joints in a plausible way. In this example, you could also use an IK [Inverse Kinematics] change to do something similar. It is not clear that a neural network would be better, but it is another tool that animators could use.” Wētā FX built a new facial system for Avatar: The Way of Water using a series of neural networks to manage the relationship between facial muscles and expressions. “The tools help capture sympathetic secondary motion in the face, so animators are able to connect their creative decisions more intuitively,” Letteri states. “It was slow going at first, but has proven to be a powerful tool for animators.” Special attention is paid to the training dataset. “When we train a neural network, we are careful to only use data that we own or have the right to use for a particular show,” Letteri notes. “We’re fortunate that we have a partnership with Unity where they are looking into how our technology can reach a wider group of artists outside of Wētā FX. They have announced AI projects in beta and are already doing interesting things in this space.” Cinesite has been talking a lot about various generative and machine learning applications and what they mean for the company. “Almost across the board it’s efficiency gains,” remarks Hank Driskill, Head of CG for Feature Animation at Cinesite. “There are certain tasks like rotoscoping, which is extreme manual labor and time-consuming. I admire the people who have the patience to do it, but that will go away in the coming years and be

WINTER 2024 VFXVOICE.COM • 37

PG 36-39 AI ANIMATION.indd 37

11/24/23 1:07 PM


ANIMATION

TOP TO BOTTOM: AI is confined to the dataset it has been trained on, which is not the same as artists who can create any style of motion for the characters in Teenage Mutant Ninja Turtles: Mutant Mayhem. (Image courtesy of Cinesite and Paramount Pictures) AI will assist in producing the in-between frames for characters that will speed up productions like Teenage Mutant Ninja Turtles: Mutant Mayhem. (Image courtesy of Cinesite and Paramount Pictures) AI will assist productions such as the Disney+ series Iwájú to create generic background props that are not central to the storytelling. (Image courtesy of Cinesite and Disney)

replaced with tools that can do it automatically. There is a whole layer of art design that is busy work, such as ‘Give me the artwork for what the trashcan in the corner looks like.’ You’re going to use generative AI tools to generate a thousand different trash cans and pick one.” The term “AI” is a misnomer. “AI isn’t intelligent,” Driskill notes. “It is a useful tool that is broadly applicable. There are all kinds of aspects of how we work that modern machine learning and AI tools are going to have a big impact on, but right now they’re not going to replace the minds that are at work there.” A prevailing question is whether AI will cause everyone to get stuck in the algorithm of creativity. “It runs that risk, especially because generative AI is only as good as the dataset that you trained it on. What is absolutely coming is an in-between program in the style of, let’s say, Glen Keane. But then you’re stuck in the style of Glen Keane. That isn’t the same as what you get today where artists can create any style of motion.” Generative AI tools have not been incorporated into the current workflow. “The concern with generative AI is that you don’t know where the reference came from or how it is being interpreted,” Driskill notes. “There are all of these legal challenges that are cropping up because like a lot of times with new technology, the law hasn’t caught up yet. The more that you’re building your own AI models and training it on material that you’ve created, then you’re golden.” Another legally sensitive issue is AI being utilized for vocal performances. “It is when they use AI to generate whole new lines, you get into the grey area fast. You’re now putting words in the actor’s mouth quite literally. That’s where it gets dangerous. Retooling a line is easy. I also see background characters being replaced by AI in the coming years, for sure.” AI will provide suggestions. “Often when designing a character, you produce a series of model sheets about what this character looks like with different facial expressions and poses,” Driskill

38 • VFXVOICE.COM WINTER 2024

PG 36-39 AI ANIMATION.indd 38

11/24/23 1:07 PM


states. “The artist’s job is often to hit and dial-in those shapes, a lot of keeping it on model and point. AI is going to help clean it up.” An area of animation that could be majorly improved is cinematography. “I suspect that it will move towards the live-action paradigm where you set up the lights, cameras, have your environment in place, then you bring your actors to perform. You can have AI tools saying that you’re moving the camera too fast through this section, a real camera couldn’t do that. AI is going to give you more ways to check your work and make sure the thing you’re doing is within the bounds you’re setting, except when you don’t want them to,” Driskill says. The quest for expanding revenue streams is constant. “People are always looking for the next revenue stream because theatrical has been struggling in recent years,” Driskill observes. “With the advent of streaming, a lot of folks aren’t going to movie theaters. However, people said the same thing when television arrived. They’re always looking for new revenue streams and new ways to produce content and put that content in front of people. When you generate a film, it’s partly about telling a story that means something to the world, but you also want to create experiences that people come out of wanting more. AI will help with producing more supplemental content. We have already talked about producing 60-second TikTok bits.” Technological efficiencies, such as AI, are helping to lower production costs. “With the advent of streaming and their appetite for content, we’re able to create content that can be meaningful to a subset of the population,” Driskill remarks. “That excites the hell out of me, for somebody who has been doing this for a long time. We’ve gotten to the point now where we can deliver content to a broad audience without it having to be sanitized for being four quadrants across the planet. There are a lot of things being produced now that are absolutely not for everyone, and that’s okay because there are other people where a movie like Nimona will change their lives.”

TOP TO BOTTOM: AI tools will help to make sure the facial expressions and poses stay on model, like with Blazing Sam in Paws of Fury: The Legend of Hank. (Image courtesy of Cinesite and Paramount Pictures) AI can be utilized to provide design suggestions. Cinesite created the dragon for Season 3, Episode 4 of The Witcher. (Image courtesy of Cinesite and Netflix) AI and machine learning are fundamentally changing every type of technology, which will prove to be useful when constructing massive environments such as those found in Dungeons and Dragons: Honor Among Thieves. (Image courtesy of MPC and Paramount Pictures)

WINTER 2024 VFXVOICE.COM • 39

PG 36-39 AI ANIMATION.indd 39

11/24/23 1:07 PM


PG 40-41 AMAZON MULTITILE SPREAD AD.indd 40

11/25/23 2:38 PM


PG 40-41 AMAZON MULTITILE SPREAD AD.indd 41

11/25/23 2:38 PM


PROFILE

GAIA BUSSOLATI: CONNECTING HOLLYWOOD AND ITALIAN CINEMA – AND STAYING CURIOUS By OLIVER WEBB

Images courtesy of Gaia Bussolati and EDI. TOP: Gaia Bussolati, Partner and Visual Effects Supervisor, EDI Effetti Digitali Italiani. (Image courtesy of Anica Academy) OPPOSITE TOP: The Nevers is the biggest TV project Bussolati has worked on. OPPOSITE MIDDLE, LEFT TO RIGHT: Attending an Academy New Members event in Rome. Bussolati has been a member of the Academy of Motion Picture Arts & Sciences since 2019 and the Television Academy since 2021. Bussolati was a special guest of the 30th Edition of Romics, the international festival dedicated to comics, animation, cinema and games, at Fiera Roma in October 2023. At La Biennale di Venezia in 2021 speaking on gender equality and inclusivity in the film industry. Bussolati has worked on a number of acclaimed series, including American Gods, which aired on Starz.

Gaia Bussolati was born in Turin, Italy and followed an academic path focusing on ethology, art and mathematics. She spent her university years between Turin and Hannover, Germany, eventually graduating with honors in architectural design while showing strong interests in design, modeling, photography, scenography for theatre and cinema. “My journey was quite unusual,” Bussolati remarks. “My university studies lasted quite a while. In the meanwhile, I started working on the set of an Italian movie as a production design assistant, and I found the days were long and the hours very long. I was preparing the set, working on set and managing the next day’s set. That process was quite boring for me. One day. some VFX people came on set with a greenscreen. That seemed fun, and I realized that it looked more interesting than production design.” Bussolati continues, “I never studied 3D modeling, since in university they had the motto: ‘First you sail with sails, then you add a motor.’ So, after meeting VFX guys, I thought I could finish my university and at the same time have some experience in the field. I changed the title of my thesis to something more fun than ‘Use of Corrugated Sheets,’ and magically I found a professor who accepted my new project, ‘How to Design Unfeasible Spaces.’” Bussolati started collecting materials – books, interviews, movies – trying software and finding herself to be quite good with computers. In 2001, she got in contact with a new company, EDI Effetti Digitali Italiani. “I had a one-month stage – at the time we worked a lot – and it was extended to a six- month stage. After this I was a quick modeler, using a software called bStudio from Buf company. The two founders, Francesco Grisi and Pasquale Croce, had worked for Buf and had a partnership,” Bussolati notes. After working as a production design assistant and modeler, she steered into the role of VFX supervisor. Blade Runner was an early influence. “The first time I saw it I was a child, and it was really touching and something new. I started digging into more science fiction. After that, a really interesting point was Fight Club. The founder of EDI worked on Fight Club, and that really made me question how it was done. The Matrix was also very influential. You think of ways of doing something and you can’t find any. At the beginning of my career, they were all projects where I’d love to have done that kind of magic.” Having worked extensively in the Italian market, Bussolati worked on The Proposal in 2009, which marked a shift towards working in Hollywood cinema. “The Proposal came through friends of my two bosses, the co-founders of EDI. Some friends of theirs said they had a shot left and they didn’t know how to do it. I had very little time to do it. It was really the first time I was measuring with the high quality of American movies, compared to the low quality and low-res of advertising at that time (we were used to working with PAL format in advertising). It was quite challenging. It was a really small scene, but it was our first official movie in the U.S.,” Bussolati says. Although Bussolati and the team at EDI provided work for The Proposal, their first substantial movie for the U.S. market came in 2015 with Fathers and Daughters. “The director, Gabriele Muccino, is a friend of ours. They needed the Twin Towers in the movie, so we reconstructed New York at day and night. We added New York

42 • VFXVOICE.COM WINTER 2024

PG 42-46 GAIA BUSSOLATI.indd 42

11/28/23 9:55 AM


skyline to Pittsburgh, as it was shot in Pittsburgh,” Bussolati notes. From there, Bussolati worked on a number of acclaimed series, including Mozart in the Jungle, A Series of Unfortunate Events and American Gods. Bussolati reflects, “Creatively speaking, one of our best experiences was for one scene of one episode of American Gods. It was for the final season. We were working long hours every day because we had to figure out in one month both creative and technical aspects. Despite the challenges, it was a great team that brought everybody very close together. We’re quite famous as a company for impossible missions.” Bussolati also worked on blockbusters Black Adam and Jungle Cruise. “For Black Adam, the shots were very well done. When I watched it, I realized how technically and aesthetically wonderful

WINTER 2024 VFXVOICE.COM • 43

PG 42-46 GAIA BUSSOLATI.indd 43

11/28/23 9:55 AM


PROFILE

TOP TO BOTTOM: Although Bussolati did more delegating than crafting on Black Adam, she was extremely impressed with the overall high technical and aesthetic level of filmmaking. Bussolati holds particularly fond memories for her work on Terry Gilliam’s short film “The Wholly Family” (2011). Bussolati spent one week in Naples with Terry Gilliam working on the short film “The Wholly Family.” She describes that time as a “life-changing experience.”

they are. Compared to other projects, I don’t have the same affection because that was really a team job. It wasn’t something I did on my own with my hands; it was more making people do them, giving technical and artistic indications and saying what I wanted done. However, it was a great result,” Bussolati observes. “Jungle Cruise was a nice project for us. It was the first time that we worked with Disney. It was a lot of work and a long process for us due to COVID. They stopped production because Jungle Cruise was made to advertise the [Disneyland] park for Jungle Cruise [the ride], and they couldn’t release the movie as the park hadn’t opened due to COVID. That was such an incredible job. It was bluescreens and CG swords, and a good example of our invisible work. That’s the downside of well-done VFX in a photorealistic context. Obviously, with science fiction movies this doesn’t happen because the work you do is more evident since you need to create things that don’t actually exist.” Another large-scale Hollywood production Bussolati worked on was James Mangold’s Ford v Ferrari. “We made a few shots and collaborated with [Production VFX Supervisor] Olivier Dumont,” she says. “The fun fact here is that we had the chance to be on set, myself and [VFX] Producer Rosario Barbera. We were in L.A. for some appointments with production companies, we called Olivier and he was on set and told us to join him. It was a great experience since the set was impressive for its size and technological equipment.” Despite the shift towards mainstream Hollywood cinema, Bussolati frequently continues to work in Italian cinema and goes back and forth between the two. “We continue to work in the Italian market because we are based in Italy and have lots of contacts here, so that comes naturally. We enjoy working on our best directors’ movies. We also have good ideas, so it’s important to have a base. Essentially, there are three good reasons for Italian projects: firstly, local network; secondly, if it’s a good project then

44 • VFXVOICE.COM WINTER 2024

PG 42-46 GAIA BUSSOLATI.indd 44

11/28/23 9:55 AM


that’s a bonus; and lastly, it can lead to something else, such as working with [Italian director Gabriele] Muccino on international films, for example. We also like to work with new talents and others who are looking to help the Italian market grow. There is a younger generation that is trying to use VFX as an instrument of communication instead of managing just contingencies, plus the percentage of budget dedicated to VFX is rising also in Italy,” Bussolati explains. “The international market raises the stakes for sure because you have to be very precise,” Bussolati continues. “The quality is higher and the results are higher. I like the way they collaborate which is more typical of international movies. I absolutely enjoy the variety of work, as I get bored very easily. Mainly, I try to work on one big project at a time, but many inputs coming from different movies gives me the chance of cross-thinking, which is ideal to optimize results or simply to ignite new ideas.” Bussolati was nominated for an Emmy in 2021 for her work on episode “Ignition” of The Nevers and was nominated again in 2023 for the episode “It’s a Good Day.” “The Nevers is the biggest project I worked on,” she adds. “It was six-plus-six episodes during COVID. They shut the production, and we kept going over the effects. Johnny Han, the main HBO Visual Effects Supervisor, came over to Italy to get to know us, see the offices and how we work. At one point, Johnny called us and asked us to make the set extension, and it was a lot of work. It was mainly to reconstruct London in the Victorian era, but they shot in London and on stage. During COVID, they shot mainly in a theater and didn’t have enough people or distancing, so there were lots of crowd replication. It was really complicated and took longer than we thought. We worked for one and a half years on that project. We saw the work of the people that were working there as Johnny was finding the right people for every sort of work that had to be done in the series. Explosions for one company, set extension was down to us at EDI,

TOP LEFT TO RIGHT: Bussolati was nominated for an Emmy in 2021 for her work on episode “Ignition” of The Nevers and was nominated again in 2023 for The Nevers episode “It’s a Good Day.” The first six episodes of The Nevers were more work for Bussolati and her team than the final six. For the last episode, they had to construct a bird’s view of all London and suburbs. Bussolati also worked on the acclaimed Netflix show, A Series of Unfortunate Events. Although Bussolati and the EDI team provided work for The Proposal, their first substantial movie for the U.S. market came in 2015 with Fathers and Daughters. Black Adam was one of the biggest Hollywood blockbusters Bussolati and her team have worked on. Other large-scale productions were Jungle Cruise (2021) and Ford v Ferrari (2019).

WINTER 2024 VFXVOICE.COM • 45

PG 42-46 GAIA BUSSOLATI.indd 45

11/28/23 9:55 AM


PROFILE

TOP TO BOTTOM: Terry Gilliam’s short film “The Wholly Family” was shot in Naples. Bussolati was Visual Effects Supervisor for EDI on the 2016 film Gold starring Matthew McConaughey. Bussolati worked on 16 episodes of the Italian series A casa tutti bene (There’s No Place Like Home) (2016). Bussolati served as Visual Effects Supervisor on The Leisure Seeker starring Donald Sutherland and Helen Mirren as a runaway couple traveling in an old RV they call the “Leisure Seeker.”

and creatures for another one. Finally, we connected all of our work and there was a really big show. The first six episodes were a lot more work for us, in comparison to the final six. It was simpler afterwards. At that point we had many assets of London. The last episode we had to construct a bird’s view of all London and suburbs, which was very tough. We didn’t really collaborate with other companies, but at the end we were sharing shots with other companies. So, we were crossing links and elements, which was quite interesting and made us feel part of a bigger world. That was fun.” Looking back over her career, Bussolati holds particularly fond memories for her work on Terry Gilliam’s short film “The Wholly Family” (2011). “We were in Naples and I was there with him and the production. Pastificio Garofalo was the main production company. I spent one week with Terry Gilliam, which was adorable. That experience in particular was life-changing for me because at that time I understood what type of person I wanted to be professionally and humanly. That means staying away from a super-achieving attitude and unfair aggressive behaviors in order to reach power and success.” Since starting her career at EDI, Bussolati notes how VFX tools and technology have developed. “The situation has changed a lot. In terms of time and effort, to achieve the same result there is a huge ‘zero time’ and ‘just a button’ change. But we don’t want the same result – ED [EasyDraw] files are bigger and heavier. Reality perception is also so improved in the viewer that many good movies now appear old or grotesque. There are now so many ways to reach the result, so many tools and software that I would say the effort is now increased, even though the computers are now very powerful and not that expensive anymore. The new era of artificial intelligence is a window of new opportunities, and we are very curious – and also so scared – about the possibilities that AI brings.” Regarding being a woman in a male-dominated industry, Bussolati comments that her feeling is that equality was a clearer concept in the ‘70s; nowadays, every effort to be equal is overcomplicated, over-named. “The industry surely remains male-dominated, but my perception is that everybody is now realizing that there are wider conversations that need to be tackled. I spent 23 years working in the VFX field, actually in the same company, EDI. Now I’m a partner of the company with Francesco Pepe, Head of VFX, Stefano Leoni, Head of Supervisors, and CEO Francesco Grisi. I’m a member of the Academy of Motion Picture Arts and Sciences, the Academy of Television Arts and Sciences and the Visual Effect Society. I also vote for European Film Awards and David di Donatello [Italian industry awards]. “I spend a lot of time watching movies, but I also stay curious on what are the quality levels artistically, aesthetically, technically, etc. With this work, I need to see, watch, observe a lot, come up with questions about everything and stimulate my fantasy in every way possible. I look for beauty in the world. I find enthusiasm in things big and small. To this point I really like a quote from Dorothy Parker: ‘The cure for boredom is curiosity. There is no cure for curiosity,’” Bussolati concludes.

46 • VFXVOICE.COM WINTER 2024

PG 42-46 GAIA BUSSOLATI.indd 46

11/28/23 9:55 AM


PG 47 PARAMOUNT MISSION IMPOSSIBLE AD.indd 47

11/25/23 2:39 PM


COVER

LIBERATING THE WEALTH OF SURREAL VISUAL ARTISTRY OF POOR THINGS By CHRIS McGOWAN

Images courtesy of Searchlight Pictures. TOP: The VFX-enhanced reanimation of Bella Baxter (Emma Stone). OPPOSITE TOP: Cameras capture Bella performing a wild dance. Bella is a totally unconventional young woman whose experiences inspire her to stand for liberation and equality. (Photo: Atsushi Nishijima) OPPOSITE MIDDLE AND BOTTOM: An intricate level of detail was applied to the buildings, combining sets, color palette and VFX to create uniquely dramatic environments.

When Union VFX started talking to the Poor Things team in March 2021, they were instantly intrigued. “Projects like this don’t come around often,” says Simon Hughes, Creative Director/VFX Supervisor. Indeed, Greek director Yorgos Lanthimos’ award-winning film inhabits its own category. Poor Things is both a surrealist, sci-fi-edged fantasy and a darkly comic adventure. It starts out in a stylized Victorian-era London where Bella Baxter (Emma Stone) has been brought back to life by the brilliant, eccentric scientist Dr. Godwin Baxter (Willem Dafoe). In doing so, he equips her with an embryo’s brain in another of his mad experiments (he already had stitched together animals to create a strange menagerie of hybrids like a pig-dog and a duckgoat), and becomes her guardian and father figure. Bella evolves from a naïve woman-child to a totally unconventional young woman eager to learn about the world and explore her budding sexuality. She runs off with rakish lawyer Duncan Wedderburn (Mark Ruffalo) on a picaresque tour of Lisbon, Alexandria and Paris. Bella is free from the constraining prejudices of her time, and her experiences inspire her to stand for liberation and equality and making the world a better place. The wildly imaginative plot is reflected in the audacious, striking visuals. “We really got to flex our creative muscles collaborating with the amazing creative team to produce such a stunning film. We had to up our game to meet the extremely high level of detail on the set designs and the expectations of a true master director at the peak of his artistic vision. Everyone involved is at the top of their game and pushed even further to create the truly unique world that is Poor Things,” Hughes comments.

48 • VFXVOICE.COM WINTER 2024

PG 48-52 POOR THINGS.indd 48

11/24/23 1:54 PM


“VFX are a key piece of the puzzle in making almost all films, and this project is a great example of how they can be part of getting such an imaginative vision onto the screen.” —Simon Hughes, Creative Director/ VFX Supervisor The movie, which won the Golden Lion award at the Venice Film Festival, has affinities with Pygmalion (the education of an innocent, unrefined young woman) and The Island of Dr. Moreau (a mad scientist creating hybrid creatures), not to mention Terry Gilliam’s fantastical, dystopic black comedies. Searchlight Pictures is the distributor and Stone the co-producer of Poor Things, which has a Tony McNamara screenplay based on Alasdair Gray’s 1992 novel. The crew included Robbie Ryan (cinematography), Shona Heath and James Price (production design), Holly Waddington (costume design) and Gabor Kiszelly (Special Effects Supervisor). Union VFX handled the visual effects. Before Union’s first meeting with the filmmakers, Hughes recalls, “We were already coming up with ideas for the design, such as the use of inks and paints in water tanks and shooting slow-motion to create surrealistic liquid-sky effects that had cloud-like properties. We also looked to the art world for further inspiration and uncovered artists using these techniques in practice. This led to the production designers’ discovery of Chris Parks, whose liquid photography experiments were then incorporated into the skies.”

WINTER 2024 VFXVOICE.COM • 49

PG 48-52 POOR THINGS.indd 49

11/24/23 1:54 PM


COVER

TOP: Bella (Emma Stone) gazes out at the port of Lisbon with zeppelins and colorful sky effects. The big reveal from the balcony was a 2.5D matte-painted extension of the model set. MIDDLE AND BOTTOM: The surreal ferry boat was a miniature build that was augmented and integrated with live-action plates and CG miniature-scale water and backed by a compelling sky on an LED wall.

He adds, “From the get-go, it was a very creatively collaborative project, and we were heavily involved throughout pre-production.” The shoot happened in the midst of the COVID pandemic restrictions, which required additional planning and logistics, but in August 2021 the visual effects team went to Origo Studios in Budapest for five weeks of pre-production and test shoots. “Principal photography started with the filming of the ship LED sequences at Origo and then moved to OneScrn Studio in Dunakeszi, Hungary, to shoot the miniature sequences. The shoot wrapped in December with Union providing on-set supervision throughout as required,” says Tallulah Baker, VFX Producer. The movie’s look was rooted in Surrealism and elaborate period dressing, with hyper-realistic embellishment. “We worked closely with Production Designers Shona Heath and James Price. Shona was at the center of the overall look and design. She has a particular inclination towards Surrealism, [and went] into a very intricate level of detail with particular attention to color palette and how it is used to create worlds. From the early stages of development, we collaborated with Shona to develop her ideas and help envisage how they could be translated into achievable VFX,” Hughes says. VFX played a part throughout in finishing the details. Dean Koonjul, DFX Supervisor, comments, “Each set had a significant VFX component, and merging our work into the overall design and palette of the sets was integral to building the worlds [of the film] – we had to maintain consistency with the original photography, as well as emulating and integrating the various film stocks used.” Some filming involved custom-made Ektachrome 35mm. Koonjul notes, “Yorgos and Robbie [Cinematographer Robbie Ryan] both come from the world of experimental photography and are keen explorers of different film stocks, formats and their usage in storytelling, art and design. The Ekta footage had a high contrast and saturation feel in comparison to the rest of the stocks used, which added to the look development of scenes such as the reanimation sequence and London Bridge.” Black-and-white and color film were both used, which posed another challenge. “We had to be mindful of our work – in particular around Baxter’s residence and the hybrid animals we created that feature in both black-and-white and color sections of the film,” says Koonjul. “We had to design the VFX to work across both formats.” Koonjul adds, “Working with and combining the various color stocks used on the project did present some challenges, and we needed to adjust our internal color pipeline for the more bespoke requirements of this film.” Ryan also used a number of extremely wide-angle fish-eye lenses, including 8mm and 4mm lenses, that helped create the surrealistic world of Poor Things. Hughes notes, “The use of these lenses can have both an immersive ‘goldfish bowl’ effect, as well as a disorientating or alienating effect, as the perspective becomes so extreme and distorted from what is usually considered normal or real.” He adds, “These elements mirror Bella’s experience of the world in which she is contained and play into the overall feeling of almost living in a bubble. [This is] added to the almost psychedelic visual language used across the film.” Scenes of Baxter bringing Bella back to life “were shot with dynamic lighting and projections on set. We added ethereal,

50 • VFXVOICE.COM WINTER 2024

PG 48-52 POOR THINGS.indd 50

11/24/23 1:54 PM


fluid-like chromatic aberration and lighting effects across the glass above Bella’s face,” Hughes explains. This was intended to mirror the fluid, liquid language used across the production design. “We also added the electrical currents and Tesla arc effects seen traveling from the mechanics within the room to Bella and randomly throughout the room to create a sense of chaos,” Hughes says. Dr. Baxter’s residence involved an extensive set-build for the location. “So, most of what you see here was in-camera,” Koonjul notes. “However, we did enhance the environment where required with VFX, including sky replacements and background set extensions, not to mention the addition of some furry and feathery residents at the property.” To create those residents, “Yorgos was keen to try and find as much of an in-camera and 2D solution-based approach as possible to avoid the potential scrutiny and flaws of using full CGI – he wanted to embrace the random physical nuances of animal movements that are inherently difficult to capture in CG,” Hughes says. “We felt that if we embraced the unpredictable nature of the animals, we could find a way to literally stitch live-action elements of different animals together.” Tim Barter, On-Set VFX Supervisor, explains, “Our solution was to overshoot and come back with multiple takes and multiple animals, then test different combinations to see which animals and moments worked well when combined together. This started with a series of test shoots with an animal trainer.” When it came to creating hybrids, some proved more difficult than others due to a combination of their independent movements, the camera moves and distorted lenses. “There was a significant degree of rebuild, and some CG was used to help with the joins. 3D scans of the animals were used to help us align the different elements and create the textures and scarring where they join together. The scar designs were based on paint-over concepts that we showed Yorgos for approval beforehand,” Hughes says. “In this way, we were able to preserve the naturalistic movement of the real animals while still creating a more fantastical layer of ‘strangeness’ to them in keeping with the film’s tone. The animals presented some of the most significant challenges in the show, but we all felt that using real animals as much as possible produced a far more effective end result which was incredibly surreal and, at times, hilarious.” Union supplied LED screen footage of the skies and oceans for all the scenes on board the ship, which were developed in-house based on art department concepts, according to Barter. For other scenes in Paris, Lisbon, London and Alexandria, static printed backdrops were used on set, which were later modified and enhanced in VFX to incorporate additional layers of detail and movement to the skies.” Hughes notes, “The choice to shoot LED proved a good one as the sets inherited their ‘natural’ lighting, but this was also challenging in some places as the LED footage needed [to be] recomposed to further augment ocean and sky movement – especially when sailing. We also had to address the inevitable fixes that come up when shooting this way: moiré, seams between screens, shooting offscreen due to the sizes achievable, etc.” There were various alterations of London, according to Koonjul. “We added CG zeppelins based on art department designs, as well as miniature scale firework displays that were generated using

TOP TWO: A phantasmagorical building with staircase, bathed in an orange hue, with the LED sea behind. Each set had a significant VFX component that was merged into the overall design and palette of the sets to create the worlds of the film. BOTTOM TWO: Combining a wind-swept sky of snowy clouds with fantasy architecture and a matte-painted backdrop to create a winter scene in the town square.

WINTER 2024 VFXVOICE.COM • 51

PG 48-52 POOR THINGS.indd 51

11/24/23 1:54 PM


COVER

TOP TO BOTTOM: Dr. Godwin Baxter (Willem Dafoe), his deep scars, plated face and essential fluids. (Photo: Atsushi Nishijima) Dr. Godwin Baxter (Willem Dafoe) conducting a human experiment in his lab with Max (Ramy Youssef), Bella (Emma Stone) and Mrs. Prim (Vicki Pepperdine). (Photo: Atsushi Nishijima) Bella (Emma Stone) on deck of a ship with an LED wall displaying sea and sky. The use of inks and paints in water tanks and shooting slow motion created surrealistic liquid-sky effects with cloud-like properties.

Nuke particles and comp effects to maintain the feel and blend into the surreal miniature set builds used on the shoot.” The main bulk of the London work centered around Bella’s jump from the bridge, which involved combining miniature sets and miniature scale matte paintings with miniature scale water effects and smog. “We also added boats, chimney stacks and lights into the miniature set-builds used on the shoot, all of which had to maintain the scale and feel as though they were real effects used on set,” says Koonjul. “We worked on a number of sky replacements adding a stylized and surreal, textural movement to them.” The Lisbon and Alexandria scenes were filmed as studio shoots on interior sound stages. “The set-builds were designed around the idea of landing at their ports; they also needed to have very different feels to demonstrate the journey between them,” Hughes says. “The Lisbon architecture revolved around a central square leading to a dock and ultimately onto a balcony with a view onto the entirety of the port. For all these areas there were significant set-builds at Origo that we needed to extend up from, [such as] a miniature-scale cable car system, liquid skies, greenery and floral details, ocean extensions and CG ship masts rocking on the water and birds in the sky, all ultimately leading to the big reveal from the balcony which was a 2.5D matte-painted extension of the model set.” Hughes continues, “Alexandria and Paris had very similar challenges but very different sets. Alexandria involved a much heavier CG build component to achieve the big pull-back reveal of the combined set-build, miniature model and CG worlds as one, as we follow Bella down the stairs after witnessing the horrors of the slum below. There are many similarities of challenges of Lisbon, Alexandria and Paris, but each location had several extra layers of variation to help differentiate them.” Hughes adds, “Paris involved set extensions up and out from the buildings around the brothel. We also needed to take the matte-painted backdrops out and bring them back with additional signs of life and a more defined Eiffel Tower under construction in the distance.” Another VFX highlight is Alfie’s mansion, which is one of the standout examples of combining miniature set-builds with CG, matte painting, plates of the cast, birds, trees, colorful smokestacks, flags from masts, all of which needed to remain in our miniature surreal colorful world,” Hughes describes, adding, “There is a series of shots showing the surreal ferry boat traveling between locations. This again was a miniature build that we needed to augment and integrate with the live-action plates and CG miniature-scale water.” For Hughes, meeting the challenges on Poor Things created rewards beyond the project. “There were two fundamental challenges on the show: creating the hybrid animals – combining real animals and dynamic camera movements – and building a world that maintained a miniature surrealistic feel while also feeling photoreal. The latter is a contradiction in a way, making it incredibly difficult to know when it is working well and how far to go, but a really fun challenge that pushed us to set a new bar for the work we do at Union,” Hughes offers. “VFX are a key piece of the puzzle in making almost all films, and this project is a great example of how they can be part of getting such an imaginative vision onto the screen.”

52 • VFXVOICE.COM WINTER 2024

PG 48-52 POOR THINGS.indd 52

11/24/23 1:54 PM


PG 53 PARAMOUNT TRANSFORMERS AD.indd 53

11/25/23 2:40 PM


PG 54-55 MATHEMATIC SPREAD AD.indd 54

11/25/23 2:40 PM


PG 54-55 MATHEMATIC SPREAD AD.indd 55

11/25/23 2:40 PM


INDUSTRY

2024 STATE OF THE VFX/ANIMATION INDUSTRY: FULL SPEED AHEAD By CHRIS McGOWAN

TOP: Denis Villeneuve’s Dune (2021) was nominated for 10 awards at the 94th Academy Awards, including Best Picture, winning six, including Best Visual Effects, and received numerous other accolades. Dune: Part Two carries that pedigree forward. DNEG was in charge of effects with an assist from Wētā Workshop. (Image courtesy of Warner Bros.) OPPOSITE TOP TO BOTTOM: 18th century France, the French Revolution and Revolutionary Wars are history brought to life in Ridley Scott’s Napoleon. ILM, MPC BlueBolt, Freefolk, One of Us and Outpost VFX were among the VFX creators. (Image courtesy of Apple Studios and Columbia Pictures/Sony) Visual effects were instrumental in transporting audiences to Oklahoma of the early 1920s for Martin Scorsese’s Killers of the Flower Moon. ILM handled special effects. (Image courtesy of Paramount Pictures) A family of ducks takes the vacation of a lifetime in Migration, the latest attraction from Illumination Studios Paris, featuring the voices of Awkwafina, Danny DeVito, Keegan-Michael Key and Elizabeth Banks. (Image courtesy of Illumination Entertainment and Universal Pictures) Timothée Chalamet as Willy Wonka and Hugh Grant as an Oompa Loompa in the early adventures of Wonka, served up with a variety of visual delights by Framestore. (Image courtesy of Warner Bros.)

Technological advances in the VFX/animation industry don’t look to slow down anytime soon, despite difficult-to-predict economic adjustments up or down in 2024. “We saw huge growth pre-2020 with a decade of Marvel and large-scale VFX productions,” according to Gaurav Gupta, FutureWorks CEO. Then came the rise of streaming services and a post-pandemic boom in content. “The arms race for content is real and has had an impact on the VFX industry too,” Gupta adds. However, with the strikes halting productions, “There was a contraction in this expansion,” Gupta notes. There was also belt-tightening among studios and streamers looking for greater profits. Irrespective of lulls and/or recoveries in film and TV production, the VFX/animation industry should continue to dazzle in 2024 with new developments in tools and techniques. Real-time technology, LED volumes, virtual production, facial recognition, volumetric capture, technical collaborations like OpenUSD and other innovations continue to move the craft forward. In addition, globalization, outsourcing, collaboration and remote workflows are also buoying the industry. And AI, nearly everyone’s favorite topic for debate, promises new efficiencies and creative possibilities for visual effects. Here are some thoughts from top executives about trends, breakthroughs and the business. Grant Miller, Partner & Executive VFX Supervisor, Ingenuity Studios AI really pushed into the mainstream VFX space [in 2023]. While there’s been apprehension around how the technology will be used, we’ve seen a lot of positives so far. For example, clients are using Midjourney to clearly articulate their vision. Software, like Cascadeur, is using AI to add believable secondary and physics to animations. AI-generated Neural Radiance Fields (NeRFs) are exploring novel ways to represent 3D scenes.

56 • VFXVOICE.COM WINTER 2024

PG 56-64 INDUSTRY.indd 56

11/24/23 2:02 PM


While we’re not quite there yet, AI also has the potential to remove many tedious tasks in the pipeline, affording artists more time to focus on quality and creativity. It will also enable ideas and workflows that would be nearly impossible without the help of machine learning. In the near future, we’ll be able to create robust depth and segmentation maps for moving footage, allowing artists to add fog to a forest or change the leaves to autumn. The industry has shifted from chemical compositing to digital, from stop-motion to computer animation and from matte paintings on glass to fully 3D set extensions. Technology has driven each of these transitions, and each has increased the use of VFX in the content we consume. We’re excited about what the future will hold as AI improves the tools we use in our craft, and I’m thankful to be working in the industry during such a transformative time. Fiona Walkinshaw, CEO, Film & Episodic, Framestore AI and machine learning are obviously topics that have everyone talking at the moment – in reality, in our industry we have been exploring the use of AI/ML for quite some time. We will continue to explore their use and application and how they might speed up compute time and processes that enable our artists to do what they do best: creative immersive worlds and believable CG characters and creatures that require human artistry and experience to achieve. I think we’ll see the continued expansion of FPS, our pre-production services division as more and more directors, showrunners and execs get to grips with how previs, techvis, virtual production tools and postvis can help them plan their shows and achieve their vision. Alongside this, we’re constantly improving and enhancing our in-house tech, like FUSE and our work with Unreal and FarSight, our proprietary VP ecosystem which encompasses virtual scouting, motion capture, virtual camera and on-set visualization. All of this speaks to the same broad aim: to deliver the

WINTER 2024 VFXVOICE.COM • 57

PG 56-64 INDUSTRY.indd 57

11/24/23 2:02 PM


INDUSTRY

TOP TO BOTTOM: The underwater world of Atlantis is teeming with new life above and beneath the waves in Aquaman and the Lost Kingdom, thanks to the VFX of ILM, DNEG, Cinesite, MPC, Scanline VF and Fractured FX, among other companies. (Image courtesy of Warner Bros.) Effects reflect the classy, glassy, ultra-modernistic underworld of John Wick: Chapter 4. Rodeo FX, One of Us, The Yard VFX, Pixomondo, Mavericks VFX and Crafty Apes VFX contributed to the effects. (Image courtesy of Lionsgate) The Exorcist: Believer proves there is still horror gold to be mined from the original 50-year-old, all-time head-turner, via horror connoisseurs Blumhouse Productions. (Image courtesy of Universal Pictures)

absolute best in VFX and animation, breathe life into everything we work on and reframe what directors and audiences alike can expect from the stories they see on screen. Jennie Zeiher, President, Rising Sun Pictures Early in 2023, we started seeing hype around Generative AI. The “sudden emergence” of Artificial Intelligence (I prefer machine learning) was years, decades in the making. Gen AI created widespread mania, as many of us were not prepared for it. VFX started to see new applications being promoted, from concept apps such as Midjourney and Stable Diffusion, through to animation, including Wonder Dynamics’ Wonder Studio that animates and composites a library of bipedal CG characters into live-action material. These applications allow us to dream of what can be achieved next using machine learning. We should see ML as adding more tools into our VFX toolkit and amplifying jobs; pushing the boundaries of what we can achieve in VFX with more complex challenges. The sudden emergence has been building slowly over time. The term “machine learning” was first popularized in 1959 by [computer scientist] Arthur Samuel, who created one of the first self-learning programs, a simple game of Checkers. RSP has been working away in ML for the past five years. We have made huge inroads in research and development, as well as integrating our ML team into production. Our REVIZE toolkit is evolving, having started with ML applications around face swapping and manipulation, through to the body and now performance. It is RSP’s mission to integrate our traditional VFX pipeline using ML to address our client’s technical challenges, alongside the know-how of experienced VFX creative and technical teams, all while producing shots at scale. I look forward to what 2024 will bring in this space. Hitesh Shah, CEO and Founder, BOT VFX Fundamental VFX demand still looks strong as it has for the last few years based on three trends: (1) continuing growth in the

58 • VFXVOICE.COM WINTER 2024

PG 56-64 INDUSTRY.indd 58

11/24/23 2:02 PM


amount of content produced, (2) continuing growth of the share of overall content needing VFX services (i.e. a larger percentage of shots need some digital effects), and (3) pent-up demand from the 2023 strikes. While the streaming platform strategies evolve relative to subscriber growth versus profitability and paid-only versus ad-supported, generally speaking, the amount of content being produced is still fairly strong in the foreseeable future. [Key trend for 2024]: Greater application of “human in the loop” AI tools in VFX production processes. Joanne Smithies, Visualization Supervisor, The Third Floor Many would mention the continued advancement of real-time rendering technology and rightly so. However, as a previs supervisor, I have to highlight the advancement of virtual production tools. These tools are not only becoming more powerful but also more intuitive and portable. Directors, DoPs and VFX supervisors can now bring the CG elements of their film on-location scout or on set, giving them more creative control by enabling them to plan and visualize their shots with all the elements in hand. Addy Ghani, Vice President of Virtual Production, disguise In the next 12 months, virtual production workflows will become more streamlined and simplified than ever before. The stages themselves will be more flexible – that is, they will be easier to dismantle and rebuild in various shapes and sizes as the shot requires. There’ll also be an increase in content flexibility – creatives will be able to choose exactly what type of content to display on their LED volumes, based on parameters like their budget and scheduling restrictions. This can come from many sources: video plates to 2.5D (multiple video plates), AI generated content such as NeRFs, or real-time rendered 3D environments. For the most elaborate and demanding productions, fully realized 3D real-time environments will be built and customized on the fly. For more budget restricted shots and sequences, video plates, 2.5D and AI generated plates will be used frequently. For VFX artists, that means there will be more flexibility to use virtual production technology on a much wider variety of projects, whether that’s a Hollywood feature or an indie short. Douglas Green, Co-Founder and COO, DI4D Real-time game engines have broadened the creative possibilities available to directors. Not least, in the production of high-fidelity, realistic animated films such as those produced for the Netflix series Love, Death + Robots. The challenge that faces directors and studios is matching the visual quality that is now possible with equally realistic character animation. Performance capture meets this challenge head-on, driving facial and body movements with the skill and craft of a real actor’s performance. Through the use of digital doubles, characters can remain faithful to the exact likeness and performance of an actor. With the digital-double approach, it’s possible to create the immersion of cinema or theatre entirely in CG, all while preserving the artform of acting. Because each performance is always unique, it ensures a greater level of authenticity and audience engagement, which will be vital to studios and filmmakers in 2024.

TOP TO BOTTOM: Clarence is captivated by the power of the rising Messiah in comedy/adventure/drama The Book of Clarence and vows to carve his own divine path. (Photo: Moris Puccio. Courtesy of Legendary Entertainment and TriStar/Columbia/Sony Pictures) A huge scoreboard tracks the victors and losers of the 10th Hunger Games in The Hunger Games: The Ballad of Songbirds and Snakes. ILP, ILM, Outpost VFX, RISE Visual Effects and Outpost VFX lead the VFX effort. (Image courtesy of Lionsgate) The biggest challenge for the VFX team on The Creator was creating a reality-based world that didn’t exist before but was still believable. ILM, Crafty Apes, Frontier VFX, Jellyfish Pictures, MARZ, Outpost VFX and Wētā Workshop paced the VFX world-building. (Image courtesy of 20th Century Studios)

WINTER 2024 VFXVOICE.COM • 59

PG 56-64 INDUSTRY.indd 59

11/24/23 2:02 PM


INDUSTRY

TOP TO BOTTOM:Christopher Nolan’s Oppenheimer will resonate through the new year and beyond. DNEG masterminded the stunning effects for the Trinity nuclear test and Oppenheimer’s thought process. (Image courtesy of Universal Pictures) Major global VFX collaborators fuel The Foundation. Season 2 of Apple TV’s effects-heavy sci-fi series drew DNEG, Rodeo FX, Cinesite, BOT VFX, Mackevision, Scanline VFX, Outpost VFX, Framestore, ILP, MR. X and Image Engine, among others. (Image courtesy of BOT VFX and Apple TV+) Francis Lawrence’s Slumberland unleashes a flurry of top-line visual effects to capture a vast, shifting dreamscape traversed by outlaw dream spirit Jason Momoa. Leading the VFX charge was DNEG, BOT VFX, Rodeo FX, Incessant Rain Studios, Scanline VFX, Outpost VFX, Ghost VFX, Framestore and ILP. (Image courtesy of Netflix)

Lala Gavgavian, President & COO, Digital Domain [Digital Domain’s] journey has led us to create multiple proprietary tools to support and streamline the filmmaking process. These innovative tools and processes have propelled efficiency in various components of our pipeline. We strategically integrate cuttingedge AI and ML tools, allowing talent to spend more time on the creative and technological components toward the final pixel of a shot. This collaborative ecosystem fosters growth, enriches our business and increases productivity. With our unique resources, we drive progress, support collaboration, and unify teams across global geographic locations, advancing the company and industry. Eamonn Butler, Head of Animation, Cinesite Recently, the combination of 2D and 3D animation has become one of the most popular animation trends; this will continue in 2024. We’ve seen this at Cinesite with 2D and 3D being combined in Space Jam: A New Legacy, Teenage Mutant Ninja Turtles: Mutant Mayhem and the upcoming Smurfs movie. Blending a variety of artistic mediums seamlessly into one animated video is a steadily increasing trend and allows us to reflect on how different styles of animation can challenge and complement each other creatively to produce innovative animation trends in the future. Animation has been unable to escape the era of the revival, with nostalgic and retro animation creeping their way into popular animation trends in recent years. When working on Teenage Mutant Ninja Turtles: Mutant Mayhem, we applied film grain, noise, lens flares and dust on the lens to a lot of our shots and sequences. A perfected blend of retro-futuristic visuals and a collective longing for yesteryear will almost certainly maintain retro animation as one of the most popular animation trends for 2024. [Regarding NPR], most of the filmmakers we work with are looking for something “different” with regard to the look of CG-animated films. There’s been a huge interest in projects utilizing artistic looks with films such as TMNT and Spider-Verse. It’s an exciting time for artists looking to push the boundaries and capabilities of CGI. AI will continue to be a hot topic in 2024. In broad terms, we can say that although innovation is happening at a breakneck speed in these areas and now more than ever, it is difficult to predict what the long-term future will look like. I think it is fair to say that for the foreseeable future, high-end filmmakers will continue to depend on the services like the ones Cinesite provides to produce high- quality content. On the very low end of the spectrum (YouTube, TikTok, Twitter, etc.), generative AI tools will make VFX and animation accessible to a much broader audience, but at a quality bar that will not be suited for our clients. At the moment, there are a few issues around using AI. In particular, there is the potential for disputes over rights and ownership. This will need to get ironed out before any studio takes a risk with tools that effectively mimic other artworks. I suspect that AI technology will take many years of development before it becomes a cheaper and better alternative to current production techniques. I believe you will always need artists to drive it, and it will only be as good as the vision set by a director.

60 • VFXVOICE.COM WINTER 2024

PG 56-64 INDUSTRY.indd 60

11/24/23 2:02 PM


Gaurav Gupta, CEO, FutureWorks We together need to create better working conditions and a more sustainable model for both artists and studios. I feel it’s time not just to look at growth but understand how the VFX industry can work more collaboratively to achieve a healthier and more profitable way of working. New tools and technology like AI will only help us to create more compelling visuals. Whether in terms of creating visuals or writing code, it will simply democratize and offer greater access to technology. This creates a more competitive and healthier environment. 2024 will be a very exciting year for the use of AI across all of the content production chain. It impacts everything from script to visual ideation, to VFX production, editing, sound. We’re going to see AI being used everywhere. This is going to be an exciting shift that will democratize tools and access to storytelling, as well as creating a healthier and more competitive environment for content creators at all levels. It will also be a big year for virtual production – just look at how this has developed since 2020 with The Mandalorian. 2024 will see the adoption at a much larger scale as the entire ecosystem from camera manufacturers and other technology purveyors like Epic and Unity have made virtual production more accessible. Another exciting new development will be the launch of Apple’s Vision Pro. I can’t wait to see how storytelling and new content experiences evolve in this new spatial computing format. The wealth of content will only continue to grow. Rob Hifle, CEO and Creative Director, Lux Aeterna VFX 2023 [was] an incredibly exciting time in the VFX industry. At Lux Aeterna, we have seen continued demand for unique visual experiences in documentary, drama and episodic, requiring high-end environmental and abstract simulations. The most significant VFX industry trend in 2023 [was] the increasing integration of artificial intelligence and machine learning into the creative process. This year, we received a grant from Digital Catapult to pursue R&D in AI toolsets for VFX pipelines with the support of NVIDIA. AI-powered tools are revolutionizing how we work with regard to our VFX pipelines. There will be a huge surge in usage demand for these technological advancements as commercially licensable options become available. The latest advancements in technology are opening up new creative possibilities. VFX artists are exploring novel aesthetic directions and experimental approaches as tools and techniques evolve. These unique approaches are challenging traditional pipelines, and 2024 will be an exciting time in reapproaching how we work with clients. VFX studios will be more involved and integrated at the inception stage, using their insights into new technologies and ways of working in order to maximize ambition and reduce risk. Technological advancements have been progressing at a phenomenal rate. The adoption of real-time technologies like Unreal Engine for virtual production applications and beyond is something we expect to see continue into 2024. Despite some of the challenges faced in this area, both the offerings available and

TOP TO BOTTOM: Striking visuals are one of the hallmarks of The Witcher series, which has deployed the VFX talents of Framestore, Cinesite, One of Us, Nviz, NetFX, ILM, MR. X, Rodeo FX and Clear Angle Studios. (Image courtesy of Cinesite and Netflix) Teenage Mutant Ninja Turtles: Mutant Mayhem combines 2D and 3D animation to revitalize the iconic shell-based characters. Cinesite Vancouver and Montreal contributed to the effects, which were overseen by Mikros Animation. (Image courtesy of Cinesite and Paramount) The car-cloaking in The Peripheral forecasts how in-camera effects will stimulate a further expansion of virtual production in 2024. BlueBolt handled effects. (Image courtesy of FutureWorks and Amazon Prime Video)

WINTER 2024 VFXVOICE.COM • 61

PG 56-64 INDUSTRY.indd 61

11/24/23 2:02 PM


INDUSTRY

TOP TO BOTTOM: Two worlds collide in Netflix’s Our Universe. Contributing to the VFX was Lux Aeterna VFX. (Image courtesy of Lux Aeterna VFX and Netflix) Pixar, who created Elemental, made its USD (Universal Scene Description) open source to simplify working on 3D assets across content creation tools and multiple technology platforms. (Image courtesy of Disney/Pixar) DreamWorks Animation’s Puss in Boots: The Last Wish won Outstanding Effects Simulations in an Animated Feature at the 2023 VES Awards and was nominated for Best Animated Feature Film at the 95th Academy Awards and 2023 BAFTA Awards. (Image courtesy of DreamWorks Animation and Universal Pictures)

potential rewards for studios are very enticing. This accessibility and democratization of the tools has hugely contributed to the widespread adoption of these techniques, and I can only see this expanding into 2024. Continued advancements in real-time rendering technology will lead to a significant shift in how VFX are created and integrated into projects. Real-time rendering will become more prevalent for both previsualization and final output, reducing the need for lengthy rendering times. These more sustainable practices will have significant reductions in energy costs leading to a greener VFX industry. Neishaw Ali, Founding Partner, President and Executive Producer, Spin VFX As the ‘hybrid’ work environment settles in 2024, more studios will either keep their current office footprint or reduce it. Therefore, instead of purchasing more computers, they may look to spinning up workstations in the cloud when they need it, as they can match the specific needs of the department and production without the delivery lead times and work with the latest and greatest systems. Cloud render – this service will be widely used, but is still costly. The hope is that more providers will come on-line and provide competitive pricing, which will allow studios to utilize the cloud render to service the work on a “just in time” basis. Cloud storage is going to become even more important as studios work with teams around the world in this hybrid environment. The need for speed in data accessibility whether online or offline could make a difference in delivery of a show. [Regarding AI/ML,] VFX studios are looking at smart ways to become more efficient, and I believe that they will look at automating 70% to 80% of certain tasks, such as tracking and roto using commercially available machine learning tools. This will be a great benefit to artists as they don’t need to spend time on such tedious work, but can now focus on refining and beautifying the shot. Advancements in machine learning-based denoising will allow for faster and cheaper rendering with no loss in quality. This technology has been evolving for years but is now at the point where the quality and fine detail are easily preserved. Machine learning-based upressing algorithms are opening up the possibility of rendering at lower resolutions and then seamlessly upressing the imagery to increase speed and efficiency. Generative AI has the potential to be a powerful creative tool for helping artists and designers quickly create high-quality concept art. What the future holds with technology is difficult to predict, but it will certainly be fast-moving and consequential. Real-time rendering – the breakthroughs in real-time raytracing continue. Hardware and software have matured to the point where the large datasets required for some photoreal VFX work can be raytraced in real-time on a single GPU. Michael Ford, CTO, Sony Pictures Imageworks In terms of the state of the craft, it’s an amazing time to be in VFX and animation. The creative aspirations of the projects are increasingly ambitious. Filmmakers and artists are collaborating to create remarkable films. Audiences are responding, as evidenced in both the box office and streaming success of VFX-driven and animated content. On the technology side, we have seen advances in filmmaking and collaboration across the industry. Open source technologies continue to help shape a future where both studio and software vendors can depend

62 • VFXVOICE.COM WINTER 2024

PG 56-64 INDUSTRY.indd 62

11/24/23 2:02 PM


PG 63 WARNER BROS WONKA AD.indd 63

11/25/23 2:41 PM


INDUSTRY

TOP TO BOTTOM: Rising Sun Pictures, who helped create Doctor Quadpaw in Andor, worked on the series VFX alongside ILM, Hybride, Scanline VFX, Midas VFX, The Third Floor, Whiskeytree, Blind, Clear Angle Studios and Soho VFX, among others. (Image courtesy of Rising Sun Pictures, Lucasfilm Ltd. and Disney+) Rising Sun Pictures helped develop the Quantum Realm in Ant-Man and the Wasp: Quantumania, which also enlisted the VFX talents of ILM, Digital Domain, Sony Pictures Imageworks, MPC, Atomic Arts, Spin VFX, MARZ and several others. (Image courtesy of Rising Sun Pictures, Marvel Studios and Walt Disney Studios) Sony Pictures Imageworks was one of many studios contributing VFX to Guardians of the Galaxy Vol. 3, along with Framestore, Wētā FX, ILM, RISE Visual Effects Studios, The Third Floor and a galaxy of others. (Image courtesy of Marvel Studios)

on open standards that work easily and effectively across software platforms and studio pipelines. With USD, Material X, Open Color IO, Open Shading Language and countless other tools, we are moving closer and closer to having a unified toolkit of fundamental building blocks that will benefit the entire VFX and animation ecosystem. There are also a lot of exciting developments in the area of the Open Review Initiative led by Autodesk, DNEG and our team at Imageworks. The idea of creating an open source platform for reviewing media and material for our industry is a significant development that would allow creatives to see their amazing work across compatible viewing platforms. The Academy Software Foundation (ASWF) and the newlyformed Alliance for Open USD (AOUSD) are leading these important and industry changing developments. It is an exciting time. The impact of AI and machine learning will also continue to play a significant role in impacting the filmmaking process. With the rate of change in this area of technology at its highest level ever, we are going to start seeing studios implementing toolsets that allow artists to accelerate workflows and step function certain areas of the production pipeline. Steve May, Chief Technology Officer, Pixar Animation Studios and Chairperson of the Alliance for OpenUSD Universal Scene Description (USD) was invented at Pixar and is the technological foundation of our state-of-the-art animation pipeline. OpenUSD is based on years of research and application in Pixar filmmaking. We open-sourced the project in 2016, and the influence of OpenUSD now expands beyond film, visual effects and animation and into other industries that increasingly rely on 3D data for media interchange. USD stands for Universal Scene Description, and Pixar created USD to make animated movies that have incredibly complex 3D scenes comprised of millions or even billions of individual objects. Creating such scenes involves complex artist workflows and many 3D content software tools. Historically, those 3D tools all used different data and file formats. Pixar wanted to enable more powerful creative expression for its artists by streamlining their workflows and allowing the same data to be interchanged by all of the content creation tools. So, in 2016, Pixar decided to share USD and make it open source, allowing it to be incorporated into software across the industry. Fast forward to today and USD now serves to simplify working on 3D assets across multiple technology platforms. Where this used to be very complex, USD makes it simple. Recently, Pixar, alongside NVIDIA, Apple, Adobe and Autodesk, announced the Alliance for OpenUSD (AOUSD) (www.aousd.org). Many companies consider a standard for 3D data and content critical to their business operations and growth, and AOUSD is open for new members to join. AOUSD is a big deal. We are on a journey with the evolution of 3D technology and OpenUSD is at the heart of it. If we’re successful, it will become the fundamental building block on which all 3D content will be created. Whether it’s immersive 3D content, new spatial computing platforms, or scientific and industrial applications – they will all rely on OpenUSD, and the Alliance for OpenUSD is the next step on the path to realizing that potential.

64 • VFXVOICE.COM WINTER 2024

PG 56-64 INDUSTRY.indd 64

11/24/23 2:02 PM


PG 65 WARNER BROS BARBIE AD.indd 63

11/25/23 2:42 PM


INDUSTRY

EXPLORING HARDWARE AND SOFTWARE TRENDS IN 2024 By JIM McCULLAUGH

TOP: NVIDIA Real-Time Ray Tracing enables production-quality rendering and cinematic frame rates. With NVIDIA Turing, artists can use ray-traced rendering to efficiently create photorealistic objects and environments with physically accurate lighting. (Image courtesy of NVIDIA) OPPOSITE TOP: NVIDIA Picasso is a cloud service for building and deploying generative AI-powered image, video and 3D applications with advanced text-to-image, text-to-video and text-to-3D capabilities to enhance productivity for creativity, design and digital simulation through simple cloud APIs. (Images courtesy of NVIDIA) OPPOSITE MIDDLE AND BOTTOM: At last year’s SIGGRAPH, Vicon showcased a six-person markerless and multi-modal real-time solve set against Dreamscape’s location-based virtual reality adventure “The Clockwork Forest,” created in partnership with Audemars Piguet, the Swiss haute horology manufacturer. (Images courtesy of Vicon)

The VFX industry in 2024 will be shaped by a convergence of advanced hardware and software innovations. From quantum computing and AI-accelerated hardware to real-time ray tracing and AI-driven automation, the tools at the disposal of VFX artists and studios are poised to create visuals that were once deemed impossible. With these trends, we can expect a new era of creativity and realism, redefining the boundaries of what can be achieved in the world of visual effects. Here, a cross-section of technology company thought leaders calculate the new year’s progressions. Rick Champagne, Director, Global Media & Entertainment Industry Marketing and Strategy, NVIDIA The explosion of AI has taken the media entertainment industry by storm with a myriad of tools to accelerate production and enable new creative options. From initial concepts and content creation to final production and distribution, AI plays a crucial role every step of the way. Artists are tapping into AI to bring their early concept drawings to life in 3D – or even in fully animated sequences. AI and advanced technology like NVIDIA GPUs can help virtual art departments create camera-ready environments in a fraction of the time, while enabling more creative options live on set. Capabilities like removing objects, changing backgrounds or foregrounds, or simply up-resing content in real-time for LED volumes are simpler than ever. AI is also helping studios dynamically curate and customize content tailored for individual consumers. This new era of personalization will open up new revenue streams to fuel the industry’s future growth. Customizable generative AI will enable artists, filmmakers

66 • VFXVOICE.COM WINTER 2024

PG 66-73 HARDWARE-SOFTWARE.indd 66

11/24/23 2:08 PM


and content creators to scale and monetize their unique styles and produce more diverse outputs for broader consumption. For example, speech AI gives people the ability to converse with devices, machines and computers to simplify and augment their lives. It makes consuming content in any language much easier, providing advanced captioning, translation and dubbing while also enabling the use of voice for more natural interaction in immersive experiences. Enterprises and creators can also take their generative AI to the next level with NVIDIA Picasso to run optimized inference on their models, train state-of-the-art generative models on proprietary data or start from pretrained models to generate images, video and 3D content from text or image prompts. Large language models, or LLMs, like NVIDIA NeMo can be trained using data from game lore, comics, books, television shows and cinematic universes to create new opportunities for audiences to engage with content and characters. Nearly three-fourths of M&E leaders believe AI will be crucial to staying competitive over the next five years. The new era of AI has inspired studios to create new departments to develop game-changing tools using computer vision, machine learning and foundation models. With NVIDIA AI Enterprise, businesses can streamline the development and deployment of production-ready generative AI, computer vision, speech AI and more. To overcome rising costs of production and decreasing budgets, studios are now planning their AI strategies. The upcoming year will reveal the steps they’re considering to move faster and more efficiently. Mac Moore, Head of Media &Entertainment, CoreWeave 2023 will certainly go down as the year of gen AI across industries; however, in 2024 we’ll start to see its implications. Through experimentation, it has become clear that training foundational models and running gen AI applications require a ton of compute, increasing competition for hardware that’s already relatively scarce. Looking forward, I also expect real-time game engines will continue gaining traction in production, as they allow studios to leverage the same environment for previz as their production renders. Like real-time, AI/ML promises to increase efficiency through automation, reducing redundant and time-consuming tasks in the VFX pipeline. Cloud computing will be the backbone of the AI/ML expansion, as it can be used to train and run models without the financial and logistical challenges of purchasing infrastructure. David “Ed” Edwards, VFX Product Manager, Vicon We’ll continue to see AI impact motion capture and VFX in new and improved ways in 2024 and beyond. One of the most notable of those is the extent to which it’s going to broaden both the application and the core user base. Every approach to motion capture has its respective strengths and shortcomings. What we’ve seen from VFX to date – and certainly during the proliferation of virtual production – is that technical accessibility and suitability to collaboration are driving forces in adoption. AI solutions are showing a great deal of promise in this respect. Matthew Allard, Director, Alliances & Solutions, Dell Technologies More and more commercial notebook users are transitioning to

WINTER 2024 VFXVOICE.COM • 67

PG 66-73 HARDWARE-SOFTWARE.indd 67

11/24/23 2:08 PM


INDUSTRY

mobile and fixed workstations to leverage the scalability, performance and reliability of workstations. Specifically, those users transitioning are looking for more performance and advanced features from their hardware to use specialized business software and applications, heavy office usage and run complex tasks and workloads. As a result of an increased need for systems with higher performance, the technology behind workstations and workstation performance continues to advance as well. This is leading to two major changes – workstations offering CPUs with an increasing core count and more options for multiple GPUs in one workstation (up to four GPUs). In addition, workstations offer a very efficient and cost-effective option to run workloads locally. Particularly with the growth of AI, workstations are a valuable option to run AI workloads before you need to deploy at scale. Addy Ghani,Vice President of Virtual Production, disguise The hardware and software trends in media and entertainment may surprise us in the next coming years. Going forward, we are seeing evolving trends in areas of decentralization and an ever-increasing reliance on high-speed, low-latency networking

TOP AND BOTTOM: Zero Density’s Traxis Talent Tracking and Camera Tracking systems. The Traxis platform is a turnkey solution for high-precision tracking on virtual productions, offering photoreal virtual graphics while eliminating long calibration times, distracting markers and latency errors. (Images courtesy of Zero Density)

infrastructure. One of the areas where we may see a huge uptick is in real-time graphics performance, in near-cloud infrastructure. As more AI SaaS services emerge, the rise of powerful GPUs, and even specialized AI cores, will need to be accessed almost instantaneously – or as close to it as possible. Jeremy Smith, Chief Technologist, HP Inc. The amount of data that VFX professionals have to work with these days is staggering. When it comes to looking into the crystal ball, it’s figuring out how artists and studios are going to work with massive volumes of data, the compute power required to process that data, and how all of that data is going to be managed and stored. All of the exciting creative trends in television and film, like virtual production, AI-augmented workflows and the continuous demand for higher-quality content, are all driving this data boom. So even as computation time and power are shooting up dramatically, the time to create content hasn’t actually changed that much because the quality bar keeps increasing at an almost exponential rate. In order to maximize processing power, we’re seeing developers embrace both the GPU and the CPU in workstations like the HP Z8 Fury, enabling artists to iterate much faster than ever before. We’re also seeing that organizations in VFX continue to operate with teams in a remote environment, leveraging talent pools around the world. All of these forces are driving a massive digital transformation across the industry where studios are looking to technology developers to help devise smarter ways to handle workflow infrastructure, distributed teams, a massive influx of data – and innovative ways to store and manage that data. Tram Le-Jones, Vice President of Solutions, Backlight 2023 has been a tumultuous year, but there are many reasons to look forward to an exciting and innovation-filled 2024. This year won’t just be about new cutting-edge tools that will redefine what’s possible in VFX; I’m also anticipating a transformation in how we consider the way in which we work. Realizing a more efficient and interconnected future won’t hinge on a single change. Perhaps the most significant objective the industry can have in 2024 will be fostering more collaboration and bringing people together both in VFX and across the entire production pipeline. Workers want to feel connected in the production process – they want to know what’s happening upstream, how that information translates downstream, and how we keep it all connected so teams are referencing the same information efficiently. As we brace for a deluge of work in 2024, the industry will need to optimize for both efficiency and impact. That will start with taking a hard look at our processes and revisiting workflows, production management and especially asset management. Given the sheer volume of data and media we expect to handle, the demand for intuitive, scalable solutions will skyrocket. I believe this need will catalyze a surge in collaboration across the whole production ecosystem, with more software providers partnering up to provide solutions that address entire workflows. Just as we’ve witnessed with virtual production, I anticipate more departments will want to go from script to screen

68 • VFXVOICE.COM WINTER 2024

PG 66-73 HARDWARE-SOFTWARE.indd 68

11/24/23 2:08 PM


PG 69 PARAMOUNT DUNGEONS AND DRAGONS AD.indd 69

11/25/23 2:43 PM


INDUSTRY

TOP TWO: Arcturus is the go-to-market partner for Microsoft’s Mixed Reality Capture Studios (MRCS) technology, offering a complete photorealistic end-to-end solution, from capture to distribution, featuring its advanced 3D capture and reconstruction systems paired with its Holosuite editing and streaming tool. (Images courtesy of Arcturus) BOTTOM TWO: HP Z8 Fury G5 Workstation Desktop PC features single-socket technology that delivers high-level performance with up to 56 cores in a single CPU and up to 4 high-end GPUs to handle complex deep learning, virtual production and VFX. (Images courtesy of HP)

on one production in a more collaborative, flexible way than ever before. James Knight, Global Director, Media & Entertainment/ Visual Effects, AMD Those mocap volumes are not going anywhere – they’re here to stay! Television is really starting to embrace virtual production en masse, and motion capture falls under that. The use of virtual cameras for directors, directors of photography and artists to walk around, lens and film environments as if they were really there; I see that becoming even more ubiquitous than it is. I see more pros in film and TV being more curious and discovering that process. I anticipate more professional discovery and people understanding more about the power of utilizing the volume along with motion capture and virtual production, and what that can do to pipelines that didn’t previously involve VP – streamlining the traditional pillars of post-production, production and pre-production into one iterative process. Eric Bourque, Vice President of Content Creation, Media &Entertainment, Autodesk While it’s impossible to predict exactly what’s around the corner, we know that our VFX and animation customers are constantly looking for ways to work more efficiently. This has been a driving force behind so much innovation in media and entertainment at Autodesk, including a push toward cloud-driven production, collaborating cross-industry on open source initiatives and exploring the promise of AI for accelerating artist workflows. Data interoperability poses massive bottlenecks for VFX studios working with distributed teams around the world, and we are investing in open source efforts to tackle these complex challenges. Most recently we came together with Pixar, Adobe, Apple and NVIDIA in the formation of the Alliance for OpenUSD. We are also working with Adobe to help drive a new source shading model with OpenPBR to move toward a reality where files can move seamlessly from one digital content creation system to another. AI tools in some form or another have played a role in boosting VFX workflows for many years now, and we also see generative AI taking things one step further. For instance, we are integrating AI services developed using NVIDIA Picasso, a foundry for building generative AI models, into Maya, and are also teaming up with Wonder Dynamics to deliver an integration between Maya and Wonder Studio – a tool that harnesses the power of AI for character-driven VFX workflows. Shawn Frayne, CEO & Co-Founder, Looking Glass Factory Over the next months – not years or decades – I believe folks will find themselves chatting with AI-powered holograms on a daily basis in stores, theaters, places like the airport or stadium, our offices and eventually in our homes. We see this happening already with a handful of brands we work with, and I think that’s the beginning of something much bigger that’s about to sweep the globe. Not long after that happens, I think a lot of us will find we’re also chatting with each other (by which I mean, fellow humans) in 3D through holographic terminals like the Looking Glass – first in hybrid office setups, but eventually also in our homes – just like what was demonstrated at the recent NVIDIA Siggraph demo.

70 • VFXVOICE.COM WINTER 2024

PG 66-73 HARDWARE-SOFTWARE.indd 70

11/24/23 2:08 PM


So, my hunch is the shift from the 2D phones and laptops we use today to spatial interfaces of tomorrow will accelerate dramatically over the next few months, powered by conversational AI characters and AI-powered holographic communication. Christopher Nichols, Director, Chaos Group Labs Advance of General GPU (GPGPU) and AI tools: The demand for AI tools will continue to explode, not only externally but internally within organizations. This desire is going to also raise the demand for GPUs that can handle the AI training models that organizations will be creating. However, in terms of M&E/VFX, the vast majority of tools won’t necessarily be based around image generation. Instead, we’re likely going to see things like smarter denoising, animation accelerants (auto-rigging, auto-blendshapes) and motion capture workflows that start with a smartphone. Andrew Sinagra, Co-Founder, NIM Labs As companies look for ways to create efficiencies across global productions, migrating more services and systems to a based cloud infrastructure is key. We’re seeing businesses and solutions that facilitate the hybrid (on-prem and cloud) model as well as the full studio-in-the-cloud model rise to the surface of conversations. Over the past several years, we’ve recognized the challenges of a decentralized business model, and the solutions to these issues will be a top priority in 2024. Dade Orgeron, Vice President of 3D Innovation, Shutterstock We’re already seeing impressive AI applications for motion capture, character replacement and environment generation, but what’s most exciting are the tools that will break down the barriers to entry into the 3D industry, opening the doors for more creators while simultaneously reducing creation timelines from days to minutes. Ofir Benovici, CEO, Zero Density After helping to produce two million hours’ worth of content, we are seeing an increasing pressure to adopt open standards. 3D platforms need to work harmoniously together and be compatible with industry protocols. Anything else just makes workflows unnecessarily slow and painful to set up. Kamal Mistry, CEO, Arcturus In the next year, we expect to see virtual production continue to accelerate, and hopefully solve a fairly significant problem: adding real people to virtual backgrounds. One of the best aspects of virtual production is that it gives performers the opportunity to fully immerse themselves in the world where their story takes place. Some of the most impressive examples are sci-fi hits like The Mandalorian and Star Trek: Strange New Worlds, which feature imaginative alien worlds. In most cases, though, those worlds are depicted as deserted landscapes, devoid of actual people. Right now, the best way to solve that is for artists to create CG digidoubles, a costly and time-consuming process. There are also limits in how realistic the CG characters are – which is why you often see them in the far background. In the coming year, volumetric video will play a prominent role in the creation of virtual humans for virtual production. Productions can simply record a real performer in costume

TOP TO BOTTOM: ftrack Review’s all-in-one, free-flowing media review interface accelerates the project review and approval process, whether offline or in full interactive sync. (Image courtesy of ftrack) ftrack Studio VFX production management software with cloud-based interface enhances speed and efficiency in planning projects, planning shots, tracking work, creating reports, reviewing work in sync and managing a team. (Image courtesy of ftrack) New-generation cineSync 5 streamlines the review process, supporting more collaborative reviews, both local and remote, while still meeting the highest requirements across security and performance. (Onscreen is a scene from Season 5 of The Expanse.) (Image courtesy of Expanding Universe Productions, LLC and ftrack)

WINTER 2024 VFXVOICE.COM • 71

PG 66-73 HARDWARE-SOFTWARE.indd 71

11/24/23 2:08 PM


INDUSTRY

TOP: Virtual production technology and services provider, 209 group, relies on Brompton Technology for custom virtual production LED volume builds for studios and production companies. BOTTOM: ZEISS CinCraft Scenario provides camera tracking data for use in real-time rendering engines for virtual production or live compositing. It benefits productions that include VFX by recording the tracking data on set to then be used in post-production to optimize an artist’s workflow. (Image courtesy of ZEISS)

carrying out an assigned task, then add them to a virtual production scene. We’ve seen that in films like the Whitney Houston biopic, the latest Matrix, comic book movies and more, but there have been limits due to how much data is needed to display each volumetric character. That is changing though, and volumetric video developers – including Arcturus – have found new ways to make the volumetric recordings lightweight, meaning content creators can add hundreds (if not more) real performances into virtual production backgrounds as needed. For creators, it means they have a new way to assign relightable 3D assets in a traditional 2D pipeline, leading to better crowd scenes, new VFX options and a better way to create 3D compositions. For audiences, that means a better viewing experience, and that’s really what it’s all about. Gretchen Libby, Director, Visual Computing, Amazon Web Services (AWS) In 2024, cloud computing will continue to serve a vital role in VFX and animation workflows, to support the increasing pace and scale, as well as a growing global workforce. As more studios enable their artists to work in the cloud, they will need robust cost reporting and management tools, as well as data management across multiple locations. The cloud provides tremendous scalability and flexibility, but as demand increases in 2024, they will need to optimize their costs to match their budgets. Interoperability also remains a top priority in content creation workflows. Christy Anzelmo, Chief Product Officer, Foundry As the industry continues to evolve in response to new challenges, the drive for delivering high-quality VFX with even greater efficacy continues, and building efficient and flexible pipelines where skilled artists are empowered by technology has never been more essential for any VFX project or business. It’s exciting to see the industry’s continued progression toward leveraging USD to enable interop between applications and departments without compromising on performance or scale. As we continue to embed USD further within Nuke, Katana, Mari and other products, Foundry is looking forward to participating in the recently announced Alliance for OpenUSD and supporting the path towards standardization. Adrian Jeakins, Director of Engineering, Brompton Technology LED products designed specifically for virtual production are what we’ll see emerging more and more as manufacturers understand better the unique requirements of LED for in-camera visual effects (ICVFX). One of the first and most obvious of these is LED panels that give better color rendering by using extra emitters to improve spectral quality. A challenge here for panel manufacturers is sourcing LED packages with extra emitters, which are not widely available. As a result, we’ll see coarse panels first, which will be really good for ceilings in a volume, and then finer panels. Processing is also presented with some new challenges because the algorithms for calibrated control of these extra emitters are a lot more complex than those in standard LED panels.

72 • VFXVOICE.COM WINTER 2024

PG 66-73 HARDWARE-SOFTWARE.indd 72

11/24/23 2:08 PM


PG 73 VES HANDBOOK AD.indd 73

11/25/23 2:44 PM


SPECIAL FOCUS

VFX IN THE U.K.: MORE DEMAND THAN EVER By OLIVER WEBB

TOP: Framestore worked on Zack Snyder’s Rebel Moon. (Image courtesy of Netflix) OPPOSITE TOP TO BOTTOM: Cinesite provided VFX for Teenage Mutant Ninja Turtles: Mutant Mayhem as well as for A Haunting in Venice and The Witcher Season 3. (Image courtesy of Paramount Pictures) Rocket is one of many creatures and characters developed by Framestore, including Groot, Dobby, Paddington and Iorek Byrnison, but Rocket (with the voice of Bradley Cooper) continues to take on a full life of his own. (Image courtesy of Marvel Studios and Walt Disney Studios) The notorious Daleks were first introduced in 1963 in the Doctor Who episode “The Daleks” and made into a movie, Dr. Who and the Daleks, in 1965. (Image courtesy of BBC Studios and Zodiak VFX)

The U.K. visual effects industry has a rich talent pool and is one of the world’s leading visual effects hubs. It is home to world-leading companies, including Framestore, DNEG, One of Us, BlueBolt, Union, MPC, Milk VFX and many more. There has also been an increase in new, emerging VFX companies as demands have surged in recent years. VFX companies in the U.K. have also won the Academy Award for Best Visual Effects on several occasions with notable wins for Gravity, The Jungle Book and Inception. The Harry Potter films were a contributing factor in putting U.K. VFX on the map. In 2014, Industrial Light & Magic created 200 jobs in a new studio in London, and there has been continued growth within the British industry ever since. Despite setbacks with COVID and implications of Brexit, there is more demand than ever, and the U.K. visual effects industry is thriving. The U.K. government supports the VFX sector and has been very successful in attracting inward investment filming with a combination of tax relief, excellent crews, superb facilities and iconic locations. In September, there was a parliamentary review to examine the current challenges faced by the British film and television industry. Martin Perkins, New Business and Bidding Manager at Cinesite London (Teenage Mutant Ninja Turtles: Mutant Mayhem, A Haunting in Venice, The Witcher Season 3), argues that the structure of the tax reliefs inadvertently disadvantages the U.K.’s VFX companies, which has led to a stagnation of investment in this high-tech, high-productivity sector. “When our tax reliefs were designed in 2007, the U.K. was bound by the EU Cinema Communication, which stipulated that tax relief should have a territorial cap,” Perkins notes. “This was written into our Corporation Tax Act such that productions receive 25% relief on their U.K. production expenditure, but once that exceeds 80% of their global production budget, there is no further relief in the U.K. VFX and post-production usually sit within that final 20% of that expenditure, meaning that productions that shoot in the U.K. typically will “cap out” on their tax relief by the time they’re ready

74 • VFXVOICE.COM WINTER 2024

PG 74-82 UK VFX.indd 74

11/24/23 2:36 PM


“While [the end of the writers and actors strikes] will be a great relief after several months of uncertainty, it will also throw up some of the challenges facing our industry. Will we see the same shortage of skills and talent, which stymied the marketplace post-COVID? There are lessons learned from that period which we can all apply to avoid the same level of saturation and burn-out the industry saw.” —Philip Greenlow, Managing Director of VFX, Jellyfish Pictures for VFX; therefore, they need to place this work in other territories in order to claim a rebate on the work.” “This means that, unfortunately, a lot of U.K. VFX houses are not able to capitalize on the large number of films and high-end TV series that shoot in the U.K., as they end up taking their VFX work to locations [such as Canada, Australia and France] where the tax credit isn’t capped, and they can also offer more than the U.K.’s 25% rebate on their spend,” Perkins continues. “The most regular thing I hear from production-side VFX producers when discussing a new project that’s shooting here is, ‘We’d love for you to do the work, but we are chasing the highest rebates we can get.’” London VES Chairman Nicolas Casanova, Stereo Supervisor at MPC London, argues that the cost of living in the U.K. has also contributed significantly to this problem. “Other countries have tax incentives that are equal or better than the ones offered in the U.K., which drives the studios to have offices in multiple countries. That is a complication that we are facing, combined with Brexit. We had a lot of junior artists who were really good, but they didn’t have the chance to stay in the U.K. as they hadn’t been here for five years or longer. They had to relocate to countries where getting a work visa was easier. It’s really uncertain at the moment.” In 2017, U.K. Screen Alliance reported that 60% of the U.K. VFX workforce are U.K. workers and a staggering one in three of the workforce are made up of European workers. Of course, Brexit will have a significant impact on these statistics, and this number is in

WINTER 2024 VFXVOICE.COM • 75

PG 74-82 UK VFX.indd 75

11/24/23 2:36 PM


SPECIAL FOCUS

TOP TO BOTTOM: Framestore is famous for its work in bringing Paddington Bear to life. (Image courtesy of StudioCanal). Jellyfish Pictures has worked on The Creator as well as Asteroid City, Stranger Things, Black Mirror, Loki Season 2 and The Nevers, among others. (Image courtesy of 20th Century Studios) For Framestore, The Little Mermaid was a “creatively challenging and technically complex” live-action adaptation of the 1989 animated film, with Framestore developing the underwater look. (Image courtesy of Framestore and Walt Disney Studios)

steep decline. U.K. Screen Alliance also found in their 2017 report that only 27% of the workforce are women, so this also remains an important area. Encouragement is needed to break down these gender barriers. “I think one of the issues we had during lockdown was there was so much demand from streaming services to start producing. There weren’t enough students graduating for the amount of people we had to hire. In the U.K. a lot of young people have never considered visual effects or animation as a career path, and so I think we need to reach out to them so that they understand that this is an option,” Casanova says. Despite this, several U.K. universities and film schools offer visual effects as a course. The University of South Wales is one of the most renowned, with all staff having industry experience. “The core staff have experience working in film, TV series and commercials, with credits in releases such as the Fantastic Beasts series, Doctor Who and His Dark Materials, to name a few,” says Course Leader Geraint Thomas. “The course prides itself on allowing students to find their own path within the moving arts, from creating 3D motion design for high-end commercials, title sequence design for television series, to digital matte painting environments within the latest blockbusters. Essentially, if it’s digital and moving, we’re interested. While we cover the technicalities of heavy VFX in the course – 3D compositing, modeling and lighting, environment building, etc. – we predominantly specialize in the contextuality of what the students are using these skills for. In this aspect, the course not only gives them the technical abilities to create industry-level VFX and motion design pieces, but also to think critically when problem-solving, and to look at the bigger picture of the field that they’re stepping into.” Graduates have gone on to work for leading companies, including ILM, DNEG, Moving Picture Company (MPC), Framestore and Goodbye Kansas. They’ve worked on productions such as Dune, Avengers, Guardians of the Galaxy, Spider-Man, No Time To Die and Andor. “Come to think of it, I struggle to think of a

76 • VFXVOICE.COM WINTER 2024

PG 74-82 UK VFX.indd 76

11/24/23 2:36 PM


PG 77 DISNEY LITTLE MERMAID AD.indd 77

11/25/23 2:45 PM


SPECIAL FOCUS

TOP: Framestore is proud of its work on Barbie and their collaboration with director/writer Greta Gerwig. (Image courtesy of Warner Bros. Pictures) BOTTOM: Peter Pan & Wendy is one of Framestore’s recent collaborations. (Image courtesy of Disney)

blockbuster film and/or high-end TV series that a student from our course hasn’t worked on in recent years. Regarding motion design, we’ve got students working for the likes of Buck, The Mill, Carbon, ManvsMachine and more. We often invite students back for guest talks and to speak with the students on a one-to-one basis, as we very much have a sense of belonging here on the course,” says Thomas. One of the U.K.’s most notable VFX companies is Framestore, an Oscar, VES and BAFTA award-winning visual effects and animation studio, home to over 3,000 artists, producers and technologists. It spans eight locations and four continents working on films, TV shows, ads, theme park rides and immersive experiences. “We are perhaps best known for our creatures and characters like Rocket, Groot, Dobby, Paddington and Iorek Byrnison, who share screens with the best acting talent, delivering performances that audiences believe wholeheartedly. But we also produce equally seamless environment and effects work across shows such as Top Gun: Maverick, Barbie, The Little Mermaid, Guardians of the Galaxy, The Martian and Blade Runner 2049, for which we won an Academy Award,” says Fiona Walkinshaw, CEO of Film & Episodic at Framestore. One of Framestore’s most recent collaborations is Barbie. “To have collaborated on Greta Gerwig’s Barbie is really special, and we played our part in creating a seamless Barbie Land,” Walkinshaw add. “The Little Mermaid has been one of the most creatively challenging and technically complex pieces of work we’ve ever done. I am also incredibly proud of the characters that we have created and that audiences have emotionally engaged with to such an extent that they have been integral to the success of the films they have featured in. Paddington Bear is a huge achievement for Framestore. He is truly part of Framestore lore. And seeing the character arc of Rocket Raccoon evolve over the three Guardians films – working to create a character for James Gunn that would be as important to Marvel as any of their live-action superheroes – has been really satisfying. The Guardians of the Galaxy trilogy has been the highlight of many careers at Framestore, and his vociferous support of our work, alongside VFX Supervisor Stephane Ceretti, has been truly appreciated and is something we’ll all continue to reflect on with great pride. Following Army of the Dead, Rebel Moon sees us working with Zack Snyder again. He is an amazing collaborator, and our teams get great personal and professional pleasure out of working with him and his team. On the episodic side, with His Dark Materials trilogy and Foundation Season 2, we are bringing the same level of quality as we would do to any feature project. I’m so proud of the BAFTA’s we won for His Dark Materials.” Myfanwy Harris is Executive Director at Cardiff-based studio Zodiak VFX, known for their work on Doctor Who. “There’s a lot of work we’re very proud of; however, as some of it is sometimes created whilst in our outsource role for larger studios. We choose not to shine a light on it to respect our relationships with them,” Harris says. Zodiak decided to remain at boutique size to be an asset to productions and larger studios alike, working respectfully and collaboratively to provide a reliable resource to the

78 • VFXVOICE.COM WINTER 2024

PG 74-82 UK VFX.indd 78

11/24/23 2:36 PM


PG 79 DISNEY GUARDIANS OF THE GALAXY AD.indd 79

11/25/23 2:46 PM


SPECIAL FOCUS

TOP TO BOTTOM: Cinesite helped bring Strays to life. (Image courtesy of Universal Pictures) Framestore was a leading vendor on Guardians of the Galaxy Vol. 3. Working on the film was a career highlight for many at the company. (Image courtesy of Marvel Studios and Walt Disney Studios) Mutant Mayhem is the seventh Teenage Mutant Ninja Turtles film in the franchise. (Image courtesy of Paramount Pictures)

industry without compromising on quality. After the last four years, their industry presence has grown. “We saw the benefits of remaining dynamic as a company by keeping our overhead low and scaling up and down outside of our core team for productions from our trusted pool of senior freelancers and similar-sized studios, which we have built a network with to tackle larger projects,” Harris details. “This not only enables us to offer competitive prices to our clients, but it also protected us during the pandemic as when the lull in workload eventually hit. We could financially maneuver around it relatively easily. Also having already established ourselves as an outsourcing house for our supportive work with larger studios, we already had a lot of the remote working setups in place, so we didn’t have to restructure things too much when our artists shifted to work from home during that period.” Founded in 2001, Jellyfish Pictures is a global BAFTA and Emmy award-winning animation and VFX studio. In 2023, Jellyfish Pictures reached their biggest headcount as a VFX division to date with over 220 artists worldwide. “We’re anticipating that at some point in the coming months the strikes will end, and we will probably see a steep incline in demand for VFX services into 2024. While this will be a great relief after several months of uncertainty, it will also throw up some of the challenges facing our industry. Will we see the same shortage of skills and talent, which stymied the marketplace post-COVID? There are lessons learned from that period which we can all apply to avoid the same level of saturation and burn-out the industry saw. Our priority for the coming year remains focused on the quality of our work and the diverse, talented community of artists in our extended remote network and across our studios in the U.K. and India, supporting the delivery of high-end output and building on the growth and development we’ve been able to sustain this year,” says Jellyfish Pictures’ Managing Director of VFX, Philip Greenlow. VFX Supervisor and Compositor Gianluca Dentici (Avengers: Infinity War) notes that from the visual effects artists perspective it has been a very complex time in relation to what happened both during the pandemic and the recent writers and actors strike. “I compare the latter to a second pandemic due to the almost total production immobility in the States, which being the most important market it heavily affects the British territory alike since it is one of the major VFX hubs in the world,” Dentici explains. “In this sense, I wish 2024 to be a game-changer and to bring a real awareness of the difficulties of our industry in the hope we can finally sit together with other guilds’ members to try and resolve the problems that are making our work every day more difficult. We need more dialogue, more interaction with other industry professionals. 2023 was undoubtedly the year of the outbreak of the artificial intelligence, whose success has been accompanied by many conflicting opinions on its licit use, although I personally believe an excessive alarmism and outsized demonization has been set off.” Walkinshaw also acknowledges the impact of new technologies. “Of course, technologies such as real-time and AI will undoubtedly bring benefits to the tools our artists use to the art of filmmaking. The shifts we are seeing in Hollywood today

80 • VFXVOICE.COM WINTER 2024

PG 74-82 UK VFX.indd 80

11/24/23 2:36 PM


PG 81 DISNEY INDIANA JONES AD.indd 81

11/25/23 2:46 PM


SPECIAL FOCUS

TOP TO BOTTOM: Doctor Who returned to screens in November 2023. (Image courtesy of BBC Studios and Zodiak VFX) Zodiak VFX is known for its work on Doctor Who. (Images courtesy of BBC Studios and Zodiak VFX) Sebastian was one of the many creatures recreated for the live-action version of The Little Mermaid. (Image courtesy of Framestore and Walt Disney Studios)

will undoubtedly shape the next five years, potentially driving adoption of these technologies and putting greater emphasis on preparedness and readiness before the hard costs of filming. Visual effects used at their best enable filmmakers to tell a story that otherwise might not be told, create worlds that we believe to be real, and engage with characters that we as an audience feel something for. I think that the U.K. VFX industry and the wider global industry will continue to develop talent and technology that support creative storytelling and will be integral to that process,” Walkinshaw observes. Zodiak remains optimistic about the rise in virtual production. “We’re sure there’s going to be a shift to incorporate it more and more. As we see some larger VFX studios fall, there’s a security we feel remaining at dynamic boutique size for survival reasons, so we can see perhaps more studios like us forming as a result; however, there will always be a need for the big London houses, and we very much see them retaining the monopoly of the industry, as there’s such an ever increasing demand for work that needs the manpower of the hundreds of artists to get it done,” Harris explains The VFX industry is the fastest-growing part of the U.K. film industry. The latest figure from the U.K. Screen Alliance puts the U.K. VFX workforce at 10,680 people employed in VFX. It is also reported that in 2019 the sector contributed £1.68 billion in GVA to the U.K. economy. This number has undoubtedly been impacted by numerous factors, including COVID, Brexit and the recent strikes. Perkins stresses the importance of being ready for the sudden shift into production after the strikes. “The proposals to the U.K. tax credit will take time to push through the government, so action is needed now so the industry is best placed to continue contributing billions of pounds to the economy and hundreds of thousands of jobs across the U.K.’s nations and regions,” he argues. Vicon’s VFX Product Manager, David ‘Ed’ Edwards, offers a positive outlook for the coming years. “We have a great deal to be excited about. While the global financial issues will have an effect, I do think that the increasing demand for products will bring with it an increased demand for talent. Studios want competent people they can trust to join their departments and deliver what’s required to get these products market-ready at speed and scale. The U.K. has done some fantastic work in this respect to ensure its education system is producing the next era of VFX specialists. I also anticipate a rise in indie productions. The cost of entry-level equipment for many aspects of VFX continues to come down, and there are so many options for people to ‘get started’ on a game or film project in their own back room. There’s a tremendous amount of talent and resources in the U.K., and we have a cultural history of DIY, so it wouldn’t surprise me if the next group of successful indie creators were to start here.”

82 • VFXVOICE.COM WINTER 2024

PG 74-82 UK VFX.indd 82

11/24/23 2:36 PM


PG 83 DISNEY MANDALORIAN AD.indd 83

11/25/23 2:47 PM


PROFILE

ACHIEVING THE PROPER BALANCE WITH ROBERT STROMBERG By TREVOR HOGG

Images courtesy of Robert Stromberg. TOP:Robert Stromberg, Director, Production Designer and Visual Effects Specialist. OPPOSITE TOP: Sketches by Marcel Delgado of Skull Island for the original King Kong became the inspiration for Stromberg’s jungle of Pandora. OPPOSITE MIDDLE, LEFT TO RIGHT: A documentary on matte artist Albert Whitlock, who was brought over to America by Alfred Hitchcock, made Stromberg want to pursue the same profession. Stromberg, left, was greatly influenced by his father William Stromberg, center. Stromberg gets to spend some quality time with Peter Jackson, Steven Spielberg and James Cameron. Stromberg joins Johnny Depp for a feast during the shoot of Alice in Wonderland. Stromberg and Rick Carter partnered together to bring Pandora to cinematic life for James Cameron. Illusion Arts co-founders Bill Taylor and Syd Dutton talk to Stromberg about matte painting for a Michael Jackson music video.

Described as a creative visionary, Robert Stromberg has traveled to Pandora with James Cameron, gone down the rabbit hole to Wonderland with Tim Burton, sailed to the far side of the world with Peter Weir and had his own personal encounter with villainess Maleficent. The journey has led to two Oscars, five Emmy Awards and the establishment of The Virtual Reality Company, which produced the Jurassic World VR Expedition. Life in Carlsbad, California was a mixture of surfing and sketching for Stromberg who had a great admiration for his father. “My dad was always sketching out new and exciting fantasy worlds filled with strange creatures; he wanted to be the next Cecil B. DeMille.” Partnering with the elder Stromberg was an 18-year-old neighbor. “Sometime in the late ‘60s, my dad and Phil Tippett began creating low-budget monster films in our garage. One of these films was a version of a Ray Bradbury short story called “The Sound of Thunder.” Shot in 16mm, they built small sets, made costumes and filmed in the woods in and around Carlsbad. Then, back in our garage they created miniature sets with painted plaster cliffs, model railroad trees and bushes. They also built fully articulated dinosaur puppets for stop-motion animation, a la Ray Harryhausen and Willis O’Brien. As a kid, I learned so much about the process just hanging around them. Another stroke of luck for me was when retired Disney artist Bruce McIntyre, who had once created pencil drawings for Pinocchio, Snow White and a lot of the Disney animated films, relocated and became my elementary school art teacher; he would roll in a reel-to-reel videotape player and place it in front of the class. At some random point, Bruce would pause the show leaving just one frame frozen on screen and then ask us all to draw that image. It was a great lesson for me in learning how to find and draw emotion into simple characters.” A documentary on matte artist Albert Whitlock, who was brought over to America by Alfred Hitchcock, made Stromberg want to pursue the same profession, and a treasured gift from his father was a book on the making of the original King Kong. “Those images made a big impression and became an inspiration for the jungle of Pandora in Avatar, which I co-produced and designed with Rick Carter,” Stromberg says. “When it came to what Ray Harryhausen was doing, we just thought that was amazing. To this day, my brother and I still talk about the skeleton fight in Jason and the Argonauts.” At the age of 19, Stromberg moved to Los Angeles and worked in the tollbooth in the parking lot at Universal Studios. “I was fired for drawing on the job, which turned out to be fortunate because it allowed me to get a production assistant job on a sitcom called Family Ties with Michael J. Fox. My first ‘illustrious’ PA job was to sweep out an entire auditorium that was filled with a layer of dust, which I’ll never forget! One of the producers on the project realized that I had worked all night and said, ‘Hey, kid, I like your work ethic. What do you want to do in this business?’ I told him that I was going to be a great matte artist in film one day. He asked me to show him what I had and was blown away. A few weeks later I found myself creating matte paintings for Journey to the Center of the Earth. After that, I went on to work on other films like A Nightmare on Elm Street and many other low- budget horror films. Mostly, I stumbled into everything along the way

84 • VFXVOICE.COM WINTER 2024

PG 84-88 ROBERT STROMBERG.indd 84

11/24/23 2:46 PM


and simply had to figure it out. One day when I was working on a very complex shot for a movie, Syd Dutton and Bill Taylor came to see me. Little did I know that they had a company called Illusion Arts and were once the artistic and technical team at Universal Studios under Albert Whitlock. Syd and Bill saw what I was doing and instantly offered me a full-time job at their studio. Syd took me under his wing and gave me great opportunities. I was fortunate enough to win my first Emmy for the TV series Star Trek: The Next Generation. Dutton and I created all the cities for not only Star Trek: The Next Generation but also Deep Space Nine and Voyager.” After a year of dealing with a life-threatening diagnosis, Stromberg decided to establish his own company, Digital Backlot. “I was bidding on work that Industrial Light & Magic or Digital Domain were bidding on and winning those jobs. I ended up becoming a full-time visual effects supervisor and traveled the world on many projects, including Game of Thrones, Walk the Line, Memoirs of a Geisha, John Adams and Boardwalk Empire but I was still looking for something more. I wanted to be a director. Along the way, I met many directors, including Martin Scorsese, Steven Spielberg, Peter Jackson and Steven Soderbergh. In 2003,

WINTER 2024 VFXVOICE.COM • 85

PG 84-88 ROBERT STROMBERG.indd 85

11/24/23 2:46 PM


PROFILE

TOP: Stromberg shared his first Oscar with Rick Carter and Kim Sinclair for their art direction on Avatar. BOTTOM: Maleficent provided Stromberg with the opportunity to fulfill his ambition of becoming a feature film director.

I met Peter Weir, and we worked together on a film called Master and Commander: The Far Side of the World. I watched how Peter directed not just the actors but the crew and how he handled himself under pressure. I remember telling myself that was the type of director I wanted to be.” Continues Stromberg, “A few months later I received a call out of the blue from James Cameron. At that time, it was called Project 880, not Avatar. After the call, I created an image of a small character looking over a vista near these giant trees with plants and floating mountains.” The concept art was enthusiastically received by Cameron. “The first year, we decided to only do images in black and white, and James understood why I wanted to do this. We needed to see and feel the graphic depth of this world before we added color. This all goes back to my early fascination with a black-and-white film and directors of photography like James Wong Howe. It was important to understand the concept because we were making a stereo 3D film, and depth was a crucial part of the viewing experience. Ultimately, I spent nearly five years as a co-production designer with Rick Carter, assembling a team of artists where we created every plant, tree and creature on Pandora. What a ride!”Avatar also provided an opportunity for Stromberg to collaborate with producer Brooke Breton, who has been a major influence. Transitioning to directing was not a big leap. “Simply put, I always felt that I was already directing, no matter what position I held,” Stromberg notes. “I worked on several films with producer Joe Roth, who recognized while I was production designing the film Oz the Great and Powerful that I wasn’t just designing the film, I was also guiding the Disney executives through the process; he could see that I had a great rapport and dialogue with the director, Sam Raimi, the crew and actors. One day, Joe came to me and handed me a script that Tim Burton had recently turned down. We filmed Maleficent at Pinewood Studios in London, and

86 • VFXVOICE.COM WINTER 2024

PG 84-88 ROBERT STROMBERG.indd 86

11/24/23 2:46 PM


immediately I realized that all my years of experience, whether it be visual effects, matte painting, supervising or production design, made it easy for me to communicate on all aspects of the filmmaking process. James Cameron taught me discipline, Steven Spielberg showed me that things could evolve or adapt in the moment, Tim Burton showed me that you shouldn’t be afraid to push the creative wall, and most importantly, Peter Weir showed me that you could direct with respect and humanity.” The lessons learned from being a matte artist still resonate today. “One was to have the feeling of complete control over creative design for at least several shots in a movie. What I mean by this is that since I was the specialist hired to do the job of creating an illusion that fit like a glove into one of the scenes, it taught me to understand the totality of a sequence and how the shots that I created needed to fit seamlessly into that scene. It also allowed me to understand what I needed to do to create something that fit the emotional narrative of that moment in the film.” In may ways visual effects can be viewed as a magic trick. “The end goal is to create something that fools or entertains the audience,” Stromberg states. “Filmmaking in general is all an illusion. Sometimes I still wonder how it all comes together with so many moving parts. It’s a lot like a symphony and all the different musical instruments coming together, hopefully without a few bad notes.” Creativity is born out of curiosity. “All artists want to know how things work. They pay attention to the small things in the world, such as how light reacts to objects, weathering, how plants and trees grow, and human behavior. When making a film or a painting, it’s never really finished, you just simply run out of questions to ask yourself.” Currently, Stromberg is the co-founder of The Virtual Reality Company which is a content studio and production company for VR experiences. “During my years of design and creating the world of Pandora for Avatar, we created and invented many new

TOP LEFT AND BOTTOM LEFT: After completing principal photography at Pinewood Studios, over 100 matte paintings were created by Stromberg for Maleficent in post-production. Stromberg on set with Angelina Jolie. TOP RIGHT TO BOTTOM: Alice in Wonderland led Stromberg to win his second Oscar for Achievement in Art Direction along with Karen O’Hara. With Alice in Wonderland, Stromberg learned quickly that the designs needed to be pushed to almost comical proportions. Stromberg and Syd Dutton created all the cities for not only Star Trek: The Next Generation but also Deep Space Nine and Voyager. Along with designing Oz the Great and Powerful, Stromberg guided Disney executives through the process.

WINTER 2024 VFXVOICE.COM • 87

PG 84-88 ROBERT STROMBERG.indd 87

11/24/23 2:46 PM


PROFILE

TOP TO BOTTOM: Stromberg believes that filmmaking is an illusion with visual effects being a part of the magic trick, as demonstrated by Oz the Great and Powerful. Matte painting taught Stromberg to understand what is needed to create something that fits the emotional narrative of each moment in a film like Life of Pi. Upon establishing Digital Backlot, Stromberg got an opportunity to work on Game of Thrones. Stromberg celebrates his first Oscar for Achievement in Art Direction on Avatar with Sigourney Weaver, Rick Carter and Kim Sinclair in 2010. (Photo: Dan MacMedan)

technological enhancements, including for the first time a virtual camera that could do many of the same things that traditional cameras could do. For every scene in Avatar, we created a digital environment that could be explored, scouted and filmed, all virtually.” A major turning point was when Facebook purchased Oculus for $2 billion. “Oculus was nice enough to show me the tests that they had been working on that were compelling enough for Facebook to take notice. Once I saw them, I immediately started The Virtual Reality Company. At that time, everyone was talking about how it could be used for gaming, but I thought this could be a great narrative storytelling tool, and I then proceeded to create a four-minute test that I called ‘There.’ ‘There’ was a fantastical world where we envision a young girl who takes us through this surreal environment. I added dialogue and a symphonic score and suddenly realized that this was a completely new form of potential storytelling. Over several years, we created many different narrative experiences including The Martian, Raising a Rukus and The Jurassic World experience. In the AR world, we created a successful app called Follow Me Dragon, which became a top download in the app store.” Unreal Engine and Unity are powerful tools. “As technology has improved over the years, I can see where an entire film could be created using gaming engine technology and AI, so like analog-to-digital, you either embrace it or get eaten by it. Currently, AI is being used in visual effects to enhance or replace prosthetics. As far as creating artwork, art influences art, and AI is a whole new generation of this concept. In the future, it’s the individual artists that will be burdened with copyrighting their work to lay claim to the unique design. It’s interesting that in the mid-’90s when computers first came onto the scene, people like myself were actually afraid that they would take all of our jobs as artists away, and what actually happened was that, for a while, the technicians tried to make art using Photoshop. But they soon found that you still need all of the life experiences and a keen understanding of how things work to make the technology really sing. I can only hope that AI is just another paintbrush in the artists’ toolkit.” Emphasizes Stromberg, “Adaptation is a crucial element in deciding a career path. One thing that we can always be certain of is that technology will continue to race ahead. We just need to make sure that creativity, human behavior and emotions keep a steady pace alongside advancement in technology.” The biggest career challenge has nothing to do with creativity. “Finding balance in life while still pursuing your dreams is a difficult juggling act. And, in film, where time is your most valuable resource, it doesn’t allow for much else. Commercials and television series have all of the elements of a film with a shorter timeline, and that allows me the time to live my life in-between. I’m still interested in being a feature director and will take on another project, but it has to be the right story at the right time. In the inevitable end, everyone’s life is a movie, so make sure to direct yours properly, and when your life film is over, I hope it ends with a standing ovation! My advice for any challenge is to find balance.”

88 • VFXVOICE.COM WINTER 2024

PG 84-88 ROBERT STROMBERG.indd 88

11/24/23 2:46 PM


PG 89 DISNEY LOKI AD.indd 89

11/25/23 2:48 PM


TV/STREAMING

BLUE EYE SAMURAI SLICES THROUGH SOCIAL INJUSTICE By TREVOR HOGG

Images courtesy of Netflix. TOP: The round glasses are historically accurate, with an artistic choice made to have amber-colored lenses to hide the blue eyes of Mizu to others. OPPOSITE TOP TO BOTTOM: The desire to have a 2D aesthetic for Blue Eye Samurai meant avoiding excessive volumetric lighting on the characters. Ringo is defined by honor and compassion and, unlike Mizu, sees the good in people. Animation enabled the fight sequences to be pushed beyond the laws of physics while the camerawork, choreography and lighting gave them a live-action maturity.

In an effort to curb European influence, Japan expelled foreigners in 1639 and carried on the isolationist policy for another two centuries; it is within this historical backdrop where Blue Eye Samurai Co-Creators and Executive Producers Amber Noizumi and Michael Green placed a renegade female warrior seeking to murder the four men responsible for her mixed heritage, which has resulted in a life filled with racial hatred and discrimination. The Netflix project that consists of eight episodes was animated by Blue Spirit in a 2D/3D hybrid style that unflinchingly depicts violence, sex and societal exploitation with a dose of dry humor. Even though the project was sold to Netflix in 2019, the idea was born 16 years ago. “The story we wanted to tell could only have taken place in the Edo period,” Green states. “Plus, has there ever been a more beautiful era for design?” The decision to venture away from live-action was not a hard one. “Animation was the only way we could tell this character’s story without compromise – and with maximal beauty,” Noizumi notes. “The animation style came from conversation with all our partners. But it didn’t really come together until Jane Wu [our Producer and Supervising Director] came on board and showed us her inspirations.” Visual inspiration came from the art of Hokusai and other ukiyo-e prints from the Edo period. “Jane Wu brought the idea of bunraku puppetry as a touchstone for our character design,” Noizumi explains. “Our Costume Designer, Suttirat Anne Larlarb, did deep, extensive research on every aspect of the clothing of the era – the colors, the patterns, even the stitching.”

90 • VFXVOICE.COM WINTER 2024

PG 90-96 BLUE EYE SAMURAI.indd 90

11/24/23 2:51 PM


“For Blue Eye Samurai, we were targeting a 2D aesthetic inspired by Japanese woodblock print,” states Toby Wilson, Production Designer. “Of course, we need to light our scenes, but we wanted to avoid excessive volume lighting on our characters that would break the 2D look, and we wanted to avoid high-contrast photorealistic light in the environments. In order to achieve this aesthetic, I was pushing the Japanese principles of Notan: designing compositions with clear light over dark or dark over light in harmony with the visual story of the shots. Couple that with designed grouping of textures. Like woodblock prints, we would design our shapes and group textures within them. We would organize those textures so they are most concentrated and had the most contrast around the area of focus. And the shapes these textures were housed within, we would try to always design them inspired by our creative source materials from Japanese historical art.” “Production took about 3.8 years to complete from start to finish,” Wu remarks. “The eight episodes were treated episodically, except for some of the action pieces that required martial arts choreography. We shot some of that ahead of the episodic launches due to the stunt artists’ availability.” Directing alongside Wu were Ryan O’Loughlin, Alan Wan, Michael Green, Earl A. Hibbert, Sunny Sun and Alan Taylor. “I tried to understand what each episode needed and ‘cast’ each director to their strong suit,” Wu states. “From there, I encouraged our directors to tell this story with our overall philosophical idea of ‘grounding the storytelling’ through live-action cameras and lensing. We even designed our camera package to the specs that Game of Thrones had, down to

WINTER 2024 VFXVOICE.COM • 91

PG 90-96 BLUE EYE SAMURAI.indd 91

11/24/23 2:51 PM


TV/STREAMING

TOP TO BOTTOM: The cameras moved (with few exceptions) as a real camera would, and focal lengths were carefully chosen for each shot. The goal for animating the cliff fight was to give all of the characters realistic weight and balance to make these “impossible moves” seem plausible. The Four Fangs are bounty hunter thugs hired to kill Mizu, so they were designed to be imposing and strong.

the film back to try to reproduce the same feel in each shot. In the forest canopy, I wanted the camera to follow Mizu and Ringo as they talked so you felt like you were walking with them and seeing the story unfold in real-time. The camera doesn’t always need to be straight front on characters to reveal emotions; body language and camera placement can do the trick and elevate storytelling at the same time.” “Our challenges were similar to what you would find on many shows,” Wilson observes. “We had a story that traveled across Japan with many new locations and characters. We knew it was going to be an issue to stay on budget and schedule, so we devised systems in order to make this work. For locations, it was extremely helpful and authentic that Japanese architecture has an inherent modular nature to it. This allowed us to design new spaces while utilizing the shoji screens, tatami mats, window designs and plaster walls from previous sets. We were also able to build out the entire city of Edo using a modular building system. We used a similar modular system for our crowd characters. Fashion changes based on region in Japan, so [Character Design Lead] Brian Kesinger and his team devised a menu to create background characters so we didn’t have to build hundreds of unique characters. By using the menu system, we could dial different heads, kimonos, shoes and props to populate our streets and interior spaces.” Intriguing discoveries were made during the historical research. “I remember when we first pulled up images of Edo period glasses and saw the ones that would eventually be Mizu’s,” Green recalls. “We could suddenly ‘see’ her, even before the character was designed. They just made her feel real to us. Here was her mask! Straight out of history – and now on our poster. And then came the idea that she would wear amber-tinted lenses to hide her eye color.” The mandate for the character designs, world-building and shot designs was to have viewers forget that they are watching animation. “We wanted them to sink into the story and our characters,” Noizumi remarks. “We wanted them to be as attached to

92 • VFXVOICE.COM WINTER 2024

PG 90-96 BLUE EYE SAMURAI.indd 92

11/24/23 2:51 PM


PG 93 DISNEY AHSOKA AD.indd 93

11/25/23 2:49 PM


TV/STREAMING

TOP TO BOTTOM: For locations, it was extremely helpful and authentic that there is an inherent modular nature to Japanese architecture. The ukiyo-e prints of Hiroshi Yoshida were a major influence on the look and composition of Blue Eye Samurai because they have a distinct East/West artistic harmony. Key words that describe the Swordfather are mentor, strength and stoic. His blindness prevents him from seeing Mizu’s blue eyes but doesn’t prevent him from seeing who she really is.

Mizu in animation as they would be if she was played by a famous actor. All our choices were made to support that.” Restraint and realism were the guiding principles for the character designs. “Designing a ‘bad guy’ for Blue Eye Samurai means having to design an imposing character without the usual tropes of giving him a scar on his face or an eye patch [to oversimplify, of course],” Kesinger states. “Rather, I had to play with more subtle cues of how this particular ruffian wears his kimono or how they carried himself in a pose or expression. Really subtle stuff. But it was all in service to the tone of the story.” Almost every character wears a kimono. “We did a deep dive into how kimonos were built, how they were worn by different genders and social classes. Thanks to Jane who would model them for us, we would study how they move so that we could work with our partner studio, Blue Spirit, to create a rig that would allow for realistic movement,” Kesinger says. Silhouettes and poses provided an opportunity to explore the different social classes in Japan during the Edo period. “We took great care in making sure all our characters held themselves in an appropriate manner when it came to their social station and gender so that Mizu could stand out defiantly from that,” Kesinger remarks. “Details like how perpendicular a samurai wears their katana to their body or how long a kimono sleeve an older woman would wear versus a younger woman are just a couple examples of the level of detail we explored in coming up with poses. When we first meet Mizu, she is disguised as a man, so we made sure she walked, sat and even ate with more masculine mannerisms.” Outside of a few dream sequences, Blue Eye Samurai is not fantastical. Observes Kesinger, “These characters endure a lot, but they are not superheroes. When Mizu gets hit, we have to design a bruise for her and how that bruise develops over the course of the rest of the sequence. With a character like Ringo, it was important to show a character with a limb difference who was not defined by it.” The previs was extensive. “The main objective of the previs was to create a smooth transition through the pipeline of all departments all the way to the vendor,” notes Earl A. Hibbert, Head of Previz and director. “The very complex scenes took an extensive amount of time. We implemented a scouting process with the directors early in the sequence construction. This would give the directors the opportunity to look through a set sometimes before the storyboard stage to explore ideas in the set and aid in the pitch process with the board artists so they had all the information they needed.” A large amount of previs was required for the dojo fight in Episode 101. Hibbert explains, “We had to work with the stunt and production design teams to design that very complex fight to fit within the boundaries of the set. In addition to the action scenes, many dramatic scenes were complex, too: in Episode 101, Mizu’s intro in the soba house, and the wonderful and traumatic scenes in Episode 102 with Mizu and Swordfather. In all of the dramatic scenes, we worked on compression and expanding space between characters to create and elevate tension.” “While we aren’t trying to fool anyone into thinking this show was done in 2D, we all loved the idea of making it look like

94 • VFXVOICE.COM WINTER 2024

PG 90-96 BLUE EYE SAMURAI.indd 94

11/24/23 2:51 PM


PG 95 UNIVERSAL COCAINE BEAR AD.indd 95

11/25/23 2:49 PM


TV/STREAMING

TOP TO BOTTOM: The animation of the snow in the first few episodes reflects moments of serenity and emotional turmoil within the characters from scene to scene. For action sequences with big camera moves, full 3D builds were constructed, and look development was done to ensure that the sets looked like the painted backgrounds. The pivotal personality traits that Character Design Lead Brian Kesinger wanted to convey with Akemi were defiance and individuality. Because almost every character wears a kimono, a deep dive was done into how they were built and worn by different genders and social classes.

a drawing or an illustration and chose to do the animation ‘on 2s,’ as they used to call it back in the days of classic animation,” reveals Michael Greenholt, Animation Director. “If the film is at 24 frames per second, you are holding each drawing for two frames [essentially 12 frames per second]. Working on 2s in a 3D environment with moving cameras is a challenge, and we had to do lots of testing and experimenting to make it look right. If it’s done wrong, characters’ feet might slide on the ground as they walk, or they might seem to jitter as they move.” Having to animate blind and handless characters meant that familiar techniques were not enough. Greenholt comments, “You need to find people with similar challenges and see how they do it. How would Ringo hold the reins of a horse? If Swordfather is blind, you have to show him listening or feeling things. He can’t just be a magical man who happens to know where everything is around him. These sorts of characters keep a scene from being ‘just any old scene.’” Nuances were essential in making the characters and situations believable. “There was a simple moment where Mizu and Ringo were walking along a narrow, snow-covered ledge,” Greenholt remarks. “We decided to have Mizu’s foot slip a little. It was just a small opportunity to show that she isn’t flawless. And it felt real in the moment and added a little peril to the situation.” Environments were adapted to better integrate the cast. “The characters were animated first, giving the animator freedom to have them walk however they needed to, then footprints were painted into the background later to match where the feet landed,” Greenholt says. Atmospherics have been incorporated into every shot whether it be falling snow or the steam coming off of hot food. “If there was a comprehensive attitude it was, ‘More, please,’” Green adds. “Everything about Blue Eye Samurai trades in elements. We wanted rain or wind or snow in every exterior shot. Smoke or flickering candles for interiors. Texture is everything.” There is no shortage of sword fighting. “Every action set piece started with a concept on the page,” Green explains. “Even when you’re working with genius stunt directors [like Sunny Sun], it is up to the writer to make sure there is a concept for the action and that the action has a direct effect on character. Not just on story but on character. That’s what keeps action integral and a scene uncuttable. Once we’ve communicated those needs to our team, we stand back and let them be brilliant.” Mizu is driven to the edge of a cliff by the Four Fangs in Episode 102. “The cliff fight was a challenge because we didn’t have fight choreography to use,” Greenholt adds. “Our goal was to give all the characters realistic weight and balance to make these ‘impossible moves’ seem plausible.” In Episode 106, Mizu battles 20 guards at once. “In terms of camera and layout, it was complex because it was such a long shot without cutting,” Wu notes. “Then adding on to the complexity, we threw in the most difficult fight choreography. In the end, we got something that was spectacular!” There are also quiet moments. “In Episode 107, Mizu confides with Swordfather on the cliff,” Wu states. “It’s such a delicately directed scene, and the performances were so beautiful; it really was moving. I want to show the audience that animation isn’t a genre but a medium to tell deep complex characters and rich stories. I can’t wait for people to see that.”

96 • VFXVOICE.COM WINTER 2024

PG 90-96 BLUE EYE SAMURAI.indd 96

11/24/23 2:51 PM


PG 97 UNIVERSAL SUPER MARIO AD.indd 97

11/25/23 2:50 PM


LEGENDS

ACTING UPON HIS SCREEN AMBITIONS, STAN WINSTON LEAVES A LIVING LEGACY By TREVOR HOGG

Images courtesy of Stan Winston Studio Archives. TOP:Stan Winston poses with the Chip Hazard and Archer puppets created by SWS for Joe Dante’s Small Soldiers. OPPOSITE TOP: Stan Winston sculpts a bust of his friend and frequent collaborator, Arnold Schwarzenegger. OPPOSITE MIDDLE, LEFT TO RIGHT: Stan Winston with Danny DeVito during a Penguin makeup test at Stan Winston Studio for Batman Returns. Aliens writer/director James Cameron and Stan Winston review the progress of the clay Alien Queen design maquette at Stan Winston Studio. Stan Winston applies his Emmy-winning old age makeup to Cicely Tyson for The Autobiography of Miss Jane Pittman. Stan Winston and the SWS crew prep the full-size animatronic T. Rex for action on the set of Jurassic Park. Johnny Depp transforms into Edward Scissorhands during a test makeup application and scissorhands fitting at Stan Winston Studio. Stan Winston adjusts one of the full-size animatronic T-800 endoskeletons on the set of Terminator 2: Judgment Day.

Like many before and after him, Stan Winston wanted to become a movie star, and the ambition became a reality indirectly as he infused personalities into his creature effects, resulting in four Oscars, three BAFTA Awards, two Primetime Emmy Awards and being honored as an inaugural inductee of the Visual Effects Society Hall of Fame in 2017. As a teenager growing up in Arlington, Virginia, he would transform himself into a werewolf by using rudimentary makeup and scare the neighbourhood children. “He loved the classic Universal monsters, Lon Chaney and Jack Pierce,” recalls Matt Winston, son of Stan Winston. “The Wizard of Oz was his favorite movie.” After his first year of studying dentistry at the University of Virginia in Charlottesville, the undergraduate decided to major in Fine Arts and minor in Drama. “Dad was in the Virginia Players at the University of Virginia, and not only did he act but was essentially the makeup department. Through his Fine Arts major, he was learning all of the fundamentals, like perspective, highlights and shadows.” Upon arriving in Los Angeles, Stan Winston couldn’t find any jobs as an actor and had an epiphany when watching Planet of the Apes. “He thought, ‘Maybe my other passion of makeup effects could be my way into Hollywood until I have my big break as an actor,’” Matt Winston remarks. “Dad found out about the Walt Disney Studios makeup apprenticeship program under Robert J. (Bob) Schiffer. This was the other foundational experience of his artistic career. He worked on Disney specials and some of those great Disney films from the late 1960s with Kurt Russell. Then he founded Stan Winston Studio in 1972 at the age of 26. Stan Winston Studio was the garage and kitchen table of our tiny two-bedroom house in Encino, California!” An early career breakthrough for Stan was the television movie Gargoyles. “He came on to handle background gargoyles, which were pullover slips latex masks,” Matt Winston states. “In the early part of the shoot he was told by the producers that there was going to be no credit for him and the makeup leads Del Armstrong and Ellis Burman, Jr. He said, ‘This show is called Gargoyles, and since we’re essentially creating the stars of the movie for you, unless we get proper credit, I’m walking off this project.’ The producers acquiesced and gave them all credit, and because of that they all won Emmys.” Another area of significant impact was worker benefits. “Dad was instrumental in getting his on-set crew designated as puppeteers, which meant their work fell under the umbrella of Screen Actors Guild, allowing them to qualify for SAG residuals and health benefits; he could be tough but his crew knew that he had their back. The fact that Dad kept a core team for so many years speaks highly of the company culture of Stan Winston Studio and the man who led it.” For two and a half years Alec Gillis was part of Stan Winston Studio before departing to form Amalgamated Dynamics, Inc. with co-worker Tom Woodruff, Jr. “I did not learn techniques from Stan but a creative philosophy,” Gillis notes. “His approach to running a business and being a creative artist was most like my personality. Stan was a father figure and mentor to a lot of people. He was not an easy-going guy because he was pushing hard to get the best

98 • VFXVOICE.COM WINTER 2024

PG 98-102 STAN WINSTON.indd 98

11/24/23 3:00 PM


quality and was committed to the journey and craft. Stan would toss you into the deep end, and I was grateful for the opportunities that he gave me. Stan wasn’t a niche character creator. He would take on anything from a rubber W.C. Fields nose to a 40-foot-tall Queen Alien. It’s all character. I’ve always respected that because I got into the industry to do interesting things, not to pursue a single technique.” Winston never forgot about what he learned from his acting classes. “There was a time on Aliens where he had a metronome going so there was a frame of reference for the puppeteers to lock their brains around for their performance,” Gillis states. “Stan was looking for organic ways to make these rubber, metal and plastic creations come to life, and he saw it as performance rather than as robotics. When you were puppeteering for him, he wanted you to be in the state of mind of an actor even though you’re only creating the movements for one portion of this character. You had to also think of it as an orchestra that comes together to create an emotion. He would talk in those terms rather than get bogged down with servomotors. Stan showed me that most successful performance characters are done by people who are artists and

PG 98-102 STAN WINSTON.indd 99

11/24/23 3:00 PM


LEGENDS

TOP TO BOTTOM: Stan Winston gives feedback on the progress of creating Archer’s head for Small Soldiers. Kevin Peter Hall and Stan Winston take a break at the makeup effects trailer on the set of Predator while SWS artist Matt Rose works on the Predator bio-helmet in the background. Stan Winston, Producer Richard Weinman and the SWS crew on the set of Winston’s directorial debut Pumpkinhead. From left to right: Alec Gillis, Stan Winston, Tom Woodruff, Jr. (in Pumpkinhead suit), Richard Weinman, John Rosengrant and Richard Landon.

performers, and the support people are the engineers.” “People magazine did an article about special effects makeup, and Stan’s quote was, ‘Some of my best work has been in some of the worst movies,’” remarks Tom Woodruff, Jr. “I remember reading that and thinking it’s interesting because what’s he’s basically saying is, ‘I love to work and love what I do.’ He didn’t care if it was a big or small movie that would never perform because it gave him an opportunity to express himself as an artist.” Overtime was avoided as much as possible, as Stan Winston wanted his team to be mentally fresh and wanting to work the next day. “It was a big find that Stan was such a family man and realized that there had to be a balance.” The Monster Squad was a great experience. “Stan was in the drawing room doing sketches of a new creature from the Black Lagoon, but not from the Black Lagoon because that was licensed,” Woodruff, Jr. states. “I was watching him, and he was showing me all of this fine pencilwork where you go in with more pencilwork to make things shaded, and it had great volume to it. I still draw that way.” A fond memory of Woodruff, Jr.’s was Winston painting some appliances for The Monster Squad with him. “Stan came in, and we were talking. I asked him, ‘Do you mind helping me paint one of these?’ I just wanted to paint with Stan Winston. He said, ‘Mind? I would love to.’ We sat there painting Frankenstein appliances for a couple of hours; that was always such a big thing for me to be able to have that time with him.” Considered to be Stan Winston’s right hand was John Rosengrant, who went on to establish Legacy Effects with colleagues Lindsay MacGowan, Shane Mahan and Alan Scott after their mentor’s death in 2008. “My 25 years with Stan helped to propel me in this business, and I learned so much from him,” Rosengrant notes. “Being his right hand became easy because it

100 • VFXVOICE.COM WINTER 2024

PG 98-102 STAN WINSTON.indd 100

11/24/23 3:00 PM


felt right. He ran Stan Winston Studio as a true business. There was a time in and out. We had a lunch hour. And his philosophy wasn’t that we were the be all and end all. We are one important part of the filmmaking process. Stan would say, ‘What we’re doing is creating characters.’ We did create some iconic characters, and at Legacy Effects we continued doing that. When you think about it, the movie is called Aliens or The Terminator or Predator or Iron Man, and we are creating that character. The Queen Alien in Aliens is totally a character as is the T. Rex or the raptors in Jurassic Park or Edward Scissorhands – they are all etched in our brain.” “I would say that The Terminator put Stan on the map for animatronics, and it’s an extension of puppeteering,” Rosengrant observes. “When Stan saw what they were doing with The Dark Crystal, he wanted to expand upon that idea and to take it into a more expansive role because you can do things with a puppet or animatronic that you can’t do with makeup. How are you going to do the Queen Alien unless it’s a giant animatronic puppet? The other thing is that Stan was willing to embrace digital. You could try to compete with it, but why? We’re all trying to do the same thing. Stan and I are of the mindset that whatever is the best way to get the shot is the way it should be. Stan opened Digital Domain [with James Cameron and Scott Ross] after our Jurassic Park experience because he saw the value in it. What it did for us with our puppeteering was suddenly some complicated mechanism to make an arm move could be simplified down to a rod because they could paint that out. It was always figuring out how to evolve and stay with the times, not get caught in the past.” James Cameron was not familiar with Stan Winston until special effects makeup artist Rob Bottin recommended him for The Terminator. “Stan wanted to empower his art team because that’s what they sell in addition to the technique and physicality of

TOP LEFT: Stan Winston paints the full-size animatronic and rod-puppeteered T-800 endoskeleton for The Terminator. TOP RIGHT TO BOTTOM: Stan Winston surrounded by just a small portion of Stan Winston Studio’s groundbreaking puppets and practical effects created for Terminator 2: Judgment Day. SWS artist Alec Gillis sculpts the head of Pumpkinhead at Stan Winston Studio. SWS artist and Pumpkinhead suit performer Tom Woodruff, Jr. paints the head of the newborn Pumpkinhead.

WINTER 2024 VFXVOICE.COM • 101

PG 98-102 STAN WINSTON.indd 101

11/24/23 3:00 PM


LEGENDS

TOP TO BOTTOM: Stan Winston puts the full-size animatronic T. Rex through its paces on the set of Jurassic Park. Stan Winston, SWS mechanic Al Sousa and SWS Art Department Supervisor John Rosengrant rehearse the movements of a Chip Hazard rod puppet for Small Soldiers. Stan Winston, Kevin Peter Hall (in Predator suit) and the SWS Predator crew on location in Palenque, Mexico. From left to right: Shane Mahan, Steve Wang, Stan Winston (kneeling), Brian Simpson, Kevin Peter Hall, Shannon Shea, Richard Landon and Matt Rose.

making moulds and doing puppeteering,” Cameron states. “I was a unique customer for him because I was coming in with my own drawings. I expected there to be tension around that and there never was. Stan celebrated art and design first and foremost, and he was excited by that. Stan put his designers on cleaning up, sculpting and getting on with the designs that I brought in, and where there were designs missing his guys went crazy. We meshed because we were all artists. We respected each other and the creative process. If I had one thing to say about Stan it would be that he always celebrated artists and the moment of creation whether that was a pencil sketch or a sculpture.” A personal regret lingers from the making of Avatar. “We had an internal design team that was working closely with Stan’s team,” Cameron states. “After three and a half years into a four-and-ahalf-year project, we finally had a short sequence of five or six shots of Neytiri, and she had basically come to life. You just see Neytiri’s eyes in the forest. She hops up on a branch, goes to shoot Jake and stops herself because this little Tinkerbell thing lands on her arrow. Those were the first shots completed by Weta Digital [now Wētā FX]. I knew that Stan was sick, so when I got those shots, I called him up and said, ‘Stan, you’ve got to see this. We’ve cracked the code.’ That turned out to be the last time I talked to him because when I went over the next day Matt Winston said, ‘He’s not having such a good day.’ I came back the following day and the housekeeper told me that he had passed away during the night. Stan never got to see the culmination of the dream that he and I set out to do when we founded Digital Domain.” Sixteen years after his death, the impact of Stan Winston still resonates in the movie industry, with Legacy Effects carrying on special effects wizardry of Stan Winston Studio and the Winston family embarking on an academic endeavor. “Before Dad passed away [in 2008], he dreamed of stepping away from the day-to- day of the business, traveling the world and sharing how he and his team did it all,” explains Matt. “That’s how the idea of the school was born. We started working earnestly on the Stan Winston School of Character Arts in 2010, and that’s been the focus of the family ever since. It’s online educational videos by the leaders in the industry.” The global shutdown caused by COVID-19 had a major impact. “We were perfectly positioned for a pandemic, which accelerated the adoption and acceptance of online education by a decade.” Reflecting on the influence of his frequent collaborator and business partner, Cameron states, “We were aligned philosophically on how important it was to have details and nuance. Stan thought like a director, and I thought like a makeup or visual effects guy. He and I met in the middle. His legacy is that every CG character that you see now, he was at the cutting edge of proving that technology.” Rosengrant has a fond memory working on Amazing Stories. “Stan took me aside and said to me, ‘You’re a really good artist.’ There are a lot of people who have some artistic gifts and need a place to develop them, and Stan had that atmosphere that allowed me to grow and develop. By him recognizing that, I felt like I had finally arrived at that point in time, and then it kept going up.”

102 • VFXVOICE.COM WINTER 2024

PG 98-102 STAN WINSTON.indd 102

11/24/23 3:00 PM


PG 103 UNIVERSAL TROLLS AD.indd 103

11/25/23 2:51 PM


[ VES NEWS ]

Visual Effects Society Recognizes Special 2023 Honorees By NAOMI GOLDMAN

TOP LEFT: 2023 Founders Award recipient Tim McGovern with Board Chair Lisa Cooke at VES Honors. TOP RIGHT: New VES Fellows Bob Coleman, Jeff Barnes, Chuck Finance, Shannon Gans and Tim McGovern.

VES celebrated a distinguished group of VFX practitioners at its festive VES Honors event this past October, held at the Skirball Cultural Center in Los Angeles. VFX Supervisor and founding VES member Tim McGovern was named recipient of the 2023 VES Founders Award. The Society designated Lifetime VES Memberships to Archivist and Curator Sandra Joy Aguilar and Producer and AMPAS Governor Brooke Breton, VES; VFX Artist Agent and Executive Bob Coleman and Tim McGovern with Lifetime VES memberships. This year’s venerated VES Fellows bestowed with the post-nominal letters “VES” were Jeff Barnes, Toni Pace Carstensen, Bob Coleman, Chuck Finance, Shannon Gans, Tim McGovern and Ray McMillan. The 2023 VES Hall of Fame inductees included Samuel Z. Arkoff, Lawrence W. Butler, Wah Chang, Norman Dawn, ASC, Walter Percy Day, OBE, Marcel Delgado, Farciot Edouart, ASC and Edward D. Wood, Jr. Founders Award recipient, Lifetime Member and VES Fellow Tim McGovern, VES is an Oscar-winning VFX Supervisor and Creative Director known for groundbreaking work in VFX and Computer Animation. He has played key leadership roles with the VES over the last 20 years, including global Board of Directors Vice Chair, founding Co-Chair of the VES Awards Committee and founding Chair of the VES Committee for Outreach to Developing Regions, contributing greatly to the Society’s global expansion. Lifetime Member Sandra Joy Aguilar is a renowned Archivist and Curator who oversees metadata and indexing at USC’s Shoah Foundation – The Institute for Visual History and Education, an online resource on the Holocaust and other genocides. She has played a key role in developing the VES library, digital museum and archive projects for the past 20 years as Co-Chair of the VES Archives Committee and consultant on the VES Roth Museum of Visual Effects.

Lifetime Member Brooke Breton, VES has been involved in a wide variety of live-action films, animated films, TV series and theme park projects. She played an instrumental role in launching Digital Domain and was Senior Production Executive for Illumination Entertainment. Breton was a producer for the Academy Museum of Motion Pictures, is an AMPAS governor representing the VFX Branch and a three-term member on the VES global Board of Directors. Lifetime Member and VES Fellow Bob Coleman, VES is the President and Founder of Digital Artists Agency with a roster of VFX talent that includes Academy Award, VES Award and Emmy Award-winning artisans. He has a tenure working in film production, visual effects and post-production that spans over 40 years. Coleman has served on the VES Awards Committee for 23 years, including nine years as Chair, and a long tenure on the VES global Board of Directors. VES Fellow Jeff Barnes, VES is an entertainment and technology creative executive who continues to make marked impacts across Silicon Valley and Hollywood and currently serves as the Executive Vice President of Creative Development at Light Field Lab. He has contributed to the development, support and advancement of the creative community, serving as Chair and Vice Chair of the VES global Board of Directors, and was previously honored as a VES Lifetime Member. VES Fellow Toni Pace Carstensen, VES is a founding member of the VES and its Executive Committee and served on the global Board of Directors for many years, as well as Co-Chair of the Global Education Committee and Co-Editor of the first edition of the VES Handbook of Visual Effects. Carstensen serves as Chair of the Vision Committee and was previously honored as a VES Lifetime Member and recipient of the VES Founders Award. VES Fellow Chuck Finance, VES has been a valuable longstanding member of the VES following a career in educational, information and theatrical filmmaking. Early in his career, he gained a reputation for producing and directing science films, then went on

104 • VFXVOICE.COM WINTER 2024

PG 104-106 VES NEWS.indd 104

11/24/23 3:05 PM


to co-found Perpetual Motion Pictures and work as a VFX Producer and VFX Consultant. Finance played an instrumental role on the VES Awards Committee for almost two decades, including service as the Chair and Co-Chair. VES Fellow Shannon Gans, VES is a producer and builder of studios and teams at the intersection of creativity, technology and business. She is part of the Apple team on its mixed-reality headset and is best known as the CEO and Co-Founder of New Deal Studios, a 21st Century Creative Studio, a VFX, design and production studio known for award-winning film and commercial sequences. Gans has played a leadership role serving several terms on the global VES Board of Directors. VES Fellow Ray McMillan, VES is a venerated VFX Supervisor/ Producer, Director and Director of Photography known for his work on films, including The Day After Tomorrow, The Incredible Hulk and X-Men. His career has been punctuated with more than 40 awards for his outstanding body of work. McMillan is deeply involved with the VES Toronto Section, serving multiple terms as the Section Treasurer and Secretary and chairing their studio screenings. Hall of Fame inductee Samuel Z. Arkoff was an American producer of more than 200 low-budget exploitation films and is credited with starting the beach party and outlaw biker movie genres. Hall of Fame inductee Lawrence Butler was an Academy Awardwinning American special effects artist best known as the inventor of the bluescreening process. Hall of Fame inductee Wah Chang was an American designer, sculptor, animator and artist who, in 1939, became the youngest member of Disney’s effects and model department.

TOP LEFT: New VES Lifetime Members Sandra Joy Aguilar, Tim McGovern, Bob Coleman and Brooke Breton. TOP RIGHT: Quentin Martin, VES France and Carol Madrigal, VES Georgia at the VES Honors celebration with Board Chair Lisa Cooke (left) and Executive Director Nancy Ward (right). BOTTOM TWO: It was a beautiful night celebrating the VFX community at the VES Honors celebration at the Skirball Cultural Center in Los Angeles.

Hall of Fame inductee Norman Dawn, ASC was an early American film director and artist. He was the first person to use the glass shot in a motion picture and the first director to use rear projection in film production. Hall of Fame inductee Walter Percy Day, OBE was a British painter best remembered for his work as a matte artist, special effects technician and master of illusionist techniques. Hall of Fame inductee Marcel Delgado was a Mexican-American sculptor and model-maker whose technique revolutionized stop-motion animation. His most famous work was the original King Kong, brought to life by sculpting one of the most iconic characters in cinematic history. Hall of Fame inductee Farciot Edouart, ASC was a 10-time Academy Award-winning American motion picture special effects artist and innovator of French descent. He was a recognized specialist and innovator in process photography, also known as rear projection. Hall of Fame inductee Edward D. Wood, Jr. was an American filmmaker, actor and pulp novel author. In the 1950s, he directed several low-budget science fiction, crime and horror films that later became cult classics.

WINTER 2024 VFXVOICE.COM • 105

PG 104-106 VES NEWS.indd 105

11/24/23 3:05 PM


[ VES NEWS ]

VES Bay Area Hosts Farewell Tribute to 32Ten Studios

TOP TO BOTTOM: VES Bay Area Section members, guests and the VFX community gathered for the Fall BBQ and farewell tribute to iconic 32Ten Studios. VES Bay Area Section Manager and Tribute Event Producer Rose Duignan and VES Board Chair Lisa Cooke honor 32Ten Studios with fellow guests. Beloved Star Wars friends joined in the festivities for 32Ten Studios at the VES Bay Area BBQ and VFX gathering.

Marking the end of an era, the VES Bay Area Section held an emotional final event at the 3210 Kerner Main Stage and Courtyard in San Rafael in tribute to the iconic 32Ten Studios, which has now shuttered operations. The gathering and VES Bay Area Fall BBQ was held on the revered grounds that were once home to Industrial Light & Magic. For more than four decades, thousands of the most exemplary blockbuster movie effects to hit the screen were created in this legendary studio, including Star Wars, Men in Black, Ghostbusters, Poltergeist, Terminator and more. The tribute event brought together hundreds of visual effects artists, former employees and VFX colleagues to give a heartfelt sendoff to the acknowledged birthplace of cutting-edge visual effects and hub of creativity and innovation. Rose Duignan, VES Bay Area Section Manager and Event Producer for the epic event, who once worked as a production supervisor at Industrial Light & Magic and Kerner Optical, echoed the sentiments of so many. “It was always about the people who brought their talent and passion into this special place,” Duignan said. “While buildings may rise and fall, the legacy of everyone who inhabited this space, the enduring friendships, the work we created together – that is what will live on and surely shape generations to come. I am forever grateful to have been a part of the history this space holds and to have the honor to help produce this final goodbye.” VES Board Chair Lisa Cooke and a lineup of impassioned speakers, including Imagine Documentary filmmaker Joe Johnston, ILM Chief Creative Officer John Knoll, Pixar President Jim Morris, VES, pioneering visual effects artist Dennis Muren, VES and more – along with special guests R2D2, C-3PO and a squad of stormtroopers – were all on hand to honor the historic event and enduring relationships made along San Rafael’s canal. They all joined in for a momentous last group photo in front of the 32Ten soundstage. For their support and hard work in coordinating the “party of the season,” the VES Bay Area Section wants to thank the great team who brought it to life, including its Section Board of Managers and VES headquarters, fantastic student volunteers, the founders of 32Ten and generous sponsors Advance Systems Group (ASG), Industrial Light & Magic, Barbier Security and local favorite Big Jim’s BBQ.

106 • VFXVOICE.COM WINTER 2024

PG 104-106 VES NEWS.indd 106

11/24/23 3:05 PM


[ ON THE RISE ]

Rutul Patel Nair CEO & VFX Producer, Digital District India & Co-Chair, VES India VES Member: 6 years VES benefit most enjoyed: Networking events and VES nominations event.

Natalie A. Palumbo Motion Graphics Designer & VFX Compositor VES Member: 1 year VES benefit most enjoyed: Professional discussions/interviews and VFX breakdowns

Born and raised in Gujarat, India, VFX Director Rutul Patel has always seized opportunities and worked towards reinventing herself throughout her professional journey. Her impressive portfolio and multiple Addy Awards helped her land a job in the VFX industry. “After completing my MFA from the Academy of Arts University in San Francisco, I joined a leading VFX studio in Los Angeles as an intern. I hadn’t really trained to be a matte painter, but my then-supervisor, Ken Nakada, felt I had a natural flair for it.” Rutul soon found herself being part of big Hollywood projects like Avatar, Tree of Life and Sucker Punch. “I was in School when Titanic released, and I fell in love with the movie. Never in my wildest dreams had I imagined that in a few years I would not only get to work on Avatar, but also meet the genius behind it all, James Cameron.” When Rutul came to India she joined Oscar-winning VFX studio Rhythm & Hues in Mumbai as a texture painter and worked on projects like Life of Pi, Snow White and the Huntsman and R.I.P.D. In 2012, Rutul started her own production house specializing in creating visual effects for commercials, having previously worked for a couple of ad agencies in India and interned for one in San Francisco. Rutul has worked on a number of VFX projects for clients like Unilever. Films, however, remain her first love. Since 2017, Rutual has been CEO of French VFX house Digital District in Mumbai. Her work includes Irma Vep, My Brilliant Friend (HBO), Mortel, Mortel 2, The Kissing Booth (Netflix), Coeurs Noirs, Too Old To Die Young (Amazon), Snowfall (FX) and movies such as Apache, Eau Fourte, The Deep House, Germinal, Last Journey of Paul W.R., Hellboy, Extinction, Playmobil and At Eternity’s Gate. Her advice for newcomers to the VFX industry: “Join the VFX industry only if it’s your obsession. It’s a tough business, and artists need stamina to make it. And stamina comes with an irrational love for the work. And generously absorb all kind of new knowledge, and don’t restrict yourself only to your specialization.” In her spare time, Rutul likes to paint and grow her own vegetable garden.

Natalie is a Motion Designer and VFX Compositor based in New York City. She is an Adobe Certified Professional in Visual Design & Video Design and a member of VES N.Y. Chapter. Born outside of Baltimore, Maryland, Natalie’s interest in visual arts started at an early age. Her older brother, Anthony, has low-verbal autism. As children, conversation was missing from their sibling relationship. After trying and failing with words, Natalie found she could bond with her brother by drawing his favorite cartoon characters from Animaniacs (1993). She realized art was an effective form of nonverbal communication and decided to pursue art as a career. Natalie became fascinated by VFX and animation in motion pictures. She was particularly struck by the iconic “rose petals” scene in American Beauty (1999) and how animation and complex editing were used to convey intense emotion throughout the film. From there, Natalie decided to study VFX compositing and attended Ringling College of Art & Design as a motion design major. After graduating with honors in 2017, she went to New York City to intern at Marvel Entertainment. From there, she worked with AOL, Yahoo, Netflix, Adobe, Behance, Nielsen, and independent filmmakers in Manhattan and L.A. doing VFX and post-production. Natalie is most proud of the visual presentation she created for an uncut audio interview hosted by Tom Frangione with Sir Paul McCartney discussing the 40th anniversary of his Tug of War album special for Sirius XM The Beatles Channel. The visual presentation is included in the Paul McCartney Archives. “To have my name associated with anything Paul McCartney is a dream come true. I thank Tom Frangione for allowing me to lend my creative skills to such an iconic interview.” Natalie advises creatives to always do their research. “Motion design is all about creative problem-solving. I research technique until I get the result I want. I never want to say, ‘I can’t.’” In her free time, Natalie enjoys filming N.J. classic rock band the Black Ties on location, visiting art installations in New York City and spending time with her brother.

WINTER 2024 VFXVOICE.COM • 107

PG 107 ON THE RISE.indd 107

11/24/23 3:07 PM


[ THE VES HANDBOOK ]

Advantages and Disadvantages of Using LED Volumes for Production By GLENN DERRY Edited for this publication by Jeffrey A. Okun, VES Abstracted from The VES Handbook of Virtual Production (Chapter 2) Edited by Susan Zwerman, VES and Jeffrey A. Okun, VES Is an LED Volume the Right Tool? The decision to commit to an LED Volume style of production needs to happen well before the budget and schedule are determined. If the LED volume is the primary methodology for principal photography, its use will influence every aspect of the production pipeline: key creative hires, the types of scenes written, production design, as well as the practical concerns of crew members and visual effects vendor choices. Key creatives will need to embrace not putting off decisions until post-production. Ample time should be allocated to the pre-production period for shot planning and visualization, digital asset construction and creative approvals to be finalized before principal photography begins. When using an LED volume, most digital production design and pipeline tasks, such as modeling, texturing and lighting, are moved from post-production into pre-production. Additionally, there is the added overhead of optimizing the digital environment for the game engine used to render and display content. The physical art department and virtual art department (VAD) work simultaneously with the production designer to complete a cohesive design. Physical set pieces that may also appear on the LED wall need to be digitized and used as a reference for the digital content team. All of this happens at the pace of a production’s shoot schedule rather than driven by an editorial request in post-production. When deciding to use or not use an LED volume, there are many factors to consider: Advantages of LED Volume Production •H as the potential to reduce principal photography and post-production schedules. •C reative decisions are made in pre-production, establishing a blueprint for principal photography while working out the technical logistics. •D istant or exotic locations can be reproduced on the LED volume stage, minimizing crew travel and associated overhead. •L ED volumes can reduce the overall number of physical stages and associated support equipment required for a production while maintaining the script’s scope and scale.

•T he LED wall provides environmental illumination and reflections on the subject(s) and set surfaces. •S hooting in an LED volume allows for as much time as required to shoot sunrise, nighttime, “golden hour” or other typically time-sensitive scenes and in a comfortable environment. •T he LED volume offers better continuity for reshoots and pick-ups. The shooting team can load a previously photographed scene’s background into the wall for newly written dialogue or use plate photography captured on location as the background source material in the volume. •T he actors can see the world they are performing in, resulting in more authentic performances and sight lines. •T he LED volume is schedule-friendly for actors, especially children or other talent that may have contractual time or travel restrictions. The Ten Commandments of LED Volume Production 1. K ey creatives, such as the Director, Production Designer and Director of Photography, must be available during project development and for the entirety of pre-production. 2. C reative decisions must be made early and acted upon with urgency. 3. T he pre-production schedule will be longer. 4. M oney will be spent sooner. 5. C reative transparency and open collaboration are required across all departments. 6. E very physical set piece or prop appearing on the LED wall must be scanned and reconstructed digitally. 7. T he VAD (Virtual Art Department) is an extension of the art department creating the set. 8. V irtual scouts and pre-light days in the LED volume are required for every set. 9. F inal look decisions must be made in the LED volume through the camera lens. 10. C reative drives the technology.

Members receive a 20% discount with code: HVP23 Order yours today! https://bit.ly/VES_VPHandbook

108 • VFXVOICE.COM WINTER 2024

PG 108 HANDBOOK.indd 108

11/24/23 3:08 PM


PG 109 UNIVERSAL MIGRATION AD.indd 109

11/25/23 2:51 PM


[ VES SECTION SPOTLIGHT: LONDON ]

London Calling: Resilient VES London Section Revisited By NAOMI GOLDMAN

TOP TWO: VES London members and guests partake in their fantastic mini-golf Summer Party, sponsored by Chaos, creators of V-Ray. BOTTOM: The “Art & AI” panel discussion was hosted by DNEG and organized by Board member Jahirul Amin from CAVE Academy.

The Visual Effects Society’s international presence gets stronger every year – and so much of that is because of our regional VFX communities and their work to advance the Society and bring people together. The VES London Section continues to be a vibrant international community of artists. It boasts upwards of 280 members, primarily working in film and television, with a significant representation of artists working in commercials and games. “The VES London Section is both thriving and resilient amidst a host of successive large-scale challenges that have impacted our industry and our members,” said Nicolas Casanova, Chair of the VES London Section. “On the heels of Brexit, then the COVID pandemic and shifts to work-from-home and back to hybrid models, and most recently the production slowdown from the dual strikes, the London VFX community continues to adapt to new ways of working and prove itself immensely creative and strong.” “There’s some fantastic work coming out of the U.K. – it’s world-leading,” said Helen Moody, Secretary of the VES London Section. “With films like Oppenheimer, Barbie, Mission Impossible: Dead Reckoning and The Little Mermaid and episodic including The Witcher, Good Omens, The Crown and Loki, we have so much to be proud of. While many VFX companies in the region have been facing tough times amidst the economic uncertainties, I am confident that our market is positioned for growth and that we will bounce back stronger than ever once projects shift back into production.” VES Sections stepped up during the pandemic and created a host of rich virtual programming to continue to provide members benefits and offer support and social connections, including a supportive wellness series featuring a whiskey-tasting session, cooking class and essential oils use demo – all online. As restrictions opened up to allow for in-person events, the Section has hosted a strong lineup of screenings and panel conversations – with huge thanks to MPC for providing their screening room for many years, and to former VES London Board member Greg Keech for shepherding the robust screening program. VES London created a series of well-attended educational events with CAVE Academy on topics including: an “Art and AI Panel” sponsored by DNEG London; “Shooting HDRIs for VFX,” also featuring CineArk: and “Playing with Polarisation” at Cinesite’s new studios. And the Section held a festive Summer Party at Birdies mini-golf in Angel, London, sponsored by Chaos (creators of V-Ray) – complete with the inaugural Chaos Cup competition. “Our mini-golf party really felt like we came together as a very connected industry, a community convening of friends and colleagues,” said Casanova. “About half the attendees were members and the other half senior representatives from visual effects companies, which we organized both to foster networking and enhance our relationships with potential supporters and sponsors. Since COVID, there’s still a high number of people who work remotely or hybrid, and this has affected the number of people who travel into London

110 • VFXVOICE.COM WINTER 2024

PG 110-111 VES SECTION LONDON.indd 110

11/24/23 3:12 PM


and attend in-person events. So, we’re working on fewer, better in-person events. Our Summer Party was a shining example.” As part of the Section’s “Life in VFX” Series, VES London hosted a successful “Career Change in VFX” panel featuring artists who had both transitioned in and out of the visual effects field, highlighting the value of transferable skills and relationships. The Section is also investing in programs that touch on other life skills and experiences to benefit their members, such as navigating parenthood and the pursuit of work-life balance. This dovetails on the Section’s efforts to highlight the VES Member Assistance Program, which provides free round-the-clock resources to all VES members on a wide range of professional and personal issues. In 2023, VES London held a well-attended online VES Awards nominating panel. As Moody pointed out, “The London Section was the first outside of the U.S. to host our own VES awards judging event about 20 years ago. That was important to us because so much of the award-winning work was coming out of the U.K. – we wanted to stand alongside our U.S. colleagues to participate in judging the world’s best work.” The Section is poised to host an in-person VES Awards panel in January to amp up opportunities for personal interaction and networking. The Section Board of Managers is working on a number of goals and initiatives, including: developing events and educational sessions that appeal to its members’ needs; celebrating the work coming out of the U.K.; and providing more opportunities to foster relationships. The Board is also committed to diversifying its membership to keep the Section “evolving and fresh,” with a special focus on attracting younger artists and technicians at the cusp of VES eligibility (with five or six years of VFX experience), to provide a forum for inter-generational knowledge-sharing and a richer platform for diverse perspectives and skill sets. VES London is a big source of pride for Section leaders Casanova and Moody. Said Casanova, “What I find in the VES is that the people you meet are all trying to improve the industry in one way or another. The quality of our members and the talent they possess are inspiring. Often people who become fellow members or friends, I discover that I already know their work as a fan – and VES closes the gap. I feel that the VES Awards is a crowning achievement that we produce year after year, and that our awards carry so much weight and meaning for the winners and nominees because of the knowledge of our craft that we bring to the process.” Moody shared some closing thoughts, “It’s an honor to represent the talented VFX professionals who are VES members in the U.K. I’m proud to be a Board member because all of my colleagues give their time and talents and energy for free. We do it because we want to make a difference and, working together, I know that we are poised to bring about change that will benefit our members and our global community.”

TOP AND MIDDLE LEFT: A dynamic panel led the “Career Change in VFX” discussion, part of the VES London “Life in VFX” Series. MIDDLE RIGHT: VES London members and guests enjoy an exclusive film screening. BOTTOM LEFT: Section members enjoyed a “Playing with Polarisation” educational session at Cinesite, coordinated with CAVE Academy. BOTTOM RIGHT: Section members participate in an educational session on “Shooting HDRIs for VFX,” organized by CAVE Academy and CineArk.

WINTER 2024 VFXVOICE.COM • 111

PG 110-111 VES SECTION LONDON.indd 111

11/24/23 3:12 PM


[ FINAL FRAME ]

“It’s Alive!”

Resurrection or reanimation has long been a province of lore and literature. As a literary device it has its roots in the Mahabharata and the Bible. We’ve also seen it in more contemporary works such as Mary Shelley’s Frankenstein, J.R.R. Tolkien’s The Lord of the Rings, J.K. Rowling’s Harry Potter and many others. Director James Whale brought Frankenstein to the silver screen in 1931 in what is now considered one of cinema’s greatest classics. More than 80 films have been inspired by the tale of the Doctor and his Monster. These days, resurrection and reanimation are increasingly explored by artists and scientists. This issue’s cover story is about a novel twist on the concept. In Poor Things, director Yorgos Lanthimos weaves a highly visual and VFX-infused tale of the evolution of a young woman brought back to life by an off-center genius of a scientist named Dr. Godwin Baxter. In a recent interview, Lanthimos said that three films were his reference for the tone and feel of the film: The Ship Sails On, Belle de Jour and Young Frankenstein. But the film is about more than just bringing someone back to life and a showcase for VFX. It explores themes of feminism, equality, liberation, sturdiness and aspiration in signature Lanthimos style.

Image from House of Frankenstein (1944) courtesy of Universal Pictures.

112 • VFXVOICE.COM WINTER 2024

PG 112 FINAL FRAME.indd 112

11/24/23 3:14 PM


CVR3 FOTOKEM AD.indd 3

11/25/23 2:14 PM


CVR4 RAYNAULT AD.indd 4

11/25/23 2:15 PM


Issuu converts static files into: digital portfolios, online yearbooks, online catalogs, digital photo albums and more. Sign up and create your flipbook.