

TRON: ARES - FUTURE VISION


Welcome to the Fall 2025 issue of VFX Voice!


Thank you for being a part of the global VFX Voice community. Our cover story ventures inside the highly sophisticated program, Ares, at the centerpoint of sci-fi action film Tron: Ares. We go behind the scenes of the twin brothers-fueled horror film Sinners, trending otherworldly streamers Star Trek: Strange New Worlds and Alien: Earth, and immersive gaming hit Star Wars Outlaws. We delve into trends around AI, VFX and advertising, amplify the latest content-generating tools for music videos, celebrate the artistry of matte painters and highlight the expanding alliance between virtual production and VFX.
VFX Voice goes up close and personal with profiles on MPC Paris leader Béatrice Bauwens and Mission: Impossible VFX Producer Robin Saxen. We get immersive with advances in VR-based mental health therapies with USC’s Dr. “Skip” Rizzo, showcase the state of VFX in Ireland in our special report, go down under with our VES Section spotlight on VES Australia, and much more.
Dive in and meet the innovators and risk-takers who push the boundaries of what’s possible and advance the field of visual effects.
Cheers!
Kim Davidson, Chair, VES Board of Directors
Nancy Ward, VES Executive Director
P.S. You can continue to catch exclusive stories between issues only available at VFXVoice.com. You can also get VFX Voice and VES updates by following us on X at @VFXSociety.

FEATURES
8 MUSIC VIDEO: CONTENT TSUNAMI
AI, VP and smartphones multiply options for frontline creatives.
14 VFX TRENDS: AI, VFX & ADVERTISING
AI, VFX software advances reshape short-form creation.
20 COVER: TRON: ARES
Bringing the Grid into the real world – with believability.
28 PROFILE: BÉATRICE BAUWENS
MPC Paris leader is a passionate promoter of French VFX.
32 TV/STREAMING: STAR TREK: STRANGE NEW WORLDS
Expanding the unexplored cosmos with virtual production.
38 UNSUNG HEROES: MATTE PAINTERS
Honoring traditional artistry while staying agile with new tech.
44 PROFILE: ROBIN SAXEN
Mission: Impossible VFX producer links efficiency and creativity.
50 SPECIAL FOCUS: IRELAND
Incentives and innovation signal a bright green future for VFX.
56 VIRTUAL PRODUCTION: WELCOMING THE SHIFT
The growing integration of VP into the traditional VFX pipeline.
62 FILM: SINNERS
A new approach was developed to achieve the twinning effects.
68 HEALTH & WELLNESS: VR & MENTAL HEALTH
Dr. “Skip” Rizzo on innovations in the field of VR-based therapy.
72 TV/STREAMING: ALIEN: EARTH
New alien creatures join the Xenomorph wreaking havoc on TV.
82 VIDEO GAMES: STAR WARS OUTLAWS
Massive Entertainment utilized VFX to create an immersive event.
DEPARTMENTS
2 EXECUTIVE NOTE
90 THE VES HANDBOOK
92 VES SECTION SPOTLIGHT – AUSTRALIA
94 VES NEWS
96 FINAL FRAME – THE SECRET OF IRELAND
ON THE COVER: Light-emitting vehicles, weapons and suits are part of the visual language for Tron: Ares (Image courtesy of Disney Enterprises)


VFXVOICE
Visit us online at vfxvoice.com
PUBLISHER
Jim McCullaugh publisher@vfxvoice.com
EDITOR
Ed Ochs editor@vfxvoice.com
CREATIVE
Alpanian Design Group alan@alpanian.com
ADVERTISING
Arlene Hansen arlene.hansen@vfxvoice.com
SUPERVISOR
Ross Auerbach
CONTRIBUTING WRITERS
Gemma Creagh
Naomi Goldman
Trevor Hogg
Chris McGowan
ADVISORY COMMITTEE
David Bloom
Andrew Bly
Rob Bredow
Mike Chambers, VES
Lisa Cooke, VES
Neil Corbould, VES
Irena Cronin
Kim Davidson
Paul Debevec, VES
Debbie Denise
Karen Dufilho
Paul Franklin
Barbara Ford Grant
David Johnson, VES
Jim Morris, VES
Dennis Muren, ASC, VES
Sam Nicholson, ASC
Lori H. Schwartz
Eric Roth
Tom Atkin, Founder
Allen Battino, VES Logo Design
VISUAL EFFECTS SOCIETY
Nancy Ward, Executive Director
VES BOARD OF DIRECTORS
OFFICERS
Kim Davidson, Chair
Susan O’Neal, 1st Vice Chair
David Tanaka, VES, 2nd Vice Chair
Rita Cahill, Secretary
Jeffrey A. Okun, VES, Treasurer
DIRECTORS
Neishaw Ali, Fatima Anes, Laura Barbera
Alan Boucek, Kathryn Brillhart, Mike Chambers, VES
Emma Clifton Perry, Rose Duignan
Dave Gouge, Kay Hoddy, Thomas Knop, VES
Brooke Lyndon-Stanford, Quentin Martin
Julie McDonald, Karen Murphy
Janet Muswell Hamilton VES, Maggie Oh
Robin Prybil, Lopsie Schwartz
David Valentin, Sean Varney, Bill Villarreal
Sam Winkler, Philipp Wolf, Susan Zwerman, VES
ALTERNATES
Fred Chapman, Dayne Cowan, Aladino Debert, John Decker, William Mesa, Ariele Podreider Lenzi
Visual Effects Society
5805 Sepulveda Blvd., Suite 620 Sherman Oaks, CA 91411 Phone: (818) 981-7861 vesglobal.org
VES STAFF
Elvia Gonzalez, Associate Director
Jim Sullivan, Director of Operations
Ben Schneider, Director of Membership Services
Charles Mesa, Media & Content Manager
Eric Bass, MarCom Manager
Ross Auerbach, Program Manager
Colleen Kelly, Office Manager
Mark Mulkerron, Administrative Assistant
Shannon Cassidy, Global Manager
Adrienne Morse, Operations Coordinator
P.J. Schumacher, Controller
Naomi Goldman, Public Relations


MUSIC VIDEOS AND VFX KEEP IN TUNE WITH EACH OTHER AND THE TIMES
By TREVOR HOGG
coordinated the
OPPOSITE TOP AND MIDDLE: Bluescreen was utilized for the aerial balloon shots in the music video for “Contact” featuring Wiz Khalifa and Tyga. (Images courtesy of Frender)
OPPOSITE BOTTOM: Door G makes use of a 56-foot-wide by 14-foot-tall virtual production wall in their studio located in East Providence, Rhode Island. (Photo courtesy of Door G)
Seen as a breeding ground for celebrated filmmakers such as David Fincher, Spike Jonze and the Daniels (Daniel Kwan and Daniel Scheinert), music videos no longer require millions of dollars from a record company to produce or a major television network to reach the masses. The media landscape continues to evolve with the emergence of virtual production, AI and Unreal Engine, as well as what is considered to be acceptable cinematically. Whether an actual renaissance is taking place is open to debate, as “Take On Me” by a-ha, “Sledgehammer” by Peter Gabriel and “Thriller” by Michael Jackson remain hallmarks of innovation despite the ease of access to technology and distribution that exists nowadays for independent-minded content creators and musicians.
Constructing a 56-foot-wide by 14-foot-tall virtual production wall in East Providence, Rhode Island for commercials, movies and music videos is Door G, a studio that also does real-world shoots, makes use of AI-enhanced workflows and handles post-production. “Independent artists can put their work out on YouTube and social media, and get attention that way,” states Jenna Rezendes, Executive Producer at Door G. “People may not have heard the song but have seen the video. The visual is important.” One has to be agnostic when it comes to methodology and technology. She says, “We can work with clients or directors and come up with a plan that suits their creative. Does it make sense to do it virtually or lean into some AI to help create some backgrounds? All of these methods can be combined to create the best product. There are numerous tools accessible to a lot of different people. People are making music videos on their own using iPhones and putting them on TikTok. It’s a matter of how the approach fits the end result you’re looking for, within your budget.”
TOP: Ingenuity Studios provided and
visual effects work for the music video for “My Universe” by Coldplay + BTS, with contributions from AMGI, BUF, Rodeo VFX and Territory Studio. (Image courtesy of Ingenuity Studios)

AI will become a standardized tool. “It allows you to be more iterative in the creative development phase. You can churn through ideas and not feel like you’re wasting a lot of valuable time,” remarks Joe Torino, Creative Director at Door G. “AI is a quick pressure test of ideas versus spending weeks building things out only to realize it’s not going to work. Like any other new technological development, AI will become a choice of the creator and the person who is buying the creative. Do they want something that is AI or human-created or a combination of both? It will become clear in the future what’s what, and people will have a choice to make what they want to invest in.”
Unreal Engine is another important tool. “Unreal Engine allows you to create worlds from scratch, but, that said, it still requires a lot of processing power not so widely available yet,” Torino states. “We did a music video that was this post-apocalyptic world entirely created in Unreal Engine. This is something you would struggle to do as video plates. This allowed them, on a reasonable budget, to create this world that otherwise would have been impossible.” Content is not being created for the long term. “Things are more disposable,” Torino believes. “If you do a music video that is innovative and creative, you get a chance to have a moment, but I don’t know how long that moment will last. You can make a great music video with one filmmaker and artist, and that probably wasn’t the case in the 1960s, 1990s and early 2000s,” Torino states. “The biggest factor for change is through AI. Soon, you will be able to make an entire music video using AI. We’re probably six months to a year out from someone who can recreate the aesthetic of the music videos for ‘Take On Me’ or ‘Sledgehammer’ with a few clicks and some dollars to spend on AI video rendering platforms. It will





be polarizing like everything else, but polarization ultimately gets eyeballs, so people are going to do that.”
Leveraging his experience as a visual effects producer, Max Colt established Los Angeles-based Frender, which has worked on award-winning music videos for the likes of Harry Styles, Marc Anthony, Thomas Rhett and Nicki Minaj. “For me, it’s the music, the camera operator, the idea and a bunch of tools to play with,” observes Colt. “Right now, we work with AI technology because, for the past three years, it has become a trend. It’s also easy to jump into the industry because you can shoot a music video on an iPhone like Beyoncé – it’s simple to create content, but it’s not easy to create good content. I’ve been in this industry for 15 years and have done 500 or 600 music videos. I am still doing them because I personally like music as well as the freedom, and getting to work with the creatives and musicians.” Virtual production cannot be treated in the same way as a greenscreen shoot. Colt notes, “You need to approve everything before the shoot, and if you want to change something, it’s hard but still possible. This is a new tool, and for a big production like a movie, it’s time and cost-effective. For a music video, virtual production is okay; maybe in the next five years it will become cheaper than shooting regular greenscreen.”
“You can shoot a nice story without any post,” Colt observes. “It is a trend to create nonexistent things and visual effects help you to do that. You can produce some cool stuff with After Effects. But for simulations you need to use Nuke or Houdini. If you aesthetically understand the difference, you can play with these tools. Music videos are a great platform to play with new tools. Movies are a risky business, and no one spends a lot of money on new stuff.”
In America, music videos, commercials and films are treated as different industries. Colt remarks, “Sometimes it’s hard to change industries. If you start on music videos, it’s hard to jump onto big movies. Europe is more flexible. However, music videos are the best platform to start your career, understand the process and see the
TOP TWO: From Normani & 6LACKS’ “Waves” music video. Software and technology have become readily available along with tutorials via YouTube, which means that record labels and artists no longer have to hire major visual effects companies for their music videos. (Images courtesy of Frender)
BOTTOM: Diljit Dosanjh’s music video of his song “Luna” from the album MoonChild Era. Music videos provide the opportunity to experiment with new tools. (Image courtesy of Frender)

final result.” Projects vary in scope when it comes to enjoyment, he adds. “Some projects I love and have a lot of stories. Others are copy-and-paste projects. Just like being a visual effects artist, sometimes it’s technical work such as clean-ups, while other times it’s creative, like producing nonexistent things. It really depends.”
Ingenuity Studios was established to create visual effects for music videos, which continue to be part of a corporate portfolio that has expanded into television and feature films. Noteworthy collaborations have been with musicians such as Shawn Mendes, Selena Gomez, Taylor Swift, Billie Eilish and BTS. “Music videos have been the leading example on how viewing and creation habits have changed as well as the purpose of these things,” observes David Lebensfeld, President & VFX Supervisor for Ghost VFX and Ingenuity Studios. “Starting with the MTV era, music videos at their core were promotion to sell albums and took on a larger role in society when they became a place where creative people could go and stretch their legs and do stuff that wasn’t possible in other mediums. Record labels were willing to pay for it because it actually coincided with how music gets promoted. Certainly, more recently, TikTok and Instagram, probably more than YouTube, are where music gets promoted. On a production scale, you approach it differently for TikTok than you do for the MTV era because you would have $2 million to $3 million videos, whereas now it’s about viral trends, which are completely different in how those things get exposed and become popular.”
“Music videos are more divorced from cinema than they were in the A Hard Day’s Night or MTV era where you would have the same people making those things,” Lebensfeld remarks. “Now it’s more about capturing the moment and zeitgeist quicker, dirtier and less polished but maybe equally as clever. For that, you don’t need a six-week timeline and a $1 million budget to make a music video. The other benefit, particularly around the economics of record labels, is they used to put together a plan where you would


TOP: Boston-based independent artist Llynks filmed a music video for her song “Don’t Deserve” in Door G’s East Providence, Rhode Island studio. (Photo courtesy of Door G)
BOTTOM TWO: Frender produced the music video for Coldplay’s “Up&Up”. (Images courtesy of Frender)



come out with a big artist’s album and say, ‘These are the three singles that we’re going to put a bunch of money behind and promote.’ That’s when music video production would begin. It was before the albums came out. Over time, even in the MTV era, the record labels started waiting and seeing what hit in order to better deploy their capital to promote the album. With this current trend, it’s even more so, where they’ll make a music video for something they had no idea was going to be a hit, go with the moment and be able to produce something and put it on TikTok, or let the audience make it user-generated and incentivize that; that is the best use of their capital to promote an album.” Some things will continue to stand out. “But the way that content is created and consumed is going to be more about reaching niche, localized audiences versus making something big and expensive for everybody.”
Music videos themselves are not revenue generators. “AI, Unreal Engine and cellphone cameras – these tools that allow you to make content or user-generated content at a larger scale for a lot less money is where this is heading and arguably is already,” Lebensfeld observes. “The biggest piece of technology that is changing the way that music videos get made isn’t actually production related. It’s social media and the technologies behind that, which will continue to evolve with AI or hardware that gets better, easier or more ubiquitous. If you start having people wearing AR glasses all day long, then you’re going to need content for that. It’s not going to be about how technology is changing what gets produced. It’s more about how technology is changing how things get produced [for YouTube, TikTok and Instagram] and the need for content. Arguably, right now, video cameras in your phone are a big piece of the production puzzle, and will become more popular as people consume more content on the go, in a virtual environment or at a Fortnite concert. The AI piece allows for a lazier production process, so you could have somebody generating prompts of something they think is funny and setting it to a Lady Gaga song, and that’s what everybody finds hilarious and ends up blowing up a single.”
TOP AND MIDDLE: Frender collaborated with Chris Brown and Nicki Minaj on the music video for “Wobble Up.” (Images courtesy of Frender)
BOTTOM: Door G has an agnostic attitude towards methodologies and technologies as there is not one solution for everything. (Photo courtesy of Door G)


NEW TECH AND VFX ADVANCES SPARK COMMERCIALS AND SHORT-FORM
By CHRIS McGOWAN
It’s not just advances in traditional VFX software that are transforming advertising, it’s the rise of emerging technologies like generative AI that is reshaping the approach to short-form content creation, according to Piotr Karwas, VFX Supervisor at Digital Domain. “From early stages like storyboarding, concept art and previs, to sometimes even contributing to the final image, this hybrid method of blending generative AI with classic VFX techniques marks an entirely new chapter in effects production,” Karwas states. “Real-time rendering with game engine technology is becoming an increasingly integral part of our VFX workflow. It plays a particularly significant role in previsualization and motion capture, enhancing both speed and creative flexibility.”
Karwas notes, “Digital Domain has a remarkable history of producing high-impact visual effects for advertising and shortform content, consistently pushing the limits of what is achievable in the medium. Since its founding, the studio has delivered cinematic quality and compelling storytelling in commercials, collaborating with some of the industry’s most visionary directors, including David Fincher, Mark Romanek, Michael Bay and Joe Pytka.”
“Whether it’s crafting hyper-real environments, photoreal digital humans or seamless compositing for global campaigns, Digital Domain has played a pivotal role in redefining audience expectations for brand storytelling,” Karwas says. From Super
TOP: Daredevil, Colossus and other Marvel characters in the Electric Theatre Coke commercial bridged high-end visual effects and classic comic book art. (Image courtesy of Coca-Cola and Marvel Comics)
OPPOSITE TOP: Rodeo FX crafted 3D billboards in Times Square and Piccadilly Circus to advertise the survival-horror video game The Callisto Protocol (Image courtesy of Rodeo FX and Striking Distance Studios)

Bowl commercials to global product launches, he comments, “the studio’s ability to combine cutting-edge visual effects with engaging narratives has made it a preferred partner for major brands aiming to make a bold and immersive statement in a competitive advertising landscape.”
“Visual effects and advanced digital technologies are transforming how brands engage with audiences,” adds Karwas. This includes the growth of interactive and immersive advertising. He explains, “Digital Domain is at the forefront of this change, utilizing its innovative digital human technology to create experiences that go beyond traditional media. A powerful example is The March 360, a real-time virtual reality experience that not only tells a story but also places viewers inside a pivotal moment in history. By authentically recreating the 1963 March on Washington and digitally resurrecting Dr. Martin Luther King Jr. to deliver his ‘I Have a Dream’ speech, Digital Domain has shown how VFX can produce emotionally resonant and immersive content that is both historically significant and technologically innovative. This goes beyond mere advertising; it is experiential storytelling that blurs the line between reality and digital creation, engaging audiences on a deeply human level.”
Electric Theatre Collective demonstrated cutting-edge visual effects in “Coca-Cola – The Heroes,” for which it won the Outstanding VFX in a Commercial prize at the 2025 VES Awards,
as well as the best “Lighting & Compositing” award. Directed by Antoine Bardou-Jacquet, the project merged two iconic brands: Coca-Cola and Marvel. The visual effects and animation were integral to the storytelling. “From the outset, the campaign needed three things: carefully designed interpretations of key Marvel characters, CG development and animation of those characters, and vibrant color grading,” says Ryan Knowles, Electric Theatre Collective Creative Director.
“After some early engagement where we created some test animations for Antoine and the agency, GUT [Buenos Aires], we were brought in as a key creative partner for the campaign,” comments Greg McKneally, CG VFX Supervisor at Electric Theatre. “The brief for this project was to bring illustrated and sculpted Marvel heroes to life, seamlessly embedding them in the real world of a comic shop. We needed to tell a dramatic story while retaining a unique look for each character based on the medium they originated from: comic book, t-shirt or collectible.” McKneally notes, “With dynamic camera work and physical interactions, we decided the base of each character needed to be CGI. To maintain authenticity, we used traditional 2D animators, illustrators and painters throughout the process, adding hand-made elements into scenes. This approach allowed us to bridge the worlds of comic book art and high-end CGI, allowing for complex scenes in the physical world of the set, with a hand-crafted 2D feel.”



TOP TO BOTTOM: For the “Malaria Must Die” TV spot campaign, Digital Domain utilized Charlatan technology to age David Beckham nearly 30 years to deliver a powerful speech and message. (Image courtesy of “Malaria Must Die” campaign and Digital Domain)
Rodeo FX has delivered visual effects for commercials for many notable clients, including McDonald’s. (Images courtesy of Rodeo FX and McDonald’s)
Electric Theatre brought Juggernaut and other Marvel superheroes to life in the real world of a comic-book shop, and Marvel together with Coca-Cola. (Image courtesy of Coca-Cola and Marvel Comics)
McKneally continues, “Our primary software pipeline involves Maya for modeling, rigging and animation; Houdini for shading, scene assembly, lighting, FX simulation and rendering; and Nuke for compositing. Houdini, and a number of different custom tools built within it for the project, formed the backbone of the creation of the stylized character, including Paintify 2.0, which allowed us to create Daredevil as a moving water-color and pencil illustration.”
To achieve the look of Black Widow as a 2D illustrated character moving in three dimensions, Electric Theatre worked with a traditional comic book concept artist to develop a set of consistent illustrations. Knowles explains, “These were used as a guide for the development of the CG character, and we also trained a bespoke machine learning model on this original concept art. This machine learning model was used to process and filter some of the CG renders of the character. It was a key ingredient in the recipe of the final look of Black Widow, which was crafted in compositing.”
“Simple real-time rendering was used extensively in the previs phase of the project, during which we carefully mapped out each camera move and blocked out the animation for the scenes in Maya,” McKneally says. “We were able to rough out the placement of the lights and key shadow elements, as well as the characters themselves. To create the unique shading style of Juggernaut, we built a custom shading system which generated the shadows as geometry in the scene, meaning that the artists working in Houdini had a real-time preview that was very close to the final look of the character.”
“The biggest challenge in this project was that each character required a different technical and creative approach,” McKneally says. “For example, while Colossus is a resin maquette and could use a more conventional CG approach, Juggernaut is inspired by 1980-1990s comics. He is a 3D animated character, but has a limited color palette and uses a bespoke shading system, hatch line geometry, mid-tone shadow mattes which reveal classic printed half-tone shading, and hand-painted 2D animated expressions. All of this was composited together with mis-printing filters, bleed, grain and irregular line work to achieve the final look.” McKneally adds, “Meanwhile, Captain America and the comic book city features custom toon shaded CG, hand-painted buildings carefully recreated and textured in CGI, hand-painted skies, a custom linework system, 2D FX animation and crowd simulations.”
Daredevil was one of the most complex challenges. “His entire look is an evolving water-color painting with pencil details,” Knowles comments. “The paint strokes that define his form had to feel 2D but still react in a realistic way to the lighting of the environment he was moving through. This was achieved in a combination of 3D lighting and an evolution of our bespoke 3D paint system in Houdini – originally developed on ‘Coca-Cola Masterpiece’ –and then assembled and finessed in the hands of the compositors.” According to McKneally, “3D animation and compositing tools are progressing at a rapid rate. This, combined with the incredible pace of development for machine learning tools, means that the tech landscape for visual effects in advertising is evolving at speed. At Electric, we have created a workflow that allows artists to harness the latest developments in image generation, animation, rendering
and compositing, staying nimble and flexible enough to swap new tools into our workflow as the technical or creative demands evolve. We’re finding many projects now benefit from a bespoke combination of traditional and emerging tools.”
Knowles observes, “One of the biggest challenges we have seen comes from rapid AI image generation. Polished images generated by all parties – agencies, clients, directors – which help to sell a concept or idea, but risk shortcutting the creative development. It’s a high level of finish in a very short time, but can often lack the nuance, depth and personality that multiple iterations in a traditional concept art development process can achieve. We strive to strike a balance.” Knowles adds, “To bring this nuance back into the fold, our creative team uses a mix of traditional design and animation workflows to iterate ideas and then engage custom machine learning workflows to add polish and variation. This allows for rounded creative development with the high level of ‘finish’ now expected by agencies, directors and clients. As machine learning image and video generation becomes more and more available – and quality improves – having the ability to guide the artistry that underlies the final image becomes even more valuable for creating original, breathtaking work,” McKneally explains.
“As an end-to-end production house, VFX is at the heart of what Alongside does,” says Glen Taylor, Founder of Alongside Global. “VFX is central to our efforts in the advertising industry, and this trend isn’t slowing down. We use it all the time to drive storytelling, realism and creative innovation. From compositing and CG to AI-enhanced workflows, VFX helps shape our final output across our projects. We just created a new website for the healthcare company Calcium, where we built an asset in 3D. VFX spans mediums.”
The pace of change is intense in VFX software and tech and greatly affects advertising. “New tools are released every day –face swaps, language transcription and translation, and AI scene building, to name a few,” Taylor says. “Unreal Engine is already transforming how we work. It allows for live client feedback, instant changes and scalable outputs from AR and VR to social and film. Once we hit photorealism in real-time, it will become the core production tool.” He adds, “The challenge is knowing what to use and when. We’re experimenting constantly, especially with AI-generated content that’s licensing-safe and creatively sound. The goal isn’t to replace creativity – it’s to move faster and smarter.”
Rodeo FX has delivered visual effects for many advertising clients, including McDonald’s, Nissan, Red Bull and Tiffany & Co. Ryan Stasyshyn, VFX Supervisor and head of the Toronto office for Rodeo FX, remarks, “Content creation has become remarkably accessible, especially with the integration of AI. Simultaneously, our clients are increasingly seeking greater levels of collaboration throughout the creative process.” Real-time rendering is coming into play in production and VFX for advertising. “That said, I think it’s worth noting that real-time has a number of hurdles that need to be addressed before it becomes the mainstay of the industry. In regard to real-time rendering to output final frames in a traditional VFX approach, I believe the industry is at that point and,



Rodeo FX Ads & Experiences helped create the immersive event Stranger Things: The Experience, working with Netflix, Fever, Mycotoo and Liminal Space. (Image courtesy of Rodeo FX and Netflix)
Rodeo FX worked on visual effects for Tiffany & Co. as part of the global “About Love” brand campaign. (Image courtesy of Rodeo FX and Tiffany & Co.)
TOP TO BOTTOM: Deadpool enjoys popcorn and a Coke in the comic-book shop created by Electric Theatre. (Image courtesy of Coca-Cola and Marvel)


TOP: The Electric Theatre ad, uniting 2D and 3D animation with live-action in a comic-book shop, won VES awards given in 2025 for Outstanding VFX in a Commercial and Outstanding Lighting and Compositing in a Commercial. (Image courtesy of Coca-Cola and Marvel)
BOTTOM: The TIME magazine cover image of Dr. Martin Luther King Jr. was created using a historically accurate 3D rendering of King drawn from “The March,” a virtual reality experience that depicts the 1963 March on Washington. (Image courtesy of Digital Domain, Hank Willis Thomas and the Martin Luther King Jr. Estate)
depending on the studio, is either there or going to be there shortly. As real-time rendering pertains to virtual production, I think its use has shown real promise but also a large barrier, with education of the users being the biggest hurdle. The technology is there, and the talent in the industry can certainly support it. I think we may have done ourselves a disservice by overhyping it a little too much out of the gate and sometimes setting unrealistic expectations.”
Erik Gagnon, VFX Supervisor and Flame artist for Rodeo FX, says, “Rodeo FX’s work in advertising is decisively oriented toward the future through the proactive integration of artificial intelligence into our creative process. This approach enables us to provide our clients with a highly personalized and innovative partnership, elevating their original concepts into stunning visuals and unique storytelling experiences. Advancements in VFX tools and technologies are profoundly transforming our approach to advertising. Today, we focus less on isolated software and more on comprehensive ecosystems of specialized tools, allowing us to push creative boundaries. These developments make our production pipelines more adaptive and supports artists in becoming increasingly versatile and agile.”
Gagnon continues, “We are witnessing the rise of interactive and immersive advertising powered by VFX. Rodeo FX has crafted visually compelling experiences, such as 3D billboards in iconic locations like Times Square and Piccadilly Circus for The Callisto Protocol. Additionally, during the Cannes Film Festival, Rodeo FX Ads & Experiences created an ingenious activation for Stranger Things, captivating attendees and sparking their imaginations. The synergy between VFX, advertising, and experiential marketing represents an ideal combination to genuinely bring brands to life. By integrating cutting-edge visual effects into interactive environments, we engage audiences more deeply, stimulating their senses and creating memorable brand connections. This powerful blend not only elevates brand awareness but also ensures impactful and lasting impressions.”
Franck Lambertz, VFX Supervisor and head of Rodeo FX’s Paris office, sees AI-powered image and video generation as “an exciting new playground.” He notes, “While the technology is still developing and can be unstable, we see it as another valuable tool in our creative arsenal. It still requires passion and expertise to avoid clichés and elevate the final result. Our roles may become less technical and more focused on creative leadership.” Lambertz continues, “Increasingly, clients arrive with well-developed reference decks. Thanks to tools like Midjourney, some directors have already visualized their concepts. This makes the creative process more efficient and enables us to work together more effectively to finalize the vision.” Lambertz adds, “Sometimes, clients challenge us with real-time projects. Depending on the brief, we assemble the right pipeline and talent to deliver top-tier visuals. Recently, we completed a VR experience for a live event in Paris – our first time producing a 7K, 60fps stereoscopic render, fully synchronized with fan and lighting effects. Watching the team immerse themselves in the experience with VR headsets and exchange feedback was a memorable moment.” Concludes Karwas, “Advertising is rapidly entering a new era. The integration of generative technologies and VFX is unlocking entirely new possibilities for connecting with audiences, enabling brands to engage and interact with customers in ways that were previously unimaginable.”


PROGRAMMING THE REAL-WORLD INTO TRON: ARES
By TREVOR HOGG
As much as Tron was prescient in regards to the technological and societal impact of computers, the growing debate over the application and ramification of artificial intelligence has worked itself into the video game-inspired universe conceived by Steven Lisberger with the release of Tron: Ares. Whereas Tron and Tron: Legacy essentially took place within the digital realm, the third installment, directed by Joachim Rønning, goes beyond the motherboard as an advanced AI program known as Ares (Jared Leto) embarks on a dangerous mission in the real world.
“The core of our film is this contrast between the Real World and Grid World, which has been established within the first two Tron films,” notes VFX Supervisor David Seager. “The big challenge for us was, how do you bring the Grid into the real world in a believable fashion? There is a lot of embellishment that could be made when you’re inside the Grid because you can say, ‘This is inside of the computer, so Grid things can happen.’ But if you did some of those exact same things in the photography you shot in the real world, it would begin to break the illusion. Joachim Rønning drove it from the start by embracing, ‘What does a Light Cycle look like in the real world? How does it behave?’ There were all of these different things we had to embrace and answer for ourselves to find a language that first and foremost supports the story, but, also, the more believable the visual is, the more people are going to buy into the story.”
Images courtesy of Disney Enterprises.
TOP: Practical Light Cycles were built with LED light strips, which allowed for scenes in the real world to have the proper reflections and refractions.
OPPOSITE TOP: Director Joachim Rønning talks to Jared Leto about his vision for the Tron franchise. (Photo: Leah Gallo)
OPPOSITE BOTTOM: Light Ribbons surpass lightsabers in complexity.
Light-emitting vehicles, weapons and suits are part of the visual language for Tron. “With the majority of our film being in the real world, we embraced shooting as much as possible in the real world,” Seager states. “The Light Cycles are the best example of that. You can sit there and go, ‘You want to shoot in the real world. Then go with a camera, shoot an empty road and imagine what’s happening.’ Or, what we decided to do, which is to say, ‘We’re going to have proxy bikes that represent Light Cycles.’ Our entire team worked together to find these electric Harley-Davidson bikes that

were modified to have the same wheelbase of a Light Cycle and had the stunt team driving through the streets doing the action. I find you get that added reality of the real physics of the bike having to change lanes, pulling in front of a car and the camera car following. The speed is different, so the camera operator has to correct and keep his eye on the camera. All these million decisions go together to make a shot that looks like a motorcycle driving down a street. Then, our job is to go back, and over the top of that proxy we put our Light Cycle. We worked with the lighting department and
put a bunch of LED bricks on bikes to have them emit light. Even the stunt riders’ safety suits had light lines. Our hope there is you might get some happy accidents. The fact that we had a real bike reflecting where it needed to reflect and casting that light where it needed to go was a guide. Were there times when we wanted to deviate from that motion? Absolutely! However, the number of wins we have far outweighed those times we had to go in there and paint some light out.”
Among over 2,000 visual effects shots produced by primary



vendor ILM, as well as Distillery VFX, Image Engine, GMUNK, Lola VFX and Opsis, is the signature action sequence where a pursuing police car gets sliced in half by a Light Ribbon generated by a Light Cycle. “You see what they filmed practically and notice what you need to do,” remarks Vincent Papaix, VFX Supervisor at ILM. “There is postvis happening. We need to make it look photoreal, and for the way we see a Light Ribbon, it’s in its name. It’s light and Ribbon, so we make it an indestructible piece of glass emitting energy that can cut things. The special effects team had a precut police car, pulled it with two wires and then separated it. What we enhanced in CG was the Light Ribbon, but also the sparks, the piece of the engine falling down, dust and smoke. We also had to have a tight match to the police car because of all the interactive light and reflections that go in there. We could have done full CG, however, because of my compositing background, I am a big fan of trying to preserve the photography. It was meticulous work from the 3D and 2D departments to blend the plate and CG that was going on top to make it seamless.”
Appearing in the sky is the Recognizer. “What we did in order to make the Recognizer believable in the real world was to add a lot of details, like screws and an engine that was referenced from a rocket ship,” Papaix states. “It’s still the design of Tron, but with the technology of our real world to make sure this can fly in a believable way.” A larger issue was the environmental integration. “We used
TOP: Ares utilizes 3D printing technology to enter the real world.
BOTTOM: Complicating the Light Ribbons is the history of where they were previously included in shots.

Burrard Street, one of the biggest avenues in Vancouver, but when we put the Recognizer in the scene, it would be in-between buildings when it’s supposed to move through the street. That’s because in the story the Recognizer is chasing a character, so it needs to be not just over the city but be able to go through the city. In order to do this, sometimes we had to widen the street, which was quite a challenge.” Vancouver is known for rain. Papaix says, “All of our photography work had wet ground, and wet means it reflects a lot. On set, they put a lot of LED tubes where the action was supposed to be. Most of the time it worked great. We built an entire Vancouver in CG and had a couple of blocks at the ground level that were a photoreal full CG asset. We put in the effort to have a CG replica of the real photography, so we were able to render our CG version of the city to patch things as needed or integrate things because the action got moved.”
Animators had to act like cinematographers. “At the end of the third act, there is a dogfight between some Tron Light Jets and a F-35, and besides animating the vehicles, we also animated all of the cameras,” states Mike Beaulieu, Animation Supervisor at ILM. “We looked at aerial footage of different films or documentaries, and made a list of what kinds of lenses we usually use when we’re shooting stuff like that. It also gets creative because our director has a sensibility for what kind of lenses he prefers and what types of shots he likes to have. We can make adjustments there. We

TOP: A dramatic aerial dogfight takes place between Light Jets and a F-35.
BOTTOM: The big challenge was how to bring the Grid into the real world in a believable fashion.


TOP: Principal photography took place in Vancouver with the majority of the scenes shot at night.
BOTTOM: Joachim Rønning has a conversation with Greta Lee while surrounded by a massive and dramatic practical red set constructed for scenes that take place within the Dillinger Grid. (Photo: Leah Gallo)
would animate everything in 3D space first and then we would go in with an eye of, ‘How would we film all of this stuff?’ We would cover it with different camera angles and see what angles looked most interesting, which ones tell the story the best and still have an energy to it without getting too flat; 200mm to 400mm lenses tend to flatten everything, so to have a sense of traveling through 3D space with lenses like that is tricky. You can have a jet flying at you going 300 miles per hour, but with a 400mm lens, when you’re that far away, it doesn’t look like it’s changing that much. We had to get creative by covering it with a 100mm lens so we can actually read some of that travel of the jet flying around.”
A new Identity Disc is introduced. “You get to read a little bit more of the animation of the triangle disc that Ares has as it’s flying through the air,” Beaulieu remarks. “Because of its shape, you don’t always register the revolutions that the circular disc is doing as it’s getting thrown. The triangular disc also had these little blades that come out on the ends, which adds to the silhouette. We had conversations about how the triangular discs should move and behave. Do they behave as you would expect in the computer program where they go on a perfectly straight line and hit things and rebound, like Captain America’s shield where it deflects off surfaces? Or do we play it more [as if] they have to arc like a frisbee or boomerang? We ended up falling into a realm where we did a bit of both. Whatever felt best for the shot. If it needed to be something fast and have a big impact then something that was straight would feel better. But in the wider shots we had to put an arc on some of the discs to differentiate them. There is a sequence when we have these ENCOM soldiers flooding the plaza and they’re throwing discs. There are 15, 20 or 35 discs flying around in the shot, so to make them all read, we had to vary what they were doing. Some of them are flying straight and others are on a bit of an arc.”
Light Ribbons surpass lightsabers in complexity. “There’s something that nobody realizes, which is the history of Light Ribbons is tricky,” notes Jhon Alvarado, Animation Supervisor at ILM. “Basically, for every shot, you have to animate Light Ribbons four shots before it.” Performances had to be altered because of the Light Ribbons. “If you play that shot a little longer, they’ll be going straight through a Light Ribbon. We had to change the motion to make them get past it. There are happy accidents that come about where an actor leans over a tiny bit, and there’s nothing there to tell her to do that, but it works perfectly with her getting out of the way of the Light Ribbons. Then Jeff [Capogreco, VFX Supervisor] adds a reflection of the Light Ribbons reflecting on the environment and face, and all of a sudden you go, ‘Those Light Ribbons were there the whole time. How did they do that?’” The Light Ribbons had to be produced quickly and efficiently. Alvarado says, “At the beginning it was a hard, slow process to get the Light Ribbons generated, so we took a lot of technology and worked with our pipeline and R&D to determine how we get this in Maya so the animators are able to generate Light Ribbons in real-time or quickly whenever there is a change. For example, if we have to do 300 or 500 frames of history, we don’t want to be there waiting for half an hour as the Ribbons generate. We want to see it straightaway. There was a lot of work that was done to accommodate for these changes that happen in the filmmaking process.”




TOP TO BOTTOM: Light Ribbons had to be adjusted to suit the performance of the actors.
Over 2,000 visual effects shots were produced by ILM, Distillery VFX, Image Engine, GMUNK, Lola VFX and Opsis.
The special effects team led by Cameron Waldbauer created a practical police car that could be split in half, while the Light Ribbon was inserted later in post-production.
Ares originates in the Dillinger Grid, which has a red color palette and resembles an industrial wasteland. “The environment itself is loosely interpreted as a giant motherboard surrounded by voxelated water,” states Jeff Capogreco, VFX Supervisor at ILM. “There is a whole sequence where Ares and Eve are riding a Skimmer, a floating motorcycle that causes the water to create a giant rooster tail of bioluminescence and water spray. We did this whole deep dive of finding a bunch of clips of rooster tails shot at different times of day and angles, and we had the shading folks go in and light it, match it and validate it. Once we had that in the can, we started to interject what we were going to be doing. The other neat thing is, as you’re going along the water you start to realize that the water is actually a bunch of grids. Imagine you’re in fairly calm water where every 20-by-20 meters, there is a red border that outlines an area. One neat component is, as the Skimmer passes through these gates, it charges the rooster tail. You get these bursts of beautiful bioluminescence energy that illuminates the environment. We wanted to give it a visual feast.”
“Another department that we collaborated with was environments because you can imagine the Dillinger Grid being huge,” Alvarado remarks. “The Skimmer is traveling 150 miles an hour. We had to find ways for us to change and add things in animation that environments then used to Tron-ify the world and still feel in that environment. Another thing that we [addressed] was how do we create life in this Dillinger Grid? We have people and objects flying, and to go to every shot in animation and add that to this huge world would have been time-consuming. I spent time working with the environments artists to create a Dillinger Grid library of life. Then, environments procedurally put it within this grid, and we see it in shot. If something calls out in the shot then we can go in and change it. But most of the time we were like, ‘That’s enough.’ Things like that allow us to use the departments, collaboration and technology to help us get the best out of it rather than brute force our way into it all the time.”
When it comes to figuring out how Ares makes the transition from the Grid World into the Real World, a current technology was utilized. “They didn’t want that sharp contrast, like computer and real,” Capogreco reveals. “You can imagine these conversations. There was at least a year’s worth of R&D and trying to figure out, how do you bring a program into the real world? We leveraged the idea of 3D printing technology. What you will see are things that are born out of Jig. If you know anything about 3D printing, you get these pieces that support the model inside, and what you will see visually are lasers effectively creating these structures, and inside the structures are printed human beings or programs or vehicles that are then born into the real world. It’s beautiful. Lots of deep contrast, sparks and lasers. It’s science fiction at its best.”
It is not always about being revolutionary. Capogreco observes, “It’s finding ways to get the most out of what we have as opposed to reinventing something. With present-day computer technology, we’re not having to rewrite renderings. We’ve got Pixar’s RenderMan, which is phenomenal. It’s more about how can we maximize for performances as well as working with Jhon and animation to make sure we’re obeying physics – and not changing things too quickly or doing things that are going to make simulations go off the rails. It has been a great process on this show.”


BÉATRICE BAUWENS: A CHAMPION FOR GROWING FRENCH VFX
By OLIVER WEBB
Béatrice Bauwens was born and raised in Tourcoing near Lille in the North of France where her parents were teachers. “I studied first at Lille University of Science and Technology,” Bauwens begins. “At that time, I wanted to be a sound engineer, and I always loved science. Music was also in my background. Back then, there were very few places offering a dedicated education in the subject [of sound engineering]. I studied electronics, then with a specialization in Image and Sound at Université de Bretagne Occidentale in Brittany, and the world of images opened up to me. I discovered editing and computer graphics. I had a few courses in POV-Ray, Autodesk 3ds Max and coding during those university years. My career as a sound engineer ended there, as I was discovering these new fields with images and collaborative projects. Then I took a last Masters year in production back in the north of France at Université Polytechnique Hauts-de-France in Valenciennes.”
Bauwens’ first break came during her studies where a documentary project required generating a car in CGI, with the compositing directly done in the editorial software Media 100. “It was really a self-discovery with no real teaching about the art at all, and that’s it,” Bauwens says. “When I began working, my first experience was as a technician/production assistant for a documentary producer, more on the edit side. VFX was not something that was of interest. I was involved with the editorial and tape dubbing. Then, I moved to Mikros Animation in 2002, and it’s really where I discovered the ‘real’ visual effects. I was a technical assistant in the CGI department of Mikros; CGI to video transfer, rendering watching, digital archiving, IO and linking with editorial. I have never been a graphist, but I’ve always worked with them side by side.” The visual effects part of Mikros for film and episodic changed its name in 2020.
“After technical years in Mikros,” Bauwens continues, “I moved to the production department as VFX coordinator, then VFX producer. Those roles are very project-oriented.” Bauwens served as Visual Effects Producer on the 2012 film Rust and Bone, “an amazing experience with Cédric Fayolle, who was the VFX Supervisor and my colleague at the time. I was a VFX producer for Mikros then. It was a very challenging project, working for the first time for such a talented filmmaker, Jacques Audiard. It involved a lot of shots, the Cannes deadline, a diversity of techniques – 3D, 2D fix and practical – and the liberty with camera movement that was given to the director. Creatively, one challenging part was the animation of Marion Cotillard’s prosthesis when she arrives at the nightclub. The way she walks is telling something about her mindset, and the way to animate the prosthesis that was replacing her real legs needed to express that. Unfortunately, the film didn’t get the Palme d’Or at Cannes, but we were nominated for the VES Award for Outstanding Supporting Visual Effects in a Feature Motion Picture. We didn’t win, but it was quite an experience being in Los Angeles for our work on a feature film.”
OPPOSITE TOP AND BOTTOM: MPC Paris was the primary vendor, led by Bauwens, for multi-Academy Award-nominated Emilia Pérez, directed by Jacques Audiard. (Top photo: Shanna Besson. Images courtesy of Netflix)
In 2022, Mikros merged with MPC and Bauwens became Head of Studio for MPC Paris. In February, the Technicolor Group collapsed, sinking MPC – but not everywhere. “Depending on our location, the outcome is very different,” Bauwens remarks. In April, New York-based TransPerfect, a leading global provider of language and AI solutions for business, acquired France-based MPC and The
Images courtesy of Béatrice Bauwens, except where noted.
TOP: Béatrice Bauwens, Head of Studio, MPC Paris. (Image courtesy of PIDS-Enghien)

“It’s true that the French film industry may look a little bit shy about the use of VFX. However, it’s definitely changing. Some very ambitious projects in terms of VFX were recently made. … The ongoing evolution of the tools allows new possibilities to be offered to the filmmaker and producer, pushing the limits of our craft again. At the end of the day, it is about how to make the trick help the story.”
—Béatrice Bauwens, Head of Studio, MPC Paris
Mill. Beuwens continues in her leadership role at MPC Paris and joins the senior management team of TransPerfect Media. “I’m still in charge of MPC’s activities in France – post and VFX – and in Belgium,” Bauwens states. “The MPC banner is still used for our operations in France.”
When it comes to MPC selecting projects, Bauwens explains it’s mostly the other way around; it comes down to which projects select MPC. “Especially in Paris, thanks to our history with the French cinema, there is definitely not one type of project that we are working on. The diversity of our projects reflects this. For example, we’ve been involved with Justine Triet’s Anatomy of a Fall, which is definitely not highly intensive with VFX, and Under Paris, directed by Xavier Gens. It’s really often a question of trust –trust that MPC will bring its expertise and solutions to help deliver the project to its ultimate goal: telling a story.”
When it comes to choosing a favorite VFX shot from her career, Bauwens doesn’t single one out in particular. “One thing I know that is unique to VFX is, per show, there is always one shot that is a




TOP AND MIDDLE: Bauwens is proud of being a part of The Last Duel (2021) as Head of Visual Effects for Mikros VFX. (Images courtesy of 20th Century Studios)
BOTTOM: MPC Paris was honored with the 2025 César & Techniques Innovation prize at the César Academy in France in January. (Image courtesy of César Academy)
challenge; it may not be the most difficult one, but it will last until the end, bringing a specific difficulty, and you know when delivered, the rest of the shots are going to go well. It’s the shot that you may remember the name, like ‘070_020’, for years. And when you see the movie at the end, it goes so fast. All the work, all the worries for shots that last only a few seconds. I guess it is very specific to our VFX world where days, even months can be spent with one open shot. It’s such a different experience than being on set where, at the end of the day, you know you have filmed.”
One project, however, that Bauwens is particularly proud of is Ridley Scott’s The Last Duel. “Ridley Scott was a big step,” Bauwens acknowledges. “I wasn’t directly on the film. The VFX Producer was Kristina Prilukova. I was Head of Studio and this film would determine our future in Paris in the international landscape. Part of the work was done in MPC Montreal, and in Paris we did environments, FX and crowd work. It was one of the very first projects that triggered the 40% tax rebate bonus in France by having more than €2m of VFX in Paris. We weren’t allowed to fail. Working with Ridley Scott for our team in France was like a life goal. I think about Christophe ‘Tchook’ Courgeau, the wonderful [MPC] Head of Environment in Paris, building the Notre-Dame Cathedral in the Middle Ages for the one director who inspired him to work in this industry and visual effects. As a team in the studio, we were very proud of being part of the movie.”
Bauwens also serves as Co-Chair of France VFX. “France VFX is a trade association with 24 companies in France that are related to VFX. I co-chair this organization with Olivier Emery [Founder, CEO and VFX Producer at Trimaran]. France VFX aims to support and promote the French VFX industry. The diversity of the group that constitutes France VFX represents the diversity of our industry in France. Through this association, we can better defend

“[W]hen you see the movie at the end, it goes so fast. All the work, all the worries for shots that last a few seconds. I guess it is very specific to our VFX world where days, even months can be spent with one open shot. It’s such a different experience than being on set where, at the end of the day, you know you have filmed.”
—Béatrice Bauwens, Head of Studio, MPC Paris
the interests of the industry at national and international levels. It also allows us to promote our savoir faire to other industry associations that represent DOPs and producers. It shows that there is a strong ecosystem for VFX in France.”
Bauwens is optimistic about the future of visual effects in the French film industry. “It’s true that the French film industry may look a little bit shy about the use of VFX. However, it’s definitely changing. Some very ambitious projects in terms of VFX were recently made. Le Comte de Monte-Cristo is one good example; it was received by audiences with great success. There is also an interesting evolution of the ‘film de genre’ with films like The Substance and The Animal Kingdom [Le règne animal] where the visual effects make the story possible. The ongoing evolution of the tools allows new possibilities to be offered to the filmmaker and producer, pushing the limits of our craft again. At the end of the day, it is about how to make the trick help the story.”
Concludes Bauwens, “Georges Méliès was one of the first pioneers of visual effects. There are many factors in the French film environment that are beneficial to reinventing our craft again – the schools, the intermittent social status for artists, the ecosystem of companies that have existed for a long time and the new ones, too. France is now more a part of the global industry. These are all positive signs for the future of VFX in France.”

BOTTOM: Bauwens, left, is Co-President of France VFX, an association of French VFX vendors, and a passionate spokesperson for the benefits of utilizing French VFX in international as well as local productions.
TOP: Bauwens served as Visual Effects Producer on Rust and Bone (2012), directed by Jacques Audiard. (Photo: Roger Arpajou. (Image courtesy of Why Not Productions and Sony Pictures Classics)

GOING BEYOND WHAT WAS DONE BEFORE FOR STAR TREK: STRANGE NEW WORLDS S3
By TREVOR HOGG
Captain Christopher Pike helmed the USS Enterprise in the original pilot for Star Trek, but NBC rejected it, resulting in the name change to Captain James T. Kirk, who went beyond the realm of television and film to become a pop icon. Despite being replaced, Pike did not become a footnote but appeared in an episode of the original series and on the big screen over half a century later. Things went into hyperdrive for the Starfleet officer when a guest appearance on Star Trek: Discovery paved the way for him to reclaim leading man status with the production of Star Trek: Strange New Worlds, which revolves around his galactic expeditions on behalf of the United Federation of Planets. The third season, consisting of 10 episodes, does not shy away from exploring different genres with an irreverent wit and expanding the scope of the unexplored cosmos by utilizing virtual production.
Images courtesy of CBS Studios, Inc.
TOP: Starfleet Captain Christopher Pike (Anson Mount) views the outcome of his space battle against the Gorn from the bridge of the Enterprise
OPPOSITE TOP TO BOTTOM: Virtual production was utilized to expand the scope of the practical sets constructed for the Gorn hive environment. Practical pods were constructed with actors placed inside.
“There is a variety of different stories and approaches to things in Star Trek: Strange New Worlds,” states Jason Zimmerman, Supervising Executive & Lead Visual Effects Supervisor on the show. “They’re not afraid to try out different ideas and tell a variety of stories. It’s always cool because when you read the scripts, you may not have the bigger picture of things, and when you finish the season, you look back and go, ‘That worked great.’” The goal was to capture the spirit that Gene Roddenberry infused into his innovative space western that aired during the late 1960s. “This show is taking those classic moments and themes of the original series and bringing them into the present,” remarks Brian Tatosky, Visual Effects Supervisor. “The original series had different tones to episodes, like more serious or lighthearted. What Strange New Worlds has done is push that even farther to where we have genre

episodes within our episodes. Visual effects is about supporting those scripts. If it’s a funny episode, we need to make sure that our stuff fits. If it’s a scary episode, the sets and effects have to be as scary as possible.”
Virtual production has become a significant component of world-building. “The main difference for us on this show and on the shows where we do virtual production is we’re building something for the day we shoot,” Zimmerman states. “We’re not prepping for something that we know is coming in post. It requires more urgency and interaction with the different departments to make sure we’re aligning with what they’re planning. If you’re talking about something like an Engineering set, a lot of that is practical in terms of what they’re interacting with. The graphics, to [First Assistant Art Director: Motion Graphics] Timothy Peel and his team’s credit, are mostly practical, so we’re not planning for a lot of monitor comps later on. The most successful assets have always been those that start something practically and iterate it into a virtual asset. In that way, you’re able to hide the effect. No one knows where it starts and ends. Some of those assets are organic in nature, like the Gorn hive, which was a virtual production asset, and yet we had practical pods. They could sit in the foreground, which pushed everything away in depth and made it feel like it was all one piece, as opposed to something like Engineering, which is all hard surfaces but requires interactive lighting. It’s interesting because each one is different and custom to the story.”
Not every alien encounter is friendly. “Luckily, through our tenure on Star Trek, we have done a few space battles,” Zimmerman notes. “We have learned a lot about what to do and




To create an impression that the interior of the ancient prison was influenced by quantum physics, a constantly moving virtual background was created that goes off into the infinite distance.
not to do along the way. Fortunately, in Episode 301 with the Gorn, we have Chris Fisher directing, who has been with us since the start. He’s our EP and PD [Producing Director]. Chris loves to storyboard and interact with everybody. It’s about the prep. It’s different from a virtual production where everybody is working towards a common thing, although there was some virtual production playback hidden in those sequences. Primarily, it’s about the choreography of things and establishing a geography. We start to previs the minute we get his boards. We might even be doing previs while he’s beginning to shoot. If something comes up on set where we go, ‘They’re looking more this way,’ we can adjust our animation to match. What Chris plans typically holds, and that’s helpful for us because we’re not scrambling to try a new direction or change something in post.” There is an offshoot component associated with virtual environments. “Before even the storyboards, they can be mulling around in the creative process,” Tatosky remarks. “We can show them potential shots and layouts of where things need to go in a sequence if characters need to get from Set A to Set B. When shooting begins, they have a better idea of how the pieces fit together from day one, which means fewer shots and retakes.”
Among the threats that Pike and his crew encounter are Klingon zombies. “There are a few things in visual effects that you want to play with, and zombies are at the top of the list,” Zimmerman laughs. “Other than the shots that are obvious visual effects, which we’re either keeping out or in with a containment field or bursting into particles, all the zombies you see are special effects makeup. It was done on a virtual production asset, when they’re on top of that plateau where the shuttle lands. We had our moments where we got to play in the sandbox. You would log into QTAKE to see what they were doing on set on the day, and they were on a different planet with zombies. There was nothing we had to do later for that. The table was set for us.” The Gorns had practical and digital versions. “There was a guy in a [mocap] suit, as well as CG, for
TOP TO BOTTOM: Practical Gorns interacted with the cast while digital ones were produced for moments where they defy physics, like crawling on walls.
The exterior shots of the Enterprise in outer space were fully CG.

when they’re hanging upside down on the ceiling,” Zimmerman explains. “I’m sure the actors are so much happier having something that is live in the scene to interact with. It helps us in terms of our lighting and textures to have them all marry together. We might add a blink or some drool, but there is a nice division of labor between what we do practically and digitally that helps to sell the creatures in their environment. When you see somebody being held up by a practical Gorn, it feels so much more visceral and real. We like working that way.”
First appearing in Star Trek: The Animated Series and transitioning to live-action with Star Trek: The Next Generation, the Holodeck has a near disastrous test run that nearly kills its user and cripples the Enterprise. “The fun part is figuring out how thick you want the yellow lines to be on the Holodeck because it has changed over the show’s seasons even in Star Trek: The Next Generation,” Tatosky notes. “Then looking at the original effects of the transitions into Holodeck sets in TNG and going, ‘We want to honor that, but there has been enough time that we should upgrade these things a little bit and make them more visually complex because the viewer of today expects more than essentially a cross dissolve, which was what a lot of these effects were.’ I loved the scripting of it all, including why they’re using the Holodeck and what happens as a result. It fits into the Universe, and they get to tell a funny story with a lot of neat twists and turns that the audience will enjoy.”
Among the new environments visited by the Enterprise is an ancient prison that uses quantum physics to entrap a malevolent life force that resembles glowing fireflies. “The background was meant to move and feel infinite in the space,” Zimmerman states. “That was a virtual production asset, so it’s a perfect example of something that we started designing the minute we knew it was an idea. There are technical challenges because you’re creating a massive environment that has to move. We have to make sure


TOP TO BOTTOM: Star Trek: Strange New Worlds provided the opportunity for Starfleet Captain Christopher Pike to take center stage after the character was replaced by Captain James T. Kirk in the original series.
Storm Studios was responsible for the digital set extension of the decommissioned research facility, which resulted in the Klingon zombies.
Christina Chong as La’an, Darius Zadeh as Metron and Melissa Navia as Ortegas in Episode 309.


that it performs well on the wall. You have to make sure that it is lit properly because it has this bright backlighting, but it’s also supposed to be this moody, mysterious space as well. It was a fun one to construct because of the scope, and in our world, where we build so much in the future, this was meant to feel more ancient but still part of the Star Trek Universe.” The imprisoned life force is in a container that explodes like a flash grenade and scorches the eye sockets of a medical officer. “It’s supposed to be a horrifying moment,” Tatosky notes. “We’re not an R-rated film, but we still have to emphasize how awful it is.” Prosthetic makeup was an important component of the effect. “Blending with what was done practically was helpful to us,” Zimmerman remarks. “It saves you some wide shots and allows you to see what the light is doing near the eye. When the head turns a certain direction and it starts to fill with light, you can see that cavity more clearly.”
The 10 episodes feature between 2,000 and 2,500 visual effects shots by Ghost VFX, Storm VFX, Crafty Apes, Pixomondo, Zoic Studios, AB VFX, Cause and FX and an in-house team. “Nowadays, it’s hard to technically count because where does the work on the virtual environment and wall fit?” Tatosky observes. “Those would have been visual effects shots, but now they can point the camera almost anywhere once the environment is finished and get essentially beautiful visual effects shots in camera. It has upped the count of visual effects but in a completely different way.” Solar flares have a significant role to play in thwarting the Gorn. “I have to give credit to one of our DPs, Glen Keenan,” Zimmerman states. “I’ve probably known Glen for 15 years, and what’s fun with him is he understands visual effects, the camera and how things will end
TOP: Most of the Engineering set was practical to capture the proper interaction with cast members such as Martin Quinn as Scotty.
BOTTOM: The mantra for Star Trek: Strange New Worlds was to capture as much as possible in-camera, which meant producing visual effects elements such as motion graphics ahead of time for principal photography.
up in post so well. I lean into having good cinematography and DPs. If our work doesn’t match the lighting or if it’s not bright enough or too bright, all of those things are going to result in something that the viewer is going to catch. It’s an educated audience we have in Star Trek, and above and beyond that, in television in general. They know if it’s good visual effects or not. Having good cinematography and lighting to start with sets us on a path that is so much better than if you’re trying to fix something.”
“It’s important to give credit to Alex Wood, our Production Supervisor,” Zimmerman observes. “He’s our boots on the ground in Toronto and has been since Season 1 of Star Trek: Discovery. Alex has a great rapport with the different departments and helps us to walk through all of the meetings, shot listing, and interacting with directors in addition to Brian and me. Alex has built such a relationship with special effects and stunts that they know what we’re thinking, and we get what they’re thinking. Most of the time, we are in the ballpark when we start, in terms of the expectation for how somebody will approach something.” Another unsung hero is the in-house visual effects team. “I’m not sure we would be here without the in-house team,” Zimmerman admits. “They’re like the Navy SEALs of visual effects. Our team involves comp and CG, and has grown not only in the number of shots but also in the scope of what they’re able to take on now, finishing shots and beautiful CG things that we probably didn’t have the capacity to take on early on. They also help us with look development and setting the tone for things. When we go to the vendor, we’re not saying, ‘R&D something for us.’ We’re saying, ‘Here’s the new script and the idea. This is it in a shot. Take that and roll it into 25 shots.’”
Season 3 was about refining the virtual environment process. “We knew there was interest in doing more with that, to go to new places that we haven’t seen before, to go to more complex places that we’ve seen before and to push the wall to as close to failure as possible but still working so we can get the most detail, animation and lights in there,” Tatosky states. “We want to do better and be more creative than the season before, and do less of the fixing of the mistakes by thinking ahead.” There is no shortage of complex shots. “Just in Episode 301 alone, there are some massively long fly-throughs of the Gorn cruiser coming from outside into space, flying down essentially into the set,” Tatosky notes. “That is a mammoth undertaking to make that all feel nice and to get the lighting, speed and feeling of it all right.”
The production process has become a well-oiled machine due to the cross-pollination of filmmaking talent associated with the various Star Trek series. “Everybody is bringing that collective years of experience to the table every day,” Zimmerman remarks. “In a lot of ways, I don’t think there is anything anyone is going to come up with that will scare anybody because someone is going to go, ‘Remember when we did this?’” There is a broad range of visual effects work. “We’ve always got our Enterprise shots, which never get old,” Zimmerman reflects. “That being said, it’s nice to explore the virtual production and scope of the worlds we’re building, and the creatures have been satisfying. We always get there, and it always works out in a good way. It’s not like we just get it done. It’s always something we’re proud of by the end of the season.”



TOP TO
The design challenge for the ancient prison was to emphasize a sense of history rather than futurism while still fitting within the Star Trek Universe.
Prosthetic makeup and visual effects were combined to create believable scorched eye sockets.
Motion graphics are a major on-set component are produced by Timothy Peel.
BOTTOM:

NEW TOOLS KEEP EXPANDING THE CANVAS FOR TODAY’S MATTE PAINTERS
By CHRIS McGOWAN
Matte painting for movies has been around since Norman Dawn’s 1907 documentary Missions of California. Since then, innumerable films have utilized the technique to provide background sets and settings – works such as Ben-Hur, King Kong, The Wizard of Oz, Citizen Kane, Mary Poppins and Raiders of the Lost Ark are just a few spectacular examples. Originally consisting of painting on glass, matte painting began the big shift to DMP (digital matte painting) with 1985’s Young Sherlock Holmes and other films in the 1980s and ’90s, with DMP ruling the 21st century.
Matte painting is still widely used but has evolved. “It’s now often integrated into the 3D workflow – used for paint-overs, environment extensions or background plates. Fully 2D or 2.5D matte paintings are becoming rarer, but they remain a cost-effective solution when applied strategically,” states Charles-Henri Vidaud, Head of Environments at Milk VFX.
TOP: Raynault Founder Mathieu Raynault worked on matte painting for Matrix Revolutions. (Image courtesy of Warner Bros.)
OPPOSITE TOP TO BOTTOM: ILM’s Timothy Mueller worked with VFX Art Director Yanick Dusseault to create the desert planet Jakku in Star Wars: The Force Awakens (Image courtesy of ILM and Lucasfilm Ltd.)
Milk VFX’s Charles-Henri Vidaud created the matte painting of the planet Morag for Avengers: Endgame for Cinesite. (Image courtesy of Marvel Studios and Walt Disney Studios)
Frameless Creative teamed with Cinesite and its partner company TRIXTER to create a permanent immersive art exhibition in London. The many artworks include this painting by renowned 18th-century Venetian painter Canaletto, known for his iconic views of the Grand Canal in Venice. (Image courtesy of Antonio Pagano, Cinesite and Frameless Creative)
Vidaud reflects, “I started in the industry 25 years ago doing storyboards, so transitioning into matte painting was a natural next step. It allowed me to combine composition, lighting and storytelling on a broader canvas. One of the most iconic matte paintings I created was for Avengers: Endgame while at Cinesite – the shot where the Benatar spaceship lands on planet Morag. That environment was originally designed by MPC for Guardians of the Galaxy, but I had to recreate it from scratch to fit a different time of day and camera angle. It was both technically demanding and creatively rewarding. Other highlights include Black Widow and Assassin’s Creed.”

“Matte painting remains a vital and evolving art form in the film, television and gaming industries,” remarks Timothy Mueller, Senior Generalist Artist at ILM. “While traditional painting skills continue to provide the foundation – mastering composition, lighting, perspective and storytelling – today’s matte painters must also be proficient with cutting-edge digital tools and workflows.” Mueller adds, “Despite rapid technological progress, the core challenge stays the same: crafting believable worlds that support the narrative and evoke emotion. Matte painters are storytellers, visual architects who help transport audiences to places both fantastical and grounded in reality.”
Like many in the field, Mueller began with a foundation in traditional art. “That path eventually led me to the world of visual effects, which had long fascinated me, especially the groundbreaking work of the artists and technicians behind George Lucas’s original Star Wars films. My first film credit as a matte painter came on the Mark Wahlberg football movie Invincible, where I worked at Matte World Digital alongside the Academy Awardwinning visual effects supervisor and author of The Invisible Art, Craig Barron, VES, and senior art director Chris Evans.” Currently, being part of ILM provides Mueller with “the opportunity to work on a wide variety of high-quality projects in the visual effects industry. One of the highlights of my career was working on my first Star Wars film, The Force Awakens, where I helped bring J.J. Abrams’ unique vision of the Star Wars Universe to life. Collaborating closely with VFX Art Director Yanick Dusseault, we created distant worlds across the galaxy, from the desert planet





Jakku to Starkiller Base – an icy world transformed into a superweapon by the First Order.” Time pressure, especially in high-end film and VFX, can be intense for matte painters. “You might have only a day – or even just an hour – to deliver a polished final product. Clients and directors may request changes that require significant rework,” Mueller says. “Matte paintings need to hold up at 4K resolution and sometimes even as high as 32K. Working non-destructively and saving incrementally is essential.”
A key challenge is seamlessly blending hand-painted elements, photographs and 3D renders – especially when the shot is animated with camera movement and parallax, according to Mueller. “Artists often rely on projection mapping and multi-pass renders to achieve the desired result. Making a painting look photorealistic while aligning with the film’s or game’s artistic direction is critical. It must feel like an organic part of the world, with lighting, camera angle and mood all perfectly matched.” Mueller notes, “As visual effects increasingly shift toward 3D assetdriven, procedural systems constrained by budget, the need for 2.5D-style matte painting has become more important than ever. Matte painters can work independently, without relying on other departments, allowing for greater efficiency and significant savings in both time and cost for productions.”
Mathieu Raynault, Founder of Raynault VFX, got into matte painting early in his visual effects career, around 1996 in Montreal, Canada. “I had just started as a CG artist in the early days of computer animation, coming from a background in architecture. I was naturally drawn to set extensions and digital environments, especially anything architectural. But achieving photorealism in CG back then was a serious challenge and took literally days to render! Digital matte painting was still a lesser-known discipline at the time, but I had the chance to work with my friend – and now business partner – Yanick Dusseault, who introduced me to it. Learning how to ‘cheat’ in 2D using Photoshop was a revelation.
TOP TO BOTTOM: Scanline VFX’s Scott R. Anderson provided the matte painting for the Jennifer Lopez sci-fi tale Atlas. (Image courtesy of Scanline VFX and Netflix)
Scanline VFX Lead Matte Painter Alec Geldart worked on the digital matte paintings for Rebel Moon – Part Two: The Scargiver. (Image courtesy of Netflix)
Charles-Henri Vidaud created the matte painting of the planetary surface of Morag for Avengers: Endgame for Cinesite. (Image courtesy of Marvel Studios and Walt Disney Studios)

Inspired by traditional matte painters, we were constantly exploring painting techniques and finding shortcuts to achieve photorealism without relying solely on heavy 3D renders.” Raynault continues, “I remember the relief of being able to paint huge amounts of details on entire backgrounds instead of modeling and texturing everything. Around that time, 3D software like Softimage began introducing camera projection tools, which gave our work new life – allowing us to add parallax to our digital matte paintings. It felt like a revolution.”
Camera projection mapping was a game-changer. “I still remember when Softimage first introduced the camera projection feature,” Raynault recalls. “At first, we didn’t understand the point of it. Then my former CG teacher, Robin Tremblay, showed me a simple test – he projected a photo of a landscape onto basic geometry and animated a camera move. Suddenly, that static photo had parallax and depth. It blew my mind! This aligned perfectly with the digital matte painting experiments I was doing at the time. I realized we could bring this technique into production –combining matte paintings with camera projections to create shots that were visually rich, had depth and movement, and rendered incredibly fast compared to traditional 3D workflows. It quickly became one of my go-to tools.” He continues, “Working on the Star Wars prequels at ILM around 1999 was particularly formative. Being part of the ‘DigiMatte’ department and collaborating with legends like Paul Huston and Yusei Uesugi had a huge impact on me. It was an inspiring environment that really helped shape my skills because most matte painters there had actually been painting traditionally before transitioning to digital.”
The Lord of the Rings trilogy ranks high among Raynault’s work – “not just for the creative work, but for the overall experience. Moving to New Zealand, working at Wētā Digital when it was still fairly new, and meeting talented people from around the world made it a truly special chapter in my life,” he recalls. Raynault went


TOP: Raynault VFX Creative Partner Yanick Dusseault was a Senior Matte Painter for Wētā Digital on The Lord of the Rings trilogy. (Image courtesy of New Line Cinema and Warner Bros.)
BOTTOM TWO: Timothy Mueller, Senior Generalist Artist at ILM, worked with the ILM team on Star Wars: Episode VII – The Force Awakens, including Starkiller Base. (Images courtesy of ILM and Lucasfilm Ltd.)
“When a shot can benefit from a matte painting approach, we’ll always consider it first. We see it as a way to preserve the craft and stay efficient without compromising on quality. … Matte painters are frequently involved during the concept phase, helping to explore visual ideas early on – sometimes even before the shoot. The quick input that DMP provides can help shape the look of a scene or environment well before anything is filmed.”
—Mathieu Raynault, Founder, Raynault VFX


BOTTOM:
on to co-found Rodeo FX in 2006. He launched the boutique studio Raynault VFX in 2011. Raynault VFX remains deeply attached to the mindset that “when a shot can benefit from a matte painting approach, we’ll always consider it first. We see it as a way to preserve the craft and stay efficient without compromising on quality.” Matte painters are frequently involved during the concept phase, helping to explore visual ideas early on – sometimes even before the shoot, according to Raynault. “The quick input that DMP provides can help shape the look of a scene or environment well before anything is filmed.”
Raynault observes, “There’s definitely an overlap” of matte painting and concept artwork. “Many digital matte painters naturally moved into concept art. I think this is because historically, matte painters often acted as both designers and world-builders, which makes the transition to concept art a natural one. Even today, many of the skills required are similar, so the line between the two disciplines can be quite thin.”
“The matte painter is increasingly being brought into production at an early stage,” notes Alec Geldart, Lead Matte Painter at Scanline. “It’s really beneficial to get in there early to advise as to how we would approach a sequence or shot, determining what needs to be 3D assets and what could be handled with projection setups, scheduling and allocating artists to the show, and so on.” Some of Geldart’s favorite projects have included Rebel Moon: Parts 1 and 2, Stranger Things Season 4 and Star Trek Beyond. “In production, concept art can become indistinguishable from matte painting,” Geldart notes. “If I am asked to provide five versions of a landscape and one is chosen and approved, I would then be asked to hand it off to comp. If I approached concept any differently than I prepare a matte painting, I wouldn’t be able to hand it over that quickly. Of course, all types of concept art are meant to be guides for other departments; in that case, you’re a little freer from a proper DMP workflow.” Geldart adds, “These days, matte painters are all expected to be generalists. It’s great to have as many skills as possible and working knowledge of as many programs as possible. But you can’t lean on 3D; you need to have a solid 2D base and a strong understanding of matte painting fundamentals.”
Thomas Hiebler, DMP & Environments Supervisor for TRIXTER and Cinesite, comments, “Although matte painting is still used in almost every project to efficiently enhance object details, the epic, largescale works seem to be gradually disappearing.” Hiebler notes, “Clients increasingly demand maximum flexibility and the ability to make changes right up until the final deadline. This makes it challenging to rely on the traditional matte painting workflow. For this reason, at TRIXTER, we merged the Environment and DMP departments early on. Typically, we start with an environment fully constructed in 3D, utilizing various programs such as Gaea, SpeedTree and especially Houdini. If, at a later stage, it proves more efficient, we can always switch to a matte painting workflow. However, we try to avoid this as much as possible since changes to lighting, camera positions, or even entire shots [getting replaced] often occur due to re-edits requested by the client. In traditional matte painting, such decisions must be made early on, which significantly limits flexibility.”
In Hiebler’s opinion, “The future undoubtedly lies in procedural workflows and, looking further ahead, fully dynamic 3D environments with emerging technologies. These tools make it possible to implement even complex changes flexibly and efficiently, opening up entirely new
TOP: Mathieu Raynault was a matte painter for Wē tā Digital for The Lord of The Rings: the Two Towers and a Senior Matte Painter for The Lord of The Rings: the Return of the King before founding Raynault VFX (Image courtesy of New Line Cinema and Warner Bros.)
Raynault Creative Partner Yanick Dusseault worked on matte paintings for Star Wars: Episode III – Revenge of the Sith (Image courtesy of ILM and Lucasfilm)
possibilities for creating impressive worlds. New technologies and steadily growing computing power now enable us to integrate millions of objects into a single scene. The emergence of extensive libraries, such as the Quixel library, has also contributed to the rapid creation of complex environments.” Meanwhile, Geldart sees an explosion in resources for DMP. “You have KitBash, Gaea and many online tutorials from working professionals. Matte painting seems much more accessible now than when I started. And, of course, [there’s] AI. I’m not as pessimistic about AI as some people; I do think it will play a larger role in matte painting and concept art in the future, but I don’t think it’s all doom and gloom.”
According to Mueller, “Photoshop and other 2D-based software will always be fundamental tools for the matte painter. Compositing programs like Nuke and 3D modeling software such as 3ds Max are also essential to the workflow. High-end rendering engines like V-Ray are used to recreate realism in computer graphics by accurately simulating global illumination, reflections and refractions, soft shadows, physically accurate materials, lighting and more. Major advancements have also been made in real-time rendering, particularly with Unreal Engine. Additionally, photo-based imaging tools such as Mari, Reality Capture and Agisoft allow for the seamless integration of 2D photographs into 3D models for rendering.”
Photoshop remains the main tool for most digital matte painters, observes Raynault. “Beyond that, any 3D package can be used to generate base geometry or render passes – whatever helps you block out your scene. Blender or Cinema 4D are widely used.” Vidaud concurs, “Photoshop is still the standard for software.”
However, “Tools like Krita and Affinity Photo are gaining ground, particularly because of their color management features and Linux compatibility.”
Mueller concludes, “Exciting new worlds have opened up with the expansion and creation of new technologies that have made what was impossible in the past a new possibility for the future. The integration of 2D painting with 3D assets, image projection techniques and real-time engines like Unreal has expanded the possibilities for creating immersive, photorealistic environments. These advances allow artists to work faster, create more dynamic shots, and collaborate more closely with VFX teams. In short, matte painting today is a powerful blend of art and technology, demanding both creative vision and technical expertise, and offering exciting opportunities for those passionate about world-building.”
Noémie Cauvin, Head of DMP & Visual Development at The Yard, remarks, “Much like the arrival of Photoshop or the rise of 3D to build environments, matte painting is once again at the edge of a technological revolution, this time driven by AI. AI is a powerful tool, but it remains a tool. It can support our work, but it doesn’t replace the creative decisions, storytelling instincts or visual problem-solving that matte painters bring to the table to create stunning images. At its core, our craft has not changed: we are world-builders and set makers. No matter the tools, our mission is the same – to create immersive, believable environments that bring stories to life on screen.”



TOP TO BOTTOM: The Frameless immersive art exhibit includes John Atkinson Grimshaw, an English Victorian-era artist renowned for his nocturnal scenes of urban landscapes. DMP/environment skills are important not just for films and series but also for immersive entertainment.
(Image courtesy of Antonio Pagano, Cinesite and Frameless Creative)
DMPs from Scanline VFX were an important part of Rebel Moon – Part One: A Child of Fire. (Image courtesy of Netflix)
Scott R. Anderson was Lead Digital Matte Painter for Scanline VFX on Atlas and Shazam: Fury of the Gods. (Image courtesy of New Line Cinema and Warner Bros.)

ROBIN SAXEN EMBRACES TECHNOLOGY TO MAKE THE CREATIVE MISSION POSSIBLE
By TREVOR HOGG
The past decade has been literally and figuratively a ‘Mission: Impossible’ for Robin Saxen, who has been the constant visual effects component throughout the Christopher McQuarrie era of the film franchise where she started off as Second Visual Effects Coordinator on Mission: Impossible – Rogue Nation and became the Visual Effects Producer on Mission: Impossible – Fall Out, Mission: Impossible – Dead Reckoning Part One and Mission: Impossible – The Final Reckoning
Before taking on the perilous global espionage exploits of Ethan Hunt, Saxen had a fateful experience on another Tom Cruise production when she was a visual effects production coordinator at ILM. “The biggest moment was working on Magnolia with Paul Thomas Anderson. We were almost the same age, and he was so excited about working on the project and so infectious that it was a joyous experience.” A mentor was also discovered in the form of Joe Letteri, VES. Saxen remarks, “Working with Joe all the way through Magnolia led to a lot of the tools that I’ve made and the systems I put in place so we can track things quicker, get through things and deliver. Ultimately, the filmmakers are worried about the picture while I’m worried about all the pieces that are going to make that picture.”
Working in the film industry was not an obvious fit despite being born and raised in California. “My dad was an insurance broker after leaving the Navy,” Saxen states. “I wanted to be part of something that was always changing. I did not want to sell insurance. I like the energy of working in the kind of world where the technology is always growing and the ask is always something bigger; that makes it quite engaging intellectually.” Technology continues to advance at a rapid rate. “When I first started, somebody called me from USC and asked, ‘How much space would it take to store a whole movie?’ At the time, I said, ‘It’s never going to happen because it costs too much to scan a film.’ Now, that happens all the time. Even how we deliver films. At the end, it’s a file, not these big, bulky prints. It has sped up how quickly we can compile everything but also means that things can be pushed to the very end.”
The resolution divide no longer exists between digital and film. “Digital technology has improved so much that you actually get a much finer resolution than with film, and that was many years in the making. When we started with digital, it was only 1080 res, and film could go up to 6144. We’re able to achieve it now with DI treatments.”
TOP: Visual Effects Producer Robin Saxen at 2025 FMX in Stuttgart where she spoke about the art of producing for visual effects. (Photo courtesy of Robin Saxen)
OPPOSITE TOP TO BOTTOM: Saxen has spent a decade working on the Mission: Impossible franchise, including the final installment where Tom Cruise goes underwater. (Image courtesy of Paramount Pictures)
Visual effects painted out the ramp that enabled Tom Cruise to drive off a cliff on a motorbike in Mission: Impossible – Fallout. (Image courtesy of Paramount Pictures)
Digital augmentation assisted in raising the sense of peril for Tom Cruise in Mission: Impossible – Fallout. (Image courtesy of Paramount Pictures)
Saxen attends the London premiere of Mission: Impossible –The Final Reckoning. (Photo courtesy of Robin Saxen)
An analog enjoyment growing up was reading books. “I was always into books and expanding my knowledge,” Saxen remarks. “I loved John Steinbeck in the early days. I grew up on a little bit of a farm, and remember in high school people talking about The Grapes of Wrath and saying when they had to butcher their pigs how tragic that was and what it meant psychologically to the children. I said, ‘Actually, I grew up on a farm, and we used to watch the chickens get butchered. It was just a thing that you did.’” The quest for knowledge led to Pepperdine University and a BA in Telecommunications and Political Science. “I wanted to do more editorial work, but the industry was hard to get into if you didn’t have a family connection or know somebody. I ended up working at Unitel Video, which did The Fresh Prince of Bel-Air and Star Trek:



“I love the globality of our industry because everything benefits from a more global contribution of the brain trust. It’s that technology and striving to make something even better, and bringing all of these different people together around the world has made our industry grow exponentially.”
—Robin Saxen, Visual Effects Producer
The Next Generation. I was getting nowhere fast, so I sent out 50 resumes and got hired at Pacific Title Digital. There was this great programmer named Bonnie Boyd, and she explained operating systems and Linux to me in a way that made sense. We had to make stuff up as we were going along. There were no GUIs [Graphic User Interface] at that time. Everything was command-line based.”
Breaking new ground with emerging digital technology was exciting. “Several times I’ve run into people saying, ‘We’ve been doing it like this for 20 years. Why should we change?’” Saxen notes. “I’m like, ‘We have computers now. Let’s do something better. Let’s speed it up!’ I’ve taken a lot of pride in automating tools on my side for production and editorial so that we can take care of all the paperwork, communication and tracking easily. This allows us to focus on the picture, the job and making sure that we can deliver on time and as much on budget as we possibly can, and I take a lot of pride in that.”
The willingness to adopt technological advancements extends to AI. Saxen comments, “AI has become a boogeyman that, somehow,




Saxen takes a break while location scouting in Norway for Mission: Impossible – Dead Reckoning. (Photo courtesy of Robin Saxen)
Saxen gets ready for the train action sequence that takes place in Mission: Impossible – Dead Reckoning. (Photo courtesy of Robin Saxen)
they’re trying to beat us up with. But we’ve always worked to make better tools in order to improve how we work. Part of that is physics and math. I’ve even called it liquid math when I was trying to explain it to somebody. Don’t be afraid of the tool. The traditional way to get into visual effects companies back in the day was through roto and tracking, and a lot of that’s been farmed out. Now there are tools making that easier. If you have tools that make things easier, then you can focus on the next thing. What I hope is it will inject a creative freedom that enables you to make something that unleashes your potential.”
There is still a place for traditional methods. “ILM was a huge collaboration of talent and people appreciating each other’s ability, like the modelmakers and the special effects guys,” Saxen recalls. “I still remember being in a mobile home-type of cabin, and people getting on the loudspeakers saying, ‘Explosion off stage one.’ And you would hear this loud boom! That was very cool. ILM was established and had their way of doing things. Then I went to Wētā Digital, which didn’t have any baggage. It was all about, ‘How quickly can we make this happen?’ It was a great place for me to work and actually come up with faster systems for production.”
Time was spent with two other major visual effects heavyweights, Framestore and DNEG, where the transition was made from being a production manager to a senior visual effects producer. “I love the globality of our industry because everything benefits from a more global contribution of the brain trust. It’s that technology and striving to make something even better, and bringing all of these different people together around the world has made our industry grow exponentially. I’ve been in situations where I need to make changes, but my vendor is in Canada and I’m in London on the phone saying, ‘Can you get this?’ We’ll get that overnight, so that actually does help.”
TOP TO BOTTOM: Mission: Impossible – Rogue Nation welcomed director Christopher McQuarrie and Saxen to the espionage franchise headlined by Tom Cruise. (Image courtesy of Paramount Pictures)

“I’ve created a tool that works for all time zones. One of the best things that I can do for my director is to keep chasing the iterations and making sure that he or she has seen the shots. The most important thing is establishing a relationship with editorial and making sure that you’re clear on the pipeline for your vendors and you know the resolution you’re delivering to. Basically, you are buttoned up from the beginning...” —Robin Saxen, Visual Effects Producer
Switching over to the client side was the next step. “The facility experience has been invaluable because I understand how things are done, the technology, and what needs to go into making the shots believable and workable,” Saxen remarks. “My first experience was working on World War Z. That was tough because I didn’t know who was responsible financially for something, and each department is there to protect their own budgets. That was a bit trial by fire, but I learned a lot on that one.” The visual effects work has ranged from creating a fully digital bear protagonist for Studio Canal, to invisible effects that support an actor performing death-defying stunts for Paramount Pictures. “Because Paddington is a CG creature interacting with environments and people, you have to be aware of everything to make him sit in that three-dimensional space. There are different things to consider. You’re tracking the dialogue and having all that for the animation. It’s a little bit different when you’re turning things over and making it all work. I’ve always told people that the organic stuff is harder, like fur and plant life. [Director] Paul King was amazing to work with, and I see him so much in Paddington. When you’re on more of a live-action picture, it is your relationship with editorial and how quickly they assemble sequences. You need to do the clean-up work so you don’t see the stunt mats or the rigs. You

TOP: Helicopters play a major role in the action scenes found in Mission: Impossible – Rogue Nation. (Image courtesy of Paramount Pictures)
BOTTOM: Saxen’s career-changing moment was working for Paul Thomas Anderson and Joe Letteri, VES, on Magnolia (Photo courtesy of Robin Saxen)



To acquire the necessary scope and scale for environments, visual effects were utilized for Mission: Impossible – The Final Reckoning (Image courtesy of Paramount Pictures)
Ethan Hunt (Tom Cruise) attempts to escape a falling train in Mission: Impossible – Fallout. (Image courtesy of Paramount Pictures)
“If you have tools that make things easier, then you can focus on the next thing. What I hope is it will inject a creative freedom that enables you to make something that unleashes your potential.”
—Robin Saxen, Visual Effects Producer
have something that’s seated in reality, so you know what it needs to look like and how it has to interact with lighting in the environment. Basically, you have a template.”
Fundamentals do not change. “What filmmakers want from me is, ‘How soon am I going to get it? Am I iterating, and what were my last notes? When was the last time I’ve seen this?’” Saxen states. “I’ve created a tool that works for all time zones. One of the best things I can do for my director is to keep chasing the iterations and making sure that he or she has seen the shots. The most important thing is establishing a relationship with editorial and making sure that you’re clear on the pipeline for your vendors and you know the resolution you’re delivering to. Basically, you are buttoned up from the beginning so that people that can do their tests and make sure that when submitting, everything is in line and they’re looking at stuff in the same visual space. If you do that, then when you’re in the push and turning over hundreds and hundreds of shots a week, you know that it’s a well-oiled machine.”
Having the same team on Mission: Impossible – Dead Reckoning Part One and Mission: Impossible – The Final Reckoning was extremely beneficial. “Over the course of five years, we fine-tuned everything,” Saxen remarks. “Everyone could access the paperwork and notes easily, and they all felt comfortable with the tools. I made it so that when picture editorial had to export or import, it was like four commands. They could do their work and not have to worry about the small stuff.” A signature stunt performed by Tom Cruise for Mission: Impossible – The Final Reckoning is the submarine sequence. “We don’t know what a submarine would look like at that distance underwater in the Baltics,” Saxen observes. “But the way they engineered the inside of the submarine, and some parts of the exterior where Tom lands on the sub, gave us the framework that we built around to create something that is quite magical. It took time.”
If you want to be successful in the visual effects industry, you have to be quite thorough. “I created something that is like a digital Post-it, which we could leave each other so when you came in the morning and logged in, you’ll see them,” Saxen explains. “They’d be flagging changes, problems and concerns, and then I can resolve everything. I always had a record of those Post-its and would send notes back. It has allowed us to dynamically keep all of the vendors up to date.” Collaboration leads to the best results. “I have people who I am bringing up, and they are doing well. I include people in the decision process because they come up with ideas, and you’re all part of the solution; inclusion is the critical thing, which took me a bit to learn. You don’t always have to work alone. You can bring in the troops, and then you’re all on the same team. That has been hugely helpful.”
TOP TO BOTTOM: Saxen visits the Brooklyn Bridge while shooting Gangs of New York. (Photo courtesy of Robin Saxen)



































GREEN SCREENS: IRELAND’S ROLLING VFX LANDSCAPE
By GEMMA CREAGH
TOP: Screen Scene VFX (SSVFX), an award-winning post and VFX company based in Dublin, was one of the key teams responsible for building the vast world of Shōgun. (Image courtesy of SSVFX)
OPPOSITE TOP TO BOTTOM: Step inside the TARDIS time machine with EGG VFX, putting an Irish stamp on global hit Doctor Who. (Image courtesy of EGG VFX and BBC Studios)
EGG VFX, along with Milk VFX, Powerhouse VFX and Clear Angle Studios, were key vendors behind Season 2 of Good Omens, which streams on Prime Video. (Image courtesy of EGG VFX)
While the VFX was completed overseas, limited series Say Nothing is a dramatic retelling of Northern Ireland’s political history created by Joshua Zetumer and produced by FX Productions. (Photo: Rob Youngson. Courtesy of FX)
When FX and Hulu’s Shōgun won Outstanding Visual Effects at the 76th Emmy Awards, Irish eyes were certainly smiling. That’s because the production credits of this American drama, set in 16th-century Japan, include such notable names as VFX Producer Nicolas Murphy and VFX Head of Production Kenneth Coyne, and one of the key teams responsible for building the show’s vast world is Screen Scene VFX (SSVFX), an award-winning post and VFX company based in Dublin. With credits like Spider-Man: No Way Home, 3 Body Problem, Lessons in Chemistry, Game of Thrones, WandaVision and Black Widow, SSVFX is the largest and one of several Irish companies that sculpt digital realities for international IP.
“The journey for us on Shōgun was incredible,” comments Jake Walshe, CEO of SSVFX and the Screen Scene Post Production Group. “FX and Disney had a strong vision for the show; they wanted it to be historically accurate and visually stunning. To achieve the level of detail required, they gave us 16 months to complete the show. We had 150 people building Osaka.” Walshe celebrated three wins at the 23rd Annual VES Awards for their work on HBO’s The Penguin with Colin Farrell and Shōgun.
Walshe is also the Chairman of VFX Ireland, a trade association that represents several leading studios. These have all been busy of late, as the field has grown 326% in the past five years and now employs over 300 people. As a tourist destination, Ireland might promise rolling hills, friendly locals and pints of creamy Guinness, but, in terms of screen production, it’s the tax incentives, skilled creatives and government support that have seen this sector flourish.

A HISTORY OF VFX
Long before Trinity College Dublin was the backdrop for Paul Mescal’s heartache in Normal People, it was the birthplace of Havok. In 1998, in the computer science department, Hugh Reynolds and Steven Collins founded the engine that would go on to power chaos in video games worldwide. Animation has also played a major role in Ireland’s screen legacy. Hand-drawn classics like All Dogs Go to Heaven and The Land Before Time paved the way for Oscar-winning and nominated studios like Brown Bag Films and Cartoon Saloon. In turn, they inspired Giant, Lighthouse, Piranha Bar, Moetion, Kavaleer, and now there’s a slew of young talent coming up behind them. From Doc McStuffins to Get Rolling with Otis to Transformers, a surprising number of beloved kids’ characters are rendered on computers across the Emerald Isle.
Over the past two decades, there has been a rapid evolution in the landscape of VFX. Oscar-nominated director Ruairi Robinson’s ambitious VFX-laden shorts earned exciting opportunities in Hollywood. Irish/Canadian Matthew Talbot-Kelly, a VFX supervisor for Windmill Lane, brought over an episode of Stargate Atlantis in the early 2000s. Sci-fi feature Lockout, co-written by Luc Besson, was helmed by Irish directors James Mather and Steve Saint Leger; for this project, Windmill Lane opened a facility in Sandyford Business District that housed approximately 50 artists, many of whom upskilled on the job. However, Game of Thrones was the biggest catalyst for change that the Irish industry has ever seen. With filming taking place in Northern Ireland, Producer Mark Huffam, CBE, made the strategic decision to also bring Season 1’s post-production, sound and picture to Screen





From battlegrounds to kingdoms, EGG VFX helped build the world of The Woman King with detailed, story-driven effects. (Image courtesy of EGG VFX)
Actor-turned-series-creator Chris O’Dowd’s Small Town, Big Story features sleek, stylized VFX from Outer Limits Post Production. (Image courtesy of Outer Limits)
Scene. That way, he could utilize the tax credits on either side of the border. Game of Thrones’ VFX followed shortly after and this ultimately led to the inception of SSVFX, which hired over 20 artists to build the iconic digital realm of Westeros. Between 2010 and 2019, the war for the Iron Throne injected an estimated £251 million into Northern Ireland’s economy. The entire island’s industry was impacted; Irish creatives were upskilled, and the next generation of crew were ready to muck in.
Drew Banerjee, Managing Director of EGG VFX and Post Production, enjoyed the uptick when they landed the contract to work on Sky’s epic series Sinbad in 2011. EGG VFX went on to work on The Woman King, Good Omens and Doctor Who, and found themselves a good fit for these kinds of projects. “We’re very reactive, we’re very human. So, if there’s a problem, you can ring me. That’s how we win, and this is something to be said about the Irish in a wider context. The way we do business, negotiate and navigate creative choices works well for Americans – and creatives from other territories. The Irish are open. We’re used to talking things through and building relationships.”
The Irish VFX sector has faced setbacks. The post-COVID production slowdown, followed by the U.S. writers’ and actors’ strikes, took its toll. This led to layoffs, with many houses forced to diversify. Last January, Windmill Lane Pictures closed its doors after 27 years in business. However, 2025 has delivered a welcome resurgence.
As a result, Elephant Goldfish recently entered the market in partnership with Molinare. “Studios and post-production companies that want to stay relevant will need to embrace new tools, new workflows and a new mindset,” notes Managing Director/ EP Deborah Doherty, whose credits include projects for Netflix, Disney+, Sky and BBC. “The ability to think creatively about both production and business models and respond quickly to everchanging demands is critical to survival and growth in this next
TOP TO BOTTOM: Nicolas Cage stars in The Surfer, directed by Irish filmmaker Lorcan Finnegan, with VFX by Outer Limits Post Production turning the beach into a psychological battleground. (Image courtesy of Outer Limits)

phase of the industry, which currently feels busier than ever. I’m so proud to see Elephant Goldfish come to life after what felt like an unachievable idea a few months ago.”
STATE OF THE NATION
For those unfamiliar with local politics, the island of Ireland is divided into two jurisdictions. The Republic of Ireland is made up of the 26 counties known colloquially as the “South” – while the six counties in the north remain part of the U.K. For anyone uninformed but curious, this complex history is dramatically depicted in FX’s Say Nothing. For those with an interest in making content there, this means dealing with two sets of tax incentives and two separate systems.
In the Republic of Ireland, the Section 481 tax rebate allows for up to 32% back on eligible expenditure for film or TV production (with a cap on 80% of the total global budget). Recently, this scheme has been expanded to include factual content, and just last month, Screen Ireland announced an enhanced version for mid-to-low budget feature films. Scéál is an 8% uplift to the existing 32% incentive and is available for feature films with a budget of less than €20m. This region is the only English-speaking country in the EU offering potential access to Eurimages or Creative Europe MEDIA funds for co-pros. Depending on the county where post is taking place, there is also potential support from the WRAP fund and in-kind assistance available from local county councils.
A TALE OF TWO TERRITORIES
On a scenic drive through the Irish landscape, you could cross the border multiple times and never realize it. Many freelancers happily work with clients in both territories, using both currencies. For financiers, there are some major differences. Since Brexit, Northern Ireland is no longer part of the EU. The U.K.’s VFX tax


TOP TO BOTTOM: Studio Ulster launched its world-class virtual production complex in Belfast in June. (Photo: Charles McQuillan. Courtesy of Getty Images and Studio Ulster) Bulls in the Water by Enter Yes™ is written and directed by IFTA-winning, BAFTA-nominated Kris Kelly. (Image courtesy of Enter Yes™)
Brown Bag Films is behind Get Rolling with Otis, an 11-minute, CGI-animated preschool series from Apple TV+ based on Loren Long’s bestselling book series. (Image courtesy of Apple TV+ and Brown Bag Films)
“Now I can see there’s a huge opportunity, and it’s driven by the streamers. Shows like Game of Thrones or Bodkin come to shoot here, then stay to do their post. There’s a growing talent pool. The rebate is getting higher and higher. Even Ireland, as a geographic location, has a lot of advantages – it’s close to America, close to Europe, and it’s English-speaking. There’s a lot going for it.”
—Jeannette
Manifold, Head of Studio, Crafty Apes Australia


Airing five seasons on Disney Jr., animated children’s series
BOTTOM: From left: Liam Óg Ó hAnnaidh (Mo Chara), Naoise Ó Cairealláin (Móglaí Bap) and J.J. Ó Dochartaigh (DJ Próvaí) in Kneecap, with VFX by Outer Limits Post Production adding to the distinct aesthetic of the film. Kneecap is a Belfast-based, Irish-language hip-hop group. (Image courtesy of Outer Limits)
rebate, introduced under the new Audio-Visual Expenditure Credit scheme, offers a 39% tax credit on qualifying U.K.-based VFX costs, an enhancement over the standard 34% AVEC rate. This applies to film, high-end TV, animation and children’s content, provided they pass the BFI cultural test or qualify as official co-productions and spend at least 10% of their budget in the U.K. Eligible VFX includes CGI, compositing, digital environments and motion capture.
In Northern Ireland, established companies like Yellow Moon (Line of Duty, Vanity Fair) and JAM Media (Jessy and Nessy, Nova Jones) work on projects of both a large and local scale. Meanwhile, the Pixel Mill offers a dedicated space where emerging talent can hone their craft and connect. The recent launch of Studio Ulster marks a significant advancement in infrastructure. This state-ofthe-art virtual production facility is funded by a £72 million investment from the Belfast Region City Deal. Among the first to shoot on VP1 (the biggest stage) was Enter Yes™, who filmed a proof of concept for their feature, Wolfking and I. This IFTA-winning, BAFTA-nominated company was founded by the husband-andwife team of Vicki Rock and Kris Kelly, specializing in high-end visuals for film, TV and games. Kelly notes, “We’re always looking for new and innovative ways of approaching subjects. We invest ourselves in subject matter that’s sensitive and worth telling. Using interesting visual approaches and mediums, there is a real cross-pollination between the service we provide for people like Netflix, Hollywood Features or the BBC. We also utilize those skills in our own productions; Bulls in Water is our latest production with Northern Ireland Screen, and has a lot of visual imagery that we’re proud of. That is a result of years of understanding the form and being at the forefront of incorporating visual art with visual effects. We spend both time and resources exploring stories that go beyond the boundaries of this island.”
GLOBAL ILLUMINATION
While other parts of the world may be more financially competitive, Ireland’s real offering is its creative talent; the local teams work hard and communicate effectively. “Being Irish, people accept you into social circles very quickly. That makes being away from home very easy. You can jump in and connect with people. It can quite often be on a cultural level, the ability to hang out without an agenda.” In an interview for Film Ireland, two-time Academy Award-winning Visual Effects Supervisor Richard Baneham (Avatar) shares his experience working internationally. He says, “There’s something cultural in the way we approach both the work and the subject matter.” Other Irish talent in key roles include (but are certainly not limited to) Laura Livingstone, VFX Executive at Netflix; Ed Bruce, Senior Vice President at SSVFX; Alison O’Brien, VFX Producer and Founder of Potent Pictures who has worked with MGM, Netflix Animation and for Disney+; VFX Supervisor Niall McEvoy (Foundation, Vikings: Valhalla); and Ciaran Crowley (Inception, Terminal), Creative Head of Milk VFX’s Dublin studio. Strategic public investment has been ongoing from both states, with concerted efforts being made to push that 326% growth figure even further. Dedicated agencies, such as Screen Ireland’s training
TOP:
Doc McStuffins was produced by Dublin company Brown Bag Films. (Image courtesy of Disney and Brown Bag Films)
department, the National Talent Academy for VFX, Creative Futures Academy, and Creative Skillnet, have been busy training workers. In Northern Ireland, the Department of Communities, through NI Screen and BFI, supports the Nerve Center to deliver training in visual effects and virtual production. Meanwhile, third-level institutions such as IADT, TU Dublin, Ballyfermot College, TUS, ATU, Queen’s University’s Capture Lab, and Ulster University’s virtual production and VFX programs produce graduates each year who are encouraged to pursue placements and vocational training. The aim is to eventually fill 2,000 seats, although that may take some time. Richie Baneham shares what he values in recruits. “There are two qualities,” he says. “Talent is a great thing to have, but tenacity in a lot of hires, I would take that over talent. The want and willingness to be there, to want to do the work.”
PLATE TO PIXEL
With credits spanning Spider-Man: Homecoming, Dune: Part Two and Thor: Love and Thunder, Jeannette Manifold shared her insights as Head of Studio at Crafty Apes Australia at an industry event last month on how Ireland is starting to attract larger-scale projects. “Now I can see there’s a huge opportunity, and it’s driven by the streamers,” she said. “Shows like Game of Thrones or Bodkin come to shoot here, then stay to do their post. There’s a growing talent pool. The rebate is getting higher and higher. Even Ireland, as a geographic location, has a lot of advantages – it’s close to America, close to Europe, and it’s English-speaking. There’s a lot going for it.”
There’s no shortage of ambition. Telling indigenous stories with imagination is just as important. Smaller VFX studios and boutique post houses, such as Motherland and Outer Limits, are gaining recognition for their artistic voices and innovative thinking. Having worked across bigger productions for most of two decades, Outer Limits VFX Supervisor Andy Clarke enjoys the freedom of a smaller team. “Once you go over 20/30 people, you have to lock down the pipeline. That can be constricting. I wanted to try new software, new graphics cards.” Several years ago, Clarke joined forces with Eugene McCrystal in Outer Limits, and the vibrant agency has accrued credits on Small Town, Big Story, The Surfer, Kneecap and the upcoming series How to Get to Heaven from Belfast. On the latter, they will work alongside Enter Yes™ and EGG VFX. Adds Clarke, “That’s what excites me – groundbreaking new technologies. I try to get my hands on them as soon as possible and explore where we can apply them in the storytelling process.”
While the international marketplace continues to ebb and flow, Irish companies are evolving slowly and deliberately. Alongside cert incentives, it’s technological innovation that is on track to redefine the VFX landscape there. Virtual production – covered under both iterations of the tax credit – along with AI, real-time rendering and immersive technologies, is inevitably going to impact key aspects of the screen industry. It’s a shift that creatives across the pond are not afraid to adapt to, and have a bit of craic with – a good time in Ireland – during the process. And yes, they do pour a lovely, creamy pint of Guinness.


TOP: A Women in Film & TV Ireland panel discussion on building sustainable careers took place in in Dublin in May, featuring Deborah Doherty of Elephant Goldfish; Dee Collier, post-production specialist; Christina Nowak, Founder of New Chapter Production Ltd. & virtual production innovator; and Jeanette Manifold, Head of Studio for Crafty Apes (Australia). (Photo: Doreen Kennedy. Courtesy of WTF Ireland))
BOTTOM: The Quaternion Plaque on Broom Bridge in Dublin marks the spot where, on October 16, 1843, on a stroll along the Royal Canal in Dublin, mathematician Sir William Rowan Hamilton invented quaternions, carving his equation with a penknife into the side of the bridge This mathematical structure is still used today in computer graphics, robotics and physics to provide a way to represent rotations in three-dimensional space.

WELCOMING THE CULTURAL SHIFT TO VIRTUAL PRODUCTION
By CHRIS McGOWAN
Innovation and integration continue to push virtual production into new realms of possibility, both on and beyond the LED wall. “Virtual production is a large umbrella term, and the more visibility we can shine on the most cost-effective workflows and toolsets within that umbrella, the further we will be able to push it,” comments Connor Ling, Virtual Production Supervisor at Framestore. “At Framestore, we’ve already made big strides in integrating the outputs and workflows of virtual production into the traditional VFX and show pipeline, which will also increase confidence in the creatives to harness it.”
TOP: How to Train Your Dragon, a live-action remake of the 2010 animated film, utilized Framestore’s Farsight VP ecosystem. (Image courtesy of DreamWorks Animation and Universal Pictures)
OPPOSITE TOP TO BOTTOM: For the Star Wars: Skeleton Crew series, ILM utilizes an LED screen. (Image courtesy of ILM and Lucasfilm Ltd.)
Milk VFX led the visual effects for Surviving Earth, an eight-episode series that recreates critical mass extinctions in Earth’s history and reveals how life survived. (Image courtesy of Loud Minds, Milk VFX and Universal Television Alternative Studio)
Dimension Studio created a snowy landscape for the Time Bandits series. (Image courtesy of Dimension Studio and Apple TV+)
The next phase is about simplicity and scale, according to Tim Moore, CEO of Vū Technologies. “We’re focused on making virtual production more accessible, not just to filmmakers and studios, but also to brands and agencies that need to produce smarter. That means streamlining the workflow, building better tools, and connecting the dots between physical and digital in a way that feels intuitive. We’ve seen a lot of evolution since we started on this journey in 2020, and don’t expect that to stop anytime soon as the broader industry continues to change.” Regarding virtual production and LED volumes, Moore adds, “Some of the biggest innovations haven’t been with the LED itself but in what surrounds it. Camera tracking has become more accessible. Scene-building is faster thanks to AI and smarter asset libraries. We’re also seeing tools like Sceneforge that help teams previsualize entire shoots before they step onto a stage. The focus now is on removing friction so the technology disappears and creativity takes the lead.”
Framestore has always had a broad view of what constitutes virtual production, according to Ling. “Yes, shooting against LED walls is virtual production, but virtual production is not shooting against LED walls. LED is one tool in our wider suite [called Farsight, which also encompasses v-cam, simulcam and virtual scouting] and, like any solution, each tool has applications where it shines and others where one of our other tools may be more applicable. When we talk about

“We continue to see nexus points develop between different flavors of VP tools and techniques, where combining them and the artists that specialize in them pushes everything forward.”
—Justin Talley, Virtual Production Supervisor, ILM
this suite of tools, they’re not just being used at the beginning of production, but at every stage of the production life cycle.”
Ling explains, “Our clients are using Farsight more and more, sometimes as standalone elements, sometimes, as with projects like How to Train Your Dragon, as part of a full ecosystem that starts at concept stage, runs through previs, techvis and postvis, and on into final VFX.” Paddington in Peru, Wicked and Barbie also utilized Farsight. “The biggest innovation has been in terms of making this system scalable, fully-integratable and, perhaps most importantly, utterly intuitive – we’ve seen filmmakers who’ve never used the kit go from zero to 60 with it, and their feedback has allowed us to refine the tech and tailor it to their specific needs, almost in real-time.” Ling adds, “In broader terms, one of the key changes we’ve seen is more complete, more relevant data being captured on set, which makes it far better when it comes back to VFX. This is less about innovation, per se, than users becoming more adept at using the technology they have at their disposal.”
He Sun, VFX Supervisor at Milk VFX, notes, “We’re moving toward a more emerging creative process where real-time tools are not only used on set, but also play a vital role from early development through to post-production and final pixel VFX delivery. As machine learning continues to advance, it will work more closely with real-time systems, further enhancing creative collaboration, accelerating iteration and unlocking new possibilities for storytelling.” Sun continues, “We’re also seeing a shift toward greater innovation in software development driven by increasingly powerful GPU hardware. Real-time fluid simulation tools, such





TOP TO BOTTOM: Baby Paddington falls into the river in Paddington in Peru, a film aided by Framestore’s Farsight VP ecosystem. (Image courtesy of Columbia Pictures and StudioCanal)
In Wicked, Glinda (Ariana Grande) and Elphaba (Cynthia Erivo) were enlivened by Framsestore’s Farsight VP ecosystem, which offers fast, scalable, collaborative tools for scouting, previs, motion capture and in-camera VFX. (Image courtesy of Universal Pictures)
Working with an LED wall, Milk VFX provided visual effects for the eight-part series Surviving Earth. (Image courtesy of Loud Minds, Milk VFX and Universal Television Alternative Studio)
as Embergen and Liquidgen, enable the generation of volumetric, photorealistic smoke, fire and clouds, which can then be played back in real-time engines. Innovation is also expanding beyond Unreal Engine. For example, Chaos Vantage brings real-time ray tracing to LED wall rendering.” Other notable advancements include Foundry’s Nuke Stage, which enables real-time playback of photorealistic environments, created in industry-standard visual effects tools like Nuke, onto LED walls, according to the firm. Sun also points to Cuebric, “which leverages generative AI to create dynamic content directly on LED volumes.”
According to Moore, Hybrid pipelines are essential. He explains, “Teams need the flexibility to mix practical, virtual and AI-powered workflows. The key is making those handoffs smooth and accessible without requiring a massive team of specialists to operate the system.” Sun remarks, “At Milk VFX, I designed our real-time pipeline using a modular approach to support flexibility and scalability across a range of projects. This pipeline was built and refined during two major factual series. Our real-time VFX workflow primarily supports two types of VFX shots: full CG shots and hybrid shots. In hybrid shots, certain elements, such as characters, environments, or FX, are generated through the Unreal Engine pipeline, while others are rendered using our Houdini USD pipeline. These elements are then seamlessly integrated with live-action plates during compositing.”
Sun continues, “Assets are designed to be dual-format compatible, making them interchangeable between Houdini and Unreal Engine. This allows for efficient asset reuse and smooth transitions across departments. A key part of our system is a proprietary plug-in suite focused on scene import/export and scene building. This enables artists to move assets and scene data fluidly between Unreal, Maya and Houdini. For instance, animators can continue working in Maya using scenes constructed in Unreal, then reintegrate their work back

into Unreal for final rendering. We’ve also developed an in-house procedural ecosystem generation tool within Unreal, allowing artists to create expansive landscapes efficiently, such as prehistoric forests and sea floors. For rendering, we use both Lumen and path tracing depending on the creative and technical needs of each shot, with final polish completed by the compositing team to achieve high-end visual fidelity.”
Sun adds, “When implementing real-time technology as part of the visual effects pipeline, one of the biggest challenges beyond the technology itself is the necessary shift in mindset from traditional workflows. For example, I bring the compositing team into the process much earlier, involving them directly in Unreal Engine reviews. At the same time, I treat the Unreal team as an extension of the comp team, as they work closely together throughout production. Having both teams collaborate from the start fosters a more integrated and creative environment. This synergy has proven to be a game-changer, enabling us to iterate on shots far more quickly and efficiently than in traditional pipelines.”
Justin Talley, Virtual Production Supervisor at ILM, comments, “One of the areas we’ve focused on over the years is ensuring we can seamlessly move between VP/on-set and post-production VFX contexts. A recent development has been seeing all the work we put into developing custom rendering and real-time workflow tools for on-set now make its way more broadly into traditional VFX work as well.” Talley notes, “We’ve continued to develop our simulcam toolsets and related technologies like real-time matting, markerless mocap and facial capture technologies. Similarly, I think some of the work we’ve done around IBL [image-based lighting] fixture workflows has an interesting future when combined with simulcam to give you some of the benefits of LED volume VP for scenarios that don’t warrant a full LED volume.”


TOP TO BOTTOM: For Sony Picture Imageworks, virtual production is a vital aspect of feature animation projects such as KPop Demon Hunters, created in partnership with Sony Pictures Animation. Using Unreal Engine for visualization at all stages of production injected traditional cinematography techniques into the workflow.
(Images courtesy of Sony Pictures Animation and Netflix)
The “Tunnel of Love” is a colorful sequence in the first season of the Percy Jackson and the Olympians series.
(Image courtesy ILM, 20th Television and Disney+)
At a Vū studio, a hockey player poses in front of an LED wall.
(Image courtesy of Vū)


Talley adds, “The latest gen LED fixtures are shipping with profiles optimized for a pixel mapping implementation – some of them with video color space – that allow VP teams to map content onto these powerful, broad-spectrum light fixtures.”
Regarding AI-driven volumetric capture, Theo Jones, VFX Supervisor at Framestore, observes, “We’ve seen a lot of tests, but nothing that feels like it’s anywhere near production-ready at this stage. As impressive as an AI environment might look once you put it on your volume, the key issue is still the inability to iterate, adjust, or make changes in the way any DP or director is going to ask you to do. It’s an interesting new tool, but they need to add more control for it to become genuinely useful.” Ling notes, “In a funny way, it’s not dissimilar to the LED boom a few years back. Some people are selling it as an immediate ‘magic bullet’ solution, but in reality, you’re not hearing about the time that goes into creating the imagery, let alone the work that goes in afterwards to bring it up to spec.” Talley comments, “Gaussian Splatting is a compelling technique for content creation. The benefits of quick generation and photo realism are hard to ignore. There are still several limitations, but the speed at which these tools are maturing is impressive.”
Greenscreens are still relevant. Ling explains, “As with most things within virtual production and VFX, it’s another tool on the toolbelt. There are still some completely valid use cases where bluescreen is the most cost-efficient route for shooting. Is it preferred? That might depend on the department you ask, but the use of LED
TOP: For The Book of Boba Fett 2, ILM conjures up a desert landscape with an LED screen. (Image courtesy ILM and Lucasfilm Ltd.)
BOTTOM: An Amazon Upshift Live shoot at Vū Nashville using a stadium backdrop on the LED wall. (Image courtesy of Amazon Upshift and Vū )
screens in production has by no means removed the validity of the green/bluescreen workflow. I think the more exciting progressions in shooting against green or blue are the progressions in simulcam or the use of screens [that are] not blue or green, like the sand screens used on Dune.”
DNEG has launched DNEG 360, a new division created in partnership with Dimension Studio to provide filmmakers and content creators with a range of services including visualization, virtual production, content creation and development services, offering clients an end-to-end production partnership and a seamless transition through development, pre-production and virtual production into visual effects and post for feature film and episodic projects, advertising, music videos and more, according to the firm. Steve Griffith, Managing Director of DNEG 360, states in a press release: “DNEG 360 is the only end-to-end, real-timepowered service provider operating at this kind of scale anywhere in the world. Our experienced and talented team has delivered thousands of shots, which has allowed us to refine our approach through hands-on experience. This allows us to provide guidance, experience and continuity for our clients as they navigate the filmmaking process, reducing risk and finding cost efficiencies.” As part of the unveiling of the division, DNEG is debuting two of the world’s largest LED volume stages in London and Rome.
Brian Cohen, Head of CG, Sony Pictures Imageworks, notes, “From SPI’s perspective, virtual production is an exciting aspect of our feature animation projects, particularly in our partnership with Sony Pictures Animation. Using Epic’s Unreal Engine for visualization at all stages of production has opened the door for us to incorporate traditional cinematography techniques into our workflow. I think our biggest challenge [with virtual production] has been the cultural shift for artists and filmmakers to work in a VP capacity. While everyone loves real-time feedback, not everyone is prepared to make creative calls in a fluid, non-linear workflow.” Cohen adds, “On the feature animation side, we are defining the trends and leading our creative partners into a brave new world of interactive filmmaking with our virtual production efforts. It will take time for all filmmakers to truly be comfortable in this looser, more exploratory way of working, but ultimately, we see it as the only way forward as it provides a level of instant feedback and creative control we’ve never been able to provide previously.”
One of the biggest challenges in virtual production is expectation versus reality. “A client sees a beautiful render and assumes we can just hit play on set. But the process still requires creative planning and technical coordination,” Vū’s Moore observes. “We’ve also seen growing pains when traditional teams try to plug into a VP pipeline without the right prep. We’ve learned to lean heavily into pre-production and prototyping. Building scenes early, doing walk-throughs with the director and DP and making sure creative and tech are speaking the same language – that’s what makes the difference.” ILM’s Talley concludes, “We continue to see nexus points develop between different flavors of VP tools and techniques, where combining them and the artists that specialize in them pushes everything forward.”
“It will take time for all filmmakers to truly be comfortable in this looser, more exploratory way of working, but ultimately, we see it as the only way forward as it provides a level of instant feedback and creative control we’ve never been able to provide previously.”
—Brian Cohen, Head of CG, Sony Pictures Imageworks


BOTTOM: Cinecittà Studios in Rome has partnered with Dimension Studio to enhance its facilities and attract international productions. The partnership aims to make Cinecittà a leading destination for virtual production, and includes the use of Cinecittà’s new state-of-the-art LED wall, one of Europe’s largest. (Image courtesy of Dimension Studio)
TOP: ILM has been at the vanguard of virtual production with its StageCraft technology, which was applied to The Book of Boba Fett. (Image courtesy of ILM and Lucasfilm Ltd.)

MIXING THE REAL AND UNREAL TO ILLUMINATE THE UNDERWORLD OF SINNERS
By TREVOR HOGG
Just as Martin Scorsese is closely associated with Robert De Niro, Ryan Coogler has formed a strong creative partnership with Michael B. Jordan. The latter cinematic combo has upped the ante with Sinners as Jordan plays twin brothers returning to their hometown with hopes of leaving behind their troubled lives, but instead encounter an even greater danger.
“I wanted Sinners to be able to be a film without vampires,” states Production Designer Hannah Beachler. “It was more about paying homage to a Mississippi when cotton was king, everybody was on plantations and sharecropping wasn’t that different from slavery.” Impacting the set design was the vertical nature of the IMAX format. “Ryan told me to treat as if it’s normal, but then I didn’t! I knew you were going to see higher, so we had to pay attention to ceilings. There was one part in the church where we had these beams that were a design language throughout the film, which was a cross-X section. I talked to Autumn Durald Arkapaw [Cinematographer] about it, and she had her guys in Los Angeles do some measurements for me so I could decide, depending on where the camera was placed, how much we were going to see.”
Lighting was a fun challenge. Beachler notes, “The 1930s is a mix of candlelight and electricity, depending on where you are. Even inside the sawmill, I had a blast creating ways for Autumn to create and mold light, and then not tell her about it! It was magical to watch her find it and see what she would do with it.”
Images courtesy of Warner Bros. Entertainment Inc.
TOP: The most complicated twinning shot to execute involved the passing of a cigarette.
OPPOSITE TOP TO BOTTOM: Director-writer-co-producer Ryan Coogler and Cinematographer Autumn Durald Arkapaw often used the IMAX 15 perf format for heightened emotional moments and action sequences. (Photo: Eli Adé)
Music plays a pivotal role in the narrative of Sinners Ryan Coogler and Michael B. Jordan have been inseparable, collaborating on five films together.
A major lesson was learned while making the Black Panther franchise. “When Ryan and I worked with Marvel Studios, we learned how important visual effects are as a storytelling device,” states Editor Michael P. Shawver. “Before that, we thought of them as being the bells and whistles.” Fireflies were inserted digitally when Cornbread (Omar Benson Miller) relieves himself by a tree outside the juke joint. “Sometimes you don’t want to throw things off towards the end with a new idea, but we had such an open, organic collaboration with the visual effects team. As a kid

“I’m
known for calling Sinners a project that is bookending technologies. We are simultaneously looking back at the beginnings of the moving picture with using large-format film while also utilizing the latest and cutting edge of ML technologies. All for the sake of successful storytelling without letting technology get in the way.”
—Guido Wolter, VFX Supervisor, Rising Sun Pictures
in the Northeast, fireflies were a big part of my summer, and I asked Michael Ralla [VFX Supervisor] and James Alexander [VFX Producer], ‘Can we do fireflies?’ We talked about how fast the fireflies would light because at first it appeared as if they were blinking rather than being slow and rhythmic. Then it evolved into having vampire eyes open behind Cornbread, which added an extra element of danger, threat and [raised the] stakes, plus the coolness of the fireflies made it visually interesting.” Shawver adds, “Thanks to visual effects and the childlike enthusiasm of Michael and James for anytime a new idea came up, we had a sandbox to play in.”
“One thing I say all the time when working with visual effects is, ‘We’re not going to make shit up,’” notes Cinematographer Arkapaw. “That statement came out of a conversation I had with Michael Ralla on Black Panther: Wakanda Forever, asking him, ‘Why does visual effects oftentimes make things up?’ I take that seriously on set. I do everything I can to help the visual effects team capture the assets they need so the shot is better in post and not built from scratch.” Footage was primarily captured with Panavision System 65 Studio, Panavision System 65 High Speed, IMAX MSM and MKIV film cameras. “We often used the IMAX 15 perf format for heightened emotional moments and action sequences, and System 65 as our main storytelling workhorse.”
The main lenses were the Panavision Ultra Panatar anamorphics with the System 65 cameras and the Panavision IMAX lenses for IMAX, which only have 50mm and 80mm. “I asked Dan Sasaki




at Panavision to make us an 80mm IMAX Petzval lens because I knew Ryan would love it and we’d find a special sequence to utilize it. The lens has a nostalgic softness and dreamlike quality that helped shape the emotional texture in some of our supernatural scenes.”
There was talk about having 180 visual effects shots, but the workload rose to a total of 1,013 for Storm Studios, Rising Sun Pictures, ILM, Base FX, Light VFX, Outpost VFX, TFX, Baraboom! Studios and an in-house team of six artists. The digital augmentation ranged from a last-minute alteration to a gold ring, Michael B. Jordan passing a cigarette to himself as a twin, a CG train and vultures, a burning roof, expansive cotton fields, turning present-day Louisiana into 1930s Mississippi and glowing vampire eyes. A new approach was developed to achieve the twinning effect. “We thought of any possible way to put something on Michael and allow him to redeliver the performance in the same space, location and lighting conditions right after Ryan is happy with his take,” explains VFX Supervisor Ralla. “That’s how we came up with the Halo rig. The ring is made out of carbon fiber, so it’s surprisingly light, and the fact that it’s shoulder-worn allows for the weight to be distributed in a way that it didn’t make a big difference to Michael. The 12 cameras were positioned all around his head and had their firmware modified so we could shoot at LOG [log or logarithmic profile or shooting profile].”
Unlike the traditional two little holes in the neck, Coogler wanted the vampire bites to be massive wounds. “I started studying dog and shark bites to see how the flesh moves and what the bleeding looks like,” remarks Special Effects Makeup Designer Mike Fontaine, who also had to show the adverse effect that sunlight has on the skin of vampires. “When Remmick initially shows up, his body is covered with burns, which were
TOP: Special effects and visual effects worked together to create a fire tornado caused by Remmick (Jack O’Connell) being consumed by flames.
BOTTOM: Special effects supplied plenty of fire for the production, which was digitally enhanced.

not meant to appear fantastical. I found these amazing photos on medical websites of second and third-degree burns and came to the conclusion that nothing was going to look as real as them. We printed those photos on temporary tattoo paper, stuck them to the body and peeled them off.” There was a specific visual aesthetic for the vampire eyes. “Ryan referenced this effect called tapetum lucidum, which is the reflective quality that you’ll see in the eyes of nocturnal animals. I reached out to this amazing contact lens artist named Cristina Patterson and discovered she had been secretly developing a contact lens that does this effect. These practical contact lenses were able to not just have a paint design but to reflect light in the actor’s eyes.” There were situations where the contact lenses could not be used because of cost, logistical or safety reasons. Fontaine reveals, “This is where the visual effects team came in. They were able to scan and photograph the contact lenses to replicate them digitally. The end result is practical and visual effects working together.”
The surreal montage where musicians from different eras appear in the same room together was more about having a surreal vibe than abstract imagery. “It was all done on set,” states Espen Nordahl, VFX Supervisor and Head of VFX at Storm Studios. “They could have shot this as a ‘oner,’ but it’s on IMAX film, so you’re limited by the physical roll of film. I believe this is the longest IMAX shot ever because it’s using five rolls of film and is around 4,500 frames. Autumn [Arkapaw] and Michael Ralla had planned out the stitches ahead of time.” One of the most difficult twinning shots to execute was the passing of a cigarette between Smoke and Stack. Nordahl remarks, “Split screens are not rocket science, but [coordinating] the interactions requires a lot of work. We did a test in pre-production where that scene was similarly shot on the lot. They had to do so many takes that the sun changed



TOP TO BOTTOM: Photos found on medical websites of second and third-degree burns provided important visual reference for the prosthetic makeup.
For large parts of shots, full CG crowds were avoided for the train station arrival scene.
Originally, the plan was to have a real train. In the end, a digital version appears on the big screen.
A significant part of the environmental work centered on creating cotton fields.




TOP TO BOTTOM: Ryan Coogler discusses a scene with Delroy Lindo and Michael B. Jordan. Muzzle flashes were period accurate.
Lighting was a fun challenge as the 1930s is a mix of candlelight and electricity.
Michael B. Jordan goes into full vampire mode.
angle, which meant relighting hats and replacing shoulders, or we could use the takes that were closer in time but not rehearsed as well. It becomes a branching thing where every problem we fix has the potential of causing another thing that needs to be tuned because now that looks uncanny. It’s old-school comping, painting, warping and retiming. The hands would never be at the same angle, so we had to recreate a lot of hands and fingers.”
Part of the twinning effect required inserting the entire head rather than the face of Michael B. Jordan. “REVIZE™ is the name of Rising Sun Pictures’ proprietary ML-based visual effects pipeline,” states Guido Wolter, VFX Supervisor at Rising Sun Pictures. “We have extensively used it for full-body replacements, twinning and stunt face replacements in the past and are always exploring and expanding its capabilities and possibilities. For Sinners, we had to adapt REVIZE™ and tweak for the low-dynamic range of large-format film. I’m known for calling Sinners a project that is bookending technologies. We are simultaneously looking back at the beginnings of the moving picture with using large-format film while also utilizing the latest and cutting edge of ML technologies. All for the sake of successful storytelling without letting technology get in the way. Thanks to the plethora of data coming from the Halo camera rig and the principal shoot, we were able to create temporally stable, full-head ML models for MBJ. Due to our extensive research, we quickly found that simply the head volume, head size and general proportions of MBJ were different from the doubles. Only a full head replacement would give us satisfactory and believable results. We quickly abandoned the often-seen, and not very successful, hockey-mask face-swap approach and went for custom-tailored full-head models of MBJ. These models are painstakingly fine-tuned to singular shots by our ML teams.”
Digitally recreating 65mm film grain and lens aberrations were important to achieve seamless integration. “Ryan and Autumn were keen on making sure that the CG looked like something that’s been photographed on that film format through those lenses,” notes Nick Marshall, Visual Effects Supervisor at ILM. “We received some good lens grids from the production, and then set about making sure we could mimic every little quality of the diffraction spikes, the way the bokeh forms, chromatic aberrations, and near and far defocus.” ILM was responsible for the train station where the train was completely CG. Marshall explains, “Special effects took a steam smoke machine and ran it on a dolly down a track built next to the practical rail tracks, which were there on location. We didn’t want to go full-CG crowd for the large parts of the shots, so having that practical steam interacting with the crowd was a great way to give us something to blend into. When we designed all our train station environment extensions, we slightly repositioned our train tracks to match up with that dolly track they’d built on the location so we could have our train closer to the platform, but running down the same position as the dolly. We carefully integrated some additional steam simulations.”
Light VFX was given the responsibility of producing the ominous vultures that foreshadow the arrival of the vampires. “One of the challenges we had was having birds going from standing on the ground then flying off, and vice versa, in the same shot,” states
Antoine Moulineau, Founder, VFX Supervisor & Creative Director at Light VFX. “On a lot of shows, we usually have two assets, one for when they’re walking and another for flying. But on Sinners we couldn’t do that. We had to do it the hard way, which means folding everything correctly, having the right position of each feather, and then, with Houdini, do some cloth simulations on the feathers to make them bend and collide as they should. It was a purely physical thing. It was all edited by hand. We didn’t take any shortcuts. We had real footage of vultures walking around. As we were building our CG asset, it was placed next to the real vultures until we reached a point where no one could tell which one was fake.”
Captured for real was the car shootout at the end. “We took the same car from Bonnie and Clyde: Dead and Alive [2013], which had 118 squibs, and used that for R&D to figure out what the bullet holes would look like and to rehearse the camera moves,” remarks Special Effects Coordinator Donnie Dean. “We did 10% of the bullet hits to start getting the camera moves down and gradually ramped up until we had over 80 squibs on the car going through it on each side. You had it so that the squibs move around the car with the camera because the camera can’t see the entrance and exit [of the bullet hits] at the same time; therefore, you had to space them out but not make it look weird.” Fire has a prominent presence. Dean remarks, “We now have access to materials that are more fire resistant than you might have had 10 or 15 years ago. You see the barn door go up in the movie, and we could build that in a way where you can burn it and turn it off and it’s not marred. When you see the ceiling burn away, that’s a different story. That took three months to figure out. We tried different materials because there is a certain amount of time [30 to 45 seconds] you have for that roof to burn away. The fire tornado [caused by the death of Remmick] was practical, and that also took us three months. It was a combination of a mechanical situation with propane adaptors. There were fireballs spinning around that create the swirl and a big metal turbine in the middle of it all. We fed that with big vacuums used for [cleaning] parking lots. We plumbed that in to push the air into the big turbine and then plumbed it with massive inlets of propane. If you shove enough propane in there it will still burn. We went through a $1,000 worth of propane for one shot. The fire tornado went 60 feet high and touched the ceiling of the studio!”
With the proliferation of digital cameras, whole generations of digital artists have never worked with celluloid, in particular 65mm. “I gave 90-minute lens seminars on Zoom, and Nick, Espen and Guido had to train their compositors how to work with film because they had never touched scanned celluloid,” Ralla states. “People were going, ‘What is dust-busting? Where are these scratches coming from? Why does one frame pop?’ We had to come back to what was filmed and apply that to all of the CG stuff. It gets tricky because this film has 1,013 shots, which is more than 50% of the runtime.” Dealing with celluloid is not terribly complicated. “It requires an understanding,” Nordahl notes. “We need to get rid of the unwanted artifacts like hairs, keep the grain and halation, but not the gate weave [which is a creative choice]. It was like a re-education of a lost art, and the resolution of those IMAX plates were ridiculous!”




One of the digital creatures was a rattlesnake that gets stabbed with a
TOP TWO: Vampire bites were inspired by those of wild animals such as sharks. Scarred-for-life Sammie Moore (Miles Caton) grew up to be a legendary Chicago blues guitarist, played by a real legend, Buddy Guy. Tapetum lucidum, which is the reflective quality found in the eyes of nocturnal animals, inspired the look of the vampire eyes.
knife.

VR THERAPY AND IMMERSIVE MENTAL HEALTH HEALING WITH DR. “SKIP” RIZZO
By NAOMI GOLDMAN
Extended Reality via AI, VR and AR technologies pioneered by Albert “Skip” Rizzo, Ph.D. has revolutionized the delivery of patient-centered mental health care, offering innovative, engaging and effective therapies that are changing the treatment landscape for medical professionals and relief-seekers.
As a clinical psychologist and Director of the University of Southern California Institute for Creative Technologies (ICT) Medical VR Lab and Research Professor at the USC’s Department of Psychiatry and School of Gerontology, Dr. Rizzo has spent more than three decades at the forefront of clinical virtual reality, conducting research on the design, development and evaluation of VR systems across the domains of psychological, cognitive and motor functioning in healthy and clinical populations. A trailblazer at the intersection of clinical psychology, neuroscience and immersive technology, his work has focused on PTSD, Autism, ADHD, Alzheimer’s disease, stroke, psychedelic therapy, suicide prevention and other clinical conditions. Dr. Rizzo is best known for his empirically validated use of VR in the treatment of combat and sexual trauma-related PTSD.
OPPOSITE TOP TO BOTTOM: Bravemind allows veterans to process their trauma memories in a safe, controlled setting. Clinicians control the stimulus presentation via a separate interface and are in full audio contact with the patient.
Dr. Albert “Skip” Rizzo demonstrates advancements in the Bravemind therapy.
Exposure therapy for veterans centers on virtual Iraq/Afghanistan environments (Bravemind).
In this one-on-one with VFX Voice, Dr. Rizzo shares his current views on the dynamic field of VR-based therapy, the deepening of his trauma-based work, and harnessing AI and digital humans.
VFX Voice: In the years since you last discussed VR/AR/XR and their use as therapeutic tools with VFX Voice in 2017, how has the field evolved?
Dr. Rizzo: The whole field of clinical applications using VR/XR/AR has evolved dramatically in several ways. First, the technology has
Images courtesy of Dr. “Skip” Rizzo and the USC Institute for Creative Technologies.
TOP: Albert “Skip” Rizzo, Ph.D., Director, Medical Virtual Reality Lab, USC Institute for Creative Technologies.
caught up with the vision, including products developed by Meta, PICO and HTC that have surpassed what we could have dreamed of – low-cost, high-fidelity headsets and tech for developing these experiences. So, we can really deliver at scale for clinical purposes.
Second, the science has advanced to the point that we can confidently say that when applied thoughtfully within a proper clinical perspective, these virtual applications make a difference for people in a scientifically valid way. We have data that confirms that these virtual approaches are as good as, or not inferior to, traditional methods, and are actually more appealing, so people will do more. And since everything is measured in a VR environment, we have an ongoing pipeline of data to continually feed back in and evolve the programs and the user experience.
Third, companies are taking academic findings and turning them into products. While that is happening at an accelerated rate, it’s not mainstream yet. As new programs are developed, it’s difficult to break into the medical field, and there’s a learning curve on what programs work for what headset. We need a convergence that brings it all together – a consumer reports-type inventory for both clinicians and lay people to find the best programs and ratings. Folding in virtual humans and virtual environments is the next runway.
VFXV: On that note, your work with virtual humans and support for military veterans has converged with the creation of Battle Buddy, an AI-driven mobile health (mHealth) application tailored exclusively for veterans and developed in response to the U.S. Department of Veterans Affairs’ Mission Daybreak Challenge to end the epidemic of veteran suicide. Walk us through this program.
Dr. Rizzo: Consider Battle Buddy an ‘in your pocket support agent.’ A virtual human that can interact with users on health and wellness content, as a springboard to real-world support networks, to be a companion for conversation or to play games. In the event of a suicidal crisis, Battle Buddy’s primary focus shifts to guiding veterans through their personalized safety plan. We have incorporated deep content from the Veterans Administration – from questions about the G.I. Bill to finding an apartment to relationship advice – to be that supportive bridge, especially for new veterans. In that first year of transition from military to veteran status, when the structure they have been accustomed to falls away, the suicide rates double. Battle Buddy was designed specifically to address the crisis of veteran suicide and be a companion, and sometimes a lifeline. This module is still in development, and through our partnership with Sidekick [an Augmented Human Intelligence platform that empowers subject matter experts to scale their knowledge through secure, expert-guided AI agents], it has enormous potential to address this crisis.
VFXV: You have envisioned a multitude of applications based on the Battle Buddy model. What are some other populations you’re exploring that could be served by a virtual human companion?
Dr. Rizzo: I’m very interested in maternal health care and disparities. So, the idea is to build out a Birthing Buddy – testing this same model. The user can make an AI character and set up the program





to be their partner, remind them of doctor’s visits, offer wellness support, have access to detailed resources and data, and engage in dialogue. While using it, the AI software agent is getting to know the mother-to-be’s health and mental health, and after the baby is born –because the relationship and trust are already established – it can be there to support the mother through post-partum depression and any issues. It can harken back to prior conversations and offer a voice of support with no shame. On the issue of should AI be a substitute for real-life therapists – let’s say not replace them. But some people may not seek help for myriad reasons – from lack of resources to shame of stigma – and this is where AI can fill the gap and be agents of change.
VFXV: Your groundbreaking work on Bravemind, a virtual reality exposure therapy for treating post-traumatic stress disorder in veterans, has been a center point of your trauma-related work. How has Bravemind expanded its scope and applications?
Dr. Rizzo: Our combat-related work with veterans who served in Iraq and Afghanistan has continued, and we have now extended the range of traumatic experiences to include sexual assault survivors within a military context. We integrated VR contexts, putting people into these experiences in the field and at U.S. bases. The model was built out, tested and found effective, so now it is a full-fledged tool for both civilian and military sexual trauma.
Along the way, we started working with people in Ukraine to build a program addressing the civilian and combat experience and the great need for support. USC donated the software code base to a developer in Ukraine, and we built a program with several experiences – a wartime scene in a wooded area with trenches and drones; a city scene; and a country village scene taken over by force – where people can walk around destroyed buildings, experience tanks and Russian soldiers on patrol, or hide out in a house as soldiers approach. We also created a metaverse social gathering place for refugees, where people can occupy an avatar and interact in a shared social support experience to talk about grief, loss and hope. A lot of love went into to building this, and we expect the first open clinical trial by the end of 2025.
We are developing a similar process in Israel in partnership with clinicians at Ben-Gurion University in Israel. This model to treat trauma includes Gaza-like war scenes, and experiences of Hamas missiles flying into homes and kibbutz communities – it is an apolitical tool, working on capturing the full spectrum of the lived experience on the ground. This model takes a different clinical approach – not just exposure to re-process the experience, but a means to go into a space that looks like your wrecked home, and allow the user to visit and clean it up as a means to work through the trauma.
VFXV: How will these VR tools developed for people in Ukraine and Israel be available?
Dr. Rizzo: All my work has a clinician component. In the Ukraine
TOP AND MIDDLE: Bauwens is proud of being a part of (2021) as Head of Visual Effects for Mikros VFX.
BOTTOM: MPC Paris was honored with the 2025 César & Techniques Innovation prize at the César Academy in France in January.
TOP LEFT: Battle Buddy is an AI-driven mobile health app designed to support veterans by functioning as an ‘in your pocket support agent’.
BOTTOM LEFT: Battle Buddy incorporates deep content from the U.S. Veterans Administration to serve as a rich resource and supportive bridge for new veterans.
model, the therapy is administered with a clinician, which can be remote; it’s not a self-help model. This is hard medicine for hard problems, in going back to the scene of trauma to confront and process emotional memories in a safe place – even conceptually. Going through the emotional activation with the support and very active engagement of a clinician is key; as someone talks about their story, the clinician can adjust the VR scene to align with the evolving narrative and enhance the realism and effectiveness of the therapy. In Ukraine, the therapy will ultimately be delivered through the International Institute of Postgraduate Education in Kyiv, a psychological training institute with 160-plus clinicians.
VFXV: Using VR to support people with autism spectrum disorder is another focus area of your work. How has this work grown and expanded?
Dr. Rizzo: We developed VITA (Vocational Interview Training Agents) to help people on the autism spectrum who are high functioning and skilled, but have issues with social interaction, to help them succeed in job interviews. We built virtual human characters that reflect different demographics and can pop them into different job contexts [office, restaurant, warehouse] – in total, six characters, seven work contexts and three levels of provocativeness – a nice, neutral or nasty interviewer – allowing for many scenarios. This can be done on a headset, but mostly we’re delivering this role-play on a laptop or big-screen TV, so the user can sit with a vocational counselor and keep practicing, combining skills training and an experiential approach to confronting fear. Through a partnership with the Dan Marino Foundation in Florida, this 10-year program has been distributed to more than 170 schools, special need programs and clinicians. We made the application generic enough where we only had to change some lines to make it appropriate for veterans or other disadvantaged populations with challenges to overcome in the job market. And we’re now working to integrate more AI into the program components to expand its capacity.
VFXV: What are some of the hot button ethical issues and implications related to VR/AR/XR and treating mental health?
Dr. Rizzo: The big debate is centered on two issues. First, when it comes to bringing AI into the mix of clinical care – and replacing clinicians with virtual therapists on using AI support agents –should we be doing it in the first place? In the ideal world, all humans would be optimal. But looking at the latest World Health Organization’s stats that estimate 1 billion people have a mental health issue and two-thirds will never see the inside of a therapy office – because of access, resources or stigma – conversion to AI can play a vital role in meeting unaddressed needs. Research shows that some patients are more likely to disclose personal information to an AI character as they have less worry about being judged or making an impression. And it is always a motive that if you have a serious problem you understand with AI interaction, we will try and guide you to talk with a real person offering genuine empathy and counsel.
The other issue is trust. We don’t fully know how AI agents actually work and they sometimes hallucinate and give bad
information, notably in two cases where the AI agent helped users develop and execute suicide plans. Now, ChatGPT has safeguards if you start talking about suicide. And when it comes to our apps like Battle Buddy, all of the content delivered to the veteran is researched and vetted, and the program has been red team-tested. As we navigate these important considerations and manifest our responsibilities as researchers and clinicians, it is clear that innovative XR deployments are scaling our ability to deliver patient-centered care in ways that transcend traditional treatment.



The Ukraine PTSD exposure therapy project builds on the Bravemind model created to serve military veterans and features virtual city and village environments.
Users can enter destroyed buildings or hide out as soldiers approach in the Ukraine-focused virtual therapy.
TOP TO BOTTOM: Refugees in Ukraine can gather in a metauniverse via avatars and interact in a shared social support experience.

HONORING THE PAST WHILE SETTING A NEW COURSE WITH ALIEN: EARTH
By TREVOR HOGG
Novelist-turned-filmmaker Noah Hawley has a habit of putting his own off-kilter twist on franchise building, whether it be psychic powers being mistaken for mental illness in the X-Men-connected Legion, honoring the bloody and sardonic sensibilities of the Coen brothers in Fargo, or combining the corporate-greed-running-scientifically-amok theme with the race to commercialize immortality in Alien: Earth. All three of the television productions found a home on FX with the third one set two years before Ridley Scott’s Alien. The show consists of eight episodes revolving around a terminally ill child as she has her consciousness transferred to a synthetic adult body and then attempts to assist her older brother finding survivors from a crashed research spaceship containing five invasive and predatory alien specimens that break free and wreak havoc on Earth.
Images courtesy of FX.
TOP: Concept artwork showcasing the skyline of Prodigy City.
OPPOSITE TOP: A practical Xenomorph is filmed causing havoc inside the research facility run by Prodigy. (Photo: Patrick Brown/FX)
OPPOSITE BOTTOM: A fully CG shot of the Maginot, which maintains the Weyland-Yutani Corporation aesthetic established in Alien.
“The remarkable thing, after seven Alien movies, is how little mythology there actually is,” notes Hawley, who serves as Creator, Showrunner, Executive Producer, Writer and Director. “I’m basically working off the first two films as my template. We know there’s a company called Weyland-Yutani, but what is the geopolitical order on Earth? The luxury for me coming in at this later date is I got to create out of whole cloth. There are five corporations fighting to see who will be the monopoly left at the end. In choosing to explore the Prodigy Corporation and the idea of the Boy Genius, the Peter Pan mythology of the Lost Boys and human hybrids allowed me to take Weyland-Yutani, which is everything to the films, and make it a component of the TV franchise. In the same way, by introducing these new creatures, I’m able to take the Xenomorph, which is everything in the films, and make it a fan-favorite element in the show. My job is to recreate the same feelings

you get from the first two films, but by telling a totally different story. One of those critical feelings was for the first time discovering the life cycle of this creature, which expands the horror.”
There always had to be something real in the frame. “I’m not attracted to a pure greenscreen environment,” Hawley states.
“There are some shots of ships coming in to land in a city that might be an entirely CG shot, and those can work well. But if you have a green box, I find it tough, and so do the actors, to get into that head space. We had a performer in a Xenomorph suit. For these other creatures, we did not have animatronic versions, but




had realistic props made so at least the actor can see what it is they’re dealing with. I also think about the length of the shot. If I’ve got a suit performer in a Xenomorph costume with a tail on a fishing line for a half a second shot, I can buy that. If it’s a stunt in which the tail is going to throw the performer off balance, we remove the tail and do it as CG.” Hawley sought out some visual effects advice. “I ended up calling Denis Villeneuve while I was making the show, and I told him that every other visual reference I have on my wall was one of his movies. He is so good in Arrival or Dune or Blade Runner 2049 at showing you how big something is. It’s actually very hard. I wanted the spaceship crash to feel like a disaster that is beyond human scale.”
Driving the visual aesthetic was Alien and Aliens. “What I liked about those two movies was that they were at a time where computer design wasn’t used for much of the design drawings, so it was more like the manual version,” remarks Production Designer Andy Nicholson. “But more than that, if you lived in 1979, that was your vision of the future. A lot of the shapes and elements from European furniture and car designs in the 1970s to the early 1980s were in the prop dressing. The Nostromo in Alien had such a solid ship design that was very much its own. It wasn’t like Star Wars or 2001: A Space Odyssey. The Weyland-Yutani ship in our show, the USCSS Maginot, has similar crew quarters, canteens, cryo chambers and bridge because people move between ships, and you have to know where everything is.” Shooting took place in Thailand at three different studios and on 16 to 21 stages. Nicholson adds, “The crew quarter corridors were white and the engineering and flight corridors were darker. We had them set out in the same stage so you could transition and shoot on the same day between places.”
“The creatures were well in advance in terms of what they looked like before we designed and built sets,” Nicholson remarks.
TOP: The Maginot passing by planetary rings was one of the 2,000 visual effects shots contributed by Untold VFX, UPP, Zoic Studios, Pixomondo, Fin Design + Effects, Crafty Apes and Cos FX Films.
BOTTOM TWO: The cyborg Morrow (Babou Ceesay) is given a mechanical arm.

“Some of the spaces they appeared in had to be augmented to work with them, but that was more to do with the container vessels. The Xenomorph eggs that get carried into the ship were so big that they dictated the size of a couple of the doors on the sets.” The art department concepted Prodigy “Neverland” Research Island, which is located in New Siam. “We did a two-kilometer by two-kilometer section of Prodigy City in Unreal Engine based around the location we were shooting the crash site. We handed over to visual effects a many-gigabyte package of the city with a kit bash of parts so that you could expand that building type out as far as possible from wherever we needed to.” Elevating the fear factor in the sci-fi franchise is a particular architectural design element. “They’re in this ship with vents everywhere, and who knows how big this alien is? Where’s it going? It doesn’t feel like you shut the door and can be safe. It’s not even safe when Ripley gets into the shuttle at the end of Alien. We wanted to keep that fear going through this, and that’s the reason why there are vents in all the rooms in Neverland, even in the operating theatre.”
The static, animatronic and puppet renditions of Xenomorph eggs were significant prop to construct. “We decided to do a few different versions, and gave them various textures to see which one looked close enough to the original,” explains Sarinnaree “Honey” Khamaiumcharean, Creature & Prosthetic Supervisor. “We had more than 30 eggs on standby that were made from soft silicone and had a clear, slimy liquid put on top. A full version was made where everything moved in a creepy way. When the egg opens you see the placenta of a Facehugger. It was our masterpiece! We also had fun creating a super-detailed two-centimeter Chestburster that was put into a tube placed inside of a lung. It’s going to be the new iconic scene!” Making an alien creature that has never been seen before appear realistic enough to be onscreen was a


BOTTOM TWO: Atmospherics such as dust emphasize the size and magnitude of the crash site.
TOP: Exploring the layout of the Prodigy R&D facility situated on the Neverland Research Island.



TOP TWO: Wendy (Sydney Chandler) jumping from the cliff demonstrates the superhuman abilities of her synthetic body.
BOTTOM: From left: Jonathan Ajayi as Smee, Adarsh Gourav as Slightly, Sydney Chandler as Wendy, Timothy Olyphant as Kirsh, Kit Young as Tootles, Erana James as Curly and Lily Newmark as Nibs leave the Neverland Research Facility to aid the rescue operation taking place at Prodigy City.
significant task for the crew in Thailand. “We had about 10 sizes of Ticks. The big Tick could suck all of your blood out. I called my friend, who is a doctor, and asked, ‘How many liters of blood do humans have?’ It’s four to five liters depending on your size. We put in a sac that had four to five liters of blood to see how it was going to look.” Complicating matters was the climate. Khamaiumcharean notes, “We live in a country that is quite humid and hot, so the bigger challenge was for us to find the material that could be kept outside for a long period of time. It’s difficult to keep things in the fridge all the time. Also, we had to continually remake synthetic blood because every two weeks it would be gone!”
Whether for a television or feature production, the quality and standards for prosthetic makeup are no different. “The problems you come across are exactly the same,” notes Steve Painter, Lead Prosthetic Supervisor & Designer. “I will prepare two or three takes of prosthetics as backups. The fake dead bodies were sourced offsite in the U.K. because there was just no way with the number of people I had in Thailand and the local materials. Sometimes the paint was still wet when we were taking them onto set. It was close.” One has to be able to adapt to the on-set conditions. “If you cool hair in the humidity in Thailand, it straightens by itself, so you have to hairspray the heck out of it to keep the hair where you want it.” An important relationship was had with the visual effects team.
“If Jeff Okun [VES, Onset VFX Supervisor] needed something from me, I was happy to give him whatever he needed to make his life easier. From my point of view, there are certain things we can’t do physically, and that’s where Jeff came in. We became good friends over the course of the shoots and production.”
“Basically, all the killings were down to us, and it was great fun to try to work out how,” Painter laughs. “There’s a character, Bergerfeld [Dean Alexandrou], who literally gets his face ripped

off by the Xenomorph,” Painter states. “We had him crawling away wearing green leotards because he supposed to have no legs. A stuntman representing Bergerfeld was all wired up, and the Xenomorph has his head in its mouth, lifts him up like a ragdoll, shakes him around and then throws him. It was incredible when we watched it being shot. David Rysdahl’s death was a huge scene for us. I got onboard with one of the Jurassic Park and Star Wars animatronics guys who I’ve known for years, and we designed what had to happen. Noah sent me the Chestburster scene of John Hurt in Alien, but what he did not tell me was that it would be on a beach in Krabi. We did a mock-up of part of the beach in Bangkok and had a raised platform, so all the puppeteers were underneath. Where we tried to improve from the original Chestburster scene is you never see John Hurt’s legs. I wanted to see the whole body, so I devised a puppetry system where we could puppeteer David’s fake leg. It was the legs moving and twitching that sold it, that this is not a man appearing through a hole in the table for his head and arms. It looked amazing, and I even got to hit the lens with a blood spurt!”
“I’ve been fortunate enough to work on Star Wars, the recut of Blade Runner and now Alien, so I’m living my kid’s dream of sci-fi,” states VFX Supervisor Jonathan Rothbart. “The mantra of mine on this show is anytime we start seeing visual effects that feel modern, I always tried to push them back to the look and feel of 1979.” The digital augmentation was impacted by the choice to shoot with anamorphic lenses. Rothbart observes, “We spent a lot of time mapping out and focusing on what those anamorphic lenses are doing and how they affect all the images because each one has its own little texture and look. We did a lot of softening of things to get it to play in the plate, and then we also have the chromatic aberration where you start seeing the different colors split on the edge of the frames. And, of course, flares and grain level, which is not the


TOP TWO: A rooftop view of the Maginot crashing into the twin towers of Prodigy City.
BOTTOM: Shooting a scene in the control room that houses the artificial-intelligent mainframe nicknamed Mother, which monitors the automatic flight and environment functions of the Maginot. (Photo: Patrick Brown/FX)


lens but us trying to emulate the right kind of film stock.”
Along with the Tick, other new creatures featured in the 2,000 visual effects shots are the Eye Midge, Orchid and Fly. “Anybody who asked me, ‘What’s your job like?’ I answer, ‘It’s the best job in the world!’” Rothbart laughs. “I get to make spaceships and monsters. We’ve got some great facilities. Our primary vendor was Untold Studios, which did the Eye Midge and Xenomorph. Pixomondo came on later as well. Zoic Studios handled the Tick
TOP: Wendy (Sydney Chandler) anxiously waits to be reunited with her brother, Hermit (Alex Lawther), as she flies towards Prodigy City.
BOTTOM: The prosthetic makeup of an infected body is digitally enhanced.

and the Orchid. UPP did the majority of the space work, and Fin Design + Effects did Morrow’s CG hand. The two hardest creatures were certainly the Eye Midge and the Tick. I love the Eye Midge most of all because we’ve been able to put so much character into its little moments. But the Eye Midge is also this evil, vicious creature that is intelligent and takes over whatever vessel it puts itself into it. At first, the animators were giving me this sheep with an Eye Midge, but I said to them, ‘It has to be an Eye Midge trying to control a sheep so it moves quirky.’ As for the Tick, any subtle changes in lighting really affected the way the character looked. It was a lot of work to get the Tick to feel right, and it’s in so many different lighting scenarios. Then there’s the difference between when the Tick is filled and not filled with blood. That was definitely a process to figure out.”
Prodigy City was a major environment build. “We had gotten some concept art from the art department, and Noah was interested in telling the substory of what he called the ‘plus versus minus’ of society,” Rothbart states. “The idea that the wealthier lived above ground and the poor people live in all these dwellings below ground. We tried to come up with a structure that would both be believable and make sense to that concept. We added these levels of the city. Most of the events in this particular season take place in the upper level of that city. It’s a cool concept that I’m hoping Noah will investigate further next season.” Plate photography was always the starting point. “We would strip out a lot of the buildings, then rebuild our buildings on top of that, as well as the lower area underneath it so we felt more of that futuristic city. We’ve got some really huge shots like the big ‘oner’ when they go to the crash site, which was insanely difficult with so many layers and so much going on.” Sparks are a prominent atmospheric in the crash site. Rothbart notes, “We took what was there and tried to
“I ended up calling Denis Villeneuve while I was making the show, and I told him that every other visual reference I have on my wall was one of his movies. He is so good in Arrival or Dune or Blade Runner 2049 at showing you how big something is. It’s actually very hard. I wanted the spaceship crash to feel like a disaster that is beyond human scale.”
—Noah Hawley, Creator, Showrunner, Executive Producer, Writer, Director

TOP: Illustrating the growth cycle of the Tick.
BOTTOM: The Eye Midge resides in the eye socket of its victim.


minimize the sparks in a lot of places and then rebuild them back with our own sparks so we could be more specific with the art direction.”
One of the more stunning reveals occurs when Wendy (Sydney Chandler) displays the superhuman strength of her new synthetic adult body by jumping off of a cliff, landing safely on the beach and sprinting at a high speed back to the Neverland research complex. “We had a cliff that they had photographed, but it wouldn’t work with the distance and scale,” Rothbart reveals. “We ended up, making that a fully CG shot for the jump down, then we have a takeover from when she stands up and does her initial jump. Those shots are always hard because people have a side-by-side comparison of live-action to the CG.” A subtle and extensive effect was the CG hand belonging to Morrow (Babou Ceesay). Rothbart explains, “That was a complicated asset because you can see through into the internals, but you also have the external. We had to make sure that the captured light hits both inside and outside and plays off each other.” The visual effects will not disappoint fans of Alien and Aliens. “All of our facilities did great work. It’s been a joy for me in the sense that you know the cool things you can do with the shot as opposed to how we get to fix it. That is the best place to be if you’re doing this job.”
TOP: The cryopods and cantina for the Maginot were inspired by the Nostromo, resulting in the original Alien art department team of Production Designer Michael Seymour, Art Director Roger Christian and Set Designers Alan Tomkins and Benjamin Fernandez receiving credit for Episode 105. (Photo: Patrick Brown/FX)
BOTTOM: Kirsh (Timothy Olphant) supervises the five alien specimens taken from the crashed Maginot

WINNER OF THE 2024 FOLIO: OZZIE AWARD
Best Cover Design VFXV Winter 2024 Issue (Association/Nonprofit/Professional/Membership)
HONORABLE MENTIONS FOR THE 2024 FOLIO: EDDIE & OZZIE AWARD
Best Cover Design VFXV Fall 2023 Issue and Best Full Issue for Fall 2023 Issue

The Folio: Awards are one of the most prestigious national awards programs in the publishing industry. Congratulations to the VFXV creative, editorial and publishing team!
Thank you, Folio: judges, for making VFXV a multiple Folio: Award winner.

MASSIVE ENTERTAINMENT STRETCHES THE GALAXY FOR STAR WARS OUTLAWS
By TREVOR HOGG
Considered part of the revival of Lucasfilm Games is the first Star Wars open world video game where small-time thief Kay Vess becomes entangled with the Pyke Syndicate, Crimson Dawn, Ashiga Clan and Hutt Cartel. Set between The Empire Strikes Back and Return of the Jedi, Star Wars Outlaws was published by Ubisoft and developed by Massive Entertainment, which is responsible for the real-time tactical games Ground Control and World in Conflict The action-adventure revisits familiar faces and places while introducing new planets and characters that expand the scope and understanding of the ruthless criminal underworld. The end result went on to receive the 2025 VES Award for Outstanding Visual Effects in a Real-Time Project, highlighting how visual effects have become an integral part of providing players with an immersive experience.
Images courtesy of Ubisoft Entertainment and Lucasfilm Ltd.
TOP: To give the imagery a cinematic feel, inspiration was taken from the Ultra Panavision 70 camera used to capture the footage for Rogue One: A Star Wars Story
OPPOSITE TOP: The Trailblazer flies through the Khepi system.
OPPOSITE BOTTOM: Laser bolts had to appear not too saturated or luminous, no matter the time of day or environment.
Even though the story is set in the Star Wars universe, an effort was made not to be self-referential when devising the shape language for the game. “It’s not just to reiterate all of the things that you have seen from Star Wars, but to bring new things into Star Wars,” notes Benedikt Podlesnigg, Art & World Director at Massive Entertainment. “One thing that we tried to avoid doing was only looking at Star Wars reference to create new Star Wars because, in the end, you will never create something new. We looked at what inspired George Lucas, Ralph McQuarrie and Doug Chiang when they created new things. What are they looking at? What is the cultural influence? Using the example of Toshara, we examined what if, instead of making a game, we made a movie. Where would we go and film it? Then, we explored that location and drew inspiration from it. The winds shaped Toshara, so we brought aerodynamics into it, which answered many questions for us. It influenced the shapes and patterns.”
The wind needed to be simulated on Toshara, which raised the

complexity. “Normally for a game, you would have one wind vector, which is going to be the direction and strength pointing in a direction,” remarks Stephen Hawes, Technical Art Director at Massive Entertainment. “The biggest challenge was that we changed the wind system in Snowdrop [the in-house game engine], so it’s more the Beaufort system for wind strength. When the art director says, ‘I want strong wind,’ you give him strong wind, and everything is flapping around. ‘That’s too strong. Let’s pull it back.’ But there’s
no word to describe or number to put on this. We literally made a weather vane prop that would spin based on the wind, and it would give a read on how windy it was. We had all of the wind settings with a written English term like ‘still’ for no wind and ‘hurricane’ for a violent storm. On the cantina, you have little slats on the windows that move based on wind strength. If the wind strength is too high, they will move too much. It was important that we got that value nailed down with Ben, then tweaked things, like more



BOTTOM: Ray tracing was an important part of being able to light the cryogenic room properly and consistently.
movement in the grass or less movement on the clouds. That’s an easier distinction to make to get the whole world working. We added a whole different wind track for how the grass moves.”
Ray tracing was an indispensable tool for ensuring lighting continuity. “We used ray tracing to give us quite a bit back in terms of real-time lighting updates,” Hawes explains. “There are many things you can do when the lighting isn’t baked. This allowed us to create interesting set pieces. One of the examples we used as part of our [VES Awards] nomination reel was the collapsing tunnel. You’re escaping a ship that is landlocked, falling apart, and the corridor is slipping vertically while you’re in it. You’re sliding down through it as explosions are going off. There are many moving parts that we can do far more with now to keep the lighting all consistent.” A new environmental experience for Massive Entertainment was going beyond the ozone layer into outer space. “For Snowdrop, one of the big new improvements was how we handled space content, especially the nebulae that we have on Kijimi or Toshara, which is one of the space regions we created that has huge volumetric clouds where you have more or less density,” explains Baptiste Erades, Senior VFX Artist at Massive Entertainment. “This was not something that the engine was meant to be putting the player in, such a big mass of overdraw. We are usually trying not to put the player in this perspective. However, since we are in space, and there are many examples in Star Wars where they have
TOP: The Nix mode allows the character to perform various tasks, such as open doors, distract guards or pick pockets.

this situation, like in the Kessel Run, we had to create a whole new technology that improved our volumetric system so it can also be used for clouds and such things.”
Having to interact with the various environments are the iconic speeders. “It was a lot of fun, but still challenging because at some point when a speeder gets chased by other speeders, you start to have a lot of visual effects and elements reacting to them,” Erades states. “Performance-wise, this was quite tricky. One of the things that was challenging, especially in the Star Wars universe, is that the speeder floats in the air, so you don’t have any real feedback elements. You don’t have any wind blowing, except for when they start moving. We had to find a way to make them feel real. Usually, in other games, you have the wheel turning fast and white smoke coming out. We had to try to drag some of the ground, depending on whether it was mud or dust, to come out, to make the speeder more anchored, even if it was actually flying. It was the same when you jumped. Depending on how your speeder lands, you will have some dirt spikes coming next to you to emphasize the heavy damage that you can take on top of this huge speed you have. We’re trying to get all these elements in our scene.”
Research was conducted into the design techniques and camera choices that the various filmmakers had made for big-screen installments of the Star Wars franchise. “Once we had that narrative, we treated it as a movie set,” reveals Bogdan

TOP: The wind system in Snowdrop was altered to honor the Beaufort system for wind strength.
BOTTOM: Much of the world-building was devoted to the arid moon environment of Toshara, with Mirogana City being the central hub of activity.


Draghici, Realization Director at Massive Entertainment. “When doing the mocap, we recorded all of the cameras, so the camera movements and feel come from cameras being operated by real camera operators. We realized that the imagery for this project would be challenging to achieve with our standard camera tools. We had discussions with Stephen [Hawes] and his department. They created this unique lens project that emulates certain types of cameras and lenses. In our case, the camera we were targeting was the same one used in Rogue One: A Star Wars Story. It was Ultra Panavision 70. It’s an anamorphic camera, which comes with a specific set of artifacts and traits. We tried to go as cinematic as possible for each cinematic [shot], not only to support the story but to feel like a small movie.”
Getting the correct visual aesthetic also meant taking a retro approach. “We looked at every reference for characters, clothing or technology from the era when the movie was made,” Podlesnigg remarks. “There were 1960s and1970s technologies that we looked at for the Trailblazer. There are some muscle cars in there. We used a Swedish motorbike as a reference for our speeder. However, even for the characters, Kay Vess’s haircut is inspired by the style seen in the movie. It’s these kinds of references and all of these small things put together that make the aesthetic work. If you take one thing that is too modern or old, it doesn’t work anymore.”
An important aspect of the world-building was being true to the setting being depicted. “Toshara is a moon with a solid Amberine core that has a crust on top of it that is slowly being stripped away
TOP: Modified BX Commando droid ND-5 serves as the co-pilot of the Trailblazer.
BOTTOM: Kay Vess rappels down the side of a building in Toshara.

by the wind,” Podlesnigg states. “Everything was created with that in mind. The wind direction, architecture, harnessing of wind for electricity, the choice of lenses you see in space, the nebula and the dust clouds revolve around this concept. Our world-building team has done a tremendous job hitting all of these small notes.”
Cinematics have different aspects when it comes to complexity. “We have the Barsha standoff where the intensity of the emotions is quite high, and that is to be reflected in the performance,” Draghici states. “Not to mention, compared to film, all of these performances have to be translated to CGI puppets. How can we make this transfer to preserve as much of the actor’s performance as possible? Then we have the Darth Vader scene. Because Sliro Barsha has such a big ego, he displays artifacts collected from conquered gangs and species in a glass case. I wanted to play with reflections on the glass. Sliro Barsha is admiring his collection, and then, in the glass reflection, you see the door opening and Darth Vader comes in. We had to do all the math behind where the character needs to stand, where he needs to look, what point on that glass he needs to look at, how the camera needs to move around the character to make sure that it’s at the perfect moment for when the door opens and Darth Vader is revealed. And there’s the complexity of seamless transitions and keeping the player immersed.”
Devising a hologram system was not easy. “It sounds straightforward, but we had to answer interesting questions, like what if you look at a hologram from top down?” Hawes states. “I’m proud of

TOP: Among the familiar faces making an appearance is Jabba the Hutt.
BOTTOM: Kay Vess and Nix come across an old ship wreckage site.


the results we got. Moreover, it helped with the narrative because if the script changed and this character was no longer on the ship but needed to be talking, how do we do that? The character can be added through a hologram, which allows for the scene to be placed back in. Keeping in mind the era you’re in, the holograms change in terms of aesthetic. Giving the art director the tools to tweak and make the holograms look as they needed, while also ensuring they worked aesthetically, was a big challenge. In the end, the holograms became quite a strong element within the game.” Blaster fire was another interactive light element. “For tracer bullets, you usually boost the intensity for daytime, otherwise you will not see it,” Erades explains. “You have another intensity for night and maybe a third one for when you’re inside a building. Our game is quite difficult because we have multiple planets with different times of day, so it’s hard to know if it’s nighttime or daytime, and you have space with a different sun that has various colors. It was challenging to keep a consistent look for the laser bolt that was not too saturated or luminous but still felt like a powerful effect in all situations without triggering the lens flare.”
Inspired by World War II aerial dogfights, outer space battles are a trademark of Star Wars and something fans would be expecting. “One of the biggest challenges I had with the lighting artists early on was if you look at the planet, the exposure kept changing violently,” Hawes reveals. “We have a different lens for when you’re in space. The exposure is clamped, so it gives you that Star Wars look where you can see the stars and planet, and
TOP: The first planet introduced in the Star Wars universe, Tatooine gets visited, with Jabba’s Palace being a prominent setting.
BOTTOM: The motion capture and dialogue performances of Humberly González influenced the animation for Kay Vess.

when you’re firing your blaster and have explosions going off, the exposure in the scene isn’t fluctuating. Then it was making sure from my side that the lens is set up for space hits. It’s going to be different in its thresholds. If you fire a missile, turret or blaster fire, you can’t just flare. We want to have flare for the sun and anything else that is triggered, but everything else should be quite subtle. We have things like asteroids. In some maps, there are thousands of objects all updating. Normally, that is a bad idea to have each object updating, so we came up with a system for doing all of the updates and then doing it all in one pass.”
Open world games are familiar territory for Massive Entertainment. “The biggest difference was that the Tom Clancy’s The Division games we worked on were city-based open world,” Podlesnigg remarks. “Now we’ve added landscapes, space, speeders and spaceships to it. The biggest difference for us was, ‘How do we connect all of it together?’ But with Snowdrop, we have one of the best toolsets on the planet, which is powerful to work with and also fun.” One particular line of dialogue proved to be problematic. “We have the line, which is Kay Vess with ND-5 in the Trailblazer at the end of the mission, and they’re satisfied in accomplishing what they needed to accomplish. ND-5 asks, ‘Where are we going next?’ And her line is, ‘Anywhere we want.’ Believe it or not, we shot over 100 versions of that line to get the proper tone! We had to record that for the announcement trailer and kept sending it to Lucasfilm. People would be like, ‘Maybe here the tone can be different.’ A lot of notes from everybody! I hope that some of the lines we created
become as iconic [as ‘I have a bad feeling about this.’].” Working on an established IP was fascinating. “Not everything exists in the movie and you have to extrapolate,” Erades observes. “It was interesting work to have to respect the IP and the vibe of Star Wars, but also make your own version and push it forward.”

TOP: There were many 1960s and 1970s technologies looked at for the Trailblazer, including muscle cars.
BOTTOM: A Nix puppet was one of the props created for the motion capture sessions.
Welcome to the Fourth Edition of the VES Handbook of Visual Effects!

The Fourth Edition of this award-winning handbook remains the definitive guide for Visual Effects professionals, packed with new chapters and sections that are the definitive essential techniques, best practices and the latest innovations. Updated to reflect the quickly evolving industry landscape, this edition brings expanded knowledge reflecting updated techniques for areas covered in the Third Edition, such as Previs, AR/VR Moviemaking, color management (including ACES), the DI process and matte painting, as well as many new chapters by new writers that include Virtual Production, AI, NeRFs and Gaussian Splatting, and so much more.
New in This Edition
With contributions from 30 new Visual Effects experts, this edition explores cutting-edge topics, including:
• Script and shot breakdowns
• Use of Real-time Engines for Virtual Production, Previs and Animation
• Virtual Production Overview with detailed best practices
• Updated Motion Capture to reflect the use of Game Engines, and more
• The Use of AI in Visual Effects and Compositing
• Summary of Techniques for NeRFs and 3D Gaussian Splatting
• Updated Color Management
• Compositing of Live Action Elements and Deep Compositing
• Digital Environments
• Rigging for Animation
• Update Interactive Games
• Immersive Experiences Utilizing AR/VR Technologies, Hemispheres and Domes
A must-have for anyone working in, or aspiring to work in, visual effects, The VES Handbook of Visual Effects, Fourth Edition covers essential techniques and solutions for all visual effects artists, producers and supervisors, from pre-production to digital character creation, compositing of both live-action and CG elements, photorealistic techniques, virtual production, Artificial Intelligence and
so much more. With subjects and techniques clearly and definitively presented in beautiful four-color, this handbook is a vital resource for any serious visual effects artist.
For the student, this book contains the “gold”: Secrets and practices that will guide your career into the field and business of visual effects. Contained within these pages are the knowledge and expertise of real-world working professionals, including many Academy Award, Emmy Award and VES Award winners. It is like having many experts at your fingertips to explain in clear detail the “whats”, “hows” and “whys” of the entire field of visual effects.
About the Book’s Editors
Jeffrey A. Okun, VES, is an award-winning Visual Effects Supervisor, a VES Fellow, a member of the Academy of Motion Pictures, Arts and Sciences, the American Society of Cinematographers and the Television Academy. He has served as the chair of the Visual Effects Society as well as in many other leadership positions, including chair of the LA Section.
Susan Zwerman, VES, has been a member of the VES since 1998. She is a highly respected Visual Effects Producer who has been producing visual effects for more than 25 years. Zwerman is also a well-known seminar leader and author. She is a member of the Academy of Motion Picture Arts and Sciences, Producers Guild of America, Directors Guild of America, and a member and Fellow of the VES.
Susan Thurmond O’Neal joined the VES in 1997 and has been deeply involved ever since. She has served on the global Board of Directors in multiple roles, including Treasurer in 2016, 2nd Vice Chair in 2022 and 2023, and 1st Vice Chair in 2024 and 2025. Over the years, she has chaired the legacy global Education Committee, currently leads the Membership Committee, and in 2019 was honored with the VES Founders Award for meritorious service to the Society.

VES Australia Revisited: VFX Community Thriving Down Under
By NAOMI GOLDMAN



a
Members of the VES Australia Section gather for a festive pub night and recruitment event.
VES Australia members co-host a forum with Women in Film and TV (WIFT Vic) highlighting women in VFX.
The Visual Effects Society’s international presence gets stronger every year, and so much of that is because of our regional VFX communities and their work to advance the Society and bring people together. Founded in 2008, the VES Australia Section is growing with more than 155 members who represent a broad spectrum of disciplines, including visual effects, virtual production and games, as well as education, research and recruitment. Half the membership is in New South Wales, with a smaller contingent in Adelaide and Melbourne, and a few members in up-and-coming Brisbane and the Northern Territory.
“When it comes to the state of the industry, we are starting to see new growth in the VFX space,” said Aditya Talwar, Chair of the VES Australia Section and CG Supervisor at Netflix Animation Studios.
“DNEG and ILM opened in the last few years, and Wētā has expanded its presence with a new hub in Melbourne. And, with more happening in VFX and animation and the support of government incentives, we are seeing an influx of artists into the region.”
“From my perspective, I see many people who have been dealing with job loss or looking for remote work, and a lot of activity among people investing in learning new skills to get into virtual production,” said Lukas Sarralde, Vice Chair of VES Australia and Founder, Virtual Production Supervisor and Cinematographer for Mandrake Studios. “I’m working to help more VES members go down the path of gaining new knowledge, from Unreal Engine to the skills to become LED wall operators, but this is a learning process that takes time. I’m also focused on helping universities in Sydney update their curriculum to integrate more tech, equipping them to better empower their students and jumpstart their career pathways.”
VES Australia leadership cites its vast geographic scope – one Section encompassing a country that covers an entire continent –as a unique attribute that can present challenges. “Our Section is both big and small,” Talwar said. “While operating across a vast territory, our events are smaller and feel intimate, allowing us to grow close bonds. In this collaborative environment, we have close communications, and everyone has a voice in bringing forth new ideas. That’s exciting because our membership pulls from a cross-section of educators and researchers and hands-on artists and innovators – and those who practice our craft ‘old school’ and ‘new school’. Our Board of Managers represents diversity across professional expertise, gender and geography, which all contribute to the quality of our programming.”
VES Australia’s member recruitment efforts include pub nights, screenings at Netflix Animation Studios and networking events, where members are encouraged to invite prospects via strong word of mouth by Netflix and other VFX companies in the region. The Section Board of Managers had a goal of rebooting its membership numbers after a dip due to the pandemic, and they have largely achieved that aim. “Since 2023, we have pivoted from traditional mixers to educational programs, including our virtual production series,” Sarralde said. “The vision is to tailor our VES Australia member experience into a rich learning experience and a venue to learn valuable marketable
TOP TO BOTTOM: VES Australia hosts
“Virtual Production Workshop for Filmmakers on ICVFX & Unreal Engine Cinematics” with Mandrake Studios, TDC, ARRI and the ASC.
skills. From LED processors to camera-tracking systems, we have the venue to share knowledge, workflows and technology, and partner with allied organizations like The American Society of Cinematographers (ASC) to educate and build up our community.”
Among its notable events, the VES Australia Section has hosted a series of Virtual Production workshops, including: a panel with experts from Unity Technologies, MOD, Nant Studios, Wētā FX and Disguise; a workshop on virtual production at a local Sydney studio, hosted in collaboration with Mandrake Studios, TDC, ARRI and ASC; and a VES x Dreamscape Virtual Production Showcase focused on in-camera VFX, LED walls and real-time production, with participation from Dreamscreen, Nant Studios and Canon. The Section hosted a World VFX Day celebration and holiday mixer at the Atomic Brewery in Sydney, enthusiastic VES Awards nominations events in Sydney at Animal Logic, and in Melbourne at Framestore, where members were featured in the CrewCon opening night “State of the Industry” panel and hosted a special Q&A on “The Making of Transformers One” with the Sydney ACM SIGGRAPH Chapter and key members of the ILM Sydney creative team.
Demonstrating its support for diversity and inclusion across the industry, the Section partnered with Women in Film and TV (WIFT Vic) for their Christmas-in-July event and co-sponsored a forum with WIFT Vic, “Invisible Art, Visible Costs: What Filmmakers Need to Know About VFX and the Women Behind It,” which brought together a dynamic group of women VFX supervisors, producers and creative leads.
Moving forward, Section leadership aims to regularize master talks and Q&A panels in collaboration with studios, and strengthen its presence in Adelaide and Brisbane with more screenings. It is also focused on creating educational programs and working more closely with the education system, exploring opportunities to bring mentorship around VFX to large universities so students can learn from experienced membership. “When it comes to the VES Awards, it’s essential that we spotlight the [VES-Autodesk] Student Award as an opportunity for students to showcase their work and be part of a first-class show,” Talwar said. “This is a part of our efforts to invite students into our world to foster the next generation of VFX professionals.”
Sarralde and Talwar reflected on their membership in the Society and what makes the VES Australia Section a source of pride. “I’m proud of our Section and its strong sense of community and support,” Sarralde said. “As a Board member, I’m able to deliver ideas because I strongly identify with people trying to learn and network in this tough industry. I know their pain. Now, we have a mission to spread the word about all that the VES has to offer, to bring more colleagues into this amazing worldwide community.” Talwar observed, “The VES has really been an anchor for me in the VFX industry. The Society has forged an excellent technical community and has been a champion of standardization, such as the VFX reference platform, amidst our segregated industry around the world. It’s great to be part of a vibrant community that brings together all of the different companies and studios under one umbrella as the ambassadors for visual effects.”




Members of the
VES Australia members and guests assemble for a festive mixer at Atomic Brewery in Sydney.
A night out with VES Australia members at a fun film screening.
TOP TO BOTTOM: VES Australia members gather for an insightful panel on virtual production.
VES Australia Section convene for their VES Awards nominations event.
VES Celebrates SIGGRAPH 2025
By NAOMI GOLDMAN
The VES celebrated our global community at a high-energy VES SIGGRAPH mixer on August 13th at the Rogue Kitchen & Wetbar in Vancouver’s Gastown neighborhood. The party was hosted by the VES Vancouver Section with support from sponsors Eyeline Studios, Netflix Animation Studios and Barnstorm. The hugely popular event, free to VES members, gave SIGGRAPH attendees the opportunity to network and enjoy meeting old and new industry colleagues and friends.
SIGGRAPH is the leading conference and exhibition on computer graphics and interactive techniques. Twenty-three VES members



presented on 30 diverse topics ranging from “Navigating the Future of VFX Education” to “PostViz Workflows for VFX and Virtual Production.” The annual conference has ushered in new breakthroughs, bolstered a community of perpetual dreamers and mapped a future where exponential innovation isn’t just possible –it’s inevitable.
VES Section Board members from Vancouver, Toronto, Washington, Germany, Los Angeles, Montreal, Georgia, New York and the Bay Area enjoyed a leadership luncheon where many connected inperson for the first time and shared insights on industry trends.


LEFT, TOP TO BOTTOM: VES members celebrate our global VFX community at the VES party.
TOP RIGHT: Board leaders from nine VES Sections connect for lunch harborside.
BOTTOM RIGHT: VES Vancouver Board members hosted the festive VES SIGGRAPH party.
VES Publishes Career Opportunity Guide and Hosts Career Transition Forum
In the midst of a rapidly evolving industry landscape and economic instability – as VES members may be looking to change their role within the visual effects field or explore new opportunities in adjacent or new industries – the VES Education and Outreach Committee published the VES Career Opportunity Guide. Edited by Committee Member Laurence Cymet, the resource reflects the input of more than 26 industry experts, including career and recruitment professionals, industry practitioners, as well as those who have made career transitions. The guide was created to: highlight the deep value that VFX professionals bring to the table, and how to leverage it in career transitions; provide practical guidance on improving position and stature within the VFX industry; assist in exploring opportunities in adjacent industries; provide useful tactics in approaching job and career changes; and provide useful information on opportunities for VFX professionals in outlying industries.
Download the VES Career Opportunity Guide at https://www.vesglobal.org/jobboard/.
As a complement to the new guide, the Society hosted a live global webcast around the theme: Retooling Your Career: Transferable Skills and Industry Shifts. Moderated by Susan O’Neal, VES 1st Vice Chair and Recruiter at BLT Recruiting, the panel of experts included: Andy Cochrane, Immersive Content Creator and Consultant; Debra Coleman, (PCC) Founder, Open Frame Coaching; Laurence Cymet, WeFX Head of Technology, Educator and VES Toronto Chair; and Simon Davies, (DAX) Digital Artists Xchange Founder and Digital Creative Talent Specialists Member. From SaaS to tech, from gaming to education, the forum explored how the hard and soft skills honed in VFX can transfer into adjacent industries – ones still rooted in familiar tools and teamwork.

Highlights of career advice from the expert panel:
On adopting best practices to provide meaningful value to studios in the current market: “Know your company’s capabilities, know who their direct competition is and bring meaningful connections.” —Susan O’Neal.
On roles or departments that are most welcoming for career changers: “We’ve definitely moved away from specialists towards generalists, so make sure that you work to cultivate skills in both 2D and 3D (and totally unrelated skills like photography, etc.). Be flexible and reliable. Always deliver, never give up, never stop learning new things.” —Andy Cochrane
On handling burnout: “If you are feeling burnout, you need to take care of your physical and mental health first and foremost. Rest, be healthy, do things that bring you joy, reset. Then it will be easier to find your way forward.” —Debra Coleman
On opportunities for professionals who have been part of the production on-set side of VFX, including data wranglers and photographers: “Learn everything you can and help out while on set. Outside of entertainment, there are many industries that need people who can observe and gather data from a physical location, including architecture and construction, which have entire fields focused on LiDAR, photogrammetry and other types of reality capture.” —Andy Cochrane.
Get more carer advice and watch the Retooling Your Career: Transferable Skills and Industry Shifts webcast replay at https://www.vesglobal.org/jobboard/.
VES assembled a panel of VFX professionals who provided expertise on achieving employment goals inside and outside the industry.


Irish VFX Leaps from Leprechauns to Lightsabers

Ireland’s emergence as a premier animation and VFX post-production destination represents one of the most remarkable cultural and economic transformations in the country’s recent history. What began as a small, artisanal animation scene has blossomed into a world-class industry that now attracts some of Hollywood’s biggest productions. The foundation was laid by visionary companies like Cartoon Saloon, whose groundbreaking work on films like The Secret of Kells demonstrated that Ireland could produce animation of extraordinary artistic merit and international appeal. These early pioneers didn’t just create beautiful films, they established Ireland’s reputation as a place where creativity, craftsmanship and storytelling excellence converge.
The industry’s evolution accelerated dramatically with strategic government support, particularly through the enhancement of
Section 481 tax incentives that made Ireland increasingly attractive to international productions. This financial framework, combined with Ireland’s natural advantages – English-speaking talent, proximity to major markets, and competitive costs – created a perfect opportunity for growth. The development of world-class studio infrastructure and robust VFX pipelines has allowed Irish companies to scale rapidly and take on increasingly ambitious projects. Today, established firms like EGG VFX, Outer Limits, Element VFX and Piranha Bar regularly contribute to high-profile productions including The Woman King, The Apprentice and others, while newer players like SSVFX have made their mark on prestige television, contributing visual effects to the massively successful series Shōgun as well as The Penguin, 3 Body Problem, Gladiator II and Ahsoka. (See article page 50)
Image from The Secret of Kells courtesy of Cartoon Saloon.

