Page 1




CVR1 VFXV issue 18 SUMMER 2021.indd 1

5/3/21 4:34 PM


5/3/21 5:13 PM


5/3/21 5:16 PM


Welcome to the Summer issue of VFX Voice! Coming out of a belated and unique awards season, this issue of VFX Voice shines a light on the all-virtual 19th Annual VES Awards celebration, including our honoree profile of Sir Peter Jackson, recipient of the VES Lifetime Achievement Award. Congratulations again to all of our outstanding nominees, winners and honorees! Our cover story goes inside the acceleration of streaming media – which continues to pick up speed. We delve in to the making of epic monster battle Godzilla vs. Kong. We sit down with visual effects artists Alec Gillis and Artemis Oikonomopoulou. We delve into hot industry trends including invisible effects and mobile filmmaking, and go further into the TV/streaming surge with a behind-the-scenes look at HBO’s The Nevers. We bring you insights from effects pros tackling small-screen challenges and the Baba Yaga interactive experience. And we serve up expert tips and guidance from the award-winning VES Handbook of Visual Effects. VFX Voice is proud to be the definitive authority on all things VFX. And we continue to bring you exclusive stories between issues that are only available online at VFXVoice.com. You can also get VFX Voice and VES updates by following us on Twitter at @VFXSociety.

Lisa Cooke, Chair, VES Board of Directors

Eric Roth, VES Executive Director



5/3/21 4:35 PM

PG 3 EPIC AD.indd 3

5/3/21 5:16 PM

[ CONTENTS ] FEATURES 8 VFX TRENDS: IPHONE 12 Advancements cement the viability of mobile filmmaking.



14 VFX TRENDS: INVISIBLE EFFECTS Inside the expanding scope of ‘bread-and-butter’ effects. 22 COVER: STREAMING ACCELERATES The streaming juggernaut continues to gain momentum. 30 FILM: GODZILLA VS. KONG Maintaining proper scale between humans and monsters.


ON THE COVER: Tom Hiddleston is Loki in Marvel Studios’ Loki, streaming on Disney+ this summer. (Image courtesy of Marvel Studios)

34 ALEC GILLIS: PROFILE Effects artist behind some of filmdom’s scariest creatures. 40 TV/STREAMING: THE NEVERS ‘Electric’ HBO series mixes period drama, sci-fi and horror. 46 PETER JACKSON: PROFILE Recipient of the 2021 VES Lifetime Achievement Award. 50 THE 19TH ANNUAL VES AWARDS Celebrating the best in visual effects. 58 VES AWARD WINNERS Photo Gallery 64 PROFILE: ARTEMIS OIKONOMOPOULOU Cinesite Supervisor finds visual effects is the perfect job. 68 SFX TRENDS: TV/STREAMING SURGE Answering streaming’s high demand for practical effects. 74 TECH & TOOLS: INVISIBLE EFFECTS In-camera advances impacting the future of visual effects. 80 VR/AR/VR: BABA YAGA Interactive experience allows viewers to decide their path. 86 VFX TRENDS: SMALL-SCREEN CHALLENGES Effects pros share their toughest scenes on recent TV projects.


PG 4 TOC.indd 4

5/3/21 4:37 PM

PG 5 SPINVFX AD.indd 5

5/3/21 5:17 PM

SUMMER 2021 • VOL. 5, NO. 3



Visit us online at vfxvoice.com PUBLISHER Jim McCullaugh publisher@vfxvoice.com


EDITOR Ed Ochs editor@vfxvoice.com CREATIVE Alpanian Design Group alan@alpanian.com ADVERTISING Arlene Hansen arlene-VFX@outlook.com Bernice Howes bernicehowes@me.com SUPERVISOR Nancy Ward CONTRIBUTING WRITERS Ian Failes Naomi Goldman Trevor Hogg Kevin H. Martin Chris McGowan ADVISORY COMMITTEE David Bloom Andrew Bly Rob Bredow Mike Chambers Lisa Cooke Neil Corbould, VES Irena Cronin Paul Debevec, VES Debbie Denise Karen Dufilho Paul Franklin David Johnson, VES Jim Morris, VES Dennis Muren, ASC, VES Sam Nicholson, ASC Lori H. Schwartz Eric Roth

OFFICERS Lisa Cooke, Chair Emma Clifton Perry, 1st Vice Chair David Tanaka, 2nd Vice Chair Jeffrey A. Okun, VES, Treasurer Gavin Graham, Secretary DIRECTORS Jan Adamczyk, Neishaw Ali, Laurie Blavin Kathryn Brillhart, Nicolas Casanova Bob Coleman, Dayne Cowan, Kim Davidson Camille Eden, Michael Fink, VES Dennis Hoffman, Thomas Knop, Kim Lavery, VES Brooke Lyndon-Stanford, Josselin Mahot Tim McGovern, Karen Murphy Janet Muswell Hamilton, VES, Maggie Oh Susan O’Neal, Jim Rygiel, Lisa Sepp-Wilson Bill Villarreal, Joe Weidenbach Susan Zwerman, VES ALTERNATES Colin Campbell, Himanshu Gandhi, Bryan Grill Arnon Manor, David Valentin, Philipp Wolf Visual Effects Society 5805 Sepulveda Blvd., Suite 620 Sherman Oaks, CA 91411 Phone: (818) 981-7861 visualeffectssociety.com VES STAFF Nancy Ward, Program & Development Director Mark Farago, Director of Operations Ben Schneider, Director of Membership Services Jeff Casper, Manager of Media & Graphics Colleen Kelly, Office Manager Debbie McBeth, Global Coordinator Jennifer Cabrera, Administrative Assistant P.J. Schumacher, Controller Naomi Goldman, Public Relations

Tom Atkin, Founder Allen Battino, VES Logo Design Follow us on social media

VFX Voice is published quarterly by the Visual Effects Society. Subscriptions: U.S. $50: Canada/Mexico $60; all other countries $70 a year. See vfxvoice.com Advertising: Rate card upon request from publisher@vfxvoice.com or VFXVoiceAds@gmail.com Comments: Write us at comments@vfxvoice.com Postmaster: Send address change to VES, 5805 Sepulveda Blvd., Suite 620, Sherman Oaks, CA 91411. Copyright © 2021 The Visual Effects Society. Printed in the U.S.A.


PG 6 MASTHEAD.indd 6

5/6/21 11:28 AM


5/3/21 5:18 PM



Images courtesy of FiLMic. iPhone 12 Pro Max photos courtesy of Trevor Hogg except where noted. TOP: Adam Rowland holds a chrome ball and color chart for reference during filming of a VFX shot for An American Pickle. (Image courtesy of Adam Rowland)

Even since the iPhone debuted on June 29, 2007, a constant focus for Apple and consumers has been on improving its optical capabilities to the point that a more accurate name would be the iCamera. The technological advancements have not slowed down as consumer expectations increase with each passing year. The iPhone 12 Pro Max incorporates 5G, LiDAR, Dolby Vision HDR and Apple ProRAW, which will only continue to cement mobile filmmaking as a viable and visually acceptable form of cinema. “I have the new iPhone 12 Pro Max that has all of those bells and whistles,” explains Michael Skelton, Visual Effects Supervisor at Pixomondo. “I do shoot a lot with my phone because it’s small, fast and easy to take out. Fifty percent of what I’m capturing is on my phone. I can shoot hundreds and hundreds of pictures. I use applications that mimic the film back of the camera that we’re shooting. I use that a lot of times when I’m scouting a location or I’m with the DP and he is framing up shots. I’ll know if we’re going to be doing an extension, this is what we’re going to see, this is what we might see, and this is what we’re not going to see at all. It’s the best way of doing that because I can’t on my DSLR. It’s a tool that I use on every show every day, but I’d never rely on it. I would need to battle-test something like that in a lot of different scenarios before ever being comfortable leaving our $60,000 LiDAR scanner at the shop. I would love it if it became that compact and easy because that would be a game-changer.”


PG 8-12 iPHONE.indd 8

5/3/21 4:38 PM

“The primary reason they put LiDAR in there was to be able to get a depth approximation of your scene and direct the focus better in the photograph. It’s not intended to be a set acquisition or even a high-end 3D scanning device as of yet. We’ll see how fast it evolves.” —Carsten Kolve, Facility Digital Supervisor, Image Engine “It has Dolby Vision certified HDR 4K 60 fps and not only that,” states Carsten Kolve, Facility Digital Supervisor at Image Engine. “I have a virtual camera system, LiDAR scanner and facial capture – all of these things are in my pocket. A whole lot of quite advanced technology has been made available to a whole lot of people, including us, to utilize. We certainly make use of it. On some projects we used iPads and iPhones to capture a camera motion for previs or full CG shots. It’s easy to acquire this kind of stuff without having a big motion capture stage. If professional workflows work more into the consumer area, it will hopefully have an impact on the type of tools that are available to us at their price point. The primary reason they put LiDAR in there was to be able to get a depth approximation of your scene and direct the focus better in the photograph. It’s not intended to be a set acquisition or even a high-end 3D scanning device as of yet. We’ll see how fast it evolves.” Augmented reality combined with LiDAR on mobile devices provides the opportunity for previsualization that will assist with framing and composition. “If you want to shoot an action where you have a background that’s not there or on location and

TOP: A great benefit of shooting with a smartphone is the portability factor. MIDDLE: An example of the UI for FiLMic Pro. BOTTOM: An example of the UI for FiLMic Pro.


PG 8-12 iPHONE.indd 9

5/3/21 4:38 PM


TOP TO BOTTOM: Steven Soderbergh helped legitimize FiLMic Pro as an viable option for Hollywood productions by shooting Unsane and High Flying Bird on the iPhone. A collaboration between Moondog Labs and FiLMic made the anamorphic aspect ratio an option for mobile filmmakers. A LOG gamma curve of painted hills is displayed on FiLMic Pro. A shoot take places in Iceland with FiLMic Pro.

want to replace the building with something bigger, up until now you couldn’t see that building,” remarks Joseph Kasparian, Visual Effects Supervisor at Hybride. “The iPad and iPhone are capable of loading an asset, using LiDAR to track properly the background, and utilizing AR to visualize that asset to get better framing. I’ve seen tons of applications meant for interior design where you can do this. For sure, I would use these apps to visualize and compose something that is not on set. 5G is supposed to go up to 10 gigs a second for a data transfer. You would be able to go through and stream through edits in real-time. It’s mainly a question of how much companies are going to charge us for data transfers! 5G is going to make the whole concept of working in the cloud a bigger reality.” “The idea is that LiDAR helps to find the shape of things, so the iPhone and iPad can do things like defocus that a tiny camera sensor can’t do,” notes Berj Bannayan, Co-founder of Soho VFX. “You get that nice portrait mode where you see how the focus rolls off of objects. AR is becoming an interesting area for consumer electronics. I can scan my living room and upload a piece of IKEA furniture to see how it looks. One of the things that struck me when I saw the iPhone 12 is that I could be constantly doing LiDAR scans of the set while we’re shooting. They’re low resolution, but it certainly can be interesting from an on-set reference standpoint for saying that the camera was approximately 12 feet away from the actor and there was a light approximately 13 feet above his head.” The popularity of content creators on YouTube, Snapchat and TikTok who have mass followings is seen as the democratization of filmmaking through mobile devices. “There is a massive paradigm shift that has happened over the last few years in regards to what passes as acceptable entertainment,” believes Lou Pecora, Visual Effects Supervisor at Zoic Studios. “Most kids watch way more YouTube than television or movies in the theater, even before COVID-19. People put a camera on themselves, talk and do quick cuts less than a syllable sometimes of them standing in their kitchen or doing something stupid, like the ice bucket challenge. These get millions of hits. Talk about the democratization of entertainment, but is that cinema? [Director/screenwriter/visual effects artist] Gareth Edwards made amateur films that looked amazing using consumer technology in a way that most people don’t have the dedication, discipline and drive to get that result. That’s why there aren’t a million Gareth Edwards.” Mobile software applications have arisen that have enabled Hollywood directors such as Steven Soderbergh to make Unsane and High Flying Bird on the iPhone. “From the outset, FiLMiC Pro was intended to mimic all of the manual controls that you would find on a $10,000 to $50,000 camera, whether it is an ARRI or RED,” explains Neill Barham, Founder and CEO of FiLMiC. “We wanted cinematographers and directors to feel at home on the mobile platform and to also offer an opportunity to people who couldn’t afford those cameras to create their own material, and have it as a learning path to follow their ambitions if they wanted to grow into the Hollywood ecosystem. This coincided with the release of the iPhone 4 which had a 720p HD capability. We knew early on that the technology was only going to get better and that supposition proved to be true.” Aspect ratios from Super 35 to 1.33:1 to DCI are readily available for


PG 8-12 iPHONE.indd 10

5/3/21 4:38 PM

mobile filmmakers. “It is a great opportunity for somebody to learn whether the story is best told in 2.40:1 or 1.85:1,” states Barham. “Moondog Labs, which created a 2.40 anamorphic lens that would stretch the image over the sensor, came to us in Beta to create the software that de-squeezed that image so you could look at your full-resolution 2.40:1 aspect ratio in the library and it was done. Now anamorphic lenses far outsell wide and telephoto lenses in the mobile space. It is a great affordable option for $150 when years prior it would have been a multi-thousand-dollar Angénieux lens. Now you have the YouTube and TikTok generation saying, ‘I can do whatever I want.’ People from the ages of 13 to 15 have a skillset that so far exceeds what mine was when I first entered into film school. That is incredible, and it’s starting to reinvent the format and create new and compelling mediums that didn’t exist before.” It is important to keep in mind the capability of the CPU and GPU to ensure that the system does not crash. “When an important process is in effect, we divert resources through other things,” explains Christopher Cohen, CTO of FiLMiC. “For example, if we’re refreshing our histograms at 30 fps and there is something that needs more compute in the system, then we may bring that down to 15 fps and interpolate between histogram models. You won’t see a difference as a user, but we’re able to move that compute to where it’s most needed.” Technology is always changing but some things stay the same. Adds Cohen, “Audio

“5G is supposed to go up to 10 gigs a second for a data transfer. You would be able to go through and stream through edits in real-time. It’s mainly a question of how much companies are going to charge us for data transfers! 5G is going to make the whole concept of working in the cloud a bigger reality.” —Joseph Kasparian, Visual Effects Supervisor, Hybride TOP ROW, LEFT TO RIGHT: An example of using the LiDAR sensor with the Canvas app. MIDDLE LEFT AND RIGHT: An example of footage capture while using the Dolby Vision HDR option. BOTTOM: This photograph was taken in complete darkness using portrait mode. The camera automatically adjusted for a three-second exposure and the LiDAR scanner was able to pick up on the low-light details.


PG 8-12 iPHONE.indd 11

5/3/21 4:38 PM


“We’re starting to see multiple axis stabilization in phones now as well as quantum dot deposition technology. If you were to do that on a full-frame sensor it would be astonishingly expensive, but on smaller sensors it becomes an economic reality.” —Christopher Cohen, CTO, FiLMiC

TOP: A screenshot of the iPhone 12 Pro Max video mode UI. BOTTOM LEFT AND RIGHT: Scaniverse is another app that is able to take advantage of the LiDAR scanner on the iPhone Pro Max.

solutions have become more important with filmmaking on mobile. Tripods are probably more important now than they’ve ever been as people are trying to up their game. We’re starting to see multiple axis stabilization in phones now as well as quantum dot deposition technology. If you were to do that on a full-frame sensor it would be astonishingly expensive, but on smaller sensors it becomes an economic reality.” One thing that cannot be coded is the portability of smartphones. “You can walk into a protest in a crowded street, pull out your phone and you’re one of hundreds of other people doing the exact same thing,” notes Cohen. “You are the invisible filmmaker, and that is incredible because you can get authentic reactions from people without coaxing. You can record real life in a way that you can’t with a RED or ARRI device.” Then there is the matter of quick camera setups. “Claude Lelouch told a great story about shooting The Best Years of a Life,” recalls Barham. “He initially intended to shoot a few shots in crowded spaces on the iPhone and do the rest on 35mm film. Early in production, instead of just sitting around not doing anything while waiting for the camera to be set up for the next shot, Claude grabbed the iPhone and asked his actors to take a couple more shots. After doing that for two or three days, he decided to put the big camera away and shot the rest on the iPhone. He is shooting his next film on the iPhone as well. It is a real revolution, not just an affordable convenience which might have been the impetus initially.” As smartphones have gotten to be more sophisticated, certain complications have arisen. “Taking advantage of new APIs used to be clean-cut in that Apple would use the third-party development ecosystem as a way of vetting ideas before incorporating them into their own camera,” states Cohen. “Where things have gotten tricky is that they’re a huge company with a billion users and want to keep those users as happy as possible. This means that their tech is often geared towards a broadly palatable consumer application. That runs counter to our intent to satisfy the highest end of the professional market. A lot of the things now is how can machine learning and AI take away the creative choice from a user and basically automate a satisfactory result. We’re specifically interested in the 100 million who have a specific idea and look. We have occasionally butted heads against that broad consumer solution.” When it comes to funding R&D, very few companies can match Apple and Samsung. “Apple held out on OLED for a long time while the rest of the industry adopted it, and the reason was it couldn’t fully support the P3 color space,” remarks Cohen. “They waited until the color accuracy was there. The only thing that I would caution mobile filmmakers is that many OEMs will do automatic white balance adjustments to the display itself. I would just go ahead and turn that off.” Being able to shoot in broad daylight is an area that needs improvement. “Neutral density filters are one of the essentials that make the difference as to whether you have great usable footage that doesn’t have blown-out highlights in the sky or dramatically underexposed landscape in the majority of your daylight exterior shots,” states Barham. “Is that something which will come to the native hardware in the next two or three years? I would not bet against it.”


PG 8-12 iPHONE.indd 12

5/3/21 4:38 PM

PG 13 AMAZON THEM AD.indd 13

5/3/21 5:18 PM



TOP AND OPPOSITE TOP: The forest is replaced by a mountain range and the fortress walls digitally extended by Image Engine in Mulan. (Images courtesy of Image Engine) OPPOSITE BOTTOM: A series of shots needed to be stitched together by MPC as if done in one continuous take for 1917. (Images courtesy of MPC)

In the past, invisible effects were associated with rig removals, set extensions and environmental clean-up. However, the advances in photorealism and drone photography have expanded the scope to include digital doubles and what used to be considered physically impossible shots. The ultimate goal is for audience members to get lost in the storytelling and the world building rather than drawing attention to the work of digital artists. Invisible effects are not going to disappear as CGI has become an accepted part of cinematic language even for independent productions and television sitcoms. Veterans of film and television projects share various insights into the past, present and future of what remains as the bread-and-butter work of the visual effects industry. Lou Pecora, Visual Effects Supervisor, Zoic Studios “One of the things that I’ve come to rely on are my HDR acquisition tools. Except for specialized cases I don’t hold up production anymore to go out with shiny and grey balls, and a color chart. My camera assistants have the color charts and pop them into the camera at the beginning. Occasionally, I can line up color based on skin tone and other things. I use the Ricoh Theta to do all of my HDRs. I can set my brackets, walk out, everybody clears, and I’m there for a minute at most capturing the full scene and all of the lights in it. If there are light patterns I have them turn the lights on and off. You’re not going to LiDAR every set because with a TV schedule and budget you don’t have that luxury. “On Legion, I always knew that we were going to have surprises in post, so my incredible on-set data wrangler, Rebecca McKee, would walk around and photogram everything. She had a high-powered laptop and used RealityCapture to build models of every set and location. We definitely used them. The photogram


PG 14-20 INVISIBLE.indd 14

5/3/21 4:39 PM

“The photogram technology has evolved to the point that it is solid. [Data wrangler] Rebecca [McKee] blew people’s minds on set by showing them the models of the rooms that we were standing in. It’s a newer tool that has helped with visible and invisible effects. If we need to get rid of the picture on the wall it can be replaced because we have a model of it.” —Lou Pecora, Visual Effects Supervisor, Zoic Studios technology has evolved to the point that it is solid. Rebecca blew people’s minds on set by showing them the models of the rooms that we were standing in. It’s a newer tool that has helped with visible and invisible effects. If we need to get rid of the picture on the wall it can be replaced because we have a model of it.” Arundi Asregadoo, Visual Supervisor, MPC Film “When looking at the breakdowns for 1917 and The Revenant, you realize how much work has gone into the background in creating those stitches. The process is unassuming sometimes. The most important thing in the opening sequence of The Revenant, when

they’re running through the forest, was to make sure that we weren’t using the same language and devices that would tell we were cutting from one to the next by a wipe of a tree or a horse running to the front. It was simple things like the arrows, which you wouldn’t think about being an actor within that moment or beat. Digital doubles are invisible work like Rachel [in Blade Runner 2049], stage actors who don’t exist anymore or in this time period. When we do CG animals – because they can’t be filmed the way we want them, such as a mouse, and you’re respecting what they do for movements – that’s invisible as well. It’s all in the eye of the beholder.”


PG 14-20 INVISIBLE.indd 15

5/3/21 4:39 PM


“Digital doubles are invisible work like Rachel [in Blade Runner 2049], stage actors who don’t exist anymore or in this time period. When we do CG animals – because they can’t be filmed the way we want them, such as a mouse, and you’re respecting what they do for movements – that’s invisible as well. It’s all in the eye of the beholder.” —Arundi Asregadoo, Visual Supervisor, MPC Film Michael Shelton, Visual Effects Supervisor, Pixomondo “What drives the tools that I pick for a project is the amount of time that we have to complete the work. There are some things that are being done right now that are interesting but not quite mature enough. There is content-aware removal of objects. Adobe has been doing some machine-learning-based things that you can find in Photoshop and is even in After Effects right now. Content-Aware Fill can analyze footage, try to remove a moving object and rebuild what was behind it. A tool like that is exciting because it opens up a lot of possibilities for automation in that area, which is helpful to our process. With anything that is cutting edge, it works well, but under controlled circumstances. There are people who are pushing that technology to get it to a place where it is something that we can consider as part of our toolset.”

TOP TWO: World War II gets recreated by Pixomondo for the HBO series Perry Mason. (Images courtesy of Pixomondo) BOTTOM TWO: It was important for MR. X to recreate Mexico City from the memories of filmmaker Alfonso Cuarón, rather than be entirely period-accurate in Roma. (Images courtesy of Pixomondo)

Carsten Kolve, Facility Digital Supervisor, Image Engine “The hardest part of creating invisible effects is adding all the complex imperfections that exist in the real world. If we are unable to acquire this detail as reference right on set or via dedicated shooting/scanning sessions, we rely in the first instance on the skill of our highly specialized artists to make up for that and fill in the missing data with their artistic ability. Primarily due to their experience and ability to adapt related reference imagery, we are able to fill in the blanks where reference data could not be gathered, or we need to create something completely new that still feels true to the real world. “While a lot of technological advances have been made in all areas of acquisition of reference data and, subsequent of it, in shot pipelines, we should never forget that our work is primarily guided by artists working closely together with technicians in every discipline, that make creative decisions on how to achieve the vision for an effect given all the source data and tools we have at our disposal. VFX is a craft that can’t be reduced to a simple ‘one-button’ tech commodity, but instead it’s artists and technologists working together, creating imagery in support of a story that hopefully touches the audience on an emotional level. Working at this intersection of disciplines is what makes working in this field so exciting, even if the ultimate validation for your work is to not be noticed.”


PG 14-20 INVISIBLE.indd 16

5/3/21 4:39 PM

Rob Price, Visual Effects Supervisor, Zoic Studios “It’s been a lot of improving on what’s already existed, such as the fidelity and detail that you can get out of LiDAR and photogrammetry. Photogrammetry takes a lot of processing power, so refinement of the computers computing like the cloud and just having more resources to crunch the numbers continually gets a lot better for us. “In the mid-1990s, when doing a set extension, people would say that the only way possible is with a locked-off camera and to do a single matte painting. But filmmakers want their big moving cameras, so you need more 3D environments that look photoreal. The set extensions need to be at a higher level of quality and believability, so we have to up our game on how we create them. “It’s a necessity more and more that you need a camera in an environment to help every other department. Even the 2D is made infinitely easier if we have a digital camera that matches the practical one. A lot of what we do with the LiDAR scanning isn’t always to recreate 3D objects. On The Haunting of Bly Manor, all of the interiors were scanned for a dual purpose. We needed to see through the windows from the outside. Also, if we have the geometry for the exact interior, it becomes a lot quicker for that camera track to solve because you have the entire room.” Jeff Campbell, Visual Effects Supervisor/Partner, SPINVFX “The finishing resolution has gone up and I hope we have plateaued at 4K. I can’t see any reason to go any higher unless TVs become the size of your living room wall, but they likely will. “Now we have physically-based rendering (PBR) pipelines that render images based on the flow of light in the real world. Much of what makes a physically-based pipeline different is a more detailed approach to the behavior of light and surfaces. Working physically-based means that all you have to do is put lights where they’re supposed to be, use the correct materials and you’ll get the real result. “You have to understand the strengths and weakness of what a computer can do. Computers are great at rendering solid objects. Animal fur, hair, cloth and skin have come a long way and can look great. Humans are difficult unless they are further away from camera, like in crowds. Simulation water and pyro looks great now too.

“It’s been a lot of improving on what’s already existed, such as the fidelity and detail that you can get out of LiDAR and photogrammetry. Photogrammetry takes a lot of processing power, so refinement of the computers computing like the cloud and just having more resources to crunch the numbers continually gets a lot better for us.” —Rob Price, Visual Effects Supervisor, Zoic Studios

TOP TWO: The hair of Jean Grey (Sophie Turner) was art directed by Soho VFX for the outer space scene in Dark Phoenix. (Images courtesy of Soho VFX) BOTTOM TWO:Shattered glass is created digitally in Extraction by SPINVFX. (Images courtesy of SPINVFX)


PG 14-20 INVISIBLE.indd 17

5/3/21 4:39 PM


“This [shot in Roma] wasn’t meant to be an exact replication, but how Alfonso Cuarón remembered it. While the archival photos were helpful, we still had to aesthetically tweak after that to get the exact look he was going for.” —Aaron Weintraub, Co-founder, MR. X “The methods have generally stayed the same – highly detailed assets, photorealistic lighting and real-world references – but the tools have evolved to give us much better and faster results.” Joseph Kasparian, Visual Effects Supervisor, Hybride “Some of the great tools used in past projects are connected to the RTX. In Solo, Han Solo is walking with Lando Calrissian, and we’re seeing the Millennium Falcon for the first time. That whole shot was done in real-time on our side. What happened is we built and rendered that shot in Houdini with the basic lights and the environment with global illumination. After that we took all of the elements of that shot and exported them into Flame. We had a machine with two RTX in it. We didn’t have real-time ray tracing on it, but we could do look development in real-time. We could put lights and smoke in real-time. We could play with the atmosphere in real-time. In the end, we did that shot with a real-time renderer and redesigned completely the look development using a real-time engine. It was so fast to iterate. You could create 10 different moods in a day because everything was real-time. With the RTX that is now part of the Unreal Engine you can put ray tracing in. Everything in the real-time world is going to change because of that graphic card.” Berj Bannayan, Co-founder, Soho VFX “The digital realm of visual effects is basically the same as it was 25 years ago. We can do more with the resources that we have because the computers are faster and the algorithms are better. When we were working on The Incredible Hulk in New York, one LiDAR scan would take a 25 to 45 degrees cone, and that one little section would take 15 minutes. Then you had to move the camera and do it again. Now we have a LiDAR scanner that can do 360 degrees in two minutes. We can be a lot more confident now in our ability to do something. A lot of our industry is evolutionary rather than revolutionary. It’s a steady march towards more realistic visual effects. Even with the same tools we can do more because we learned more about what we do. The toolmakers have also learned more. I lead our software team on lighting and rendering. Having those photorealistic and physically-based materials, cameras and lights make the visual effects better, and integrates them into the photographic elements so that they really become invisible.” TOP TWO: A ring of fire is augmented to make it appear more dangerous in Episode 404 of Fargo. (Images courtesy of Zoic Studios) BOTTOM TWO: Hybride makes use of practical camera angles to make their CG work on The Mandalorian believable and grounded. (Images courtesy of Hybride)

Pete Jopling, Executive Visual Effects Supervisor, MPC Episodic “If you get your viewer to accept the world you build then that reality is your new reality. But then you could look at it in a binary way with a show like Chernobyl. It was invisible effects, and that typically


PG 14-20 INVISIBLE.indd 18

5/3/21 4:39 PM


5/3/21 5:19 PM


“Nowadays, you can pretty much do anything, but a big part of me still enjoys doing something that nobody expects was treated, thinking that it existed in camera on the day it was shot.” —Steve Ramone, Visual Effects Supervisor, SPINVFX involved more 2D disciplines such as environment work and disguising where the joins are. You don’t want people to question whether visual effects have been used here. However, something like a giant explosion of a nuclear reactor obviously had to be recreated. How do you make it something that you believe is happening right here, right now? That’s a journey you have to take your viewer on. I remember the director telling us that the smokestacks were another character in the story. It played an important part for him in reminding the viewer that this terrible thing was happening no matter how peripheral it might be. The smokestacks are there fairly constantly throughout a lot of the episodes; it creates a tension throughout the whole series. The explosion itself isn’t quite devastating to you, but everything else that follows is.”

“I remember the director telling us that the smokestacks [in Chernobyl] were another character in the story. It played an important part for him in reminding the viewer that this terrible thing was happening no matter how peripheral it might be. The smokestacks are there fairly constantly throughout a lot of the episodes; it creates a tension throughout the whole series. The explosion itself isn’t quite devastating to you, but everything else that follows is.” —Pete Jopling, Executive Visual Effects Supervisor, MPC Episodic

TOP TWO: Extensive environmental extensions were done by Hybride for Episode 208 of Tom Clancy’s Jack Ryan. (Images courtesy of Hybride)

Aaron Weintraub, Co-founder, MR. X “In Roma, we had to reconstruct Mexico City in the 1970s. There is a 1,000-frame-long dolly shot of the main character walking across a main thoroughfare, and there’s a movie theater and neon signs. It’s all digital. Most of the foreground elements were built practically, and then a lot of that was replaced, extended, built up and built off into the background. There was a small bluescreen in the background that helped a little bit. There was a lot of roto and neon reflecting in the puddles on the ground that had to be replaced. They had a practical streetcar, but it wasn’t exactly the period one. It was painted blue for blocking and was then replaced with the period-built, full-CG streetcar. This wasn’t meant to be an exact replication, but how Alfonso Cuarón remembered it. While the archival photos were helpful, we still had to aesthetically tweak after that to get the exact look he was going for.” Steve Ramone, Visual Effects Supervisor, SPINVFX “Good compositing tools, plate stabilization, color correction, multiple choice of keyers and good grain tools [are important in achieving invisible effects]. A tool that allows the owner to create gizmos or subtools is where invisible effects can shine, like Nuke. The biggest improvements were years ago when everything went from Flames and Infernos down to desktop computers and made access easier for everyone. People want to work on big shots, so the need to evolve them isn’t as important as, say, a better renderer or other big tools. I come from a time when trying to get one past the audience was the reward because the tech wasn’t as powerful as it is today, and VFX was highly critiqued for how poor the work looked, mostly because the creative overstepped the limitations of the tech. Nowadays, you can pretty much do anything, but a big part of me still enjoys doing something that nobody expects was treated, thinking that it existed in camera on the day it was shot.”


PG 14-20 INVISIBLE.indd 20

5/3/21 4:39 PM

PG 21 HBO AD.indd 21

5/3/21 5:19 PM



TOP: The Falcon and the Winter Soldier (Image courtesy of Marvel Studios) OPPOSITE TOP: WandaVision (Image courtesy of Marvel Studios)

At the start of 2020, Netflix, Amazon Prime Video and other streamers were already reshaping the movie and television-watching experience. Then came the pandemic. As home entertainment demand soared, movie theaters shut down and film and TV suffered a slump in production. Studios postponed premieres and tinkered with movie release strategies. Streaming services benefited from the increased demand and grew across the board. Due to all the disruptions that followed – either because of or accelerated by COVID-19 – the new decade seems set to become “the Streaming ’20s.” HBO Max is one of several streaming success stories. Launched on May 27, 2020 during the pandemic, it has been growing rapidly, with 17.2 million activations in the fourth quarter. HBO and HBO Max had 41.5 million U.S. subscribers as of the end of 2020, according to the company (HBO consumers and new subscribers are being directed now just to HBO Max). “So far, HBO Max has exceeded expectations and we have seen some exciting momentum over the last several months,” says Andy Forssell, Executive Vice President and General Manager at HBO Max. “[Last] fall, hits like The Undoing and The Flight Attendant drove record usage on the platform. We also completed our distribution footprint, striking deals with Amazon and Roku, and Wonder Woman 1984 arrived on HBO Max on Christmas Day to record viewing. We’ve grown more subscribers this year than HBO did over all of the last 10 years, so we’re very excited about the enthusiasm we’re seeing and hearing from fans.”


PG 22-28 STREAMING.indd 22

5/6/21 11:53 AM

“We not only launched a streaming platform during the pandemic, but our programming teams under Casey Bloys also created standout content born out of the stay-at-home period, including Selena + Chef and Coastal Elites. Both were developed, produced and shot remotely.” —Andy Forssell, Executive Vice President and General Manager, HBO Max Netflix continued its remarkable rise in 2020 and dominated Nielsen’s top 10 lists of most-streamed titles, with both old and new titles. The streamer continues to pump out original programming. It will launch 70-plus original movies this year, 10 of them in languages other than English. Netflix added 37 million paid subscribers in 2020, to stand at 203.7 million worldwide at the end of the year, according to the company. “If you take the U.S. being our most penetrated market, we’re still under 10% of television viewing time... We’ve got a lot of subscribers here in the U.S., but we still have a lot more viewing time that we would like to earn,” said Reed Hastings, Netflix Co-founder, Chairman, President and Co-CEO, in January’s “Netflix Fourth Quarter Earnings Report.” The Walt Disney Company reported that Disney+ had nearly 95 million subscribers worldwide as of its first quarter of 2021, according to research firm Statista, which reports a growth in the service’s subscriber base of almost 70 million since the start of the fiscal year of 2020. The service launched November 12, 2019. Meanwhile, Hulu (controlled by Disney) reached 39.4 million subscribers by year’s end.

Disney+’s The Mandalorian grabbed the No. 1 spot in Nielsen’s top 10 streaming list for the week of Dec. 14–20, which was the first time that a non-Netflix show had accomplished that feat (the next nine places were taken by Netflix). In the following week, Pixar’s animated film Soul, which skipped theaters and went directly to Disney+, scored the No. 1 spot on Nielsen’s list for December 21-27. This didn’t include Wonder Woman 1984, however, as HBO Max wasn’t yet part of Nielsen’s rankings. Nielsen’s list, launched in August of 2020, initially included just content from Netflix, Disney+, Amazon Prime Video and Hulu. It measures minutes watched over the course of a week. For its year-end 2020 tally and subsequent weeks in 2021, Nielsen broke out the top 10 streaming titles in three categories: original series, acquired series and movies (both acquired and original). Disney immediately started gaining ground in the movie category. Amazon announced in January that it now has more than 150 million paid Prime members worldwide (Prime members get free shipping on Amazon products as well as access to Prime Video). Amazon’s Sony-produced series The Boys landed at No. 3 in Nielsen’s list of top streaming titles for the week ending Sept. 6.


PG 22-28 STREAMING.indd 23

5/6/21 11:53 AM


TOP: The Mandalorian (Image courtesy of Disney) BOTTOM: Luca (Image courtesy of Pixar/Disney)

That week, The Boys and Disney’s Mulan (No. 10) were the first titles to break Netflix’s monopoly of the list. Amazon is boosting its spending, and will lay out $1 billion for five seasons of an upcoming Lord of the Rings series. NBCUniversal’s Peacock streaming service launched nationally July 15 and has signed up 33 million subscribers, it was reported during the company’s earnings report for the fourth quarter of 2020. Apple TV+ is another growing streaming force. The number of Apple TV+ users will rise from 33.6 million users at the end of 2019 to a projected 40 million for 2020, forecasts Statista. And Paramount+, set to subsume CBS All Access on March 4, offers live news and sports content, plus an expanded library of movies and shows, including on-demand programming from MTV, BET, Comedy Central, CBS and other ViacomCBS channels, and movies from Paramount Pictures. Other players and platforms in the streaming mix are YouTube TV, the Roku Channel and set-top devices like Roku, Amazon Fire TV, Apple TV and Google Chromecast, along with Smart TVs. This year, some streamers are charging “premium” fees to access certain new films early, while others are offering tiered subscription pricing. Disney+ charged a $29.99 one-time “Premier Access” fee for Mulan last September and Raya and the Last Dragon in March of this year that offered exclusive access to the titles before


PG 22-28 STREAMING.indd 24

5/6/21 11:53 AM

they became free to regular subscribers. Peacock will have three tiers for subscribers: a free, ad-supported subscription tier, as well as $4.99 and $9.99 subscription options that are ad-free and offer more content. HBO will offer an ad-supported HBO Max at a cheaper price point, probably by the 2nd quarter, according to AT&T CEO John Stankey during the company’s Q4 earnings report. Gregory K. Peters, Netflix COO and Chief Product Officer, has a different view about tiers. “We really believe that from a consumer orientation the simplicity of our ad-free, no additional payments, one subscription [is] really powerful and really satisfying to the consumers around the world. And so, we want to keep emphasizing that.” In December, Warner Bros. shocked the movie industry when it announced it would release all 2021 movies simultaneously in theaters and on HBO Max (Wonder Woman 1984 also launched that way on December 25 in the U.S.). Forssel explains, “This decision was a unique, consumer-focused hybrid distribution model created as a response to the impact of the ongoing pandemic. Films will continue to be released theatrically worldwide, while adding an exclusive one-month access period on HBO Max in the U.S. This model allows the consumer to have choice and the power to decide where and how they want to watch.” Disney’s Onward was an example of a movie that had its theatrical window shortened early last year because of theater closures – in this case to two weeks. It became available for digital download on March 20, 2020 (and started streaming April 3). On Christmas Day, Soul skipped theaters entirely in the U.S. and debuted on Disney+. Meanwhile, this year, Disney debuted Raya and the Last Dragon March 5 simultaneously in theaters and on Disney+ (for those paying the Premier Access fee), and Loki begins streaming June 9 on Disney+ with the first of six episodes. Could Netflix one day regularly release movies simultaneously in theaters and streaming? Ted Sarandos, Netflix Co-CEO and Chief Content Officer, explains that if theatrical windows collapse and Netflix has easier access to showing films in theaters, “I’d love to have consumers be able to make the choice between seeing it out or seeing it at home. There’s a very different experience associated with going out and going to the theater with strangers and seeing a movie, and it’s fantastic. It’s just not core to our business.” Ultimately, there may be room for a reduced or eliminated window. A study in South Korea that looked at theatrical windows reduced from three months to one month, from 2015 to 2018, showed that theatergoers there largely remained loyal to the theatrical experience, even with the option of viewing at home, according to a January 15 article in the Harvard Business Review. “It’s not an either/or equation,” states HBO’s Forssell. “There is room for growth and innovation in both mediums, and uncertain times like the period we’re in now create prime conditions for both. At the end of the day, we are a customer-first company. Right now, we have customers who want to experience our films in theaters and customers who want to experience our films in the safety of their own homes.” Another change coming may be that we will see far more series

TOP: Loki (Image courtesy of Marvel Studios and Disney+) MIDDLE: The Boys (Image courtesy of Amazon) BOTTOM: The Queen’s Gambit (Image courtesy of Netflix)


PG 22-28 STREAMING.indd 25

5/6/21 11:53 AM


“[Netflix’s streaming] serves a global audience with incredibly diverse taste. Some of them [movies or series] are hugely impactful in the region that they’re created for, and some of them become very, very global, like we saw with #Alive last year from Korea, which became a very big hit for us around the world.” —Ted Sarandos, Co-CEO and Chief Content Officer, Netflix

TOP TO BOTTOM: A Quiet Place Part II (Image courtesy of Paramount Pictures) #Alive (Image courtesy of Netflix) Raya and the Last Dragon (Image courtesy of Disney) Army of the Dead (Image courtesy of Netflix)

produced, since they are often more economical to make and tend to keep audiences glued to the streamer longer than does a movie. Disney’s The Mandalorian is a good example – it cost about $15 million per episode its first season or about $120 million for its initial eight episodes, far less than the budget for many blockbuster films. Studios and streamers showed a great ability to change course last year. “We had to address the pandemic head on by being agile and flexible – two qualities that are fundamental to our workplace culture and critical for our success during these unpredictable times,” says Forssell. “We not only launched a streaming platform during the pandemic, but our programming teams under [Chief Content Officer, HBO and HBO Max] Casey Bloys also created standout content born out of the stay-athome period, including Selena + Chef and Coastal Elites. Both were developed, produced and shot remotely.” Hollywood slowly returned to work as feature films, series and network programs resumed production. “The industry has introduced enhanced safety protocols and drastic new measures designed by guilds and unions, along with epidemiologists, to keep incidences of COVID-19 down on sets,” wrote Ahiza García-Hodges on November 20, 2020 for NBC News. WandaVision was a series that felt the impact of the shutdown. The production started in November 2019, was suspended from March to September of 2020 and then completed in November. Tara DeMarco, Visual Effects Supervisor for WandaVision, comments: “It’s been a long, strange road working on this show through the entire COVID-19 lockdown. All of the [visual effects] artists and teams have been working from their homes for the entirety of post, and we actually made it work.” Disney+ debuted Marvel Studios’ first two WandaVision episodes on January 15, 2021. Such hygienic measures may stay in place for a while, depending on vaccination numbers and new strains, but filmmakers now understand better how to move forward. The success of Netflix in the U.S. and overseas with foreignlanguage programming suggests that any company that hopes to truly compete globally needs to be able to both produce content for a U.S audience and source content from non-English-speaking countries. Netflix says 60% of its subscribers are located outside North America, according to a December 9, 2020


PG 22-28 STREAMING.indd 26

5/6/21 11:53 AM

PG 27 HBOMAX AD.indd 27

5/3/21 5:20 PM


article by Ellen Gamerman in The Wall Street Journal. She adds, “Netflix reports more than doubling its investment in non-English original content over the last two years. The company has gone from commissioning roughly one in five foreign original TV and movie titles to nearly two in five since 2016.” For TV series on Netflix, 63% of new first-season originals currently in production or development are being made outside the U.S., according to Gamerman. Netflix currently dubs programming in 34 languages and adds subtitles to even more than that number. In terms of the internationalization of streaming, Forssell says, “We have been clear about our global ambitions for HBO Max. Streaming services must be global in operation but local in appeal, so that means adding regional programming as well as very tailored marketing efforts that authentically reach audiences in each country. This is top of mind for us as we prepare to launch HBO Max in Latin America in June, and then in Europe later this year.” Sarandos thinks that the subscription model inspires consumers to be more adventurous about what they watch. “People say, ‘Hey, I don’t watch foreign language television, but I’ve heard of this show called Lupin, and I’m super excited to see it’… and 10 minutes later, all of a sudden, they like foreign-language television. So it’s a really incredible evolution. “[Netflix’s streaming] serves a global audience with incredibly diverse taste,” Sarandos continues. “Some of them [movies or series] are hugely impactful in the region that they’re created for, and some of them become very, very global, like we saw with #Alive last year from Korea, which became a very big hit for us around the world.” As the pandemic winds down, the new normal of viewing should favor streamers. Observes Lucas Hilderbrand, Professor of Film and Media Studies and Visual Studies at the University of California, Irvine, in the November 30, 2020 issue of UCI Review: “For decades, well before the coronavirus, there has been a trend toward home viewing. With streaming, that trend accelerated. Now, with the closure of theaters and many people staying home – voluntarily or by public orders – the pandemic has entrenched this. So, viewing habits have already changed in ways that will have long-term effects.” Adds Hilderbrand, “We’re witnessing what may be the irreversible turn from cinema being a theatrical mode to becoming a predominantly streaming medium.” Ultimately, believes Forssell, “The customer will decide what the future looks like.”

TOP TO BOTTOM: Soul (Image courtesy of Pixar/Disney) Space Sweepers (Image courtesy of Netflix) Wonder Woman 1984 (Image courtesy of Warner Bros. Pictures and HBO Max) Bridgerton (Image courtesy of Netflix) Onward (Image courtesy of Pixar/Disney)


PG 22-28 STREAMING.indd 28

5/6/21 11:53 AM

PG 29 CINESITE AD.indd 29

5/3/21 5:21 PM



Images courtesy of Warner Bros. Pictures and Legendary Pictures except where noted. TOP: Scanline was responsible for the initial skirmish between Godzilla and Kong, when the fire-breathing lizard-king attacks a vessel transporting the mighty ape, inverting the aircraft carrier. While significant fluid dynamic and sims were required, hand animation was also required to sweeten the procedural work. OPPOSITE TOP: Weta handled many ‘Hollow Earth’ shots as well as some of the more character-heavy Kong moments, owing to their experience with both fur and creature animation. (Image courtesy of Warner Bros. Pictures) OPPOSITE BOTTOM LEFT AND RIGHT: The design of Godzilla, which had evolved from the 2014 reboot film to its King of the Monsters sequel, was retained for Godzilla vs. Kong, directed by franchise newcomer Adam Wingard, pictured here on set. (Photo: Vince Valitutti)

When, as a youth, Visual Effects Supervisor John ‘DJ’ Des Jardin saw the 1976 remake of King Kong, he was at first unaware that most of the scenes featuring the great ape were done with Rick Baker in an ape suit. “It was only later I found out about how the giant 40-foot ape built by Carlo Rambaldi was only used in a few shots,” he admits, “as it was too big to articulate a practical one, except for the full-scale hand holding Jessica Lange. With our Godzilla vs. Kong, prosthetic monsters would have been even more impractical, given these creatures are nearly 10 times the height of that Kong.” Godzilla vs. Kong marks the fourth film in the Warner/Legendary Monsterverse – the third with Godzilla, following Godzilla and Godzilla: King of the Monsters, and the second with Kong, taking place decades after the set-in-the-’70s Kong: Skull Island. While the premise vaguely matches Toho Studios’ King Kong vs. Godzilla, the plot actually hinges more on human machinations and manipulations. These range from keeping Kong captive in a kind of Truman Show-style menagerie, to arranging for the titular battle, in the hope that whichever creature proves victorious will be sufficiently weakened by combat to then succumb to their own champion when loosened upon him. Keeping with their approach on previous Monsterverse entries, Legendary’s selection of Adam Wingard to direct reflected an inclination towards low-budget and indie-minded filmmakers. To better prep for the assignment, Wingard rewatched the Godzilla oeuvre in its entirety, re-grounding himself in a film series he had often watched as a youngster, but this time with an eye toward delivering something he hadn’t previously attempted – a feature that didn’t carry with it an ‘R’ rating.


PG 30-33 GODZILLA VS. KONG.indd 30

5/3/21 4:42 PM

While the storyline was being refined, design efforts commenced. “Some preliminary previs had already been done before I came on,” recalls Des Jardin, a veteran of other genre franchises including The Matrix sequels, Man of Steel, Batman v Superman: Dawn of Justice and the theatrical version of Justice League. “There was a war room set up with a ton of fantastic concept art developed by [Production Designers] Tom Hammock and Owen Paterson that really put across the intended feel of the show as it was envisioned by the director. They were still working on the script at that point, but things started solidifying soon after. We were aided in our planning in that a lot of the same production

crew had done Skull Island, so there was already a pretty good idea of what could be handled on set vs. what would fall to visual effects.” MPC, Scanline and Weta were selected as primary vendors. Des Jardin contacted Guillaume Rocheron, who supervised VFX on the earlier two Godzilla films – in 2014 for MPC, then acting as Production Visual Effects Supervisor on the sequel. “One of my earliest conversations with him involved finding out what kinds of challenges I should be on the lookout for,” Des Jardin reports. “Guillaume, having just finished King of the Monsters, had a warning for me about being able to animate the creatures, as the


PG 30-33 GODZILLA VS. KONG.indd 31

5/3/21 4:42 PM


TOP TWO: Godzilla’s fire breath was modulated to read convincingly in brighter conditions as well as night scenes, and often benefited from on-set interactive illumination to key and drive the post effect. BOTTOM: Since Kong: Skull Island had taken place in the 1970s, there was license taken with the size of the great ape – suggesting a late growth spurt – this time out so that Kong would appear more in scale with the Saturn moon-rocket-sized Godzilla.

filmmakers want to see them in order to tell the story while still respecting and maintaining the proper scale to the motion. If the creatures maintain the proper scale, they feel like they’re moving too slow for the drama. So, the dynamics become kind of crazy, to the point that we once clocked Kong as going 400 miles per hour across a particular landscape!” The answer lay with a demonstration of Newtonian physics – with respect to everything around the creatures. “We found the trick was to let environmental effects – the ripples in the water, the smoke being disturbed in the air and the damage as things affected by gravity and destruction fall – portray the scale accurately, even while these characters are performing at the speed the drama demands,” notes Des Jardin. “Things falling at speed from a building all observed a similar ‘realistic’ rate, and so by maintaining and respecting those real-world aspects, it grounded the action, regardless of how crazy-fast the creatures were on their rampages.” The digital model of Godzilla – a design somewhat evolved for King of the Monsters from the first film – was retained for this new entry, while Kong’s Skull Island model was sent over by ILM. The latter had to be aged up, since that film was set back in the 1970s. Wingard has likened the differences in ages for the simian to Clint Eastwood as he appeared in the Dollars trilogy of westerns versus his appearance decades later in Unforgiven. Des Jardin employed a production-side crew to take all the nittygritty data on set. “We had witness cameras to make sure everything got covered for the various set extension work in the movie,” he says, “plus whole environments that included the monsters, so we needed a lot of foundation to make that work. And I made myself available to answer questions from the director and DP, addressing concerns about whether their approaches would work with our post processes.” Des Jardin appreciated Wingard’s visual approach for the creatures. “He was very specific, wanting the lenses used on our shots to be very realistic and in keeping with what production shot. But he also wanted a lot of the creature stuff to be shot less from the human point of view and more from the height that these two live at, and they’re close to 400-feet tall. I think that was an interesting idea, and by getting up on level with Kong as he goes after the other guy, we can really watch them go at it. Weta did a lot of the more emotional Kong acting bits, which does reflect the strengths they’ve demonstrated with primate and fur animation in films past. It’s easier to bond with Kong since we’re all primates. Getting inside the head of a big lizard isn’t quite the same, but by getting close in, you do get an idea of how he is responding to what is going on, so we did many over-the-shoulder type views, like what you’d see in a boxing match.” When the camera was in motion during a battle, the movement was deliberately constrained. “Adam wanted those moves to happen at helicopter speed, not hauling ass through the world in a way that would blow the scale. We reserved really drastically quick moves for special moments and avoided those pull-back-to-infinity moments completely.” Des Jardin praises the work of aerial DP David Nowell. “We were still shooting on the Gold Coast when David shot the aerials in Hong Kong, so I couldn’t be there with him. He did a great job given the


PG 30-33 GODZILLA VS. KONG.indd 32

5/3/21 4:42 PM

extreme – and to my mind, very arbitrary – restrictions on aircraft in Hong Kong. We used as much real plate footage as possible, then cobbled stuff together when that wasn’t possible. There are a lot of scenes that had to be done with full CG cityscapes, owing to the level of destruction that was taking place. We might anchor it to a plate to get the buildings initially, but then replace and destroy some of those. We’d also have to replace the ocean because X, Y and/or Z happened with the lighting and that part of frame had to reflect these changes.” Scanline took on the daylight Hong Kong work, while MPC handled the more extensive night scenes there. “They were out there a month-and-a-half ahead of us,” Des Jardin states, “which enabled their team to scour Hong Kong for all useful details in the various locations called for in the approved previs. MPC Supervisor Pier Lefebvre handled that project for us.” This early work utilized the company’s Envirocam process, developed for Man of Steel, to capture and reproduce exceptional levels of visual information from the location. Since Des Jardin eschews the model of dividing up work within a shot, vendors took on whole sequences. “The big question in our mind at the start was whether Scanline – which has long been known for their huge water sims and being great with fluid dynamics – could do the character work in the big water scenes as well as the environmental aspects. Because rather than parcel out parts of the shots to various vendors and then try to get those pieces to live together, I wanted to have vendors do everything for the shots in their sequences. There was an early test, for one of the biggest shots in the trailer, the big punch Kong throws from atop the aircraft carrier, and Scanline’s work looked totally impressive even that far back. Selling these full CG scenes required nuanced touches. “A lot of hand animation goes into the character work, and a lot of simulations go into the environment and natural phenomena around them,” Des Jardin acknowledges. “Adam is a huge fan of camera shake, too, so all kinds of lens flares and post tricks to make the image less clean and more in keeping with the panic situation. We had stuff hitting the lens during the ocean stuff, which helped give the impression you’re inside the action.” In various locations where Godzilla’s fire breath was deployed, production used interactive sources. “Those lighting rigs come on and turn off to give us big lighting cues that animation then fleshes out,” states Des Jardin. “It’s great to have that on-set interaction with the actors and environment, and it anchors the look of the effect. Even if the timing isn’t perfect, we can usually work with that because you can’t beat those real bits of light interaction.” Finesse was needed to deliver the breath effect in brighter scenes. “Making sure Godzilla’s fire reads in different lighting conditions was mainly a matter of eyeballing things so it reads in a way that makes sense. If you’re in daylight, the intensity has to be modulated to take into account the brightness. In the main, we try not to be too overexposed, because then you start losing detail to the detriment of the actual effect. In a way, that kind of sums up a lot of our work here – keep things real, but show enough to really deliver the levels of excitement.”

TOP TO BOTTOM: Rather than mixing full-size animatronics and puppetry with CGI, the creature effort remained strictly in the digital realm so that no compromises in movement owing to limitations on practical effects would impact the storytelling. (Images courtesy of Weta Digital and Warner Bros. Pictures)


PG 30-33 GODZILLA VS. KONG.indd 33

5/3/21 4:42 PM



Images courtesy of Alec Gillis. TOP: Alec Gillis OPPOSITE TOP: Alec Gillis has been part of some major Hollywood franchises such as Aliens vs. Predator.

There is no holding back Alec Gillis, especially when discussing whether special effects artists get the proper respect and recognition from the film industry. The two-time Oscar-nominee dislikes the practice of special effects being left off of award ballots and feels that more can be done to promote and support the efforts of their practical-minded colleagues. All of this is done good-naturedly as Gillis talks from the porch of his house via Zoom. He has earned the right to his own opinions after being schooled by legends Roger Corman and Stan Winston, co-founding Amalgamated Dynamics, and working on Alien3, Tremors, Starship Troopers, Cast Away, Zookeeper, It and The Predator. Even though special effects have been around ever since French illusionist, director and actor Georges Méliès dazzled audiences with the first of his 500 films in 1896, a lack of awareness and understanding remains today. “A lot of what we do is educating when we’re dealing with clients because people don’t know what goes into it, how meticulously things are crafted, and nothing exists until they put money into it, which is a difficult concept for producers to get,” states Gillis, who had to deal with the complications of running a business under pandemic protocols. “After six weeks of being in quarantine,” he says, “I went into the shop and was looking around thinking, ‘We have enough space to socially distance. We have four bathrooms so everyone can have their own.’ We fall under the category of manufacturing, so we are considered to be essential here in L.A. Nothing was stopping us. People were comfortable with it and came back. We have inched towards finishing projects, and there have been no COVID-19 illnesses.” Born in Phoenix, Arizona, Gillis grew up in Santa Ana, California. “I could see the Disneyland fireworks from my bedroom window. It was a time where my mom would drop my brother and I off when we were nine and 11 at Disneyland. We would run around the park for an entire summer. That was amazing because of animatronics. I remember when the Haunted Mansion was new. It was all great fuel for the imagination. The summer of 1969 was when The Planet of the Apes came out. We’d hide under the seats in between shows to watch the movie again. It was such a compelling story, and I felt that I was watching an adult movie with interesting themes and, of course, that twist ending. Then you realize that Rod Sterling wrote the script and he’s the Twilight Zone guy! You’re becoming a fan of people’s work and start thinking, ‘Individuals make these things that I love, so why couldn’t I be one of them?’” The patriarch of the family was a U.S. Marine turned insurance salesman who was a big fan of learning about how movies were made. “I remember my dad waking me up when Jason and the Argonauts premiered on CBS and seeing the moment when the skeletons are coming up out of the ground,” recalls Gillis. “That was when I was sucked into how wildly creative the world of effects was. But it was always mixed with storytelling.” The experience created yet another fan of Ray Harryhausen. “My business partner, Tom Woodruff Jr., is good about contacting people and sending letters, so when we were in London for Alien3, he arranged for us to meet Ray Harryhausen and his wife Diana. He showed us his big display case of all the skeletons as well as


PG 34-39 ALEC GILLIS.indd 34

5/3/21 4:55 PM

“I went out to Pomona one night where James Cameron’s friend was a projectionist to see his 35mm print of Xenogenesis. Jim was 25 and I was 19. I remember thinking, ‘He’s light years ahead of me!’ It was stop-motion, big models, robots and 35mm. We went back to his house and looked at giant pre-production paintings and a seven-foot-long spaceship model. For about a year I helped him with that.” —Alec Gillis some animation that was done on an unfinished Grimm’s fairytale short. These British animators had rebuilt the puppets and completed it. It was really good. I always admired the artistry and expressiveness of Ray’s animation, and the boldness of his designs. He was an independent filmmaker and an auteur. Ray did the panels that showed the layout of the effects sequences and story beats, developed it with Charles Schneer, and brought in the director. I liked that.” Plans to attend film school at USC were interrupted for the aspiring cinematic talent when his application was declined. “However, I did get accepted by the art school there, but instead decided to get a job,” states Gillis. “I did manual labor, like delivering mattresses. The husband of my high school art teacher taught oceanography at Cal State Fullerton and had a student who made movies. He suggested that I should meet him. I went out to Pomona one night where James Cameron’s friend was a projectionist to see his 35mm print of Xenogenesis. Jim was 25 and

I was 19. I remember thinking, ‘He’s light years ahead of me!’ It was stop-motion, big models, robots and 35mm. We went back to his house and looked at giant pre-production paintings and a sevenfoot-long spaceship model. For about a year I helped him with that. I had taken over my mom’s garage with stop-motion puppets and makeup, which caused a Cal State classmate of my sister to offer to get me a job interview for Battle Beyond the Stars. I didn’t have a portfolio, so I decided to bring Jim along because he’d make me look good. We interviewed with Chuck Comisky and the Skotak brothers, Dennis and Robert. Six months later we were brought in to build miniatures. I worked alongside Jim each time he moved up a rung in the ladder.” After dropping out of UCLA’s film school, Gillis was referred to makeup and creatures effects wizard Stan Winston by Cameron. “Stan used to remark, ‘I love working with clay, but now people are my clay.’ He was good at putting a crew together, spotting talent and giving it an opportunity. Coincidentally, this was why he was


PG 34-39 ALEC GILLIS.indd 35

5/3/21 4:55 PM


LEFT TO RIGHT: Gillis helped friend and colleague James Cameron in creating the title adversaries in Aliens. Gillis does touch-up on a lifelike dummy of Bishop (Lance Henriksen) to be included in Aliens. An important aspect in designing believable creatures is understanding anatomy. Human characters get as much attention to detail as their alien counterparts. Gillis instructs the workshop crew at ADI.

able to have his own life. Stan would put a lot on your shoulders and go home, but his presence was felt as he was the boss and through his art direction. Stan corrected my misconceptions about how to run a shop and how to interact with clients. Stan would say, ‘It’s your job to explain to a producer why it needs to be this way. Don’t get tongue-tied or angry. Communicate. Both of you want the outcome to be great.’ He was a team player, had a can-do attitude, was positive and had a big ego. It can’t be an unchecked ego, but you need to have one. Stan could be very opinionated, but he would listen. He told me once, ‘You have to learn to find good ideas and utilize them.’” “Working on Aliens was a surreal experience, because in 1979 Jim and I had gone to see the first one as fans at a theater in Brea, California. We came away thinking it was a tour de force sci-fi film,” states Gillis. “Five years later, Jim is directing the sequel. I was one of Stan’s main guys at that time. It was exciting because Stan’s team had been through The Terminator with Jim and I had been through the Roger Corman experience with Jim, so we were a tight unit. It felt like filmmaking the way it’s supposed to be, where you have a nice bond with the director and there aren’t layers between you.” Subsequently, Gillis partnered with Stan Winston colleague Tom Woodruff Jr. to establish special effects company Amalgamated Dynamics. “In 1987, Tom and I were getting interest in an anthology script that we wrote, and Stan only wanted to work on his own projects. The movie did not happen, and I realized that there was a glass ceiling. I respectfully handed in my resignation to Stan and there was no animosity upon leaving. Tom followed shortly after that. Gale Hurd called us with this script for Tremors,


PG 34-39 ALEC GILLIS.indd 36

5/3/21 4:55 PM

which was our first feature, but we had also done a couple of other television jobs.” Tremors has gone on to become a cult classic. “Because it was a low-budget movie, we kept it simple and did manual puppeteering,” remarks Gillis. “We suggested that the Skotaks should be brought in to build and shoot the miniature sets while we’d construct the miniature creatures. There was a big cable puppet with levers that was very articulated and simple hand puppets with articulated mouths. There are miniature shots all through Tremors of creatures coming up and diving. It opened up the film. After the studio saw the edit, they decided to give more money, which was used for additional miniature shots.” Amalgamated Dynamics went on to win an Academy Award for Best Visual Effects for Death Becomes Her, as well as nominations for Alien3 and Starship Troopers. “Alien3 had name recognition. Death Becomes Her was an oddball movie, but Robert Zemeckis was riding high on his great work and it had some groundbreaking digital effects. And with Starship Troopers there is always something over the top and sensational about the work of Paul Verhoeven.” For Wonder Boys, a sophisticated dead dog needed to be created for filmmaker Curtis Hanson. “Curtis was calm and affable,” recalls Gillis. “He said, ‘Sometimes I want it floppy but to also have rigor mortis set in.’ An armature was made that had these locking joints. We’d go in, set the joints for the rigor mortis scenes, and then unset them. That was a nice gig, and they gave us the resources to do it right.” Animatronics technology has become precise. “We made a talking gorilla for Zookeeper that had 34 servo motors in its face.

LEFT TO RIGHT: Miniatures are part of the special effects toolset used by Gillis. Gillis is in dangerous proximity to an infamous doll known for being possessed by the spirit of a serial killer. An aerial view of the ADI workshop with the centerpiece being a massive animatronic alien creature. The head of a massive baby created for the 2010 Shanghai World Expo. Gillis poses with the animatronic gorilla created for Zookeeper.


PG 34-39 ALEC GILLIS.indd 37

5/3/21 4:55 PM


LEFT TO RIGHT: No creature is too big or small for Gillis to create. Gillis demonstrates a deadly moment in a well-crafted piece of prosthetic artwork. After dropping out of UCLA film school, Gillis was referred to makeup and creature effects wizard Stan Wintson by James Cameron. Gillis makes adjustments to a larger-than-life special effect. Gillis orchestrates some on-set puppeteering. A favorite childhood film of Gillis was The Planet of Apes, which contributed to his fascination with special effects.

We had the ability with our motion control system to pre-program all of those lip movements to get a beautiful sync and also get the facial performance right.” Though not having the budgets afforded to Rick Baker and Stan Winston, Amalgamated Dynamics has made the best out of the situation. “We’ve been fortunate to be connected to a lot of cool projects. Tremors is the closest to pure our design. We have gotten into franchises like Alien and Predator, which I love working on. You’re trying to push the boundaries to give audiences something new, but not depart too far from it.” “I would love to do more nonhuman full-scale miniature creature characters where we could push the boundaries of animatronics,” states Gillis, “fully motorized and pre-programmed, so we can animate in the way that a digital artist does. Refine, edit, get all of the bugs out of it, and have a character where you can push a button. It’s more of an effect than a live puppeteer. It would be great to have a project where we could invest in new technology to enhance our facial performance.” Animatronics have been created beyond the big and small screen, most memorably as a big baby for the Shanghai World Expo in 2010. “The live installations are fun. We’ve done work for Banksy as well. You’re using a lot of the same construction techniques, but it’s a different drill when it comes to maintenance. There are some amazing animatronics going on in theme parks right now, like the Avatar stuff that Disney has done, which is gorgeous. But, as a storyteller, I love the characters in movies. There is a lot we can do with animatronics or a combo platter with digital.” “We’re still fundamentally sculptures, molds, rubber skins and


PG 34-39 ALEC GILLIS.indd 38

5/3/21 4:55 PM

“We’re still fundamentally sculptures, molds, rubber skins and mechanisms. But within that we do have silicone skin material where you get real translucency, more flexible motioncontrol systems that are designed around our use rather than flying lights around stadiums, and smaller and more powerful digital servos that are cleaner in their signal.” —Alec Gillis mechanisms,” notes Gillis. “But within that we do have silicone skin material where you get real translucency, more flexible motion-control systems that are designed around our use rather than flying lights around stadiums, and smaller and more powerful digital servos that are cleaner in their signal. We are not a deeppocket industry. Digital had a great advantage in that the gaming companies are making more money than the movie business. They were funding a lot of development, and now you see Unreal engine and real-time stuff in The Mandalorian. It’s photographically real and fantastic. “Someone would have to come to me with a big chunk of money and say, ‘We want to build AI into a character.’ Maybe at this point that’s just theme parks. But that would be a lot of fun. I have ideas for live attractions that allow you to be in and among crazy creatures and animals and safely interacting with them based on AI.”

LEFT TO RIGHT: Gillis takes a self-portrait while on set. The Green Goblin and his hoverboard are brought to life in the Spider-Man trilogy. The principal photography takes place on top of massive mechanical rig. Gillis has received Academy Awards nominations for Alien 3 and Starship Troopers. Various-sized puppets were created for Tremors.


PG 34-39 ALEC GILLIS.indd 39

5/3/21 4:55 PM



Photos by Keith Bernstein and courtesy of HBO except where noted. TOP: Mechanical devices and sparks were incorporated into the workshop of Penance Adair (Ann Skelly) to emphasize her ability to create futuristic electrical inventions. OPPOSITE TOP: The inventions of Penance Adair are derived from her ability to see potential energy and what it would do. OPPOSITE BOTTOM LEFT: The coachman is a mechanical device made by Adair that inhabits a world that is an authentic recreation of Victorian London. OPPOSITE BOTTOM RIGHT: A single street block was constructed and redressed to look like several others when needed.

In the HBO series The Nevers, spores are released by an alien spaceship, causing a group of Victorian-era women and men to develop special abilities that result in them becoming social outcasts and subjugated to brutal experimentation. Mixing the genres of period drama, science fiction and horror is creator Joss Whedon, which is not surprising considering his credits include the Avengers franchise, Buffy the Vampire Slayer and Firefly. He subsequently left the production with the remaining six of 12 episodes being completed by showrunner Philippa Goslett (Mary Magdalene). “Joss is a veteran of visual effects projects and knows the routine,” notes Visual Effects Supervisor Johnny Han, who was also involved with One Night in Miami and Welcome to Chechnya. “Joss understood from the first script readings how important it was to identify what was going to be visual effects, but once we got the overall idea of how to do them, he trusted us. Philippa is amazing as well. She has been good to embrace the world that we’ve already halfway established and is working that into her material. Even though Philippa has done fewer visual-effectsheavy projects, I’m excited because it means that I can be useful and walk her through the process.” The Nevers was impacted by a high-profile HBO production. “A lot of our team came from Game of Thrones,” reveals Han. “I was the new kid. Everyone was coming fresh off of the finale, so they had such a well-oiled machine and professional way of working – that was such a treat. There wasn’t any drama. Everyone knew what you were there to do, that it was hard work and not everything goes right.”


PG 40-44 THE NEVERS.indd 40

5/3/21 4:57 PM

Part of the Game of Thrones alumni is Production Designer Gemma Jackson (Finding Neverland). “I felt it was more make-believe in Game of Thrones,” observes Jackson. “Anything could be considered. While for The Nevers, despite the sci-fi aspect, I kept it straightly in Victorian time.” Special Effects Supervisor Michael Dawson (Snow White and the Huntsman) is no stranger to HBO projects having previously worked on Band of Brothers and The Pacific. “Every day was a bullet hit or someone was getting blown up or the beach was going up,” remarks Dawson. “This one was Victorian London, so you didn’t get any explosions except for the opium cart going up. The

Nevers was a lot more sedate, but also interesting because it wasn’t the same thing every day.” Halfway through the production, the pandemic caused The Nevers to shut down, leading to some changes in the structure and execution of the series. “The pandemic put it into parts one and two,” notes Han. “That gave us an opportunity to embellish the story because now we’re doing 12 instead of 10 episodes. Even with part one, it was a blessing because during the lockdown the edit was refined. They were shooting as much live-action material as possible, so we were able to identify a lot of things. Normally, with element shoots, you have to guess what you’re going to need, but


PG 40-44 THE NEVERS.indd 41

5/3/21 4:57 PM


TOP: The hand motions of Bonfire (Rochelle Neil) resemble someone making pottery out of fire. BOTTOM: Gemma Jackson referenced her extensive personal library of historical black-and-white photography when recreating Victorian London.

we were able to shoot some shot-specific ones with special effects.” Locations and sets also needed to be revised. Adds Han, “When we got back to work, in a way it became more interesting work. The asylum built out at Langleybury was too claustrophobic for the COVID-19 situation, so we completely transfigured the police station. We took down walls, kept the overall shape and windows, and then made it the asylum with different colors, dressing and accents. That pleased me ridiculously on a level. I defy you to know that it’s the same place.” One of the guidelines established by Joss Whedon was that story is not part of the steampunk genre. “Steampunk is such a well-established visual motif,” offers Han. “The most obvious thing is that we’re not using steam anyway. Penance Adair [Ann Skelly] is all about electricity. In the show, she is revealed to be the master of being able to see potential energy and what it would do. We didn’t want to make steam and hydraulics. We leaned a lot more towards electric motors and things sparking – that helped to inform the design process.” Jackson uses a different term to describe the show. “Joss’ byword was Victorian sci-fi, which is how I approached it. Other than Episode 106, which is a whole different kettle of fish [because it takes place on another planet], most of it was fairly accurately based on the Victorian era. But we allowed people like Penance


PG 40-44 THE NEVERS.indd 42

5/3/21 4:57 PM

Adair to have extraordinary ideas and plans that were way beyond her era.” A suggestion involving electricity led to Dawson getting hired. “I remember when I first went to the interview to get the job and the producer only had one script. She said, ‘In this show we have a car that comes out of the back of a horse-drawn carriage and how would you go about that?’ I thought, ‘I have no idea.’ I said, ‘Why don’t we make the car electric?’ Later she rung me up and said, ‘You’ve got the show. We mentioned the fact that it should be electric and Joss loved that idea.’ From that point on we were building a fully-electric, three-wheel vehicle from scratch that would fit into the back of a horse-drawn carriage and come out. I was impressed with the fact that we did it and it worked!” The vehicle could go 30 miles per hour. “Stunt Coordinator Rowley Irlam, who is not forgiving in any way, put the car through its paces and said, ‘That is a great little car.’ From then on, it went up and down hills and around London,” says Dawson. “Joss liked the car so much that he commissioned a four-wheel version that we made and will be in the second half of the season.” Among the supernatural abilities of the Touched is the ability to see into the future, walk on water and wield fire. “We call it ‘the ripple’ when Amalia True [Laura Donnelly] sees into the future,” remarks Han. “The DP for the pilot, Seamus McGarvey [Atonement], did some tests with speed ramping the frame rate and maintaining the same ratio as the aperture so it didn’t get any darker or brighter – that would cause the character to jitter. I came up with the idea that if you were looking at your reflection in a vibrating mirror, the edges of your reflection would distort like rippling water. I took Seamus’ test, put on our visual effect distorting it and Joss loved it.” Amalia True gets attacked by the water-defying Nicholas ‘Odium’ Perbal [Martyn Ford]. “It was the biggest effects shoot of the season,” recalls Han. “We filmed at a water tank stage at Pinewood Studios. An overhead wire rig was built that could suspend Martyn Ford high enough so that his feet were resting on the water. There were various platforms and a different rig for every shot. We had an underwater camera and crane. Even though the shot is all CG, it is based on the practical water shoot.” Annie ‘Bonfire’ Carbey (Rochelle Neil) has the ability to unleash fire. “We used some practical fire and developed a Bluetooth micro-controlled LED that was a poker chip size which rested in her palm,” explains Han. “We came up with a lot of ways where she would keep her palm facing herself to hide the light from camera and to nicely illuminate her own face. Bonfire shapes the fire in her hands almost like pottery.” Primrose Chattoway (Anna Devlin) is revealed to be a giant. “We would shoot the scene without Anna and have her do her lines off-screen,” says Han, “but when it was time to shoot her plate, we would bring the two or three cameras half a distance closer. If you do your measurements well, you’ll get an optically correct image of a girl being twice as big.” The production design accommodated the character. “For going in and out of Penance’s workshop, we made it look like they cut through the top of the wall and put extra doors up so Primrose

TOP TO BOTTOM: By halving the distance between the camera and Primrose Chattoway (Anna Devlin), Visual Effects Supervisor Johnny Han was able to create the optical illusion of her being the size of a giant. The scene where Primrose Chattoway (Anna Devlin) is revealed to be a giant to Myrtle Haplisch (Viola Prettejohn) by Penance Adair. Special Effects Supervisor Michael Dawson built a functional electric car using a Tesla battery configuration.


PG 40-44 THE NEVERS.indd 43

5/3/21 4:57 PM


“When you read it, it takes a little while to get your head around it because it’s so out of the norm. When you get to Episode 106 where it’s off-world and you think, ‘What is going on? Where is this going?’ It is so exciting to be putting something into reality that is in [show creator] Joss [Whedon’ s] mind.” —Johnny Han, Visual Effects Supervisor

TOP TO BOTTOM: The Galanthi vessel had to feel like an extension of a creature that could grow its own ship. Amber was the prominent material chosen for the Galanthi spacecraft. It was important to portray the spores dropped from the Galanthi as being friendly rather than invasive.

could go to and through,” reveals Jackson. “It was a nice little detail.” Driving the sci-fi narrative is a mysterious alien species. “The Galanthi was a big character design process,” admits Han. “Their spacecraft has an organic, almost skeletal feel to it. We wanted it to feel airworthy, so it almost feels like sails. We thought that amber is a beautiful color and is translucent. Our scene was going to be at dusk, so the sunlight would light the amber nicely. We toned down the amber a lot. It had to feel like an extension of a creature that could grow its own ship.” Not every scene takes place on Earth. “That took a while for us to agree as to what that world was,” remarks Jackson. “There is this burnt-out city that had to have an architectural sense to it other than being a pile of rubble. I remember looking at some extraordinary microscopic photographs of fur and eyes, and all sorts of weird and wonderful things that we had up on walls. I got quite involved with underwater stuff like fish gills.” The alien world harkened back to another experience for Dawson. “It is almost like Band of Brothers. There is a battlefield with explosions and bullet hits. We had a lot of snow, atmosphere, wind and dirt. I enjoyed doing that as it was so different from the previous five episodes set in Victorian London.” Approximately 1,200 visual effects shots were made by 12 vendors consisting of Scanline VFX, EDI, Mackevision, MR. X and BUF, with an in-house team of digital artists providing crucial problem-solving support. “The things that we didn’t anticipate were how much of the invisible effects that we would have to figure out,” states Han. “How do we move the story along without big flashy effects? Whether it meant morphing the performance between two different takes together or holding eyelines and preventing the character from blinking.” The Nevers is a wild project. “When you read it, it takes a little while to get your head around it because it’s so out of the norm,” says Han. “When you get to Episode 106 where it’s off-world and you think, ‘What is going on? Where is this going?’ It is so exciting to be putting something into reality that is in Joss’ mind.” Jackson enjoyed the creative process. “What happens is you tend to do a lot of research, and then you have to chuck it out and get on with it in some strange way. You have to own it and make it yours. It just comes out of you after a while, which is when it gets to be gorgeous. That’s why I’ve come back to do the other half [of the season]. It is still lurking in me. I want to carry on with it a bit longer.”


PG 40-44 THE NEVERS.indd 44

5/3/21 4:57 PM

PG 45 HYBRIDE AD.indd 45

5/3/21 5:21 PM




Images courtesy of Peter Jackson TOP: Sir Peter Jackson, VES Lifetime Achievement Award recipient. OPPOSITE TOP: With Braindead miniatures.

Taking us from Skull Island to Middle Earth to the World War I battlefield to breakfast with the Beatles, Sir Peter Jackson has created cinematic experiences that have transported and transfixed audiences worldwide. One of the most innovative filmmakers of our generation, acclaimed director-producer-screenwriter Jackson has created exceptionally humanistic stories through his visionary approach, expansive storytelling and embrace of technology to enhance the movie going experience. Jackson made history with The Lord of the Rings trilogy, becoming the first person to direct three major feature films simultaneously. The Fellowship of the Ring, The Two Towers and The Return of the King collected a slew of awards from around the globe – including 12 VES Awards across the trilogy, with The Return of the King receiving his most impressive collection of awards, including three Academy Awards, two Golden Globes, three BAFTAs, and awards from the Directors Guild and Producers Guild. His celebrated filmography includes Heavenly Creatures, King Kong, The Lovely Bones, the three-film adaptation of Tolkien’s The Hobbit, the acclaimed They Shall Not Grow Old and the forthcoming The Beatles: Get Back documentary, slated for release this summer. In recognition of his enormous contributions to filmed entertainment, Jackson was honored at the 19th Annual VES Awards with the Lifetime Achievement Award. On that note, Jackson was clear: “If I was ever going to be honored for my lifetime of work, I would have chosen the VES award, because the dream of making


PG 46-49 PETER JACKSON.indd 46

5/3/21 4:59 PM

“[To aspiring filmmakers] I generally say, ‘Make a film.’ Today, you can do so much with your phone, and the digital technology available to young people is incredible. It’s actually very hard to make a film at home. It demands a passion and dedication to your craft and a support system. But that’s why people should do it. It’s a test. If you want to be a filmmaker, you’ve got no excuses.” —Peter Jackson visual effects was the seed of my whole world as a filmmaker. I’m thrilled and gratified. But I’ve checked the fine print and this award doesn’t call for mandatory retirement. Don’t think I’ve peaked – there’s a lot more to come!” Growing up an only child in a tiny village in New Zealand, Jackson was always searching for things to occupy his imagination, and he credits his ‘alone time’ with sparking his love of visual effects. “I remember when my dad brought home our first black-and-white TV. I was enraptured with British sci-fi shows Stingray and Thunderbirds. I was putty in their hands and I used my Matchbox cars to recreate the scenes. Then came the original King Kong and Jason and the Argonauts, and it opened up a world that I didn’t know existed. That early fascination shaped me as a filmmaker. I strive to make films I like to watch, that have an air of escapism and the fantastic. “I was making movies with my parents’ Super 8 and then a 16MM when I was really young. I wanted to be a filmmaker, particularly working in VFX, in part because it was a solitary activity. As

I was filming home movies, I began to think about camera angles and how to compose shots and tell a story, and I started to learn how to edit and splice shots together. I fell in love with storytelling and realized I’d be frustrated being told what to do… and so a director was born!” Jackson started out making low-budget splatter films. His first, Bad Taste, was shot on weekends over four years and evolved into Jackson’s first feature film. “It was both practical and passion. I loved horror films like Evil Dead and Dawn of the Dead, and when you can only make a low-budget movie, you can’t make a King Kong or a Harryhausen film with spectacle or costumes or exotic locations. The genre allowed me to make an impact without much money. We just needed special effects and a sense of humor. It was a great gateway.” By the time he got to the genesis of the Lord of the Rings movies, Jackson knew they would have the financing to make a cool, modern version of a Harryhausen film – that was the initial spark. “We started working on fantasy film ideas and at every turn


PG 46-49 PETER JACKSON.indd 47

5/3/21 4:59 PM


“Once I finish my historical trip with the Beatles, I will go back to fulfilling my childhood vision and do some stop-motion moviemaking. It is amazing to have the chance to realize the aspirations I had in the earliest part of my life.” —Peter Jackson

TOP LEFT TO RIGHT: With Orcs on the set of Lord of The Rings: The Two Towers. With the Palantir on The Lord of the Rings: The Return of the King. On the set of his first film Bad Taste. About to shoot a fight scene with himself for Bad Taste. On the set of puppet animated musical black comedy horror film Meet the Feebles.

realized we couldn’t do something, because it was too much like Lord of the Rings. We were frustrated that everything seemed to emanate from Tolkien. So eventually we decided to get the rights and adapt it ourselves.” The Lord of the Rings trilogy was one of the biggest hits of the 2020 holiday movie season and is celebrating the 20th anniversary of the launch this year. Jackson mused on the films’ enduring popularity. “People use Lord of the Rings as a cultural touchstone and you see references all the time, especially Gollum. I can’t point to a magic formula, but I’m of course proud. Here’s where I deviated from films like Jason and the Argonauts. Lord of the Rings is tagged as a fantasy novel, but presents as a historical novel, like an account of Elizabeth I or the Battle of Hastings. It reads like it actually happened in a bygone era. That shift in perspective influenced everything from writing and directing to production design. I shot it as a historical film, which I thought would be an interesting way to treat a fantasy topic. Maybe that’s why people are so invested.” Jackson’s love of history combined with a drive to create new visual experiences translates to his documentary filmmaking. For They Shall Not Grow Old, the Imperial War Museum in London learned of Jackson’s interest in World War I and pitched the idea of a film that took their archival footage and presented it in a fresh way, in concert with the 100th anniversary of the war. “I was intrigued and struck with questions if technology existed to make this footage look like it was shot slowly last week. The idea of doing something transformative to make WWI accessible for a younger audience was interesting, and so Weta and Park Road Post started doing tests. “The most amazing surprise was when we hit the original speed of the footage. The cameramen in WWI used a hand-cranked camera with no motors, so all of the shots varied in speed. None of it was documented, so we had to go shot by shot and start eyeballing it. But as soon as you hit the magic speed, people started to come alive. They no longer looked like Charlie Chaplin figures. These people became human beings, and we could give the audience the experience of what it was like on the battlefield… to see the war through their eyes. “On Get Back, we are also working with the only existing fly-onthe-wall footage of the Beatles. In all other known footage, the band was performing – on stage or for the press. We had running footage of people not meaningfully aware of being filmed. It’s incredible to see the band and experience them as human beings. I’m excited and emotional about delivering that to audiences.” Moving seamlessly from the fantastical to documentaries, Jackson shares his approach as a filmmaker. “What is the same


PG 46-49 PETER JACKSON.indd 48

5/3/21 4:59 PM

about any kind of film is the storytelling at its heart. On They Shall Not Grow Old, we went through 100 hours of footage with different soldiers telling the same story. They risked their lives with the cameras rolling to preserve their legacy. On Get Back, we have 60 hours of never-before-seen footage of the Beatles, filmed by Michael Lindsay-Hogg, and 150 hours of audio. In both cases, I didn’t have the ability to get new shots; we use what we have. But I have always loved editing, and post-production is my favorite part of any project. I am quite happy and privileged to work with other people’s footage and go straight to the editing suite.” When asked for his advice for aspiring filmmakers and visual effects practitioners, Jackson said it often manifests with people asking to become his assistant. He went on to share lessons from his journey. “I hearken back to what I did as a kid. I got a Super 8 camera and made my own movies. None of them were commercial quality, but they taught me a helluva lot. I generally say, ‘Make a film.’ Today, you can do so much with your phone, and the digital technology available to young people is incredible. It’s actually very hard to make a film at home. It demands a passion and dedication to your craft and a support system. But that’s why people should do it. It’s a test. If you want to be a filmmaker, you’ve got no excuses.” In preparing to accept the VES Lifetime Achievement Award, Jackson talked about things coming full circle in his life and how he spent time in COVID-19 lockdown – revealing his joy and reverence for the art of filmmaking and those giants who influenced his work. “When I was 10 or 11, I wanted to be a stop-motion animator. I loved Ray Harryhausen models, but I didn’t have the materials to make the armatures properly, and I experienced a long period of frustration. Last year when New Zealand went into a six-week lockdown, I decided I would go back and make the models. I got the foam latex and everything I needed, installed an oven and got started. I made the skeletons from Jason and the Argonauts. I made the movable figurehead from The Golden Voyage of Sinbad, and I made the six-armed goddess Kali – the models I dreamed of making when I was a kid, as fully working armatures. I haven’t actually filmed them yet, because lockdown ended, and I got busy and went back to work on Get Back. “Once I finish my historical trip with the Beatles, I will go back to fulfilling my childhood vision and do some stop-motion moviemaking. It is amazing to have the chance to realize the aspirations I had in the earliest part of my life.”

TOP LEFT TO RIGHT: On the set of King Kong with Jack Black. On the set of The Lord of the Rings with Sir Ian McKellen (Gandalf). Shooting on the set of Mortal Engines. On the set of his first film Bad Taste. On the set of The Hobbit. With his homemade Ray Harryhausen-inspired armature of Kali.


PG 46-49 PETER JACKSON.indd 49

5/3/21 4:59 PM



TOP TO BOTTOM: Comedian Patton Oswalt kicks off the 19th Annual VES Awards – gone virtual! VES Executive Director Eric Roth delivers his welcome from the woods. VES Chair Lisa Cooke hands out the show’s first award. The VES Award, in homage to the Georges Méliès iconic Man in the Moon.

For a complete list of nominees and winners of the 19th Annual VES Awards visit visualeffectssociety.com.

The Visual Effects Society held the 19th Annual VES Awards in April, the prestigious yearly celebration that recognizes outstanding visual effects artistry and innovation in film, animation, television, commercials, video games and special venues. But this time, the Society delivered the signature event with a twist, as the show was a virtually-produced celebration with viewers tuning in to the streaming show worldwide. Comedian Patton Oswalt served as host of the dynamic virtual show, marking his 10th time at the helm, and appeared as a supersized green ogre, roasting VES Executive Director Eric Roth in the forest and musing with the iconic Man in the Moon. The VES Awards honored talent in 25 categories – showcasing the depth of VFX contributions to all forms of filmed entertainment – from nominees selected by VES members via 32 virtual nomination events conducted across the globe. The Midnight Sky was named photoreal feature winner, garnering two awards. Soul was named top animated film, winning five awards. The Mandalorian was named best photoreal episode and garnered three awards. Walmart won top commercial honors. Sacha Baron Cohen presented the VES Award for Creative Excellence to acclaimed visual effects supervisor, second unit director and director of photography Robert Legato, ASC. Cate Blanchett presented the VES Lifetime Achievement Award to award-winning filmmaker Sir Peter Jackson – along with a starstudded tribute from Andy Serkis, Naomi Watts, Elijah Wood, Sir Ian McKellen, James Cameron and Gollum. Presenters also included: directors Zack Snyder and Pete Docter, actors Lauren Cohan, Danny DeVito, Jenna Elfman, Marcus Scribner, Chris Sullivan, Colman Domingo, Demian Bichir and Anson Mount and VES Chair Lisa Cooke. Jocelyn Moffatt, Autodesk Entertainment Industry Marketing Manager, presented the Autodesk Student Award. “Traditions find a way to persist,” said Lisa Cooke, VES Board Chair. “With vision and a lot of hard work, we were proud to host our annual celebration of the artistry, ingenuity and passion of visual effects practitioners worldwide – virtually. We are seeing best in class work that elevates the art of storytelling and engages the audience in new and innovative ways. The VES Awards is the only venue that showcases and honors these outstanding artists across a wide range of disciplines, and we are extremely proud of all our nominees and winners.”


PG 50-57 VES AWARDS.indd 50

5/3/21 5:00 PM

TOP TO BOTTOM: Sir Peter Jackson, recipient of the VES Lifetime Achievement Award. Cate Blanchett presents the Lifetime Achievement Award to Peter Jackson. Gollum makes an appearance in tribute to Peter Jackson. Naomi Watts honors Peter Jackson. Outstanding Visual Effects in a Real-Time Project – Ghost of Tsushima. Outstanding Model in a Photoreal or Animated Project – The Midnight Sky; Aether. Outstanding Visual Effects in a Photoreal Feature – The Midnight Sky. Winning team for Outstanding Visual Effects in a Photoreal Feature – The Midnight Sky.


PG 50-57 VES AWARDS.indd 51

5/3/21 5:00 PM


TOP TO BOTTOM: Outstanding Visual Effects in an Animated Feature – Soul. VES Awards presenter and winner for Soul Pete Docter. Outstanding Virtual Cinematography in a CG Project – Soul. Outstanding Animated Character in an Animated Feature – Soul; Terry. Outstanding Effects Simulations in an Animated Feature – Soul. Outstanding Created Environment in an Animated Feature – Soul; You Seminar. Outstanding Supporting Visual Effects in a Photoreal Episode – The Crown; Gold Stick. Outstanding Visual Effects in an Animated Feature – Soul.


PG 50-57 VES AWARDS.indd 52

5/3/21 5:00 PM

TOP TO BOTTOM: Outstanding Visual Effects in a Photoreal Episode – The Mandalorian, The Marshal. Outstanding Animated Character in an Episode or Real-Time Project – The Mandalorian; The Jedi; The Child. Outstanding Created Environment in an Episode, Commercial, or Real-Time Project – The Mandalorian; The Believer; Morak Jungle. Outstanding Compositing in a Feature – Project Power. Outstanding Effects Simulations in a Photoreal Feature – Project Power. Outstanding Compositing in an Episode – Lovecraft Country; Strange Case; Chrysalis. Outstanding Effects Simulations in an Episode, Commercial, or Real-Time Project – Lovecraft Country; Strange Case; Chrysalis. Outstanding Visual Effects in a Photoreal Episode – The Mandalorian, The Marshal.


PG 50-57 VES AWARDS.indd 53

5/3/21 5:00 PM


TOP TO BOTTOM: Sacha Baron Cohen presents the Creative Excellence Award to Rob Legato, ASC. Outstanding Visual Effects in a Commercial – Walmart; Famous Visitors. Outstanding Special (Practical) Effects in a Photoreal Project – Fear the Walking Dead; Bury Her Next to Jasper’s Leg. Outstanding Created Environment in a Photoreal Feature – Mulan; Imperial City. Outstanding Supporting Visual Effects in a Photoreal Feature – Mank. Outstanding Visual Effects in a Special Venue Project – The Bourne Stuntacular. Rob Legato, ASC, recipient of the VES Award for Creative Excellence.


PG 50-57 VES AWARDS.indd 54

5/3/21 5:00 PM

TOP TO BOTTOM: Andy Serkis pays tribute to Peter Jackson. Ian McKellen applauds Peter Jackson. Elijah Wood congratulates Peter Jackson. Autodesk Entertainment Industry Marketing Manager Jocelyn Moffatt presents the Autodesk Student Award. Outstanding Visual Effects in a Student Project – Migrants. Jon Landau and James Cameron celebrate Peter Jackson.


PG 50-57 VES AWARDS.indd 55

5/3/21 5:00 PM


TOP TO BOTTOM: VES Awards presenter Danny DeVito. Outstanding Animated Character in a Photoreal Feature – The One and Only Ivan; Ivan. VES Awards presenter Jenna Elfman. Outstanding Animated Character in a Commercial – Arm & Hammer; Once Upon a Time; Tuxedo Tom. VES Awards presenter Anson Mount. VES Awards presenter Colman Domingo. VES Awards presenter director Zack Snyder.


PG 50-57 VES AWARDS.indd 56

5/3/21 5:00 PM

TOP TO BOTTOM: 10-time VES Awards host Patton Oswalt wraps the show. VES Awards presenter Marcus Scribner. Outstanding Compositing in a Commercial – Burberry; Festive. VES Awards presenter Lauren Cohan. VES Awards presenter Chris Sullivan. VES Awards presenter Demian Bichir. The dynamic VES Awards streamed worldwide to members in 40+ countries.


PG 50-57 VES AWARDS.indd 57

5/3/21 5:00 PM



The VES Awards for Outstanding Visual Effects in a Photoreal Feature and Outstanding Model in a Photoreal or Animated Project (Aether) went to The Midnight Sky. (Images courtesy of Netflix)


PG 58-63 VES AWARD WINNERS.indd 58

5/3/21 5:01 PM


PG 58-63 VES AWARD WINNERS.indd 59

5/3/21 5:01 PM




PG 58-63 VES AWARD WINNERS.indd 60

5/3/21 5:01 PM

Soul won the VES Awards for Outstanding Visual Effects in an Animated Feature, Outstanding Animated Character (Terry), Outstanding Created Environment in an Animated Feature (You Seminar), Outstanding Effects Simulations in an Animated Feature and Outstanding Virtual Cinematography in a CG Project. (Images courtesy of Disney/Pixar)


PG 58-63 VES AWARD WINNERS.indd 61

5/3/21 5:01 PM




PG 58-63 VES AWARD WINNERS.indd 62

5/3/21 5:02 PM

The Mandalorian won the VES Awards for Outstanding Visual Effects in a Photoreal Episode (The Marshal), Outstanding Animated Character in an Episode or Real-Time Project (The Jedi; The Child) and Outstanding Created Environment in an Episode, Commercial or Real-Time Project (The Believer; Morak Jungle). (Images courtesy of Lucasfilm Ltd. and Disney)


PG 58-63 VES AWARD WINNERS.indd 63

5/3/21 5:02 PM




Images courtesy of Cinesite and Artemis Oikonomopoulou except where noted. TOP LEFT: Artemis Oikonomopoulou TOP RIGHT: Artemis Oikonomopoulou on set filming Death on the Nile in Egypt. OPPOSITE TOP: Look development of Yellowjacket in Ant-Man. (Image courtesy of Marvel Studios and DNEG)

The ability of the visual effects industry to quickly adapt to the pandemic-inspired, work-at-home mandate was a pleasant surprise for Cinesite Visual Effects Supervisor Artemis Oikonomopoulou. “For everyone and other people I have spoken to, we were all surprised how all of the visual effects companies managed to sort it all out in a matter of one or two weeks. It’s opened the doors now for a new way of working, which was something that was going to happen eventually, but this forced the issue. Since our industry is based a lot on tax breaks, the next step would be for governments to go, ‘Can you work for a London facility but be in France?’ It will be interesting to see how they resolve that. But I will certainly be happy when I’m back in an office again. I’ve never thought I’d say that!” Growing up in Athens during the early 1980s, Oikonomopoulou was a fan of Batteries Not Included, Ghostbusters, and anything by Steven Spielberg. “My parents encouraged us to go to the cinema a lot. It was the golden age of family movies,” she says. The ancient buildings and artifacts were simply part of the landscape. “As a child you spend lots of time being dragged around museums looking at stones and pots from an era which at the time means nothing to you. But when you grow up and think about it, it takes on a different perspective. My mom was an archaeologist.” The first personal computer introduced into the family household was by Amstrad. “There was a lot of Pac-Man and Donkey Kong! It was a different time growing up as there was no Internet. I was subscribing to cinema magazines. It was also a golden age


PG 64-67 ARTEMIS.indd 64

5/12/21 3:15 PM

“During the final weeks of doing our master’s dissertation [at Bournemouth University], we had a couple of recruiters from Cinesite come to the university and I had a meeting with them. Two weeks after finishing my course, they offered me a job. It was a time when the visual effects industry in London was picking up with the first Harry Potter film.” —Artemis Oikonomopoulou of Disney animation with The Little Mermaid and Beauty and the Beast. A lot of people say that they got into this job because of Star Wars, but I got into this because I wanted to be a Disney animator.” Computer generated effects – in particular, the ballroom in Beauty and the Beast and the alien water tentacle in The Abyss – deepened her fascination with how movies were made. “In the mid-1990s, when I was a teenager, I randomly met someone who said, ‘You shouldn’t be a hand-drawn animator, but go and study computers because everyone is going to be using them in the future,’” recalls Oikonomopoulou. “On a whim I applied to Bournemouth University in 1997. At the time, it was the only bachelor’s degree in something like that in Europe. I finished school at the age of 17 and then moved to the U.K.” Oikonomopoulou subsequently graduated with a bachelor’s in Computer Animation and Visualization, and a master’s degree in Special Effects. “When I got to the course and did the degree, I realized that I was lucky because I ended up loving computer graphics. During the final weeks of doing our master’s dissertation, we had a

couple of recruiters from Cinesite come to the university and I had a meeting with them. Two weeks after finishing my course, they offered me a job. It was a time when the visual effects industry in London was picking up with the first Harry Potter film.” Initially, her ambition was to be an effects artist, but the new recruit was considered to be too junior for the position. “I started in lighting and texturing,” remarks Oikonomopoulou. “I’m so happy that I did because it was all about painting pictures. It was before the advances in physically-based lighting. You have your fill and bounce light. You had to create it all yourself. I ended up loving that. One sad thing about this lockdown, outside of having to work from home, is not being able to go to the cinema and galleries because that’s something I do on a weekly basis. First of all, it’s because I love it, but it also trains your creative eye to get better at what you do. You always have to look at art.” Eventually, Oikonomopoulou began to ponder her future beyond what she had been doing. “I was a lighter for six or seven years and then a CG supervisor for a long time. You have to think


PG 64-67 ARTEMIS.indd 65

5/12/21 3:15 PM


“A lot of people say that they got into this job because of Star Wars, but I got into this because I wanted to be a Disney animator.” —Artemis Oikonomopoulou

TOP TO BOTTOM: Oikonomopoulou served as an on-set supervisor on Venom. The part that Oikonomopoulou enjoys the most about visual effects is being able to craft cinematic imagery as she did on Death on the Nile. Oikonomopoulou learned a great deal from filmmaker Andrew Stanton during the making of John Carter. (Image courtesy of Cinesite and Disney)

on your feet all of the time, be organized, make sure that you keep everyone connected and that every department is talking to each other. Over time I realized that I wanted to become a visual effects supervisor. I care about the technical part of my job because it is such a big part of it, but I mostly care about how something looks. I like to look at pictures, give people feedback and bounce ideas off of them. Even with something that is unreal you still have to add a level of reality to be able to sell it to your audience.” On Death on the Nile and Venom, Oikonomopoulou was an on-set supervisor. “It’s about being constantly aware of what’s going on, being able to foresee any issues that might arise on set, and being quick on your feet in coming up with solutions. When I started working, visual effects was considered part of the post-production and not such an important aspect of being on set. That has definitely changed. Virtual production is going to be an interesting one to see where it takes us and how much of a tool it will be in the next few years, or it’s going to be something that people try and think it’s too much for what it is. It was the same thing 10 years ago when all of the films wanted to be in stereo. Virtual production is great to help directors to visualize something, because it’s hard shooting and looking at a greenscreen and not really knowing how that would look for another few months down the line. You always have to remember; you can do it with or without.” “I never like being asked what it’s like being a woman in this job, because I don’t see that,” remarks Oikonomopoulou. “You go to work every day and do hopefully as much as anyone else can, independent of who you are. In London, and I’m sure it’s the same in Canada and the States, it’s such a multicultural close-knit group of people. Everyone brings something different to the table, from the way that they see things to their backgrounds. When I started working in the 3D department, I was the only woman for eight or nine months. There were a lot more women in the compositing side. But over time that has changed. People have gotten more educated about this industry, and realize that it’s not only about sitting in front of a computer and coding all day. It’s another medium for your creativity. You could be more technical or creative. There are so many avenues and niches you can fit yourself in this job.” Among her career highlights was creating the ‘Thomas the Tank Engine’ sequence in Ant-Man. “What I liked about Ant-Man is it felt different than your usual Marvel Studios movie because of that smaller scale,” remarks Oikonomopoulou, “when he shrunk down to tiny proportions, how little stuff affected him, like fluff on the street. I was a CG supervisor [at DNEG] on that. We were sent a cupboard full of the toys that were in the little girl’s room. Every day we would pick one out and give it to a modeler and texture artist and say, ‘Make this little teddy bear.’ What was so interesting was looking at the minute detail, especially when Ant-Man is running on a carpet – that carpet took us so long! To be at that scale and show all


PG 64-67 ARTEMIS.indd 66

5/12/21 3:15 PM

“Over time I realized that I wanted to become a visual effects supervisor. I care about the technical part of my job because it is such a big part of it, but I mostly care about how something looks. I like to look at pictures, give people feedback and bounce ideas off of them. Even with something that is unreal you still have to add a level of reality to be able to sell it to your audience.” —Artemis Oikonomopoulou of the microfibers to sell, that was hard but fun.” Another personal favorite was creating the destruction of the lighthouse cave and a dancing Mandelbulb, which is the embodiment of an alien, for Annihilation. “It is a great movie that is more of a thinking man’s sci-fi. The photography was beautiful. The things we had to make were beautiful and so interesting.” With the growing importance of the Chinese market, Oikonomopoulou worked on the development and execution of the barren Earth landscapes and space environments for Mermaid 2. “It was intriguing trying to connect different cultures and find a common ground. I was on set for six weeks. They work differently, and the language barrier is hard when you’re there on your own. I worked closely with the director, Stephen Chow, because he wanted a lot of concept ideas. [It was unnerving] when you were having a conversation in Mandarin or Cantonese for about half an hour and someone would only translate one sentence! There was a lot of that happening. You had to try to figure out the in-between.” The poor critical reaction to John Carter was a disappointment for her. “The movie didn’t deserve all of the hate that it got. I was on that for two and a half years. Andrew Stanton is a great director and storyteller. You learned so much from the way Andrew talked about what he wanted to see and how he wanted to see it. At the time it was a challenging film for technical reasons. There were a huge number of shots to work on. We had to redesign the whole pipeline [at Cinesite]; 200 to 300 extra artists were hired. It was a shame that John Carter was pushed aside in the end.” Machine learning is a scary prospect for the visual effects industry in terms of its unforeseen impact on the creative side. “There are definitely areas that we can improve and economize, but it is still a job in a creative field,” observes Oikonomopoulou. “It still needs people’s eyes, not just a machine to operate behind the scenes.” The advances in smartphones, like the iPhone Pro Max to include LiDAR scanning, is something to be excited about. “I had an app on my phone where you could take an HDRI – that stuff is great.” The career appeal lies in the nature of visual effects. “I don’t see it as a job,” she says. “It’s part of who I am. That’s when you enjoy what you do. Sometimes it does take over your life and there are long hours, other times not so much. I can’t see myself doing anything else in my life.”

TOP TO BOTTOM: Oikonomopoulou assisted showcasing the struggle between Eddie Brock (Tom Hardy) and the symbiote inhabiting his body in Venom. Natalie Portman encounters a mysterious alien lifeform in Annihilation. (Image courtesy of DNEG) The microscopic imagery of Ant-Man made it a fun and unique project for Oikonomopoulou. (Image courtesy of Marvel Studios and DNEG) After watching The Little Mermaid and Beauty and the Beast, Artemis Oikonomopoulou initially had ambitions to become an animator.


PG 64-67 ARTEMIS.indd 67

5/12/21 3:15 PM



If you thought that the state of play in special effects in television and streaming had ramped up in recent years, you’re not imagining things. More shows, bigger scenes, and what might be described as a resurgence in the audience taste for practical effects has seen SFX, makeup effects and other on-set effects in high demand. Here, several practical effects supervisors look back at what they’ve been working on in the TV and streaming space, as they highlight the specific gags and effects sequences in Gangs of London, Lovecraft Country, 9-1-1: Lone Star and His Dark Materials. THE ART AND SCIENCE OF EFFECTS: GANGS OF LONDON

TOP: A careful mix of fuels, mortars and debris made up the explosions for Gangs of London. (Image courtesy of Alexander Gunn)

In the Sky Atlantic series Gangs of London, several shocking moments portraying the brutality of international gangs tearing at the seams of the British capital were brought to life with practical effects. Overseeing that work was Special Effects Supervisor Alexander Gunn of Arcadia SFX. “Gangs of London actually


PG 68-72 SFX TV.indd 68

5/3/21 5:05 PM

encompassed virtually every type of physical effect technique. We had rain, wind, snow, smoke, atmosphere. We had fire, we had explosions, we had bullet hits. We had blood squirts. We had people being blown to pieces.” Gunn identifies a scene of a man being killed with a cattle stun gun as one of those key moments. “We built the actual gun, which had moving parts and which the actor could hold. You could actually put it against someone’s head, and it would appear to thump into their skull. The performer had this twin-chambered system on them with flexible pipes to make the blood spurt out. It was also a controllable system to show that the blood spurts slow down as the heartbeat slows.” A shoot-out in a campsite had a similar physical quality. Gunn notes it was heavily previs’d by series creator Gareth Evans, which helped inform the frenetic action as bullets and debris fly. “We had about 20 or 30 bullet hits going off that were rigged into the set and then the rest of it is done with air cannons. We had loads of silicon

TOP: A ship that characters of Lovecraft Country find themselves in proves to actually be underwater, and therefore a set piece that Special Effects Supervisor J.D. Schwalm needed to flood. (Image copyright © 2020 HBO) BOTTOM LEFT: A snow-covered location on His Dark Materials, as generated by the Real SFX team. (Image courtesy of Danny Hargreaves) BOTTOM RIGHT: Flash-burn makeup effects work for 9-1-1: Lone Star. (Image courtesy of Jason Hamer)


PG 68-72 SFX TV.indd 69

5/3/21 5:05 PM


glass, feathers for the pillowcases all being shot up. We literally turned it into a maelstrom of crap flying around.” Then there’s the major house explosion. Gunn’s team set off rigged mortars and explosions by hand rather than via electronic firing boxes. “You feel it as it goes,” he says. There was also a very deliberate way the blast was ‘designed’ to look as gritty and dramatic as possible. “With the fireball, if you look at it, it’s got a halo of black behind it. Because there’s so much incidental light coming from the fireball, what I can actually do is frame that with mortar pots that are filled with dark earth and dark debris. We fire the fireballs off, then you fire your debris off. Then you fire another fireball off in the center, and then your big debris cannons behind that. “You want this stuff to come up behind it, give a black wall, so that your fireball is now not lighting up the house,” adds Gunn. “Instead, it’s actually got black behind it, and that actually makes the flame a lot dirtier. And in bright sunlight, it gives you some lovely, rich colors, and it matches the exposure levels of the daylight itself. It is quite an art and a science.” MOLOTOV COCKTAILS AND GENERAL MAYHEM IN LOVECRAFT COUNTRY

TOP LEFT: Convincing and safe interior fire burns were delivered by J.D. Schwalm’s team on Lovecraft Country. (Image copyright © 2020 HBO) TOP RIGHT: One of Gunn’s mandates on Gangs of London was to involve the stunt performers as much as possible in the blasts themselves. (Image courtesy of Alexander Gunn) BOTTOM: For actor interactions with the polar bears in the His Dark Materials, stand-ins were made and puppeteered during shooting. (Image courtesy of Danny Hargreaves)

Fiery effects were amongst a myriad of practical effects work required for HBO’s Lovecraft Country, which also included creature interaction work, blood hits and mechanical rigs. One particular necessary gag was ‘working’ Molotov cocktails, which actors playing a mob of men could throw in a number of scenes. “They were as real as one can get for a Molotov cocktail,” outlines Special Effects Supervisor J.D. Schwalm from Innovation Workshop. “They were made from candy glass, and we also experimented with different fuels cut down with water, with a wick that was soaked in a white gas or an alcohol fuel that would burn a nice bright yellow flame and look just like they needed to.” Indeed, pushing for the real was a large part of the series, and it was further evidenced by another fiery effect created by Schwalm’s team for an interior burning room. “We built the entire set outside out of steel, and then we built a huge tent around it with propane pipes and put fire sprinklers in. Everything was controlled with big automated manifold systems. We were able to deliver a pretty intense fire, safely, in pretty close proximity to the talent.” The talent also got close to the action for a scene that sees the characters enter a fantasy world and find themselves in a ship


PG 68-72 SFX TV.indd 70

5/3/21 5:05 PM

submerged underwater. “That entire set had to be built underwater, so my team fabricated a 100-foot-diameter round pool,” says Schwalm. “Then construction came and built their set on top of our pool and then we filled it up with water. We could control the level of the water in the whole pool, and we could also control the current of the water, the heat, the temperature and the cleanliness, depending on if they needed it clear or murky.” At one point the ship’s windows break and water flows in. “To do that,” explains Schwalm, “we built these giant dump tanks and had 10-foot-diameter round plastic pipes filled with about 50,000 gallons of water. On ‘action’ there were trap doors that opened immediately, and the water went down a chute and then burst through those windows and onto the set. I got a lot of text messages the day after that episode came out with people saying, ‘How did you do that? Was that practical?’” GRISLY BUT REALISTIC: THE MAKEUP EFFECTS OF 9-1-1: LONE STAR

Given that the Fox series 9-1-1: Lone Star deals with the rescue stories of fire, police and ambulance departments, it’s not hard to imagine the need for gruesome makeup effects for accident victims and others facing medical emergencies. Indeed, that’s what special effects makeup designer Jason Hamer of Hamer FX was called upon to create for the show, in spades. “The work ranges from burned bodies to broken limbs to conjoined twins connected at the head,” details Hamer. “For that last effect, we had these fiberglass skullcaps that were ratcheted onto the heads of our actors, glued down with medical tape, and then the appliances go over that. We actually built an electromagnet that when charged up made the two pieces stick. The actors were twin brothers who lived together, which also helped curb any COVID-19 stress about them being so close to each other.” A more common effect in the show were burned bodies and burned appliances. For flash-burn victims, Hamer and his team developed an outer skin layer with Baldiez, a plastic cap material

The Practicalities of Practical Suits Netflix’s space series Away called on the astronauts of the show to be fitted up in spacesuits that resembled those of the Apollo/Gemini NASA missions. This work was handled by Legacy Effects. “We did a lot of research from the era and tried to stay as close to that period as possible,” outlines Legacy Effects Co-founder John Rosengrant. “We translated the designs to something comfortable and functional that actors could wear, while ensuring purposeful details that reflected actual function. To stay within budget constraints, we had to make the suits work for both IVA and EVA filming. We made boots and gloves that would translate for both purposes, and introduced spacer knit to the interior of the suits to make them believable both inside and outside the spaceship.” Legacy’s team collaborated with the show’s design team, led by Costume Designer Kimberly Adam, on materials both aesthetic and functional, utilizing coated Cordura and spacer knit as the primary fabrics. “Fitting the actors was a primary concern to ensure that they would feel comfortable, but authentic in the suits,” notes Legacy Effects Fabrication Department Head Marilyn Chaney. “A few of the actors experienced slight claustrophobia at the initial fittings, which actually helped with the authenticity of the suits as a whole. We worked with each individual actor in the suits to address issues about fit, comfort and function to ultimately enhance the believability.”

TOP: Ray Panthaki and Hilary Swank in Away. The spacesuits by Legacy Effects referenced early NASA flight suits. (Photo: Diyah Pera. Copyright © 2020 Netflix)

LEFT: One of the main facial-injury makeup effects gags orchestrated by Jason Hamer and his team on 9-1-1: Lone Star. (Image courtesy of Jason Hamer)


PG 68-72 SFX TV.indd 71

5/3/21 5:05 PM


that was sprayed onto a garbage bag. “We get these sheets of skin from that, and by applying it to the face or the area that’s burnt, you could create this outer layer of skin that is charred. We ended up doing really thin pieces of silicone that worked as under skin.” For some burned firemen, Hamer consulted reference that showed how the top layer of skin peels back from the under layers. “The outer layer is very black and charred, but then you have this really white, pale under layer. “To create that look,” continues Hamer, “we had polyfoam bodies with armatures in them that we could pose in really strained positions. We covered the whole body in a drying blood to give it an undertone. And then while the blood was still wet, we sprayed glue and a thin layer of plastic on top. We spray-painted it with flat, black spray paint, and then we heat-gun everything. That black layer shrivels up and creates that outer skin layer, and it exposes that polyfoam covered in the blood underlayer. It was such a simple technique that was very quick and very affordable, but also very effective.” HIS DARK MATERIALS: LET IT SNOW

TOP LEFT: Body injuries formed part of the main work on 9-1-1: Lone Star. (Image courtesy of Jason Hamer) TOP RIGHT: 9-1-1: Lone Star requires a multitude of medical emergencies and conditions to be replicated. (Image courtesy of Jason Hamer) MIDDLE: A fiery scene in Lovecraft Country. (Image copyright © 2020 HBO) BOTTOM LEFT: A Real SFX crew member distributes fake snow on a His Dark Materials set piece. (Image courtesy of Danny Hargreaves) BOTTOM RIGHT: The cattle stun-gun rig setup on the performer for Gangs of London. (Image courtesy of Alexander Gunn)

Environmental effects are a significant part of BBC/HBO’s His Dark Materials, especially the generation of snow. Special Effects Supervisor Danny Hargreaves from Real SFX explains the on-set techniques for making non-snow products look like the real thing. “To make snow, we use a machine called a Krendl, which had its original life as an insulation machine where you fire foam or paper into an attic, for example. For spreading out snow, you get bags of fake snow, which is recycled paper that has been dyed white. You put that into this machine. It churns it up with blades and then it fires it down a tube, and then at the end of this tube, it’s got water jets. The paper itself hits the water, and then it turns it into papier-mâché-like material. It very much behaves like snow. “It’s a very quick way of laying a large area of paper or white down on the floor,” adds Hargreaves. “When you have a set that’s built primarily of foam, it’s just a really natural product that goes on top of the foam, and you just create these lovely waves. All of the sets that you see in His Dark Materials were completely fake and totally paper.” Hargreaves’ other principal challenges during the making of His Dark Materials’ two aired seasons have also included mechanical effects for the balloon-flying sequences achieved with motion bases, and enabling interactive effects for scenes involving the polar bears. Here, of course, the bears would ultimately be CG creations from visual effects studio Framestore, but on set they were brought to life with partial foam pieces, stuffies and puppeteering, plus on-set practical gags. “The integration of practical effects with visual effects – that is, making things move on set when there was going to be a CGI creature added in later – was a huge challenge for us,” states Hargreaves. In fact, Hargreaves makes particular mention of the importance of the co-existence of special effects and visual effects on a show like His Dark Materials. “We need them and they need us, and I think I’m quite happy to keep it that way. There are certain things that they like to do, while I still try and claw onto the practical elements as much as I can. We come under the same umbrella.”


PG 68-72 SFX TV.indd 72

5/3/21 5:06 PM

PG 73 FTRACK AD.indd 73

5/3/21 5:22 PM



Images courtesy of Adobe, Autodesk, Foundry, SideFX, Cinefade and Epic Games. TOP: The physically-based renderer produced by Arnold. OPPOSITE TOP: Unreal Engine by Epic Games is allowing for more in-camera effects to occur on set in real-time, such as with The Mandalorian. OPPOSITE BOTTOM: Cinefade was frequently used by cinematographer Erik Messerschmidt while shooting Mank.

Despite the ability to create fantastical worlds and creatures digitally, the majority of the work for the visual effects industry is focused on making unnoticeable alterations, whether it be painting out rigging, extending sets and locations, or doing face replacements for stunt doubles. Leading the way in creating the tools and technology to create and execute these invisible effects are software companies Autodesk, Adobe, Foundry and SideFX, as well as Epic Games and Cinefade. “It’s incremental advancements in the tools up and down the line with periodic leaps sprinkled in there,” believes Ben Fischler, Industry Strategy Manager at Autodesk. “I spent years doing lighting and rendering in compositing. If you look to the move to path tracers like Arnold and physically-based shading, lighters can think the same way that a DP would on set. The integration of Arnold with Maya and 3ds Max is one of the biggest pieces that we now have. Arnold has both a CPU and GPU mode, and for doing look development and lighting work on a desktop it is incredibly fast, and you can be rendering in the Maya viewport.” Where the technology has evolved most recently is with in-camera visual effects. “It’s a process that is changing the future


PG 74-79 TECH TOOLS.indd 74

5/3/21 5:06 PM

of all visual effects,” notes David Morin, Industry Manager for Media & Entertainment at Epic Games. “With in-camera visual effects, the greenscreen is replaced with LED displays on set while shooting live-action elements. This can enable in-camera capture of both practical and digital elements, giving the director and cinematographer more access to the final look of a scene earlier than ever before. This is an important step forward for invisible effects.” “Creating invisible effects has always been much of the ‘bread

“I look at machine learning as the assistant you wish you could hire rather than the thing that is going to replace you. We don’t want to replace people with robots.” —Victoria Nece, Senior Product Manager, Motion Graphics and Visual Effects, Adobe


PG 74-79 TECH TOOLS.indd 75

5/3/21 5:07 PM


TOP THREE: Content-Aware Fill in Adobe Photoshop; After Effects uses machine learning to fill in the background when objects are removed. BOTTOM: Solaris is a suite of look development, layout and lighting tools that enables the creation of USD-based scene graphs from asset creation to final render.

and butter’ of Foundry tools, including Nuke, Katana, Mari, and even in the early days of Foundry’s Furnace toolset for rig removal and clean-up tasks,” states Christy Anzelmo, Senior Director of Product at Foundry. “Nuke’s underlying ethos is to give the artist technical and creative control of what is happening in their shot to achieve those high-quality results. Its highly scalable processing engine, industry-standard color management workflows, along with the ability to quickly set up projections make Nuke wellsuited for creating convincing digital matte paintings and other invisible effects. Mari also works well with the very high-quality textures needed for creating photoreal CG elements, with GPU-enabled workflows that enable artists to have a fluid creative workflow when texturing.” “At SideFX, we have a relentless drive toward cinematic/ photoreal quality throughout the pipeline and continually push for significant improvements in our tools for creating character FX [cloth, hair and fur], crowds, water, fire, smoke, destruction and world building,” remarks Cristin Barghiel, Vice President of Research and Development at SideFX. “Houdini Digital Asset technology provides the ability for a technical artist to build and package up a powerful and complex tool and give that asset to a creative artist who can use it even without knowing Houdini. Solaris is a suite of look development, layout and lighting tools that empower artists to create Universal Scene Description (USD)based scene graphs that go from asset creation to final render. Solaris integrates with USD’s HYDRA Imaging Framework for access to a wide range of renderers such as the new SideFX Karma, Pixar RenderMan, Autodesk Arnold, Maxon Redshift, AMD ProRender and more.” “The professional video workflow crosses many apps,” states Victoria Nece, Senior Product Manager, Motion Graphics and Visual Effects at Adobe. “Depending on someone’s role they are probably using half of Creative Cloud, but it’s a different half than a person sitting next to them. We look at the professional tools as working together as building blocks of that pipeline. But there are also new tools that make things simpler for less experienced users. Premiere Rush is a space where we see there’s an opportunity for more emerging filmmakers, like social video creators, to do it all in one. We tier things to the audience and how much precision and control they need. Someone who is doing more advanced visual effects will want to dive into After Effects where they have every button and knob to dial in the final results.” Autodesk has placed an emphasis on producing more procedural tools. “The best example is the Bifrost Graph in Maya,” states Fischler. “Think of it is as a visual programming environment. Teams or individual artists can build custom tools that then can be scaled up to entire teams. You can think of Bifrost as an effects or rigging tool and different solvers can easily be integrated into it. Bifrost is currently part of Maya, but we’re also exploring bringing it into other tools as well. Being a plug-in, the Bifrost development team can iterate independent of the Maya teams. From a software development standpoint, plug-in architecture is smart software design because it’s modular. That allows you to build in such a way where there are fewer dependencies between other parts of


PG 74-79 TECH TOOLS.indd 76

5/3/21 5:07 PM

the software. Maya has been around for years, but the code base is constantly evolving, and the team over the last few years has made a big push to make it more modular under the hood. Some of the things that you’ve seen recently are animation caching features which has allowed for huge performance gains. The Arnold renderer has a plug-in to Maya, which means when you’re using Maya it doesn’t feel like you’re in a different environment.” “In recent years, there has been a push toward helping artists work faster and reduce the often manual work needed to create precise invisible effects,” notes Anzelmo. “One example of this in Nuke is the development of the Smart Vector toolset, which uses a unique type of motion vector to automate the warping of paint and textures over several frames, dramatically speeding up the process when working with organic surfaces like fabrics or faces. Since their release, the tools in Nuke that use Smart Vectors have grown to include the Grid Warp Tracker in Nuke 12 and take advantage of GPU acceleration to generate the Smart Vectors on the fly without a pre-rendering step. Mari has also expanded its toolset for painting digital assets from photographs, making it easier to populate virtual production environments with high-fidelity content.” SideFX has developed a series of solvers to help with the creation of cloth, fire and soft body objects. “Cloth continues to be a major focal point of the Vellum solver, with key improvements to mass/scale/topology invariance, collision robustness and recovery from tangling,” states Barghiel. “Velocity blending and relative motion controls offer more stability during high-speed scenarios. Vellum also has new sliding constraints to create unique effects. The Houdini Content Library now includes a collection of eight different fabrics with unique physical behavior and shaders per fabric. These include silk, velvet, wool, leather, jersey, raincoat, tulle with embroidery and jeans. With the addition of a new Sparse solver, artists can now create more impressive fire and smoke shots with detail where they need it. With the solve only taking place in the active parts of the simulation, processing time is cut into a fraction of what was required previously. Whether artists are flattening dough, squeezing gummy bears or adding jiggle to a seal, Houdini lets them choose between the Vellum solver for speed or the Finite Elements [FEM] solver for accuracy. FEM now offers a fast, accurate and stable global-nonlinear solver, a fully-symmetric solver with built-in Neo-Hookean material model and robust recovery from partially penetrating tetrahedral meshes. FEM also has significantly improved handling of fast-moving objects.” “A big part of traditional visual effects, whether or not they’re invisible, is rendering,” observes Morin. “It has historically taken a long time for computers to process the algorithms that define what makes up a visual effect. You would have to program your visuals and then let them render frame-by-frame, usually overnight. As part of this process, visual effects artists developed algorithms to represent simple things from rocks to complex things like realistic water, fur and human hair and skin. As computing power has become faster and more accessible, those algorithms can run faster and faster. The video game industry has contributed a lot to that speed shift as video games have always required real-time graphics and visual effects image processing. The game industry

TOP TWO: Bifrost Graph is a node-based, visual programming environment that enables users to construct procedural graphs to create effects such as sand, fire, smoke and explosions. BOTTOM TWO: Using the Soft Selection in Nuke. An example of the Grid Warp Tracker in Nuke.


PG 74-79 TECH TOOLS.indd 77

5/3/21 5:07 PM


TOP: Volumetric clouds created by using Bifrost Graph. BOTTOM THREE: Absolute Post creates a Houdini water simulation for Outlander.

made a massive investment in making those algorithms work in 1/30th of a second, which is the required speed for continuous gameplay. Today we’re benefiting from that effort in the visual effects industry with tools like Unreal Engine, which was developed for games, but is now available to filmmakers and content creators to do part of all of their work in real-time. This has also benefited creative flow across teams as production designers can build sets in Unreal Engine and achieve photoreal results right there in the art department. We’ve also seen artists like production designer Andrew Jones use these techniques to great effect on The Mandalorian and creature creators like Aaron Sims use these virtual production tools for character development as well.” “Of the biggest things that we’ve seen that is transforming the industry is machine learning,” observes Nece. “What was incredibly painful to do by hand and could take days or even weeks to get right, you can do in minutes because of machine learning. We’re particularly excited about Roto Brush 2 in After Effects, which shipped last fall. That uses Adobe Sensei, which is our machine learning technology to generate a matte over multiple frames. You could select a person or object in your scene and it will track it from frame to frame with incredible precision. It’s introducing the ability to do roto in places where it wasn’t possible before because of a deadline or budget. You see that speedup and that speaks to the other piece of this, which is the democratization of it. You can do this kind of work even if you’re working on your own. You can have a car go through the background, or power lines, or someone in a shot; that’s a space where the visual effects are even more invisible because no one assumes that they’re even doing visual effects in the first place.” “Content-Aware Fill originated in Photoshop and now we’ve brought it into After Effects,” remarks Nece. “That’s letting you take a piece of your image and either filling it in with another piece of your image, or in the case of After Effects, filling in with something from a different point of time. A colleague calls them ‘time-traveling pixels.’ You have a car go through the background in your shot and it’s a distraction. You want to remove it. You can mask out that car loosely and fill in that hole. You can do it either automatically and it will guess what belongs there based on the other frames in the shot or you can give it more information. We always want to make sure it’s not a black box and that these advanced algorithms are things that you can control, art direct, adjust and improve on so the hand of the artist is still there. For instance, you might need to paint in a piece of road and will figure out how to move that from frame to frame. And we just put lighting correction in that space. If you had a shot that changed over time in brightness, maybe the sun came out or you had a reflection that shifted, it can now compensate for that. It takes pixels from other points in time to figure out what to put in the space that you’re trying to fill. I look at machine learning as the assistant you wish you could hire rather than the thing that is going to replace you. We don’t want to replace people with robots.” “The Foundry approach is to explore machine learning as a means to accelerate repeat image processing tasks, such as up-resing footage, and to assist artists rather than removing the


PG 74-79 TECH TOOLS.indd 78

5/3/21 5:07 PM

artist entirely,” explains Anzelmo. “We have seen some cases where ML can return lost detail or correct an image that would otherwise have been unusable, which in itself is exciting! Today, artists are used to applying an effect and then tweaking its parameters until the effect does exactly what they want. With machine learning, the quality of the result depends highly on how the model is trained, and there isn’t the same ability to tweak the model in the same way. The Foundry’s Research efforts led to the creation of the ML-Server, an open-source client-server tool for training machine learning models directly in Nuke. Based on learnings from the ML-Server, we have some other exciting projects in the works.” “There have already been examples of AI and machine learning being used in production with Houdini such as Spider-Man: Into the Spider-Verse,” notes Barghiel. “While we don’t see AI and machine learning as widely adopted in the standard media pipeline today, it may someday be a critical part of the creative pipeline. For now, we see the greatest gains being made by using art-directable procedural workflows in conjunction with pipeline automation, to provide artists with powerful tools they can use to create iteratively, leading to the best results from everyone’s time and resource investment.” The precision of the machine learning is dependent on the quality of the data set being provided. “As we release new versions of Flame, the sophistication and the quality of the solution will go up because they continue to train it on larger and different data sets. That’s one of the cool things about it, it’s not static,” states Fischler. “The Flame team is focused on trying to solve specific problems because those are trainable things. For things like sky replacement, we came out with the last version, and face tracking. It’s when you get into the more open-ended challenges that things get tougher. Stay tuned because there are going to be some more Flame announcements with additional machine learning tools.” Not all of the innovation is associated with post-production, as the Cinefade system enabled David Fincher and cinematographer Erik Messerschmidt to play with the variable depth-offield effect on several occasions during principal photography for Mank. “The Cinefade system consists of a variable ND filter that is synced to the iris motor of the camera and controlled via a cmotion cPro lens control system,” explains Oliver Janesh Christiansen, inventor of Cinefade. “The operator varies iris diameter to affect depth of field and the VariND automatically compensates for the change in light transmission, keeping exposure constant. The effect is achieved completely in-camera, giving filmmakers complete control and enabling a novel form of cinematic expression. It allows cinematographers to seamlessly transition between a deep and a shallow depth of field in one shot, resulting in a unique in-camera effect in which the foreground remains sharp while the background gradually becomes blurry, isolating the character and drawing the viewer’s attention. Cinefade VariND, which is controlled remotely, can also be used as a practical exposure tool to hide the transition between an interior and exterior location.”

TOP TWO: Unreal Engine was utilized to provide previsualization for John Wick: Chapter 3 – Parabellum. BOTTOM TWO: Cinefade allows cinematographers to gradually transition between a deep and a shallow depth of field in one shot at constant exposure.


PG 74-79 TECH TOOLS.indd 79

5/3/21 5:07 PM



Images courtesy of Baobab Studios. TOP: Baba Yaga in her hut with the boiling cauldron. OPPOSITE TOP: A storybook aesthetic was adopted for the visuals.

In Slavic folklore, a supernatural being associated with wildlife appears as a deformed old woman and lives in a hut; she provides the inspiration for Baba Yaga, an interactive VR experience produced by Baobab Studios in partnership with Conservation International and the United Nations ActNow campaign that is available exclusively on Oculus Quest. As settlers encroach a rainforest, their chief becomes deathly ill, leading her two children to embark on a journey to find a cure in the form of the mysterious Witch Flower. Serving as the writer, director and cinematographer on the project is Eric Darnell, Co-Founder and Chief Creative Officer of Baobab Studios, with the voice cast consisting of Glenn Close, Kate Winslet, Jennifer Hudson and Daisy Ridley. “While it’s about a mean old witch who lives in the forest and maybe eats children,” states Darnell. “Baba Yaga is also a story about recognizing our connection with the natural world. In doing so, perhaps we’ll be more inclined to live in harmony with it.” Since the release of Invasion! in 2016, the independent interactive animation studio established by Darnell, CEO Maureen Fan and CTO Larry Cutler has gone on to win six Emmy Awards and two Annies. “It has been an amazing journey for us,” remarks Darnell. “When we made Invasion! it became clear to us that VR is not like cinema. It’s a different medium. We recognized, which is obvious now, that our superpower is immersion. You can look around, but if you’re running in real-time then we can have interactivity. Characters can respond to what you do. It got us to think about how we can tell stories that make the viewer the main character and feel that their choices are meaningful and matter. I like to think of story as being primarily about characters making big decisions that reveal something about themselves.”


PG 80-85 BABA YAGA.indd 80

5/3/21 5:08 PM

“What I’m most interested in is how people react and respond to the opportunity to go down more than one path, and do people find that compelling enough to go back and try it again? That’s the hope. This is the most complex interactive project that we’ve done, particularly when it comes to giving the viewer the opportunity to make their own decisions about how the story is going to play out.” —Eric Darnell, Co-Founder and Chief Creative Officer, Baobab Studios Allowing viewers to impact the narrative with their choices means that multiple outcomes need to be produced. “It can get complicated,” admits Darnell. “We explored that more in Baba Yaga, where there is more flexibility in how the story can flow and finish. But the other thing that we have realized over these last five years is sometimes it’s the little things that make the viewer feel engaged and connected with the world. In Baba Yaga, there is a moment where another character offers you a lantern, and more than one person has commented on how it felt magical that she reaches out to you, you raise your hand, and suddenly you have this lantern, and are able to point it around to see the dark forest better.” Machine learning has enhanced the interactivity and immersion as characters can believably respond to the actions of the viewer. “Bonfire was the first time that the viewer was the main character and has to interact with this alien creature that is hiding out there in the dark jungle,” states Darnell. “You can pick up a piece of food and offer it to Pork Bun or throw a flaming log at him. Depending on what you do, the character has to be always ready to respond in a way that makes sense. If you throw the flaming log, it might run away, hide for a minute and come back out when it starts to feel

safe again. We have to find ways to not only be able to react to the range of things that the audience can do, but react to a frequency of doing certain things. When you combine that with trying to find ways for the AI system to work with handcrafted animation, for it to pick from this big library of potential actions, insert them into the scene, and actually modify everything at the same time because it matters where the character is, how they’re oriented to the viewer and where the object of its desire is, it became a complex problem that ultimately paid off for us.” As with the rest of the world, the pandemic caused Baobab Studios to adopt a remote workflow for the production. “Our team had become international and diverse already,” explains Larry Cutler, CTO at Baobab Studios. “For Baba Yaga, co-director Mathias Chelebourg lives in France and so did our original production designer, Matthieu Saghezchi. One of our lead animators was in Poland and our modeler was in the Philippines. We had cracked the challenge of how to build toolsets that enable you to review dailies in VR with everyone present, so when we decided to have everyone switch to being remote in early March 2020, it didn’t change our process that much. There were certain things that we said like, ‘The only way you could actually shoot all of


PG 80-85 BABA YAGA.indd 81

5/3/21 5:08 PM


the 2D footage [for the cinematic version] in VR was if we were all together in the screening room.’ But we were able to do that remotely, which was crazy.” Working within a new medium meant that Baobab Studios had to develop a holistic toolset called Storyteller. “All of our animation is done in Maya and the real-time work in Unity,” states Cutler. “We built an entire toolset called Storyteller on top of those amazing platforms, as well as things that are standalone, such as our own shot database.” Human characters make their debut in Baba Yaga. “Over time," adds Cutler, "the animation industry has been able to do convincing humans, but they involve simulated clothing, hair and complex rigs. To do that running at 72 fps, rendering both eyes and having systems that are reactive, was daunting for us but exciting to take on for Baba Yaga. We had to have clothing that was believable, hair, nuanced facial performances and great animation: all of these things had to come together. We built our own subdivision surface rendering system within Unity, so you could get sharp creases around the eyebrows and nose of Magda [Daisy Ridley]. Eric wanted the 2D version to be still be told from a first-person emotional POV. That meant it had to handheld and not feel like a single-camera move. As a result, we built a whole toolset for Eric to shoot all of the camerawork himself in VR.” Breaking the fourth wall between the characters and viewer was an area of concern for Ken Fountain, Animation Supervisor at Baobab Studios. “We had to come up with some cool workflow techniques in order to allow what we were doing to connect with the automated look-at system that we had developed. We


PG 80-85 BABA YAGA.indd 82

5/3/21 5:08 PM

were worried about the tiniest details of eye convergences and micro-expressions in facial features that you normally don’t spend a lot of time thinking about because you’re not trying in a movie to persuade an audience to actually do something. Often times in animation, we simplify in order to get the point across more clearly. When you have a character that is two feet away from your face, you need the complexity, so knowing the musculature of the face and making sure that the animators are all using it the same way does add a level of complications to it. “We had to design our workflow based on branching narratives,” explains Fountain. “Our scene structure had to develop its own language. What we would call different clips of animation, how we would group things together as scenes and pieces of that scene and layers of that scene. The joy of working in a new technology is nobody has written the workflow yet. What makes a difference is how well you have mapped out those ‘what if’ situations ahead of time, because AI is not yet sophisticated enough to be able to do a lot of the heavy lifting. A lot of the heavy lifting has to happen in planning the scene and saying, ‘We’ve tested this with some user tests and have discovered that they want to do this.’ We need to plan the branching action to be able to accommodate what the user wants to do or what they might choose.” Maya has the ability to layer animation. “We don’t want this to look like a bunch of clips stitched together," adds Fountain. "It needs to look like a handanimated performance everywhere. We worked with our engineers to come up with great blending strategies, and ways to layer parts of animation to hide transitions and make things feel more continuous. How could we harness what Maya can do and send

OPPOSITE TOP: The hand tracking allows for the Baby Chompies to follow the fingers of the viewer. OPPOSITE MIDDLE: A screenshot taken from the prologue. OPPOSITE BOTTOM: Despite being deathly ill, the Chief is determined to protect her children. TOP: Helping Baba Yaga to emote despite wearing mask was the varying intensity of her glowing eyes. BOTTOM: Theatrical lighting is used extensively throughout Baba Yaga.


PG 80-85 BABA YAGA.indd 83

5/3/21 5:08 PM


that to Unity in a way that Unity can use it modularly and be able to blend things together? That was through processes that came heavily across the last three productions.” Impacting the effects style was the pop-up storybook aesthetic of the prologue and epilogue, as well as the decision to adopt a stop-motion approach towards the characters. “Because the whole world of Baba Yaga feels hand-drawn, a lot of our effects are hand-drawn,” notes Nathaniel Dirksen, VFX Supervisor and Head of Engineering at Baobab Studios. “Rather than using Houdini for effects, we would use Procreate. We actually had an animator draw fire on their iPad, animate fire in Procreate or Blender, and do 2D effects. We hadn’t done 2D effects and brought them into VR before. The Witch Flower had a lot of elements going into it, so we used Houdini to maximize the art direction in order to get the effect that we wanted. In the 2D version, we redid the Witch Flower as a 2D effect that was hand-drawn. In the VR piece, the Witch Flowers are a lot further away from the action, so the effect that we had done didn’t hold up well when we got them into the 2D version [where they appear much closer to the camera].” The Oculus Quest is an untethered headset that has a GPU associated with a smartphone rather than a powerful computer. “If you give the user the ability to shine a light around the scene, you’ve just made life harder, but it is cool, so you do it anyway!” laughs Dirksen. “Doing real-time lighting is more expensive computationally, so on the Quest, which has limited rendering power, we had to be careful about how much extra computational complexity that we had to add in. We wanted to have this moment of Magda giving


PG 80-85 BABA YAGA.indd 84

5/3/21 5:08 PM

“Because the whole world of Baba Yaga feels hand-drawn, a lot of our effects are hand-drawn. Rather than using Houdini for effects, we would use Procreate. We actually had an animator draw fire on their iPad, animate fire in Procreate or Blender, and do 2D effects. We haven’t done 2D effects and brought them into VR before.” —Nathaniel Dirksen, VFX Supervisor and Head of Engineering, Baobab Studios you the lantern to rope you into going onto the journey with her. The lighting setups had to be precisely scripted. “The way that things turn on and off is carefully choreographed to make sure that it works,” Dirksen adds. “The forest has a cool, blue-ish, layered background, giving this notion that it goes off into the distance. That starts to fade out and be simplified once the flytraps come out. We have to make sure that we’re not having too many expensive things onscreen at the same time. You set it up in the beginning, establish that it’s there, people get a feeling for the forest, darken things down and focus on a particular area of the scene. Choices like that are helpful in terms of focusing the viewer but also keeping our render budget in line.” Orchestrating the ADR sessions turned out to be extremely complicated because of the safety protocols caused by coronavirus. “Glenn Close, who plays the mother, already had all of her dialogue recorded, but we still needed to get Kate Winslet, Jennifer Hudson and Daisy Ridley to do all of their voice recordings,” recalls Scot Stafford, Founder and Creative Director of Pollen Music Group. “None of them had home studios. We found a pandemic-proof studio [for Daisy Ridley in London], a pandemic-proof remote recording operation that goes into people’s homes, builds it out and then takes it all down [for Kate Winslet], and an in-house solution in Chicago for Jennifer Hudson [with the engineer who records a lot of her music]. There were several takes that weren’t usable, but we got all of the good ones by the skin of our teeth.” The sound design was built around the idea that the story takes place in an Arctic rainforest. “I thought that concept was fascinating,” remarks Stafford. “I was inspired by some of the otherworldly sounds that you hear as the ice cracks and melts on a frozen lake. There are these crazy squeaks, swoops and sign waves. I created a creature out of those noises that would sound like a birdcall. You hear it three times very briefly. As you progressed there were different states of immersion, from being stylized stagecraft, to being intimate and domestic, to being immersed in this incredibly powerful forest.” “Making sure that different scenarios actually had a satisfying through line was the biggest accomplishment on this film,” notes Cutler. “All of these other pieces feed into that.” At the conclusion of Baba Yaga, viewers make a fateful choice which is determined and implemented by the type of hand gestures they decide to utilize. “What I’m most interested in is how people react and respond to the opportunity to go down more than one path, and do people find that compelling enough to go back and try it again?” states Darnell. “That’s the hope. This is the most complex interactive project that we’ve done, particularly when it comes to giving the viewer the opportunity to make their own decisions about how the story is going to play out.”

OPPOSITE TOP: The lighting setups had to be carefully choreographed to make sure that everything would render properly in real-time. OPPOSITE BOTTOM: Various details are placed in the background to further the storytelling as well as to draw the viewer into the environment. TOP: Magda is the first human character created by Baobab Studios, and it was important for her to feel present all the time, from her hair, skin, clothing and facial expressions. MIDDLE: The clothing of Magda was hand-animated to give it a tactile feeling. BOTTOM: The final render of Magda offering the lantern to the viewer, which is considered to be one of most engaging moments in Baba Yaga.


PG 80-85 BABA YAGA.indd 85

5/3/21 5:08 PM



These days, television show and streaming series typically require hundreds, if not thousands of VFX shots to be managed over a season. But amongst those multitude of visual effects, there’s often just one shot or sequence that tends to stand out as being trickier than the others. It might be from a highly technical shoot or because a very specific piece of action is required. Sometimes there may be a large number of iterations carried out on the shot, or just the sheer complexity of the VFX called upon to tell the story point may make the specific scene a tough one. Several effects practitioners share their toughest scenes on recent television and streaming experiences, with the answers ranging from detailed creature animation on The Crown, to intricate shot choreography for The Mandalorian, to managing a live-action explosion shoot on The Stand, and just the sheer VFX delivery task at hand for Raised by Wolves. THE CROWN: MAKING A CG STAG LOOK MAJESTIC

TOP: Maintaining a majestic nature was one of the crucial briefs given to Framestore for the stag in The Crown. (Image copyright © 2020 Netflix)

Season 4 of The Crown features a number of scenes in which members of the royal family are hunting a stag, an animal that in the show appears as a completely CG creation by Framestore. For Creature Supervisor Ahmed Gharraph, who is also Joint Head of CG at Framestore, the stag was one of the toughest things to realize since it had to match the drama’s sense of grounded reality. “Our aim,” says Gharraph, “was to convince viewers that the stag on their screen was real, rather than a good digital imitation. It’s a goal that’s difficult to achieve, especially on a TV budget, but for us


PG 86-90 TV SHOTS.indd 86

5/3/21 5:09 PM

“[After undertaking a massive restoration as a result of overexposed footage] Important Looking Pirates took some of the original footage [on The Stand], along with the clean explosion, and the stunt performers and then more elements we shot, and they undertook this remarkably painstaking job of putting it all together. They also did some fantastic simulation work to see the structure of the house rip apart.” —Jake Braver, Visual Effects Supervisor, The Stand this goal was vital for the integrity of the show.” Furthermore, the stag would be a key part of the first shots of one of the show’s early episodes, including an opening shot that was almost 500 frames long, “in broad daylight and in 4K resolution,” adds Gharraph. “There was absolutely nowhere to hide. There were wide shots as well as full close-ups of the stag, so the asset had to hold up at all distances.” To help with crafting the creature, Framestore of course considered photo and video reference of real stags. The shoot also made use of a stand-in blue silhouette stuffy as a size guide, while a 2D

TOP: The Krayt dragon in The Mandalorian was almost never shown in its full form, but instead submerged beneath the sand or in its cave. (Image copyright © 2020 Lucasfilm Ltd.) BOTTOM: A combination of elements were composited together by Important Looking Pirates to form this dramatic shot of the blast in The Stand. (Image copyright © 2020 CBS All Access)


PG 86-90 TV SHOTS.indd 87

5/3/21 5:09 PM


“It’s the same work. The artists are the artists. The shots are the shots. Because of who we’re working with, we have to have very high-level shot production. There is no difference between my time at Disney on major motion pictures and the level of visual effects shots on this show.” —Ruth Hauer, Visual Effects Producer, Raised by Wolves animatic provided by the client informed Framestore in terms of staging and animation. In fact, animation was another tricky part of the shots, since one of the main briefs was to maintain a sense of majesty, even after the stag is shot and collapses. “We would see him again throughout different parts of the episode,” notes Gharraph, “where he would continue to hobble on an injured leg and look progressively more tired and worn down, but always having to look majestic.” The final principal challenge for the stag came from its fur and convincingly deforming it with an underlying set of muscles, skin and fat. “We generated approximately 15 million hairs, which were then covered in dirt, grass, water droplets, clumps of mud and so on,” describes Gharraph, who still marvels at the level of detail his team went to for the creature. “We even grew moss in the crevices of his antlers.” THE MANDALORIAN: STORYTELLING TIME WITH A KRAYT DRAGON

TOP: Original plate for a Mother flying scene in Raised by Wolves. (Image copyright © 2020 HBO Max) MIDDLE: A CG representation of actor Amanda Collin for her transformation into a necromancer. (Image copyright © 2020 HBO Max) BOTTOM: The final Mother shot by MR. X. (Image copyright © 2020 HBO Max)

Industrial Light & Magic had plenty of technical challenges in creating the Krayt dragon for Episode 1 of Season 2 of The Mandalorian. Among them were building a massive beast with multiple legs, making the creature ‘swim’ through sand, and finding ways to cleverly hide the full extent of it underneath the ground. But what also made the Krayt dragon scenes particularly tough, according to ILM Animation Supervisor Hal Hickel, was figuring out the key storytelling beats of the scenes. For example, at one point, the Mandalorian (Pedro Pascal) manages to be swallowed by the Krayt dragon, only to blow it up from the inside. “Figuring out exactly how much of a big cinematic moment to make that and what angles we were going to use took a lot of time,” says Hickel. “We did lots of different variations on that moment.” Earlier, the Krayt dragon is shown exiting its hideaway cave and fired upon by Tusken Raiders with ballistas. It comes out of the cave, backs up and returns again. These beats were things that Hickel and the whole creative team, including the episode’s director, Jon Favreau, had to consider. “We had to keep re-thinking, ‘Well, what are the goals of the heroes? What are the ballistas for?’ Because they don’t seem powerful enough to keep it from going back in? Oh, they’re not to keep it from going back in. They’re to piss it off.” In crafting the Mandalorian’s final battle with the Krayt dragon,


PG 86-90 TV SHOTS.indd 88

5/3/21 5:09 PM

“Our aim was to convince viewers that the stag on their screen [on Season 4 of The Crown] was real, rather than a good digital imitation. It’s a goal that’s difficult to achieve, especially on a TV budget, but for us this goal was vital for the integrity of the show. [An opening shot was] in broad daylight and in 4K resolution. There was absolutely nowhere to hide. There were wide shots as well as full close-ups of the stag, so the asset had to hold up at all distances.” —Ahmed Gharraph, Creature Supervisor, The Crown ILM was afforded the flexibility of many of the assets being computer-generated – including the lead character at times – although the shots were a combination of live-action filmed on an L.A. backlot with both the principal actors and skilled stunt performers. “Whatever techniques we use to finally produce the shots,” notes Hickel, “Jon’s motivation was always about, how does it make him feel? If he doesn’t feel the thing he’s supposed to be feeling, whether it’s a laugh, whether it’s a build-up to a kind of a crescendo or an exciting release moment, whatever it is, if he isn’t feeling it, he knows it in his gut. And that final big bang was the button on the whole sequence, so it did need to be just right. It’s not surprising it got a lot of extra scrutiny and discussion.” THE STAND: MOVING QUICKLY WHEN THINGS DON’T QUITE GO AS PLANNED

Episode 6 of The Stand has a fiery ending for a scene in which explosives go off in one of the character’s homes. A practical explosion on a purpose-built house set was filmed in one take with multiple cameras shooting at various frame rates, including Phantom cameras running at 300 fps. Several stunt performers on wires were also filmed being engulfed in the blast. While the explosion was suitably spectacular, a large portion of the footage proved to be too overexposed to use. “It was a huge disappointment,” admits Visual Effects Supervisor Jake Braver, who was also a producer on The Stand and was second unit directing that sequence. “It then immediately became a plan to salvage the shots and do a massive restoration effort. “I walked off set for maybe 20 or 30 minutes, went around the block and had a think about what the options were,” adds Braver. “Our explosion permit was only valid at that location through the following day – they weren’t going to let us back again any later than that – so I was weighing all of the production needs versus all of the visual effects needs.” The same night, Braver devised a solution that would involve restoring sections of the house in order to conduct an additional explosion element shoot, while also working out where digital

TOP TO BOTTOM: A Framestore lookdev render of the stag in The Crown, showcasing the fur simulation. (Image courtesy of Framestore) Industrial Light & Magic needed to simulate extensive sand and debris for the Krayt dragon sequences in The Mandalorian. (Image copyright © 2020 Lucasfilm Ltd.) Important Looking Pirates simulated the house explosion in The Stand, matching what was captured for real. (Image courtesy of Important Looking Pirates) Final explosion composite. (Image copyright © 2020 CBS All Access)


PG 86-90 TV SHOTS.indd 89

5/3/21 5:09 PM


effects would come in (there was already a plan to use CG in the sequence to depict parts of the house breaking apart). An additional shoot did take place, along with the acquisition of bluescreen elements of performers being ratcheted through flames on matching angles. “Then,” details Braver, “Important Looking Pirates took some of the original footage, along with the clean explosion, and the stunt performers and then more elements we shot, and they undertook this remarkably painstaking job of putting it all together. They also did some fantastic simulation work to see the structure of the house rip apart.” While the original outcome of the one-take explosion shoot did not go according to plan – Braver still recalls the “knot in my stomach and dread and anxiety from seeing people looking at the overexposed image on the monitors and then looking at me saying ‘How do we fix this?’” The ‘spring-into-action’ response made sure the final result was just as visceral for the audience. RAISED BY WOLVES: THE TOUGH TASK OF DELIVERING NEARLY 3,000 SHOTS

TOP TO BOTTOM: Concept art by Doug Chiang for the Krayt dragon as seen in Season 2 of The Mandalorian. (Image copyright © 2020 Lucasfilm Ltd.) The bluescreen stand-in for the stag on location for The Crown. (Image copyright © 2020 Netflix) Bluescreen element shoot of stunt performers and explosion for The Stand. (Image courtesy of Jake Braver) CG element crafted by MR. X for Raised by Wolves. (Image copyright © 2020 HBO Max)

When Raised by Wolves Visual Effects Producer Ruth Hauer began working on Season 1 of the Ridley Scott series, it was already in the midst of shooting in South Africa. This presented an early significant challenge for the visual effects team. “We had to jump in and hit the ground running,” recalls Hauer. “The facilities had not been contracted yet. We ended up with 12 vendors worldwide.” The spread of vendors around the globe would also provide a hurdle in terms of time zones for dailies and notes sessions. “And then, of course, COVID hit,” says Hauer, “which put a damper on everything, because now all the vendors had to go virtual with their artistry. So, it was challenging working with 12 vendors in different countries with tight deadlines and almost 3,000 shots. I felt like I was on a fast-moving train always trying to keep up.” The first season required a wide array of visual effects from creatures to environments, to holograms, and digi-double-type requirements for the main android character, Mother (Amanda Collin). Hauer remembers many individual shots and sequences being hard to nail down, including a moment when the character Marcus (Travis Fimmel) attempts to kill Mother while she is plugged into a simulator. “Marcus is going to kill Mother,” describes Hauer, “and she gets up, and then suddenly rocks start lifting and she starts ‘bamming’ them with her screech. That all had to look photoreal and based in reality. It took a while to get those shots done.” Hauer worked hand-in-hand with Visual Effects Supervisor Raymond McIntyre Jr. on the show, coordinating CineSync sessions with vendors, and keeping tabs on shots, schedule and budget via a combination of a Filemaker Pro database and Excel spreadsheet. Hauer, who had previously worked predominantly in film visual effects, says that the approach to the VFX of the show was pretty much the same as in film. “It’s the same work. The artists are the artists. The shots are the shots. Because of who we’re working with, we have to have very high-level shot production. There is no difference between my time at Disney on major motion pictures and the level of visual effects shots on this show.”


PG 86-90 TV SHOTS.indd 90

5/3/21 5:09 PM

PG 91 YANNIX AD.indd 91

5/3/21 5:22 PM


Lens Mapping for VFX By DAVE STUMP, ASC Edited for this publication by Jeffrey A. Okun, VES Abstracted from The VES Handbook of Visual Effects – 3rd Edition Edited by Jeffrey A. Okun, VES and Susan Zwerman, VES Lens mapping to quantify each lens is crucial to the success of doing VFX work. Precise measurement of lens characteristics can make the difference between expensive visual effects work and economical visual effects work, usually accomplished by meticulously shooting grid charts through a wide variety of focus and zoom positions in order to reverse engineer the design geometry of lenses. The optics that make lenses rectilinear must be fully quantified and understood in order to integrate computer generated images into live-action cinematography

If exteriors will be shot at T8 and interiors will be shot at T3.5 then shoot the charts at both of those stops. It is acceptable to use frame rate or shutter angle to reach correct exposure. Slate every take with focal length, T-stop, focus distance, and distance from the center of the chart to the image plane. In the case of zooms, it is enormously beneficial to chart the lens for every marked focal length.

Before beginning any project, shoot a grid chart for the purpose of reverse engineering the optical paths of all the lenses to be used in VFX work.

Then rotate the chart to a skewed angle, as large as possible in frame, completely filling the frame from corner to corner. Focus so that the near and far out-of-focus areas of the chart are equally soft. Zoom lenses complicate the issues of lens geometry because they move many elements in relation to each other in order to change focal length, making it extremely hard to understand what is happening to the image in terms of optical geometry correction. The no parallax point in a zoom moves as the focal length changes, sometimes by a very substantial amount.

Shoot the chart straight on, as large as possible in frame, completely filling the frame, level, square to the camera, and with the frame lines of the imager as precisely parallel to the chart lines as possible. Shoot the chart at the preferred stops intended to be used for most of the production.

Anamorphic lenses further complicate the issues of lens geometry as they almost always have an optical path with two different no parallax points and two different focal lengths, a normal focal length in the vertical axis and another focal length (twice as wide) in the horizontal axis. The anamorphic (squeeze) element only functions in the horizontal axis of the lens, and has its own no parallax point, different than the vertical axis, vastly complicating the calculation of lens distortion and geometry, and making the work of integrating CGI an order of magnitude more difficult, especially in moving shots.


PG 92 HANDBOOK.indd 92

5/3/21 5:10 PM


5/3/21 5:23 PM


VES Education Initiative Connects; VES Luminary Interviews Inspire By NAOMI GOLDMAN

VES Education Initiative Builds Momentum The VES Education Committee kicked off its new Peer-to-Peer Mentorship Program, aimed at connecting mentors and mentees from the VES membership to provide support in all disciplines of visual effects. The pilot program connected 84 members globally in the first quarter of 2021. As part of the outreach initiative, the Committee is collaborating with several schools to educate students and educators alike about visual effects and career possibilities in the field. The VES teamed up with the BRIC Foundation on mentorship and outreach, aimed at increasing diverse representation in the industry, and participated in the dynamic BRIC (Break, Reinvent, Impact and Change) Virtual Summit in March. Additionally, the Committee partnered with Access VFX, Women in Animation and Rise Up Animation to successfully connect nearly 140 members who were interested in mentoring specific age groups and demographics worldwide. The VES looks forward to watching these programs evolve into successful, ongoing service opportunities and benefits of the Society. Get a Front Row Seat for the VES Luminary Interviews Watch and enjoy the VES’ insightful Luminary Video Series, featuring some of the biggest names in visual effects — VFX legends, luminaries and VES Award winners from around the globe. Let their stories engage and inspire you to carry forth and create. The latest videos include up close and personal interviews with: Chris Corbould – acclaimed special effects supervisor and coordinator Roger Corman – prolific producer and director Joyce Cox, VES – award-winning visual effects producer John Dykstra – award-winning visual effects pioneer  Harrison Ellenshaw, VES – acclaimed matte artist and visual effects supervisor  Jonathan Erland, VES – VFX artist, technologist and pioneer Rob Legato, ASC – award-winning visual effects supervisor and cinematographer. J im Morris, VES – award-winning film and visual effects producer and President of Pixar Animation Studios Richard Winn Taylor II, VES – acclaimed creative and cinematic director TOP TO BOTTOM: Rob Legato, ASC Jim Morris, VES Lynda Ellenshaw Thompson, VES

 ynda Ellenshaw Thompson, VES – acclaimed visual effects producer L  Douglas Trumbull, VES – award-winning visual effects pioneer and filmmaker More coming soon! VES members can log in to watch the VES Luminary Interviews here: bit.ly/VESLuminary


PG 94-95 VES NEWS.indd 94

5/3/21 5:11 PM

New Zealand Celebrates the VES Awards in Red Carpet Style By EMMA CLIFTON PERRY, New Zealand Section Board member and VES 1st Vice Chair

The celebration of the 19th Annual VES Awards was a truly global affair, but the New Zealand Section experienced the virtual production with some extra in-person panache. When the New Zealand Section realized that they were in a position to host an in-person awards viewing party, the Board of Managers rose to the challenge and spearheaded the festive event at the Roxy Cinema in Miramar Wellington, which is co-owned by Academy Award-winning film editor and producer Jamie Selkirk, co-owner of Weta Workshop Tania Rodger, and renowned foodie Valentina Dias. With the event running simultaneous to the global live stream, it was hosted as a weekday luncheon event. Local celebrities, VES nominees and members, the New Zealand Film Commission and organizations including Women in Film and TV - New Zealand branch were all in attendance. Guests were greeted with an opportunity to pose on the red carpet prior to entering the pre-party. A welcome “bubbles bar” was on offer with all attendees receiving a complimentary glass to celebrate. James Ogle, Co-Chair of the New Zealand section, even brought his own VES award to the event, allowing winners a chance to pose with the statue on the red carpet. With New Zealand’s own Sir Peter Jackson receiving the VES Lifetime Achievement Award and two groups of local nominees in attendance, spirits were high. When the winners of the Outstanding Created Environment in a Photoreal Feature – Mulan; Imperial City – were announced, the audience roared to life with applause, laughter, a few tears and plenty of high-fives. It was lovely to see just how excited the nominees and winners got, even in this nontraditional gathering. We in New Zealand feel incredibly lucky to have been able to host an event like this with our region unrestricted by COVID concerns. The worldwide stream helped us be tethered to our global industry colleagues. The production value of the show was fantastic, and we appreciate everyone who made it possible for VES members worldwide to join together to celebrate our craft in this most unprecedented time.

TOP TO BOTTOM: New Zealand Section Board members kick off the red carpet event. From left: James Ogle, Kay Hoddy, Emma Clifton Perry and Dylen Velasquez. (Photo: Jim Guo) The Mulan; Imperial City team celebrating their VES Awards win. From left: Jeremy Fort, Matt Fitzgerald, Ben Walker and Adrian Vercoe, with Weta Visual Effects Supervisors Joe Letteri, VES and Anders Langlands. (Photo: Jim Guo) VES Awards nominees, guests and members of the New Zealand Film Commission and Women in Film and TV watching the global live stream. (Photo: Jim Guo)


PG 94-95 VES NEWS.indd 95

5/3/21 5:11 PM


In This Corner Weighing In At...

Godzilla vs. Kong is the newest iteration of the Monsterverse (see article, page 30) with a special effects legacy that dates back to 1933. That’s when the original black-and-white movie of King Kong debuted to stunned audiences. Even today, it ranks as one of the greatest monster/horror features of all time. It displayed the genius of special effects pioneer Willis O’Brien who was able to blend stop-action animation with live action, revolutionary at the time. For a while, many people believed Kong was a stuntman in a gorilla suit, but Kong was, in fact, 18- to 24-inch models with rabbit fur, metal skeletons and joints. The film also used rear projections for certain special effects sequences. The filmmakers built a giant hand that was raised in the air by a crane. There was also a 20-foot head constructed with three men inside operating levers to enable Kong’s facial expressions for closeups. The 1954 original Godzilla did have a man in a 200-pound suit for special effects creator Eiji Tsuburaya. The team built highly-detailed miniature buildings and sets that Godzilla could stomp through. Director Ishiro Honda also dialed down the action since slow motion gave the monster a more realistic feel. Hand puppets were used as inserts for close-up action on Godzilla’s face. The original King Kong and Godzilla movies seem quaint by today’s standards. Little did these creatures know at the time that they would meet in 2021 with the help of hundreds of VFX and SFX artists from around the globe.


PG 96 FINAL FRAME.indd 96

5/3/21 5:11 PM


5/3/21 5:14 PM


5/3/21 5:15 PM

Profile for Visual Effects Society

VFX Voice Summer 2021  


Recommendations could not be loaded

Recommendations could not be loaded

Recommendations could not be loaded

Recommendations could not be loaded