Page 1

VFXVOICE.COM

WINTER 2020

TAKING FLIGHT WITH CARNIVAL ROW

VFX AND SFX IN THE NEW DECADE • OSCAR CONTENDERS • THE IRISHMAN RAY HARRYHAUSEN’S 100TH • PROFILES: JENNIFER LEE & DAN CURRY


CVR2 NETFLIX IRISHMAN AD.indd 2

11/12/19 2:31 PM


PG 1 VES AWARDS HOUSE AD.indd 1

11/12/19 2:35 PM


[ EXECUTIVE NOTE ]

Welcome to the Winter issue of VFX Voice! We’re thrilled to celebrate awards season, where outstanding visual effects artistry is in the spotlight. In this issue, we’ll preview the top contenders for the VES Awards, the Oscars and the BAFTAs. We go behind the scenes of The Irishman, Carnival Row and The Dark Crystal. Award-winning VFX trailblazer Dan Curry, VES and Disney Animation’s Jennifer Lee take center stage in this issue’s profiles, and we salute the 100th anniversary of legend Ray Harryhausen. Get an inside look at trends in virtual production, real-time technology and the state of SFX trends – just as we get ready to bestow the new VES Award recognizing Outstanding Special (Practical) Effects in a Photoreal Project at the 18th Annual VES Awards. Industry leaders look into the crystal ball for VFX in 2020 and beyond, and more. And some exciting news hot off the presses: VFX Voice has won again! The 2019 FOLIO: Eddie and Ozzie Awards, one of the most prestigious national awards programs in the publishing community, has again honored our publication, naming VFXVoice.com as Best Website (Association/ Nonprofit)! So, thank you for your enthusiastic support of our digital magazine, where we continue to bring you exclusive stories between print issues, only available online at VFXVoice.com! You can also get VFX Voice and VES updates by following us on Twitter at @VFXSociety. Thank you for being a part of the global VFX Voice community. We’re proud to be the definitive authority of all things VFX.

Mike Chambers, Chair, VES Board of Directors

Eric Roth, VES Executive Director

2 • VFXVOICE.COM WINTER 2020

PG 2 EXECUTIVE NOTE.indd 2

11/12/19 1:34 PM


PG 3 WARNER BROS JOKER AD.indd 3

11/12/19 2:36 PM


[ CONTENTS ] FEATURES 8 FILM: THE VFX OSCAR VFX Voice previews the array of contenders from 2019.

VFXVOICE.COM

DEPARTMENTS 2 EXECUTIVE NOTE 90 VFX CAREERS

18 VFX TRENDS: TECH BOOM Virtual production is poised to reshape the future of VFX. 30 SFX TRENDS: CRAFT CURRENTS Supervisors survey the state of SFX and the road ahead. 40 FILM: THE IRISHMAN Production VFX Supervisor Pablo Helman interprets Scorsese. 44 TV: THE DARK CRYSTAL Practical effects, puppetry and digital meld in Netflix fantasy.

92 VES SECTION SPOTLIGHT: LOS ANGELES 94 VES NEWS 96 FINAL FRAME: METROPOLIS

ON THE COVER: Cara Delevingne as winged faery Vignette Stonemoss in Amazon’s Carnival Row. (Image copyright © 2019 Amazon Studios)

46 TECH & TOOLS: REAL-TIME How real-time technology is accelerating change for the industry. 52 PROFILE: JENNIFER LEE Walt Disney Animation CCO’s rise from storyroom to boardroom. 58 INDUSTRY ROUNDTABLE: 2020 VISION Industry leaders discuss VFX directions in the new decade. 76 COVER: CARNIVAL ROW Building wings and flying high for Amazon’s fantasy drama. 82 PROFILE: DAN CURRY, VES Tracking the orbit of a VFX pioneer on the ‘New Frontier.’ 86 LEGEND: RAY HARRYHAUSEN Celebrating the 100th anniversary of the stop-motion giant.

4 • VFXVOICE.COM WINTER 2020

PG 4 TOC.indd 4

11/12/19 1:35 PM


PG 5 fTrack AD.indd 5

11/12/19 2:37 PM


WINTER 2020 • VOL. 4, NO. 1

MULTIPLE WINNER OF THE 2018 & 2019 FOLIO: EDDIE & OZZIE AWARDS

VFXVOICE

Visit us online at vfxvoice.com PUBLISHER Jim McCullaugh publisher@vfxvoice.com

VISUAL EFFECTS SOCIETY Eric Roth, Executive Director VES BOARD OF DIRECTORS

EDITOR Ed Ochs editor@vfxvoice.com CREATIVE Alpanian Design Group alan@alpanian.com ADVERTISING VFXVoiceAds@gmail.com SUPERVISOR Nancy Ward CONTRIBUTING WRITERS Ian Failes Naomi Goldman Trevor Hogg Kevin H. Martin Barbara Robertson ADVISORY COMMITTEE David Bloom Andrew Bly Rob Bredow Mike Chambers Neil Corbould, VES Irena Cronin Paul Debevec, VES Debbie Denise Karen Dufilho Paul Franklin David Johnson Jim Morris, VES Dennis Muren, ASC, VES Sam Nicholson, ASC Lori H. Schwartz Eric Roth

OFFICERS Mike Chambers, Chair Jeffrey A. Okun, VES, 1st Vice Chair Lisa Cooke, 2nd Vice Chair Brooke Lyndon-Stanford, Treasurer Rita Cahill, Secretary DIRECTORS Brooke Breton, Kathryn Brillhart, Colin Campbell Bob Coleman, Dayne Cowan, Kim Davidson Rose Duignan, Richard Edlund, VES, Bryan Grill Dennis Hoffman, Pam Hogarth, Jeff Kleiser Suresh Kondareddy, Kim Lavery, VES Tim McGovern, Emma Clifton Perry Scott Ross, Jim Rygiel, Tim Sassoon Lisa Sepp-Wilson, Katie Stetson David Tanaka, Richard Winn Taylor II, VES Cat Thelia, Joe Weidenbach ALTERNATES Andrew Bly, Gavin Graham, Charlie Iturriaga Andres Martinez, Dan Schrecker Tom Atkin, Founder Allen Battino, VES Logo Design Visual Effects Society 5805 Sepulveda Blvd., Suite 620 Sherman Oaks, CA 91411 Phone: (818) 981-7861 visualeffectssociety.com VES STAFF Nancy Ward, Program & Development Director Chris McKittrick, Director of Operations Ben Schneider, Director of Membership Services Jeff Casper, Manager of Media & Graphics Colleen Kelly, Office Manager Debbie McBeth, Global Coordinator Jennifer Cabrera, Administrative Assistant P.J. Schumacher, Controller Naomi Goldman, Public Relations

Follow us on social media VFX Voice is published quarterly by the Visual Effects Society. Subscriptions: U.S. $50: Canada/Mexico $60; all other countries $70 a year. See vfxvoice.com Advertising: Rate card upon request from publisher@vfxvoice.com or VFXVoiceAds@gmail.com Comments: Write us at comments@vfxvoice.com Postmaster: Send address change to VES, 5805 Sepulveda Blvd., Suite 620, Sherman Oaks, CA 91411. Copyright © 2020 The Visual Effects Society. Printed in the U.S.A.

6 • VFXVOICE.COM WINTER 2020

PG 6 MASTHEAD.indd 6

11/12/19 1:35 PM


PG 7 DISNEY TOY STORY 4 AD.indd 7

11/12/19 2:38 PM


FILM

THE CONTENDERS: 20 FILMS THAT COULD TAKE VFX OSCAR GOLD By IAN FAILES

In the visual effects landscape, 2019 has again been a big year for superhero films, sci-fi adventures and fantastical realism. Usually that might mean a few obvious front-runners for the VFX Oscar, but the final contenders for the crown (and the eventual winner) can often be surprising, especially in terms of a film’s perceived use of photorealistic CG effects versus more practical methods. This makes picking the possible VFX contenders a tough task. While there have been several contending films that evidence incredibly high photorealistic CG work, there have been fewer that tended to rely on models and miniatures, as past winners First Man, Interstellar and, to a lesser extent, Blade Runner 2049, had done. Still, a number of 2019 releases maintained a heavy use of practical creature effects, special effects and stunts. With those things in mind, VFX Voice looks to 20 films that may be the front-runners for VFX Oscar consideration in 2020. The winner will be announced at the Academy Awards on February 9. CG HUMAN(ISH), OLD AND YOUNG

TOP: Ad Astra is the story of an astronaut who travels across outer space in search of his father and a higher intelligence. (Image copyright © 2019 20th Century Fox)

8 • VFXVOICE.COM WINTER 2020

PG 8-16 OSCARS.indd 8-9

Looking back at 2019’s releases, a clear trend is the presence of CG humans or actors that have gone through some degree of augmentation or modification. The results often had the impact of blowing the audience away with the artistry behind the performance itself – both from the actor and VFX teams. Take Ang Lee’s Gemini Man, in which a younger-looking, cloned version of Will Smith battles against the current-age Will Smith. To make that possible, the actor performed both roles during filming, but went through a body and facial-capture process for his younger self. Weta Digital then took the resulting data and crafted a completely CG version of the actor. All the nuances Weta Digital manages to input into their digital Smith (based, of course, on the real Smith) perhaps represent the pinnacle of their work in this area. What’s incredible is that Gemini Man is just one of several films from the past year where actors have been replicated as either

Looking back at 2019’s releases, a clear trend is the presence of CG humans or actors that have gone through some degree of augmentation or modification. The results often had the impact of blowing the audience away with the artistry behind the performance itself – both from the actor and VFX teams. younger or older versions of themselves. The Irishman, from Martin Scorsese, with visual effects led by Industrial Light & Magic (ILM), similarly ‘reduced’ the ages of its main cast, this time to deal with the different time periods of the film. ILM also retains all the well-known nuances of actors such as Robert De Niro, Al Pacino and Joe Pesci with their fully CG versions. Having them in the film, along with such a high-profile director, will bring The Irishman a lot of credibility in the VFX Oscar race. Other films that dived heavily into the aging and de-aging realm, and CG human-like performances, are Captain Marvel and Avengers: Endgame. While both of those Marvel films employ a spectacular level of effects imagery (think: explosions, fight scenes, inter-planetary travel), they may well be most remembered this year for character work. Captain Marvel, from Anna Boden and Ryan Fleck, sees a young Nick Fury (Samuel L. Jackson) on the screen, crafted principally by Lola VFX, which used 2D techniques to do its aging and de-aging work (other VFX studios also worked on Fury’s transformation). The sheer number of shots of young Fury in the film make this a major achievement. The Russo Brothers’ Avengers: Endgame has multiple aging and de-aging shots, again led by Lola, which provided for an old-age Captain America (Chris Evans), an emaciated Tony Stark (Robert Downey Jr), a de-aged Hank Pym (Michael Douglas) and Howard

TOP: ILM’s CG Genie for Aladdin was based on multiple capture sessions of Will Smith. (Image copyright © 2019 Walt Disney Pictures) BOTTOM: A de-aged Samuel L. Jackson as Nick Fury was one of the central pieces of VFX work in Captain Marvel. (Image copyright © 2019 Marvel Studios)

WINTER 2020 VFXVOICE.COM • 9

11/12/19 1:36 PM


FILM

THE CONTENDERS: 20 FILMS THAT COULD TAKE VFX OSCAR GOLD By IAN FAILES

In the visual effects landscape, 2019 has again been a big year for superhero films, sci-fi adventures and fantastical realism. Usually that might mean a few obvious front-runners for the VFX Oscar, but the final contenders for the crown (and the eventual winner) can often be surprising, especially in terms of a film’s perceived use of photorealistic CG effects versus more practical methods. This makes picking the possible VFX contenders a tough task. While there have been several contending films that evidence incredibly high photorealistic CG work, there have been fewer that tended to rely on models and miniatures, as past winners First Man, Interstellar and, to a lesser extent, Blade Runner 2049, had done. Still, a number of 2019 releases maintained a heavy use of practical creature effects, special effects and stunts. With those things in mind, VFX Voice looks to 20 films that may be the front-runners for VFX Oscar consideration in 2020. The winner will be announced at the Academy Awards on February 9. CG HUMAN(ISH), OLD AND YOUNG

TOP: Ad Astra is the story of an astronaut who travels across outer space in search of his father and a higher intelligence. (Image copyright © 2019 20th Century Fox)

8 • VFXVOICE.COM WINTER 2020

PG 8-16 OSCARS.indd 8-9

Looking back at 2019’s releases, a clear trend is the presence of CG humans or actors that have gone through some degree of augmentation or modification. The results often had the impact of blowing the audience away with the artistry behind the performance itself – both from the actor and VFX teams. Take Ang Lee’s Gemini Man, in which a younger-looking, cloned version of Will Smith battles against the current-age Will Smith. To make that possible, the actor performed both roles during filming, but went through a body and facial-capture process for his younger self. Weta Digital then took the resulting data and crafted a completely CG version of the actor. All the nuances Weta Digital manages to input into their digital Smith (based, of course, on the real Smith) perhaps represent the pinnacle of their work in this area. What’s incredible is that Gemini Man is just one of several films from the past year where actors have been replicated as either

Looking back at 2019’s releases, a clear trend is the presence of CG humans or actors that have gone through some degree of augmentation or modification. The results often had the impact of blowing the audience away with the artistry behind the performance itself – both from the actor and VFX teams. younger or older versions of themselves. The Irishman, from Martin Scorsese, with visual effects led by Industrial Light & Magic (ILM), similarly ‘reduced’ the ages of its main cast, this time to deal with the different time periods of the film. ILM also retains all the well-known nuances of actors such as Robert De Niro, Al Pacino and Joe Pesci with their fully CG versions. Having them in the film, along with such a high-profile director, will bring The Irishman a lot of credibility in the VFX Oscar race. Other films that dived heavily into the aging and de-aging realm, and CG human-like performances, are Captain Marvel and Avengers: Endgame. While both of those Marvel films employ a spectacular level of effects imagery (think: explosions, fight scenes, inter-planetary travel), they may well be most remembered this year for character work. Captain Marvel, from Anna Boden and Ryan Fleck, sees a young Nick Fury (Samuel L. Jackson) on the screen, crafted principally by Lola VFX, which used 2D techniques to do its aging and de-aging work (other VFX studios also worked on Fury’s transformation). The sheer number of shots of young Fury in the film make this a major achievement. The Russo Brothers’ Avengers: Endgame has multiple aging and de-aging shots, again led by Lola, which provided for an old-age Captain America (Chris Evans), an emaciated Tony Stark (Robert Downey Jr), a de-aged Hank Pym (Michael Douglas) and Howard

TOP: ILM’s CG Genie for Aladdin was based on multiple capture sessions of Will Smith. (Image copyright © 2019 Walt Disney Pictures) BOTTOM: A de-aged Samuel L. Jackson as Nick Fury was one of the central pieces of VFX work in Captain Marvel. (Image copyright © 2019 Marvel Studios)

WINTER 2020 VFXVOICE.COM • 9

11/12/19 1:36 PM


FILM

TOP: Live-action actors were transformed into CG felines for Cats, retaining their distinctive facial features. (Image copyright © 2019 Universal Pictures) BOTTOM: The famous baby elephant made its live-action debut in Dumbo. (Image copyright © 2019 Walt Disney Pictures) OPPOSITE TOP: Crafting a completely believable performance of a younger-looking Will Smith was critical to the success of Gemini Man. (Image copyright © 2019 Paramount Pictures) OPPOSITE MIDDLE: Destruction effects, massive lumbering CG monsters and some intense environments made up the VFX for Godzilla: King of the Monsters. (Image copyright © 2019 Warner Bros. Pictures) OPPOSTE BOTTOM: Stunts, practical effects and CG work touch so many of Fast & Furious Presents: Hobbs & Shaw’s action scenes. (Image copyright © 2019 Universal Pictures)

10 • VFXVOICE.COM WINTER 2020

PG 8-16 OSCARS.indd 10-11

Stark (John Slattery), and a young Stan Lee (in his last-ever cameo). Although we have seen a lot of this work before in recent Marvel movies, the work here is crucial to the time-travel storytelling in the film. Meanwhile, Endgame is also a showcase of digital human-like characters, with both Josh Brolin’s Thanos and Mark Ruffalo’s Hulk. New machine learning techniques used by Weta Digital and Digital Domain for their Thanos scenes, and new capture methods adopted by ILM for Hulk (Framestore also worked on Hulk shots) ensured that the characters look richer than ever. We can, arguably, sense the actor behind the performance even closer than previously. That goes for the central character played by Rosa Salazar in Robert Rodriguez’s Alita: Battle Angel, too. Realized entirely in CG by Weta Digital, Alita had to interact with live-action actors and perform some incredible stunt movements. Weta Digital adopted a new full-body capture suit and helmet-cam for the job that aided in replicating Salazar’s performance digitally. New skin shaders and approaches to hair simulation helped as well. Then there’s Guy Ritchie’s Aladdin, which features a fully synthetic Genie based on the performance of Will Smith. Here, ILM took both on-set capture of the actor and additional ‘Anyma’ captures to make the character, who retained the charm of Smith but also was made flexible enough to do a myriad of magical things. The studio also had plenty of other work in the film on their plate, from vast landscapes to other CG characters (not to mention a flying magic carpet). The translation of Will Smith to a Genie kept the essence of the original actor in Aladdin, and it’s something that was done also for the performers in Tom Hooper’s Cats, albeit with a lot of extra fur. This work, led by Mill Film, is noteworthy for its complexity and

in being a somewhat ‘off-beat’ use of capture and CG face tech; audiences might at first have found the translation a little uncanny, but the breakdowns are charming and memorable. STUNNING CHARACTERS AND LOCATIONS

Jon Favreau’s The Lion King easily has the most stunningly realistic characters and landscapes of any film released in 2019, even if some audience members found some of the performances initially startling. While the original 1994 2D animated film featured lions, hyenas and other animals with very emotive expressions, the 2019 3D animated version went with something more realistic (although of course the animals still talk, and sing). Still, that aspect clearly did not worry too many people given the film’s success, and it’s impossible to go past The Lion King as a major VFX Oscar contender thanks to its advances in virtual production techniques and the CG/animation prowess of MPC. The film ‘feels’ like a live-action one, and that’s because real-world camera rigs and approaches to photography were part of the virtual filmmaking process, which took advantage of real-time tools and VR set scouting, with MPC then delivering the entirely photoreal landscapes and inhabitants of the African plains. There’s world-building of a different kind in J.J. Abrams’ Star Wars: The Rise of Skywalker, a film in which ILM led the creation of distant galaxies, CG space battles, lightsaber duels and a host of CG characters. At the same time, like the other recent Star Wars outings, creature and special makeup effects, practical special effects and stunts, form a large part of The Rise of Skywalker’s final shots, and that can hold significant sway in the VFX Oscar race. Jon Watts’ Spider-Man: Far From Home has many CG characters – Spider-Man, Mysterio and the ‘Elementals’ of the film – and it goes to many locations. Perhaps the most impressive VFX

WINTER 2020 VFXVOICE.COM • 11

11/12/19 1:36 PM


FILM

TOP: Live-action actors were transformed into CG felines for Cats, retaining their distinctive facial features. (Image copyright © 2019 Universal Pictures) BOTTOM: The famous baby elephant made its live-action debut in Dumbo. (Image copyright © 2019 Walt Disney Pictures) OPPOSITE TOP: Crafting a completely believable performance of a younger-looking Will Smith was critical to the success of Gemini Man. (Image copyright © 2019 Paramount Pictures) OPPOSITE MIDDLE: Destruction effects, massive lumbering CG monsters and some intense environments made up the VFX for Godzilla: King of the Monsters. (Image copyright © 2019 Warner Bros. Pictures) OPPOSTE BOTTOM: Stunts, practical effects and CG work touch so many of Fast & Furious Presents: Hobbs & Shaw’s action scenes. (Image copyright © 2019 Universal Pictures)

10 • VFXVOICE.COM WINTER 2020

PG 8-16 OSCARS.indd 10-11

Stark (John Slattery), and a young Stan Lee (in his last-ever cameo). Although we have seen a lot of this work before in recent Marvel movies, the work here is crucial to the time-travel storytelling in the film. Meanwhile, Endgame is also a showcase of digital human-like characters, with both Josh Brolin’s Thanos and Mark Ruffalo’s Hulk. New machine learning techniques used by Weta Digital and Digital Domain for their Thanos scenes, and new capture methods adopted by ILM for Hulk (Framestore also worked on Hulk shots) ensured that the characters look richer than ever. We can, arguably, sense the actor behind the performance even closer than previously. That goes for the central character played by Rosa Salazar in Robert Rodriguez’s Alita: Battle Angel, too. Realized entirely in CG by Weta Digital, Alita had to interact with live-action actors and perform some incredible stunt movements. Weta Digital adopted a new full-body capture suit and helmet-cam for the job that aided in replicating Salazar’s performance digitally. New skin shaders and approaches to hair simulation helped as well. Then there’s Guy Ritchie’s Aladdin, which features a fully synthetic Genie based on the performance of Will Smith. Here, ILM took both on-set capture of the actor and additional ‘Anyma’ captures to make the character, who retained the charm of Smith but also was made flexible enough to do a myriad of magical things. The studio also had plenty of other work in the film on their plate, from vast landscapes to other CG characters (not to mention a flying magic carpet). The translation of Will Smith to a Genie kept the essence of the original actor in Aladdin, and it’s something that was done also for the performers in Tom Hooper’s Cats, albeit with a lot of extra fur. This work, led by Mill Film, is noteworthy for its complexity and

in being a somewhat ‘off-beat’ use of capture and CG face tech; audiences might at first have found the translation a little uncanny, but the breakdowns are charming and memorable. STUNNING CHARACTERS AND LOCATIONS

Jon Favreau’s The Lion King easily has the most stunningly realistic characters and landscapes of any film released in 2019, even if some audience members found some of the performances initially startling. While the original 1994 2D animated film featured lions, hyenas and other animals with very emotive expressions, the 2019 3D animated version went with something more realistic (although of course the animals still talk, and sing). Still, that aspect clearly did not worry too many people given the film’s success, and it’s impossible to go past The Lion King as a major VFX Oscar contender thanks to its advances in virtual production techniques and the CG/animation prowess of MPC. The film ‘feels’ like a live-action one, and that’s because real-world camera rigs and approaches to photography were part of the virtual filmmaking process, which took advantage of real-time tools and VR set scouting, with MPC then delivering the entirely photoreal landscapes and inhabitants of the African plains. There’s world-building of a different kind in J.J. Abrams’ Star Wars: The Rise of Skywalker, a film in which ILM led the creation of distant galaxies, CG space battles, lightsaber duels and a host of CG characters. At the same time, like the other recent Star Wars outings, creature and special makeup effects, practical special effects and stunts, form a large part of The Rise of Skywalker’s final shots, and that can hold significant sway in the VFX Oscar race. Jon Watts’ Spider-Man: Far From Home has many CG characters – Spider-Man, Mysterio and the ‘Elementals’ of the film – and it goes to many locations. Perhaps the most impressive VFX

WINTER 2020 VFXVOICE.COM • 11

11/12/19 1:36 PM


PG 12-13 UA MISSING LINK Spread AD.indd All Pages

11/12/19 2:38 PM


PG 12-13 UA MISSING LINK Spread AD.indd All Pages

11/12/19 2:38 PM


FILM

accomplishment here is the augmentation, expansion or recreation of so many real-world locales such as Venice, Prague and London. That keeps the film grounded, even if it is one of the most intensive VFX projects of the year, to which many effects studios contributed. Also a ‘winner’ in terms of feeling grounded in reality, despite playing host to a swathe of game-inspired CG characters, is Rob Letterman’s Pokémon Detective Pikachu. The movie was shot on film, made extensive use of stuffies and stand-ins, and relied on scores of artists (predominantly from MPC and Framestore) to rotoscope and integrate the CG characters into scenes – with the result feeling like one of the most accomplished live-action/ hybrids from 2019. Another live-action/hybrid, Tim Burton’s Dumbo, manages to hit plenty of emotional beats by re-telling the popular Disney story of the titular baby elephant. MPC led the effects effort here, finding a balance between the cartoony nature of the cute Dumbo and mixing it with all the real-world aspects that were necessary. VFX artists on Godzilla: King of the Monsters, directed by Michael Dougherty, were similarly tasked with integrating CG characters (this time, massive ones) into both live-action and fully synthetic locations. The scale of VFX work in King of the Monsters is huge, and crucial to making audiences believe these creatures had been re-born. Men In Black: International, a film by F. Gary Gray, would simply not have been possible without visual effects. There are CG characters, CG environments, CG vehicles, and scenes that bring live-action actors into places where they couldn’t have shot. The work is immense, and done by a multitude of providers. With some killer scenes, and a new take on the liquidlike morphing nature of machines from the future, Tim Miller’s Terminator: Dark Fate might have just enough ‘wow’ moments to move into VFX Oscar contention. It also has, among the slew of CG imagery and digital human work, lots of large-scale practical stunt scenes.

AD

THE NOT-SO-SURPRISING SURPRISES

TOP: Subtle and sometimes invisible effects helped add to John Wick: Chapter 3 - Parabellum’s intense action scenes. (Image copyright © 2019 Lionsgate) MIDDLE: VFX studios were a key part of the design of many of the scenes and characters in Men In Black: International. (Image copyright © 2019 Sony Pictures) BOTTOM: Deliberately staying within a cartoon realm, the characters of Pokémon Detective Pikachu were still finely integrated into the live-action. (Image copyright © 2019 Warner Bros. Pictures)

The ‘big’ films listed above are definitively VFX-driven films. One of the criteria for visual effects Oscar consideration is the contribution the visual effects make to the overall production. But there are sometimes films that make the VFX consideration lists which, at first, do not appear to rely so heavily on VFX. The truth, of course, is that visual effects often form a key part of these kinds of films; they just tend to be more of the ‘invisible’ kind, or used in different ways. In Chris Butler’s Missing Link, for instance, visual effects are incorporated in extensive ways, from fully CG characters and environments, to the compositing of separately filmed stop-motion elements into final shots. And despite being a stop-motion film, there’s precedent for such a movie to be included in VFX Oscar consideration (The Nightmare Before Christmas and Kubo and the Two Strings). Ad Astra by James Gray has some very obvious visual effects work for scenes set in space, but the film differs in being a

14 • VFXVOICE.COM WINTER 2020

PG 8-16 OSCARS.indd 15

11/12/19 1:36 PM


PG 15 AMAZON HONEY BOY AD.indd 15

11/12/19 2:39 PM


FILM

somewhat more cerebral experience than typical sci-fi releases. The big accomplishment here is the real feeling of traveling into space with Brad Pitt’s character, as if VFX were not involved at all. The goal of Us director Jordan Peele must surely also have been to tell his story without the audience noticing there were any visual effects in the film. However, ILM helped create key doppelgänger scenes and introduced a mix of horror and intrigue into several shots. Old-school split screens and a desire to keep everything invisible may well get this over the line. In that category, too, is John Wick: Chapter 3 - Parabellum, from director Chad Stahelski. Audiences would be intimately aware that stunt choreography and special effects were major players in getting Parabellum’s action scenes onto the screen. On top of that, VFX had a role in tying elements together, for example, by enabling bullets to fly underwater or cameras to be erased from reflections in glass walls, or for a major motorcycle chase to be partly filmed on greenscreen. And rounding out this list of 20 possible contenders is David Leitch’s Fast & Furious Presents: Hobbs & Shaw, another film leaning heavily on practical special effects and stunts, where CG and digital visual effects (led by DNEG) were equally important. Any film that tends to show crazy stunts, elevated by just as intensive VFX work, might have a strong chance at awards success. Nominations for the five contenders for the VFX Oscar will be announced on January 13. These are chosen after a list of 20, and then 10, go through the consideration process via the visual effects branch of the Academy.

AD

TOP: In Spider-Man: Far From Home, effects simulations for the Elementals and plenty of exotic locations made for a heavy VFX film. (Image copyright © 2019 Sony Pictures) MIDDLE: Terminator: Dark Fate revives the Terminator franchise, one that is fondly remembered for creature effects and breakthrough digital technologies. (Image copyright © 2019 Paramount Pictures) BOTTOM LEFT: With cutting-edge virtual production filmmaking techniques and a supreme level of photorealism on show, The Lion King is likely to be dominant during awards season. (Image copyright © 2019 Walt Disney Pictures) BOTTOM RIGHT: Invisible effects in Us helped tell this horrifying doppelgänger story. (Image copyright © 2019 Universal Pictures)

16 • VFXVOICE.COM WINTER 2020

PG 8-16 OSCARS.indd 17

11/12/19 1:36 PM


PG 17 AMAZON THE REPORT AD.indd 17

11/12/19 2:40 PM


VFX TRENDS

NEW VIRTUAL TECHNOLOGIES REMAKE VFX’S FUTURE PIPELINE By DEBRA KAUFMAN

Visual Effects Supervisor Sam Nicholson, ASC, who founded and heads Stargate Studios, remembers the pre-digital processes for visual effects. “I started on Star Trek,” he says. “It was all in-camera effects, shooting film and composting with optical printing.” Now, he says, virtual production has brought back the in-camera effect and promises to bring visual effects from a post-production process to the set. This isn’t the only nascent trend in visual effects, but it is poised to have the biggest impact. With virtual production, directors, cinematographers and every other department can see and often manipulate – in real-time – the physical set and actors composited with digital images and creatures. “The big change has come with [more powerful] GPUs from NVIDIA combined with Epic Games’ Unreal Engine 4 providing the software for real-time rendering and ray tracing,” says Nicholson. “When you put that together with LED walls or giant monitors, we think that at least 50% of what we do on set can be finished pixels.” VFX IN REAL-TIME

TOP: Stargate Studios’ ThruView system enables shooting in real-time while actually seeing CG elements – complete with reflections and lighting – integrated into the camera, eliminating the need for greenscreen setups. (Image courtesy of Stargate Studios) BOTTOM LEFT: Sam Nicholson, ASC, CEO, Stargate Studios BOTTOM RIGHT: David Morin, Head of L.A. Lab, Epic Games

18 • VFXVOICE.COM WINTER 2020

PG 18-28 VFX FUTURE.indd 18-19

David Morin was part of a small team working on Jurassic Park; now he’s Head of Epic Games’ L.A. Lab, which showcased virtual production during SIGGRAPH 2019. A group of companies – what Morin calls a “coalition of the willing” – came together to show the power of using Unreal Engine 4 with a panoply of technology developed to enable virtual production. That included Magnopus’s VR Scout, a multi-user tool that lets creatives “scout” a virtual location; Quixel, which develops photorealistic environments with hi-res scans and photogrammetry; Profile Studios, providing camera tracking for real-time compositing; and Lux Machina, a systems integrator with proprietary tech to marry the digital and physical worlds. “This next phase – where we can do a lot more in real-time – is an exciting moment,” says Morin. Rob Legato, ASC, Visual Effects Supervisor on The Lion King and The Jungle Book, made use of real-time technologies to bring

digital worlds to photoreal life, to great acclaim. He notes that he always had a bent towards naturalistic filmmaking, and virtual production enables that. “The reason we shoot the way we have for 100 years is because it works,” he says. Working in isolation on a computer took the process away from the production. “Now, you put yourself in virtual reality and you immediately get input,” he says. “All of a sudden you start clicking on all burners because you have people to [collaborate with]. The difference is like composing jazz music one note at a time with five hours between each note, versus hearing them all together.” Foundry’s chief scientist/co-founder Simon Robinson notes that virtual production plays another important role in “removing the boundaries between story, previs, on-set, post viz and postproduction.” With virtual production, the data generated in previz becomes reusable further down the pipeline, he observes. “That really transforms what has typically been a system of processing things overnight and having people check it in the morning.” Virtual production takes VFX from the iterative post-production process to the set which, says Nicholson, speeds up the process dramatically. He first tried out virtual production as Visual Effects Supervisor for the ABC TV show Pan Am in 2011. Now he’s integrated it into the production of an upcoming HBO series Run. “The challenge is 350 visual effects per episode,” he says. “We will do 4,000 shots over the next 10 weeks in Toronto. We synchronize it, track it, put it in the Unreal Engine, and it looks real and shouldn’t need any post enhancements. The entire power of a post-production facility like Stargate is moving on set. We now say fix it in prep rather than fix it in post.” A chorus of VFX facilities and artists applaud the fact that realtime virtual production is the apparent modus operandi for the future, with tangential contributions from other tech spaces. “Currently, there is no one thing shaping the VFX business,” comments John Fragomeni, President of Digital Domain. “There are multiple factors at play that are intersecting with and impacting

TOP AND MIDDLE: Unreal Engine spotlights virtual production, putting the latest UE4 tools through their paces with Epic Games, Lux Machina, Magnopus, Profile Studios, Quixel and ARRI, demonstrating real-time, in-camera visual effects. (Images courtesy of Epic Games) BOTTOM LEFT: Rob Legato, ASC, Visual Effects Supervisor BOTTOM RIGHT: Adam Valdez, Visual Effects Supervisor, Technicolor MPC

WINTER 2020 VFXVOICE.COM • 19

11/12/19 1:37 PM


VFX TRENDS

NEW VIRTUAL TECHNOLOGIES REMAKE VFX’S FUTURE PIPELINE By DEBRA KAUFMAN

Visual Effects Supervisor Sam Nicholson, ASC, who founded and heads Stargate Studios, remembers the pre-digital processes for visual effects. “I started on Star Trek,” he says. “It was all in-camera effects, shooting film and composting with optical printing.” Now, he says, virtual production has brought back the in-camera effect and promises to bring visual effects from a post-production process to the set. This isn’t the only nascent trend in visual effects, but it is poised to have the biggest impact. With virtual production, directors, cinematographers and every other department can see and often manipulate – in real-time – the physical set and actors composited with digital images and creatures. “The big change has come with [more powerful] GPUs from NVIDIA combined with Epic Games’ Unreal Engine 4 providing the software for real-time rendering and ray tracing,” says Nicholson. “When you put that together with LED walls or giant monitors, we think that at least 50% of what we do on set can be finished pixels.” VFX IN REAL-TIME

TOP: Stargate Studios’ ThruView system enables shooting in real-time while actually seeing CG elements – complete with reflections and lighting – integrated into the camera, eliminating the need for greenscreen setups. (Image courtesy of Stargate Studios) BOTTOM LEFT: Sam Nicholson, ASC, CEO, Stargate Studios BOTTOM RIGHT: David Morin, Head of L.A. Lab, Epic Games

18 • VFXVOICE.COM WINTER 2020

PG 18-28 VFX FUTURE.indd 18-19

David Morin was part of a small team working on Jurassic Park; now he’s Head of Epic Games’ L.A. Lab, which showcased virtual production during SIGGRAPH 2019. A group of companies – what Morin calls a “coalition of the willing” – came together to show the power of using Unreal Engine 4 with a panoply of technology developed to enable virtual production. That included Magnopus’s VR Scout, a multi-user tool that lets creatives “scout” a virtual location; Quixel, which develops photorealistic environments with hi-res scans and photogrammetry; Profile Studios, providing camera tracking for real-time compositing; and Lux Machina, a systems integrator with proprietary tech to marry the digital and physical worlds. “This next phase – where we can do a lot more in real-time – is an exciting moment,” says Morin. Rob Legato, ASC, Visual Effects Supervisor on The Lion King and The Jungle Book, made use of real-time technologies to bring

digital worlds to photoreal life, to great acclaim. He notes that he always had a bent towards naturalistic filmmaking, and virtual production enables that. “The reason we shoot the way we have for 100 years is because it works,” he says. Working in isolation on a computer took the process away from the production. “Now, you put yourself in virtual reality and you immediately get input,” he says. “All of a sudden you start clicking on all burners because you have people to [collaborate with]. The difference is like composing jazz music one note at a time with five hours between each note, versus hearing them all together.” Foundry’s chief scientist/co-founder Simon Robinson notes that virtual production plays another important role in “removing the boundaries between story, previs, on-set, post viz and postproduction.” With virtual production, the data generated in previz becomes reusable further down the pipeline, he observes. “That really transforms what has typically been a system of processing things overnight and having people check it in the morning.” Virtual production takes VFX from the iterative post-production process to the set which, says Nicholson, speeds up the process dramatically. He first tried out virtual production as Visual Effects Supervisor for the ABC TV show Pan Am in 2011. Now he’s integrated it into the production of an upcoming HBO series Run. “The challenge is 350 visual effects per episode,” he says. “We will do 4,000 shots over the next 10 weeks in Toronto. We synchronize it, track it, put it in the Unreal Engine, and it looks real and shouldn’t need any post enhancements. The entire power of a post-production facility like Stargate is moving on set. We now say fix it in prep rather than fix it in post.” A chorus of VFX facilities and artists applaud the fact that realtime virtual production is the apparent modus operandi for the future, with tangential contributions from other tech spaces. “Currently, there is no one thing shaping the VFX business,” comments John Fragomeni, President of Digital Domain. “There are multiple factors at play that are intersecting with and impacting

TOP AND MIDDLE: Unreal Engine spotlights virtual production, putting the latest UE4 tools through their paces with Epic Games, Lux Machina, Magnopus, Profile Studios, Quixel and ARRI, demonstrating real-time, in-camera visual effects. (Images courtesy of Epic Games) BOTTOM LEFT: Rob Legato, ASC, Visual Effects Supervisor BOTTOM RIGHT: Adam Valdez, Visual Effects Supervisor, Technicolor MPC

WINTER 2020 VFXVOICE.COM • 19

11/12/19 1:37 PM


VFX TRENDS

“[W]e’re reaching a point of maturity in the industry as the whole. I think the overall trend that drives everything is the global increase in media volume, a lot of it driven by the streaming platform. The focus now is the pragmatic issues of getting it done on time and on budget without sacrificing quality.” —Simon Robinson, Chief Scientist/Co-founder, Foundry

TOP, MIDDLE AND BOTTOM: Chaos Group’s Lavina is a highly realistic real-time ray tracer that leverages NVIDIA’s RTX cards. Here, the robot characters and one-billion-polygon city environment rendered in Lavina illustrate the breadth of the system’s capabilities. (Images courtesy of Chaos Group)

20 • VFXVOICE.COM WINTER 2020

PG 18-28 VFX FUTURE.indd 20-21

the VFX process from technology development to production to post-production. However, if I had to say what I believe to be the most influential, it would be real-time/game engine technology and processes. At Digital Domain, we are experiencing that convergence first-hand and embracing it as we infuse it into the traditional VFX execution to help us become more efficient as filmmakers. “We’re in an exciting era of content creation,” adds Fragomeni, “and I believe the VFX business is somewhere between evolution and revolution, and we’ve taken a proactive role in this climate change by nurturing relationships as collaborators to partner with the major game platform developers. We’ve adopted a hive mentality to development in which we’ve been able to achieve some remarkably creative and technical work across our multiple production groups. The diversity of ways we’ve been able to harness game technology and processes has been very successful. Just this year alone, we’ve seen several significant breakthroughs particularly within our Digital Human Group, as well as New Media + EXP using game engine technology. “As a VFX studio,” Fragomeni continues, “we are focused on creating and building new tools and advancing our pipeline to help us work seamlessly between our traditional VFX process while utilizing the best technology that gaming platforms have to offer. Infusing game-engine technology into our creative process has truly widened the gates of what is possible in visual effects, allowing us to build upon our legacy of outstanding visual storytelling and drive us into the next decade.” Sara Tremblay, VFX Producer/Set Supervisor at Crafty Apes, says, “The immersion of real-time render engines into VFX pipelines, especially small and mid-level houses, will be a gamechanger for the industry. Currently, this technology has made a

huge contribution to video games and larger-scale films. Once it becomes a mainstay in VFX pipelines it will give companies the ability to meet the growing demand for CG-enhanced shots on tight deadlines, which streaming content and TV series currently experience. “The incorporation of these engines will not only help reduce the long and costly render times many CG shots require, but it will do so without sacrificing the aesthetic quality that current render systems produce. It will also give artists and clients more artistic flexibility to explore styles and concepts before committing to a design, all while done in a shorter timeframe.” Salvador Zalvidea, VFX Supervisor with Cinesite, believes that “real-time is going to have a big impact in visual effects. Most of the exciting technologies we are seeing emerge will be done in real-time and on set, shifting the visual effects process to preproduction and production. This will allow creatives to make decisions on set. We will probably still require some visual effects to be done or refined after the shoot, but iterations will be much quicker, if not instantaneous.” At SIGGRAPH 2019, Chaos Group also showed off its Project Lavina for real-time ray tracing. Taking advantage of NVIDIA RTX GPU scaling, a new noise algorithm and animation support, Project Lavina demonstrated a real-time one-billion-polygon city designed by Blizzard Entertainment. “Unlike rasters that get slower with more triangles, ray tracing stays the same speed, so you can feed it huge scenes,” says Chaos Group Vice President of Product Management Phil Miller. “Lavina is about combining or merging your biggest possible scenes in the VRay format, so you can connect to that for photorealism with the full continuum of quality.”

TOP: Compositing in Nuke. (Image courtesy of Foundry) MIDDLE: Capturing volumetric datasets for future testing. (Image courtesy of Foundry) BOTTOM LEFT: Simon Robinson, Chief Scientist/Co-founder, Foundry BOTTOM RIGHT: Sara Tremblay, VFX Producer/Set Supervisor, Crafty Apes

WINTER 2020 VFXVOICE.COM • 21

11/12/19 1:37 PM


VFX TRENDS

“[W]e’re reaching a point of maturity in the industry as the whole. I think the overall trend that drives everything is the global increase in media volume, a lot of it driven by the streaming platform. The focus now is the pragmatic issues of getting it done on time and on budget without sacrificing quality.” —Simon Robinson, Chief Scientist/Co-founder, Foundry

TOP, MIDDLE AND BOTTOM: Chaos Group’s Lavina is a highly realistic real-time ray tracer that leverages NVIDIA’s RTX cards. Here, the robot characters and one-billion-polygon city environment rendered in Lavina illustrate the breadth of the system’s capabilities. (Images courtesy of Chaos Group)

20 • VFXVOICE.COM WINTER 2020

PG 18-28 VFX FUTURE.indd 20-21

the VFX process from technology development to production to post-production. However, if I had to say what I believe to be the most influential, it would be real-time/game engine technology and processes. At Digital Domain, we are experiencing that convergence first-hand and embracing it as we infuse it into the traditional VFX execution to help us become more efficient as filmmakers. “We’re in an exciting era of content creation,” adds Fragomeni, “and I believe the VFX business is somewhere between evolution and revolution, and we’ve taken a proactive role in this climate change by nurturing relationships as collaborators to partner with the major game platform developers. We’ve adopted a hive mentality to development in which we’ve been able to achieve some remarkably creative and technical work across our multiple production groups. The diversity of ways we’ve been able to harness game technology and processes has been very successful. Just this year alone, we’ve seen several significant breakthroughs particularly within our Digital Human Group, as well as New Media + EXP using game engine technology. “As a VFX studio,” Fragomeni continues, “we are focused on creating and building new tools and advancing our pipeline to help us work seamlessly between our traditional VFX process while utilizing the best technology that gaming platforms have to offer. Infusing game-engine technology into our creative process has truly widened the gates of what is possible in visual effects, allowing us to build upon our legacy of outstanding visual storytelling and drive us into the next decade.” Sara Tremblay, VFX Producer/Set Supervisor at Crafty Apes, says, “The immersion of real-time render engines into VFX pipelines, especially small and mid-level houses, will be a gamechanger for the industry. Currently, this technology has made a

huge contribution to video games and larger-scale films. Once it becomes a mainstay in VFX pipelines it will give companies the ability to meet the growing demand for CG-enhanced shots on tight deadlines, which streaming content and TV series currently experience. “The incorporation of these engines will not only help reduce the long and costly render times many CG shots require, but it will do so without sacrificing the aesthetic quality that current render systems produce. It will also give artists and clients more artistic flexibility to explore styles and concepts before committing to a design, all while done in a shorter timeframe.” Salvador Zalvidea, VFX Supervisor with Cinesite, believes that “real-time is going to have a big impact in visual effects. Most of the exciting technologies we are seeing emerge will be done in real-time and on set, shifting the visual effects process to preproduction and production. This will allow creatives to make decisions on set. We will probably still require some visual effects to be done or refined after the shoot, but iterations will be much quicker, if not instantaneous.” At SIGGRAPH 2019, Chaos Group also showed off its Project Lavina for real-time ray tracing. Taking advantage of NVIDIA RTX GPU scaling, a new noise algorithm and animation support, Project Lavina demonstrated a real-time one-billion-polygon city designed by Blizzard Entertainment. “Unlike rasters that get slower with more triangles, ray tracing stays the same speed, so you can feed it huge scenes,” says Chaos Group Vice President of Product Management Phil Miller. “Lavina is about combining or merging your biggest possible scenes in the VRay format, so you can connect to that for photorealism with the full continuum of quality.”

TOP: Compositing in Nuke. (Image courtesy of Foundry) MIDDLE: Capturing volumetric datasets for future testing. (Image courtesy of Foundry) BOTTOM LEFT: Simon Robinson, Chief Scientist/Co-founder, Foundry BOTTOM RIGHT: Sara Tremblay, VFX Producer/Set Supervisor, Crafty Apes

WINTER 2020 VFXVOICE.COM • 21

11/12/19 1:37 PM


PG 22-23 AMAZON THE AERONAUTS AD.indd All Pages

11/12/19 2:41 PM


PG 22-23 AMAZON THE AERONAUTS AD.indd All Pages

11/12/19 2:41 PM


VFX TRENDS

THE BUSINESS STORY

“I believe that real-time is going to have a big impact in visual effects. Most of the exciting technologies we are seeing emerge will be done in real-time and on set, shifting the visual effects process to pre-production and production. This will allow creatives to make decisions on set. We will probably still require some visual effects to be done or refined after the shoot, but iterations will be much quicker, if not instantaneous.” —Salvador Zalvidea, VFX Supervisor, Cinesite

The rise of virtual production isn’t just a technology story (although more on that later). It’s a business story. At Sohonet, Chief Executive Chuck Parker notes that streaming media outlets’ hunger for content has resulted in skyrocketing episodic production. Statista reports that the number of original scripted TV series in the U.S. went from 210 in 2009 to 495 in 2018. And that’s just in the U.S. According to Quartz, Netflix released almost 1,500 hours of original content in 2018, comprised of more than 850 titles streamed exclusively on the platform. Multiply that by Amazon Prime, Hulu, YouTube and, soon to debut, Apple TV, Disney+ and HBO Max – and you get an overwhelming number of shows requiring an overwhelming amount of VFX. “Volume is our newest challenge,” says Technicolor MPC Visual Effects Supervisor Adam Valdez. “What it means is that we’re going to have a lot more filmmakers and series makers who want to use VFX. Virtual production is the bridge we build between CG artists and physical production, because that is really the all-encompassing purpose of it.” Valdez notes though, that even with more volume in VFX, the price point is still going down. “We have to continue to be reactive to the realities,” he says. “From an artistic point of view, things have never been more diverse and adventurous. We just have to keep working on the cost/price point and every studio knows that.” At Foundry, Robinson says that, over the last couple of years, they’ve been having “the dawning realization that we’re reaching a point of maturity in the industry as the whole. I think the overall trend that drives everything is the global increase in media volume, a lot of it driven by the streaming platform,” he says. “The focus now is the pragmatic issues of getting it done on time and on budget without sacrificing quality.” But the overwhelming consensus is that VFX supervisors and artists will not be displaced by the new technologies (including artificial intelligence). “Don’t worry about your job,” says Valdez. “It only makes sense to make sure that we have time at the back end to do the best polishing and finishing we can. This is about expediting decision-making and faster feedback with filmmakers. It just helps everyone with more interactivity and less wasted time. It doesn’t kill jobs. It allows some projects to get made, facilitates others and completely improves others.” Entity FX VFX Supervisor/President Mat Beck, ASC, also urges VFX artists to stay involved. “In this industry, the job is always changing, but it doesn’t mean your experience is devalued,” he says. “The advances that threaten your old job can potentiate you in a newer version of it. New techniques take a while to percolate down from the high end to the average film. Good artists are staying busy, in part by adapting to the environment as it changes.” THE CLOUD

TOP LEFT: Salvador Zalvidea, VFX Supervisor, Cinesite TOP RIGHT: Mat Beck, ASC, VFX Supervisor and President, Entity FX

24 • VFXVOICE.COM WINTER 2020

PG 18-28 VFX FUTURE.indd 25

Sohonet’s Parker, whose company offers Connected Cloud Services, has some statistics to bolster the case for how the cloud is already changing the VFX industry. He believes that “cloud-first workflows” are being driven by the increase in use of public cloud resources for compute, storage and applications. “VFX rendering

has led the industry to the cloud over the last four years for ‘peak’ compute challenges,” he says. “New entrants in VFX are building entire TV shows, commercials and even movies from cloud resources with zero on-prem compute/storage, and cloud storage economics are rapidly evolving. The virtual desktop will drive the next wave of the use of the cloud, compounded by cloud-based Software-as-a-Service (SaaS) applications from Avid, Adobe, Maya, Nuke and others.” Parker believes that as “the VFX industry turns to platforms and software as services, it will reduce capital expenditure requirements, fixed labor costs and large office leases, which will improve operational cashflow for established players” and make it easier for smaller facilities to establish themselves. He points to boutique firms that “can leverage virtual workstations, cloud-based workflows and real-time remote collaboration to deliver high-quality work faster and at a lower cost.” Foundry, says Robinson, has already created exactly that paradigm with its Athera cloud-based VFX platform. “Some large-scale projects have already run through Athera,” he says. He adds that migrating to the cloud for an entire project is still in its early days. “It’s possible, but it’s clear to us that it is still seen as technically difficult,” says Robinson. “It’s hard from a business point of view to make the shift, to find staff that understands it. And there are a lot of engineering challenges.” But the promise of what Athera offers is why Foundry is committed to evolving the platform. “A small cluster of artists can spin up a studio for a particular show and do it with marginal infrastructure,” says Robinson. “It’s part of a trend where people want pipeline infrastructure to be more and more turnkey, with more time devoted to the creative task and less to the nuts and bolts of running the computing equipment.”

“VFX artists are becoming just filmmakers with real-time feedback. Because it’s the trend, more digital artists will become good cameramen, directors, animators, artists. Doing it in real-time with real-time input, you begin to create a style that’s your own that will meld into the natural way to make a movie.” —Rob Legato, ASC, Visual Effects Supervisor

OPEN SOURCE

It wasn’t that long ago that visual effects facilities and software developers jealously guarded technologies – their “secret sauce” – developed in-house. That model has been upended to some degree by the advent of open source software – a kind of radical collaboration in which the programmers release their code, enabling others to study, change and distribute whatever they create. Open source can speed up the development of software and encourage new features and uses. Pixar built its Universal Scene Description (USD), a 3D scene description/file format for content interchange and content

TOP: On the set of The Lion King, the first feature film ever shot entirely in Virtual Reality. From left: Producer/director Jon Favreau, Cinematographer Caleb Deschanel, ASC, Production Designer James Chinlund, Visual Effects Supervisor Robert Legato, ASC, and Animation Supervisor Andy Jones. (Photo: Michael Legato. Copyright © 2018 Disney Enterprises, Inc.) BOTTOM: Cinematographer Caleb Deschanel, ASC, turning the wheels of virtual production on the set of The Lion King in 2018. (Photo: Michael Legato. Copyright © 2018 Disney Enterprises, Inc.)

WINTER 2020 VFXVOICE.COM • 25

11/12/19 1:37 PM


VFX TRENDS

THE BUSINESS STORY

“I believe that real-time is going to have a big impact in visual effects. Most of the exciting technologies we are seeing emerge will be done in real-time and on set, shifting the visual effects process to pre-production and production. This will allow creatives to make decisions on set. We will probably still require some visual effects to be done or refined after the shoot, but iterations will be much quicker, if not instantaneous.” —Salvador Zalvidea, VFX Supervisor, Cinesite

The rise of virtual production isn’t just a technology story (although more on that later). It’s a business story. At Sohonet, Chief Executive Chuck Parker notes that streaming media outlets’ hunger for content has resulted in skyrocketing episodic production. Statista reports that the number of original scripted TV series in the U.S. went from 210 in 2009 to 495 in 2018. And that’s just in the U.S. According to Quartz, Netflix released almost 1,500 hours of original content in 2018, comprised of more than 850 titles streamed exclusively on the platform. Multiply that by Amazon Prime, Hulu, YouTube and, soon to debut, Apple TV, Disney+ and HBO Max – and you get an overwhelming number of shows requiring an overwhelming amount of VFX. “Volume is our newest challenge,” says Technicolor MPC Visual Effects Supervisor Adam Valdez. “What it means is that we’re going to have a lot more filmmakers and series makers who want to use VFX. Virtual production is the bridge we build between CG artists and physical production, because that is really the all-encompassing purpose of it.” Valdez notes though, that even with more volume in VFX, the price point is still going down. “We have to continue to be reactive to the realities,” he says. “From an artistic point of view, things have never been more diverse and adventurous. We just have to keep working on the cost/price point and every studio knows that.” At Foundry, Robinson says that, over the last couple of years, they’ve been having “the dawning realization that we’re reaching a point of maturity in the industry as the whole. I think the overall trend that drives everything is the global increase in media volume, a lot of it driven by the streaming platform,” he says. “The focus now is the pragmatic issues of getting it done on time and on budget without sacrificing quality.” But the overwhelming consensus is that VFX supervisors and artists will not be displaced by the new technologies (including artificial intelligence). “Don’t worry about your job,” says Valdez. “It only makes sense to make sure that we have time at the back end to do the best polishing and finishing we can. This is about expediting decision-making and faster feedback with filmmakers. It just helps everyone with more interactivity and less wasted time. It doesn’t kill jobs. It allows some projects to get made, facilitates others and completely improves others.” Entity FX VFX Supervisor/President Mat Beck, ASC, also urges VFX artists to stay involved. “In this industry, the job is always changing, but it doesn’t mean your experience is devalued,” he says. “The advances that threaten your old job can potentiate you in a newer version of it. New techniques take a while to percolate down from the high end to the average film. Good artists are staying busy, in part by adapting to the environment as it changes.” THE CLOUD

TOP LEFT: Salvador Zalvidea, VFX Supervisor, Cinesite TOP RIGHT: Mat Beck, ASC, VFX Supervisor and President, Entity FX

24 • VFXVOICE.COM WINTER 2020

PG 18-28 VFX FUTURE.indd 25

Sohonet’s Parker, whose company offers Connected Cloud Services, has some statistics to bolster the case for how the cloud is already changing the VFX industry. He believes that “cloud-first workflows” are being driven by the increase in use of public cloud resources for compute, storage and applications. “VFX rendering

has led the industry to the cloud over the last four years for ‘peak’ compute challenges,” he says. “New entrants in VFX are building entire TV shows, commercials and even movies from cloud resources with zero on-prem compute/storage, and cloud storage economics are rapidly evolving. The virtual desktop will drive the next wave of the use of the cloud, compounded by cloud-based Software-as-a-Service (SaaS) applications from Avid, Adobe, Maya, Nuke and others.” Parker believes that as “the VFX industry turns to platforms and software as services, it will reduce capital expenditure requirements, fixed labor costs and large office leases, which will improve operational cashflow for established players” and make it easier for smaller facilities to establish themselves. He points to boutique firms that “can leverage virtual workstations, cloud-based workflows and real-time remote collaboration to deliver high-quality work faster and at a lower cost.” Foundry, says Robinson, has already created exactly that paradigm with its Athera cloud-based VFX platform. “Some large-scale projects have already run through Athera,” he says. He adds that migrating to the cloud for an entire project is still in its early days. “It’s possible, but it’s clear to us that it is still seen as technically difficult,” says Robinson. “It’s hard from a business point of view to make the shift, to find staff that understands it. And there are a lot of engineering challenges.” But the promise of what Athera offers is why Foundry is committed to evolving the platform. “A small cluster of artists can spin up a studio for a particular show and do it with marginal infrastructure,” says Robinson. “It’s part of a trend where people want pipeline infrastructure to be more and more turnkey, with more time devoted to the creative task and less to the nuts and bolts of running the computing equipment.”

“VFX artists are becoming just filmmakers with real-time feedback. Because it’s the trend, more digital artists will become good cameramen, directors, animators, artists. Doing it in real-time with real-time input, you begin to create a style that’s your own that will meld into the natural way to make a movie.” —Rob Legato, ASC, Visual Effects Supervisor

OPEN SOURCE

It wasn’t that long ago that visual effects facilities and software developers jealously guarded technologies – their “secret sauce” – developed in-house. That model has been upended to some degree by the advent of open source software – a kind of radical collaboration in which the programmers release their code, enabling others to study, change and distribute whatever they create. Open source can speed up the development of software and encourage new features and uses. Pixar built its Universal Scene Description (USD), a 3D scene description/file format for content interchange and content

TOP: On the set of The Lion King, the first feature film ever shot entirely in Virtual Reality. From left: Producer/director Jon Favreau, Cinematographer Caleb Deschanel, ASC, Production Designer James Chinlund, Visual Effects Supervisor Robert Legato, ASC, and Animation Supervisor Andy Jones. (Photo: Michael Legato. Copyright © 2018 Disney Enterprises, Inc.) BOTTOM: Cinematographer Caleb Deschanel, ASC, turning the wheels of virtual production on the set of The Lion King in 2018. (Photo: Michael Legato. Copyright © 2018 Disney Enterprises, Inc.)

WINTER 2020 VFXVOICE.COM • 25

11/12/19 1:37 PM


VFX TRENDS

“Rapid and responsive prototyping is key. Intelligent, adaptive, modular pipelines capable of flexing to all manner of production scales will drive this force to a deeper creative engagement. We’re facing an evolving VFX landscape where those who harness the rapid adaptation between longform, streaming and VP content will provide the immediate creative playground our clients look for.” —Adam Paschke, Head of Creative Operations, Mill Film

TOP: Still from a demo of real-time in-camera visual effects created with Epic Games’ Unreal Engine 4. (Image courtesy of Epic Games) BOTTOM LEFT: Chuck Parker, Chief Executive, Sohonet BOTTOM RIGHT: Adam Paschke, Head of Creative Operations, Mill Film

creation, among different tools. Although USD is core to Pixar’s 3D pipeline, the company open-sourced the file format. At Foundry, Robinson reports his company has demonstrated the role of USD as an emerging standard for the representation of 3D scene data. “We’re looking at how that representation can be used for real-time playback and fit into workflows where you want to access those open standards and do editorial and playback in these real-time environments,” he says. “If USD is used in every portion of the pipeline, it represents a continuity of decision-making and a consistent framework for visualization. We’re investigating what this will mean for artist workflow.” Robinson adds that his company has published an open sourced machine learning framework that works within Nuke. “It’s a way of having discussion with customers,” he says. “If you’re a Nuke customer today, you have the framework that allows you to download and run samples of interesting techniques published in conferences as well as try what you want in-house. Open acceptance of the need to collaborate with customers and produce open source frameworks for them is important. It’s a hugely collaborative industry and that’s expressed in open source.” ACCURACY, DIGITAL ACTORS AND AI

Virtual production and real-time technologies may dominate the conversation, but other developments have been impacting VFX. Valdez reports that he’s also noticed “general improvements of simulation in all aspects of the VFX work – lighting, FX, skin and muscle deformations. As VFX artists we now have to work in a very physically accurate way,” he says. “We’re becoming extremely accurate partially because we’re not cheating a stacked composite of elements, but building proper 3D spaces and working in proper 3D. It used to be more hacked together. Now you have to be very legitimate in space and simulations. It’s challenging.” Beck believes that we’ll see the end of the uncanny valley for digital humans. “Entire performances are going to be done inside the computer,” he says. To that end, the Digital Human League – a disparate group of scientists and technologists – have formed Wikihuman, which is an open source central location for their work, to understand and share knowledge about the creation of digital humans. The result is a non-commercial, free collection of technologies that provide artists with a baseline for creating digital humans. AI and machine learning are also enabling the development of more tools in the VFX/post industry, with numerous companies working on applications. Foundry is partnering with a university and a post-production facility to examine the intersection of rotoscoping and machine learning. “The task of roto is still a creative one that will be done by artists, but there are a lot of accelerations you can do that will make it more efficient,” says Robinson, who adds that the project is on a two-year time scale to get to a prototype. “We want the artist firmly in the loop.” The creation of digital media – long the purview of visual effects – will play an increasing role as the distribution and consumption of media assumes different forms. Today it’s virtual reality and augmented reality, the latter of which in particular is poised to

AD

26 • VFXVOICE.COM WINTER 2020

PG 18-28 VFX FUTURE.indd 27

11/12/19 1:37 PM


Autodesk Insights

Karen Dufilho

Award-winning Executive Producer, Google

Vision Quest: The Power of Trusting Your Voice Image courtesy of Ryan Wai Kin Lam

Academy and Emmy award-winning producer Karen Dufilho has been working at the intersection of story, animation and innovation for almost two decades. From early on, Dufilho was attracted to artists who could create new worlds, and wanted to lend her talents to bold and unique storytelling. She helmed the Shorts Division at Pixar Animation Studios, overseeing its acclaimed original production slate, and went on to serve as Executive Producer at Google Spotlight Stories, where she was instrumental in introducing immersive, narrative content and producing groundbreaking short films.

The creative voices producing content and making decisions should be a reflection of our audience. Things are shifting, probably not enough, and there’s a lot more work to do. The lack of diversity in our business is an entrenched problem. If you’re not actively working to change things and be part of the future, or to realign people in the exchange of new ideas, or to bring other points of view into your work and your teams, then you’re doing it wrong.

Trust that your voice is unique and belongs here; you will find an audience.

From a corporate standpoint, diversity is essential. It’s about creating a space where everyone can thrive and create work that reflects and amplifies the real world that we are all living in. And the bottom line (however you measure that) will reflect that as well. The truth is, everyone’s ready.

I’m obsessed with stories and the people who tell them. I’ve had the great opportunity to support and enable a range of talent and projects through new tech, new thinking and new platforms. Most recently that’s included telling stories in 360 and VR. But at the end of the day, there’s nothing like creating something out of nothing, building a world, putting on a show. I didn’t really have a mentor coming up and I always felt that was missing. I definitely had support from people I worked with, and took every opportunity I could. But it takes a concerted energy, and an openness, to put someone else’s success and growth on your own to-do list. So it is a big priority for me to incorporate mentoring into my daily workflow and how I think about my own professional long-term strategy.

PG 27 AUTODESK ADVERTORIAL.indd 27

My advice to aspiring professionals: Trust that your voice is unique and belongs here; you will find an audience. There are so many more places and platforms to share your work and aggregate fans and followers, so build it, nurture it, grow your army of fans - therein lies your power. I see emerging new creative artists embracing this changing landscape in stride...have tenacity and grit, and be that unstoppable force. And remember that it truly takes a village to bring visions to life, so maintain that sense of awe and respect for your peers and collaborators.

Ask Me Anything - VFX Pros Tell All Join us for our series of interactive webinars with visual effects professionals. Ask your questions, learn about the industry and glean inspiration for your career path. Register today at: VisualEffectsSociety.com/AMA

11/12/19 2:42 PM


VFX TRENDS

become part of many vertical industries. Tomorrow? Beck notes that it’s likely that people will experience media through direct neural inputs and even provide interactive neural feedback at the same time. GAME ENGINE DEMOCRATIZATION

“Unlike rasters that get slower with more triangles, ray tracing stays the same speed, so you can feed it huge scenes. Lavina is about combining or merging your biggest possible scenes in the VRay format, so you can connect to that for photorealism with the full continuum of quality.” —Phil Miller, Vice President of Product Managemen, Chaos Group

TOP: Visual effects can be achieved on set in real-time with Stargate’s ThruView, making them part of the filmmaking process. The combined output is piped into Unreal Engine at high resolution, playing back at up to 60 frames per second, and produces final-quality pixels. (Image courtesy of Stargate Studios) BOTTOM: Phil Miller, Vice President of Product Management, Chaos Group

In the shorter term, virtual production – like every other major technological leap forward – is exclusive to tentpole movies (or episodic TV). But that will change. Valdez points out that there’s so much interest in using it, it most likely will become more affordable over time. “We’re trying to make people understand that you don’t need tons of money or a giant mocap stage to take advantage of these tools,” says Valdez. “They’re part of enabling a better process.” He also notes that, with game engines introduced to movie/ TV production and post, the gaming generation is poised to make digital content. “The new generations are so much into games,” he says. “And they’ve been working with world-building for years.” That massive community and ecosystem around game engines, Valdez adds, is increasingly interested in making movies. They may add more to the mix, but they won’t overtake established VFX artists. In fact, says Legato, “VFX artists are becoming just filmmakers with real-time feedback. Because it’s the trend, more digital artists will become good cameramen, directors, animators, artists,” he says. “Doing it in real-time with real-time input, you begin to create a style that’s your own that will meld into the natural way to make a movie.” Finally, notes Adam Paschke, Head of Creative Operations, Mill Film, “Rapid and responsive prototyping is key. Intelligent, adaptive, modular pipelines capable of flexing to all manner of production scales will drive this force to a deeper creative engagement. We’re facing an evolving VFX landscape where those who harness the rapid adaptation between longform, streaming and VP content will provide the immediate creative playground our clients look for. I foresee us offering a more responsive and expansive canvas on which to layer our collaborative playfulness. “We will take advantage of deeper workflow technology that diffuses the administrative nature of complex data-handling and that instead engages the immediacy of flourishing idea-generation,” he adds. “That experience will bring fluidity, free from computational hindrance. Optimization of cloud storage, cloud computing, cloud rendering, machine-learning, metric gathering, auto-optimization should be humming at the core of our facilities. Versatility of our tools will strengthen asset development within the tuning arena of emerging Virtual Production technology. Machine-learning will lubricate rapid-prototyping and optimization of consistent volume delivery for streaming content. Embracing dynamic, open-source vertical and horizontal update technology will enable the most modernized fine-tuning of feature film realism.” Concludes Paschke: “Appropriate investments in the right corners of pipeline dynamism sound like a nerd’s utopia but will, within the next decade, ensure the creative moxie of all key creatives is right there at the ready, returning the energetic free-form element back to the artform. Bring on this future. We’re all ready!”

AD

28 • VFXVOICE.COM WINTER 2020

PG 18-28 VFX FUTURE.indd 29

11/12/19 1:37 PM


PG 29 VES AUTODESK AD.indd 29

11/12/19 2:45 PM


SFX TRENDS

HIGH DEMAND, STREAMING, TIGHT SCHEDULES MARK INDUSTRY GROWTH By TREVOR HOGG

TOP: Sandra Bullock floats in zero gravity in Gravity, with practical assistance from Neil Corbould, VES. (Image copyright © 2015 Twentieth Century Fox Film Corporation) BOTTOM LEFT: Hayley J. Williams BOTTOM RIGHT: Neil Corbould, VES

30 • VFXVOICE.COM WINTER 2020

PG 30-38 SFX FUTURE.indd 30-31

In order to develop a better understanding of the current status, as well as what the future holds for the SFX industry, VFX Voice asked SFX supervisors from around the globe to provide insight about the state of their unique craft – combining imagination, engineering, technology and artistry to conjure practical magic on the big screen. Hayley J. Williams (Annihilation, Maleficent: Mistress of Evil) has frequently collaborated with filmmaker Tim Burton (Dumbo) and has founded her own special effects company located at Longcross Studios in the U.K. “My father was a special effects supervisor and I worked closely with him for a good number of years. I’ve also worked closely with the likes of Neil Corbould, VES. It’s very much about learning the job on the job and having very good teachers.” In regards to the lack of female special effects supervisors, Williams believes, “Over the years there has gotten to be a few more of us in the industry. It wasn’t something you came across frequently when I started. Lack of opportunity back then had something to do with it. It’s obviously a bit of a fight. I’ve built my reputation up now, so I don’t feel like I face that fight anymore. For sure, early on I did.” Neil Corbould, VES, is a celebrated member of one of the most renowned families in the history of special effects. With their uncle, Colin Chilvers, pioneering the way by being the Special Effects Supervisor on Superman (1978), brothers Chris, Ian, Paul and Neil Corbould each have gone on to forge careers working on high-profile projects such as Inception (2010), Gravity (2013), Doctor Strange (2016) and Black Widow (2020). “When I was special effects technician on Superman, we had a little tin shed at Pinewood Studios with a crew of 15 or 20 people,” recalls Neil Corbould who has received Oscars for Gladiator (2000) and Gravity (2013). “Today, special effects crews are a lot bigger

with 80 or 90 people for a movie of that sort of size. Because the technology has moved on, we have more resources at hand, like C&C cutters, milling machines and waterjet cutters. What we can do now is far greater than what we could do back then and for a reasonable price.” Twenty-five years ago, a visual effects supervisor told Neil Corbould that special effects had five years left as a practical entity. “We’re probably stronger now than we’ve ever been. I like to work with people who embrace what we do and build on something that we’ve produced.” Chris Corbould is a mainstay of James Bond films, including the latest No Time to Die (2020) and received an Academy Award for Inception. “There was a time when I thought that the role of SFX would totally diminish and give way to the amazing world of VFX. Fortunately, there is a proportion of audiences that still crave some reality in the films they go to watch, with James Bond, Mission: Impossible and Christopher Nolan’s films being examples where physical effects play a major role. SFX crew numbers continue to rise on major films, and it is not unusual for crews in excess of 100 technicians. However, the secret behind all these films is a good marriage between SFX and VFX where the prime objective is to give the most realistic result while taking into consideration safety, cost and filming schedule. “The two departments, although specializing in totally different skills,” he continues, “perfectly complement each other, where VFX will add/remove specific elements to a SFX scene and SFX will create real elements for VFX to enhance digital scenes. It is the duty of every SFX supervisor to continually explore the possibilities and work closely with VFX to create those jaw-dropping moments that audiences talk about and remember for years. The most crucial part of these events involves the development of original content and storyline, and all SFX supervisors should strive to be part of this creative process and inspire the next generation of SFX.”

TOP: Chris Corbould won an Oscar for Inception, which featured a gravitydefying hallway. (Photo courtesy of Warner Bros. Entertainment) MIDDLE: From left: Tessa Thompson and Natalie Portman in Annihilation, with Hayley J. Williams looking after the special effects. (Photo: Peter Mountain. Copyright © 2017 Paramount Pictures) BOTTOM LEFT: Chris Corbould BOTTOM RIGHT: Tony Kenny

WINTER 2020 VFXVOICE.COM • 31

11/12/19 1:37 PM


SFX TRENDS

HIGH DEMAND, STREAMING, TIGHT SCHEDULES MARK INDUSTRY GROWTH By TREVOR HOGG

TOP: Sandra Bullock floats in zero gravity in Gravity, with practical assistance from Neil Corbould, VES. (Image copyright © 2015 Twentieth Century Fox Film Corporation) BOTTOM LEFT: Hayley J. Williams BOTTOM RIGHT: Neil Corbould, VES

30 • VFXVOICE.COM WINTER 2020

PG 30-38 SFX FUTURE.indd 30-31

In order to develop a better understanding of the current status, as well as what the future holds for the SFX industry, VFX Voice asked SFX supervisors from around the globe to provide insight about the state of their unique craft – combining imagination, engineering, technology and artistry to conjure practical magic on the big screen. Hayley J. Williams (Annihilation, Maleficent: Mistress of Evil) has frequently collaborated with filmmaker Tim Burton (Dumbo) and has founded her own special effects company located at Longcross Studios in the U.K. “My father was a special effects supervisor and I worked closely with him for a good number of years. I’ve also worked closely with the likes of Neil Corbould, VES. It’s very much about learning the job on the job and having very good teachers.” In regards to the lack of female special effects supervisors, Williams believes, “Over the years there has gotten to be a few more of us in the industry. It wasn’t something you came across frequently when I started. Lack of opportunity back then had something to do with it. It’s obviously a bit of a fight. I’ve built my reputation up now, so I don’t feel like I face that fight anymore. For sure, early on I did.” Neil Corbould, VES, is a celebrated member of one of the most renowned families in the history of special effects. With their uncle, Colin Chilvers, pioneering the way by being the Special Effects Supervisor on Superman (1978), brothers Chris, Ian, Paul and Neil Corbould each have gone on to forge careers working on high-profile projects such as Inception (2010), Gravity (2013), Doctor Strange (2016) and Black Widow (2020). “When I was special effects technician on Superman, we had a little tin shed at Pinewood Studios with a crew of 15 or 20 people,” recalls Neil Corbould who has received Oscars for Gladiator (2000) and Gravity (2013). “Today, special effects crews are a lot bigger

with 80 or 90 people for a movie of that sort of size. Because the technology has moved on, we have more resources at hand, like C&C cutters, milling machines and waterjet cutters. What we can do now is far greater than what we could do back then and for a reasonable price.” Twenty-five years ago, a visual effects supervisor told Neil Corbould that special effects had five years left as a practical entity. “We’re probably stronger now than we’ve ever been. I like to work with people who embrace what we do and build on something that we’ve produced.” Chris Corbould is a mainstay of James Bond films, including the latest No Time to Die (2020) and received an Academy Award for Inception. “There was a time when I thought that the role of SFX would totally diminish and give way to the amazing world of VFX. Fortunately, there is a proportion of audiences that still crave some reality in the films they go to watch, with James Bond, Mission: Impossible and Christopher Nolan’s films being examples where physical effects play a major role. SFX crew numbers continue to rise on major films, and it is not unusual for crews in excess of 100 technicians. However, the secret behind all these films is a good marriage between SFX and VFX where the prime objective is to give the most realistic result while taking into consideration safety, cost and filming schedule. “The two departments, although specializing in totally different skills,” he continues, “perfectly complement each other, where VFX will add/remove specific elements to a SFX scene and SFX will create real elements for VFX to enhance digital scenes. It is the duty of every SFX supervisor to continually explore the possibilities and work closely with VFX to create those jaw-dropping moments that audiences talk about and remember for years. The most crucial part of these events involves the development of original content and storyline, and all SFX supervisors should strive to be part of this creative process and inspire the next generation of SFX.”

TOP: Chris Corbould won an Oscar for Inception, which featured a gravitydefying hallway. (Photo courtesy of Warner Bros. Entertainment) MIDDLE: From left: Tessa Thompson and Natalie Portman in Annihilation, with Hayley J. Williams looking after the special effects. (Photo: Peter Mountain. Copyright © 2017 Paramount Pictures) BOTTOM LEFT: Chris Corbould BOTTOM RIGHT: Tony Kenny

WINTER 2020 VFXVOICE.COM • 31

11/12/19 1:37 PM


SFX TRENDS

Tony Kenny and his company, Dynamic Effects Canada, has worked on the Amazon series The Boys as well as features such as Pompeii (2014) and X-Men (2000). “We do everything wireless. Before, we would have miles and miles of wire running around for various things like bullet hits. It’s so much easier now in terms of digital gauges. Everything is much quicker. It’s so much easier to rig it, hook it into a receiver, stand back with a transmitter and fire it off. We use iGNIUS out of England, which is technically a blasting company. iGNIUS have their instruments set to an odd frequency that is not used by cellphones, walkie talkies or anything to do around a film set. They’re safe. We’ve been using them for years and they’re fantastic.” R&D is project driven, Kenny adds. “We end up building things for one show, and the next one we will have something similar, so we’ll use the same rig that was built and modify it. We’re constantly modifying our gear to make what we ever need to happen on set.”

TOP: The Electra crash rig used for Amelia was built by Tony Kenny. MIDDLE LEFT: John Frazier MIDDLE RIGHT: Joel Whist BOTTOM: Joel Whist executed numerous explosions for War for the Planet of the Apes. (Image copyright © 2017 Twentieth Century Fox Film Corporation)

32 • VFXVOICE.COM WINTER 2020

PG 30-38 SFX FUTURE.indd 32-33

John Frazier has received Academy Awards for Spider-Man 2 (2004) and technical achievements such as the pneumatic car flipper. “We’ve all been in those situations where we’re rushed. That’s when you say slow down. I remember one case when I got into it with Michael Bay. We were working on The Island (2005), and it was a dangerous stunt. We had the two stunt players on motorcycles being hung by the back of a truck and we’re going down the freeway. Michael said, ‘I’m going to go up to the other end, grab a couple of shots, and then we’re going to come back and do this.’ I said, ‘Okay, when you come back, I’ll need five to 10 minutes of your time.’ That was so we could go over the whole scenario and contingencies of what might happen. He went to the other end, got his shot, came back and said, ‘Let’s mount up and do the shot.’ I said, ‘No. We agreed that I’d get my 10 minutes.’ He

said, ‘I used your 10 minutes at the other end getting the shot.’ I answered, ‘You’ve got to find 10 minutes.’ Without getting into a screaming match I told him to take a timeout. I thought that was going to be my last day, but that’s what you have to do. You can’t let them rush you.” Joel Whist has a diverse blockbuster résumé that includes Cabin in the Woods (2011), Sucker Punch (2011), The BFG (2016) and War for the Planet of the Apes (2017). “There’s so much 4K photography happening now, and the imagery is so crisp. We’re using a lot of atmosphere onstage in set because if you don’t the image is too clean. It’s like watching a soap opera in the 1970s. Instead of using a haze you’re getting thick smoke, and that starts to become an issue with certain restrictions and crews. Normally, you would come onto a stage, fill it to a level that the DP likes, keep it at that level and keep shooting. What’s happening now is that you’re not allowed to put any atmosphere in on certain shows until the last minute. When you put it in that way then it’s luck of the draw, because you’re just filling a section of the stage, which means you can get layers of smoke and movement of atmosphere, which is a dead giveaway.” J.D. Schwalm has followed in the footsteps of his father, Special Effects Supervisor/Coordinator Jim Schwalm, to establish Innovation Workshop, and has worked on Venom (2018), First Man (2018) and Fast & Furious Presents: Hobbs & Shaw (2019). “When you shoot a car into a building or flip a car there are imperfections that take place, whereas computers are going to give it more of an algorithm. Doing gags for real allows the CG guys to follow that template of the imperfection and make them bigger and better.”

TOP: Transformers: The Last Knight would not be complete without explosions from long-time Michael Bay collaborator John Frazier. (Photo: Andrew Cooper. Copyright © 2016 Paramount Pictures and Hasbro) MIDDLE: A seamless integration of special and visuals effects in First Man led to J.D. Schwalm winning an Academy Award. (Image courtesy of Universal Pictures) BOTTOM LEFT: J.D. Schwalm BOTTOM RIGHT: Terry Glass

WINTER 2020 VFXVOICE.COM • 33

11/12/19 1:37 PM


SFX TRENDS

Tony Kenny and his company, Dynamic Effects Canada, has worked on the Amazon series The Boys as well as features such as Pompeii (2014) and X-Men (2000). “We do everything wireless. Before, we would have miles and miles of wire running around for various things like bullet hits. It’s so much easier now in terms of digital gauges. Everything is much quicker. It’s so much easier to rig it, hook it into a receiver, stand back with a transmitter and fire it off. We use iGNIUS out of England, which is technically a blasting company. iGNIUS have their instruments set to an odd frequency that is not used by cellphones, walkie talkies or anything to do around a film set. They’re safe. We’ve been using them for years and they’re fantastic.” R&D is project driven, Kenny adds. “We end up building things for one show, and the next one we will have something similar, so we’ll use the same rig that was built and modify it. We’re constantly modifying our gear to make what we ever need to happen on set.”

TOP: The Electra crash rig used for Amelia was built by Tony Kenny. MIDDLE LEFT: John Frazier MIDDLE RIGHT: Joel Whist BOTTOM: Joel Whist executed numerous explosions for War for the Planet of the Apes. (Image copyright © 2017 Twentieth Century Fox Film Corporation)

32 • VFXVOICE.COM WINTER 2020

PG 30-38 SFX FUTURE.indd 32-33

John Frazier has received Academy Awards for Spider-Man 2 (2004) and technical achievements such as the pneumatic car flipper. “We’ve all been in those situations where we’re rushed. That’s when you say slow down. I remember one case when I got into it with Michael Bay. We were working on The Island (2005), and it was a dangerous stunt. We had the two stunt players on motorcycles being hung by the back of a truck and we’re going down the freeway. Michael said, ‘I’m going to go up to the other end, grab a couple of shots, and then we’re going to come back and do this.’ I said, ‘Okay, when you come back, I’ll need five to 10 minutes of your time.’ That was so we could go over the whole scenario and contingencies of what might happen. He went to the other end, got his shot, came back and said, ‘Let’s mount up and do the shot.’ I said, ‘No. We agreed that I’d get my 10 minutes.’ He

said, ‘I used your 10 minutes at the other end getting the shot.’ I answered, ‘You’ve got to find 10 minutes.’ Without getting into a screaming match I told him to take a timeout. I thought that was going to be my last day, but that’s what you have to do. You can’t let them rush you.” Joel Whist has a diverse blockbuster résumé that includes Cabin in the Woods (2011), Sucker Punch (2011), The BFG (2016) and War for the Planet of the Apes (2017). “There’s so much 4K photography happening now, and the imagery is so crisp. We’re using a lot of atmosphere onstage in set because if you don’t the image is too clean. It’s like watching a soap opera in the 1970s. Instead of using a haze you’re getting thick smoke, and that starts to become an issue with certain restrictions and crews. Normally, you would come onto a stage, fill it to a level that the DP likes, keep it at that level and keep shooting. What’s happening now is that you’re not allowed to put any atmosphere in on certain shows until the last minute. When you put it in that way then it’s luck of the draw, because you’re just filling a section of the stage, which means you can get layers of smoke and movement of atmosphere, which is a dead giveaway.” J.D. Schwalm has followed in the footsteps of his father, Special Effects Supervisor/Coordinator Jim Schwalm, to establish Innovation Workshop, and has worked on Venom (2018), First Man (2018) and Fast & Furious Presents: Hobbs & Shaw (2019). “When you shoot a car into a building or flip a car there are imperfections that take place, whereas computers are going to give it more of an algorithm. Doing gags for real allows the CG guys to follow that template of the imperfection and make them bigger and better.”

TOP: Transformers: The Last Knight would not be complete without explosions from long-time Michael Bay collaborator John Frazier. (Photo: Andrew Cooper. Copyright © 2016 Paramount Pictures and Hasbro) MIDDLE: A seamless integration of special and visuals effects in First Man led to J.D. Schwalm winning an Academy Award. (Image courtesy of Universal Pictures) BOTTOM LEFT: J.D. Schwalm BOTTOM RIGHT: Terry Glass

WINTER 2020 VFXVOICE.COM • 33

11/12/19 1:37 PM


PG 34-35 DISNEY MARVEL AVENGERS ENDGAME AD.indd All Pages

11/12/19 2:46 PM


PG 34-35 DISNEY MARVEL AVENGERS ENDGAME AD.indd All Pages

11/12/19 2:46 PM


SFX TRENDS

“I see less puppeteering going on. Now creatures are becoming mostly all CG. Maybe it’s a guy wearing a mocap suit with dots all over it... We’re at this equilibrium of where now it’s more enhancement. Shoot it practically and get what you can. It gives visual effects something to build off of. I feel fairly confident that I’ll be employed until I retire!” —Frank Iudica, Special Effects Supervisor

TOP: Frank ludica utilized a rotating platform to go along with rotating wire rig for the spacewalk scenes featured in Ad Astra. (Image copyright © 2019 Twentieth Century Fox Film Corporation) BOTTOM LEFT: Laird McMurray BOTTOM RIGHT: Frank Iducia

36 • VFXVOICE.COM WINTER 2020

PG 30-38 SFX FUTURE.indd 37

Oscar-winner First Man (2018) was an example of the art, special effects and visual effects departments coming together to make a seamless hybrid of practical and digital elements. “First Man proved that if everyone does their homework and studies the shots, you can make an amazing special effects movie affordably,” Schwalm says. Technology has had a major impact on the special effects process. “Technology and the advancement of technology have given my company the ability to have a full-fledged manufacturing facility right down to fully automated computerized machines, milling machines, robotic arms, 3D printers and laser cutters. All of these things allow us to prototype in an extremely fast manner.” Terry Glass, who early in his career was an effects technician on Raiders of the Lost Ark (1981) and most recently supervised Angel Has Fallen (2019) and 6 Underground (2019), wonders about the ramifications of the U.K. leaving the European Union. “Brexit came up on the Marvel show I’m working on. In theory, we’re going to leave the country with lots of equipment while we’re still part of the European Union, and we could be coming back out of the European Union. Do we have to bring it back on a different form? Do we import it back into England or is it just returning? “They had this damn referendum without actually thinking what would happen if they said, ‘We want to go out,’” says Glass. “Our explosive legislation that affects us greatly is based on European explosive legislation. Will that change? I hope so because it will make life easier.” Tax credits make a big difference, says Glass, adding, “The production world has gotten so corporate. Before, you used to get producers coming up through the floor as first and second ADs. Now producers are coming through the accounts side of it. Sometimes they don’t see what actually goes on to make it. Without the money you can’t have the big effects.” Streaming services have had significant impact on U.K. productions. “Everything seems to be Netflix at the moment. They’re coming at it from a slightly different angle. We still work on the premise that

whatever we’re going to do is going to be seen on a big screen, not television, although most things are shown on a TV screen these days!” Laird McMurray has collaborated on Take This Waltz (2011), Pacific Rim (2013) and Suicide Squad (2016), “In terms of photorealism, [the industry is] not going to replace actors. I don’t think [audiences] will accept it. It’s one thing to go to a Disney movie and say that the lion looks real. Actors sell movies. People want to read about them in the tabloids. Molly’s Game (2017) had everyday effects where you’re doing some rain on a window, wind on a ski slope and maybe some blowing snow. Those types of effects are still cheaper – especially when they’re [built] around the actor – to do practically than to generate [digitally].” Streaming services are responsible for a bad trend, according to McMurray. “They spend amazing amounts of money on their programming, which is wonderful and looks great, but are still shooting on a TV-style schedule. That’s an anathema for effects guys. We need time to develop things and ensure safety and test. Television schedules don’t allow for that. You start at 7 a.m. on Monday, finish on 7 a.m. Saturday and are back at work on 7 a.m. on Monday. I won’t do that.” Frank Iudica started off as a special effects technician on Alien: Resurrection (1997) and subsequently supervised Fear the Walking Dead (TV series premiering 2015) and Ad Astra (2019). “I see less puppeteering going on. Now creatures are becoming mostly all CG. Maybe it’s a guy wearing a mocap suit with dots all over it and he’s the Velociraptor, where it used to be [if possible] you would try to have the actual thing on hand. In Alien: Resurrection, there was this baby alien nine feet tall that was entirely animatronic. We’re at this equilibrium of where now it’s more enhancement. Shoot it practically and get what you can. It gives visual effects something to build off of. I feel fairly confident that I’ll be employed until I retire!” Computer-based motion-control platforms are a major

“I have a theory that CG is so brilliant and able to take you to a place that is beyond reality that in many ways as human beings we need to feel grounded before we go there. It’s almost like we need to cue up before we get on the roller coaster ride. What a lot of productions feel is by using practical effects you ground your audience, and then they are ready and willing to go somewhere in CG.” —Neal Scanlan, Special Effects Supervisor

TOP: A ship gimbal for Pacific Rim constructed by Laird McMurray. (Photo courtesy of Warner Bros. Pictures) BOTTOM LEFT: Neal Scanlan (Photo: Ed Miller) BOTTOM RIGHT: Mark Hawker

WINTER 2020 VFXVOICE.COM • 37

11/12/19 1:37 PM


SFX TRENDS

“I see less puppeteering going on. Now creatures are becoming mostly all CG. Maybe it’s a guy wearing a mocap suit with dots all over it... We’re at this equilibrium of where now it’s more enhancement. Shoot it practically and get what you can. It gives visual effects something to build off of. I feel fairly confident that I’ll be employed until I retire!” —Frank Iudica, Special Effects Supervisor

TOP: Frank ludica utilized a rotating platform to go along with rotating wire rig for the spacewalk scenes featured in Ad Astra. (Image copyright © 2019 Twentieth Century Fox Film Corporation) BOTTOM LEFT: Laird McMurray BOTTOM RIGHT: Frank Iducia

36 • VFXVOICE.COM WINTER 2020

PG 30-38 SFX FUTURE.indd 37

Oscar-winner First Man (2018) was an example of the art, special effects and visual effects departments coming together to make a seamless hybrid of practical and digital elements. “First Man proved that if everyone does their homework and studies the shots, you can make an amazing special effects movie affordably,” Schwalm says. Technology has had a major impact on the special effects process. “Technology and the advancement of technology have given my company the ability to have a full-fledged manufacturing facility right down to fully automated computerized machines, milling machines, robotic arms, 3D printers and laser cutters. All of these things allow us to prototype in an extremely fast manner.” Terry Glass, who early in his career was an effects technician on Raiders of the Lost Ark (1981) and most recently supervised Angel Has Fallen (2019) and 6 Underground (2019), wonders about the ramifications of the U.K. leaving the European Union. “Brexit came up on the Marvel show I’m working on. In theory, we’re going to leave the country with lots of equipment while we’re still part of the European Union, and we could be coming back out of the European Union. Do we have to bring it back on a different form? Do we import it back into England or is it just returning? “They had this damn referendum without actually thinking what would happen if they said, ‘We want to go out,’” says Glass. “Our explosive legislation that affects us greatly is based on European explosive legislation. Will that change? I hope so because it will make life easier.” Tax credits make a big difference, says Glass, adding, “The production world has gotten so corporate. Before, you used to get producers coming up through the floor as first and second ADs. Now producers are coming through the accounts side of it. Sometimes they don’t see what actually goes on to make it. Without the money you can’t have the big effects.” Streaming services have had significant impact on U.K. productions. “Everything seems to be Netflix at the moment. They’re coming at it from a slightly different angle. We still work on the premise that

whatever we’re going to do is going to be seen on a big screen, not television, although most things are shown on a TV screen these days!” Laird McMurray has collaborated on Take This Waltz (2011), Pacific Rim (2013) and Suicide Squad (2016), “In terms of photorealism, [the industry is] not going to replace actors. I don’t think [audiences] will accept it. It’s one thing to go to a Disney movie and say that the lion looks real. Actors sell movies. People want to read about them in the tabloids. Molly’s Game (2017) had everyday effects where you’re doing some rain on a window, wind on a ski slope and maybe some blowing snow. Those types of effects are still cheaper – especially when they’re [built] around the actor – to do practically than to generate [digitally].” Streaming services are responsible for a bad trend, according to McMurray. “They spend amazing amounts of money on their programming, which is wonderful and looks great, but are still shooting on a TV-style schedule. That’s an anathema for effects guys. We need time to develop things and ensure safety and test. Television schedules don’t allow for that. You start at 7 a.m. on Monday, finish on 7 a.m. Saturday and are back at work on 7 a.m. on Monday. I won’t do that.” Frank Iudica started off as a special effects technician on Alien: Resurrection (1997) and subsequently supervised Fear the Walking Dead (TV series premiering 2015) and Ad Astra (2019). “I see less puppeteering going on. Now creatures are becoming mostly all CG. Maybe it’s a guy wearing a mocap suit with dots all over it and he’s the Velociraptor, where it used to be [if possible] you would try to have the actual thing on hand. In Alien: Resurrection, there was this baby alien nine feet tall that was entirely animatronic. We’re at this equilibrium of where now it’s more enhancement. Shoot it practically and get what you can. It gives visual effects something to build off of. I feel fairly confident that I’ll be employed until I retire!” Computer-based motion-control platforms are a major

“I have a theory that CG is so brilliant and able to take you to a place that is beyond reality that in many ways as human beings we need to feel grounded before we go there. It’s almost like we need to cue up before we get on the roller coaster ride. What a lot of productions feel is by using practical effects you ground your audience, and then they are ready and willing to go somewhere in CG.” —Neal Scanlan, Special Effects Supervisor

TOP: A ship gimbal for Pacific Rim constructed by Laird McMurray. (Photo courtesy of Warner Bros. Pictures) BOTTOM LEFT: Neal Scanlan (Photo: Ed Miller) BOTTOM RIGHT: Mark Hawker

WINTER 2020 VFXVOICE.COM • 37

11/12/19 1:37 PM


SFX TRENDS

innovation, in essence creating a civilian flight simulator. “It used to be that they would have to either free fly it or do a flight sequence that needs to be choreographed with turbulence,” Iudica says. “You can, through rehearsal, get all of that dialed in, record that move and repeat it. Now you can have a jet fighter flying through a canyon and plug that action into the craft so the platform is following that move and the actor is pretending.”

“With the 120 frame rate you have no motion blur, so everything is crisp and clear. Doing something like squibs you can [clearly] see what [the result] is. Or if we’re shooting dust balls [the camera] picks up everything. We’ve done rain in the past with rain interacting with the sets and actors. In the deep background you can do a wet down and get away with it. But when you’re at 3D HD at 120 frames you can see exactly where the rain starts and stops.” —Mark Hawker, Special Effects Supervisor

TOP: Neal Scanlan on the set of Solo: A Star Wars Story with Six-Eyes. (Photo: John Wilson. Image copyright © 2017 Lucasfilm Ltd.) BOTTOM: A complicated gimbal was constructed by Mark Hawker for the A Wrinkle in Time cast to stand upon. (Image copyright © 2018 Walt Disney Pictures)

Neal Scanlan has developed a reputation for creating creature effects from Babe (1995) to the Star Wars franchise. “It’s natural within any industry to push boundaries as far as one can go, and to use all of the technology and methodology that you can in order to create the most convincing effect. Nowadays, I’m looking back to the Babe days [when the approach to effects was more hands-on]. For example, when Blue is lying on an operating table in Jurassic World: Fallen Kingdom (2018), that was a complete rod puppet. Twenty puppeteers were on a rostrum working together to bring that to life. To me, there’s something about placing your hand onto an entity and bringing it to life that gets something which is soulful and magical that could never be achieved robotically.” Visual effects are best when digital and practical elements are combined, says Scanlan. “I have a theory that CG is so brilliant and able to take you to a place that is beyond reality that in many ways as human beings we need to feel grounded before we go there. It’s almost like we need to cue up before we get on the roller coaster ride. What a lot of productions feel is by using practical effects you ground your audience, and then they are ready and willing to go somewhere in CG.”

AD

Mark Hawker has a career spanning Pirates of the Caribbean: On Stranger Tides (2011) to A Quiet Place (2018), and had to deal with 120 fps rate and HDR imagery for Gemini Man (2019). “With the 120 frame rate you have no motion blur, so everything is crisp and clear. Doing something like squibs you can see [clearly] what [the result] is. Or if we’re shooting dust balls [the camera] picks up everything. In the past we’ve done rain interacting with the sets and actors. In the deep background you can do a wet down and get away with it. But when you’re at 3D HD at 120 frames you can see exactly where the rain starts and stops.” In conclusion, the digital revolution has enabled filmmakers to capture their unbridled imagination, which has expanded the scope of special effects, while streaming services have also increased the demand. There appears to be no shortage of work, but the issue of ever-shortening production schedules may reach a breaking point where more time needs to be budgeted for; otherwise, the cinematic ambitions will need to be curbed because of matters of safety. The rapid growth in technology has been critical in making the manufacturing process faster and more efficient, thereby, allowing special effects supervisors to enhance the storytelling in ways that were once considered to be impossible.

38 • VFXVOICE.COM WINTER 2020

PG 30-38 SFX FUTURE.indd 39

11/12/19 1:37 PM


PG 39 NEON APOLLO 11 AD.indd 39

11/12/19 2:47 PM


FILM

TRANSFORMING AGES, FACES AND PLACES IN THE IRISHMAN By TREVOR HOGG

Images courtesy of Netflix, Fábrica de Cine, STX Entertainment, Sikelia Productions and Tribeca Productions. TOP: Ray Romano (Bill Bufalino), Al Pacino (Jimmy Hoffa) and Robert De Niro (Frank Sheeran). OPPOSITE TOP TO BOTTOM: Original plate photography of Robert De Niro shot with a RED DRAGON camera. The younger version of Robert De Niro was created with the help of the infrared photography captured by two ALEXA Mini cameras. Making the ‘youthification’ process more challenging was not using facial markers and headcams on Robert De Niro. In order to create younger versions of De Niro, Al Pacino and Joe Pesci, two years were spent building a library of performances of the targeted age for the three actors.

40 • VFXVOICE.COM WINTER 2020

PG 40-43 THE IRISHMAN.indd 40-41

Whenever Robert De Niro and Martin Scorsese collaborate with each other gangsters seem to be part of the storyline, whether it be Mean Streets, Goodfellas, Casino and now The Irishman. The biographical crime drama is based on the book I Heard You Paint Houses: Frank “The Irishman” Sheeran and Closing the Case on Jimmy Hoffa by Charles Brandt, which recounts the life of mob hitman and World War II veteran Frank Sheeran, who claimed to have killed labor leader Jimmy Hoffa. Netflix has invested $200 million into the production that features De Niro starring alongside Al Pacino, Joe Pesci, Harvey Keitel, Bobby Cannavale, Anna Paquin and Ray Romano. Complicating matters for Production Visual Effects Supervisor Pablo Helman (War of the Worlds) is the fact that Frank Sheeran (Robert De Niro), Jimmy Hoffa (Al Pacino) and mafioso Russell Bufalino (Joe Pesci) appear throughout the 300 scenes at different ages, and Scorsese did not want to utilize traditional motion-capture techniques as he felt they would hinder his filmmaking process. Helman previously collaborated with Scorsese on Silence. “It’s incredible to work with and learn from such a master,” acknowledges Helman. “Marty blocks everything on paper to give everybody who is part of the production a sense of how he’s thinking about scenes. Marty has an incredible eye and memory. Then, oddly enough, it’s all about the gut and feeling. Sometimes when Marty looks at the work that we’re doing he starts to remember how he felt about this or that performance or why he selected a particular performance. It’s up to us to interpret what he has in mind and make him happy.”

The Irishman is a performance-heavy movie. “The cinematic narrative that Marty uses is that of a masterwork,” remarks Helman. “There is a lot of dialogue and close performances that have to do with the eyes, the chin, how the nostrils move, and the timing choices made by the actors. There were some previs pieces that we had to make, and besides the facework that we’re doing in the movie there are numerous complicated situations that had to do with starting the shot on location and marrying that into a set or vice versa. Those types of things are difficult to improvise. The previs informed everybody about the methodology and resources needed to make the shot work.” The story spans a period of 40 years with a theatrical runtime of 210 minutes. Of the 2,300 shots, about 1,700 had visual effects with 1,000 featuring de-aged faces. ILM served as the main vendor with the work divided between facilities in San Francisco and Vancouver, while 2D makeup fixes were provided by SSVFX and Vitality VFX. “We see Robert De Niro when he is 24, 30, 36, 41, 50, 65, and after that the makeup takes over to get to 83 when he dies. Joe Pesci is more limited in that he starts at 50 and ends at 83. Al Pacino starts at 44, ends at 62, and was a completely CG performance.” Because of the frequent cross-cutting between the various ages of Sheeran, Hoffa and Bufalino, it was impossible to cast different actors for each of them and still have the audience making a connection with the characters. “Marty is all about performances and character. It has been a difficult effort extracting performances from the actors and translating them into the younger versions.” De Niro, Pacino and Pesci made different acting decisions for their roles. “Robert De Niro is the quietest of the three,” observes Helman. “His performance is subtle, with his eyes, the movement of the chin, and the way his face wrinkles into showing us an expression. He internalizes everything going on around him. Joe Pesci is subtle but sarcastic. You always know that there’s something behind what he said. Al Pacino is boisterous because Jimmy Hoffa was a hyper person.” No keyframe animation was utilized for the de-aging. “Everything was captured in 3D and re-targeted into a younger version of the actor,” states Helman. “The process enabled us to understand how a performance gets put together in the faces of these actors. It has been an incredible experience just to learn what De Niro means when he raises an eyebrow. “I come from having worked with motion capture for five years and having markers on the actors’ faces as well as being in a controlled environment,” explains Helman. “After our first meeting with De Niro and Marty we were told that they weren’t going to wear any markers or helmets on set. The system that we researched and developed is called Flux, which is performance capture based on lighting and textures. We also created a threecamera rig so wherever the center RED DRAGON camera with a Helium sensor was moving, there were two ALEXA Mini witness cameras situated to the left and right of it that were capturing infrared footage. The two witness cameras were outfitted with infrared lights in order to throw infrared light onto the actors that was not seen by the center camera. What that did was neutralize

WINTER 2020 VFXVOICE.COM • 41

11/12/19 1:38 PM


FILM

TRANSFORMING AGES, FACES AND PLACES IN THE IRISHMAN By TREVOR HOGG

Images courtesy of Netflix, Fábrica de Cine, STX Entertainment, Sikelia Productions and Tribeca Productions. TOP: Ray Romano (Bill Bufalino), Al Pacino (Jimmy Hoffa) and Robert De Niro (Frank Sheeran). OPPOSITE TOP TO BOTTOM: Original plate photography of Robert De Niro shot with a RED DRAGON camera. The younger version of Robert De Niro was created with the help of the infrared photography captured by two ALEXA Mini cameras. Making the ‘youthification’ process more challenging was not using facial markers and headcams on Robert De Niro. In order to create younger versions of De Niro, Al Pacino and Joe Pesci, two years were spent building a library of performances of the targeted age for the three actors.

40 • VFXVOICE.COM WINTER 2020

PG 40-43 THE IRISHMAN.indd 40-41

Whenever Robert De Niro and Martin Scorsese collaborate with each other gangsters seem to be part of the storyline, whether it be Mean Streets, Goodfellas, Casino and now The Irishman. The biographical crime drama is based on the book I Heard You Paint Houses: Frank “The Irishman” Sheeran and Closing the Case on Jimmy Hoffa by Charles Brandt, which recounts the life of mob hitman and World War II veteran Frank Sheeran, who claimed to have killed labor leader Jimmy Hoffa. Netflix has invested $200 million into the production that features De Niro starring alongside Al Pacino, Joe Pesci, Harvey Keitel, Bobby Cannavale, Anna Paquin and Ray Romano. Complicating matters for Production Visual Effects Supervisor Pablo Helman (War of the Worlds) is the fact that Frank Sheeran (Robert De Niro), Jimmy Hoffa (Al Pacino) and mafioso Russell Bufalino (Joe Pesci) appear throughout the 300 scenes at different ages, and Scorsese did not want to utilize traditional motion-capture techniques as he felt they would hinder his filmmaking process. Helman previously collaborated with Scorsese on Silence. “It’s incredible to work with and learn from such a master,” acknowledges Helman. “Marty blocks everything on paper to give everybody who is part of the production a sense of how he’s thinking about scenes. Marty has an incredible eye and memory. Then, oddly enough, it’s all about the gut and feeling. Sometimes when Marty looks at the work that we’re doing he starts to remember how he felt about this or that performance or why he selected a particular performance. It’s up to us to interpret what he has in mind and make him happy.”

The Irishman is a performance-heavy movie. “The cinematic narrative that Marty uses is that of a masterwork,” remarks Helman. “There is a lot of dialogue and close performances that have to do with the eyes, the chin, how the nostrils move, and the timing choices made by the actors. There were some previs pieces that we had to make, and besides the facework that we’re doing in the movie there are numerous complicated situations that had to do with starting the shot on location and marrying that into a set or vice versa. Those types of things are difficult to improvise. The previs informed everybody about the methodology and resources needed to make the shot work.” The story spans a period of 40 years with a theatrical runtime of 210 minutes. Of the 2,300 shots, about 1,700 had visual effects with 1,000 featuring de-aged faces. ILM served as the main vendor with the work divided between facilities in San Francisco and Vancouver, while 2D makeup fixes were provided by SSVFX and Vitality VFX. “We see Robert De Niro when he is 24, 30, 36, 41, 50, 65, and after that the makeup takes over to get to 83 when he dies. Joe Pesci is more limited in that he starts at 50 and ends at 83. Al Pacino starts at 44, ends at 62, and was a completely CG performance.” Because of the frequent cross-cutting between the various ages of Sheeran, Hoffa and Bufalino, it was impossible to cast different actors for each of them and still have the audience making a connection with the characters. “Marty is all about performances and character. It has been a difficult effort extracting performances from the actors and translating them into the younger versions.” De Niro, Pacino and Pesci made different acting decisions for their roles. “Robert De Niro is the quietest of the three,” observes Helman. “His performance is subtle, with his eyes, the movement of the chin, and the way his face wrinkles into showing us an expression. He internalizes everything going on around him. Joe Pesci is subtle but sarcastic. You always know that there’s something behind what he said. Al Pacino is boisterous because Jimmy Hoffa was a hyper person.” No keyframe animation was utilized for the de-aging. “Everything was captured in 3D and re-targeted into a younger version of the actor,” states Helman. “The process enabled us to understand how a performance gets put together in the faces of these actors. It has been an incredible experience just to learn what De Niro means when he raises an eyebrow. “I come from having worked with motion capture for five years and having markers on the actors’ faces as well as being in a controlled environment,” explains Helman. “After our first meeting with De Niro and Marty we were told that they weren’t going to wear any markers or helmets on set. The system that we researched and developed is called Flux, which is performance capture based on lighting and textures. We also created a threecamera rig so wherever the center RED DRAGON camera with a Helium sensor was moving, there were two ALEXA Mini witness cameras situated to the left and right of it that were capturing infrared footage. The two witness cameras were outfitted with infrared lights in order to throw infrared light onto the actors that was not seen by the center camera. What that did was neutralize

WINTER 2020 VFXVOICE.COM • 41

11/12/19 1:38 PM


FILM

“We see Robert De Niro when he is 24, 30, 36, 41, 50, 65, and after that the makeup takes over to get to 83 when he dies. Joe Pesci is more limited in that he starts at 50 and ends at 83. Al Pacino starts at 44, ends at 62, and was a completely CG performance.” —Pablo Helman, Visual Effects Supervisor

TOP: Al Pacino as Jimmy Hoffa was a completely CG performance as he starts at age 44 and ends at 62. BOTTOM: The Flux rig featuring a RED DRAGON camera with a Helium sensor and two ALEXA Mini witness cameras capturing infrared footage.

42 • VFXVOICE.COM WINTER 2020

PG 40-43 THE IRISHMAN.indd 42-43

the light direction so Flux could take those three cameras, triangulate the actor in front of the camera, and generate geometry that captured every facial nuance.” Because Flux is dependent on lighting and texture, De Niro, Pacino and Pesci did not wear any makeup. “In order to create a younger version of the actors we spent two years building a library of performances of the targeted age for all of these actors,” reveals Helman. “Every one of those performances were catalogued by content, such as moods. That was also part of the software. Flux captures and interprets the performance at the contemporary age of the actor. The actors are 76 and 78 years old. We also had versions of the actors at all of the ages that needed to be represented. There was an effort to put all of those renders through our library, and that gave us an output of several faces from different projects they had been in, taking into consideration their performance, lighting and lens size.” At the beginning of the project, a Medusa session was held where De Niro, Pacino and Pesci were asked to do specific performances. An interesting discovery was that their approach towards acting had changed from when they were younger. “There were some expressions that we couldn’t find any examples of,” states Helman. “That is also part of how Scorsese subtly designs a character. Four years ago, when starting The Irishman, I thought, ‘We’ll make De Niro look like he did in Goodfellas or Casino.’ But it wasn’t the case. We’re making someone who appears as the character of Frank Sheeran. When De Niro played Al Capone in The Untouchables, he was a completely different person.” Adjustments were made to the body posture to create a sense of history. “We also had to do the hands as well as mix digital hair with what was on set in order to get rid of wig lines.” Sets and locations had to be seamlessly blended together. “There was a difference in exposure and light,” notes Helman. “The interiors used LEDs because the infrared light does not contaminate the spectrum. But when you go outside there is a

lot of infrared light that comes from the sun, so we had a specific set of filters. It has been incredible collaboration with Rodrigo Prieto ASC, AMC (Silence) as what we were trying to do had to be exact and match his lighting and the performance of the actors.” The Phantom camera was utilized for slow-motion shots. “Some shots were shot 300 or 400 frames per second. It’s great to see that, because the motion blur is minimized and you can really see the performances on the faces.” A close partnership was had with production designer Bob Shaw (The Wolf of Wall Street) in determining what needed to be practically and digitally built to convey the proper time period. “Scorsese is a method director, meaning that if there are 700 extras on set and the camera is looking in one direction, he still wants them to be performing even if they’re not in the frame. He is trying to build a world as much as possible that is real. But obviously there is the economics part of it and the logistics of shooting in New York.” As a production designer recreates a period with set dressing and props, the same approach was taken in making age-appropriate characters. “Before Marty saw any work, he asked me, ‘What is this going to do to my movie?’ I answered, ‘It’s going to create the context for the period that you’re trying to depict.’” “The main achievement here is that we captured the actors’ performances on set under theatrical light without markers on their faces,” believes Helman. “Also, there was no other performance capture in a controlled environment. What you got was what you got. It’s taking technology away from the facial performance that takes place and translating those performances into younger versions of themselves. You’re going to see an hour and a half worth of work where Robert De Niro, Al Pacino and Joe Pesci age in front of your eyes. I’m looking forward to going to the back of the theater and listening to people react to what we’re trying to do. It was an incredible project for us to work on because the visual effects are completely at the service of the live-action shoot.”

TOP TO BOTTOM: The frequent cross-cutting between the various ages of Sheeran, Hoffa and Bufalino made it impossible to cast different actors for each timeframe and still allow the audience to make a connection with the characters. The four different stages required to create a younger version of Robert De Niro. Joe Pesci portrays Russell Bufalino from the age of 50 to 83. Pablo Helman, Visual Effects Supervisor (Photo: Greg Grusby)

WINTER 2020 VFXVOICE.COM • 43

11/12/19 1:38 PM


FILM

“We see Robert De Niro when he is 24, 30, 36, 41, 50, 65, and after that the makeup takes over to get to 83 when he dies. Joe Pesci is more limited in that he starts at 50 and ends at 83. Al Pacino starts at 44, ends at 62, and was a completely CG performance.” —Pablo Helman, Visual Effects Supervisor

TOP: Al Pacino as Jimmy Hoffa was a completely CG performance as he starts at age 44 and ends at 62. BOTTOM: The Flux rig featuring a RED DRAGON camera with a Helium sensor and two ALEXA Mini witness cameras capturing infrared footage.

42 • VFXVOICE.COM WINTER 2020

PG 40-43 THE IRISHMAN.indd 42-43

the light direction so Flux could take those three cameras, triangulate the actor in front of the camera, and generate geometry that captured every facial nuance.” Because Flux is dependent on lighting and texture, De Niro, Pacino and Pesci did not wear any makeup. “In order to create a younger version of the actors we spent two years building a library of performances of the targeted age for all of these actors,” reveals Helman. “Every one of those performances were catalogued by content, such as moods. That was also part of the software. Flux captures and interprets the performance at the contemporary age of the actor. The actors are 76 and 78 years old. We also had versions of the actors at all of the ages that needed to be represented. There was an effort to put all of those renders through our library, and that gave us an output of several faces from different projects they had been in, taking into consideration their performance, lighting and lens size.” At the beginning of the project, a Medusa session was held where De Niro, Pacino and Pesci were asked to do specific performances. An interesting discovery was that their approach towards acting had changed from when they were younger. “There were some expressions that we couldn’t find any examples of,” states Helman. “That is also part of how Scorsese subtly designs a character. Four years ago, when starting The Irishman, I thought, ‘We’ll make De Niro look like he did in Goodfellas or Casino.’ But it wasn’t the case. We’re making someone who appears as the character of Frank Sheeran. When De Niro played Al Capone in The Untouchables, he was a completely different person.” Adjustments were made to the body posture to create a sense of history. “We also had to do the hands as well as mix digital hair with what was on set in order to get rid of wig lines.” Sets and locations had to be seamlessly blended together. “There was a difference in exposure and light,” notes Helman. “The interiors used LEDs because the infrared light does not contaminate the spectrum. But when you go outside there is a

lot of infrared light that comes from the sun, so we had a specific set of filters. It has been incredible collaboration with Rodrigo Prieto ASC, AMC (Silence) as what we were trying to do had to be exact and match his lighting and the performance of the actors.” The Phantom camera was utilized for slow-motion shots. “Some shots were shot 300 or 400 frames per second. It’s great to see that, because the motion blur is minimized and you can really see the performances on the faces.” A close partnership was had with production designer Bob Shaw (The Wolf of Wall Street) in determining what needed to be practically and digitally built to convey the proper time period. “Scorsese is a method director, meaning that if there are 700 extras on set and the camera is looking in one direction, he still wants them to be performing even if they’re not in the frame. He is trying to build a world as much as possible that is real. But obviously there is the economics part of it and the logistics of shooting in New York.” As a production designer recreates a period with set dressing and props, the same approach was taken in making age-appropriate characters. “Before Marty saw any work, he asked me, ‘What is this going to do to my movie?’ I answered, ‘It’s going to create the context for the period that you’re trying to depict.’” “The main achievement here is that we captured the actors’ performances on set under theatrical light without markers on their faces,” believes Helman. “Also, there was no other performance capture in a controlled environment. What you got was what you got. It’s taking technology away from the facial performance that takes place and translating those performances into younger versions of themselves. You’re going to see an hour and a half worth of work where Robert De Niro, Al Pacino and Joe Pesci age in front of your eyes. I’m looking forward to going to the back of the theater and listening to people react to what we’re trying to do. It was an incredible project for us to work on because the visual effects are completely at the service of the live-action shoot.”

TOP TO BOTTOM: The frequent cross-cutting between the various ages of Sheeran, Hoffa and Bufalino made it impossible to cast different actors for each timeframe and still allow the audience to make a connection with the characters. The four different stages required to create a younger version of Robert De Niro. Joe Pesci portrays Russell Bufalino from the age of 50 to 83. Pablo Helman, Visual Effects Supervisor (Photo: Greg Grusby)

WINTER 2020 VFXVOICE.COM • 43

11/12/19 1:38 PM


TV

WORLD-BUILDING: ADDING SCOPE TO THE DARK CRYSTAL: AGE OF RESISTANCE By IAN FAILES

Images courtesy of DNEG and Netflix. TOP: One of the many digital environments created by DNEG for Age of Resistance. OPPOSITE TOP TO BOTTOM: A bluescreened puppeteer operates a Mystic on set. DNEG removed the puppeteer and extended the environment for the final shot. The characters encounter a Nurloc, performed by a greenscreened puppeteer. The completed Nurloc composite.

44 • VFXVOICE.COM WINTER 2020

PG 44-45 DARK CRYSTAL.indd All Pages

Netflix and The Jim Henson Company’s journey back to the Dark Crystal world in the streaming series The Dark Crystal: Age of Resistance brought with it an incredible opportunity to return to the practical effects and puppetry seen in the original 1982 film. Yet, with so much advancement not only in puppetry but also digital visual effects, this prequel series could take advantage of the latest in filmmaking technology. Indeed, Age of Resistance contains a whopping 4,288 shots, crafted by DNEG and overseen by Visual Effects Supervisor Sean Mathiesen. Here, the VFX Supervisor breaks down the wide range of work involved, all the way from puppeteer and rod removals, to expansive environments and CG characters. With more than 4,000 shots to complete, it was a two-year process for Mathiesen. What he took away, in particular, from the production was just the sheer level of detail that went into making the show and filling out the world of Thra in which it occurs. “There are so many fine nuances that make everything feel as magnificent as they do,” he says. “That goes for the puppetry and the visual effects, too.” Since the show would be relying on puppets and puppeteers performing the action on stages, a significant portion of Mathiesen and DNEG’s visual effects duties were puppeteer and rod removal. When they were in full view of the camera, puppeteers tended to perform scenes wearing bluescreen or greenscreen clothing. Clean plates, photographic reference, scanned sets and an enormous roto and paint effort led to the final shots. Some shots also were filmed entirely against greenscreen or bluescreen. For a number of the puppets, in particular the Gelfling, DNEG carried out facial augmentation work in terms of blinks, brow movements and sometimes jaw lines. The character Deet, for example, had large eyes that took up a large amount of space in the puppet head. Normally servos would be able to take care of some blinking, but they could not fit into the head. “So we would give her blinks, and then move eyebrows and cheekbones and squidge up

her nose just a little bit,” says Mathiesen. “It was all those kinds of very subtle movements that don’t actually read much unless they aren’t there.” Generally, the puppeteers worked inside a sunken set with the floor elevated on a platform, allowing them to puppeteer from below. But for shots requiring Gelfling to walk or run as ‘full-body’ characters, these were crafted in CG by DNEG. “We scanned the puppets and then made digi-doubles of most of the hero ones and some background puppets as well,” explains Mathiesen. “Then whenever they do a stunt, jump out a window or run across the floor, that would be digital, too.” The extension to the Gelfling digital-double work was some completely CG creatures, of which there were many. This included The Hunter or skekMal, who at one point leaps from tree to tree. Then there are the Crystal Skimmers, large flying transportation creatures, plus many other background – and sometimes foreground – inhabitants of Thra. The Spitters, for instance, began life as puppets but were particularly complex to operate, given that they had six legs. DNEG re-created the creature for certain actions, plus for a moment where they all combine. One creature, Lore, was initially planned to be entirely realized in CG, but actually came to the screen mostly via puppetry. It is essentially a pile of rocks that comes to life. DNEG crafted a digital version of Lore that aided in removing puppeteers for certain shots. However, notes Mathiesen, “We wanted to make sure that they made a puppet version of it, at least to have as lighting reference to put in any scene where CG Lore would be. But then the puppeteers rose to the occasion and figured out how to puppeteer where they would be completely dressed in blue and the creature connected to boots and rods to move in the right way.” The world of Thra was brought to life also via exotic landscapes and extensions crafted by DNEG. The sanctuary trees in the Caves of Grot were full CG environments, as were several others. Sometimes, the studio would take stock footage plates, such as a volcanic plain in Iceland, and alter that to stand in for the world. “We might remove the notable features of Iceland’s mountain ranges and replace those with a more fantasy-oriented range,” details Mathiesen. The final piece of the VFX puzzle for Age of Resistance was magic, a mix of lighting FX, dust, lightning, strange beams, and other depictions of wizardry and religious-like imagery and symbology on the show. Along with the creatures and environments, this magic added to the already complex nature of the narrative, notes Mathiesen, but it also meant he could find specific aspects of the rich world of Thra to draw upon in completing the visual effects shots with DNEG. “There’d be things about the final shots where I’d say to the team, ‘Make it more blue.’ I’d go off into some weird abstract story about how Thra has three suns and one of them has a little bit more blue, and so therefore we should composite the shot a certain way. Everybody would say, ‘Oh my God, why didn’t you just say you want it more blue?’ It might be easier to do that way, but I found myself thinking like that once I’d been part of the deep mythmaking side of Henson’s world.”

WINTER 2020 VFXVOICE.COM • 45

11/12/19 2:20 PM


TV

WORLD-BUILDING: ADDING SCOPE TO THE DARK CRYSTAL: AGE OF RESISTANCE By IAN FAILES

Images courtesy of DNEG and Netflix. TOP: One of the many digital environments created by DNEG for Age of Resistance. OPPOSITE TOP TO BOTTOM: A bluescreened puppeteer operates a Mystic on set. DNEG removed the puppeteer and extended the environment for the final shot. The characters encounter a Nurloc, performed by a greenscreened puppeteer. The completed Nurloc composite.

44 • VFXVOICE.COM WINTER 2020

PG 44-45 DARK CRYSTAL.indd All Pages

Netflix and The Jim Henson Company’s journey back to the Dark Crystal world in the streaming series The Dark Crystal: Age of Resistance brought with it an incredible opportunity to return to the practical effects and puppetry seen in the original 1982 film. Yet, with so much advancement not only in puppetry but also digital visual effects, this prequel series could take advantage of the latest in filmmaking technology. Indeed, Age of Resistance contains a whopping 4,288 shots, crafted by DNEG and overseen by Visual Effects Supervisor Sean Mathiesen. Here, the VFX Supervisor breaks down the wide range of work involved, all the way from puppeteer and rod removals, to expansive environments and CG characters. With more than 4,000 shots to complete, it was a two-year process for Mathiesen. What he took away, in particular, from the production was just the sheer level of detail that went into making the show and filling out the world of Thra in which it occurs. “There are so many fine nuances that make everything feel as magnificent as they do,” he says. “That goes for the puppetry and the visual effects, too.” Since the show would be relying on puppets and puppeteers performing the action on stages, a significant portion of Mathiesen and DNEG’s visual effects duties were puppeteer and rod removal. When they were in full view of the camera, puppeteers tended to perform scenes wearing bluescreen or greenscreen clothing. Clean plates, photographic reference, scanned sets and an enormous roto and paint effort led to the final shots. Some shots also were filmed entirely against greenscreen or bluescreen. For a number of the puppets, in particular the Gelfling, DNEG carried out facial augmentation work in terms of blinks, brow movements and sometimes jaw lines. The character Deet, for example, had large eyes that took up a large amount of space in the puppet head. Normally servos would be able to take care of some blinking, but they could not fit into the head. “So we would give her blinks, and then move eyebrows and cheekbones and squidge up

her nose just a little bit,” says Mathiesen. “It was all those kinds of very subtle movements that don’t actually read much unless they aren’t there.” Generally, the puppeteers worked inside a sunken set with the floor elevated on a platform, allowing them to puppeteer from below. But for shots requiring Gelfling to walk or run as ‘full-body’ characters, these were crafted in CG by DNEG. “We scanned the puppets and then made digi-doubles of most of the hero ones and some background puppets as well,” explains Mathiesen. “Then whenever they do a stunt, jump out a window or run across the floor, that would be digital, too.” The extension to the Gelfling digital-double work was some completely CG creatures, of which there were many. This included The Hunter or skekMal, who at one point leaps from tree to tree. Then there are the Crystal Skimmers, large flying transportation creatures, plus many other background – and sometimes foreground – inhabitants of Thra. The Spitters, for instance, began life as puppets but were particularly complex to operate, given that they had six legs. DNEG re-created the creature for certain actions, plus for a moment where they all combine. One creature, Lore, was initially planned to be entirely realized in CG, but actually came to the screen mostly via puppetry. It is essentially a pile of rocks that comes to life. DNEG crafted a digital version of Lore that aided in removing puppeteers for certain shots. However, notes Mathiesen, “We wanted to make sure that they made a puppet version of it, at least to have as lighting reference to put in any scene where CG Lore would be. But then the puppeteers rose to the occasion and figured out how to puppeteer where they would be completely dressed in blue and the creature connected to boots and rods to move in the right way.” The world of Thra was brought to life also via exotic landscapes and extensions crafted by DNEG. The sanctuary trees in the Caves of Grot were full CG environments, as were several others. Sometimes, the studio would take stock footage plates, such as a volcanic plain in Iceland, and alter that to stand in for the world. “We might remove the notable features of Iceland’s mountain ranges and replace those with a more fantasy-oriented range,” details Mathiesen. The final piece of the VFX puzzle for Age of Resistance was magic, a mix of lighting FX, dust, lightning, strange beams, and other depictions of wizardry and religious-like imagery and symbology on the show. Along with the creatures and environments, this magic added to the already complex nature of the narrative, notes Mathiesen, but it also meant he could find specific aspects of the rich world of Thra to draw upon in completing the visual effects shots with DNEG. “There’d be things about the final shots where I’d say to the team, ‘Make it more blue.’ I’d go off into some weird abstract story about how Thra has three suns and one of them has a little bit more blue, and so therefore we should composite the shot a certain way. Everybody would say, ‘Oh my God, why didn’t you just say you want it more blue?’ It might be easier to do that way, but I found myself thinking like that once I’d been part of the deep mythmaking side of Henson’s world.”

WINTER 2020 VFXVOICE.COM • 45

11/12/19 2:20 PM


TECH & TOOLS

FOR COMPANIES AT THE FOREFRONT, THE FUTURE IS REAL-TIME By IAN FAILES

The art and technology of visual effects always seem to be changing, and one technology has brought about rapid change for the industry. That technology is ‘real-time’. For the most part, this means real-time rendering, particularly via game engines and the use of GPUs. Real-time rendering has been used for a great deal of virtual production, for anything from previs, set-scouting, live motion capture, simulcam work, compositing on set, in-camera visual effects, to actual final frame rendering. Some of the many benefits of real-time include instant or faster feedback to cast and crew, and the pushing of more key decisions into pre-production and shooting instead of post. The landmark projects from this past year to incorporate realtime include The Lion King and The Mandalorian, but it’s a technology that’s permeating a wide range of both large and small VFX productions. Looking to the future, VFX Voice asked a number of companies invested in real-time how their tools and techniques are likely to impact the industry.

scene itself live on stage, for actors to see the environment that they’re acting against in real-time, to change sets or locations quickly, to re-shoot scenes at any time of day with the exact same setup, review and scout scenes simultaneously in VR, use gameplay and simulation elements live on stage and much more. “Filmmakers love it because they can get more predictability into exactly what a given scene or project will demand,” says

TOP: A still from Unity’s real-time rendered The Heretic, which utilized many of the latest authoring workflow tools that the game engine maker is implementing into its engine. (Image courtesy of Unity Technologies) BOTTOM: The seam between the LED wall and the liveaction set is designed to be hidden, with the background, lighting and many other elements able to be changed in real-time. (Image courtesy of Epic Games)

REALLY REAL-TIME: IN-CAMERA VFX

TOP: A one-billion polygon KitBash3d city designed by Evan Butler, as rendered in Lavina. (Image courtesy of Chaos Group)

46 • VFXVOICE.COM WINTER 2020

PG 46-50 REAL-TIME.indd 46-47

Epic Games, the makers of Unreal Engine, is one of the companies at the forefront of real-time. A recent demo set up by Epic Games showed real-time technologies coming together to enable in-camera VFX during a live-action shoot. The demo featured an actor on a motorbike filmed live-action against a series of LED wall panels. The imagery produced on the walls blended seamlessly with the live-action set, and could also be altered in real-time. “An LED wall shoot has the potential to impact many aspects of filmmaking,” suggests Ryan Mayeda, Technical Product Manager at Epic Games. “It can enable in-camera finals, improve greenscreen elements with environment lighting when in-camera finals may not be feasible, improve the ability to modify lighting both on actors and within the 3D scene live on stage, to modify the 3D

WINTER 2020 VFXVOICE.COM • 47

11/12/19 2:21 PM


TECH & TOOLS

FOR COMPANIES AT THE FOREFRONT, THE FUTURE IS REAL-TIME By IAN FAILES

The art and technology of visual effects always seem to be changing, and one technology has brought about rapid change for the industry. That technology is ‘real-time’. For the most part, this means real-time rendering, particularly via game engines and the use of GPUs. Real-time rendering has been used for a great deal of virtual production, for anything from previs, set-scouting, live motion capture, simulcam work, compositing on set, in-camera visual effects, to actual final frame rendering. Some of the many benefits of real-time include instant or faster feedback to cast and crew, and the pushing of more key decisions into pre-production and shooting instead of post. The landmark projects from this past year to incorporate realtime include The Lion King and The Mandalorian, but it’s a technology that’s permeating a wide range of both large and small VFX productions. Looking to the future, VFX Voice asked a number of companies invested in real-time how their tools and techniques are likely to impact the industry.

scene itself live on stage, for actors to see the environment that they’re acting against in real-time, to change sets or locations quickly, to re-shoot scenes at any time of day with the exact same setup, review and scout scenes simultaneously in VR, use gameplay and simulation elements live on stage and much more. “Filmmakers love it because they can get more predictability into exactly what a given scene or project will demand,” says

TOP: A still from Unity’s real-time rendered The Heretic, which utilized many of the latest authoring workflow tools that the game engine maker is implementing into its engine. (Image courtesy of Unity Technologies) BOTTOM: The seam between the LED wall and the liveaction set is designed to be hidden, with the background, lighting and many other elements able to be changed in real-time. (Image courtesy of Epic Games)

REALLY REAL-TIME: IN-CAMERA VFX

TOP: A one-billion polygon KitBash3d city designed by Evan Butler, as rendered in Lavina. (Image courtesy of Chaos Group)

46 • VFXVOICE.COM WINTER 2020

PG 46-50 REAL-TIME.indd 46-47

Epic Games, the makers of Unreal Engine, is one of the companies at the forefront of real-time. A recent demo set up by Epic Games showed real-time technologies coming together to enable in-camera VFX during a live-action shoot. The demo featured an actor on a motorbike filmed live-action against a series of LED wall panels. The imagery produced on the walls blended seamlessly with the live-action set, and could also be altered in real-time. “An LED wall shoot has the potential to impact many aspects of filmmaking,” suggests Ryan Mayeda, Technical Product Manager at Epic Games. “It can enable in-camera finals, improve greenscreen elements with environment lighting when in-camera finals may not be feasible, improve the ability to modify lighting both on actors and within the 3D scene live on stage, to modify the 3D

WINTER 2020 VFXVOICE.COM • 47

11/12/19 2:21 PM


TECH & TOOLS

Mayeda, on the reaction to the demo. “Art directors get very excited about the virtual location scouting features and the ability to collaborate in real-time with the director and DP. DPs have been blown away by the range of lighting control virtual production affords and the speed at which they can operate the system. VFX teams see the possibility of unprecedented collaboration and more simultaneous, non-linear workflows.” THE RENDER PIPELINE

Unity Technologies is another of the big players in real-time rendering and virtual production with their Unity game engine. Like Unreal Engine, Unity is finding an increasing amount of use inside studio production pipelines, whether those studios are making games, experiences or VFX shots. One major area of development for Unity has been in the creation of its Scriptable Render Pipeline, part of its push to make content authoring workflows more accessible for developers. Unity’s Scriptable Render Pipeline allows developers to adjust and optimize the approach to rendering inside the engine: a AAA game like Naughty Dog’s Uncharted series which uses higher-end hardware like PCs and consoles at 30 frames per second performance has constraints that are quite different from a mobile VR game that might need to run at 90 fps on a medium-end phone, for instance. “By using the Scriptable Render Pipeline, developers are given a choice of what algorithms to use,” outlines Natalya Tatarchuk, Vice President of Graphics at Unity Technologies. “They can take advantage of our two pre-built solutions, use those pre-built pipelines as a template to build their own, or create a completely customized render pipeline from scratch. “This level of fine-grained control means that you can choose the solution best suited to the needs of your project, factoring in things like graphic requirements, the platforms you plan to distribute on, and more.” REAL-TIME RAY TRACING FROM TRADITIONAL RENDERERS

Advancements in real-time rendering are not lost on companies that have traditionally built offline renderers. The maker of V-Ray, Chaos Group, for example, is in development on its own real-time ray tracing application, dubbed Project Lavina. It allows users to explore V-Ray scenes within a total ray-traced environment in real-time. “We’ve built this from the ground up to be a real-time ray tracer, so that the speed of it is always going to be the thing that’s paramount,” outlines Lon Grohs, Global Head of Creative, Chaos Group. “The second part of that is that NVIDIA’s new cards and hardware really make a big difference in terms of that speed. “So, imagine if you’re working with a 3D application and your scene is just ray-traced all the time,” adds Grohs. “There’s no toggling back and forth between all your frames, and as you put on a shader or a material or a light that is what it is, and it’s giving you instant feedback.”

48 • VFXVOICE.COM WINTER 2020

PG 46-50 REAL-TIME.indd 48-49

TOOLS FOR VIRTUAL PRODUCTION

Films known for their adoption of virtual production such as Avatar, Ready Player One and The Lion King largely relied on purpose-built real-time tools and workflows being engineered to help enable, for instance, set scouting, simulcam shooting and VR collaboration. Glassbox is among a host of companies offering more of a virtual production solution ‘out of the box’ for filmmakers. Their tools (DragonFly and BeeHive), which work with principal content creation software such as Unreal, Unity and Maya, are aimed at letting users visualize virtual performances on a virtual set, for example, by seeing a mythical CG creature fly through a fantasy land in real-time. Secondly, they enable collaboration with a virtual art department and the virtual sets. “Although techniques like virtual production have existed before,” acknowledges Glassbox Technologies CPO and Co-founder Mariana Acuña Acosta, “studios have not been able to effectively replicate the workflow of a live-action production in VFX, games and VR development without huge investments in bespoke tech and manpower.” “DragonFly and BeeHive change that,” says Acuña Acosta. “We have created an ecosystem of platform-agnostic tools based on the latest VR and game engine technologies that offer studios of all sizes and all budgets a truly disruptive set of tools capable of transforming filmmaking and content production.” PERFORMANCE, IN REAL-TIME

Real-time technologies are enabling animators and visual effects artists to capture and create characters at faster speeds and at higher levels of fidelity than ever before by sometimes inputting themselves or actors directly into the character. Reallusion’s iClone, for example, enables that ability via its Motion Live plugin. The plugin aggregates the motion-data streams from an array of industry motion-capture devices, such as tools from Xsens, Noitom, Rokoko, OptiTrack, Qualisys and ManusVR. John C Martin II, Vice President of Marketing at Reallusion, describes how this works. “Choose a character in iClone, use Motion Live to assign a face mocap device, like the iPhone, which uses our LiveFace iPhone app to transfer the data, a body mocap device or a hand mocap device. All are optional or can be used at the same time for full-body real-time performance capture.” This forms just part of iClone’s larger system for character creation and animation, which also recently was made to connect to Unreal Engine with the iClone Unreal Live Link plugin. “The process begins with our Character Creator 3 content to generate a custom morph and designed character ready for animation,” states Martin. “Animate the characters with iClone and transfer characters, animation, cameras and lights directly to Unreal Engine. FBX can still be imported as always, but Live Link adds so much more to what you can do with the content, especially in real-time.”

When Real-Time and VFX Meet Perhaps one of the ultimate examples of real-time being embraced by visual effects practitioners is Stargate Studios CEO Sam Nicholson’s ThruView system. This is a series of tools that allows a scene to be filmed in live-action but with real-time rendered or pre-rendered CG or pre-composited live-action elements displayed in real-time and captured in-camera. “For certain applications, like driving, flying, planes, trains and automobiles or set extensions, it is an ideal tool,” argues Nicholson. “What it solves particularly, as opposed to greenscreen, is shallow depth of field, shiny objects, and real-time feedback in your composite image. Rather than trying to make decisions two weeks downstream or someone else making that decision for you, you have all the tools you need on set in one place to create the image that you want to create in real-time.” For a recent show featuring a train sequence, Nicholson’s team shot real moving train footage with 10 cameras. Then, the 14 hours of plates (200 terabytes worth) were ingested, color-corrected and readied for an on-set train carriage shoot. “We’re playing back 8K on set, using 10 servers serving 40 4K screens, one for each window on a 150-foot-long train set,” explains Nicholson, who notes the real-time setup also allows for live lighting changes, for example, when the train goes through a tunnel. “ThruView gives you all of the horsepower and the creative tools that you have in a traditional greenscreen composite, but now in real-time,” continues Nicholson. “We’re going after finished pixels, in camera, on set, with the ability to pull a key and do post-production work on it if you want to, but to get you at least 80% of the way there and not have to back up and start over again.”

OPPOSITE TOP TO BOTTOM: A scene from Kevin Margo’s Construct short rendered with Chaos Group’s real-time ray tracer Lavina. (Image courtesy of Chaos Group) A screenshot from the Unity game engine. (Image courtesy of Unity Technologies) Glassbox’s DragonFly virtual production tool in action via a simulcam setup. (Image courtesy of Glassbox) A DragonFly scene inside Unreal Engine. (Image courtesy of Glassbox) A screenshot from iClone’s Unreal Live Link interface. (Image courtesy of Reallusion)

SKETCHING ART IN NO TIME

The possibilities of real-time are sometimes still in the research phase. The winner of the SIGGRAPH 2019 Real-Time Live! event,

WINTER 2020 VFXVOICE.COM • 49

11/12/19 2:21 PM


TECH & TOOLS

Mayeda, on the reaction to the demo. “Art directors get very excited about the virtual location scouting features and the ability to collaborate in real-time with the director and DP. DPs have been blown away by the range of lighting control virtual production affords and the speed at which they can operate the system. VFX teams see the possibility of unprecedented collaboration and more simultaneous, non-linear workflows.” THE RENDER PIPELINE

Unity Technologies is another of the big players in real-time rendering and virtual production with their Unity game engine. Like Unreal Engine, Unity is finding an increasing amount of use inside studio production pipelines, whether those studios are making games, experiences or VFX shots. One major area of development for Unity has been in the creation of its Scriptable Render Pipeline, part of its push to make content authoring workflows more accessible for developers. Unity’s Scriptable Render Pipeline allows developers to adjust and optimize the approach to rendering inside the engine: a AAA game like Naughty Dog’s Uncharted series which uses higher-end hardware like PCs and consoles at 30 frames per second performance has constraints that are quite different from a mobile VR game that might need to run at 90 fps on a medium-end phone, for instance. “By using the Scriptable Render Pipeline, developers are given a choice of what algorithms to use,” outlines Natalya Tatarchuk, Vice President of Graphics at Unity Technologies. “They can take advantage of our two pre-built solutions, use those pre-built pipelines as a template to build their own, or create a completely customized render pipeline from scratch. “This level of fine-grained control means that you can choose the solution best suited to the needs of your project, factoring in things like graphic requirements, the platforms you plan to distribute on, and more.” REAL-TIME RAY TRACING FROM TRADITIONAL RENDERERS

Advancements in real-time rendering are not lost on companies that have traditionally built offline renderers. The maker of V-Ray, Chaos Group, for example, is in development on its own real-time ray tracing application, dubbed Project Lavina. It allows users to explore V-Ray scenes within a total ray-traced environment in real-time. “We’ve built this from the ground up to be a real-time ray tracer, so that the speed of it is always going to be the thing that’s paramount,” outlines Lon Grohs, Global Head of Creative, Chaos Group. “The second part of that is that NVIDIA’s new cards and hardware really make a big difference in terms of that speed. “So, imagine if you’re working with a 3D application and your scene is just ray-traced all the time,” adds Grohs. “There’s no toggling back and forth between all your frames, and as you put on a shader or a material or a light that is what it is, and it’s giving you instant feedback.”

48 • VFXVOICE.COM WINTER 2020

PG 46-50 REAL-TIME.indd 48-49

TOOLS FOR VIRTUAL PRODUCTION

Films known for their adoption of virtual production such as Avatar, Ready Player One and The Lion King largely relied on purpose-built real-time tools and workflows being engineered to help enable, for instance, set scouting, simulcam shooting and VR collaboration. Glassbox is among a host of companies offering more of a virtual production solution ‘out of the box’ for filmmakers. Their tools (DragonFly and BeeHive), which work with principal content creation software such as Unreal, Unity and Maya, are aimed at letting users visualize virtual performances on a virtual set, for example, by seeing a mythical CG creature fly through a fantasy land in real-time. Secondly, they enable collaboration with a virtual art department and the virtual sets. “Although techniques like virtual production have existed before,” acknowledges Glassbox Technologies CPO and Co-founder Mariana Acuña Acosta, “studios have not been able to effectively replicate the workflow of a live-action production in VFX, games and VR development without huge investments in bespoke tech and manpower.” “DragonFly and BeeHive change that,” says Acuña Acosta. “We have created an ecosystem of platform-agnostic tools based on the latest VR and game engine technologies that offer studios of all sizes and all budgets a truly disruptive set of tools capable of transforming filmmaking and content production.” PERFORMANCE, IN REAL-TIME

Real-time technologies are enabling animators and visual effects artists to capture and create characters at faster speeds and at higher levels of fidelity than ever before by sometimes inputting themselves or actors directly into the character. Reallusion’s iClone, for example, enables that ability via its Motion Live plugin. The plugin aggregates the motion-data streams from an array of industry motion-capture devices, such as tools from Xsens, Noitom, Rokoko, OptiTrack, Qualisys and ManusVR. John C Martin II, Vice President of Marketing at Reallusion, describes how this works. “Choose a character in iClone, use Motion Live to assign a face mocap device, like the iPhone, which uses our LiveFace iPhone app to transfer the data, a body mocap device or a hand mocap device. All are optional or can be used at the same time for full-body real-time performance capture.” This forms just part of iClone’s larger system for character creation and animation, which also recently was made to connect to Unreal Engine with the iClone Unreal Live Link plugin. “The process begins with our Character Creator 3 content to generate a custom morph and designed character ready for animation,” states Martin. “Animate the characters with iClone and transfer characters, animation, cameras and lights directly to Unreal Engine. FBX can still be imported as always, but Live Link adds so much more to what you can do with the content, especially in real-time.”

When Real-Time and VFX Meet Perhaps one of the ultimate examples of real-time being embraced by visual effects practitioners is Stargate Studios CEO Sam Nicholson’s ThruView system. This is a series of tools that allows a scene to be filmed in live-action but with real-time rendered or pre-rendered CG or pre-composited live-action elements displayed in real-time and captured in-camera. “For certain applications, like driving, flying, planes, trains and automobiles or set extensions, it is an ideal tool,” argues Nicholson. “What it solves particularly, as opposed to greenscreen, is shallow depth of field, shiny objects, and real-time feedback in your composite image. Rather than trying to make decisions two weeks downstream or someone else making that decision for you, you have all the tools you need on set in one place to create the image that you want to create in real-time.” For a recent show featuring a train sequence, Nicholson’s team shot real moving train footage with 10 cameras. Then, the 14 hours of plates (200 terabytes worth) were ingested, color-corrected and readied for an on-set train carriage shoot. “We’re playing back 8K on set, using 10 servers serving 40 4K screens, one for each window on a 150-foot-long train set,” explains Nicholson, who notes the real-time setup also allows for live lighting changes, for example, when the train goes through a tunnel. “ThruView gives you all of the horsepower and the creative tools that you have in a traditional greenscreen composite, but now in real-time,” continues Nicholson. “We’re going after finished pixels, in camera, on set, with the ability to pull a key and do post-production work on it if you want to, but to get you at least 80% of the way there and not have to back up and start over again.”

OPPOSITE TOP TO BOTTOM: A scene from Kevin Margo’s Construct short rendered with Chaos Group’s real-time ray tracer Lavina. (Image courtesy of Chaos Group) A screenshot from the Unity game engine. (Image courtesy of Unity Technologies) Glassbox’s DragonFly virtual production tool in action via a simulcam setup. (Image courtesy of Glassbox) A DragonFly scene inside Unreal Engine. (Image courtesy of Glassbox) A screenshot from iClone’s Unreal Live Link interface. (Image courtesy of Reallusion)

SKETCHING ART IN NO TIME

The possibilities of real-time are sometimes still in the research phase. The winner of the SIGGRAPH 2019 Real-Time Live! event,

WINTER 2020 VFXVOICE.COM • 49

11/12/19 2:21 PM


TECH & TOOLS

NVIDIA’s GauGAN, is one such technology. It allows a user to paint a simple iteration of a scene, but then instantly see that transformed, via a neural network, into compelling landscapes or scenes. NVIDIA is already a principal provider of graphics cards that power a large amount of real-time rendering, as well as having a significant hand in machine learning research. “GauGAN tries to imitate human imagination capability,” advises Ming-Yu Liu, Principal Research Scientist at NVIDIA. “It takes a segmentation mask, a semantic description of the scene, as input and outputs a photorealistic image. GauGAN is trained with a large dataset of landscape images and their segmentation masks. Through weeks of training, GauGAN eventually captures correlation between segmentation masks and real images.” At the moment, this is demo technology – which users have been able to try at NVIDIA events and also online – that could perhaps aid in generating production scenes quickly and accurately from minimal input, although the ‘training’ of GauGAN used around a million landscape images released under common creative license on the internet. REAL-TIME HUMANS

While photoreal digital humans make waves in feature films and on television shows, real-time generated CG humans are beginning to have a similar impact. Digital Domain has capitalized on the studio’s own CG human experience and its internal Digital Human Group to jump into real-time humans. “We were shocked by how well our real-time engine, Unreal, handled our feature-quality assets,” says Doug Roble, Senior Director of Software R&D at Digital Domain, who performed his own live avatar at TED in 2019. “To generate the look of the real-time versions we used the same resolution models, textures, displacements and hair grooms from our high-resolution actor scans. We did do some optimization but focused on ensuring we kept the look.” The TED Talk – which had Roble on stage in an Xsens suit, Manus gloves and wearing a helmet-mounted camera that was all fed into Ikinema to solve a ‘DigiDoug’ model running in Unreal – saw his avatar projected in real-time on a screen behind him. Machine learning techniques were able to add to the facial expressions Roble was making. “All of this data from face shapes to finger motion to facial blood flow all needs to be calculated, transmitted and re-rendered in 1/6th of a second,” notes Roble. “It is truly mind-boggling that it is all actually possible.” TOP TO BOTTOM: Digital Domain’s Doug Roble during his TED Talk that showcased a real-time CG human representation of himself. (Photo credit: Bret Hartman/TED) Doug Roble demonstrates how he was scanned in a Light Stage to prepare for his digital-double transformation. (Photo credit: Bret Hartman/TED) A user tries out NVIDIA’s GauGAN experience, which allows them to sketch simple shapes that will be transformed into a landscape. (Image courtesy of NVIDIA) The GauGAN interface. (Image courtesy of NVIDIA) A ThruView shoot still involves using a typical camera and soundstage setup, but with pre-rendered or pre-processed footage strategically placed around a set-piece, like this vehicle. (Image courtesy of Stargate Studios)

50 • VFXVOICE.COM WINTER 2020

PG 46-50 REAL-TIME.indd 50

11/12/19 2:21 PM


PG 51 HULU CATCH-22 AD.indd 51

11/12/19 2:48 PM


PROFILE

JENNIFER LEE: CREATING NEW ANIMATED CLASSICS THAT INSPIRE THE IMAGINATION By TREVOR HOGG

52 • VFXVOICE.COM WINTER 2020

PG 52-56 JENNIFER LEE.indd 52-53

Born and raised in Rhode Island, Walt Disney Animation Studios Chief Creative Officer Jennifer Lee has fond memories growing up in a small state with hundreds of miles of coastline. “It has a lot of blue-collar towns, hard workers and beautiful beaches. I spent a lot of time on the water. You didn’t have to have money to do that.” Lee showed an early interest in drawing, which was encouraged by her salesman father, Saverio Rebecchi. “He would provide me with a lot of art supplies.” The matriarch of the family, Linda Lee, was a nurse and supported her “wild creative mess” of a daughter. “My mother was a huge reader, and my sister Amy, a straight A student, would read to me all of the time. Growing up I had a ton of books.” Seeing Star Wars at the age of five, Lee developed a love for the science fiction genre. “I constantly drew stories. Cinderella was my favorite by far. It got me through bullying and things like that. As I got older, I became a big fan of J.R.R. Tolkien and Toni Morrison.” An interest in literature led the undergrad student to earn a bachelor’s degree in English at the University of New Hampshire. “I never considered myself a writer. I had an incredible English teacher in my junior year who saw that I had a passion and deep understanding of literature. I loved Shakespeare more than anything else at that time. To have someone acknowledge that I had something to contribute meant a lot to me. I didn’t know what I wanted to do with my life but I knew that I loved stories. Film wasn’t an option for me at that point. Where I grew up I didn’t know it was possible, and books were something I always knew. I went on to study English while I was painting on the side and doing art classes. It was a strange combination that would come together, but I didn’t know it then.” After graduating, Lee became a graphic artist designing audio books for Random House in New York City. “My biggest reason for getting that job was because I got free books! It was during the transition from analog to computers. I understood computers and shifted into the graphic arts side. It also led to video. I started putting the pieces together in my twenties that I was visually telling stories all of the time and had a visual sense. But it wasn’t until screenwriting when I realized that was where I belonged. Screenwriting was like someone gave me a translator for what was in my head.” Two major incidents led Lee to develop an interest in filmmaking. “I heard someone say something outrageous and I turned it into a scene. I had never done anything like that before. I quickly took a screenwriting class at NYU and wrote a whole script in six weeks. My husband at the time was working at The New York Times on Saturdays, which meant that I had the day to myself. So I started going to see movies at the Sony multiplex and Lincoln Center. I realized that part of me was waking up and I started doing experimental videos.” On the cusp of turning 30, Lee decided to apply to Columbia University School of the Arts to get a Master of Fine Arts in film. “It had the program that for a minimum of three, if not four to five years, you would go broke but get to do everything in film. For two years you have to direct, produce, write and act as well as go from short to feature filmmaking. You were completely immersed and forced to understand the collaborative nature of it by being put into teams.”

“[Increasing the number of female directors] is about access. We’re making huge changes here. I’m no longer the only female director. We’re working hard to find that talent. There are great filmmakers out there.” —Jennifer Lee, CCO, Walt Disney Animation Studios The aspiring filmmaker formed a creative partnership with a fellow classmate who would later call her to join him at Walt Disney Animation Studios to co-write Wreck-It Ralph. “The first day of film school, Phil Johnston and I were thrown into a group together. We were turning 30 within a couple of days of each other, both were married, and came from completely different backgrounds and sensibilities. Our collaboration came organically. Phil and I would work on each other’s films. I never tried to rewrite him. I was trying to support his work and vice versa. At the time I was doing socially conscious and science fiction films while Phil was doing hilarious comedies. Even after graduation, we met every week, shared pages and forced each other to keep working. We formed a trust and understanding of each other’s writing that made our collaborating on Wreck-It Ralph possible.” Working on a Hollywood production meant dealing with studio executives and a large production crew. “Getting involved with something this collaborative was one of the biggest and most rewarding challenges,” notes Lee. “You’re not in it alone. Everyone is giving their best. The best idea wins. Your job becomes being able to filter the good and bad and protect the project, but be flexible. Wreck-It Ralph was a different boot camp of learning how you work in an environment where the ideas and criticisms are always flowing, and know what to listen to. Navigating it was about sticking to the craft that I learned. I started to see that my job was

OPPOSITE TOP: Jennifer Lee, CCO at Walt Disney Animation Studios. (Photo: Ricky Middlesworth. Copyright © 2019 Disney) OPPOSITE BOTTOM: A scene from Wreck-It Ralph where Vanellope von Schweetz (Sarah Silverman) and Ralph (John C. Reilly) are in the video game world of Sugar Rush. (Image copyright © 2016 Disney) TOP: The Ice Castle is in the distance as Sven transports Kristoff (Jonathan Groff), Olaf (Josh Gad), Elsa (Idina Menzel) and Anna (Kristen Bell) in Frozen 2. (Image copyright © 2019 Disney) BOTTOM: Lee is also involved with live-action filmmaking, such as co-writing the script for A Wrinkle in Time with Jeff Stockwell. (Image copyright © 2017 Disney)

WINTER 2020 VFXVOICE.COM • 53

11/12/19 2:21 PM


PROFILE

JENNIFER LEE: CREATING NEW ANIMATED CLASSICS THAT INSPIRE THE IMAGINATION By TREVOR HOGG

52 • VFXVOICE.COM WINTER 2020

PG 52-56 JENNIFER LEE.indd 52-53

Born and raised in Rhode Island, Walt Disney Animation Studios Chief Creative Officer Jennifer Lee has fond memories growing up in a small state with hundreds of miles of coastline. “It has a lot of blue-collar towns, hard workers and beautiful beaches. I spent a lot of time on the water. You didn’t have to have money to do that.” Lee showed an early interest in drawing, which was encouraged by her salesman father, Saverio Rebecchi. “He would provide me with a lot of art supplies.” The matriarch of the family, Linda Lee, was a nurse and supported her “wild creative mess” of a daughter. “My mother was a huge reader, and my sister Amy, a straight A student, would read to me all of the time. Growing up I had a ton of books.” Seeing Star Wars at the age of five, Lee developed a love for the science fiction genre. “I constantly drew stories. Cinderella was my favorite by far. It got me through bullying and things like that. As I got older, I became a big fan of J.R.R. Tolkien and Toni Morrison.” An interest in literature led the undergrad student to earn a bachelor’s degree in English at the University of New Hampshire. “I never considered myself a writer. I had an incredible English teacher in my junior year who saw that I had a passion and deep understanding of literature. I loved Shakespeare more than anything else at that time. To have someone acknowledge that I had something to contribute meant a lot to me. I didn’t know what I wanted to do with my life but I knew that I loved stories. Film wasn’t an option for me at that point. Where I grew up I didn’t know it was possible, and books were something I always knew. I went on to study English while I was painting on the side and doing art classes. It was a strange combination that would come together, but I didn’t know it then.” After graduating, Lee became a graphic artist designing audio books for Random House in New York City. “My biggest reason for getting that job was because I got free books! It was during the transition from analog to computers. I understood computers and shifted into the graphic arts side. It also led to video. I started putting the pieces together in my twenties that I was visually telling stories all of the time and had a visual sense. But it wasn’t until screenwriting when I realized that was where I belonged. Screenwriting was like someone gave me a translator for what was in my head.” Two major incidents led Lee to develop an interest in filmmaking. “I heard someone say something outrageous and I turned it into a scene. I had never done anything like that before. I quickly took a screenwriting class at NYU and wrote a whole script in six weeks. My husband at the time was working at The New York Times on Saturdays, which meant that I had the day to myself. So I started going to see movies at the Sony multiplex and Lincoln Center. I realized that part of me was waking up and I started doing experimental videos.” On the cusp of turning 30, Lee decided to apply to Columbia University School of the Arts to get a Master of Fine Arts in film. “It had the program that for a minimum of three, if not four to five years, you would go broke but get to do everything in film. For two years you have to direct, produce, write and act as well as go from short to feature filmmaking. You were completely immersed and forced to understand the collaborative nature of it by being put into teams.”

“[Increasing the number of female directors] is about access. We’re making huge changes here. I’m no longer the only female director. We’re working hard to find that talent. There are great filmmakers out there.” —Jennifer Lee, CCO, Walt Disney Animation Studios The aspiring filmmaker formed a creative partnership with a fellow classmate who would later call her to join him at Walt Disney Animation Studios to co-write Wreck-It Ralph. “The first day of film school, Phil Johnston and I were thrown into a group together. We were turning 30 within a couple of days of each other, both were married, and came from completely different backgrounds and sensibilities. Our collaboration came organically. Phil and I would work on each other’s films. I never tried to rewrite him. I was trying to support his work and vice versa. At the time I was doing socially conscious and science fiction films while Phil was doing hilarious comedies. Even after graduation, we met every week, shared pages and forced each other to keep working. We formed a trust and understanding of each other’s writing that made our collaborating on Wreck-It Ralph possible.” Working on a Hollywood production meant dealing with studio executives and a large production crew. “Getting involved with something this collaborative was one of the biggest and most rewarding challenges,” notes Lee. “You’re not in it alone. Everyone is giving their best. The best idea wins. Your job becomes being able to filter the good and bad and protect the project, but be flexible. Wreck-It Ralph was a different boot camp of learning how you work in an environment where the ideas and criticisms are always flowing, and know what to listen to. Navigating it was about sticking to the craft that I learned. I started to see that my job was

OPPOSITE TOP: Jennifer Lee, CCO at Walt Disney Animation Studios. (Photo: Ricky Middlesworth. Copyright © 2019 Disney) OPPOSITE BOTTOM: A scene from Wreck-It Ralph where Vanellope von Schweetz (Sarah Silverman) and Ralph (John C. Reilly) are in the video game world of Sugar Rush. (Image copyright © 2016 Disney) TOP: The Ice Castle is in the distance as Sven transports Kristoff (Jonathan Groff), Olaf (Josh Gad), Elsa (Idina Menzel) and Anna (Kristen Bell) in Frozen 2. (Image copyright © 2019 Disney) BOTTOM: Lee is also involved with live-action filmmaking, such as co-writing the script for A Wrinkle in Time with Jeff Stockwell. (Image copyright © 2017 Disney)

WINTER 2020 VFXVOICE.COM • 53

11/12/19 2:21 PM


PROFILE

“I was initially asked to join [Frozen] as a writer and said ‘no’ because I had seen a rough cut of a different version and didn’t see a story in it. But then I spent a week with Chris Buck, Robert Lopez, Kristen Anderson-Lopez and some other directors to brainstorm a new version, and by the end of it I was madly in love.” —Jennifer Lee, CCO, Walt Disney Animation Studios

TOP LEFT: Zootopia’s first bunny officer Judy Hopps (Ginnifer Goodwin) finds herself face to face with the fast-talking, scam-artist fox Nick (Jason Bateman). (Image copyright © 2017 Disney) TOP RIGHT: Anna (Kristen Bell) and Olaf (Josh Gad) venture far from Arendelle in a dangerous but remarkable journey to help Elsa find answers about the past in Frozen 2. (Image copyright © 2019 Disney) BOTTOM LEFT: Elsa encounters a Nokk, a mythical water spirit that takes the form of a horse, who uses the power of the ocean to guard the secrets of the forest in Frozen 2. (Image copyright © 2019 Disney) BOTTOM RIGHT: Providing the comic relief once again are the duo of Sven and Olaf in Frozen 2. (Image copyright © 2019 Disney)

54 • VFXVOICE.COM WINTER 2020

PG 52-56 JENNIFER LEE.indd 54-55

not to stifle ideas but to protect the journey as a whole. “Coming in here as a writer before becoming a director really helped me. I had a focus, and with the support of Phil, Rich Moore and the whole team, by the time I finished Wreck-It Ralph I understood the process.” With the release of Frozen, a retelling of the Snow Queen fairy tale, Lee became the first female director of a feature film at Walt Disney Animation Studios and the first of her gender to helm a feature that grossed more than $1 billion at the worldwide box office. “I was initially asked to join as a writer and said ‘no’ because I had seen a rough cut of a different version and didn’t see a story in it. But then I spent a week with Chris Buck, Robert Lopez, Kristen Anderson-Lopez and some other directors to brainstorm a new version, and by the end of it I was madly in love.” As the production progressed Lee became a co-director with Buck. “When John Lasseter asked me to direct with Chris, it allowed us to divide and conquer. It was a challenging schedule. I was intimidated by the production side here because I hadn’t

done it in animation before, but I knew the story room well. Chris and I could stay connected, but if he needed to go to environments or effects or character, I was running the story room. It had a more organic feeling than we thought it would. Because of that process I did join in the production side. I got nervous as to what I had to offer, but then realized I was working with the best in the world, and what I had to bring to them was the motivation, intentions, needs and wants of the character and story, and they knew what to bring back with their incredible skills and artistry.” Lee went on to co-write and co-direct the short film Frozen Fever, provide the story for Zootopia, and co-write A Wrinkle in Time. “I had different roles on those. With Zootopia I was in the story room keeping the story evolving, and working in the edit room. It was a supportive role. That skill was part of what led to me becoming the Chief Creative Officer at Walt Disney Animation Studios. “With A Wrinkle in Time being live-action, the writing process is different. You’re not in the room every day. I got to work with Ava DuVernay, which was extraordinary. But I realize the incredible way that we work here [is vital], where everyone stays connected to the whole process. Then coming to Frozen 2 – Bobby, Kristen and I also [collaborated on the] Broadway [adaptation] together. Doing a sequel is so incredibly challenging in its own way, but what we do have is a lot of trust, respect, and a rhythm to how we work. There is even a rhythm in production with the same crew for the most part, because they wanted to come back. We have been surprising each other with what we’ve been able to do. Frozen 2 let us expand on what we had done as opposed to make a big change. It felt like we were growing.” When Lasseter was displaced as Chief Creative Officer for Pixar and Walt Disney Animation Studios, Pete Docter was appointed CCO of the former and Lee of the latter. “What I love about Pete is that he has been involved with so many films. I was struggling with a part on Frozen 2 a few months ago and called him. Pete said, ‘We’ve been through that and this is what we did.’ He has so much experience with problem-solving and is so generous. Pete is busy too. He is directing at the same time. A lot of time we will call each other to commiserate and ask, ‘Did you sleep this week?’ We brought Frozen 2 up there [to Pixar] a few times. They come down here with their films. It’s such a pleasure to get to work with him.”

TOP: Frozen 2 provided an opportunity to expand the world and characters from the original movie. (Image copyright © 2019 Disney) MIDDLE: Pixar Animation Studios and Walt Disney Animation Studios have screenings of each other’s projects to assist with the development process, as was the case with Frozen 2. (Image copyright © 2019 Disney) BOTTOM: A flashback scene in Frozen 2 featuring Queen Iduna (Evan Rachel Wood) with her daughters Anna and Elsa. (Image copyright © 2019 Disney)

WINTER 2020 VFXVOICE.COM • 55

11/12/19 2:21 PM


PROFILE

“I was initially asked to join [Frozen] as a writer and said ‘no’ because I had seen a rough cut of a different version and didn’t see a story in it. But then I spent a week with Chris Buck, Robert Lopez, Kristen Anderson-Lopez and some other directors to brainstorm a new version, and by the end of it I was madly in love.” —Jennifer Lee, CCO, Walt Disney Animation Studios

TOP LEFT: Zootopia’s first bunny officer Judy Hopps (Ginnifer Goodwin) finds herself face to face with the fast-talking, scam-artist fox Nick (Jason Bateman). (Image copyright © 2017 Disney) TOP RIGHT: Anna (Kristen Bell) and Olaf (Josh Gad) venture far from Arendelle in a dangerous but remarkable journey to help Elsa find answers about the past in Frozen 2. (Image copyright © 2019 Disney) BOTTOM LEFT: Elsa encounters a Nokk, a mythical water spirit that takes the form of a horse, who uses the power of the ocean to guard the secrets of the forest in Frozen 2. (Image copyright © 2019 Disney) BOTTOM RIGHT: Providing the comic relief once again are the duo of Sven and Olaf in Frozen 2. (Image copyright © 2019 Disney)

54 • VFXVOICE.COM WINTER 2020

PG 52-56 JENNIFER LEE.indd 54-55

not to stifle ideas but to protect the journey as a whole. “Coming in here as a writer before becoming a director really helped me. I had a focus, and with the support of Phil, Rich Moore and the whole team, by the time I finished Wreck-It Ralph I understood the process.” With the release of Frozen, a retelling of the Snow Queen fairy tale, Lee became the first female director of a feature film at Walt Disney Animation Studios and the first of her gender to helm a feature that grossed more than $1 billion at the worldwide box office. “I was initially asked to join as a writer and said ‘no’ because I had seen a rough cut of a different version and didn’t see a story in it. But then I spent a week with Chris Buck, Robert Lopez, Kristen Anderson-Lopez and some other directors to brainstorm a new version, and by the end of it I was madly in love.” As the production progressed Lee became a co-director with Buck. “When John Lasseter asked me to direct with Chris, it allowed us to divide and conquer. It was a challenging schedule. I was intimidated by the production side here because I hadn’t

done it in animation before, but I knew the story room well. Chris and I could stay connected, but if he needed to go to environments or effects or character, I was running the story room. It had a more organic feeling than we thought it would. Because of that process I did join in the production side. I got nervous as to what I had to offer, but then realized I was working with the best in the world, and what I had to bring to them was the motivation, intentions, needs and wants of the character and story, and they knew what to bring back with their incredible skills and artistry.” Lee went on to co-write and co-direct the short film Frozen Fever, provide the story for Zootopia, and co-write A Wrinkle in Time. “I had different roles on those. With Zootopia I was in the story room keeping the story evolving, and working in the edit room. It was a supportive role. That skill was part of what led to me becoming the Chief Creative Officer at Walt Disney Animation Studios. “With A Wrinkle in Time being live-action, the writing process is different. You’re not in the room every day. I got to work with Ava DuVernay, which was extraordinary. But I realize the incredible way that we work here [is vital], where everyone stays connected to the whole process. Then coming to Frozen 2 – Bobby, Kristen and I also [collaborated on the] Broadway [adaptation] together. Doing a sequel is so incredibly challenging in its own way, but what we do have is a lot of trust, respect, and a rhythm to how we work. There is even a rhythm in production with the same crew for the most part, because they wanted to come back. We have been surprising each other with what we’ve been able to do. Frozen 2 let us expand on what we had done as opposed to make a big change. It felt like we were growing.” When Lasseter was displaced as Chief Creative Officer for Pixar and Walt Disney Animation Studios, Pete Docter was appointed CCO of the former and Lee of the latter. “What I love about Pete is that he has been involved with so many films. I was struggling with a part on Frozen 2 a few months ago and called him. Pete said, ‘We’ve been through that and this is what we did.’ He has so much experience with problem-solving and is so generous. Pete is busy too. He is directing at the same time. A lot of time we will call each other to commiserate and ask, ‘Did you sleep this week?’ We brought Frozen 2 up there [to Pixar] a few times. They come down here with their films. It’s such a pleasure to get to work with him.”

TOP: Frozen 2 provided an opportunity to expand the world and characters from the original movie. (Image copyright © 2019 Disney) MIDDLE: Pixar Animation Studios and Walt Disney Animation Studios have screenings of each other’s projects to assist with the development process, as was the case with Frozen 2. (Image copyright © 2019 Disney) BOTTOM: A flashback scene in Frozen 2 featuring Queen Iduna (Evan Rachel Wood) with her daughters Anna and Elsa. (Image copyright © 2019 Disney)

WINTER 2020 VFXVOICE.COM • 55

11/12/19 2:21 PM


PROFILE

“I realized I was working with the best in the world, and what I had to bring to them was the motivation, intentions, needs and wants of the character and story, and they knew what to bring back with their incredible skills and artistry.” —Jennifer Lee, CCO, Walt Disney Animation Studios TOP: Frozen 2 reunites original blockbuster directors Jennifer Lee and Chris Buck, producer Peter Del Vecho, voice cast members Idina Menzel, Kristen Bell, Jonathan Groff and Josh Gad, and songwriters Kristen Anderson-Lopez and Robert Lopez. (Image copyright © 2019 Disney) MIDDLE: In Frozen, Elsa wondered if her powers would be accepted by those around her. In the sequel she has to use them in order to save the world. (Image copyright © 2019 Disney) BOTTOM: Jennifer Lee (Photo: Ricky Middlesworth. Copyright © 2019 Disney)

Lee has been open about her battle with self-doubt, which is a common attribute for writers. “We have a joke about that. When the actors come in it’s the most fun day. When the writers work together it’s miserable because we’re so critical! I fundamentally believe when I struggle with self-doubt that I bring it back to the work. I’ve learned now in my older years to ask for what I need rather than fault myself for what I can’t do.” As to the issue of increasing the number of female directors, Lee remarks, “It is about access. We’re making huge changes here. I’m no longer the only female director. We’re working hard to find that talent. There are great filmmakers out there.” For Lee, the project dictates the audience. “I don’t believe that anyone here feels that they only make stories for kids or families directly. It’s for everyone. We tell stories that matter to us. We weren’t thinking about an age on Frozen. There is something to how we do our films that there are always things to contemplate. They’re evocative, emotional, entertaining, and there’s hope and joy. Part of that is ingrained in the people who want to make these films. What I do like is the opportunity of experimenting with the short form and always growing the artform. There is definitely a huge energy for that now and we support it 100%. To look at who we are and how we make our films is timeless, timely and resonates with all ages, but at the same time what we have never done is turn down a fascinating idea and say, ‘It’s not young enough.’ We go with what inspires us.” Virtual and Augmented Reality have had a major impact on the animation process. “We have a whole group working on VR and had the short film Cycles last year, which was amazing,” remarks Lee. “Cycles wasn’t a straight narrative, and you’re in tears at the end. It was powerful. We’re continuing to do that work and announcing more stuff in that area. I love that I’m learning a different method of storytelling. Even on the new projects that we’re working on right now, VR allows you to tell stories in a different and more visceral way and not necessarily in a linear manner. It’s important for us to stay involved because VR makes us better at our craft, but also gives us opportunities to keep growing in how we bring stories, emotion and imagination to the world.” Lee has developed a philosophy towards writing and directing. “You become the audience and ask, ‘How do I feel?’ Then you become the characters and ask, ‘Do I believe them?’ We ground it by asking, ‘What is this experience and is it moving me the way that it should?’ If you keep it at any greater distance than that, you won’t have something that resonates.” Unlike live-action, everything in the world of animation needs to be created from scratch, notes Lee. “Animation is the one area where every single thing that is there has been built from the imagination, and there’s so much that we can do with it. There’s this energy now of experimentation, exploration and technology catching up with the imagination and inspiring it to go to new places. That is what I find to be exciting.”

56 • VFXVOICE.COM WINTER 2020

PG 52-56 JENNIFER LEE.indd 56

11/12/19 2:22 PM


PG 57 DISNEY FROZEN 2 AD.indd 57

11/12/19 2:48 PM


INDUSTRY ROUNDTABLE

VFX ENTERS THE ROARING ’20s

VFX Voice asked a cross section of industry executives, artists, practitioners and visionaries this question:

AD

What single creative process do you think will most define and shape VFX for the new decade, and why?

By JIM McCULLAUGH Here are their responses.

“We will see [virtual production] used in more areas of the filmmaking process, not just previs and postvis... or to create entire films like The Lion King, but throughout the VFX pipeline.” —Lindy De Quattro, Production VFX Supervisor, MPC TOP: Star Wars: The Rise of Skywalker (Image copyright © 2019 Lucasfilm and Walt Disney Studios) BOTTOM: Lindy De Quattro

Lindy De Quattro, Production VFX Supervisor, MPC: “Virtual Production has actually been around for years already, but after the success and publicity of The Lion King and the emergence of the AR/VR industry, it’s clearly the new hot catchphrase in our industry. I think the next 10 years will see it really take off and be embraced by many more filmmakers and at a variety of budgets. We will also see it used in more areas of the filmmaking process, not just previs and postvis – which were the earliest uses of VP – or to create entire films like The Lion King, but throughout the VFX pipeline. “The big breakthrough for Virtual Production was the integration of real-time rendering engines from the game industry, like Unreal Engine and Unity. It took a while for these engines to mature enough to be able to handle film quality renders in real-time, but they’re basically there now. While it does require some extra planning and prep work to get a virtual scene set up, once that work is done then the filmmaker has amazing flexibility to not only visualize a scene, but also to modify the virtual elements as they work in real time. After the initial setup, it’s a very fast and cost-effective way to work.”

58 • VFXVOICE.COM WINTER 2020

PG 58-74 ROUNDTABLE.indd 58-59

11/12/19 2:24 PM


PG 59 UNIVERSAL 1917 AD.indd 59

11/19/19 2:57 PM


INDUSTRY ROUNDTABLE

TOP LEFT: Thilo Kuther TOP RIGHT: Rob Bredow MIDDLE: Karen Dufilho BOTTOM: Sonic the Hedgehog (Image copyright © 2019 Paramount Pictures and Sega of America, Inc.)

“As the industry sprints and the audience reacts, the one single creative process that will define and shape the VFX industry is a well-thought-out and considered approach to the allotment of time: time managed, time dedicated, time saved, time respected, for an artist’s inclination to create from the heart – be that with code, or pencil and paper.” —Karen Dufilho, Producer

Thilo Kuther, Founder and President, Pixomondo Global: “Previsualization. VFX, which has long been considered a supporting industry in the film community, will become its own industry as disruptor of the status quo because it is using technology at every level of the process: AI, Game Engine, Virtual Reality. These tools will create a new type of filmmaker. “With previs, the post-production phase is now becoming the pre-production phase as VFX houses offer technological tools to filmmakers who can make production decisions without risking financial consequences. When we build our previs set, as a filmmaker you can see it and make any adjustments before it’s built. It is similar to the process of industrial design: build a prototype, test the prototype, fine-tune the prototype, and then build the final product. Moving forward, this process will be capable of taking over production, from start to finish, not just pre-production. Within five years, VFX will literally overtake productions, and no longer be just a supporting player.” Rob Bredow, Sr. Vice President, Executive Creative Director and Head of ILM: “We are seeing a dramatic shift enabled by the maturing of virtual production technologies where – with close collaboration with filmmakers, production designers and DPs – VFX can create effects in-camera again. This benefits everyone. Actors aren’t standing in a sea of green expected to imagine the environment they are meant to be reacting to. Cinematographers can compose and light their scenes as they would on a traditional production with all of the elements right in front of them, each reacting in real-time to changes as they are made. Filmmakers are able to focus on the story that the shot needs to tell without the unknowns. To enable this, visual effects teams are involved from the very beginning of production. Savvy filmmakers have been engaging their visual effects supervisors early in the production cycle for years, to advise on shooting methodologies, determine best approaches, and help design shots for maximum effect – that’s not new. What is new, however, is the close collaboration between all the departments to achieve the vision during the production cycle, often in camera.”

AD

Karen Dufilho, Producer: “You’ve heard of the Butterfly Effect, right? Where a small change in input – the flap of a butterfly’s wings – can have a large effect on outcome: a tornado occurs in another part of the world. “Things are chaotic. The entertainment industry is no exception. Tools change, studios’ power shifts, tax exemptions dictate work/life choices. Meanwhile, we’re inundated with more shows, content and choices now than ever before. The U.S. media and entertainment industry is expected to reach more than $825 billion by 2023. The U.S. gaming industry is expected to reach nearly $26 billion in 2019 in revenue. VR gaming has grown more than 14% since 2018. “But in the chaos, there is a constant. Every line of code, every word written (and rewritten), every sketch sketched, traces back to that proverbial wing that resulted in the tornado. Never

60 • VFXVOICE.COM WINTER 2020

PG 58-74 ROUNDTABLE.indd 60-61

11/12/19 2:24 PM


PG 61 UNIVERSAL - HOW TO TRAIN YOUR DRAGON AD.indd 61

11/12/19 2:49 PM


INDUSTRY ROUNDTABLE

underestimate the power of pencil and paper on the world. “As the industry sprints and the audience reacts, the one single creative process that will define and shape the VFX industry is a well-thought-out and considered approach to the allotment of time: time managed, time dedicated, time saved, time respected, for an artist’s inclination to create from the heart – be that with code, or pencil and paper. The results could blow you away.”

“Today, the cutting edge of technology development is in deep learning and AI. Most of us have seen demonstrations of the striking potential of these technologies to produce imagery without an artist driving the creative process. The missing piece in this technology is the integration of deep learning AI techniques into an artistically driven creative VFX process.” —Xye, Founder, Yannix/ynx.vfx

TOP LEFT: Kim Davidson TOP RIGHT: Xye BOTTOM LEFT: The Mandalorian (Image copyright © 2019 Lucasfilm and Walt Disney Studios) BOTTOM RIGHT: Top Gun: Maverick (Image copyright © 2020 Paramount Pictures)

Kim Davidson, President & CEO, SideFX: “I’m a big believer in immersive worlds and the impact they will have on the creative process – particularly for VFX. Imagine being on the set of a castle siege with armies charging, trebuchets launching fireballs and walls crumbing. However, in this case the set, armies and VFX are 100% CG, and the director and cinematographer are in free-roam VR outside the castle. They can slow, pause or rewind the action. They walk around and interact with the armies and VFX – adding more soldiers, increasing the size of the fireballs, or changing the landing spot of a falling stone block. “Our industry is already well along the path to immersive filmmaking as many of the necessary components have been adopted or are being rapidly advanced. And leveraging proceduralism along with AI will allow art directors and set designers to readily and stylistically populate their themed immersive worlds – worlds where actors perform and stories unfold. Visual effects involving earth, wind, water and fire complete the illusion of this reality. Visual effects living inside the bigger visual effect of an immersive world.”

AD

Xye, Founder, Yannix/ynx.vfx: “Creativity is always enabled by the tools and technologies used to express that creative idea or concept. In VFX, this grew out of the optical effects of the pre-digital matte paintings and projection technologies and was redefined by the computer and the digital manipulations it enabled. By giving artists access to tools that could precisely manipulate color and texture in three dimensions, this revolutionized the special effects in movies. Today, the cutting edge of technology development is in deep learning and AI. Most of us have seen demonstrations of the striking potential of these technologies to produce imagery without an artist driving the

62 • VFXVOICE.COM WINTER 2020

PG 58-74 ROUNDTABLE.indd 63

11/12/19 2:24 PM


PG 63 UNIVERSAL ABOMINABLE AD.indd 63

11/12/19 2:50 PM


INDUSTRY ROUNDTABLE

creative process. The missing piece in this technology is the integration of deep learning AI techniques into an artistically driven creative VFX process. In professional movies and video, it isn’t sufficient to create a compelling image. The essential need is to create the imagery that best captures the imagination of the filmmaker. While face replacement is a constant need in VFX, ‘deepfake’ technology is useless to a creative process if the filmmaker can’t call out the details they want adjusted and refine the performance accordingly.”

“Visualize, communicate and execute is, and will continue to be, the single most important creative process for anyone in VFX —and many other fields. ... If no one can understand what you are going to create then they cannot understand how it helps the story and, more than likely, it will be rejected. But to effectively communicate these ideas is essential – and nine times out of 10 will lead to acceptance.” —Jeffrey A. Okun, VES, Visual Effects Supervisor

TOP LEFT: Norman Wang TOP RIGHT: Jeffrey A. Okun, VES BOTTOM LEFT: The King’s Man (Image copyright © 2019 20th Century Fox) BOTTOM RIGHT: Artemis Fowl (Photo: Nicola Dove. Copyright © 2019 Disney Enterprises, Inc.)

Norman Wang, CEO and Co-founder, Glassbox Technologies: “The VFX industry has been the pioneer of many groundbreaking innovations, and, as with any industry, true transformation comes from advances in technology making great ideas from the last decade faster, better and more accessible, such is the case with the latest advances in real-time production. Real-time or Virtual Production is now the bridge between pre-production, production and postproduction. Moving forward, it could simply replace these separate parts thanks to recent and forthcoming advances in real-time and extended reality technology, as well as the power of real-time rendering engines. I anticipate we will see more post-production-quality visual effects created with the efficiency and rapid turnaround of pre-production, all while under the intensity of an on-set production environment.” Jeffrey A. Okun, VES, Visual Effects Supervisor: “Visualize, communicate and execute is, and will continue to be, the single most important creative process for anyone in VFX (and many other fields). “Visualize: To be able to close one’s eyes and see the words on the page bloom into images that continue to change and evolve until the shot, sequence or project takes on a life that is bigger than yourself. “Communicate: If no one can understand what you are going to create then they cannot understand how it helps the story and, more than likely, it will be rejected. But to effectively communicate these ideas is essential – and nine times out of 10 will lead to acceptance. “Execute: Obviously, you have to be able to make your idea real. To do so one needs to understand the ever-changing

AD

64 • VFXVOICE.COM WINTER 2020

PG 58-74 ROUNDTABLE.indd 65

11/12/19 2:24 PM


PG 65 UNIVERSAL SECRET LIFE OF PETS 2 AD.indd 65

11/12/19 2:50 PM


INDUSTRY ROUNDTABLE

techniques and processes, as well as how to work with others effectively and with an eye toward budget and time.”

TOP LEFT: Fredrik Limsater TOP RIGHT: Neishaw Ali MIDDLE: Saker Klippsten BOTTOM: Birds of Prey: And the Fantabulous Emancipation of One Harley Quinn (Image copyright © 2020 Warner Bros. Pictures)

“The next 10 years promise a significant period of evolution. Technical developments like web 3.0 and cloud, and creative innovation in real-time and spatial computing, promise a change not just in visual effects, but in how we perceive the world. ... Machine learning has the power to disrupt established VFX production models and empower faster, more efficient approaches to tackling work.” —Fredrik Limsater, CEO, ftrack

Fredrik Limsater, CEO, ftrack: “The next 10 years promise a significant period of evolution. Technical developments like web 3.0 and cloud, and creative innovation in real-time and spatial computing, promise a change not just in visual effects, but in how we perceive the world. “The one technological progression I’m most interested in at ftrack, however, is that of machine learning. AI has the potential to open up new pathways for creative work, but also to bring more adaptive, agile workflows to production pipelines. Machine learning could mine data to predict the length of specific tasks or forecast workloads, for example, and thus enable a more fluid approach to studio schedules. At ftrack, we’re looking at AI-powered analytics to perfect project bids and provide automated, historical insight on tasks as they’re assigned. “Machine learning has the power to disrupt established VFX production models and empower faster, more efficient approaches to tackling work.” Neishaw Ali, President and Executive Producer, Spin VFX: “For decades, visual effects has been considered a black hole for some directors and producers whereby they need to wait for the magical creation of their vision upon shoot wrap and often feel alienated from the post process. But what if the filmmakers can create and see their vision come alive during the shoot albeit at 70% quality level instead of having to wait until the wrap of visual effects? I feel the single most creative process that will redefine and shape VFX in the next decade will be the immersive way in which the director and key creative team can drive the creation of visual effects during prep and shoot, ahead of post. We are fortunate to be on the leading edge of technology developing emerging tools for realtime creation of VFX on set. “Early examples of this were seen in Avatar movies and The Jungle Book and more recently on The Lion King, in which the production team created a complex virtual representation of the scene that is intelligent and intuitive, designed to work in the same way that is familiar to the filmmaker and allows for true real-time collaboration, whereby they can design and create the layout, shots and even select the most dramatic lighting during the shoot, applicable to their storytelling. However, in order for this to be successful, producers have to spend more time during early prep engaging the director, DP, visual effects and key creative team to construct the best approach for their show.”

AD

Saker Klippsten, CTO, Zoic Studios: “If I were to take a guess I would say that the single most important process will be the use of game engine technology such as Unreal Engine as a means to facilitate a faster final look development process, thus giving the creatives better tools to envision their story.

66 • VFXVOICE.COM WINTER 2020

PG 58-74 ROUNDTABLE.indd 67

11/12/19 2:24 PM


PG 67 PIXOMONDO SCANLINE AD.indd 67

11/12/19 2:51 PM


INDUSTRY ROUNDTABLE

Game and TV/film industries have been on a collision course for at least 20 years. The current process involves having highly skilled artists use Maya, Nuke and other similar tools that are very complicated to achieve a look which then needs to be composited and rendered on CPUs. Game engines allow for faster real-time and near-real-time rendering capabilities, giving instant feedback and the ability to iterate rapidly. Many now support ray-tracing directly in the engine which has put the quality level on par with traditional methods.”

TOP LEFT: Eliot Sakharov TOP RIGHT: Lauren McCallum MIDDLE: Aaron Weintraub BOTTOM: Peter Rabbit 2 (Image copyright © 2019 Sony Pictures Animation)

“Virtual Production will be the bleeding edge of VFX innovation in this next decade, giving filmmakers the tools to bring their real work toolkits into the virtual, digital stages. Getting DPs and directors comfortable with virtual asset creation and virtual cameras potential is key as the industry really embraces virtual workflow.” —Lauren McCallum, Global Managing Director, Mill Film

Eliot Sakharov, Business and Technology Strategist, Media & Entertainment, Microsoft: “The cloud and intelligent edge will provide the compute power necessary for user experiences that defy today’s tech limitations. You’ll engage directly with your entertainment platform through AI and sensory interactions, with holographic and virtual environments nearly indistinguishable from the real thing. AI is being infused into every aspect of our experiences today, ranging from what to watch and listen to how we travel. Editorial and VFX are no different. Over the next decade we’ll see further and further advancements in capture technology, where volumetrics and photogrammetry will radically change how we immerse ourselves. With these lifelike captures we can begin to use AI to generate text-to-speech voices, gaze adjustments and interactive elements in xR that we never thought possible. Having a single interaction automatically translated into 80 languages will increase inclusion, immersion and engagement. No single advancement will define it, as it will be an ecosystem of technologies that are only just beginning to take shape. The end result? Immersive storytelling that expands the human/ machine interface.”

AD

Lauren McCallum, Global Managing Director, Mill Film: “Virtual Production will be the bleeding edge of VFX innovation in this next decade, giving filmmakers the tools to bring their real work toolkits into the virtual, digital stages. Getting DPs and directors comfortable with virtual asset creation and virtual cameras potential is key as the industry really embraces virtual workflow. At Mill Film we believe when you combine technical innovation with the most diverse creative teams, you’re bringing every possible resource and tool to your client.” Aaron Weintraub, Sr. VFX Supervisor, MR. X: “There are so many emerging technologies that it’s difficult to choose just one, but I believe that the explosion of applications that utilize the recent advancements in Deep Learning are probably the most influential and pervasive developments that we’ve seen in this industry in recent years. It’s been stated before, even in the pages of this magazine, but it bears repeating: Deep Learning is going to turn the way we do VFX on its head. The current cost of entry for the average artist is somewhat steep, requiring significant coding ability and computing power, but as the technology becomes adopted and packaged into off-the-shelf applications, and the processor speeds continue

68 • VFXVOICE.COM WINTER 2020

PG 58-74 ROUNDTABLE.indd 69

11/12/19 2:24 PM


PG 69 VES HANDBOOK AD.indd 69

11/12/19 2:52 PM


INDUSTRY ROUNDTABLE

to accelerate, we will see these obstacles decrease dramatically. We are going to see tools that we could literally only dream about before. For every time you sat frustrated, trying to force software to do something that you – as a human – knew intuitively how the result should appear, wondering why it couldn’t just know how to do it, you will find a solution in Deep Learning.”

TOP LEFT: Aleksandar Pejic TOP RIGHT: Janelle Croshaw Ralla MIDDLE: Kim Libreri BOTTOM: Mulan (Image copyright © Disney 2019 Enterprises, Inc.)

“Many of the laborious and/or repetitive tasks in VFX have the potential to be solved and executed more efficiently by machines. This might replace today’s typical outsourcing workflows. Even simple examples like face swap show the potential of the impact machine learning has, particularly in regards to digital humans, de-aging and entirely virtual characters.” —Janelle Croshaw Ralla, VFX Supervisor

Aleksandar Pejic, VFX Supervisor, Cinesite: “The quality and complexity of visual effects has quadrupled over the past several years by utilizing the latest technology in computer graphics. What has not changed much in my view is the process itself that is deeply connected to the filmmaking process. There is a story a director wants to tell and a shot which often requires some visual effects in order to visually support the narrative, and that’s where we help. Depending on the movie, shots are invisible if we work on a drama or character-driven project or sometimes very apparent when we help superheroes to save the world, but the creative process is very similar in both instances. It involves a number of steps in order to get the final result. Some of those steps, like rotoscoping, camera and object tracking, and compositing, are very labor intensive and will tremendously benefit from the latest software and hardware developments in AI and ML [machine learning]. We can also see in some areas of animation where character tools have combined physical simulation with AI/ ML. This significantly speeds up the iterations, producing amazing results that once took days, if not weeks.”

AD

Janelle Croshaw Ralla, VFX Supervisor: “Based on the latest tech right now, the answer might be AI and machine learning, as well as AR and its impact on virtual production. Many of the laborious and/or repetitive tasks in VFX have the potential to be solved and executed more efficiently by machines. This might replace today’s typical outsourcing workflows. Even simple examples like face swap show the potential of the impact machine learning has, particularly in regard to digital humans, de-aging and entirely virtual characters. “In the grand scheme of things, 10 years isn’t that far away. Ten years ago, after coming off of The Curious Case of Benjamin Button, we thought fully digital humans would quite possibly take over filmmaking and become widespread. The opposite ended up happening in many regards.” Kim Libreri, CTO, Epic Games: “Virtual production will define and shape VFX for the new decade. While it may be a stretch to describe virtual production as a single creative process, it is the production methodology that will single-handedly impact the role that VFX plays in modern filmmaking. Real-time technology is making it possible to integrate processes that were formerly relegated to post-production cycles and bring them into the live-action shoot. Rather than passing assets or recreating them for each step in the

70 • VFXVOICE.COM WINTER 2020

PG 58-74 ROUNDTABLE.indd 71

WINTER 2020 VFXVOICE.COM • 71

11/12/19 2:24 PM


PG 71 DISNEY FOX FORD v FERRARI AD .indd 71

11/12/19 2:52 PM


INDUSTRY ROUNDTABLE

production process, digital shots can be shared and updated anytime – even on set, in real-time – to satisfy the creative vision of the director or DP. Virtual production brings a level of creative freedom and spontaneity to the set that we haven’t seen in decades.”

TOP LEFT: Adam Myhill TOP RIGHT: Steve May MIDDLE: Chris Ferriter BOTTOM: Onward (Image copyright © 2019 Disney/Pixar)

“When I think of things that will revolutionize the industry, the first thing that comes to mind is game engines, specifically in relation to their use in virtual production workflows. … We’re quickly approaching a day where the visual quality of real-time rendering will match the quality levels that you would expect from a traditional VFX finals pipeline.” —Chris Ferriter, CEO, Halon Entertainment

Adam Myhill, Creative Director, M&E Innovation Group, Unity Technologies: “The future of filmmaking, especially in the VFX industry, is real-time. Up until now, in almost all cases, if a studio or director wanted to explore a ‘what if?’ scenario, they are probably facing a staggering amount of extra work. Re-rendering that section, compositing – you’d see the results days later, or more. With the shift to real-time tools, you’re able to try out those ‘what ifs’ and see the changes right before your eyes, whether you’re doubling the size of a digital dinosaur, changing the lighting in a scene, moving some set around for a more dynamic shot or trying out a totally different camera approach. Creators are now free to experiment, test, iterate, and see everything in context, without waiting for a traditional pipeline to render it all out. Adam: The Mirror director Neill Blomkamp famously changed the time of day for a scene on the last day of production. These tools are also breaking down silos across departments, allowing everyone to be involved in the process from the start. The types of creative input on a project are no longer limited by your stage in development. You can now easily shift between storyboarding, color grading, working on the edit, trying out different music, and experimenting with different lens language, all in one location.”

AD

Steve May, Chief Creative Officer, Pixar Animation Studios: “The technological ability to produce high quality CG content in real-time will have a dramatic impact on our creative process. It will enable artists, producers and directors to iterate creatively in the moment. This will result in a significant change to traditional production pipelines, artist tools and job descriptions. Production pipelines will become far less linear. Walls between artists of different disciplines will be blurred, which will increase the demand for generalists and tools that enable generalist workflows.” Chris Ferriter, CEO, Halon Entertainment: “When I think of things that will revolutionize the industry, the first thing that comes to mind is game engines, specifically in relation to their use in virtual production workflows. “At Halon we’ve been using game engines for all our visualization needs for several years now. The real-time nature allows for high-level look development, rapid iteration, and a very effective feedback loop with our clients. The real power lies in interactivity though, allowing you to scout virtual sets or engage in multi-user collaborative sessions using VR or virtual cameras. Taken one step further, we can push all the content created using those tools into a virtual production setup utilizing LED panels to provide that interactivity on set. This includes on-set visualization, but also the potential to up-res

72 • VFXVOICE.COM WINTER 2020

PG 58-74 ROUNDTABLE.indd 73

11/12/19 2:24 PM


PG 73 FOCAL PRESS MASTERS of FX AD.indd 73

11/12/19 2:53 PM


INDUSTRY ROUNDTABLE

and shoot final pixels for in-camera VFX. “We’re quickly approaching a day where the visual quality of real-time rendering will match the quality levels that you would expect from a traditional VFX finals pipeline.”

“Creative processes evolve alongside technological advancement, and we’re on the cusp of a massive paradigm shift with growing access to real-time tools, cloud-based compute power and low-latency 5G networks. These core technologies will transform how artists work in a way that may be unrecognizable in five years from now, let alone the next decade.” —Ed Ulbrich, President and GM, Method Studios

TOP LEFT: Ed Ulbrich TOP RIGHT: Leslie Chung BOTTOM LEFT: The New Mutants (Image copyright © 2020 Marvel Entertainment and Walt Disney Studios) BOTTOM RIGHT: Underwater (Image copyright 2017 © 20th Century Fox)

Ed Ulbrich, President and GM, Method Studios: “Creative processes evolve alongside technological advancement, and we’re on the cusp of a massive paradigm shift with growing access to real-time tools, cloud-based compute power and low-latency 5G networks. These core technologies will transform how artists work in a way that may be unrecognizable in five years from now, let alone the next decade. The continued adoption of machine learning, AI and neural networks will also heavily influence the creative process moving forward, automating repetitive tasks that free artist bandwidth so that they can focus on creative ideation. “We’re already seeing many of these core technologies in production, and the timeframe to navigate difficult tasks is going to collapse as tools and technology continue to improve. It’s incredible what artists can accomplish today with real-time rendering technology and a desktop, nimbly handling work that used to require a hardcore renderfarm. Rapid prototyping can be done in nearly full resolution, and temp versions look more like finals.” Leslie Chung, VFX Supervisor, Crafty Apes: “I think the expansion of collaborative ability through real-time virtual production will change the way filmmaking and VFX marry together. “As fantastical films require more epic demands from VFX, the ability for directors, DPs, visual effects supervisors and editors to choreograph a scene in real-time together will become the biggest game-changer of our industry. I foresee it enabling a more streamlined and inspiring workflow for the impossible to be directed by the creative team and then executed by a visual effects house. You’d have so many pieces of the concept puzzle able to be generated and molded much more quickly than you ever could before. With this reduction in iteration, so much of the guesswork can be taken out of how best to hit a director’s vision for his or her film. Ideally then, both VFX studio, director and supervisor could riff more quickly and creatively together.”

AD

74 • VFXVOICE.COM WINTER 2020

PG 58-74 ROUNDTABLE.indd 75

11/12/19 2:24 PM


PG 75 DIGITAL DOMAIN AD.indd 75

11/12/19 2:54 PM


COVER

CLASHES BETWEEN HUMANS AND FLYING FAERIES STOKE VFX FIRES IN CARNIVAL ROW By KEVIN H. MARTIN

TOP AND OPPOSITE TOP: Image Engine was responsible for shots requiring full CGI creature work. To ensure credible interaction between characters, a black-suited stunt performer festooned with tracking markers would give the actor a physical presence with which to interact. (Images courtesy of Image Engine, Legendary Television and Amazon)

76 • VFXVOICE.COM WINTER 2020

PG 76-80 CARNIVAL ROW.indd 76-77

Screenwriter Travis Beacham’s original take on Carnival Row was as a feature film story, one which made the Hollywood Black List as one of the best unproduced scripts. Expanding it to a series for Amazon Prime Video involved Beacham and showrunner René Echevarria. The fantasy tale is set in a quasi-Victorian realm where man and faery share an uneasy existence. A series of gruesome murders further exacerbate the tensions between races, as human detective Rycroft Philostrate (Orlando Bloom) crosses paths with a Fae, his one-time lover Vignette Stonemoss (Cara Delevingne), who has been illegally smuggling faeries from their ruined homeland into his Republic. Shot in the Czech Republic, mostly at Prague’s Barrandov Studios, production of Carnival Row’s initial season was accomplished in 108 days. Visual Effects Supervisor Betsy Paterson, who had overseen effects for the Netflix version of Marco Polo and supervised features ranging from Marvel’s The Incredible Hulk to The Ring Two and Yogi Bear, joined the project early on, during what became a nearly 20-week pre-production period. “The design process was very much a group effort, a totally collaborative process,” she relates. “VFX became involved very heavily with the art department as well as the special effects makeup team, both of which created a number of sketches for the locations and characters. We looked at all of this work and made a determination about which aspects could best be handled on set. Sometimes that meant production built a partial set, and oftentimes it meant that a creature would be done with a prosthetic.” The wings of the faeries were an immediate concern. “There are practical wings that the actors wear when they are static on the ground, but when they start to flap and fly, that transitions to VFX,” Paterson explains. “Hand-painted silicone wings were molded and dressed for each actor – their hero wings, if you will

“Hand-painted silicone wings were molded and dressed for each actor – their hero wings, if you will – and then there were hundreds of wings that could be employed for background faeries. The wings are translucent and iridescent, so getting that to look just right was very tricky. They only glow a little and in specific circumstances – just enough to seem magical – and the play of light from sources upon the wings is something production relied on a lot.” —Betsy Paterson, Visual Effects Supervisor

MIDDLE AND BOTTOM: Crowds of airborne faeries were often achieved by shooting multiple live-action performers on wires. Important Looking Pirates handled a number of these shots, adding wings with flapping cycles that reflected their physical capabilities, as damaged creatures would often demonstrate weakened wing action. (Images courtesy of Important Looking Pirates, Legendary Television and Amazon)

– and then there were hundreds of wings that could be employed for background faeries. The wings are translucent and iridescent, so getting that to look just right was very tricky. They only glow a little and in specific circumstances – just enough to seem magical – and the play of light from sources upon the wings is something production relied on a lot.” Depicting flying creatures in fantasy has always been a filmmaking task that doesn’t offer easy solutions. “Right up front, we put a lot of stunt people up on wires. We experimented with how the wing motion would look in different conditions, such as taking off and landing. We knew that it was going to be a case of having actors on wires for as many shots as they possibly could do, because we wanted to keep things physical and ‘real’ in that way whenever possible. The general thrust was to avoid using greenscreen except when there was no alternative to doing so.”

WINTER 2020 VFXVOICE.COM • 77

11/12/19 2:26 PM


COVER

CLASHES BETWEEN HUMANS AND FLYING FAERIES STOKE VFX FIRES IN CARNIVAL ROW By KEVIN H. MARTIN

TOP AND OPPOSITE TOP: Image Engine was responsible for shots requiring full CGI creature work. To ensure credible interaction between characters, a black-suited stunt performer festooned with tracking markers would give the actor a physical presence with which to interact. (Images courtesy of Image Engine, Legendary Television and Amazon)

76 • VFXVOICE.COM WINTER 2020

PG 76-80 CARNIVAL ROW.indd 76-77

Screenwriter Travis Beacham’s original take on Carnival Row was as a feature film story, one which made the Hollywood Black List as one of the best unproduced scripts. Expanding it to a series for Amazon Prime Video involved Beacham and showrunner René Echevarria. The fantasy tale is set in a quasi-Victorian realm where man and faery share an uneasy existence. A series of gruesome murders further exacerbate the tensions between races, as human detective Rycroft Philostrate (Orlando Bloom) crosses paths with a Fae, his one-time lover Vignette Stonemoss (Cara Delevingne), who has been illegally smuggling faeries from their ruined homeland into his Republic. Shot in the Czech Republic, mostly at Prague’s Barrandov Studios, production of Carnival Row’s initial season was accomplished in 108 days. Visual Effects Supervisor Betsy Paterson, who had overseen effects for the Netflix version of Marco Polo and supervised features ranging from Marvel’s The Incredible Hulk to The Ring Two and Yogi Bear, joined the project early on, during what became a nearly 20-week pre-production period. “The design process was very much a group effort, a totally collaborative process,” she relates. “VFX became involved very heavily with the art department as well as the special effects makeup team, both of which created a number of sketches for the locations and characters. We looked at all of this work and made a determination about which aspects could best be handled on set. Sometimes that meant production built a partial set, and oftentimes it meant that a creature would be done with a prosthetic.” The wings of the faeries were an immediate concern. “There are practical wings that the actors wear when they are static on the ground, but when they start to flap and fly, that transitions to VFX,” Paterson explains. “Hand-painted silicone wings were molded and dressed for each actor – their hero wings, if you will

“Hand-painted silicone wings were molded and dressed for each actor – their hero wings, if you will – and then there were hundreds of wings that could be employed for background faeries. The wings are translucent and iridescent, so getting that to look just right was very tricky. They only glow a little and in specific circumstances – just enough to seem magical – and the play of light from sources upon the wings is something production relied on a lot.” —Betsy Paterson, Visual Effects Supervisor

MIDDLE AND BOTTOM: Crowds of airborne faeries were often achieved by shooting multiple live-action performers on wires. Important Looking Pirates handled a number of these shots, adding wings with flapping cycles that reflected their physical capabilities, as damaged creatures would often demonstrate weakened wing action. (Images courtesy of Important Looking Pirates, Legendary Television and Amazon)

– and then there were hundreds of wings that could be employed for background faeries. The wings are translucent and iridescent, so getting that to look just right was very tricky. They only glow a little and in specific circumstances – just enough to seem magical – and the play of light from sources upon the wings is something production relied on a lot.” Depicting flying creatures in fantasy has always been a filmmaking task that doesn’t offer easy solutions. “Right up front, we put a lot of stunt people up on wires. We experimented with how the wing motion would look in different conditions, such as taking off and landing. We knew that it was going to be a case of having actors on wires for as many shots as they possibly could do, because we wanted to keep things physical and ‘real’ in that way whenever possible. The general thrust was to avoid using greenscreen except when there was no alternative to doing so.”

WINTER 2020 VFXVOICE.COM • 77

11/12/19 2:26 PM


COVER

TOP TO BOTTOM: The production art department provided detailed conceptual art for the neo-Victorian cityscape, which was then built practically in Prague to a mix of one and two-story structures. MR. X replaced the contemporary aspects above the set build and extended the industrial landscape in 3D, often embellishing the skies and clouds. (Images courtesy of MR. X, Legendary Television and Amazon)

For scenes of faeries in flight, their wings would be produced via CGI, though finessing and finalizing the look of the wings was time-consuming. “With the faeries, it was a case of iterate, iterate and iterate,” she states. “It became a matter of figuring out the speeds, whether they were more hummingbird or dragonfly – we wound up somewhere in the middle most of the time. We had fight scenes where somebody gets hurt, and the combat tactics involve deliberately going after your opponent’s wings to disrupt their flight pattern, so the injury manifests visually in the change to the wing motion. We have a few large crowd scenes in the sky in flashback scenes, where there are people shot on greenscreen elements, and some where they are practically shot against the set and location, but others where they are fully CG. For the most part, they were hand-animated, because I think the most we ever had was about 60, so flocking-type solutions weren’t needed. Pixomondo spent quite a while developing that wing look, coming up with a lot of interesting stuff. I monitored their progress, making sure it stayed within our desired window for the look.” Paterson elected to break up vendor assignments by type of effect whenever possible. “I find that grouping all the creature shots or environmental work with dedicated vendors is the best way to maintain consistency over the run of a show,” she declares. “For some of the more complicated scenes that involve flying and fight scenes, we may choose to have a single vendor do everything. Pixomondo did environments as well as flying in one instance, and UPP did everything in another, but when the most efficient and effective way to work involves sharing elements between vendors, then we’ll go that route as well.” Other principal VFX houses included Important Looking Pirates, MR. X and Rhythm & Hues. Image Engine was responsible for the show’s full CGI creature work. “There’s a very big baddie, plus various additional monsters met along the way,” Paterson reveals. “In most cases, we would have a stunt guy dressed in a black suit on set to give the actors something to play to or physically push on, which gets you more credibility than simple miming. For the big creature, we put our guy in a rubber suit and up on stilts, to get the proper height and eyeline. We also had many actors wearing prosthetics, so there was invariably a fair amount of cleanup on that end of things as well, with seams and weak edges. Every show requires some of that kind of final polish, and here there were all those flying rigs that needed painting out, too.” Major action sequences and key select moments benefitted from the use of previsualization. “When you’re doing separate pieces on multiple locations for a single shot, every aspect needs to be considered and accounted for in advance of shooting,” notes Paterson. “We did previs whenever it was really necessary, particularly where we had to figure things out in some detail, like the specific beats during that first time we see Cara fly,” Paterson notes. “She dives off a cliff, and in a very elaborate camera move, we follow while moving around her as she starts to fly. We had Cara on a wire atop a mountain and she was just so game, she just jumped right off. The visceral aspect of getting something live on location just can’t be duplicated. “From there, it got very difficult,” she continues. “We had two transitions during that shot, first going from her in live-action

AD

78 • VFXVOICE.COM WINTER 2020

PG 76-80 CARNIVAL ROW.indd 78-79

11/12/19 2:26 PM


PG 79 MTS EVENTS AD.indd 79

11/12/19 2:54 PM


COVER

TOP TO BOTTOM: Pixomondo developed specific characteristics for the look of faery wings. Translucent and iridescent, their looks proved tricky to refine, conveying a magical quality as well as a seeming credible means of propulsion. (Images courtesy of Pixomondo, Legendary Television and Amazon)

to CG as she falls, and then we come in on her face, and that is a greenscreen element of the actress. Because this was so close, there wasn’t anything to hide behind, like mist or objects between her and the camera, so it was a matter of working the transitions in when the camera was behind her. We knew her face would be live-action, but when it was on her body that was going to be fully CG.” The principal design influence in the city is Victorian. “The art department gave us a pretty detailed painting for that environment,” says Paterson. “Then we took that forward into 3D and figured out the details. Production had built huge sets that went up one or two stories, so we replaced and extended everything above that level, plus the sky in many instances. We got away with some 2-1/2D versions in some instances, but most of those were fully 3D, since directors like moving the camera a lot these days, and we don’t want to hold them back or impose limits on their vision.” Part of that vertical extension included an elevated train. “There was a lot of discussion about just how much interactive presence we could create for the train passing overhead,” Paterson states. “We created a kind of cart that could go by, but the effect of its passage wasn’t really noticeable on our day shoots. Ultimately, we were able to use that at night, with a big array of lights rolling down the track inside the cart, suggesting illumination from within.” Another Victorian-looking element is a Jules Verne-like airship, which originated in the art department. “We took that much further along with development through the VFX artists,” acknowledges Paterson. “Airship battles were another aspect that benefitted from previs, with these vessels using a primitive form of hand-cranked machine guns against the faeries. Those were built practically so the actors could use them, while we added the cannon fire coming out and the resulting explosions, which often feature practically shot pyro and dust elements. But when the explosion is supposed to take place in the middle of 100 extras, you’re not going to dare try to do actual gas pyrotechnics. Though we enhance all of the explosions in post, usually by embellishing with a hot, fiery core to the blast, it is always best to start with something real for the actors. They had special effects create a blast with a bunch of cork, so there was lots of debris that couldn’t actually hurt anybody raining down. Along with the actual dust and smoke, that really helps sell it.” Paterson finds, “The color pipelines are pretty streamlined these days, so production LUTs are available right from the beginning. Then if everybody plays by the rules,” she chuckles, “everybody winds up working on the RAW neutral uncorrected plate. We have monitor tables that the DP is involved in creating so you can view the footage in the proper color space as it looks in the final.” To be cognizant of potential issues with the HDR finishing pass, Paterson had a 4K monitor set up in her office to do what she calls the “Final final color check. That lets me see just how far we can push the HDR. The color range is just so huge now, and you can find yourself very surprised by stuff that you would never see on a standard monitor, but which is very evident – and potentially very distracting, even blinding, in HDR. You also obviously have to really trust your colorist, but even so, in my position, I am visiting there a lot during that process – just to make sure it all holds up and proceeds as expected.”

AD

80 • VFXVOICE.COM WINTER 2020

PG 76-80 CARNIVAL ROW.indd 81

11/12/19 2:26 PM


PG 81 AMAZON CARNIVAL ROW AD.indd 81

11/12/19 2:57 PM


PROFILE

A veteran of more than 100 feature films and television productions, renowned visual effects producer, supervisor and main titles designer Dan Curry, VES, has enjoyed a storied career spanning over three decades working with some of the industry’s most influential and respected filmmakers. His inventive, groundbreaking work has been recognized with seven primetime Emmy Awards (12 nominations), a VES Award for Outstanding Visual Effects in a Broadcast Series, and in 2018 he was named a VES Fellow. Curry is best known for his imaginative and trailblazing work on Star Trek: The Next Generation, Star Trek: Deep Space Nine, Star Trek: Voyager and Star Trek: Enterprise – an 18-year professional journey that he describes as “a breathtaking continuous flow, like watching a movie from the beginning to its completion.” His features portfolio also includes exploring the “final frontier” on the big screen, including Star Trek: Generations and Star Trek IV: The Voyage Home. His longtime residence in Asia and extensive world travel have been major sources of artistic inspiration, influences that found their way into the beloved sci-fi franchise. THE EARLY YEARS

VES FELLOW DAN CURRY: A RENAISSANCE MAN’S CREATIVE JOURNEY TO THE STARS By NAOMI GOLDMAN

All images courtesy of Dan Curry. TOP: Dan Curry, VES, with his celebrated creation, Klingon weapon the bat’leth. BOTTOM: Visible Klingon.

82 • VFXVOICE.COM WINTER 2020

PG 82-84 DAN CURRY.indd 82-83

Hearkening to his roots as a child in New York City, Curry recalls escaping from the heat of summer into the cool air of the corner movie theater, and marveling at the wonder of cinema. “Moments of transition in my life are marked by movies and how they influenced me – The Thing, Spartacus, Forbidden Planet, House of Wax. When I saw The Beast from 20,000 Fathoms, I realized there was a difference in the realities of the background, and I read up on rear projection. Using a broken 8mm projector that I could only advance one frame at a time, I shot with my brother and stretched tracing paper on the inside of a cardboard box to make a crude rear projection system – all so I could have dinosaurs chase my brother. I was about 10 years old and I was hooked.” A celebrated painter, Curry started drawing and studying perspective at an early age. The artform came naturally. “I vividly remember moments in elementary school, walking down the hallway and seeing a drawing of warriors by an older student. I could figure out how they drew the jawline, and that very moment changed how I understood facial anatomy. I started doing storyboards for movies in my head before I really understood what storyboards were.” Curry was an undergrad at Middlebury College in Vermont studying fine arts and theater when he launched himself into the world of theater production, an experience that shaped his understanding of story trajectory, character arcs, and coordination among every crew member to achieve a singular performance. “One day, I stumbled across an old wind-up Bolex 16mm movie camera in the basement and managed to talk the school into funding a medieval epic about an Icelandic peasant hiding out from his overseer. The theater department went all in on props and costumes, and we made something new as a creative community.” GLOBETROTTER TO THE GALAXY

After college, Curry joined the Peace Corps, an experience that greatly shaped his life and career. He was assigned to build small dams and bridges for a remote, rural Thai village, and immersed

himself in the culture and wisdom of the people. After a sevenmonth trek in Nepal, he moved to Bangkok and subsequently directed a Thai TV series (My Tree and the Magic Chopsticks) doing his own animation. He taught architectural drafting at Khon Kaen University, served as the production designer for the Bangkok Opera, designed a library for U.S. Information Services on an island in the Gulf of Siam, and won the commission to be the production designer for the King of Thailand’s Royal Ball. Curry learned to speak fluent Thai and Lao, and became a martial arts expert – a skill that found its way into the Star Trek franchise. When he returned to the U.S., Curry worked as a bio-medical illustrator, and after receiving his MFA, taught fine arts, theater design, perspective drawing, graphic design and set fabrication. While in the MFA program at Humboldt State University, happenstance led him on the next leg of his creative path. Marcia Lucas (then wife of George Lucas) saw some of Curry’s paintings while on campus for a lecture on editing Taxi Driver. She introduced Curry to Dennis Muren, VES, matte artist Mike Pangrazio and others, and referred him to Peter Anderson, VES, at Universal – where Curry made his mark as a matte painter on Buck Rogers in the 25th Century and Battlestar Galactica. In the late 1980s Curry met Gene Roddenberry, who was planning to revive Star Trek. “They initially thought I would create 40 stock shots that would suffice for the whole series. That lasted about a week and a half!” During his 18 years on the Star Trek franchise, Curry worked as Visual Effects Supervisor, then producer and second unit director. He also did extensive artwork for the shows, including conceptual design, motion-control photography, animation, electronic and optical compositing, matte painting, spaceship design, storyboarding and martial arts choreography. Curry used his unique point of view and experience as a martial artist to develop a fighting style for the Klingons. He also designed several Klingon hand-to-hand combat weapons, including the Sword of Kahless, the mek’leth and, most notably, the primordial weapon inherited by alien character Worf – the bat’leth. “I have never liked movie weapons that just look cool but can’t be used. I am proud that the Korean Martial Arts Association recognized my bat’leth as the first new bladed weapon of the last century that is practical.” He often used wacky everyday objects for the “worker bee” alien ships to preserve the budget for the bigger ships seen in close-ups. “Especially in the first season of Next Generation, I would make a lot of the guest alien ships of the week out of things like a shampoo bottle and a couple of toy submarines glued on the side, and I would add gray balls and other things stuck together that worked when backlit just right. I made a utility ship out of a broken toy robot with glued-on razor handles. I discovered a world filled with spaceship parts, if you just skew your perspective.” During his tenure on Star Trek, the field advanced from analog to the digital age. “Early on, we were reticent to use CG, because it didn’t look that good. When we got to Deep Space Nine, our character Odo had to be CG. And because the space battles were big sequences, we started to embrace CG as a more efficient way to

TOP: Matte painting in oils of the original Galactica. MIDDLE: The Nacelle Tube Control Room. From Star Trek: The Next Generation episode “Eye of the Beholder” (1994). BOTTOM: With Amelia Earhart’s plane, Voyager.

WINTER 2020 VFXVOICE.COM • 83

11/12/19 2:26 PM


PROFILE

A veteran of more than 100 feature films and television productions, renowned visual effects producer, supervisor and main titles designer Dan Curry, VES, has enjoyed a storied career spanning over three decades working with some of the industry’s most influential and respected filmmakers. His inventive, groundbreaking work has been recognized with seven primetime Emmy Awards (12 nominations), a VES Award for Outstanding Visual Effects in a Broadcast Series, and in 2018 he was named a VES Fellow. Curry is best known for his imaginative and trailblazing work on Star Trek: The Next Generation, Star Trek: Deep Space Nine, Star Trek: Voyager and Star Trek: Enterprise – an 18-year professional journey that he describes as “a breathtaking continuous flow, like watching a movie from the beginning to its completion.” His features portfolio also includes exploring the “final frontier” on the big screen, including Star Trek: Generations and Star Trek IV: The Voyage Home. His longtime residence in Asia and extensive world travel have been major sources of artistic inspiration, influences that found their way into the beloved sci-fi franchise. THE EARLY YEARS

VES FELLOW DAN CURRY: A RENAISSANCE MAN’S CREATIVE JOURNEY TO THE STARS By NAOMI GOLDMAN

All images courtesy of Dan Curry. TOP: Dan Curry, VES, with his celebrated creation, Klingon weapon the bat’leth. BOTTOM: Visible Klingon.

82 • VFXVOICE.COM WINTER 2020

PG 82-84 DAN CURRY.indd 82-83

Hearkening to his roots as a child in New York City, Curry recalls escaping from the heat of summer into the cool air of the corner movie theater, and marveling at the wonder of cinema. “Moments of transition in my life are marked by movies and how they influenced me – The Thing, Spartacus, Forbidden Planet, House of Wax. When I saw The Beast from 20,000 Fathoms, I realized there was a difference in the realities of the background, and I read up on rear projection. Using a broken 8mm projector that I could only advance one frame at a time, I shot with my brother and stretched tracing paper on the inside of a cardboard box to make a crude rear projection system – all so I could have dinosaurs chase my brother. I was about 10 years old and I was hooked.” A celebrated painter, Curry started drawing and studying perspective at an early age. The artform came naturally. “I vividly remember moments in elementary school, walking down the hallway and seeing a drawing of warriors by an older student. I could figure out how they drew the jawline, and that very moment changed how I understood facial anatomy. I started doing storyboards for movies in my head before I really understood what storyboards were.” Curry was an undergrad at Middlebury College in Vermont studying fine arts and theater when he launched himself into the world of theater production, an experience that shaped his understanding of story trajectory, character arcs, and coordination among every crew member to achieve a singular performance. “One day, I stumbled across an old wind-up Bolex 16mm movie camera in the basement and managed to talk the school into funding a medieval epic about an Icelandic peasant hiding out from his overseer. The theater department went all in on props and costumes, and we made something new as a creative community.” GLOBETROTTER TO THE GALAXY

After college, Curry joined the Peace Corps, an experience that greatly shaped his life and career. He was assigned to build small dams and bridges for a remote, rural Thai village, and immersed

himself in the culture and wisdom of the people. After a sevenmonth trek in Nepal, he moved to Bangkok and subsequently directed a Thai TV series (My Tree and the Magic Chopsticks) doing his own animation. He taught architectural drafting at Khon Kaen University, served as the production designer for the Bangkok Opera, designed a library for U.S. Information Services on an island in the Gulf of Siam, and won the commission to be the production designer for the King of Thailand’s Royal Ball. Curry learned to speak fluent Thai and Lao, and became a martial arts expert – a skill that found its way into the Star Trek franchise. When he returned to the U.S., Curry worked as a bio-medical illustrator, and after receiving his MFA, taught fine arts, theater design, perspective drawing, graphic design and set fabrication. While in the MFA program at Humboldt State University, happenstance led him on the next leg of his creative path. Marcia Lucas (then wife of George Lucas) saw some of Curry’s paintings while on campus for a lecture on editing Taxi Driver. She introduced Curry to Dennis Muren, VES, matte artist Mike Pangrazio and others, and referred him to Peter Anderson, VES, at Universal – where Curry made his mark as a matte painter on Buck Rogers in the 25th Century and Battlestar Galactica. In the late 1980s Curry met Gene Roddenberry, who was planning to revive Star Trek. “They initially thought I would create 40 stock shots that would suffice for the whole series. That lasted about a week and a half!” During his 18 years on the Star Trek franchise, Curry worked as Visual Effects Supervisor, then producer and second unit director. He also did extensive artwork for the shows, including conceptual design, motion-control photography, animation, electronic and optical compositing, matte painting, spaceship design, storyboarding and martial arts choreography. Curry used his unique point of view and experience as a martial artist to develop a fighting style for the Klingons. He also designed several Klingon hand-to-hand combat weapons, including the Sword of Kahless, the mek’leth and, most notably, the primordial weapon inherited by alien character Worf – the bat’leth. “I have never liked movie weapons that just look cool but can’t be used. I am proud that the Korean Martial Arts Association recognized my bat’leth as the first new bladed weapon of the last century that is practical.” He often used wacky everyday objects for the “worker bee” alien ships to preserve the budget for the bigger ships seen in close-ups. “Especially in the first season of Next Generation, I would make a lot of the guest alien ships of the week out of things like a shampoo bottle and a couple of toy submarines glued on the side, and I would add gray balls and other things stuck together that worked when backlit just right. I made a utility ship out of a broken toy robot with glued-on razor handles. I discovered a world filled with spaceship parts, if you just skew your perspective.” During his tenure on Star Trek, the field advanced from analog to the digital age. “Early on, we were reticent to use CG, because it didn’t look that good. When we got to Deep Space Nine, our character Odo had to be CG. And because the space battles were big sequences, we started to embrace CG as a more efficient way to

TOP: Matte painting in oils of the original Galactica. MIDDLE: The Nacelle Tube Control Room. From Star Trek: The Next Generation episode “Eye of the Beholder” (1994). BOTTOM: With Amelia Earhart’s plane, Voyager.

WINTER 2020 VFXVOICE.COM • 83

11/12/19 2:26 PM


PROFILE

deliver complex shots, but using a hybrid approach – CG in the background with physical ships in the foreground. There was some CG in the Voyager title sequence, but it is primarily paintings and models. Our first fully CG show was Enterprise. By that time the image quality had really improved, and it put a whole new set of limitless tools at our storytelling disposal. “I was really proud of being part of the Star Trek family. There is something unique about that. Our stories, our characters carried a promise about the capacity of the future and an enterprise that is greater than the sum of its parts.” IT’S A FAMILY AFFAIR

As a renowned main titles designer, Curry has created more than 118 feature title sequences. He cites the backstory on Back to School with Rodney Dangerfield as one of his most rewarding design experiences. “I made a mini-movie for the film. The director wanted me to create a bio of Rodney’s life from age 12 to 55 with stills. By happenstance, I grew up in a neighborhood really similar to his character’s, so I went and got shots from one of my old family albums, blew them up and did oil paintings – over the face of my father! Now picture me with composer Danny Elfman as we merged math and music to create the sound for this visual overture.” ROOTS AND WINGS

TOP: Curry consulting with Star Trek director Jonathan Frakes on a VFX shot. MIDDLE: Directing Michael Dorn and Alan Scarfe in Battlestar Galactica. BOTTOM: On set working with liquid nitrogen.

84 • VFXVOICE.COM WINTER 2020

PG 82-84 DAN CURRY.indd 84-85

Curry points to many historic role models as he built and shaped his career, including Ray Harryhausen, John Ford, Michael Curtiz, Albert Whitlock, Frank Capra and Thomas Edison, as well as a host of painters as inspiration, including Picasso, Rembrandt, Dali, Monet, Toulouse-Lautrec, Cézanne, Michelangelo and Leonardo da Vinci. When asked what advice he gives to young film students and aspiring creatives, Curry emphasizes the power and importance of history. “You need to be a renaissance person if you want to be a great filmmaker. If you want to creatively contribute to our artform, you need to understand the history of where we evolved from – the history of art, architecture, music and theater design. Even if your goal is to work in film, be involved in at least one live theater production so you can see how a depiction of character evolves over time and how an army of people comes together. You need to know how to draw – it’s essential. Some of the things we do are so esoteric that it’s hard to describe verbally, so drawings are universal. And you would be wise to learn to speak at least one other language. It gives you another way to understand how thoughts are created, how thinking evolves, how a culture evolves. “I think film is the most powerful art form that our species has evolved. With that power I believe that as storytellers we have a responsibility to make a positive impact on our species and our society as a whole.” Advises Curry, “The computer is the best paintbrush and best typewriter ever created, but technology does not give you a brain or ideas – that has to come from you. With the tools we have now, the only limit to what we can create is our own imagination. So the deeper you think and the more you embrace the phenomenon of life for yourself, the more you can give out to other people.”

AD

WINTER 2020 VFXVOICE.COM • 85

11/12/19 2:26 PM


PG 85 YANNIX AD.indd 85

11/12/19 2:58 PM


LEGEND

GEARING UP FOR RAY HARRYHAUSEN’S 100TH ANNIVERSARY By BARBARA ROBERTSON

Images courtesy of the Ray and Diana Harryhausen Foundation.

To the legions of visual effects artists, animators, and filmmakers who have been inspired by and have followed his footsteps, Ray Harryhausen, stop-motion pioneer, visual effects creator and filmmaker, needs no introduction. Or, does he? “The thing that most stands out about Ray Harryhausen and is the most difficult to convey,” says filmmaker John Walsh and pauses, “is he didn’t invent stop-motion animation. Willis O’Brien did that with King Kong and Mighty Joe Young. But Ray stands apart.” In 1989, while a young film student, Walsh made a documentary entitled Ray Harryhausen: Movement Into Life, released the next year. The self-described fan boy continued to stay in touch with Harryhausen, and is now a trustee of the Ray & Diana Harryhausen Foundation. “Ray was the person who would instigate a project,” Walsh says. “He was not brought in to facilitate someone else’s vision. He facilitated his own vision. I can’t think of anyone else who’s done that. When we think of the great visual and special effects people – Dennis Muren, VES, Rick Baker, and so forth – they are working on other people’s films. Willis, too, for the most part. But Ray consistently worked on his own films, on stories he wrote, creatures he created. He’s a unique part of film history.” Harryhausen made 16 feature films during his lifetime and is perhaps best known for Mighty Joe Young (1949), The 7th Voyage of Sinbad (1958); Jason and the Argonauts (1963), One Million Years B.C. (1966), and Clash of the Titans (1981). His honors include the prestigious Gordon E. Sawyer Award from the Academy (presented by Ray Bradbury), the Visual Effects Society’s Lifetime Achievement Award, induction into the VES Hall of Fame (posthumously), the Science Fiction Hall of Fame, an honorary BAFTA (presented by Peter Jackson), the British Fantasy Society’s Wagner Award for his lifetime contribution, and a star on the Hollywood Walk of Fame. Some of his most famous films have been restored and re-issued in 4K format including The 7th Voyage of Sinbad, which premiered last September in London, along with a surprise. At that premiere, Walsh launched his book, Harryhausen: The Lost Movies (Titan Books). It turns out that in addition to the 16 films Harryhausen worked on that made it into the cinema, many more didn’t. “I thought the number of films Ray didn’t make would be maybe 40 or 50,” Walsh says. “It was 80. We’ve found amazing examples of offers he received, and projects he was involved with.” The “we” Walsh refers to is the Ray & Diana Harryhausen Foundation, which safeguards Harryhausen’s collection of more than 50,000 objects. CENTENNIAL EXHIBITION

In 1986, long before his death at age 92 in 2013, Harryhausen established the Foundation to look after his collection, protect his name, and educate new generations of stop-motion animators. Foundation trustees are now his daughter Vanessa Harryhausen, Simon Mackintosh and Walsh. Actress Caroline Munro, who appeared in The Golden Voyage of Sinbad, and special effects artist Randy Cook are advisors.

86 • VFXVOICE.COM WINTER 2020

PG 86-88 HARRYHAUSEN.indd 86-87

This year, the Foundation will celebrate the 100th anniversary of Harryhausen’s birthday with a major exhibition at the Scottish National Gallery of Modern Art in Edinburgh from May 23 to October 25, 2020. “We’ll have the entire collection of his main creatures, concept art, storyboards and test footage,” Walsh says. “We expect to occupy two or three floors and a dozen different rooms or spaces.” As part of the restoration process for previous exhibitions and the upcoming centenary, the Foundation has done MRI scanning and 3D mapping of many Harryhausen models to learn what’s inside. “We’re restoring pieces as we go, trying to get things back as close to how people remember them as possible,” Walsh says. “Ray never kept a log of repairs, never photographed making the creatures, or kept detailed notes on the manufacture. So we don’t know which compounds were used. And the only way to find out what’s inside is to scan. I think people are surprised to learn how heavy and robust these creatures are. They are not delicate. The rubber inside is hard and thick, and the ball and socket armatures are steel, and they’re larger than people think.” THE COLLECTION

Of course, with 50,000 objects and growing, the collection from which they’ll draw for the exhibition extends beyond his main creatures, beyond the movies for which he’s known. “[The collection has] everything he kept from the early 1960s onward,” Walsh says. “And he kept everything. Some of the most interesting sketches we have were on the back of letterhead or scribbled on legal pads – he would use both sides. It wasn’t that he was impoverished. He was successful and lived a comfortable life in West London. I think it was because he was a child of the Great Depression.” “Everything” also includes objects that were sent to Harryhausen. The Foundation is happy to continue receiving items for the collection. “We’re always surprised by what people donate to us,” Walsh says. “We have an interesting mask in the collection that was sent by a fan in 1960 of the Cyclops from The 7th Voyage of Sinbad. That fan was Rick Baker, who was a schoolboy then. Ray was always generous with his time. It was the same with me when I was a film student. But the latest discovery was Ray’s preliminary sketches on a legal pad for John Huston’s Moby Dick.” It was a project Harryhausen had never discussed publicly. THE LOST FILMS

Hidden within the collection are hundreds of objects from the films that were never made. When the trustees were considering what they could do for people who couldn’t make it to the 100th centennial exhibition in Edinburgh, Walsh suggested documenting Harryhausen’s “lost films” – the films he never made – for the public at large. Walsh didn’t know what he’d find when he started digging into the massive collection, but he found treasures. “If you asked me to find details about Jason or Clash, I could get that for you in a day or two, but the lost films are scattered amongst

OPPOSITE TOP: Ray Harryhausen with Medusa from Clash of the Titans (1981). (Photo: Andy Johnson) OPPOSITE BOTTOM: Kraken in Clash of the Titans. TOP: Ray animates Snakewoman from The 7th Voyage of Sinbad (1958). MIDDLE AND BOTTOM: Soldiers in the Skeleton’s Army from Jason and the Argonauts (1963). (Photo: Andy Johnson)

WINTER 2020 VFXVOICE.COM • 87

11/12/19 2:27 PM


LEGEND

GEARING UP FOR RAY HARRYHAUSEN’S 100TH ANNIVERSARY By BARBARA ROBERTSON

Images courtesy of the Ray and Diana Harryhausen Foundation.

To the legions of visual effects artists, animators, and filmmakers who have been inspired by and have followed his footsteps, Ray Harryhausen, stop-motion pioneer, visual effects creator and filmmaker, needs no introduction. Or, does he? “The thing that most stands out about Ray Harryhausen and is the most difficult to convey,” says filmmaker John Walsh and pauses, “is he didn’t invent stop-motion animation. Willis O’Brien did that with King Kong and Mighty Joe Young. But Ray stands apart.” In 1989, while a young film student, Walsh made a documentary entitled Ray Harryhausen: Movement Into Life, released the next year. The self-described fan boy continued to stay in touch with Harryhausen, and is now a trustee of the Ray & Diana Harryhausen Foundation. “Ray was the person who would instigate a project,” Walsh says. “He was not brought in to facilitate someone else’s vision. He facilitated his own vision. I can’t think of anyone else who’s done that. When we think of the great visual and special effects people – Dennis Muren, VES, Rick Baker, and so forth – they are working on other people’s films. Willis, too, for the most part. But Ray consistently worked on his own films, on stories he wrote, creatures he created. He’s a unique part of film history.” Harryhausen made 16 feature films during his lifetime and is perhaps best known for Mighty Joe Young (1949), The 7th Voyage of Sinbad (1958); Jason and the Argonauts (1963), One Million Years B.C. (1966), and Clash of the Titans (1981). His honors include the prestigious Gordon E. Sawyer Award from the Academy (presented by Ray Bradbury), the Visual Effects Society’s Lifetime Achievement Award, induction into the VES Hall of Fame (posthumously), the Science Fiction Hall of Fame, an honorary BAFTA (presented by Peter Jackson), the British Fantasy Society’s Wagner Award for his lifetime contribution, and a star on the Hollywood Walk of Fame. Some of his most famous films have been restored and re-issued in 4K format including The 7th Voyage of Sinbad, which premiered last September in London, along with a surprise. At that premiere, Walsh launched his book, Harryhausen: The Lost Movies (Titan Books). It turns out that in addition to the 16 films Harryhausen worked on that made it into the cinema, many more didn’t. “I thought the number of films Ray didn’t make would be maybe 40 or 50,” Walsh says. “It was 80. We’ve found amazing examples of offers he received, and projects he was involved with.” The “we” Walsh refers to is the Ray & Diana Harryhausen Foundation, which safeguards Harryhausen’s collection of more than 50,000 objects. CENTENNIAL EXHIBITION

In 1986, long before his death at age 92 in 2013, Harryhausen established the Foundation to look after his collection, protect his name, and educate new generations of stop-motion animators. Foundation trustees are now his daughter Vanessa Harryhausen, Simon Mackintosh and Walsh. Actress Caroline Munro, who appeared in The Golden Voyage of Sinbad, and special effects artist Randy Cook are advisors.

86 • VFXVOICE.COM WINTER 2020

PG 86-88 HARRYHAUSEN.indd 86-87

This year, the Foundation will celebrate the 100th anniversary of Harryhausen’s birthday with a major exhibition at the Scottish National Gallery of Modern Art in Edinburgh from May 23 to October 25, 2020. “We’ll have the entire collection of his main creatures, concept art, storyboards and test footage,” Walsh says. “We expect to occupy two or three floors and a dozen different rooms or spaces.” As part of the restoration process for previous exhibitions and the upcoming centenary, the Foundation has done MRI scanning and 3D mapping of many Harryhausen models to learn what’s inside. “We’re restoring pieces as we go, trying to get things back as close to how people remember them as possible,” Walsh says. “Ray never kept a log of repairs, never photographed making the creatures, or kept detailed notes on the manufacture. So we don’t know which compounds were used. And the only way to find out what’s inside is to scan. I think people are surprised to learn how heavy and robust these creatures are. They are not delicate. The rubber inside is hard and thick, and the ball and socket armatures are steel, and they’re larger than people think.” THE COLLECTION

Of course, with 50,000 objects and growing, the collection from which they’ll draw for the exhibition extends beyond his main creatures, beyond the movies for which he’s known. “[The collection has] everything he kept from the early 1960s onward,” Walsh says. “And he kept everything. Some of the most interesting sketches we have were on the back of letterhead or scribbled on legal pads – he would use both sides. It wasn’t that he was impoverished. He was successful and lived a comfortable life in West London. I think it was because he was a child of the Great Depression.” “Everything” also includes objects that were sent to Harryhausen. The Foundation is happy to continue receiving items for the collection. “We’re always surprised by what people donate to us,” Walsh says. “We have an interesting mask in the collection that was sent by a fan in 1960 of the Cyclops from The 7th Voyage of Sinbad. That fan was Rick Baker, who was a schoolboy then. Ray was always generous with his time. It was the same with me when I was a film student. But the latest discovery was Ray’s preliminary sketches on a legal pad for John Huston’s Moby Dick.” It was a project Harryhausen had never discussed publicly. THE LOST FILMS

Hidden within the collection are hundreds of objects from the films that were never made. When the trustees were considering what they could do for people who couldn’t make it to the 100th centennial exhibition in Edinburgh, Walsh suggested documenting Harryhausen’s “lost films” – the films he never made – for the public at large. Walsh didn’t know what he’d find when he started digging into the massive collection, but he found treasures. “If you asked me to find details about Jason or Clash, I could get that for you in a day or two, but the lost films are scattered amongst

OPPOSITE TOP: Ray Harryhausen with Medusa from Clash of the Titans (1981). (Photo: Andy Johnson) OPPOSITE BOTTOM: Kraken in Clash of the Titans. TOP: Ray animates Snakewoman from The 7th Voyage of Sinbad (1958). MIDDLE AND BOTTOM: Soldiers in the Skeleton’s Army from Jason and the Argonauts (1963). (Photo: Andy Johnson)

WINTER 2020 VFXVOICE.COM • 87

11/12/19 2:27 PM


LEGEND

the collection,” Walsh says. “It was an amazing experience. We didn’t know what we were looking for, but we found amazing artwork from films he didn’t make.” Complicating the search was the knowledge that Harryhausen hadn’t been forthcoming about projects that were turned down. “I asked him why he didn’t make Sinbad Goes to Mars, because I had read the book,” Walsh says. “He said he didn’t want to talk about it. I suppose it was frustrating for him. It’s not a great creative process. It’s a scar. So he made fewer attempts to make other films, but the attempts he made were more detailed and the artwork more elaborate.” Among Walsh’s finds were some unexpected objects. “That there were [things from] unmade Sinbad films wasn’t a surprise,” Walsh says. “But Ray tried to make Conan films in the 1960s. He also tried to get involved with H. G. Wells’ The Food of the Gods. He didn’t make Force of the Titans, the follow-up to Clash of the Titans, but we have his artwork and sculptures for that.” FUTURE PLANS

TOP TO BOTTOM: Animating Sabre Tooth Tiger and Trog from Sinbad and the Eye of the Tiger (1977). Positioning a skeleton for Jason and the Argonauts (1963). Ray Harryhausen, John Walsh and John Landis during commentary for 2012 recordings. Book cover for Harryhausen: The Lost Movies by John Walsh (Titan Books).

The Foundation doesn’t seek donations and instead hopes to raise money through projects such as books, speaking events and films. “We’ve been developing Force of the Trojan,” Walsh says. “And we have a delicious solution for the question of digital versus stop-motion. We want to have a fight between a digital character and a tabletop character. People can make up their own minds about which technique is most magical.” That fits with the foundation’s goal: To protect Harryhausen’s legacy and educate a new generation of filmgoers. “Ray thought this would be a forgotten technique like shadow puppetry,” Walsh says. “But there are more minutes of stop motion being produced today than at any time during Ray’s career. So we would like to set up a scholarship for a course for students doing stop motion. We’ll do that in 2021, once the centenary is out of the way.” If money were no limit, the Foundation would have a permanent museum within which would be a film school for students who wish to learn stop motion. Walsh notes that Harryhausen never had a video assist camera, and no one helped him with loading film or lighting. He just got on with it. He looked at dailies the next day or the day after. “I think it gave him a certain continuity of performance,” Walsh says, “an individual performance that’s hard to get when working to an overall house style. We’ve played the 4K films to very young audiences and they love it. Kids 10 and 11 cover their eyes because they don’t want to be turned to stone. Parents say they don’t get that from today’s films.” It’s hard to imagine a major feature film being made today with one visual effects artist instigating the project – writing the script, designing the characters, building them and animating them. “I think that’s Ray’s legacy,” Walsh says. “To show it can be done. Ray was unique. He was the person who would instigate. Maybe there are visual effects people out there who will think, ‘I can be that guy.’ You never know.”

AD

88 • VFXVOICE.COM WINTER 2020

PG 86-88 HARRYHAUSEN.indd 88-89

11/12/19 2:27 PM


PG 89 RED DIGITAL AD .indd 89

11/12/19 2:59 PM


[ VFX CAREERS ]

VFX HR Lightning Round for 2020 By JIM McCULLAUGH

VFX Voice asked Human Resources executives and recruiters to share their thoughts on several important questions about the VFX workplace of the future. Here’s what they had to say. VFX Voice: What is the most important skill to develop for a long-term career?

CLOCKWISE: Susan O’Neal Brooke Brigham Anna Hodge

Susan O’Neal, BLT Recruiting: “To me, the most important skills to develop for a long-term career are those ‘soft’ skills that are ingrained in a person’s character and shaped by experiences outside of the studio. Each project will have different needs insofar as technical expertise, but all projects need people with well-developed communication skills – people who are great team players, and a team player who is curious and adaptable with a solid work ethic and problem-solving skills. When hiring producers, I look for people who have worked in and around customer service. I find people who are used to working for tips have wonderful insights into the subtleties of human nature and can adapt the project’s voice to gain greatest creative and fiscal success. When reviewing résumés for junior artists, I tend to look for extra-curriculars, and will ask questions not only about proficiencies on the technical side, but also about the individual’s background and interests outside of the studio.” Brooke Brigham, Head of Recruiting, Zoic Studios: “Soft skills and the ability to get along with all personality types. VFX is highly collaborative and global. Having great technical and/or artistic skills will be a given and having leadership, teamwork and adaptability skills along with interpersonal skills will take someone to higher level roles.” Anna Hodge, Manager of Training and Education, Rising Sun: “In visual effects, it is crucial for an artist to understand the principles of composition and light, observation, anatomy, movement, aesthetics, and have a general passion for watching films. A basic understanding of a programming language doesn’t hurt either! These skills will assist you becoming an artist, but if you are looking at a long-term career in VFX, you also need to adapt and change to industry needs.” VFX Voice: What are the most important things producers are looking for when hiring? O’Neal: “Depending on the role and the project, producers will want the most talented, smartest and most accountable person possible on their team. The talent part can be ascertained by the reel, which is the artist’s calling card. Vimeo and YouTube links are sufficient, but as I work primarily in high-end commercials, the preference is for the

90 • VFXVOICE.COM WINTER 2020

PG 90-91 HR.indd All Pages

“Change is also inevitable given that the overall industry has shifted so much in the last 10 years in the areas of client expectations, tax incentives, outsourcing, and delivery platforms with streaming services and Netflix emerging.” —Anna Hodge, Manager of Training and Education, Rising Sun artist to have a well-presented website with reel, bio, résumé, contact details and, ideally, a ‘personal’ section which may showcase work done not-for-pay – personal projects like photography, sculpture, improv comedy. I’ve seen all kinds of interesting things. The first thing on the reel should be the most recent and best work.” Brigham: “Outside of technical or job-related skills, the most important thing hiring managers are looking for is culture fit. That doesn’t mean the person has to be like everyone else, but the hiring manager has to see the candidate within the team and environment, and would the candidate’s unique personality and skill set add value to the team?” Hodge: “When hiring, VFX producers look for good communicators, team players and people that can work calmly under pressure to a schedule and budget. They need to be organized, great multi-taskers, be able to forecast, be resourceful and solution focused, ensuring that things run smoothly behind the scenes at all times. Even at peak times of craziness when there is a lot going on, a good hire will never panic!” VFX Voice: Should you specialize? Or be a generalist? O’Neal: “The decision to specialize or be a generalist depends largely on where the artist wants to work. If you want to work at a large animation studio where projects last three years, you may enter a generalist, but at the end of the show you can become a specialist due to the nature of large projects. It’s much more cost-efficient in these cases to have artists specialize (layout, rigging, lookdev, etc.). If you choose to work in short-form projects with quicker turnaround, you will likely be valued more if you’re a generalist with a specialty.” Brigham: “We have a really great mix of specialized skill sets and generalists at Zoic, so this is a tough question for us. We like having a mix of both. Even if someone has a specialization, they tend to also have secret skills that also come in handy in our fast-paced environment.” Hodge: “This really depends on the company that you’re looking to work for. In a small facility, it’s a good idea to have a variety of skills, as this will carry you through to other roles in that company where hiring a specialist may not be feasible. At a larger facility, being a specialist allows you to really master a particular skill like FX, compositing and animation. You can also progress faster from junior through to senior artist. Most importantly, don’t restrict your opportunities to just feature film VFX.”

VFX Voice: What’s the most important tech to understand right now? O’Neal: “I am seeing a rise in the need for people to be proficient in real-time VFX production. It’s interesting how old-school techniques like rear-screen projection (so that everything is shot ‘in-camera’ with only cleanup to happen in VFX) are being employed again in a real-time capacity. I often get requests for artists who can develop game- engine-ready assets, and I’m seeing more and more finished VFX being delivered in time for the live-action shoot with the artists. The line between previs/postvis/VFX production is blurring.” Brigham: “The tech to understand right now and which will be important in five years is Game Engine Technology. It’s going to change how lighting, layout and environment pipelines are currently run and affect all sorts of production. Learning Unreal Engine and C++ and understanding where it excels and where the current weak points are will put you ahead of the curve.” Hodge: “There’s a lot going on in the area of AI and machine learning and its influence on VFX. AI serves many purposes but, ultimately, it makes many tasks easier to do in less time. It also provides opportunities for advancement in areas like motion-capture, facial-capture systems and content creation. Several companies are already ‘dabbling’ with AI and machine learning, and the results are pretty amazing. Real-time rendering and the blending of games and VFX is another area to keep an eye on.” VFX Voice: What will be the most important skill to have five years from now? O’Neal: “As in any industry, adaptability and curiosity are key skills for growth. It’s important for an artist to show mastery of specific skills insofar as software usage goes, but keeping an eye out for current trends in production and dabbling in the unknown assures a cutting-edge advantage over those who ask no questions about process and presume that status quo is here to stay.” Hodge: “Given the nature of the VFX industry, with technology changing and advancing rapidly, a successful artist should expect change and embrace it. Change is also inevitable given that the overall industry has shifted so much in the last 10 years in the areas of client expectations, tax incentives, outsourcing, and delivery platforms with streaming services and Netflix emerging.”

WINTER 2020

VFXVOICE.COM • 91

11/12/19 2:28 PM


[ VFX CAREERS ]

VFX HR Lightning Round for 2020 By JIM McCULLAUGH

VFX Voice asked Human Resources executives and recruiters to share their thoughts on several important questions about the VFX workplace of the future. Here’s what they had to say. VFX Voice: What is the most important skill to develop for a long-term career?

CLOCKWISE: Susan O’Neal Brooke Brigham Anna Hodge

Susan O’Neal, BLT Recruiting: “To me, the most important skills to develop for a long-term career are those ‘soft’ skills that are ingrained in a person’s character and shaped by experiences outside of the studio. Each project will have different needs insofar as technical expertise, but all projects need people with well-developed communication skills – people who are great team players, and a team player who is curious and adaptable with a solid work ethic and problem-solving skills. When hiring producers, I look for people who have worked in and around customer service. I find people who are used to working for tips have wonderful insights into the subtleties of human nature and can adapt the project’s voice to gain greatest creative and fiscal success. When reviewing résumés for junior artists, I tend to look for extra-curriculars, and will ask questions not only about proficiencies on the technical side, but also about the individual’s background and interests outside of the studio.” Brooke Brigham, Head of Recruiting, Zoic Studios: “Soft skills and the ability to get along with all personality types. VFX is highly collaborative and global. Having great technical and/or artistic skills will be a given and having leadership, teamwork and adaptability skills along with interpersonal skills will take someone to higher level roles.” Anna Hodge, Manager of Training and Education, Rising Sun: “In visual effects, it is crucial for an artist to understand the principles of composition and light, observation, anatomy, movement, aesthetics, and have a general passion for watching films. A basic understanding of a programming language doesn’t hurt either! These skills will assist you becoming an artist, but if you are looking at a long-term career in VFX, you also need to adapt and change to industry needs.” VFX Voice: What are the most important things producers are looking for when hiring? O’Neal: “Depending on the role and the project, producers will want the most talented, smartest and most accountable person possible on their team. The talent part can be ascertained by the reel, which is the artist’s calling card. Vimeo and YouTube links are sufficient, but as I work primarily in high-end commercials, the preference is for the

90 • VFXVOICE.COM WINTER 2020

PG 90-91 HR.indd All Pages

“Change is also inevitable given that the overall industry has shifted so much in the last 10 years in the areas of client expectations, tax incentives, outsourcing, and delivery platforms with streaming services and Netflix emerging.” —Anna Hodge, Manager of Training and Education, Rising Sun artist to have a well-presented website with reel, bio, résumé, contact details and, ideally, a ‘personal’ section which may showcase work done not-for-pay – personal projects like photography, sculpture, improv comedy. I’ve seen all kinds of interesting things. The first thing on the reel should be the most recent and best work.” Brigham: “Outside of technical or job-related skills, the most important thing hiring managers are looking for is culture fit. That doesn’t mean the person has to be like everyone else, but the hiring manager has to see the candidate within the team and environment, and would the candidate’s unique personality and skill set add value to the team?” Hodge: “When hiring, VFX producers look for good communicators, team players and people that can work calmly under pressure to a schedule and budget. They need to be organized, great multi-taskers, be able to forecast, be resourceful and solution focused, ensuring that things run smoothly behind the scenes at all times. Even at peak times of craziness when there is a lot going on, a good hire will never panic!” VFX Voice: Should you specialize? Or be a generalist? O’Neal: “The decision to specialize or be a generalist depends largely on where the artist wants to work. If you want to work at a large animation studio where projects last three years, you may enter a generalist, but at the end of the show you can become a specialist due to the nature of large projects. It’s much more cost-efficient in these cases to have artists specialize (layout, rigging, lookdev, etc.). If you choose to work in short-form projects with quicker turnaround, you will likely be valued more if you’re a generalist with a specialty.” Brigham: “We have a really great mix of specialized skill sets and generalists at Zoic, so this is a tough question for us. We like having a mix of both. Even if someone has a specialization, they tend to also have secret skills that also come in handy in our fast-paced environment.” Hodge: “This really depends on the company that you’re looking to work for. In a small facility, it’s a good idea to have a variety of skills, as this will carry you through to other roles in that company where hiring a specialist may not be feasible. At a larger facility, being a specialist allows you to really master a particular skill like FX, compositing and animation. You can also progress faster from junior through to senior artist. Most importantly, don’t restrict your opportunities to just feature film VFX.”

VFX Voice: What’s the most important tech to understand right now? O’Neal: “I am seeing a rise in the need for people to be proficient in real-time VFX production. It’s interesting how old-school techniques like rear-screen projection (so that everything is shot ‘in-camera’ with only cleanup to happen in VFX) are being employed again in a real-time capacity. I often get requests for artists who can develop game- engine-ready assets, and I’m seeing more and more finished VFX being delivered in time for the live-action shoot with the artists. The line between previs/postvis/VFX production is blurring.” Brigham: “The tech to understand right now and which will be important in five years is Game Engine Technology. It’s going to change how lighting, layout and environment pipelines are currently run and affect all sorts of production. Learning Unreal Engine and C++ and understanding where it excels and where the current weak points are will put you ahead of the curve.” Hodge: “There’s a lot going on in the area of AI and machine learning and its influence on VFX. AI serves many purposes but, ultimately, it makes many tasks easier to do in less time. It also provides opportunities for advancement in areas like motion-capture, facial-capture systems and content creation. Several companies are already ‘dabbling’ with AI and machine learning, and the results are pretty amazing. Real-time rendering and the blending of games and VFX is another area to keep an eye on.” VFX Voice: What will be the most important skill to have five years from now? O’Neal: “As in any industry, adaptability and curiosity are key skills for growth. It’s important for an artist to show mastery of specific skills insofar as software usage goes, but keeping an eye out for current trends in production and dabbling in the unknown assures a cutting-edge advantage over those who ask no questions about process and presume that status quo is here to stay.” Hodge: “Given the nature of the VFX industry, with technology changing and advancing rapidly, a successful artist should expect change and embrace it. Change is also inevitable given that the overall industry has shifted so much in the last 10 years in the areas of client expectations, tax incentives, outsourcing, and delivery platforms with streaming services and Netflix emerging.”

WINTER 2020

VFXVOICE.COM • 91

11/12/19 2:28 PM


[ VES SECTION SPOTLIGHT: LOS ANGELES ]

Soaring High in the Film and TV Capital of the World By NAOMI GOLDMAN

TOP: The L.A. Section Board of Managers celebrating the thriving VFX community. BOTTOM: L.A. Section members at the “Light Fields and Google” AR special event at Google HQ in Playa Vista.

92 • VFXVOICE.COM WINTER 2020

PG 92-93 LA SECTION.indd All Pages

The Visual Effects Society’s growing international presence owes much to its dynamic network of Sections, who galvanize their regional VFX communities while advancing the reach and reputation of the Society and industry worldwide. The thriving L.A. Section, the largest of VES’s regional groups, boasts 1,400 members spread across Los Angeles and Orange counties, and cites the diversity of its membership and Board of Managers among its greatest strengths. “It’s vital to have a wide range of experiences and voices in the room as we develop more inclusive and innovative programming. Creativity suffers when we don’t have people different from us inspiring or challenging us to think differently,” says Section CoChair Sarah McGrail. “Some of our members started in production, others began working in film schools, interning at VFX houses, or at home creating their own content. They work in film, episodics, commercials, animation, games, VR/AR – you name it, we have a member that can do it!” When asked about the unique nature of the L.A. market, the Section leadership is quick to point out what makes this area so special. “Being in the heart of Hollywood, we set an example of quality and integrity for visual effects as the gold standard for the industry,” says Kaitlyn Yang, Section Co-Chair. “And with some of the best film and VFX schools in town, the next generation of filmmakers is watching us. We have the potential to help shape and support these aspiring artists and innovators.” “There’s something special about creating visual effects in Los Angeles, a city where the art, craft and technology of moviemaking has been coming together for generations,” says McGrail. “Our proximity to Hollywood and Silicon Beach allows us to have access to great minds and entrepreneurs pushing the edges of what’s next.” There is a hotbed of VFX activity and inspiration at home in L.A. Companies are doing groundbreaking work. Apple, Netflix and Hulu have local production hubs, and there is a huge amount of work being done by local VFX facilities to support them. The L.A. Section Managers point to their members as enormous points of pride, with some of the ‘original heroes of VFX’ among its ranks. The Section has a robust roster of programming, including film screenings, demos, educational and special events. To grow its ranks, it holds Spring and Fall Membership Drive mixers, and members are encouraged to bring prospects to events year-round to experience firsthand the benefits of VES membership and community. The Section’s signature events are the Summer BBQ and Holiday Party, which draw hundreds of attendees. Rotomaker serves as this year’s annual Gold Section Sponsor, and the roster of generous sponsors has enabled the Section to significantly amp up its overall programming calendar.

“It’s vital to have a wide range of experiences and voices in the room as we develop more inclusive and innovative programming. Creativity suffers when we don’t have people different from us inspiring or challenging us to think differently,” —Sarah McGrail, L.A. Section Co-Chair The Vision Committee held a large-scale event, “Light Fields and Google AR” at Google’s Playa Vista headquarters, which featured Academy Award winner Paul Debevec, VES and Ivan Neulander, a Lead Rendering Engineer at Google, along with a special access tour of the Google HQ. The Third Floor and Picture Shop have provided space for meetups, membership drives and board meetings, and the Section has collaborated with industry organizations including SMPTE, Women in Animation, PGA, DGA and the Art Directors Guild to serve each other’s missions. The Section also worked closely with VES HQ staff to put on last summer’s wildly successful SIGGRAPH party, which attracted VES members and guests from around the globe. Looking to the year ahead, the Section Managers are focused on several key goals. “We are going to do more to celebrate the amazing accomplishments of our members, and to share their artistic and technological innovations with the L.A. Section and the world,” says McGrail. “Expect to see us highlighting VES’s benefits to members, including health care, networking events and subscriptions to VFX Voice,” adds Yang, “as well as encourage more engagement among members, including joining committees and running for the Board, and continue to develop diversity initiatives to further the movement on this important facet of the VFX industry.” The Section Board members provide an impassioned roundup on what belonging to the Society means to them. “I belong to the VES because of my fellow members, for the strong camaraderie, to celebrate their accomplishments and share in their creative journeys, which is all an amazing amount of fun!” says Toni Pace Carstensen, Section Treasurer. “The VES brings together the legendary artists that made VFX a household name, with the new ‘blood’ carving pathways forward. If you have visual effects in your blood, as I do, this is the place you want to be,” says Steven Blasini, Board of Managers member. “What we do as artists is now a part of every field, format and department in the entertainment industry. It’s up to us to help our VFX community and each other, and to shape the perception and reality of our corner of the entertainment industry while continuing to bring the magic of storytelling to life,” comments Van Ling, Board of Managers member. “Being a part of VES and serving our members is a privilege,” concur McGrail and Yang. “Our members are a worldly bunch! We love connecting with members in other Sections, sharing knowledge and helping new Sections grow and succeed, because we are part of a rich and collaborative global community.”

TOP: Bringing live demos of the latest VR tech to members and guests. MIDDLE: L.A. Section Secretary Pam Hogarth enjoying the Summer BBQ with special guests. BOTTOM: Welcoming the LA VFX community to its festive annual Summer BBQ.

WINTER 2020

VFXVOICE.COM • 93

11/12/19 2:28 PM


[ VES SECTION SPOTLIGHT: LOS ANGELES ]

Soaring High in the Film and TV Capital of the World By NAOMI GOLDMAN

TOP: The L.A. Section Board of Managers celebrating the thriving VFX community. BOTTOM: L.A. Section members at the “Light Fields and Google” AR special event at Google HQ in Playa Vista.

92 • VFXVOICE.COM WINTER 2020

PG 92-93 LA SECTION.indd All Pages

The Visual Effects Society’s growing international presence owes much to its dynamic network of Sections, who galvanize their regional VFX communities while advancing the reach and reputation of the Society and industry worldwide. The thriving L.A. Section, the largest of VES’s regional groups, boasts 1,400 members spread across Los Angeles and Orange counties, and cites the diversity of its membership and Board of Managers among its greatest strengths. “It’s vital to have a wide range of experiences and voices in the room as we develop more inclusive and innovative programming. Creativity suffers when we don’t have people different from us inspiring or challenging us to think differently,” says Section CoChair Sarah McGrail. “Some of our members started in production, others began working in film schools, interning at VFX houses, or at home creating their own content. They work in film, episodics, commercials, animation, games, VR/AR – you name it, we have a member that can do it!” When asked about the unique nature of the L.A. market, the Section leadership is quick to point out what makes this area so special. “Being in the heart of Hollywood, we set an example of quality and integrity for visual effects as the gold standard for the industry,” says Kaitlyn Yang, Section Co-Chair. “And with some of the best film and VFX schools in town, the next generation of filmmakers is watching us. We have the potential to help shape and support these aspiring artists and innovators.” “There’s something special about creating visual effects in Los Angeles, a city where the art, craft and technology of moviemaking has been coming together for generations,” says McGrail. “Our proximity to Hollywood and Silicon Beach allows us to have access to great minds and entrepreneurs pushing the edges of what’s next.” There is a hotbed of VFX activity and inspiration at home in L.A. Companies are doing groundbreaking work. Apple, Netflix and Hulu have local production hubs, and there is a huge amount of work being done by local VFX facilities to support them. The L.A. Section Managers point to their members as enormous points of pride, with some of the ‘original heroes of VFX’ among its ranks. The Section has a robust roster of programming, including film screenings, demos, educational and special events. To grow its ranks, it holds Spring and Fall Membership Drive mixers, and members are encouraged to bring prospects to events year-round to experience firsthand the benefits of VES membership and community. The Section’s signature events are the Summer BBQ and Holiday Party, which draw hundreds of attendees. Rotomaker serves as this year’s annual Gold Section Sponsor, and the roster of generous sponsors has enabled the Section to significantly amp up its overall programming calendar.

“It’s vital to have a wide range of experiences and voices in the room as we develop more inclusive and innovative programming. Creativity suffers when we don’t have people different from us inspiring or challenging us to think differently,” —Sarah McGrail, L.A. Section Co-Chair The Vision Committee held a large-scale event, “Light Fields and Google AR” at Google’s Playa Vista headquarters, which featured Academy Award winner Paul Debevec, VES and Ivan Neulander, a Lead Rendering Engineer at Google, along with a special access tour of the Google HQ. The Third Floor and Picture Shop have provided space for meetups, membership drives and board meetings, and the Section has collaborated with industry organizations including SMPTE, Women in Animation, PGA, DGA and the Art Directors Guild to serve each other’s missions. The Section also worked closely with VES HQ staff to put on last summer’s wildly successful SIGGRAPH party, which attracted VES members and guests from around the globe. Looking to the year ahead, the Section Managers are focused on several key goals. “We are going to do more to celebrate the amazing accomplishments of our members, and to share their artistic and technological innovations with the L.A. Section and the world,” says McGrail. “Expect to see us highlighting VES’s benefits to members, including health care, networking events and subscriptions to VFX Voice,” adds Yang, “as well as encourage more engagement among members, including joining committees and running for the Board, and continue to develop diversity initiatives to further the movement on this important facet of the VFX industry.” The Section Board members provide an impassioned roundup on what belonging to the Society means to them. “I belong to the VES because of my fellow members, for the strong camaraderie, to celebrate their accomplishments and share in their creative journeys, which is all an amazing amount of fun!” says Toni Pace Carstensen, Section Treasurer. “The VES brings together the legendary artists that made VFX a household name, with the new ‘blood’ carving pathways forward. If you have visual effects in your blood, as I do, this is the place you want to be,” says Steven Blasini, Board of Managers member. “What we do as artists is now a part of every field, format and department in the entertainment industry. It’s up to us to help our VFX community and each other, and to shape the perception and reality of our corner of the entertainment industry while continuing to bring the magic of storytelling to life,” comments Van Ling, Board of Managers member. “Being a part of VES and serving our members is a privilege,” concur McGrail and Yang. “Our members are a worldly bunch! We love connecting with members in other Sections, sharing knowledge and helping new Sections grow and succeed, because we are part of a rich and collaborative global community.”

TOP: Bringing live demos of the latest VR tech to members and guests. MIDDLE: L.A. Section Secretary Pam Hogarth enjoying the Summer BBQ with special guests. BOTTOM: Welcoming the LA VFX community to its festive annual Summer BBQ.

WINTER 2020

VFXVOICE.COM • 93

11/12/19 2:28 PM


[ VES NEWS ]

VES 2019 Honors Celebration, VES/Autodesk Initiative, and VES Bay Area Section’s ‘FAANG’ Forum By NAOMI GOLDMAN

TOP LEFT: Lifetime member Michael Fink, VES, and his wife Melissa. TOP RIGHT: Toni Pace Carstensen and Bill Taylor, VES, enjoying the rooftop party.

In October, the VES honored a host of distinguished visual effects practitioners at a special reception at the Beverly Hilton Hotel in Beverly Hills, California. The honorees included: Founders Award Recipient: Susan Thurmond O’Neal Lifetime Member: Michael Fink, VES Honorary Member: Mike Brodersen of FotoKem. VES Fellows: Neil Corbould, VES; Harrison Ellenshaw, VES; Susan Zwerman, VES VES 2019 Hall of Fame Inductees: Walt Disney, Stanley Kubrick, Stan Lee “Our VES honorees represent a group of exceptional artists, innovators and professionals who have made significant contributions to the field of visual effects,” states Mike Chambers, VES Board Chair. “We are proud to recognize those who helped shape our shared legacy and continue to inspire future generations of VFX practitioners.”

TOP LEFT: Celebrating VES 2019 honorees, from left to right: Neil Corbould, VES; Mike Brodersen; Susan Zwerman, VES; Susan Thurmond O’Neal; Michael Fink, VES; Michelle Lund for her grandfather, Walt Disney; Harrison Ellenshaw, VES; and Doug Trumbull, VES, for Stanley Kubrick. TOP RIGHT: Founders Award recipient Susan Thurmond O’Neal flanked by Mike Chambers, VES Chair; Brooke Lyndon-Stanford, VES Treasurer; and Eric Roth, VES Executive Director. BOTTOM: Michelle Lund, granddaughter of Walt Disney, on his induction into the VES Hall of Fame.

94 • VFXVOICE.COM WINTER 2020

PG 94-95 VES NEWS.indd All Pages

VES AND AUTODESK LAUNCH NEW INITIATIVE TO SPOTLIGHT DIVERSITY IN VFX This past fall, the VES launched the dynamic “Ask Me Anything: VFX Pros Tell All” initiative in concert with Autodesk, worldwide leaders in 3D design, engineering and entertainment software. Together, VES and Autodesk are shining a light on diversity in visual effects with a lineup of web talks with successful visual effects professionals from different backgrounds. A live webcast series kicked off in October with special guest Karen Dufilho, creative visionary and Academy- and Emmy Awardwinning Executive Producer at Google Spotlight Stories. Other VFX pros slated for the interactive forums ahead include Sidney KomboKintombo, award-winning Animation Supervisor at Weta Digital; Neishaw Ali, Executive Producer and President at SPIN VFX; Greg Anderson, Head of Studio-NY, Senior VFX Supervisor at FuseFX; and Lauren McCallum, Global Managing Director at Mill Film. For more info and to register for free live webcasts, go to: www.visualeffectssociety.com/ama.

MIDDLE: Susan Zwerman, VES, and Chuck Finance. BOTTOM: VES Bay Area Section Chair David Valentin (second from left) flanked by VFX and tech pros at its first FAANG forum. (Photo: Anet Hershey Photography)

VES BAY AREA SECTION HOSTS DYNAMIC VFX-TECH CROSSOVER FORUM In September, the VES Bay Area Section held a dynamic panel discussion and networking event, “FAANG: Opportunities for VFX Veterans at the Bay Area’s Leading Tech Companies.” In addition to the FAANG companies (Facebook, Apple, Amazon, Netflix and Google) the Section broadened the professional development roundtable to include Adobe and NVIDIA. Almost 100 VES members and guests attended the interactive forum to hear insights from visual effects practitioners who have

parlayed their crossover skills to successfully transition to positions with leading tech companies. “Our first FAANG event was part of our focused effort to more fully embrace Silicon Valley as a part of our Section and our community – recognizing and forging stronger relationships with the talented professionals who make the tools, as well as the artists that use them,” says Bay Area Section Chair David Valentin. “We will certainly plan more events like this to help our diverse members connect and leverage their talents across our expansive industry.”

WINTER 2020 VFXVOICE.COM • 95

11/12/19 2:29 PM


[ VES NEWS ]

VES 2019 Honors Celebration, VES/Autodesk Initiative, and VES Bay Area Section’s ‘FAANG’ Forum By NAOMI GOLDMAN

TOP LEFT: Lifetime member Michael Fink, VES, and his wife Melissa. TOP RIGHT: Toni Pace Carstensen and Bill Taylor, VES, enjoying the rooftop party.

In October, the VES honored a host of distinguished visual effects practitioners at a special reception at the Beverly Hilton Hotel in Beverly Hills, California. The honorees included: Founders Award Recipient: Susan Thurmond O’Neal Lifetime Member: Michael Fink, VES Honorary Member: Mike Brodersen of FotoKem. VES Fellows: Neil Corbould, VES; Harrison Ellenshaw, VES; Susan Zwerman, VES VES 2019 Hall of Fame Inductees: Walt Disney, Stanley Kubrick, Stan Lee “Our VES honorees represent a group of exceptional artists, innovators and professionals who have made significant contributions to the field of visual effects,” states Mike Chambers, VES Board Chair. “We are proud to recognize those who helped shape our shared legacy and continue to inspire future generations of VFX practitioners.”

TOP LEFT: Celebrating VES 2019 honorees, from left to right: Neil Corbould, VES; Mike Brodersen; Susan Zwerman, VES; Susan Thurmond O’Neal; Michael Fink, VES; Michelle Lund for her grandfather, Walt Disney; Harrison Ellenshaw, VES; and Doug Trumbull, VES, for Stanley Kubrick. TOP RIGHT: Founders Award recipient Susan Thurmond O’Neal flanked by Mike Chambers, VES Chair; Brooke Lyndon-Stanford, VES Treasurer; and Eric Roth, VES Executive Director. BOTTOM: Michelle Lund, granddaughter of Walt Disney, on his induction into the VES Hall of Fame.

94 • VFXVOICE.COM WINTER 2020

PG 94-95 VES NEWS.indd All Pages

VES AND AUTODESK LAUNCH NEW INITIATIVE TO SPOTLIGHT DIVERSITY IN VFX This past fall, the VES launched the dynamic “Ask Me Anything: VFX Pros Tell All” initiative in concert with Autodesk, worldwide leaders in 3D design, engineering and entertainment software. Together, VES and Autodesk are shining a light on diversity in visual effects with a lineup of web talks with successful visual effects professionals from different backgrounds. A live webcast series kicked off in October with special guest Karen Dufilho, creative visionary and Academy- and Emmy Awardwinning Executive Producer at Google Spotlight Stories. Other VFX pros slated for the interactive forums ahead include Sidney KomboKintombo, award-winning Animation Supervisor at Weta Digital; Neishaw Ali, Executive Producer and President at SPIN VFX; Greg Anderson, Head of Studio-NY, Senior VFX Supervisor at FuseFX; and Lauren McCallum, Global Managing Director at Mill Film. For more info and to register for free live webcasts, go to: www.visualeffectssociety.com/ama.

MIDDLE: Susan Zwerman, VES, and Chuck Finance. BOTTOM: VES Bay Area Section Chair David Valentin (second from left) flanked by VFX and tech pros at its first FAANG forum. (Photo: Anet Hershey Photography)

VES BAY AREA SECTION HOSTS DYNAMIC VFX-TECH CROSSOVER FORUM In September, the VES Bay Area Section held a dynamic panel discussion and networking event, “FAANG: Opportunities for VFX Veterans at the Bay Area’s Leading Tech Companies.” In addition to the FAANG companies (Facebook, Apple, Amazon, Netflix and Google) the Section broadened the professional development roundtable to include Adobe and NVIDIA. Almost 100 VES members and guests attended the interactive forum to hear insights from visual effects practitioners who have

parlayed their crossover skills to successfully transition to positions with leading tech companies. “Our first FAANG event was part of our focused effort to more fully embrace Silicon Valley as a part of our Section and our community – recognizing and forging stronger relationships with the talented professionals who make the tools, as well as the artists that use them,” says Bay Area Section Chair David Valentin. “We will certainly plan more events like this to help our diverse members connect and leverage their talents across our expansive industry.”

WINTER 2020 VFXVOICE.COM • 95

11/12/19 2:29 PM


[ FINAL FRAME ]

Peering into the Future – in 1927

People in every era are curious about the future. Legendary director Fritz Lang was certainly curious back in the original “Roaring Twenties” with his landmark sci-fi film Metropolis. Created in Germany, this black-and-white masterpiece centers around a future Utopian city called Metropolis, where wealthy residents enjoy the good life above the surface and downtrodden workers underground toil ceaselessly for the benefit of these elites. Not only is the film notable for its subject matter, but for its pioneering visual and special effects as well. Eugen Shüfftan was credited for much of the special effects wizardry. He fabricated a miniature of the city, a camera on a swing, and came up with the “Schüfftan Process” whereby mirrors were used to create the illusion that actors were occupying miniature sets.

This image above depicts the city. The shots that establish the city with cars, planes and moving elevated trains were filmed using stop-motion photography. The autos were modeled after a new generation of taxi cabs found in Berlin. It took several months to build this city, which was inspired by Lang’s visit to New York in 1924 where he was dazzled by the skyline, tall buildings and lighting. In addition to Shüfftan, Ernst Kuntsmann was credited with special effects. The other technicians included Erich Kettelhut for trick photography and painting effects, model makers Willy Muller and Edmund Zeihfuss, trick photography assistant Hugo O. Schulze, and Konstantin Irmen-Tschet for special photographic effects sequences.

96 • VFXVOICE.COM WINTER 2020

PG 96 FINAL FRAME.indd 96

11/12/19 2:29 PM


CVR3 FOTOKEM AD.indd 3

11/12/19 2:31 PM


CVR4 DISNEY LION KING AD.indd 4

11/12/19 2:32 PM

Profile for Visual Effects Society

VFX Voice - Winter 2020 Issue  

The award-winning definitive authority on all things visual effects in the world of film, TV, gaming, virtual reality, commercials, theme pa...

VFX Voice - Winter 2020 Issue  

The award-winning definitive authority on all things visual effects in the world of film, TV, gaming, virtual reality, commercials, theme pa...

Advertisement