While working on the Star Wars film Rogue One, director of photography Greig Fraser, ACS, ASC, proposed shooting cockpit scenes using an LED screen that displayed the exterior space environments. When implemented, the process would allow the cinematographer to capture in-camera interactive light and reflection effects from the background visual effects created ahead of shooting by Industrial Light & Magic (ILM).
T
hat pioneering work initiated by Fraser became a jumping-off point for the Disney+ series The Mandalorian. ILM, along with several other vendors, collaborated on the creation of a virtual, largely in-camera workflow, which employed a screen-based capture volume that successfully matched the look and feel of the original Star Wars features. Taking place after the events depicted in Return of the Jedi, the eightepisode first season chronicles the exploits of a bounty hunter (Pedro Pascal) as he roams the stellar backroads of a galaxy far, far away. However, before The Mandalorian could go to space, ILM VFX supervisor Richard Bluff, along with associate Kim Libreri and ILM creative director Rob
46
F EB/M ARCH 2020
Bredow, met with Fraser to talk about a virtual production approach with LucasFilm staffers. “And,” as Fraser explains, “it turned out that [ILM chief creative officer] John Knoll had been looking at building technology that would let us expand that Rogue approach to whole environments.” Bluff, who was on board ten years earlier when George Lucas was exploring the possibility of a live-action Star Wars TV series, picks up the narrative. “Jon [Mandalorian showrunner Jon Favreau] was adamant that due to the scope and scale of what would be expected from a live action Star Wars TV show, we needed a game-changing approach to the existing TV production mold. It was the same conclusion George [Lucas] had arrived at more than 10 years ago during his TV explorations, however, at that time the technology wasn’t around to spur any visionary approaches. “Jon already had extensive experience in immersive film and multimedia workflows directing The Jungle Book and The Lion King,” Bluff continues. “And he knew that any breakthrough was likely to involve real-time game technology. He challenged me and other key creatives to fully explore how we could crack the obvious production challenges of a sprawling live action Star Wars TV show.” Bluff says Libreri had been pursuing game-engine technology to support animated or live-action productions, and that became a component of what was pitched to Favreau. “ILM then partnered with Epic [Games] to make their Unreal Engine for gaming into a robust production tool,” Bluff adds, “allowing for real-time display on LED screen walls.” (In June 2018, a 35-foot-wide
capture volume was built to test the screen’s potential.) Nine-millimeter pixel resolution LED panels had been used on Rogue One, a limitation Fraser now sees as archaic compared to the 2.8-millimeter panels currently deployed. “The results on screen have a lot less moire, which is the trickiest part of working out the shooting of LED screens,” Fraser reveals. “If the screen is in very sharp focus, the moire can come through. That factored into my decision to shoot as large a format as possible, to ensure the lowest possible depth of field.” Fraser chose the ARRI ALEXA LF, adding that “Panavision was just building the Ultra Vista lenses, so we may have been the first to use them. They have a fast fall-off, so we duck the moire issue. And the anamorphic aspect is very pleasing, keeping with the established softer analog look going back to 1977. I took them along while shooting Dune, but now they’re back for season two.” Rogue One’s 2nd Unit director of photography, Barry “Baz” Idoine, was chosen to follow Fraser’s work on the pilot and complete the first season. “Prepping in May and June, we wanted to see what lens worked best for the volume and gave us the aesthetic,” Idoine elaborates. “Panavision’s Dan Sasaki gave us prototypes for the 75 and 100mm. We asked for two full sets [T2.5 50mm, 65mm, 75mm, 100mm, 135mm, 150mm and 180mm], based on our desired focal lengths. Combining the LF sensor with the 1.65 squeeze on the Ultra Vistas, you get a native 2.37 [ aspect ratio]. Those lenses have a handmade feel in addition to being large format, and a great sense of character. We didn’t use any diffusion filtration at all.” While details of the virtual production process were being ironed out, production designer Andrew Jones relied heavily on LucasFilm’s brain trust in conceptualizing the look of the show. “There was a clear aesthetic coming from their art department in San Francisco, led by Doug Chiang,” Jones reports. “That team contributes massively to every Star Wars project and was central to our show, creating concept art for every set. We tried to interpret and reproduce their concepts faithfully, including as much production value within the scope of this series as we could on this crazy schedule. We tried not to let the process influence the concepts – initially, at least. Getting deeper into things, we realized what was possible with these [LED] screens within the capture volume. So, we began offering up more ideas for the environments through a concept artist of our own in Los Angeles. Doug was in contact with Jon throughout, providing weekly reviews.”