Special Edition CineAlta Issue 9 - Billy Lynn's Long Halftime Walk

Page 1

behind the scenes

BILLY LYNN’S

Long Halftime Walk World’s first 120p 4K 3D feature film Director Ang Lee, Cinematographer John Toll, ASC and team break new ground Beyond Definition

issue 9


Billy Lynn’s Long Halftime Walk Opens the Door to the Future Interviews by Peter Crithary Written by David Heuring

Billy Lynn’s Long Halftime Walk is a feature film based on an acclaimed, best-selling novel written by Ben Fountain. At the heart of the tale is a 19-year-old soldier who survives a harrowing battle in Iraq, and is subsequently brought home for a patriotic victory tour that culminates in a spectacular halftime show at an NFL football game. The contrast between the brutal reality of war and the perception of it back home is stark. To translate the novel to the screen, and to communicate to audiences the sensory overload of combat, three-time Academy Award® winner Ang Lee (Life of Pi; Crouching Tiger, Hidden Dragon; Brokeback Mountain) envisioned nothing less than a new kind of cinema experience. Lee’s focus and drive led to a revolutionary new format, and inspired filmmaking and technology pros to rethink and reinvent the process at every step. In technical jargon, this new cinematic format is referred to as 4K 120 frames per second HDR 3D. But many who have seen it say it’s a glimpse of the future of cinema, with the potential to save theatrical movies. “I see this as comparable to when sound was introduced to movies in 1927,” says Ben Gervais, who served as technical supervisor on the project. “What is considered ‘normal’ is about to be redefined.”



Billy Lynn’s Long Halftime Walk

The Drive to Innovate The path to a new format began with Lee’s desire to improve the 3D experience, specifically the problem of motion judder, which the director found annoying and limiting in the making of Life of Pi. His biography helps explain the mindset that he brought to the undertaking. Lee says that given the choice, he might never have become a film director. “It’s really the only thing I do well,” he says. “I grew up in Taiwan in the 1960s and ‘70s. We were taught to do our study, go to college and become a useful person – anything else was considered a distraction. I was always artistically inclined, but I didn’t have a lot of chances to get in touch with art. I felt guilty. My mind was always wandering – it still does. But we did watch movies, and I knew I loved them.” Lee eventually graduated from the National Taiwan College of Arts in 1975 and then came to the U.S. to earn a B.F.A. Degree in Theatre/ Theater Direction at the University of Illinois at Urbana-Champaign, followed by a Master’s Degree in Film Production at New York University. He was on his way. “When I got to see how films are made, it was immediately clear that it was the right thing for me,” he says. “Everything began to make sense. I felt I could function in that world. I trust movies more than real life. Filmmaking has never felt like work to me. I see my career as a very long film school. To me, there’s nothing like getting into a dark room with a group of people and touching each other’s hearts on a deeper level, on an emotional, irrational level, to share the mysteries that we don’t dare to speak of in real life. That’s my motivation.”

The path to a new format began with Lee’s desire to improve the 3D experience, specifically the problem of motion judder...


Ang Lee Director


Lee sees Billy Lynn’s Long Halftime Walk in the context of his evolution from a pure dramatic director to more of a visual stylist. “After my first four movies, which were focused on drama and character, I made a conscious decision to make more visually interesting films,” he says. “I begin with the philosophy and metaphor of the story, and then think in terms of texture and technology. Whenever you truly understand something, it opens up three more questions. And Billy Lynn is a good example of that. I’m not a technical guy. I just have a lot of curiosity about drama and storytelling. I want to see inside humanity. This is really the beginning of a new quest to find a deeper cinema through storytelling.” Working with cinematographer Claudio Miranda, ASC, Lee took 24-frame 3D technology to new

heights on Life of Pi, which earned four Oscars, including best directing, cinematography and visual effects, along with seven other nominations. In retrospect, Lee says, it was only a beginning. “After a while I noticed that although we were shooting 3D, our thinking was still 2D,” he says. “Although we were shooting digital, our heads were still thinking in film terms. It takes time to convert. We had not truly embraced what digital could be. We still have so much to learn and explore. Experience will teach you so much – how our minds

“This is really the beginning of a new quest to find a deeper cinema through storytelling.”– Ang Lee


Billy Lynn’s Long Halftime Walk

work, how we see things, how we study faces, and the spiritual aspect of entertainment – our emotions and how they are passed from one person to another. We’re peering into people’s lives. That’s the pleasure of cinema. In 2D, you see a picture of a person, instead of a person. We are designed to communicate in 3D, with two eyes. What’s important is our personal investment into it – we project our emotions, our dreams and our thoughts into it. I think with the new media, we can go further and deeper. We haven’t gotten there yet. But we’ve seen something new, and now I have to get into this, in spite of the difficulty. “The way the human mind works is a wonder,” he says. “I don’t think that more clarity kills the imagination. Movies haven’t changed for a long time, and I think we’re all dying for a change. The multiplex is crap. There’s no management, no quality control. And they keep recycling the same stuff, year after year. You might as well watch them on your iPhone. When I was a kid at the movies, I got excited and my blood pressure shot up. I want to be that person when I go to the theater now.”


Billy Lynn’s Long Halftime Walk

New Sensations Life of Pi sparked Lee’s initial exploration of higher frame rates. In many scenes, the character is on a life raft in the water, in constant motion, which made it difficult to read the emotions on his face. To compensate for strobing, the shutter angle was opened up, and that required some digital sharpening. Due to limitations in 3D technology at the time, Lee was quite restrained in terms of stereo depth, except for some of the CG shots. He points out that the standard frame rate of 24 frames per second was set many decades ago because it was the minimum speed for intelligible soundtrack. “It was chosen because it’s the cheapest,” he says. “You get hooked to those stars and it’s very hard to get rid of them.” Lee began to search for a way to represent action and performance subtleties simultaneously. He and his team researched 24 fps stereo imaging and how it interacts with human perception. Gervais points out some of the format’s weaknesses. “Unlike with a 2D movie, where it’s just a picture on a wall that happens to be moving, in 3D your brain wants to believe that thing hanging out in space in front of you is real,” he says. “At 24 frames, which is usually shot with a 180 degree shutter, you’re seeing a lot of motion blur, and then gaps where there’s no motion. Then motion picks up later in the frame because you’ve dropped 180 degrees. And for fast moving objects, your brain thinks there’s something wrong with you, and starts sending signals to your body, triggering eye strain, headaches, and things like that. Low brightness just adds to the eyestrain problem.” A test James Cameron had done at 60 fps raised new questions. Lee also visited Douglas Trumbull and was inspired by his experiments in high frame rate and perception. Lee made some 60 fps tests around a boxing-themed film he was considering. He saw that ramping up the frame rate and using brighter laser projectors alleviated or eliminated many of the problems of 24 fps 3D. He was intrigued. Later, Tom Rothman from TriStar approached Lee with the Billy Lynn property. Lee saw it the perfect chance to test the new technologies he had been exploring.


Lee began to search for a way to represent action and performance subtleties simultaneously.


“It’s not a gimmick – it’s film language. It’s very complicated, and very lively. I can see through the skin, through to the actors’ consciousness and how they feel. It’s terrifying and exciting at the same time.” – Ang Lee


Billy Lynn’s Long Halftime Walk

“It’s all sensation, it’s all about experience beyond storytelling,” says Lee. “I thought if I could bring these new sensations – the battling soldier, the Dallas halftime show – that would be incredible. I thought it would be a good way to examine the new media, and also a good way to examine society and humanity. The content was a chance to really use this new cinema, because the battling soldier has a hyper-real mentality. Then you take them to Dallas, at halftime on Thanksgiving Day. It’s overwhelming.” It’s not just the spectacle that is better represented, according to Lee. Emotional subtleties are also communicated more effectively. He discerns a different effect on human physiology for each step up in quality. “I think the clarity brings something to the audience,” he says. “When you see with two eyes, your mind works differently. You pick up nuances. Now, we can see performance during action. You can get to the emotion. It feels more human. With more information, you are more relaxed, and you don’t stare so hard. After seeing it, there’s a euphoric exhaustion. I find myself less judgmental. “The higher it goes, whether from 2K to 4K or from 60 frames to 120, the smarter and more emotional the response,” he says. “The brightness, the lack of strobing, the resolution – everything comes together like a cocktail. There’s no need to discuss it – it can’t be described in words. It’s not a verbal experience. You can see it, whether you like it or not. It functions at a different level than the movies we’re used to seeing. It’s visceral and personal and heartfelt. You want to wander into that world. You don’t need a filmmaker to use focus to tell you where to look, then cut, and look at something else. You have more room to explore the world. It’s a mystery. It’s not a gimmick – it’s film language. It’s very complicated, and very lively. I can see through the skin, through to the actors’ consciousness and how they feel. It’s terrifying and exciting at the same time.”


Billy Lynn’s Long Halftime Walk

Testing Begins and a Strategy Takes Shape Determined to pursue high frame rate 3D as a practical production format, Lee continued his research. Early on, he enlisted director of photography John Toll, ASC, with whom he had never previously worked. Toll was surprised to get the call, in part because he had little or no previous first-hand experience with 3D. He read the book and saw its potential, and he had “great respect” for Lee’s previous films. Lee explained his concept of the movie and his goal of creating a new kind of cinema. “Ang thought that this particular visual approach would be a great demonstration of how this technology could enhance any story, including intimate, character-driven dramatic material,” says Toll. “He wanted to use the technology to enhance not only the viewing experience but the emotional connection of the audience to the story.” Toll says that both he and Lee came away with questions. The technical challenge took over the conversation, and Lee encouraged Toll to see the 60 fps test he had done a year earlier for the boxing project. “Up until that point, I was still questioning the value of the whole concept,” Toll recalls. “But there was something about it that actually made sense. I could really see how 60 frames projected at 60 frames, at normal speed, really brought something unusual. To this day, anyone who has seen it has a really hard time describing it. Seeing that, as much as anything, really got me really interested in the project. Also, the idea that Ang Lee was doing something really innovative and interesting made it irresistible.” Toll notes that a common assumption in the filmmaking community is that higher frame rates will necessarily deliver a television look. “That’s not what I was seeing,” he says. “I always use the word accessible. Somehow the images are more immediate and more accessible. It feels like you’re looking at reality as opposed to images on a screen – more so than in any viewing situation I’ve ever seen. I felt like I was in the movie as opposed to watching the movie.


“He wanted to use the technology to enhance not only the viewing experience but the emotional connection of – John Toll, ASC the audience to the story.”

John Toll, ASC Cinematographer


Billy Lynn’s Long Halftime Walk

We could see even more of a difference with the 120. Each step of the way was more confirmation that we were really onto something.” Toll also looked at the Trumbull high frame rate tests, and was further intrigued by their intimacy and accessibility. “That confirmed that we were on the right track,” Toll says. “I basically became a believer, and but always with questions in my mind about how we would pull it off – the amount of light we would need and the nature of the equipment. It was a little daunting, but I think we all recognized the challenges and just kind of jumped into it.” Soon Toll was shooting tests designed to determine the most appropriate camera for the job. Five different cameras and three different 3D rigs

were tested. These tests fell into the category of research, as the project was not yet a sure thing in October of 2014. “There were so many technical questions – there still are – and the most fundamental was: Which camera are you using and which rig does it go into?” says Toll. “Everyone still had a lot of questions because what we were doing was really unexplored territory – high frame rate, with 120 fps as an ideal, 4K 3D.” Based on those mix-and-match tests, it was decided that the combination of the Sony F65 camera and the Stereotec 3D rig was preferable. Toll says the Stereotec rig was appealing in part because it was simpler, modular, and allowed easier access to the camera body. Some adaptations


were necessary to make the Stereotec rig compatible with the F65’s weight. “We felt the image quality from the F65 was quite impressive and exactly what we were looking for,” says Toll. “It was pretty subtle, but in examination you could see the difference in resolution. In this initial testing, 120 frame 3D projection was not available, but we made a decision early on and then went through a whole period of conversations about all other aspects of the film. “I didn’t interpret any of our tests results as proving one piece of equipment superior to another,” he says. “We just thought the F65 was more appropriate. I felt it had a cleaner image. I can’t describe it any other way. It has great resolution and dynamic range, and I like the way it holds highlights and shadows, just in terms of the subtleties of the image.” A wide range of lenses was tested. “We went with the Master Prime lenses because we were interested in maximum detail,” says Toll, “not necessarily critical sharpness, but maximum detail. We wanted to see everything in focus, and that’s part of what the 120 aspect gives you – you become much more aware of everything in the frame, because of the clarity.”

“We felt the image quality from the F65 was quite impressive and exactly what we were looking for.” – John Toll, ASC


Billy Lynn’s Long Halftime Walk

Consistency of the lenses also had a major impact on the decision. “With the 3D rig, lens matching is critical,” Toll says. “It was amazing how many sets of lenses the assistants had to go through until they actually found satisfactory matches. Not only did the Master Primes look great, but we were more successful in finding matches with them than with other lenses.

“We spoke about clarity, and with a combination of the F65 and those lenses we believed we were achieving maximum clarity. Don’t ask me for a precise definition of clarity because I can’t give you one and I believe it can mean different things to different people, but when Ang and I looked at an image together, I knew what he meant and he knew what I meant.” Eventually Toll committed to the project. “I knew it was going to be very ambitious, but I was pretty naïve about the true complexity – what it would actually take on a day-to-day basis to put this all together and accomplish the job on a 45day schedule,” he says. “It was never a budgetheavy or schedule-heavy film. It was always scheduled as if it were a traditional 2D movie. Budget and schedule were major concerns. The schedule was adjusted to 48 days by the time we went into production, and we came in right on schedule, which was, in addition to the technical challenges, an incredible practical challenge. “Fortunately, we had a great many other people who did have experience in 3D,” says Toll. “I brought my camera crew, and basically we had a long period of prep so we could work out all the techniques – how we were going to use the camera, how we were going to use the 3D.” The twelve weeks of camera prep were technical and painstaking. “We ate up the majority of prep getting the equipment together and into the kind of shape where we could actually go out and shoot efficiently,” says Toll. “Ideally, we would have explored the various frame rates and frame blending, but we ran out of time. We never really had the opportunity to shoot much material and view it in the way that we would view our finished product. Ben Gervais built a projection room and set up a 60 fps 3D projection system in our production facility in Atlanta. 60 fps projection gave us a very good idea of how everything was working, but we seldom had the advantage of seeing it at 120 because of the limited availability of 120 frame projectors.


“With the 3D rig, lens matching is critical. It was amazing how many sets of lenses the assistants had to go through until they actually found satisfactory matches.” – John Toll, ASC


Billy Lynn’s Long Halftime Walk

When it came to the actual shoot, the physical size of the camera rig had a major impact on placement and movement. Toll says that the mirror box of the 3D rig was bigger than any matte box. The rig, which weighed more than 100 pounds, was almost always mounted on a remote head and one type of crane or another. Traditional Steadicam was not going to work. The Grip Trix electric camera dolly often came in handy. “I had definite ideas about camera movement,” says Toll. “And during prep, Ang was designing shots that were Steadicam shots. We had planned on putting as much camera movement into the shots as possible, and actually accomplishing that was quite involved. We’d sometimes have a short telescoping crane on a Grip Trix with the rig on it. We did a huge amount of dolly work in the Georgia Dome, where we shot the football game. It was moving on the field, moving through the concourse. We’d use the cameras on a crane on a Grip Trix much as you would use a Steadicam, and that allowed us more takes. We had a great grip crew, and it worked out pretty well. In my mind, it does feel like Steadicam at certain points, which was our goal.”

“We had planned on putting as much camera movement into the shots as possible, and actually accomplishing that was quite involved.” – John Toll, ASC



Billy Lynn’s Long Halftime Walk

In terms of the look, the emphasis was always on reality. Digital Imaging Technician Maninder Saini customized the look to Toll’s specifications, using CDL values and a LUT applied after that. Gervais estimates that there were roughly a dozen LUTs generated over the course of the shoot. Finer changes for each shot were logged in the CDL. For color work, Flanders OLED monitors were chosen in part for cost reasons, and also because of the ease of calibration. LG OLED 55-inch televisions were the monitors for on-set 3D viewing because active 3D glasses on the set add too many complications with sync and flicker. A passive, fixed-pattern, polarization-based monitor was preferable. “We spoke quite a bit about contrast, and how it is used in 2D to create a sense of depth,” says Toll. We wanted to experiment with the idea of using less contrast in lighting and seeing detail as our eyes might see it, while letting 3D itself add depth to the image. In terms of lighting, this was one of the biggest changes from 2D photography.” Less contrast did not necessarily translate to diminished control over the light. In real life, humans are not generally conscious of selective focus, so Toll and his team put an emphasis on creating greater depth of field. At 120 frames per second, that also had significant implications for lighting and exposure. One particular shot illustrates Toll’s thinking well. In the stadium, the camera is on a close-up of the lead soldier, with his Bravo Company comrades seated along a row behind him. Toll wanted to keep the soldiers, as well as the rest of the stadium, in focus as much as possible.

photographic challenges in terms of light, and lighting in a way that has an aesthetic and has a look.”

“Even with an exposure index of 800, we were shooting at stops of 8, 11 and sometimes 16 – not only inside, but in a football stadium,” he says. “You’re seeing quite some distance inside a very large interior, and we were trying to maintain almost unlimited depth of field. That meant huge

Lee also had distinct notions about how light and depth interact. “A big part of lighting is creating depth,” says the director. “But now, with 3D, depth is a given. So when I see standard movie lighting, it looks strange. I need more fill light because I want to see all the details. The question becomes


“You’re seeing quite some distance inside a very large interior, and we were trying to maintain – John Toll, ASC almost unlimited depth of field.“ how to light that interestingly. I think interesting textures for now can replace what we used to enjoy so much about shadows. Framing and lighting must be different. You don’t rely on the frame so much because the side is not as effective as the middle – like our vision. “So many of the things we do in cinema must be adapted – focus-pulling, performance, writing, makeup,” says Lee. “If an actor tries to act, it looks

like they are trying to act. You can see thoughts in the actors’ eyes. I think the format does the art department a favor. We can see everyone’s effort. You see every detail that the art department contributed. Everything speaks. Everyone’s work contributes to the frame and to the story.”


“Even with an exposure index of 800, we were shooting at stops of 8, 11 and sometimes 16 – not only inside, but in a football stadium. You’re seeing quite some distance inside a very large interior... “ – John Toll, ASC



Billy Lynn’s Long Halftime Walk

Stereography: Creative Use of Depth in Visual Storytelling Stereographer/3D Supervisor Demetri Portelli worked as a first AC on many productions before segueing into stereography on Resident Evil: Afterlife. Clearly he took to the assignment – his subsequent work has been recognized twice with the Lumiere Award for best live action stereography, given by the International 3D Society. His peers acknowledged his contributions to Martin Scorsese’s Hugo, which earned five Oscars including best cinematography, and to Jean-Pierre Jeunet’s The Young and Prodigious T.S. Spivet, which was named best 3D film at the Camerimage festival in Poland in 2014. Portelli is currently working as part of a team of four that is not aligned with any equipment vendors. “My philosophy is that you learn by doing,” says Portelli. “3D was a bumpy road at the beginning for a lot of people, but our standards have really gone up. My background is in streamlining things on the set, but I’m also trying to provide stereo photography that’s usable in every frame. On this project, Ben [Gervais] was able to sit with Ang and understand what he needed and what would best serve the production. Certain key pieces of equipment were brought together, so there was no number one 3D vendor. Instead, there was a combination with Stereotec and Panavision, and then a very customized lab and workflow to accomplish things that had never been done before. I think that this independent approach was the only way this project could have been realized. It’s almost more like independent filmmaking – putting smaller pieces together in the most efficient way. That’s what a 3D project really is: a customization to serve one project most efficiently, because there is no blanket solution.

“My philosophy is that you learn by doing. 3D was a bumpy road at the beginning for a lot of people, but our standards have really gone up.” – Demetri Portelli


Demetri Portelli Stereographer/3D Supervisor


Billy Lynn’s Long Halftime Walk

“We also consolidated people – having one guy with three skills as opposed to one,” he says. “Rather than overloading the set with too many people, we had people who multitasked, who are very dynamic and ready to learn and adaptable. It’s a very political and careful approach to bringing 3D into production. On the projects Ben and I have worked on, there have been challenges. But we accept the challenges, and we try to make every shot work in native photography. Conversion is a very useful tool to have – the Méliès footage in Hugo is a good example – but it’s good to be making a 3D movie at every step.” When it comes to 3D rigs, Portelli sees two companies as a cut above. “Cameron Pace and their rigs, but also their on-set software to make

3D, actually work wirelessly and efficiently,” he says. “Stereotec in Munich has made really fantastic rigs that deliver a very good end product. Florian Maier is an excellent mechanical designer, and it was exciting to see him doing his first U.S. movie.” Some parts of the rig Portelli handles himself. “Over the years, we’ve been fighting for the 3D beamsplitter mirror to have quality and durability on set,” he says. “We’re continually refining to get a cleaner image so both cameras are getting good color and good resolution. When one camera shoots the reflection and is softer than the other, that’s not good. I’m trying to balance all stereo aspects of the film, to make it as closely matched as possible. Then it’s a much more pleasing experience, much more immersive.


“We manufacture our own mirrors because we also want to control the coating on the glass,” says Portelli. “We want to control the sharpness, the color coating and the stop loss. So we’re working with our own supplier for that. We don’t want to be forced to use a filter because of an inferior mirror. Let’s have a good piece of glass and let’s not filter at all. Let’s try and capture all that information. The glass and the coatings take you into the screen. I’ll save a certain piece of glass for night scenes. I have a special box for glass that would be good with candlelight, so it’s more immersive.”

“We manufacture our own mirrors because we also want to control the coating on the glass. We want to control the sharpness, the color coating and the stop loss.” – Demetri Portelli


Billy Lynn’s Long Halftime Walk

In the big picture, gimmicky 3D hurts more than it helps. “Every project is unique, and you have to see it from beginning to end,” says Portelli. “We talk about how there’s a flow to the film, with key punctuated moments. It’s like camera operating. A well-planned and well-conceived moment punctuated with 3D, you can build up to without cheapening the medium. It’s unfortunate when 3D is tacked on at the end. When directors and cinematographers are kept out of the 3D process, and nothing’s edited in 3D, it’s not really 3D. 3D has a beautiful history. Look at the movies from the 1950s, Dial M for Murder or House of Wax. They shot those without monitors on

the set. They had very good stereographers – the 3D is volumetric and really explores space that’s baked into both eyes. Two different pieces of information give that satisfaction to your brain. That’s what we’re providing with two cameras and two lenses and two perspectives. With conversion, you’re taking one image and duplicating it. It’s not as true or perfect, and it can be bizarre and alienating.” Like everyone else, Portelli came onto the project having only seen 60- and 48-frame 3D tests. He needed to be convinced that the difference would be significant enough to justify the additional


“Very subtle pieces of information are wrapped up in that image, capturing human sensitivities. For me, it’s always been about the close-up and looking into the – Demetri Portelli actor’s eyes.”

logistical burden of shooting 120 fps. “The big surprise was that it’s physiologically the most comfortable 3D experience I’ve ever had, which is wonderful,” he says. “People say 3D has been hard work. Well, it’s been hard work because 3D at 24 fps is not really ideal. 3D in a dark cinema is not ideal. It’s like trying to read in the dark. So the laser projector is also a massive solution, saving 3D photography and paving the way for future enjoyment. The way I could relax in my seat and absorb information and look around in a shot was remarkable and fascinating. The detail is an amazing tool to have. It can be beautiful. It

can be ugly. But it’s just amazing to have those options. It really opens the door. “On Billy Lynn, we were working at an extremely high level,” he says. “It’s almost indescribable, the sensations you’re getting. People are at a loss for words. Very subtle pieces of information are wrapped up in that image, capturing human sensitivities. For me, it’s always been about the close-up and looking into the actor’s eyes. When it’s done right, you can look not just at the eyes, but into the eyes, and understand the soul of the character and their experience.”


The division of labor had Gervais focused more on workflow, while Portelli’s main concern was depth. “I build roundness and I build screen presentation, all on set,” he says. “The only thing I’m doing in post that’s different is affecting convergence and screen presentation after the edit. But I really want to do that with the creatives, with the cinematographer, the camera operator, and the director, so that it is an actual 3D movie. Only a few crews have done it enough to reach a new level of exploration. I think it’s new ground.” Depth scripting, which is similar to storyboarding, can be a helpful exercise, but the plan often gets thrown away once the camera team is on the set, encountering physical reality. A dynamic and reactive team is essential, according to Portelli, but it’s good to have some broad strokes established. Mobility is essential, and the pressure on equipment designers is always to get smaller and lighter. That’s partly about creative freedom, but it’s also about efficiency and speed in production. Handheld was not in the brief, but weight is always a concern. Cables and pieces of the housing were stripped off to save ounces. The Stereotec rig was not originally designed to handle the weight of two F65s, but with modifications it served well. “The 3D rig held up beautifully, and so did the F65 in all the tests and under hard conditions,” says Portelli. “There were bumps and strange things where we expected to lose image or to not be able to align the photography afterwards, but it’s

there. People did not believe we could get the two F65s and the 3D rig into a limousine or a Humvee without actually cutting these cars apart. We were also able to get our director in a Sprinter van, totally wireless with a 55-inch 3D monitor, along with our cinematographer and me so that we could react as the car went down the highway. We


Billy Lynn’s Long Halftime Walk

could pull 3D, we could pull the iris and even move the camera if need be. That technical build has to be very well planned. For us, that was a huge accomplishment. We could do beautiful work and get our whole shooting day.”

in visual effects to fix. And I’m really proud of the fact that 100% of this 3D photography with these two cameras is in the movie. As 3D camera people, we don’t want people to say they’ve had to fix our work.”

Portelli’s team emphasizes respect for the cinematographer and the camera crew. “If I’ve done my job in prep with my 3D technicians, I’m not touching the camera or the rig at all,” he says. “I’m able to stay with Ang in the tent. Everything should be excellent – the sync, the alignment, the timecode, the capture quality. There isn’t a single stereo pair in the film where I’ve had to throw out one eye. In a very tight schedule, in a hard shooting environment like in Morocco, the performance was great. A camera problem or a 3D problem can cost a lot of money

“The 3D rig held up beautifully, and so did the F65 in all the tests .... People did not believe we could get the two F65s and the 3D rig into a limousine or a Humvee...” – Demetri Portelli


Lee considers stereoscopy and its storytelling potential in every decision, a mindset that dovetails with Portelli’s approach. “Ang always says, ‘We’re going to choose an IO, and then we’ll light it,’” says Portelli. “I’ve never heard a director say that before. That’s a 3D filmmaker. It’s like a dream for me, because I can set up the 3D shot. Of course, it didn’t always happen that way, but it was part of the language and part of the objective. Ang also said, ‘We’re shooting 120 4K 3D, so rather than try to make digital look like the last 100 years of film, why don’t we start exploring, and try to find

the aesthetic and the look of the digital age? We might not get it right. But let’s not always do conventional lighting and shooting.’ That’s what the project is for him – making discoveries. And what’s interesting is that you do make discoveries, but you also realize what you love about the last 100 years of cinema. You put those together and move the medium forward. Portelli saw the project and the format as an opportunity to go further with 3D. “I’m trying to encourage a 3D storytelling language,” he says.


Billy Lynn’s Long Halftime Walk

“We’re shooting 120 4K 3D, so rather than try to make digital look like the last 100 years of film, why don’t we start exploring, and try to find the aesthetic and the look of the digital age?” – Demetri Portelli “Where before we’ve been conservative, I would like to push the limits with stereo, and do more handheld and Steadicam. At 24 frames per second, we were more limited, but with the higher frame rate there is less motion blur between the eyes, so it’s more comfortable. This film does push the envelope a bit. It’s watchable 3D, but there are moments when you’re slightly uncomfortable. But it’s a depiction of war. “To me, it’s important that when you go into that movie theater, you know it’s a 3D movie,” says Portelli. “I’m in the business of removing

the presence of the screen and making things immersive both for depth, but also to explore the space with the audience. People have really neglected that language. It takes courage. So many 3D projects are pushed back out of fear, and the separation, the divergence is intolerable with certain light levels. You’re getting a lot of ghosting and it’s because people don’t have the confidence to play in front of the screen. It drives me absolutely crazy. Nobody wants gimmicky or cheap. But we have to find a way to make it work.”


Billy Lynn’s Long Halftime Walk

“A soft background is very beautiful in 3D because it allows you to look where you’re supposed to look – to focus on the roundness and the information in the face.” – Demetri Portelli


Cameras and lenses that deliver a sharp image are important to Portelli, but softness still has a place in 3D filmmaking, he says. “A soft background is very beautiful in 3D because it allows you to look where you’re supposed to look – to focus on the roundness and the information in the face, for example,” he says. “But there is a case to be made for realism. We’re chasing realism since Leonardo da Vinci, hoping to connect as a human to other humans, or connect to an environment or a space. There are moments in all the 3D films I’ve worked on where the lighting has been just right, the lens has been just right, all the conditions have been just right, so I feel like I could reach through the screen and grab a wine glass. Then I’m there. We achieved it. As stereographers, we’re always trying to find that sweet spot, like cinematographers do with their lighting. We have to understand: are we building realism, or are we trying to make something expressionistic and alienating like Dr. Caligari? Those are artistic words: realism and expressionism and naturalism. There’s no mathematical setting for 3D that is the same from one actor to another. Every human being is different. That’s what I love about 3D. It’s so creative, as opposed to merely technical. It’s a volumetric way of understanding and there are so many different variations.”


Billy Lynn’s Long Halftime Walk

High Tech Serving Artistic Goals Lee and producer Brian Bell turned to Ben Gervais to help surmount the technical challenges in both production and post. Gervais teamed with Portelli on the awardwinning and ground-breaking 3D film Hugo and The Young and Prodigious T.S. Spivet, and his other credits include X-Men: Days of Future Past, Crimson Peak, and Pacific Rim. On Billy Lynn, Gervais was involved in the early testing that led to the decision to shoot 120 frames per second. “The 120 frame rate made the most sense, partly for aesthetic reasons, and partly because of technical reasons,” says Gervais. “It gave us a lot of options in post, as far as generating the deliverable versions of the movie. We knew we would still have to deliver a 24-frame version of the movie, and we knew that most projectors in cinemas now can only do 60 frames per second. So 120 was the most practical. Then it really became a matter of putting our noses to the grindstone and figuring out how we were actually going to accomplish that,” he says. “We knew we were doing something that had never been done, before,” he says. “John [Toll] had not shot in 3D before, and it’s part of the learning curve to get a director of photography into the headspace where he or she understands the considerations involved. Educating them is part of the prep process, so they can make informed decisions when lighting the film. It’s important to involve the director of photography right out of the gate. It’s a cooperative thing.”

“We knew we would still have to deliver a 24-frame version of the movie, and we knew that most projectors in cinemas now can only do 60 frames per second. So 120 was the most practical.” – Ben Gervais


Ben Gervais Production System Supervisor Photo Credit: Patrick Thompson


Billy Lynn’s Long Halftime Walk

Gervais says that the filmmakers were aware that very few people had seen 3D 4K imagery. “Even with a 4K projector, when you project 3D material, it always goes down to 2K,” he says. “One happy accident occurred when we did finally manage to project, with two projectors, 4K 3D imagery. We discovered that unlike 2K versus 4K in 2D, where there isn’t actually that much of a difference and you need a trained eye, in 3D it’s a totally different conversation. The difference, when we did A-B comparisons, was very noticeable. The illusion of stereopsis, the 3D space that you perceive when you watch a 3D movie, is based on your brain fusing fine detail from the two eyes and placing it in space,

so the more fine detail that exists in the image, the more nuanced and real the 3D depth space appears. And I believe that’s why 4K has really shone for us.” Obviously, Lee’s take on the imagery was the most important. Once the problems Lee saw in 24 fps 3D were eliminated, the goal became images that render certain immediacy. “Ang reacts to images on a more emotional level,” says Gervais. “I wouldn’t necessarily set up a doubleblind test for him, but if I had a bunch of options for him to evaluate, I would just put a number up in the top corner. Then we would work backwards – ‘Why


“The fact that we’re at 120 means that we have the ability to show people all that detail without having to slow it down.” – Ben Gervais

does Ang prefer number 4?’ With our 4K tests, he said it was a more intimate feeling, with a higher emotional impact. The more technically-minded of us could really take apart the images and point to individual differences.” Lee advanced a theory about why slow motion is used so extensively in action scenes – to place emphasis, and to make apparent things that 24 fps imagery is incapable of presenting clearly. “The fact that we’re at 120 means that we have the ability to show people all that detail without having to slow it down,” says Gervais. “So why slow it down?

Why don’t we just keep the pace up by presenting it in real time? We don’t have to use slow motion as a sort of crutch.” That’s one of many discoveries made along the way. “It becomes an image you can’t ignore,” says Gervais. “It’s easy for a viewer to detach from a 24-frame 2D movie. You’re very much a third party, a detached viewer watching something. Watching this 120 fps footage, you don’t really get a choice as a viewer. If you are looking at the screen, you are there with those people in that space and you can’t choose to detach yourself from it because it’s so real-looking.”


The F65’s sensor produces raw data in 24 fps mode that is 8K. In high speed mode, the F65 delivers a 60i signal at 1080. Some broadcast infrastructure tools were adapted to translate the 60i from the SDI output into 60p, so the imagery could be viewed on 3D monitors live on the set. The resolution was diminished in the monitor images, but the 60-frame cadence was roughly accurate. When it came to dailies, the infrastructure did not initially exist to view 120 fps 4K. At the time, only one projector series in the world was capable of projecting 120 frames in 4K – the Christie Mirage series, a non-cinema projector usually used for visualizations like video walls and flight simulators. The production needed two to see 3D at native resolution and frame rate, so Christie lent two laser-based projectors one week before production, giving the filmmakers an opportunity to see test material in the format they actually shot in. Christie provided the projectors one more time, a week before they went to Morocco. That meant that prior to post, the filmmakers saw the actual 120 fps 4K 3D images twice – once just prior and once during production. “To be honest, it was kind of shock all around,” says Gervais of the experience. “We had all seen tests that went to 24, 48 and 60-frame 3D, all in 2K.

The 60 was definitely an improvement over the 48, but it wasn’t as much of a jump as 24 to 48. So we thought 120 would be sort of another incremental bump. But it wasn’t an incremental bump at all. It was something completely different. We spent a lot of time theorizing about what it might look like, but none of us were prepared for it, and that’s one of the difficulties with this format. You can explain to people that it’s new and different and


awesome, but until people lay eyes on it, they don’t really understand the visceral impact it has on the viewer.”

footage was generated, which works out to about eight terabytes. On the biggest days, that number spiked to 18 or 19 terabytes.

The challenges rippled down to every aspect of the production. Even acquiring enough media posed an issue. At one point the camera team was using more than 120 high-speed 512-gig cards. On an average day, about 100 minutes of stereo

The F65’s sensor produces raw data in 24 fps mode that is 8K.


Using a compression algorithm (Sony compressed RAW) saved considerable time and space. “Uncompressed raw would have been six times the amount of data...” – Ben Gervais


Billy Lynn’s Long Halftime Walk

At the near-set data-wrangling station, a system was designed from the ground up. A half-dozen Sony SRPC4 rigs each held 10Gb Ethernet cards. A copy of each card would go to a spinning disk over a 10-gigabit Ethernet network, and another copy would go onto a solid state SAN, serving as a working copy. Everything was backed up to LTO tape within a couple days as well, after which the cards could be reused on the set. Using a compression algorithm (Sony compressed RAW) saved considerable time and space. “Uncompressed raw would have been six times the amount of data, which would have been very difficult to accommodate given our budget level,” says Gervais. “We were very happy with it. There weren’t any circumstances where we ran into compression artifacts or anything like that. The F65 sometimes leaves a little bit to be desired in the darks, but we actually got very lucky in that. You’re down one stop right away due to the beam splitter, and shooting at 120 puts you down another twoand-a-third stops. 3D wants a deep stop, as well. We shot with a 360-degree shutter, so we got one stop back. But you’re still talking about a camera with an effective rating below 200 ASA. And you’re trying to

shoot in a stadium, without a blockbuster budget. There’s only so much light you can pump in. So that was definitely one of the challenges.” For some shots, bumping the exposure rating of the camera to 1600 was a solution. “Obviously, a commensurate amount of noise then gets added back in,” says Gervais. “One of the bonuses that we found – another sort of accidental discovery, but it was definitely something that we appreciated – was that when you play back this footage at 120, the individual frames go by so fast that your brain tends to fuse out the noise. We discovered this during a noise reduction test. We were A-B comparing footage that had been noise-reduced and footage that hadn’t, and none of us could tell the difference. We were trying to figure out where we had screwed up! But when we paused the footage, it was obvious which was which. Even when we frame blend down, averaging frames to get down to 60 frames or 24 frames, it also averages out the noise. So shooting at 120 and frame blending down to say, 24, under most circumstances actually gave us a lower noise floor than if we had shot it at 24. We don’t get a lot of things for free at 120, but that is one thing we discovered that we do get for free.”

Photo Credit: Patrick Thompson


Billy Lynn’s Long Halftime Walk

The dailies pipeline was essentially invented for the project. Dailies were especially crucial given the unprecedented nature and exploratory ethic of the shoot. Colorfront was used in the dailies stage for color work, syncing audio and for rendering out the various deliverables – dailies for viewing in the studio, Avid media for editorial and Pix media for the studio. Colorfront played a key role in data translation from capture, QC, creating dailies, incorporating Toll’s color tweaks from the set, and delivering the files in the proper form to editorial. Audio sync is part of that process. As it did in every other way, Billy Lynn’s Long Halftime Walk stretched the capabilities of Colorfront’s software and hardware and required adaptation. “Ben approached us and asked if we could do some specific development that would allow the Billy Lynn team to work the way they wanted to work,” says Colorfront’s Bruno Munger. “Having a piece of software that was fast enough was part of the solution. We also had to make special arrangements in order to work with Avid Media Composer at 60 fps. We customized the software and the configuration to make sure that the system was as fast as possible.” The raw file produced by the camera had to be extrapolated in order to be viewed on a screen. That involved decompression and colorspace conversion to match the native colors of the camera on a regular computer monitor – processing-intensive steps that must not diminish clarity or resolution. “We couldn’t have done it without nVIDIA Quadro graphics cards,” says Munger. “All Colorfront technology is based on nVIDIA Quadro and nVIDIA GTX cards and CUDA Shading Language for nVIDIA. With stereo, it’s more data and a tremendous amount of processing. You still have to see everything the same day. So the speed of processing, the conversion of format A to format B, must be as fast as possible. That is the number one challenge.”

Gervais appreciated the performance. “Those machines had Titan Xs in them,” says Gervais. “Even with 2K 60 frames per second stereo media, we were getting about 75 or 80 frames per second render performance out of those machines, which is pretty impressive. Our Baselight has six GTX 980s in it. The guys at Filmlight know how to code for a GPU very well, so they’ve optimized just about every tool we use. Some of the stereo tools don’t really benefit that much from GPU acceleration, but the rest of our workflow definitely does, and they’ve pushed it pretty far.


The raw file produced by the camera had to be extrapolated in order to be viewed on a screen.

Photo Credit: Patrick Thompson


Billy Lynn’s Long Halftime Walk

The F65, when it runs 120 frames, generates 24-frame time code, but at five times speed.

There’s no such thing as real time when you’re trying to play back 120 frames per second 3D footage and doing a color grade. But it definitely buffers fairly quickly, so we can watch it back in real time within a reasonable amount of time.” After a 12- to 18-hour turnaround, dailies were screened on a single-projector 3D system in 2K at 60 fps on a 20-foot screen. “The dailies process really helped us, especially in testing,” says Gervais.

“We’d shoot a test, and get the whole crew in the room, and at first it was a little bit like a funeral every time we did it. Everybody would get very pale and walk out of the room and not feel very good about themselves because it all looked fake. So we had to learn some new tricks. Every single department had to pay much more attention to the way they did things. A strong backlight might look good in 2D at 24 frames, but in 3D at 120, a strong backlight looks wrong. Every department rose to


been acquired by RealD.) That made all the time code invalid once the footage went into the Avid. In order to conform, the EDLs had to be translated and recalculated using a DYI solution. The F65, when it runs 120 frames, generates 24-frame time code, but at five times speed. Add to that the fact that the cameras don’t start and stop on the same frame even when they are connected. The time code is often off by about three frames. So part of the dailies process was creating a database of the first matching frame in each eye using original camera code. Adobe Premiere was helpful there. An Excel spreadsheet had to be created that served as a sort of paper conform relating all the various materials. This complex process has now been distilled by Gervais to about 6000 lines of code that allows him to upload a 60-frame EDL from the Avid, cross-reference everything, and generate two EDLs that match back to the original camera footage. Those were brought in to Baselight, and the proper footage could be referenced for the online.

the challenge and improved their craft so that it didn’t look fake. More actual locations were used in order to achieve that realistic look.� Once editorial had a cut, they would send it to the lab, where the process was similar to a standard movie. But since dailies were 60 fps, the highest frame rate Avid supported, Tessive Software provided a solution that took 120 fps to 60 fps using a complex algorithm. (Tessive has since


Billy Lynn’s Long Halftime Walk

After that, the entire rig was packed up again and flown to New York, where with some modification it was repurposed into a post- and visual effects facility. The dailies lab was designed from the beginning to be somewhat portable. When the shoot shifted to Morocco, the entire dailies lab was flown there from Atlanta and re-built on site in five days. After that, the entire rig was packed up again and flown to New York, where with some modification it was repurposed into a post- and visual effects facility. The production eventually acquired two of the Christie Mirage projectors, which allowed them to see 120 4K stereo throughout postproduction, and the entire DI is being done in-house with this setup. Crowd replacement effects is being done by an outside firm, but the in-house visual effects team of three is doing stabilization and other effects using Nuke, Stereoid and other software. Roughly eight Mac Pros and a few PCs comprise the render farm, and Sony Pictures contributed two Baselight transfer

stations, which function as extra storage and render nodes. Clipster is also on loan from Sony. Part of the plan from the start was to preserve the ability to do a high dynamic range (HDR) pass on the film. With that in mind, visual effects pulls were done in Open EXR 16-bit with a combination of the Sony RAW Viewer tool and the Baselight. For the online, most of the demosaic-ing was done in Baselight, thanks to its solid internal de-Bayering capability and NVIDIAÂŽ GPU acceleration. For non-vfx shots, one eye is often carried as RAW. A lot of the original footage can be kept as camera original, which saves space. The camera negative size for the whole film is roughly 400 terabytes. Two hours in stereo at 120 4K in 16-bit Open EXR is a significant amount of data, Gervais notes. A two-hour movie at 120 frames per second in stereo


is 1.7 million frames, roughly. Anything that helps us streamline the workflow and save space is pursued.

Part of the plan from the start was to preserve the ability to do a high dynamic range (HDR) pass on the film.


Billy Lynn’s Long Halftime Walk

The technical challenges were unrelenting. At times, unique, unprecedented problems were presenting themselves on a daily basis.


“Normally, my MO going into a production is to try and do visual effects pulls with the camera vendor’s tool, because it’s using their exact color science,” says Gervais. “It’s easily available to other visual effects vendors or anybody who wants it can just go online, download the tool, and get exactly the same results that we’re getting, which is very useful. But in this case, the guys at Filmlight have been very fastidious in making sure that they get results that are very similar, if not identical, to the quality of the output that we get from the Sony RAW Viewer tool. So we felt comfortable after some testing just using the Baselight.” The technical challenges were unrelenting. At times, unique, unprecedented problems were presenting themselves on a daily basis.

“There’s nobody we could ask,” says Gervais. “There weren’t a lot of examples that we could look at. Using the tools at our disposal, how do we solve this problem? Trial and error. Now that we’re in the home stretch, I can say for the most part it’s been successful, and a lot of that is due to the fact that we are not a facility. If we need to build something or make a modification, we’re a little more nimble than a post facility would be. We’ve only got one client: ourselves. So it’s just about making whatever works for us work. We were working in the dark, to some degree. Having a classical dailies feedback mechanism allowed us to watch the footage every time and learn. We could see what worked and what didn’t. We refined our technique as we went forward, and that was true of basically every department, not just the camera team and the 3D team. Every single department on the set had to up their game and adjust the way they work for the format.”


Billy Lynn’s Long Halftime Walk

Inventing New Creative Options in Post Another of the many variables explored was the amount of dark time in the projector. In flight simulation, sometimes designers manipulate the amount of black between each frame in order to make computer generated motion imagery appear more smooth. In theory, the black period is filled in by the brain with interpolated motion, making a sequence of sharp frames appear to be smoother. But when the Billy Lynn team increased the dark time, this effect wasn’t seen, in part due to the fact that the footage was shot with a 360-degree shutter. The images are sharp, but do include some degree of motion blur. Gervais’s research in this area led him to a concept called “the flicker fusion frequency.” “We found that most people find that point between 60 and 90 cycles or frames per second,” says Gervais. “It’s the point where something is flickering so fast that you can’t tell it’s flickering. Under some circumstances the human eye can see up to 500 hertz. But for the most part it’s lower, and we are exceeding that with the 120. You stop being able to see individual frames in a picture. It doesn’t matter how fast an object is moving, especially if there’s motion blur. The motion is just completely continuous. You can track everything. You can watch faces as they move across the frame, instead of having to pan with them. Camera operating in 24 frames has evolved to cater to the fact that if we didn’t pan the camera with the main character, he or she would be so blurry you wouldn’t be able to see facial expressions. We discovered that we don’t have to do that. We don’t necessarily have to follow the character or punch in for the close-up. There’s so much detail that the audience can just absorb what’s going on. So we have some new choices editorially, too. “There’s a lot about human perception that is still unknown, especially with shot footage,” says Gervais. “A lot of systems use 120 hertz footage, like gaming and VR and flight simulators. But almost all of that is CGI. We’re one of the first projects to test 120 on a large scale with real footage, and stereo on top of that.”


“Under some circumstances the human eye can see up to 500 hertz. But for the most part it’s lower, and we are exceeding that with the 120.” – Ben Gervais


...editing in the native 120 fps 4K format was impossible .... the Avid software that offers 60 frames per second playback was in beta...

Tim Squyres Editor


Billy Lynn’s Long Halftime Walk

Avid’s 60 fps support was officially released during the shoot, but staying in the beta program offered some advantages for the production. “We have more direct communications with the Avid engineers,” says Squyres. “We never install new betas the first day they come out. We let somebody else do it, because we can’t afford to have something that doesn’t work. Generally the software has worked well for us. Some performance issues and weird bugs have come up that were unique to our workflow. Everything takes a little longer. You would like it to be easy, but it’s not yet.”

Tim Squyres served as editor on Billy Lynn’s Long Halftime Walk, as he has on all Ang Lee’s films, with one exception, going back to 1991. He was instrumental in designing the room in which Life of Pi was cut in 3D, but in the case of Billy Lynn, editing in the native 120 fps 4K format was impossible. Compromise was unavoidable. For example, the Avid software that offers 60 frames per second playback was in beta – a rare situation for a studio feature film editing room. “The best we could do was 60 frames HD, sideby-side, so that’s how we worked on this film,” says Squyres. “We try to watch in as close to the real environment as possible. For a 2D movie, most editors are good at imagining how what you see in front of you will translate to a big movie screen. But because of the new variables, we felt it was important to work on a 12-foot screen, at as high a frame rate as we could.”

The post workflow was designed to facilitate collaboration. Visual effects, editorial and the lab were all together, making the creative process more fluid. “The round trip time from Ang to Tim’s editors to us working in the lab made for pretty quick feedback,” says Gervais. “You can just walk down the hall, and say, ‘Hey, I tried something here, what do you think?’ The way we designed the production was around the formats. We really kept our options open so that we were never backed into a corner by choosing to bottle it a certain way. It’s really exciting because we’re experimenting all the time and just throwing stuff against the wall to see what sticks. It’s a humbling experience because as movie makers, we feel like we know what we’re doing, and this really shows us what we don’t know.” The DI is being used to address some issues that come with shooting native 3D, including geometry correction, sharpness differences and polarization flaws from the mirror. “Those would not be issues in a conversion,” says Squyres. “But what you gain shooting native I think is more important. We have a lot of scenes with 3D dust in the air. You can’t post-convert that. You can fake it, but it never looks right. There are many factors when you’re filling up the space well. With faces, there’s real dimensionality as opposed to post-converted. It’s really stunning footage. In Morocco, we were shooting outdoors in bright sunshine, and it’s stunning when you can see detail all the way to the horizon.


Billy Lynn’s Long Halftime Walk

“When we did Life of Pi, we were given a lot of rules about how you should cut differently with 3D,” he says. “I was very skeptical of those rules. I decided right away that it was a 3D movie, so I didn’t want to be thinking in 2D. From day one, I worked only in 3D all day every day. I strongly recommend that approach. Most editors are very good at knowing how something translates to a theater. But I didn’t have confidence that I had that ability in 3D. In the cutting room in New York, Squyres worked with a 12-foot screen and a Christie projector. Eventually, the rig expanded to enable projection of side-by-side HD. In the Avid media, the two eyes are squeezed horizontally into a single 1920x1080 frame, which the left eye and right eye next to

each other. The images are then unqueezed and superimposed in projection, as opposed to being shown sequentially, as is the case with a single projector. “We figured that the closer I could get to the theatrical experience, the less translation I would have to do,” says Squyres. “For this film, it was really important to have that good, solid screen environment to cut from. However, we were not able to edit at 120 frames. In fact, we could barely come up with a hardware and software combination that would actually work at 60.” The richness of the imagery meant that extensive coverage was often unnecessary. The tight 48-day schedule also kept coverage to a minimum. That helped make organizing the footage simpler.


In the cutting room in New York, Squyres worked with a 12-foot screen and a Christie projector. Eventually, the rig expanded to enable projection of side-by-side HD. “It’s interesting having new variables to work with in editing, like stereo adjustments,” says Squyres. “Usually, you can go to close-up, you can go to wide, and you cut and dissolve. Between Pi and Billy Lynn, I went back and cut a 2-D movie, and it felt kind of lame, because there was nothing to except cut or dissolve.” Squyres says that improvements in cameras and on-set 3D rigs have a positive effect in editing. “We’ve had fewer alignment issues on Billy Lynn,” he says. On Life of Pi, there was a lot of misalignment, more in the first half of the shoot then in the second half of the shoot. So that suggests that the team was getting used to the rigs. It may also be the pipeline. But the quality

in terms of alignment and stereo issues was significantly better on this film than on Pi.” Squyres is quick to add the Life of Pi had a much greater CG component. “Huge expanses of our frames were CG,” he says. “All of those beautiful horizons and seascapes and clouds – that was not in camera. Cinematographer Claudio Miranda, ASC was very good at composing with that in mind. But it’s hard to compare. Billy Lynn was very different. We’re using CG to fill our football stadium and for some enhancements in the battle, but CG is not nearly as fundamental a part of the visual look of the film as it was in Life of Pi.”


Billy Lynn’s Long Halftime Walk

“Avid looks at their client base and what people are asking for and puts resources where the most clients will make use of them.” – Tim Squyres On Squyres’s wish list are improved ways of moving stereo information between Avid and Baselight systems, and greater facility with stereo EDLs. “I look forward to the day when that carries over, so that when we start in the Baselight, we’re starting from the base I already established in the Avid,” says Squyres. “Native support for 120 fps is also on the wish list, but I don’t expect that anytime soon. Stronger hardware support for 60 fps would be great, and I wish we had better stereo analysis and correction tools. “When we did Life of Pi, that was on Media Composer 5, which had no stereo support at all,” says Squyres. “So we had to do everything by hand on that one, and kind of trick it into doing things. The stereo tools are quite good, but there’s a lot more they could do. Beyond that, just supporting better resolution – ideally I would be carrying discreet left and right eyes rather than side-by-


At least two dozen release versions of the final film are being mastered, including some with varied brightness levels, some in 2K and others in different frame rates, including 2D at 24 frames per second. “We have the tools, mostly in the form of the frame blending software, to really optimize each of those versions, to make it as good a viewing experience as possible,” says Gervais. “Often we’re taking that 120 frames per second and bringing it down to 24 or 30 or 60 or whatever the various delivery formats of the film will be, and we’re able to do that in a way that gives more creative power. We’re seeing things like less noise when we bring it down from 120 to 24 than if we had actually shot it at 24. Because we shot at 120, we can make a better 24-frame version or a better 60-frame version. We’re always refining our techniques to deliver power right back into Ang’s hand, which is really where it should be.”

side. Right now you can’t output two separate full streams.” Squyres, who has been an Avid user going all the way back to 1992, says that the small number of end-users for such advanced tools is one factor slowing development. “A lot of 3D is conversion now, and that happens in a different environment, which I think is a shame,” he says. “I think it should be part of editorial. Often, the editor isn’t involved in the conversion. So continued development of these tools depends on how many films shoot native, and how many convert. If there’s strong support for shooting native, it’ll make perfect sense to integrate better tools into Media Composer. Because when you shoot native, ideally you cut that way, too. Avid looks at their client base and what people are asking for and puts resources where the most clients will make use of them. Right now, you know, high-frame-rate 3D is a pretty niche market. We’re hoping they realize that this might be the future.”


Billy Lynn’s Long Halftime Walk

At least two dozen release versions of the final film are being mastered, including some with varied brightness levels, some in 2K and others in different frame rates, including 2D at 24 frames per second.



Billy Lynn’s Long Halftime Walk

Combined with 120 fps cinematography, the RealD TrueMotion frame-blending software goes a long way toward achieving Lee’s goal of smooth 3D motion. With TrueMotion, custom shutters were developed for Billy Lynn that weight individual input frames in non-linear fashion. The shutter angle can be “reset” to a number greater than 360, if desired. “We can add motion literally to just one part of the frame or just one character, and leave the rest of it sharp, or vice versa, and it doesn’t add to the expense of the effects,” says Gervais. “We can render out multiple versions of the same shot, and then without involving visual effects at all, in DI we

can combine two versions – for example, so that an actor’s face has a narrower shutter angle, but the rest of the scene has a wider shutter angle, so the viewer’s eyes go to that person. One shutter angle called ‘Carbon’ basically has a sharp leading edge, but a slightly blurred trailing edge to the motion blur so that we can have the perception of smooth motion, but still see some fine details. So as a face moves fairly quickly, you can still see the expression on the face, but without this huge perception of individual frames and judder at 60 frames, for example. It’s not an obvious defocus or anything like that.“ “


But there are some interesting creative possibilities brought on by shooting with a higher frame rate,” says Gervais. “We essentially acquire the ability to change shutter angle in post or manipulate the noise,” he says. “The square wave shutters that the TrueMotion software has look exactly as if you had shot them that way, but with the bonus of a slightly lower noise floor. The idea is to optimize temporal resolution with the goal of manipulating the perception of motion judder – how harsh or how smooth the output appears based on how the frames are weighted. There’s always a tradeoff between motion blur to achieve smooth motion and the loss of resolution that comes with that.” Even though very few theaters can show the film in its native format, the additional information captured will help every presentation. “The normal paradigm is you shoot a bunch of frames, and then you project those frames in the theater,” says Squyres. “With this, it’s more like we’re capturing a lot more data, and then using that data to create all different kinds of formats and all different looks. So it’s a different way of thinking about shooting, and that’s something that gets into editorial too, because then we can come up with different ways of looking at the film. We can make some things look more normal, more like they were shot at 30 or 24 in the context of the films where some scenes look like that. That’s something we’re still working on.” Regarding projection, because standard DCI cinema servers don’t support display of 120/4K/3D, we needed another solution. 7thSense Design was able to supply us with their top-of-the-line Delta Infinity III servers, one for each projector. They use solid state drives to play out uncompressed 4K 4:2:2 DPX sequences, playing out 4 x 2K tiles at 120fps over displayport links. The servers are genlocked together, and can hold over 24 terabytes each, Billy Lynn comes in at around 17 TB per eye, so they worked perfectly. Along with Christie®, 7thSense partnered with us to show the film in select theaters in this format, in addition to providing support for the screenings we did at NAB and IBC.

“We can render out multiple versions of the same shot, and then without involving visual effects at all...” – Ben Gervais


Billy Lynn’s Long Halftime Walk

Looking Forward The Billy Lynn project began as an exploration, and one of the lessons the principals took away is how much there is to yet to learn. “Ang will be the first person to say that even though we’re finished, we’re definitely not experts, because it’s such a sensitive medium,” says Gervais. “For all of us, it’s like being back in film school again. We’re starting from square one and figuring out what works and what doesn’t. We don’t want it to look dated like some of those first visual effects blockbusters that came out in the early ‘90s. We thought they looked really realistic and cutting edge at the time. But right now we think our film looks pretty great. We couldn’t be too dogmatic about our approach. We had to leave our egos behind, and just be objective and true to what’s on the screen. None of us knows anything about this new format. Nobody does. So we have to just approach it with open minds and try. “We hope that Billy Lynn is a good movie on its own,” says Gervais. “But we also feel that this puts a new palette of tools in front of filmmakers, and allows them to discover a new way of connecting the characters with the audience. We hope the film shows you that the audience can be immersed in a story in a new way, which is something that we haven’t really experienced in a long time.” Toll says that throughout the endeavor, everyone involved had the sense that it was a grand experiment.


“we also feel that this puts a new palette of tools in front of filmmakers, and allows them to discover a new way of connecting the characters with the audience.” – Ben Gervais


Billy Lynn’s Long Halftime Walk

“You have to give full credit to Ang, because he was 100% committed to the idea,” says Toll. “It was always Ang’s faith in the idea that motivated everyone. We had to believe that all the sweat and effort that went into shooting at 120 was going to pay off, and in retrospect it has. We were always working toward a goal, but we had no guarantee that it was actually going to be worth it. I don’t think anyone really knew where we were going to land. Ang is the main reason this worked. Not only does he have an incredible eye. He’s probably the most visually coherent director I’ve ever worked with. Ang actually knows how to use the technology to enhance the story, which is what it’s all about. You can’t approach this technology lightly. It’s demanding, which requires somebody who’s very aware of the demands that it places on you – not only how to use it but how you plan everything around it. It’s incredibly demanding. It’s not like ‘Let’s just pull it off the shelf and turn it on and use it.’ You really need to make adjustments and sacrifices for the technology.” Lee’s vision and leadership stretched every aspect of filmmaking technology to the limit. “It’s rare that a filmmaker will come to us with a technological challenge that will change a lot of different parts of the industry at once. Usually, we’ll push the boundaries on the effects or something like that, but with this one, we were really pushing the boundaries lens to lens, and that became a very exciting proposition for us. You hear about all the technology, and you might think it’s a giant VFX film, but it’s really not. It’s about storytelling, and what Ang and the team have done with the technology is actually tremendous.” Although Lee thinks of himself as a non-technical person, he is aware of the importance of technology to his artistic goals. “Filmmakers often don’t want to touch new technology,” he says. “The technology people are too separate from filmmakers. I think they can teach us something, and I’m glad they are trying to improve that aspect. I’m not smart when it comes to technology, but I’m interested in the way things look. When I see something they have to offer, it inspires me. You cannot just do science without the input of the artist. The collaboration is very important. It’s how we get to the human heart. That’s the end goal.”


Workflow Summary DAILIES: • Two Sony F65s in 8K with Sony RAW De-mosaicing at 120 frames per second • Ingest F65 footage from cards • De-mosaicing to ProRes 4444 on Mac Pro • Frame-blend from 120 fps to 60 fps using RealD Truemotion on render farm of MacPros • Add color and sound, transcode (and some color grading) with Colorfront Express Dailies on PCs with NVIDIA GPUs • Delivery to Avid for editing OFF-LINE EDIT: • Avid Media Composer, EDL for VFX and final conform, on MacPro STEREO CORRECTION AND VISUAL EFFECTS: • The Foundry NUKE, Ocula Stereo Plug-ins, Stereoid CONFORM: • F65 RAW media and VFX into FilmLight Baselight • Stereo correction and color correction, with some 120 fps 4K review • HDR grade on Baselight at Dolby’s Vine Theater • Output 120 fps media FOR LOWER FRAME-RATE VERSIONS: • Export to render farm for frame blending using MacPros • Reconform media into Baselight and output final lower fps media 120P PROJECTION: • Christie projectors • 7thSense Design Delta Infinity III servers

“It’s rare that a filmmaker will come to us with a technological challenge that will change a lot of different parts of the industry at once.” – John Toll, ASC


sony.com/35mm

Š 2016 Sony Electronics Inc. All rights reserved. Reproduction in whole or in part without written permission is prohibited. Features and specifications are subject to change without notice. Sony and the Sony logo are trademarks of Sony. All other trademarks are the property of their respective owner.

68

back to TOC


Issuu converts static files into: digital portfolios, online yearbooks, online catalogs, digital photo albums and more. Sign up and create your flipbook.