TM Broadcast International #94, June 2021

Page 1



EDITORIAL A couple of weeks ago the organizers of IBC

and entertainment network into a single facility: The

communicated something that nearly all of us had

Woodlands, Texas. Jon Williams, Vice President of

taken for granted: the exhibition will finally take

Media Engineering at Disney Media & Entertainment

place in December. All gazes are set now on NAB

Distribution (DMED), told us how this integration has

Las Vegas, which is scheduled to be held next

been carried out which, among many other

October. At the moment no other dates are being

challenges, has had to face an immense geography

considered for this show, but strongly gaining

and more than 240 affiliated stations for ABC alone.

ground here is the thought that it is still early for

We also have The Collectv and Box to Box Films, two

such a big event. The market is more inclined

major UK companies that told us how the famous

towards holding more local, less crowded events,

Netflix’s Formula 1 Drive to Survive is made. We also

which gradually recover the normality of our sector.

spoke with Blizzard Entertainment about the benefits of working on the cloud in esports and

TM Broadcast does not stop and one more month

Canada's Take 5 Productions described the

comes crammed with content that, in addition to

technology behind the acclaimed Vikings series. But

being of great interest, shows that broadcast does

these are just some of the contents that you will find

not stop either. We bring to these pages an

in these pages. Go ahead and you will discover much

important Disney project: the unification of its sports

more!

Editor in chief Javier de Martín editor@tmbroadcast.com

Creative Direction Mercedes González design@tmbroadcast.com

Key account manager

Administration Laura de Diego

Susana Sampedro

administration@tmbroadcast.com

ssa@tmbroadcast.com

TM Broadcast International #94 June 2021

TM Broadcast International is a magazine published by Daró Media Group SL Centro Empresarial Tartessos Calle Pollensa 2, oficina 14 28290 Las Rozas (Madrid), Spain Phone +34 91 640 46 43

Editorial staff press@tmbroadcast.com

Published in Spain ISSN: 2659-5966


SUMMARY 6

News

18 Formula 1: Drive to Survive

“Formula 1: Drive to Survive” showed us the personal face of Formula 1, with drivers at the core centre of the documentary. Available on Netflix, it was produced by Box to Box Films, with the collaboration of The Collectv.

34 Disney in The Woodlands

The aggregation of the signal distribution in a single location is becoming a trend in the TV industry in the US. With the challenges of an immense geography and over 240 affiliate stations only for ABC, Disney has decided to take this path.

48 Overwatch League

Blizzard Entertainment is a developer of well-known videogames like Diablo, Overwatch and Warcraft. During the last years, the company has made a bet on esports with tournaments and massive venues around the globe, which includes the Overwatch League, the world’s first city-based esports league with 20 teams based all over the world. 4

34


78 60

Innovating into the future: How esports is winning the production game

64

Esports takes live cloud production to the next level

70

Video over IP: Controlling the IP

86

Take 5 Productions

environment and its security

We jump in to a boat with Take 5 Productions, Production Company established in Toronto and responsible of huge successes like Vikings, Vikings: Valhalla and The Handmaids Tale. We had the opportunity to discuss with Nick Iannelli, EVP Post Production, the challenges of such big shows, working with big studios and platforms, and the current situation of the industry.

90

Carissa Dorson on ‘A Little Late with Lilly Singh’ 5


NEWS - PRODUCTS

Ateme launches its new cloud-native solution Pilot Media

Ateme has launched into the market its new cloud-native solution for media supply chain, Pilot Media. It facilitates the migration of production and operations to new service models – whether those be on-premise or in a public or hybrid cloud. According to the company statement, the users of this new technology can expect a drag-and-drop workflow builder to support a nimble operating model; 6

management insight, with dashboards updated in real time and reporting tools which provide management information to support capacity and workload planning. Jean-Louis Lods, VP Media Supply Chain Solutions, ATEME, said, “Viewers today have more options than ever in terms of what content to watch – both on TV and on their personal devices. With this, content and service providers are

facing fierce competition. We are excited to launch PILOT Media to the market as in doing so we are not only enabling our customers to compete more efficiently and maximize ROI by reducing their operational costs, but we are giving them the opportunity to create new revenue streams, for example by launching a D2C OTT service or through rapid onboarding for distribution.” 



NEWS - PRODUCTS

Sivac-ONE Compact: New portable solution for UHD(4K) contributions Sivac-ONE Compact is a real-time encoding solution based on the well-known Sivac-ONE Media Processor (SMP) card, designed for HEVC/h.265 video transport 4:2:2/4:2:0 and 10 bits per pixel. Sivac-ONE Compact provides a highly reliable video transport with latencies lower than 150ms, making it a solution for interactive video applications such as sports contributions, live events, news, remote production and return feed signal. In such compact hardware and with a consumption below 30Watts, this model is capable of processing 1x 4K UHD video channel (HDR, WCG) with resolution 2160p50/60 or up to 4x channels FHD/HD with resolutions 8

up to 1080p50/60. In both cases supporting of up to 16 channels of digital audio (stereo). Like SMP, the Compact version offers different possibilities for Input/Output to suit the needs of each project. Available inputs are 12G/4x3G/HD/SD-SDI and 2xDVB-ASI or 2xIP UDP/RTP outputs with FEC SMPTE 2022-1/2/7 (Multicast/Unicast). The Sivac-ONE Compact solution offers the possibility of dual redundant source 12Vdc,

which differentiates it from other portable solutions on the market. In addition, it has built-in 2 SFP/SFP+ 10G connectors for future applications. With this solution, Sapec meet the needs of different 4K applications, for example when the conditions of mobile connections (5G/4G/3G) are not ideal, and transport over IP or ASI is needed, still with a solution that is easy to transport and operate. 



NEWS - PRODUCTS

NDI Version 5 enables remote Premiere Pro’s video output anywhere NDI has released NDI Version 5 update. It allows on cloud video connection and transmission from any location, enabling remote video production workflows. The Vizrt owned brand NDI for Adobe Creative Cloud gives editors and producers the opportunity to bring Premiere Pro into their local live productions’

10

workflows and collaboration for review and approval. With NDI 5, these workflows open up beyond the local area network so editors and artists can use NDI for live production and collaboration across multiple offices and locations, including their homes. “The ability to send video over NDI to anywhere in the world

over an internet connection is a huge step forward for creatives,” said Sue Skidmore, head of partner relations for Adobe Video. “With NDI for Adobe Creative Cloud, video professionals and creators can collaborate more closely as they create and shape visual stories – regardless of their physical location – speeding up and streamlining workflows.”



NEWS - SUCCESS STORIES

The Théâtre du Châtelet relies on Riedel comms solutions Artist digital matrix intercom, SmartPanel user interface, Bolero wireless intercom, and MediorNet real-time network. The Theatre opened its gates back in 1862 and counts with an auditorium capacity of 2,038 people. It has faced a complete renovation during these last years. The new communications system is anchored by an Artist-128 matrix intercom mainframe, which enables seamless AES67 and Dante connections over the theatre’s IP network to the Bolero wireless intercom system and additional Riedel Performer digital partyline solutions. The update also included the Riedel’s SmartPanel 1200 user, allowing the Theatre to become the world’s first in-theatre installation of the all-new SmartPanel RSP1216HL. The theatre manages its intercom environment using 15 RSP-1216HL and five RSP-1232HL panels.

Théâtre du Châtelet adopted a full suite of Riedel's award-winning communications and video signal routing solutions.

Riedel has announced the integration of its comms and networking solution into the Théâtre du Châtelet, one of the oldest and famous Parisian theatres. The company has installed in the dramatic centre the Riedel’s

12

The installation was completed with The DECT-based Bolero wireless intercom system, with eight antennas serving 35 beltpacks. In addition, the Théâtre du Châtelet has deployed three MediorNet MicroN high-density media distribution network devices. “For this upgrade, future-proof systems were extremely important. That’s why we took our time with testing; we were looking for solutions that can accompany us for the next 20 years,” said Stéphane Oskeritzian, Head of Sound, Théâtre du Châtelet. “We chose Riedel not only because their


solutions performed the best in our tests, but also because we were searching for a solid partner. After visiting Riedel’s headquarters in Germany and having very open, constructive dialogue with their creative and innovative people, we felt confident that a partnership with Riedel could be just what we had been looking for.” Oskeritzian added, “We wanted to cover our entire facility — a lot of space — with a minimum of antennas. After our extensive testing of the leading wireless solutions on the market, Bolero was far and away the best at meeting that challenge. We also tested and reviewed every other aspect of these systems, such as haptics, feature set, and the beltpacks’ ease of use. The verdict: Bolero is outstanding!” “With the marvelous Théâtre du Châtelet as the latest example, Riedel’s communications and networking solutions are fast becoming the standard for historic, prestigious performance venues throughout Europe,” said Franck Berger, General Manager, Riedel France & Africa. “This installation is especially gratifying, since Artist, Bolero, SmartPanel, and MediorNet were chosen by very high-level professionals at the theatre who took over a year to dive into the details and subtleties of all major intercom and video network manufacturers. It’s a big testament to the technological excellence of our solutions and how they are perfectly suited to the demands of live entertainment.”


NEWS - SUCCESS STORIES

Vox Media incorporates Dante to equip the audio network of its studios

Vox Media trusted on

of Production Technology

able to control everything

Audinate’s Dante to equip

for Vox Media. The built the

remotely, like Dante

its studio facilities. The

new studios facility for the

coupled with certain

editorial group is in charge

company in New York with

manufacturers like

of some editorial references

the Dante technology. “I use

Focusrite, you could do

like “New York Magazine”

Dante any chance I can. I

everything from somewhere

and a couple of years ago

don’t even want to use

else and that is so

jumped into the production

broadcast headphone amps

appealing.”

of shows and podcasts. They

in our podcast studios

“At our Broad Street, New

have studios and facilities

anymore. I just want to use

York facility, we have

all across the US.

the RedNet headphone

something like 24 Dante

Miles Ewell is the Director

amps,” said Ewell. “Being

devices, including RedNet

14


NEWS - SUCCESS STORIES

devices, in our server room

taking a fresh look at our

room facilities, they always

to talk to our comms system

podcast infrastructure. In

shared Dante network. I set

and our Skype units,” said

San Francisco, we have a

up our studios in each

Ewell. “We have a bunch of

small studio, and we use

facility so that they were

QL consoles throughout the

Dante from the console to

floor and audio suites and

the computer. In DC, we

live control rooms and

have two podcast studios

podcasts control rooms, and

that are both QL1s with

we use a lot of AVIO

RIOs, and those are fully

adapters because they’re so

integrated,” said Ewell. “I

room separately, and I was

easy to deploy.”

see a lot of opportunity for

able to do that extremely

The Director of Production

using Dante. In both New

quickly with our Dante

technology continues: “I’m

York and DC, in our multi-

setup.” 

connected. So we had a primary control room, and then we could have a session with people in each


NEWS - BUSINESS & PEOPLE

Zixi announces global availability of Zixi as a Service (ZaaS) ZaaS enables customers to deploy IP transport workflows with all the features and functionality of the SDVP. The offering allows media and entertainment companies to easily extend existing on-premises infrastructures and capabilities to the cloud, delivering broadcast quality video while also being able to provision, orchestrate, manage and monitor complex workflows from anywhere in the world using the ZEN Master control plane. ZaaS supports a redundant video pipeline, protecting against all failure scenarios for a near 100% uptime with Zixi's patent-pending hitless failover that provides redundant transmission options for higher reliability and disaster recovery. Zixi Broadcasters are deployed in geographically diverse data centers and, on both the sending and receiving sides, Zixi software is connected to independent ISPs and employs hitless failover for maximum reliability. Many customers leverage ZaaS for universal aggregation and distribution to multiple affiliates for 24 x 7 live linear workflows, and users can also create workflows for occasional use events, activating dynamic pull targets in the cloud as needed. 

Setplex and VIVOplay announce joint venture for OTT services in Latin America VIVOplay will deliver the Setplex end-to-end OTT platform capabilities with VIVOplay channels to Internet Service Providers (ISPs) in the Latin American region. Setplex and VIVOplay will initially roll out their combined service offerings in Bolivia, Ecuador, and Peru, and then to Argentina, Chile, Colombia, and Uruguay. VIVOplay will have 40 channels available for their subscribers in the Andean cluster by the end of 2021. “As a leading OTT player in the Latin American region, VIVOplay has evolved into a virtual MVPD that offers a rapid solution for Internet providers in the region that want to offer television to their subscribers in an endto-end solution that incorporates the best programming with the best technology,” said Carlos Hulett Guinand, founder and CEO of VIVOplay. “We have found in Setplex a unique platform that enables us to meet this customer demand effectively and at a very low cost for ISPs in the region. For VIVOplay, signing this agreement for Latin America with the main IPTV operator in Europe and North America puts us at a high level of competition that will allow us to grow with the speed that the market demands.” 

16


NEWS - BUSINESS & PEOPLE

Ross Video acquires Primestream Ross Video has announced the acquisition of Primestream. Primestream Media Asset workflow solutions enable the capture, production, management and distribution of media assets, and they are used within various market verticals, such as Enterprise, Digital Media, Sports and Broadcast. These solutions support a comprehensive range of workflows for live-feeds and file-based content – from HD to 4K and VR/360 – enabling users to streamline editing, newsroom metadata logging, control room playout, and seamlessly share, edit, review and approve content prior to either broadcasting live, automate on-demand play-out, or publish to a wide range of OTT platforms. The acquisition will see the Primestream portfolio

and teams led by President/CEO Claudio Lisman and EVP Namdev Lisman, along with the company’s R&D and technical support teams, all transition over to Ross. Ross will look to blend its Streamline media asset management solution with the Primestream products over time, to create a fully converged graphics and production asset management platform. Claudio Lisman is extremely excited at the prospect of life as part of Ross Video. “We are extremely proud of what we have achieved here at Primestream, from both a solutions development and customer engagement perspective. We have built an exceptional team that’s very customer centric in approach. In joining Ross, we are joining a like-minded group that will help us reach new international

markets, scale much more effectively and realise our aspirations.” David Ross, Ross CEO, is equally excited by this acquisition and its significance. “Ross is unique in the live production space, not just because of the depth of our product portfolio, but the emphasis we have consistently placed on solving customer problems, regardless of the technology platform in use. Acquisitions have played an important role in our success and this acquisition – our largest to date – will be no exception, I’m sure. The combination of Primestream MAM and media tools plus our world leading automation, graphics, and powerful newsroom computer system tools is extremely enticing and will provide customers with a truly world-class, end-to-end integrated Media Asset Management solution.”  17


SPORTS PRODUCTION

“Formula 1: Drive to Survive” showed us the personal face of Formula 1, with drivers at the core centre of the documentary. Available on Netflix, it was produced by Box to Box Films, with the collaboration of The Collectv. Both companies have built a strong reputation creating premium sports content, especially in F1. We talked about the challenges of a production like this with Chris Sarson, Managing Director of The Collectv and Tom Rogers, Head of Post-Production at Box to Box Films.

18


FORMULA 1: DRIVE TO SURVIVE

19


SPORTS PRODUCTION

and make sure that we met those standards.

The Collectv: Bringing data management to support the show What's the history of The Collectv? Chris Sarson: The Collectv has been operating for over seven years. Our background comes from postproduction. We are dealing directly with the studios, as well as a lot of different shows and receiving media from all kinds of locations. Our philosophy is born out of trying to make media arrive from location to post-production as seamlessly as possible. We found Tom through a mutual friend when Drive to Survive was initially starting. We talked with the team at Box to Box Films to help them understand what they wanted to achieve. The next step was trying to make that possible with Netflix, which was for us a new broadcaster. They have got very high standards and so does Box to Box Films. We had to try 20

Chris Sarson, Managing Director of The Collectv

OUR FOCUS WAS ALWAYS ON THE DATA MANAGEMENT. OUR TEAM WORKS TO ENSURE THAT WE PROVIDE WHAT BOX TO BOX NEEDS TO DELIVER SHOWS. WE HAD THE CONSTRAINTS OF FORMULA 1, WHICH IS A BIT OF A FLYING CIRCUS.

Aside from Formula 1, we do other sports, reality shows, provide managed services and manage installations for production companies in their own in-house facilities. We do a range of things but fundamentally, our focus is on getting things right for postproduction, during that is on location or in a facility itself.


FORMULA 1: DRIVE TO SURVIVE

You’ve worked with Formula 1 since 2012, How has that collaboration evolved? We worked within the Formula 1 community since 2012 but we did it with lots of different broadcasters and other partners. We started with Sky F1, which was involved with Timeline and Gravity, who were the main partners for Sky.

After that, we worked with Channel 4, but through Whisper who is another client of ours. We got to know Formula 1 themselves and we now supply resources to them across their facilities in the UK and out on location still. We have built a strong reputation and then we had the success of Drive to Survive. We have been a part of that, only a small part,

doing our data management but it helps on building our reputation.

What is the role of The Collectv in the production of Drive to Survive? Our focus was always on the data management. Our team works to ensure that we provide what Box to Box Films needs to deliver their shows. We had the constraints of

21


SPORTS PRODUCTION

Formula 1, which is a bit of a flying circus. We helped develop the backup system for the cards to make sure we can get them in quickly enough, turn them around, create the proxies, we also have to flex through the system as well, depending on the deadlines to deliver the shows.

What changes have there been from the first season to the third? The system we started with is not the system that we have today. It was a physically bigger system, which meant it had to go in the cargo planes as part of Formula 1's biggest shipments. We used to have two racks of equipment. That's now gone down to three or four cases of equipment that are light and small enough to fit in the hold of planes. Therefore, we have lost some overall processing power, but the flexibility and the portability of the systems is worth more and we've got enough processing power to get away with a slight loss. 22

WE REALLY NEED TO BE AIMING FOR THREE COPIES OF THE MEDIA AS SOON AS WE CAN GET HOLD OF THE ORIGINAL CAMERA CARDS. WE HAVE OUR OWN NEAR LINE STORAGE, WHICH HAS GOT TO BE FAST.


FORMULA 1: DRIVE TO SURVIVE

One of the most important things about Drive to Survive is the behind-the-scenes action of being able to go anywhere, do anything and see everything. The new system allows much more flexibility and lastminute decisions to happen, rather than having to pre-book space on a plane weeks before and things like that. I think that's been the most important evolution that we've done so far.

What technical equipment is necessary on your part for this production? Storage is the main thing. We really need to be aiming for three copies of the media as soon as we can get hold of the original camera cards. We have our own near line storage, which has got to be fast. We need a good number of disks and 10GbE connectivity. They've all got to be fast. They can't be the cheapest

stuff out there. They've got to meet a threshold because we only have a certain amount of time before the power is going to be cut, and we've got to go to the next race. The other part of it is the copying and the Avid machines. We have up to four workstations plus other Macs to help manage the workflow, so they do the copying from the cards. We have loads of card readers. We copy and we transcode that

23


SPORTS PRODUCTION

media through those workstations to make the Avid proxies. It's really important for us to make those Avid proxies wherever we can because it helps to get it into the edit quicker. It’s also a good safety net, a good check for us because if it goes into Avid and it transcodes, there's a good chance those files are not corrupt. If they were, Avid would spit them out or not transcode them.

In a production like this, an immense amount of data is generated daily. How do you handle it? What is your workflow? Are cloud services an important part of it? As far as the data management goes, we don't generally rely on cloud services and that is for a number of reasons. Security is really important, but also, we are at tracks around the world and Formula 1 directly has huge and fantastic connectivity, so we don't

always have access to that as a separate production team. Towards the end of the season, we need to get some media to Tom, as quick as possible. With the relationship that Box to Box Films has with Formula 1, in the last few races of the season we share some of the connectivity with Formula 1 to get the proxies back to the UK even quicker, so those edits can start as quick as possible. It's not cloud computing, but that's where connectivity is important and it would be very hard to meet the strict deadlines without that connectivity that we rely on. Although we work very closely with Formula 1, we try to work by ourselves from a data management point of view, as much as we can. We rely on backing up to data tapes and physical drives that are secured and password protected. We are still in quite a physical world, generally in the way in which we work.

24


FORMULA 1: DRIVE TO SURVIVE

Survive is playing its part in that and so that remains the focus, the human stories.

Box to Box Films: Driving premium sports stories to the screen You have a great experience in documentaries. What makes Drive to Survive different from other productions? Tom Rogers: I think there are universal qualities to the projects that Box to Box Films do and that's about creating premium sports content and opening up some of those sports to a wider audience. The brief when we started Drive to Survive was to create a show that not only appeals to existing racing fans, but also broadens the Formula 1 audience. The most important parts of Drive to Survive are the drivers, the team principles, the characters that make up that sport and that's something that sometimes can be lost in the conventional coverage of Formula 1. Hopefully, it looks like the viewing figures for Formula 1 are increasing year on year and hopefully Drive to

With a copyright content like F1, where does the FIA mark the limit for recording in the circuits? Tom Rogers, Head of Post-Production at Box to Box

THE BRIEF WHEN WE STARTED DRIVE TO SURVIVE WAS TO CREATE A SHOW THAT NOT ONLY APPEALS TO EXISTING RACING FANS, BUT ALSO BROADENS THE FORMULA 1 AUDIENCE.

With Formula 1, there are a lot of stakeholders at play. You’ve got Formula 1, they own the content that they create and are the ultimate rights holder to the sport. They were very keen for the show to happen. Besides that, there are certain areas in a race track where you need additional permissions to go to because they are run by the FIA, for example, the steward's room, where a lot of the sporting decisions are made, or the press conferences. The FIA have been fantastic and supportive of the show. But also, you've got ten Formula 1 teams as well, who have their own interests and sponsors and responsibilities. I think we were very fortunate when we did Season 1, that there were a few teams 25


SPORTS PRODUCTION

that really committed to the show early.

How is that relationship with the Formula 1 teams? Formula 1 has been traditionally a very secretive sport, which is understandable. The teams are spending vast amounts of money developing cutting-edge technology. They don't want the other teams to see what they are doing because that could give away their competitive edge. Season 1 was a massive leap of faith for some of those teams. We were very lucky in the first race we filmed, which was Melbourne in 2018. Haas and Red Bull allowed us to film with them as very early adopters. I think once people saw them doing it and what we were about, they became more open to it and more trusting. It was a slow process. Inevitably, when Season 1 launched, more people came on board. 26

It's pretty widely reported that Mercedes and Ferrari elected not to do Season 1, which I understand. They are two massive premium names in the sport and were slightly more hesitant than some of the others. I think once they saw the show and realized the value it brings; they elected to take part in Season 2 and 3. We operate slightly differently for each team, depending on what they're comfortable with. There is a review process with them in the edit, but

that is not for editorial. It is purely for intellectual property. It gives the teams the opportunity to ask us to blur certain confidential parts of the car or replace a shot where we captured something confidential on a computer screen. Without the teams there wouldn't be a Drive to Survive.

Is everything on the documentary recorded by your team or do you have access to content from the F1 organization?


FORMULA 1: DRIVE TO SURVIVE

THERE ARE 24 TRACKSIDE CAMERAS, PLUS 80 ON-BOARD CAMERAS, HELICOPTERS, RF ROAMING CAMERAS IN THE PADDOCK AND PITS.

Part of our agreement is that we can use any footage that is captured by Formula 1 during a race weekend. The general rule of thumb is if you are seeing a car on track, it is Formula 1 footage. They have spent decades perfecting the capturing of the race content. They have an enormous number of cameras around the track. There are 24+ trackside cameras, plus 80 on-board cameras, helicopters, RF roaming cameras in the paddock and pits.

The infrastructure that would be required from us to duplicate that effort would be crazy. They have a really talented pool that works across their offering. The really nice thing I think that Drive to Survive has done is highlight some of that camera work that didn't make it onto the TV broadcast. Our focus is on a battle between somebody fighting for 10th and 11th position because that is part of our narrative whereas the TV coverage may be focusing on the race lead. Some of the beautiful work from the guys at Formula 1 is getting more air time and gets highlighted in the show which I think is great.

How many cameras do you use for the coverage? What other equipment did you use at the recording? In terms of the cameras that we use, Season 3 was slightly different just because of COVID and the restrictions that we were working with. Season 1 and 2, we predominantly 27


SPORTS PRODUCTION

chose a set number of races that we went to as a big team. We call those our major races. In a 20odd race season that would normally be about 10 to 15 of those races. That could be three or four PD shooting teams deployed and the cameras were the Sony FX7 and we've migrated to the Sony FX9 for Season 3. It's a good small, compact body and you get great pictures out of it, with a massively versatile range of lenses available to you, and we can shoot at the specs that are required by Netflix. We capture at 25 frames progressive. Sometimes we have a DoP shooting on Red; we've used Red Gemini before and we also supplement that with some GoPros. Those PDs with the camera are supplemented by a sound recordist. They use a lot of Zaxcom microphones. The beauty of this microphone is that it not only transmits like a conventional radio mic, but it also captures it to a memory card in this little battery pack. When our 28

characters disappear off out of range, we're still capturing the audio. That for Drive to Survive is invaluable because it leads to those most candid moments. We remember in Season 2, Guenther Steiner from the Haas team has a pretty heated argument with his drivers, Kevin Magnussen and

Roman Grosjean. We didn’t have pictures to go with that but all of that audio is captured because he was still wearing his Zaxcom radio mic. That's a massively key component to the show. Then normally those two roles are supported by a producer, whether that's an AP, or a field location


FORMULA 1: DRIVE TO SURVIVE

producer, just to help those crews navigate around the paddock. Season 3 was slightly different because if we were going to film with a team, we had to fully embed with that team. We basically had to go through the COVID testing protocol with the team, and fly, eat, travel and

accommodate with them. Even to avoid confusion, our camera guys would dress in the team kit so that it was obvious that they were within that team bubble. As a result we scaled back some of those crews. Our series director embedded with Red Bull as a one-man shooting team. He did sound, pictures, everything for the Red Bull shoots because they could

accommodate one person within that team infrastructure. What it did mean during the COVID year instead of going to maybe 10 or 15 races in a big way, we would go to most races in a slightly hybrid medium-sized offering. That leads into what Chris was saying before; it's all about having to scale back the kit. During the COVID year, we needed to be extra 29


SPORTS PRODUCTION

flexible in terms of how we could get to our races and manage the media that was coming in. The technological advances that The Collectv has made for us over the last few years really pays dividends when it comes to managing and working in a COVID-19 safe environment and following social distancing rules.

You chase the drivers all around the world, in terms of logistics what are the challenges of this? We have an incredible production team back at Box to Box Films. They worked flat out for the 30

THE REALLY NICE THING I THINK THAT DRIVE TO SURVIVE HAS DONE IS HIGHLIGHT SOME OF THAT CAMERA WORK THAT DIDN'T MAKE IT ONTO THE TV BROADCAST.

whole year, basically organizing everything from filming permits, to travel, accommodation, care, staffing, everything like that, it's a challenging show to put on. We also have a great crew that spent a lot of time away from home. It's a show where you are away a lot to film it and without their dedication, there wouldn't be a series.

For Season 3 the stakes were even higher, because not only have you got a global sport, but a global sport during a global pandemic, which really brought into focus all of those logistical challenges even more. You couldn't simply book a last-minute trip to Madrid to film with Carlos Sainz. We needed to plan well in advance to


FORMULA 1: DRIVE TO SURVIVE

make sure that the testing was carried out, that our crew were isolated for the correct amount of time as required by whichever country we were traveling to and from. Something as simple as shooting in someone's house became incredibly complicated and actually almost impossible. That's why a lot of the scenes that you see in Season 3 happen outside. There are a lot of challenges that had to be taken into account, not only logistics.

A documentary has no script. What is the workflow on postproduction to

create the content once the season is over? We have to try and predict the future because we can't film every single event. For example, Carlos Sainz in Spain’s GP makes sense. Then, in terms of how we deal with the material in the edit, it all comes in. We ingest and process all of it. The Collectv transcodes on location for our big races, but sometimes we have smaller teams going off doing other shoots that require transcoding back in the UK. Once it is all in, we have a team of loggers who then go through the

material to transcribe it, and then make that available to the edit producers. We have an edit producer for each episode and they are basically feeding those editors the material that we need for the story. The editors then start working on what they believe the story is in conjunction with the senior team, the executive team, the series directors and also feedback from the crew on the ground. Then we watch many cuts, we feedback as a team, we share cuts with Netflix, and they feedback. It's a very sort of organic process that we go 31


SPORTS PRODUCTION

through before we then eventually start to lock episodes, and it takes time.

With COVID-19, are you editing content on remote? What do you think about on remote workflows?

have all of our editors and producers together in a facility and it would have been irresponsible for us to do that. We had to figure out and sort of pivot the production to being remote and we did that via Avid Teradici, with our post-production partner.

I think actually this has been one of the positive things that came out of the crazy months that we've all been through. I was always open to working remotely but we didn't have a choice this year, we had to. Ultimately it wasn't safe to

The Collectv have also been instrumental in advising us on the latest workflows when it comes to remote editing, and I'm pleasantly surprised at how seamless that's been. We actually find that quite a few editors now prefer it and I think the future of

32

editing is a hybrid combination of the two. I believe pretty much all of our productions will have a remote element in them now. I also think there will inevitably be a point when getting everybody together is beneficial. It normally comes towards the end of a production when you are just trying to get to picture lock. You’ve got notes coming from Netflix and execs, you are trying to navigate how to address those as a team and communication is a lot easier face-to-face for those moments.


FORMULA 1: DRIVE TO SURVIVE

I think a sort of hybrid form is where we are heading and I think the technology providers made massive improvements. It also enabled us to open our talent pool as well. We had an editor that was based in South Africa while working on Season 3. That would have never had happened on previous seasons because you would have had to be in London. I think in terms of opening opportunities for talent that doesn't live in the location of a show as well is great.

With the F1 season going on, are you recording a new Drive to Survive season right now? If so, what would be the news and challenges of this year? Tom Rogers: I think the challenges with any future seasons of Drive to Survive are inevitably just keeping the format fresh. There are obviously characters that we've known and grown to love during the previous three seasons and it would be nice to maintain those characters and to keep the interest going for those. Similarly, I think with the regulation changes

coming in at the end of the year, that political sports business angle will be interesting. Chris Sarson: I would like to add the high standards that have already been set as another challenge. This is a series that has become really, popular. Everyone has got really high expectations. Everyone has really high expectations, which include The Collectv and Box to Box Films. It is important that we reach and exceed those expectations. 

33


NETWORK DISTRIBUTION

34


DISNEY

The aggregation of the signal distribution in a single location is becoming a trend in the TV industry in the US. With the challenges of an immense geography and over 240 affiliate stations only for ABC, Disney has decided to take this path. We had the opportunity to discuss this integration in the Digital Center 3 in Texas with Jon Edwards, Vice President, Media Engineering at Disney Media & Entertainment Distribution (DMED). Machine Room. Photo credit: Jon Edwards

35


NETWORK DISTRIBUTION

Where does this project come from and what was it about?

very world-class first-rate broadcast facility in general.

This goes back to the 21st Century Fox acquisition by Disney. As part of that effort, we have been on standby looking for an opportunity for the sports and entertainment networks to consolidate. Prior to this we had a lot of disparate sites. If you go back three to five years, we have had very sitespecific broadcast facilities. I am standing right now in the Burbank center in California, which is one of them. This was where Disney channels were originating, since I have been here for 20 years now. We started in the building back in 1997.

When the leadership team got to see the site, they said, "This is perfect". When the opportunity and the timing presented, we put up a project that predated the acquisition

This effort overall was to consolidate a lot of our entertainment and sports networks into a new facility. Part of the 21st Century Fox acquisition included a broadcast facility in The Woodlands, Texas. What is great about this one is that it was built to withstand a lot of the weather events that happen there and it is a 36

itself and we were activated to consolidate all of these broadcast networks into a single site in Texas.

How many phases did it take to complete it? We had very aggressive timelines, including what

Media Operations QC Room Pre-Roll-Out. Credit Jon Edwards.


DISNEY

is called a transfer of service agreement with Fox. We had to vacate the Pico Lot facility out on the Westside of Los Angeles within two years. The countdown pretty much started with that effort. Luckily, with the Disney cable side, we have had a

concurrent project here, in which we put all of our assets in Disney data center. We combined the FX and National Geographic networks that were at the Pico Lot with the Burbank Disney cables. We had about 12 to 14 months to do it,

OUR FIRST EFFORT WAS TO MOVE THE FX AND NATIONAL GEOGRAPHIC NETWORKS FROM PICO IN AN ACCELERATED TIMELINE. INSTEAD OF 24 MONTHS, WE ACTUALLY DID IT WITHIN ABOUT NINE.

which it was not a very long time. As we made this migration, we had to make sure that we were sensitive to the timing and handling of broadcast networks. A lot of the work was done in the background but then brought forward as time permitted. Our first effort was to move the FX and National Geographic networks from Pico in an accelerated timeline. Instead of 24 months, we actually did it 37


NETWORK DISTRIBUTION

within about nine. It was definitely a team effort across organizations to do this and we were able to get them live in September of last year.

What are the advantages, and also the risk of this integration? The company already had in its plans to do a consolidation effort of its entertainment linear broadcast networks. There were several different variations of it prior to the acquisition but it has always been on the back burner to do that. The reason being is that it is always advantageous to have all of your networks in a single location. That way, they can share infrastructure, connectivity, networking, and the technology that builds underneath it all in one location. The things that make it a primary place to be are that you got all your media assets now running into a single location. That's very advantageous to share content among different initiatives. For 38

THE COMPANY ALREADY HAD IN ITS PLANS TO DO A CONSOLIDATION EFFORT OF ITS ENTERTAINMENT LINEAR BROADCAST NETWORKS.

example, The Simpsons and Family Guy were very popular shows. Not only did we get that in-house, now we could share that same Family Guy episode, not only on the FX networks, where they were before, but now Freeform, our other network on the Disney cable side, can now run the same content. I will give you another technological advantage. Because you have all of your feeds on a single site, you can now share them from that one site, as opposed to have to point eight different areas of service that pay for a link from one location to the next. Now they are all in a single location, so I could share all my feeds from one location out to vendor partners, and that is really advantageous. Another point is that we can now share our linear

streams with our internal partners, like Hulu, that


DISNEY

are getting the benefit of having these feeds from a single location. They can call a single number; have a single support structure, a single engineering and transmission team, etc. The other part of your question resonates with me because there is

actually an industry trend in doing this. Our competitors and partners in the field that have been doing the same thing of looking at sites where they can aggregate a lot of their linear broadcast networks into a single location.

What were the main technological challenges of this project? That is a hard question to answer, because I lived it along with the teams. We were built on a lot of legacy infrastructure that was nearing to the end of its life. When you were

ESPN Disaster Recovery Unified Space. Credit Jon Edwards.

39


NETWORK DISTRIBUTION

You got a 21st Century Fox infrastructure that was aging for them with thousands of assets that needed to be moved not only to us but, of course, to Disney Plus and other apps like Hulu, and they also had to come to the linear side as well.

ABC Room - Pre-Build - March 10 2021. Credit Sam Exley.

Space Build. Credit Jon Edwards.

asking about how quickly we did this, there was also a time clock with several systems. One of the biggest challenges was trying to beat that clock in order to come off of those platforms very quickly. I 40

am still surprised today that we did it. I speak a lot to the play-out system side for linear broadcast but there was also a very massive media network that needed to be also pulled in at site as well.

We had a very massive effort that took months, weekends, and holidays of the team's time to do it because there was a lot of work to be done to make sure that those assets were removed from the Pico Lot into the new systems. Meanwhile, at the same time, that clock was on the wall; "Hey these systems are probably going to fall over, and if they do they might not come back up”. It was a point of no return. We had to do it. Everybody chipped in and made sure that it happened.

What infrastructure are you using right now on your center? On our side of the fence, we have a vendor deal with Imagine Communications that runs


DISNEY

Control Room Phase I Wide - September 2020. Credit Sam Exley.

WE HAD A VERY MASSIVE EFFORT THAT TOOK MONTHS, WEEKENDS, AND HOLIDAYS OF THE TEAM'S TIME TO DO IT BECAUSE THERE WAS A LOT OF WORK TO BE DONE TO MAKE SURE THAT THOSE ASSETS WERE REMOVED FROM THE PICO LOT INTO THE NEW SYSTEMS.

Cables Control Room - Phase II - March 10 2021. Credit Jon Edwards.

It's very much like a playlist that you would create on your own, but that playlist needs to be of technological enterprisegrade for it to run. We have a lot of backup systems to it as well so that is one of our primary vendors for that technology.

our linear broadcast platform. For example, in your case, if you are watching Freeform or Disney cable at home, FX, National Geographic or

ABC, it’s running through that software. It takes all of the assets or live events that are happening and puts them together.

41


NETWORK DISTRIBUTION

ABC Primary Control Room - Pre-Production - October 2020. Credit Jon Edwards.

On the media asset side, we use Evertz Mediator-X or Mediator 10, and that makes sure that we get all of those assets. All of the commercials that you see on-air, all of the promos, all of the programs like Family Guy, The Simpsons or Disney Channel shows, all those shows need to run through the system before it makes it to air. They are basically the central point of a lot of our media workflows for linear broadcast. 42

How many signals and content pass through this facility right now? That's a good one. On the FX networks and National Geographic we have six East Coast and two West Coast networks. On the ABC side, we have four time zone-related feeds with East, West, Mountain, and Pacific. For Disney Channels and Freeform, we have four East and four West networks. After that there are a lot of derivative feeds that

come out of those. ABC is not just ABC. ABC feeds 240 affiliate stations across the US. Our feed is the primary for stations all across the United States and so we have to make sure that whatever we feed goes out to them. We are basically 240 networks if you really calculate it. Those are the primary feeds so your question actually breaks out even further. We actually have partner deals with folks like Hulu Live TV, YouTube TV, Sling


DISNEY

ABC Primary Control Room - In Production Milestone - March 10 2021. Credit Jon Edwards.

and Dish Networks as well, for their apps. The feeds actually start to disperse out even further. It is a hard question to answer as there are actually thousands of endpoints at the end of the day.

What are the technological challenges of distribute all the signals and all this content through a vast country like the United States? A lot of it depends on the satellite systems currently,

in order to make sure that everybody gets the feed at the same time. For ABC in particular, all of our affiliated stations around the US want the feed at the same time so they want it with minimal latency. We are always looking at the technology and the distribution to make sure that everybody is getting the same feed. From day to day, our challenges are making sure that every performance-wise is good; that all the media assets

come in at the right way and are playing in the right format. There are also audio challenges; there is what we call ancillary data in the feeds. All of that needs to run 24/7. We were working on those to prevent nothing impacted on air. On the engineering and operations side we are in a constant state of making sure things run correctly, so that way the viewer doesn't see those problems. So there is the 43


NETWORK DISTRIBUTION

day-to-day challenge to your question. The strategic challenges are just getting those feeds to everybody without problems because it is a vast network and we feed networking systems

as well to what we call terrestrial, which are point-to-point or internet connectivity. We don't control a lot of that. We are leveraging the same internet you are using right now to do the Zoom call.

For example, we keep varying line of sight with Hulu as an internal partner, but they are constantly calling out, "Hey, can you check this?" Because there are hundreds of feeds, literally


DISNEY

hundreds of feeds. Almost every day there are at least a handful issues that we have to follow up on and go back and find out. The good news is they are not typically Disney failures, but we make sure

that everything is working correctly.

Is it profitable for Disney?

This whole project was a really big migration. Is this process 100% completed? Are you planning to move everything to the cloud?

Yes, this process is a 100% complete and yes, it is profitable. We are leveraging a hybrid model of both onsite resources as well as public cloud. For

Media Bullpen - Pre-Roll-Out - March 10 2021. Credit Jon Edwards.


NETWORK DISTRIBUTION

Building Facade - Pre-Branding - October 2020. Credit Jon Edwards.

example, some of our assets come through Amazon Web Services. We leveraged some of their infrastructure to help support us for media, mostly media file movement. To your question, one of the areas we are still exploring is doing everything that I just described in the cloud. There are some technological things that we have to overcome. We continue to aim to improve issues like lag and latency. We can't get the same speed through public cloud quite yet, but it is gaining momentum. 46

There is some newer technology that seems to be really stronger on the horizon for us. We are still looking at that and we keep that door open.

Could you explain a bit more about this hybrid model? I will keep it to the media side. For example, we have to get some content from a vendor. Let’s say it is a Freeform show in production with a completed file. They will leverage public cloud, with Amazon, to send the file up for some processing. Once it is in the system, we have an

oversight on it and we can step it through different processes. We will send that file back down from Amazon to us locally; so that way we can play the file out for linear broadcast. For example, tonight on any of our networks, a lot of those files that you are seeing are leveraging that model of pulling from a public cloud source. We call them S3 Amazon buckets. There is also transcoding. Amazon has put a lot of media into their service with us. For our consumers, we transcode


DISNEY

WE HAVE EIGHT OWNED ABC STATIONS THAT ARE ALL ALREADY STARTING TO MIGRATE OVER INTO THE WOODLANDS. BY THE END OF THIS FISCAL YEAR, WE WILL BE DONE WITH THAT PROJECT. AFTER THAT, WE ARE LOOKING AT INTERNAL REVIEWS TO ADD MORE TO THIS FACILITY.

files with high-resolution that you can't play on your phone because it consumes too much bandwidth. For that, we do a process in the cloud to transform it into a format that serves us better on the broadcast side.

Another big piece are the things we could do remotely. Imagine going back to the beginning of the conversation. Throughout the pandemic my team and the vendors could not get to site. Everything we did was over the cloud. We leveraged our networks, certain tools and applications to get into the site and do everything remotely without ever being on site. We've had vendors building massive systems in The Woodlands in Texas without ever even stepping a foot in the facility. We were leveraging all this on cloud, the infrastructure and public internet from our Disney Global networks, our Disney Media and Entertainment Distribution production network. All of those networks combined allowed us to move forward. As an example, this weekend we have a person in Idaho that I was working with on Saturday and Sunday because of an

issue. He was able to gather logs and video clips remotely so we can work with Evertz, who is a key partner remotely as well.

What's next? What's coming into the future? We have eight owned ABC stations that are all already starting to migrate over into The Woodlands. By the end of this fiscal year, we will be done with that project. After that, we are looking at internal reviews to add more to this facility. The regional sports networks, which now are owned by Sinclair, are still in The Woodlands right now. Once they take them out of our facility, the floodgates are open. We can start moving other processes within the company in there and we have a lot we can do in the facility. We are looking to add more and more over time to make this a center of excellence for the company. That's our next step.  47


ESPORTS

48


OVERWATCH LEAGUE

Blizzard Entertainment is a developer of well-known videogames like Diablo, Overwatch and Warcraft. During the last years, the company has made a bet on esports with tournaments and massive venues around the globe, which includes the Overwatch League, the world’s first city-based esports league with 20 teams based all over the world. With the pandemic freezing those events, they were prepared with an alternative option to continue their broadcast activity, and in 2017 decided to move its production to an on-cloud workflow. Corey Smith, Director of Live Operations and Global Broadcast at Blizzard tells us a bit more about this process.

41


ESPORTS

Corey Smith, Director of Live Operations and Global Broadcast at Blizzard

Pandemic forced you to move to a remote production very quickly. How did you make this change? How is your workflow currently? The pandemic was interesting because in 2017, we made the decision to decouple ourselves from traditional streaming providers as an outsourced model. We were going to take all the knowledge that we had been gathering and go to cloud with our distribution, transmission systems, asset management and all our line records. We started building the majority of our transmission infrastructure in cloud, while we were

50

still based in our Blizzard Arena in Los Angeles. We had the entire television studio accompaniment already at that facility as a purpose-built esports arena. In late 2019, when we started seeing the beginning of the pandemic and how events like NAB were cancelled, we were like: "Oh my, this thing is real." It was a pretty big concern we were tracking. By the time March 2020 came around, Los Angeles County essentially closed down and we all shifted to work from home. I haven't been back to the office since March 12 of last year. That's a long time to

APAC Virtual Set.

work remote.. We had set ourselves up for success though because of our partnership with Grass Valley, which goes back to 2018. All of our transmission work and other distribution systems being in the cloud previously AMPP became a very easy complement to our existing platform. Basically, we took the physical studio venue environment and deployed it in the cloud. The only thing we really had to figure out then were the issues with the casters and the talent in the field. The deal with consumer internet latency and upload speed is a very


OVERWATCH LEAGUE

WE TOOK THE PHYSICAL STUDIO VENUE ENVIRONMENT AND WE THREW IT UP IN THE CLOUD. THE ONLY THING WE REALLY HAD TO FIGURE OUT THEN WERE THE ISSUES WITH THE CASTERS AND THE TALENT IN THE FIELD TO DEAL WITH THEIR LATENCY AND UPLOAD SPEED.

real problem. The consumer internet infrastructure in the United States has high download and little upload. It's not an asynchronous circuit but a consumer-grade internet. You can't normally get commercial grade in most places and that was part of the issues we had to overcome. Most of the time we spent in March and April, we spent trying to get back online and figuring out those talent kits in the field.

How did your people adapt to this new workflow? Most of the workflow is very similar to a normal hardware control surface. It's just the nuance of clicking buttons with a mouse now as opposed to actually sitting at a console. The workflow is very similar in terms of how you cut production at a physical studio, but they just happened to be managing control surfaces that control our cloud infrastructure.

You partnered with Grass Valley to develop AMPP, your cloud software platform. What can you do with this platform? How long did it take you to develop it? We've been working with Grass Valley from 2018. Now we have a full audio production suite and the ability to do short-form video playout as a replay mechanism. We've attached some of our infrastructure to it. Our graphics engine is basically an NDI feed into the system via Vizrt and we run Trio on the cloud. In terms of the full AMPP ecosystem, it's basically what you would have at a physical production venue. You could take your KFrame switcher and attach it to cloud and manage your show, just like our A1 is currently doing now. By the time we will get back to a truck world, we can take some of that infrastructure that people have already spent money on to achieve greater ROI. We want to take the investments made in

51


ESPORTS

hardware and attach it to our cloud-based infrastructure. The crewing of those trucks is very tuned for that hardware and the nuances around mobile production. It’s critically important for us as leaders in the industry to not disrupt the workflow of the existing crew and what they're capable of doing. Nobody wants to roll up a truck and then have to retrain or hire specialty staff to operate a new piece of gear or software. Part of that whole journey for us was to not be intrusive on the market that operators weren't available to run it. That was our journey with Grass Valley. We have not skipped a beat in terms of being able to take our studio show and do it in the cloud.

Speaking of cloud, what can you tell us about the Aloha Project? Project Aloha is our solution to enable regular play between our teams based in North America and our teams based in

52

PROJECT ALOHA IS OUR SOLUTION TO ENABLE REGULAR PLAY BETWEEN OUR TEAMS BASED IN NORTH AMERICA AND OUR TEAMS BASED IN ASIA. Asia. During the Overwatch League’s tournament cycles, the best North America teams fly to Hawaii, and utilizing an undersea fiber connectivity, we’re able to connect those teams

directly to an online-based game server in Tokyo that our teams in Asia also connect to. What does it take to make this happen? A lot of research went into


OVERWATCH LEAGUE

for competitive game play to occur.

Outside of the cloud platform, what tools do you use for production? There's not really a whole lot else we use outside of our AWS cloud environment. There are some partners along the way that help us externally with some of the POV camera work, but for the most part, everything we do is 100% cloud across the board. Producer workstation.

figuring out the best location we would benefit in sending players. A series of network investigations finally led us to team up with the University of Hawaii. The University is well positioned from an internet perspective to take advantage of the undersea fiber infrastructure that allowed us to connect directly with our game servers in Tokyo. Measurable latency needed to be under 100ms

RCS Overwatch League custom control application.

53


ESPORTS

What is the equipment of your remote production studios? The talent cameras are basically the Sony A7s. The NUC gives us the ability to remote into those boxes and camera shading. We can change the look, the feel, the colour balance, the gain, the saturation, everything you can do on local camera We've set it up this season to remote in and operate that type of stuff at the talent homes, which Observer workstation.

54

WE'RE LOOKING FORWARD TO CONTINUING TO DRIVE THE EVOLUTION OF THE CLOUD-BASED VIDEO PRODUCTION WORKFLOW TO ALSO INCLUDE HIGH VALUE VIDEO REPLAY.

has worked out pretty well. The cameras at the venue have been a combination of Sony and Grass Valley from time to time depending on the production partner. Each

one of them has different cameras they use, what our requirements are and what's available.

As in any sports broadcast, the use of


OVERWATCH LEAGUE

Engineering workstation.

replays that highlight the most important or spectacular plays is common. How do you do it? There's a couple of different ways of doing replay. Right now we have a line record within vMix. We use that to turn those clips around and act as any source for playback for the show, and it's working out pretty well. We're looking forward to continuing to drive the evolution of the cloud-

based video production workflow to also include high value video replay, whether it's through GV making advances to AMPP for those features or whatever happens down the road with our other technology partners.

We have a virtual set currently. We work with RCSand they have helped us to develop some of those sets for the last couple seasons. We're going to continue with that enhancement and maybe even go the XR route.

Graphics are of great importance in production. What system do you use? Do you introduce any element of augmented reality or virtual reality?

I'm a big fan of the extended reality stuff right now, where you can build a 3D space instead of living in a 2D world. You can be more dynamic than just having folks sitting

55


ESPORTS

behind a desk and having that flat feel. I think that the world needs to feel like they've opened up and evolved and be able to have the talent walking around a set, to feel like there is depth of space. In terms of the graphics capabilities, we have a couple of different layers in which we deploy in our environment, depending on which league. We also deploy on more of our master control environment with the use of Singular, which is an HTML5 graphics rendering that happens as we decorate the distribution feeds. You could think of distribution feeds as we're providing a world feed, but on top of that, we're providing different feeds in Korean, French, German, Spanish… and the ads that are playing in that, what we call split fork execution, is basically dirtied at the master control level, where we have one input coming in and we have multiple inputs going out. Where you would see an English billboard for one

56

sponsor, on the Korean feed, you'd see that same billboard localized in Korean, or German, or French, or Spanish. We have a way to actually internationalize our broadcast to those specific regions with specific feeds, with specific ad inventory based on some of the singular work we've done.

And for statistics, what system do you use? In what processes are you using AI? There are a couple of different stats systems. Over the last several years, we've been developing our own stats platform internally. The stats system ties into the game engine. It allows us


OVERWATCH LEAGUE

interpret it, figure out different algorithms to apply to the data, and come up with interesting stack rankings. Then, we're able to then those and apply it into the broadcasts in terms of presenting what IBM ranked the player or the team from a statistical standpoint.

to pull the stats directly from the game, interpret the data, and then present it for broadcast. It feeds into our graphics system as well for use on air and into different analysis dashboards. Our competitive operations team and the other the team coaches can read those dashboards and see

how their players and team are performing. It's an in-house development project right now that we've been undertaking The use of AI right now in our leagues is within the Overwatch League. We've been working on a stats project with IBM. They take the raw data that we give them, they

WE'VE BEEN DEVELOPING OUR OWN STATS PLATFORM INTERNALLY. THE STATS SYSTEM TIES INTO THE GAME ENGINE. IT ALLOWS US TO PULL THE STATS DIRECTLY FROM THE GAME, INTERPRET THE DATA, AND THEN PRESENT IT FOR BROADCAST.

57


ESPORTS

What do you think about the future of esports? The challenge is still going to be getting people back to venues 100% and enjoying local sports again, regardless of whether or it's football, baseball, basketball or esports. We are going to continue to push technology. It’s going to be a hybrid approach either way, whether or not we do any

58

production on site we have to change how we think of venue production. For the most part, a lot of our stuff is going to still be cloud-based. We are still going to assume that we are cutting the show in the cloud and doing distribution the way it’s done today. The YouTube audience, as an example, doesn't really care what's going on at the venue during halftime. This is traditionally what you also

APAC Virtual LED Wall.


OVERWATCH LEAGUE

THE CHALLENGE IS STILL GOING TO BE GETTING PEOPLE BACK TO VENUES AND ENJOYING LOCAL SPORTS AGAIN, REGARDLESS OF WHETHER OR IT'S FOOTBALL, BASEBALL, BASKETBALL OR ESPORTS

players aren't really on a traditional field of play, it’s virtual. We have to figure out ways to connect the community and the fans to this virtual world regardless of how we do the broadcast. How do we thread the narratives that

see in other sports today. You also want people to walk around and have that social experience at the venue, not to be locked into your seat.

We have to look at how we evolve our production and the league. There are certain elements of traditional sports that we have to take in, but our

get people engaged? That's always going to be the goal and hopefully our tools will continue to help us tell that story. 

59


OPINION

Innovating into the future: How esports is winning the production game

By Michael Armstrong, VP Sales, EMEA, LTN Global

Professional or amateur gamers coming together from across the world to compete, active fan participation in immersive games, and groundbreaking technologies — esports has always embraced innovation. Esports is one of the few industries that wasn’t negatively affected by the pandemic over the last year. In fact, its popularity is growing exponentially, with a fan base expected to reach 474 million viewers by the end of the year, with a lift from mainstream broadcasters like ESPN and CBS. Esports producers can now capitalize on innovation in production workflows and fan engagement to drive the industry forward and create new revenue opportunities. 60

Born digital-first When it comes to delivering rich content formats and high resolution, esports is already a step ahead of traditional sports broadcasting and HD SDI workflows. Platforms like Twitch, Facebook, and YouTube have been


INNOVATION IN ESPORTS PRODUCTION

broadcasting simple livestreams of multiple competitors across the world for years. However, COVID-19 accelerated growth across the esports industry and increased the international consumer appetite for live content. The streaming of one-onone action has evolved into multiplayer global online tournaments with centralized workflows. Before the pandemic, esports shows brought professional gamers together at one venue.

During COVID-19, the industry used centralized workflows to produce broadcast-quality content with feeds from a multitude of remote locations. EA’s Madden Bowl 2020 showcases how remote production workflows can deliver the high-quality content ESPN and platforms like Twitch expect. Until last year, Madden Bowl was produced with players in the EA studio in San Jose, California, using standard, on-premise esports

workflows. With an inperson show off the table, EA had to get multiple 1080p 60fps high-quality, low-latency signals out of 40 individual homes and back to a centralized production facility. In addition, EA needed to manage the signals of up to eight different announcers in different locations conversing with each other about the game in real time. Connectivity was a key challenge in producing an event of this scale for linear broadcast. Having a fully managed, reliable multicast network backbone is a critical success factor for supporting features like rich video formats, data overlays, 3D graphics, replays, and audio mixing. The production team needed to ensure announcers were seeing the game with minimal latency, which meant they needed to feed the signals to the announcers’ homes at the same time they were fed back to the centralized production facility. 61


OPINION

Madden Bowl 2020 demonstrated the potential of remote production workflows for esports. Although physical events with an on-site audience are gradually returning, remote production models will continue to be deployed in esports production. Existing IP-based workflows, lower costs, and higher production efficiencies mean remote production will play a crucial role moving forward.

Raising the bar for esports Gamers play esports for gaming audiences. As such, esports must deliver the immersive video gaming experience fans expect. In addition, with wagering interest increasing, producers must cater to that audience’s needs. Providing data and insights into specific player performance at scale will feed the wagering and viewing appetite, increasing esports’ presence in the mainstream sports betting landscape. 62

Personalization can add a further layer, enabling fans to consume esports content the way they prefer. The esports industry can jump on the opportunity to enable multiscreen viewing experiences and

customized data and graphics to meet individual consumer interests. Streaming platforms like Twitch provide a good example of how personalization features can be rolled out. These platforms offer


INNOVATION IN ESPORTS PRODUCTION

more opportunities for esports to monetize content and expand its fan base.

Being game for innovation The esports industry should be ready to capitalize on the growing interest from consumers and broadcasters. Remote production will bring together geographically dispersed players, announcers, and consumers, enabling the industry to grow its audience, while creative production features like virtual fan engagement will deliver compelling news ways to engage.

features like the ability to choose the announcers viewers want to follow or switch between cameras. Creative elements like virtual fan engagement and giving fans on-screen time offer audiences new

ways to engage. Even after the return of fans to physical venues, virtual engagement will enable global audiences to stay connected with the industry round the clock and outside of major events. This will create

Besides the dedicated fan base, esports is increasingly front of mind with wagering audiences. Data analytics, personalization, and immersive viewing experiences will bring more ways to consume esports content and create new revenue streams. Esports producers playing the innovation game are set to win.  63


OPINION

Esports takes live cloud production to the next level By Mike Cronk, VP of Advanced Technology at Grass Valley

With its roots in the arcades, esports was once niche to the sports business but has matured spectacularly into a professional industry in its own right. As competitive video games continue to integrate into popular culture, global investors, brands, media outlets, and consumers are all paying attention. Even this year, despite spectators being locked out of stadia, live events truncated and esports companies forced to produce remotely esports has managed to grow an astonishing +14.5% from $947.1 million in 2020 to over a billion dollars for the first time. That’s according to esports and video game analysts NewZoo which also forecasts $833.6 million in revenues – over 75% of the total esports 64

market – will come from media rights and sponsorship in 2021. Audiences are growing too with the global games live-streaming audience hitting $728.8 million in 2021, up 10% from 2020. The future of esports on the consumer side will likely be powered by mobile, which will further reduce barriers to entry and allow even more gamers and fans to pour in. In particular, 5G connectivity will reduce latency so dramatically it will allow even deeper realtime interaction between fans and players and perhaps lead to new gaming formats. On the production side the future is already here. Live production is happening today in the cloud bringing incredible flexibility, efficiency and reach for esports teams,

leagues and publishers to grow the market. EA Sports and Gfinity have pioneered this exciting development. Electronic Arts Competitive Gaming Entertainment (CGE), the company's esports division, began fully distributed, remote production for its major competitive gaming events EA SPORTS™ FIFA 21 and Apex Legends. The end-to-end cloud workflow allows EA to deliver broadcast quality storytelling to its global fanbase – with a


ESPORTS IN THE CLOUD

production team working from their respective homes – regardless of where the live tournament is taking place. London-based international esports and gaming solutions provider Gfinity began producing and delivering multi-day

live esports broadcasts for one of its virtual professional motorsports series, entirely in the cloud. This included remotely controlled camera feeds from multiple individual player locations across Europe monitored, synchronized and switched in the cloud.

Esports takes the lead Like many organizations producing live events, EA and Gfinity made the move to remote production for business continuity forced by COVID-19. Both projects were also 65


OPINION

enabled through GV AMPP (Agile Media Processing Platform). This is the Grass Valley cloud-based SaaS platform that marries broadcast-quality production tools with the agility of deploying virtual machines on a dime allied with the massive firepower of cloud compute. What AMPP enables, and what EA and Gfinity have proved, is that the complexities of professional live production in the cloud can be solved with the right solutions partner. These major brands delivered unmatched viewing experiences for gaming fans around the globe. Traditional sports broadcasters, hampered as many are by legacy equipment investments, are sitting up and taking note. Latency is a key issue that comes with the fastpaced nature of esports – even a slight syncing issue impacts the end-user experience. But with AMPP, that’s all taken care of. Thanks to its innate 66

low-latency and intelligent timing management capability there is virtually no difference to punching the show in a physical studio.

vendor lock-in. Systems integration teams were learning how to piece together broadcast engineering with IT on the job.

With GV AMPP, everyone from the technical director to the talent works remotely as necessary, with no compromise on their ability to deliver stunning content. The EA team can manage live production of its gaming events more efficiently, rapidly pivoting geographically to meet esports fans' needs across Europe, Asia and the Americas.

As an analogy, the early transition to IP and cloud felt like a visit to RadioShack. You’d buy all the different parts off the shelf then head home to

It wasn’t always this way. A lot of early moves to the cloud were in reality attempts to lift and shift tools made for bespoke hardware onto virtual machines or just to transport signals from one facility another. It took time and specialist knowledge to adapt existing systems for IP and COTS servers. Lack of interoperability between equipment rendered certain workflows unworkable, or risked


ESPORTS IN THE CLOUD

solder the capacitors and transistors and LEDs to the circuit board into a rudimentary PC. It may have worked but it’s no way to make a living or run a business at scale. The resources required to do this are astronomical. Contrast that to today when we all have a smartphone in our pocket.

Its onboard computer can power a flashlight, a calculator, a camera and we can download all manner of apps to personalize our experience. It is, in essence, a unified platform where tools and services are accessed from a single familiar interface.

Unified platform That’s how we conceived GV AMPP. As part of the larger GV Media Universe, GV AMPP sits at the center of a comprehensive ecosystem of connected solutions, services and marketplaces that make cloud-based media workflows a reality.

67


OPINION

The range of applications available through GV AMPP mirror everything needed to run professional live production in the cloud. These include master control, flow monitors, clip players, master control switcher, I/Os, test signal generators, AV mux, delay modules, graphics, multiviewers, recorders and streamers. GV AMPP also provides access to extensive probing and monitoring capabilities for transparency across all parts of the workflow. In an industry first, EA’s workflow utilized Grass Valley's GV Korona switcher control panel connected over the public internet to a K-Frame production center engine running on GV AMPP in the cloud. These are the same K-Frame switchers used to produce the Super Bowl. For EA, the familiar operator-specific UI is central to ensuring productions run smoothly and consistently. AMPP Master Control is accessed using the same AMPP 68


ESPORTS IN THE CLOUD

switching fabric, which also provides the feed that’s critical to successfully switching programming. Being able to handle everything from live switching to multiviewers in the cloud through AMPP extends a company’s ability to produce content remotely and deliver improved utilization of resources by spinning up and down apps as required.

New capabilities And the power is growing all the time. GV AMPP has been enabling live esports productions in the cloud since February 2020 ingesting, processing and switching 4-8 sources. In less than a year that has ramped up to 30-camera productions and we can go even higher. This capability is going to increase over time as we optimize certain software processes and as compute gets faster and better. More than this, the choice for any professional sports organization is now real.

There is no longer a need to stick to the rigid cost and content parameters of the traditional broadcasting model. Leveraging unified platforms, such as GV AMPP, allows true global reach as well as the ultimate in distributed remote production. Live events can be produced in the West Coast US with signals routed from a venue in Europe or Australia via an AWS datacenter on the US East Coast. Workflows can be centralized, enabling productions to base key talent and crew in one gallery saving time and cost on travel while able to create more content from more events. The same cloud SaaS platform also enables all production personnel to work from home even over modest internet connections in a ultradistributed fashion. Not only is this possible, it is happening now and esports has taken a pioneering role in the transformation.  69


TECHNOLOGY

70


VOIP

In a time in which broadcasters are gradually deploying their systems over IP technology, there are several issues to keep in mind that are important. On the one hand, control: How to handle all the systems involved in a new environment. Secondly, interoperability and interconnectivity, from the exchange of signals between systems to their operation. And, last but not least, security is no longer an isolated system but it is connected to others, including the Internet, and must be properly protected.

By Yeray Alfageme

71


TECHNOLOGY

Control Unified control of the various systems involved in capture, production and broadcast of content has always been one of the most important issues in the design of audiovisual systems. There is no point in having the best equipment if the machines do not understand each other or we cannot control them as a whole. For this purpose GPIO (General Purpose Input Output) was originally created. It is a very simple but effective system and all broadcast equipment have GPIO inputs and outputs to

72

control their functions. From audio follow video up to control of graphics and content playback, everything can be automated with this simple and versatile system at the same time. To progress a little more in the control of systems, control ports based on serial protocols, either RS232 or RS422, began to be implemented. Again, if we look at it from today's perspective, these systems are very simple and require specific wiring to operate, but for a long time, and indeed also today, they have been the

basis for the control and automation of entire systems. With the ambition to unify all these control systems, SWP-08 emerged. It is much more flexible than GPIOs but still requires serial interfaces, as well as specific wiring and connections. Some manufacturers have implemented serial protocols over Ethernet cables, but this is nothing more than using a different cable and connector for the same purpose; they are not IP communications.



TECHNOLOGY

The flexibility of IP control Looking at IP as a purely communications protocol, which in fact it is, its great flexibility and adaptation have been proven thanks to the fact that it does not matter what type of data it conveys. Packages are transported in the same way, whether they contain audio, video, documents or a phone call. The same thing applies to the network. This is of great help especially when we want to exchange audio and video signals outside the purely broadcast environment. Imagine a current contribution and distribution system that does not support IP protocols. Unthinkable, right? Returning to control protocols, implementing them over IP technology has several advantages. The same wiring is used; in fact, even the same connectors without the need to add anything extra, both for the transport of the audiovisual signal and for control. A single RJ-45 74

connector -or fiber depending on bandwidth needs- utilized to connect the equipment via Ethernet to the network, can be used for all purposes. This provides great flexibility in configuration and control of the equipment, automatically achieving great interoperability. In fact, protocols such as SWP-08 have been updated in order to enable transport through IP technology, thus

allowing broadcasters to use IP infrastructure for everything from audio and video to control and all the rest of the necessary operations on the systems.

SDI is rigid, but highly interoperable The more we immerse ourselves in the IP world, the more we realize that interoperability of equipment is ultimately the key. Something that was not even feasible in SDI environments, all


VOIP

synchronously. Ask the technical heads of mobile units and witness the problems that synchronism signals, clocks and worldclocks always cause each time signals are exchanged between mobile units on a TV Compound. Madness.

equipment could exchange signals with everything, remains yet a problem to be solved in the IP world. SDI is rigid, no doubt about it. Specific cabling, only one type of signal unidirectional- and a long list of drawbacks that were previously assumed to be inherent in systems and that are now questionable in current data environments. In addition, SDI had the need to exchange signals

In contrast to this, IP is highly flexible, asynchronous, allows multiple formats through the same network, it is bidirectional and data of all kinds can coexist with audio and video signals, as long as all equipment units understand each other. And this last tagline was not necessary in the SDI world, although it is the main headache in IP environments: interoperability once again.

The challenges of interoperability The great flexibility that IP presents comes with new challenges to solve. For example, a camera with a data interface of up to 25 Gbps can exchange signals bidirectionally, not just send the signal it

records; all this through a single physical connection. The use of UDP or TCP over the IP protocol is transparent: the signals arrive from one side to the other and that's it. AMWA was created in an attempt to solve these interoperability issues. AMWA is a free community of manufacturers, broadcasters and different industry participants whose greatest achievement is the creation of the NMOS (Networked Media Operation Specifications). NMOS establishes a framework through which all compliant systems can communicate with each other by using IP technology within a media world. No more interoperability problems. Or almost....

The NMOS specifications I wish NMOS were the solution to all problems, but, as any other environment, NMOS was not born with all the capabilities from the 75


TECHNOLOGY

beginning, it has been evolving. IS-04 began by offering registration and discovery functionalities within the network. The equipment items would able to discover each other and to make themselves visible within the network in order to be discovered. This makes things even easier. IS-05 introduced the concept of interconnectivity. In other words, a microphone and a mixing console, for example, could exchange audio signals and 'selfconfigure' to define the bitrate, sample rate and specifications of the signal to be exchanged. For this purpose the Session Description Protocol is used, as it facilitates this interoperability. IS-06 provides another leap of abstraction from the network, thus allowing some computers to control others. For example, controlling a matrix from a control surface, having a camera tell a mixer what format to work in and asking for the return PGM signal, or 76

having an audio mixing console configure all the microphones on the network in the same way. NMOS IS-07 was yet a huge leap forward and had the specification introduced, which included event control and tally. These are two very important aspects in the control of audiovisual systems. And the thing is that ‘when this occurs that happens’ is essential in all productions. For example, nobody can imagine not being able to perform a macro in a video mixing system: this is event control.

And finally, security The title of this section is neither trivial nor the last thing to consider. Security of our signals was something that 'almost' no one had noticed in traditional linear signal environments, SDI, MADI and AES. And it was unthinkable that someone would come and steal your signal. It was only in distribution and contribution environments that a simple layer of

security was put in place through BISS encryption. And I still remember the number of encoders and decoders that I have configured with the BISS 12345678 key, but I would rather not say where. Security was not a problem and here the tense is important, it was not, because now it is. Not only because they can steal and copy our signals that we transmit through an IP network or the Internet -this is obviousbut because they can


VOIP

break into our systems, change them or, even worse, control them for us and we would not even know. Imagine a situation in which we are in the middle of a production and, without any operator noticing, the signals that are sent and the way the equipment works are controlled by a third party. Scary, right? Well, it is possible, real and it has already happened. Our IT colleagues are over 30 years ahead of us

in this area of security. They are the first ones and the experts to deal with these issues and they know what systems to implement and how to configure them. It is important that we do not include security considerations just as one more layer of the project, but as something to take into account from the project's outset that has an impact on the design of the final solution. Otherwise it will be too late again.

Conclusions In this second installment of VoIP we have focused on interoperability and security, two issues that in traditional linear environments were things almost to not take into account, but that in IP environments are essential and necessary to consider right from the start. Because migrating to IP does not only mean changing the coaxial cable for RJ-45 or fiber, but also changing our mindset and adopting new standards. If we want to make use of the innumerable advantages of the IP environment, we must ensure that we tackle all the challenges and are able to overcome them. In the last issue we will talk about a practical application and we will see how to implement an IP production environment within a real production experience with the aim of closing this trilogy in the most practical way possible.  77


PRODUCTION

78


TAKE 5 PRODUCTIONS

We jump in to a boat with Take 5 Productions, production company established in Toronto and responsible of huge successes like Vikings, Vikings: Valhalla and The Handmaid’s Tale. We had the opportunity to discuss with Nick Iannelli, EVP Post Production, the challenges of such big shows, working with big studios and platforms, and the current situation of the industry.

79


PRODUCTION

Nick Iannelli, EVP Post Production

Where did the path of Take 5 began? What is the focus activity of the company? I've only been here for two years but Take 5 has been around for 13 years. We've always been part of the production community in Toronto. Over that time they've had extensive relationships with studios like CBS Studios, Showtime, and MGM. Through that, Take 5 has been able to build up a reputation, internationally and therefore we are able to attract these types of high profile projects that come to us.

Take 5 usually works on co-productions, as it has done on Vikings, why is that? 80

Vikings was a special case in which the structure of the show was best served by setting it up as an Irish/Canadian CoProduction. This allowed the show to take advantage of incentives from both Canada and Ireland, but it also allowed us to showcase our talented producers, editors, sound teams and VFX artists that work in Canada. Take 5 is a production service company primarily

servicing projects that shoot in Toronto. Over the years, we’ve been able to convince our production partners, mainly some of the Hollywood studios that leaving post in Toronto was a viable option. Typically, the studios would shoot in Toronto and take their post back to LA. We’ve been able to demonstrate that we have the level of talent that they are used to in Toronto. Over the past decade the Toronto


TAKE 5 PRODUCTIONS

post community has really matured to the point in which we are competing for the same awards that our Hollywood counterparts are. We helped nurture and support these post creatives, so now we’ve created a bit of a speciality in which we will produce just the post and VFX for projects that shoot elsewhere, that’s what happened with Vikings and now the spin off series Vikings: Valhalla.

What are the main challenges in a technical way that you face on working with other companies? A lot of the challenges come when we go into a new country/city and we don't have relationships with some of the post facilities in those cities. If a show is in Toronto, we know suppliers and what those facilities offer. But when we go into a new territory in some cases there isn't the postproduction facility that can service shows of our size. Then we need to establish a relationship and work out a workflow that is common to our way of working. Or in some cases we work with one of our existing partners to find a solution. This mainly pertains to dailies, securing the camera data, applying colour, creating Avid media, syncing sound etc. With Vikings, we’ve been fortunate to work with a local vendor In Ireland; Screen Scene. They are a postproduction company based in Dublin and we’ve used them for quite a few

WE’VE CREATED A BIT OF A SPECIALITY IN WHICH WE WILL PRODUCE JUST THE POST AND VFX FOR PROJECTS THAT SHOOT ELSEWHERE

shows and most recently we’re working with them on Vikings: Valhalla. We have some interesting shows coming up in which they’re shooting one project in Calgary, Alberta and another in Halifax, Nova Scotia, in both cities there isn't a large post infrastructure, so we're having to put something together that's going to service these shows. There’s always a large amount of data that’s captured and getting that pushed across to our post teams in Toronto on a daily basis would

81


PRODUCTION

ordinarily be a challenged. But nowadays with the robustness of the internet and the cost data transfers coming down, it's fairly easy to manage and to get data pushed on a timely basis. The workflow on a show like Vikings: Valhalla, is such that at the conclusion of each days shoot, dailies are processed overnight, and then the Avid Media gets pushed to us into a secure catch folder that we receive on our side. This allows for our editors and assistants to start right away in the morning. This is a case in which the time difference helps us. Ultimately the full camera data will be sent over, using the LTO camera masters.

Talking about you, which is your role in the post-production workflow? From a post-production standpoint we have from five to seven shows that we're actively postproducing in one form or another. Whether we're handling all post-production which also encompasses visual effects. I oversee all the post and visual effects for those series from a Take 5 perspective. That includes everything from putting the team together, hiring the team, working with the various cinematographers, helping to establish dailies workflows based on camera formats. Making sure we stay on schedule and on budget. Most importantly I’m here to support out teams, making sure they have what they need to get the job done and make sure our studio partners are happy. This is my first time working on the studio side, as my background was actually on the post facility side. I ran Deluxe Toronto for many years, which is now Company 3's post-production operation in Toronto. It’s one of their larger operations as it encompasses dailies services as well as full picture and sound finishing. It also services both feature films and television series. I was there for about 16 years and

82


TAKE 5 PRODUCTIONS

83


PRODUCTION

then decided to leave and make a jump, and try something a little differently in my career.

Do you have your own equipment or do you rent it? We have a certain amount of infrastructure, but if we have a new show where it goes beyond our means we will rent what we need. In addition, we also have a small visual effects team that services and works on all of our shows. The entire render farm, storage and workstations are in-house.

How long is the life cycle of your equipment? Sometimes it's not so much about the life cycle but more about adding infrastructure because of growth. As you know, years ago you had a 50MB hard drive and that was a lot of storage. Now you've got 500 GB or 1 TB. Similarly, on our side. When we had a 100 terabyte, it was a lot of storage, but right now, we're looking to bring in a petabyte of storage to

84

WITH VIKINGS AND THE SPIN OFF SERIES VIKINGS: VALHALLA THE VFX VENDOR WE USE IS MR. X TORONTO. THEY'VE BEEN OUR PARTNERS FOR MANY YEARS AND HANDLE ALL OF THE HEAVY LIFTING FOR THE SERIES.

What do you think about the constant updates of formats and technology and how do you adapt to it?

specific piece of hardware that did that. Thankfully, over the last 10-15 years, we've gone away from that. Now, there is a workstation and software to do what you need to do. That allows us to take that same workstation, which might not be fast enough for visual effects or editing, but it might work as a file server of some kind that we'll use for our corporate office. Typically, as you upgrade, you look to find a reuse for some of the older infrastructure.

Years ago, everything was hardware-specific. If you wanted to do something, there was a

We want to talk about Vikings and Vikings: Valhalla, the two big shows of the house.

support our visual effects and post operation. Granted, that's probably not even a lot much data compared to some of the bigger facilities that exist.


TAKE 5 PRODUCTIONS

What camera equipment do you use on set? For the original Vikings series the cinematographer used Alexa cameras shooting UHD resolution for all seasons. On the new series, Vikings: Valhalla, that’s currently in production, our cinematographer Peter Robertson is using the Sony VENICE at 6K. We are just starting to colour correct the episodes in Dolbyvision and the footage looks absolutely gorgeous.

They are series with a lot of visual effects, how do you work on that?

With Vikings and the spin off series Vikings: Valhalla the VFX vendor we use is Mr. X Toronto. They've been our partners for many years and handle all of the heavy lifting for the series. We also use our inhouse team which consists of 6-8 artists that are dedicated to the series. The majority of the work on our in-house team does, is clean ups, environment extensions, mainly 2D work. Some of the software MR. X uses is Maya, Houdini and Nuke. I'm also sure they're using a lot of their own proprietary code and scripts. Our in-house team primarily works with Nuke, a little Houdini and Photoshop for matte paintings.

There are a lot of water scenes, which are always complicated, which was the main challenge of this? Mr. X uses water simulations in which they've perfected, it's looking amazing. We’re always striving to get to a photorealistic look and right now their water simulations are one of their specialties. It’s not

just about creating the water sims, but integrating the different layers of water, foam, waves into the plates that add to the realistic look. They will take a plate which consists of a boat with our actors and just make it come to life. The action may not be perfect, but they always find a way to enhance it and make it look better.

It has a lot of postproduction work, which was the main challenge you face off during this production? The time. It's just getting it all done within the schedule that you have. Everybody wants to create great-looking visual effects. Sometimes just getting it to that perfect state of photorealism, it takes a little more time. Sometimes we're lucky we have the time so we can do that, sometimes we're under some tight schedules and it's a little bit more challenging. Ultimately, I think overall it's always about getting the shots looking as photorealistic as possible. You really don't

85


PRODUCTION

want to take anybody out of the scene to say, "Oh, that looks like a visual effect." It should always feel very natural and like it was captured that way. I think that's always the biggest challenge.

You have another big international success like Vikings, which is the Handmaid’s Tale; it’s a completely different production. What are the differences between them? With Vikings, you know that visual effects play a big role with large battle scenes, or a big storm sequence. The audience is aware that when we take someone’s head off, VFX helped make it happen. With a show like The Handmaid's Tale it's more about the hidden. The viewer never feels that VFX is involved in creating a scene or a location. In season 3 of Handmaid’s, there was a big scene that took place at the Lincoln Memorial / Washington monument. Shooting on location in Washington was a big challenge because we couldn't close that area off to the crowds walking around. We tried to have as much crowd control as we could, but without closing the area off entirely, we knew we’d need VFX to make the scene work as the producers wanted it to look. This required a lot of work form Mavericks our VFX vendor, which included rotoscoping, painting out unwanted people, 3D extensions, cleaning up the environment and adding and duplicating the handmaids. As an audience member it looks like we went to Washington shot out scene and captured everything in camera.

86


TAKE 5 PRODUCTIONS

Similarly, in season 2, we had a scene in which takes place entirely at Fenway Park in Boston. For a variety of reasons we couldn't go to shoot in Boston, so our VFX Supervisor Brendan Taylor and VFX Producer Stephen Lebed went to Boston and took photographs and scans of Fenway Park. We then shot all the foreground action at a ball park in Toronto and the Mavericks team put it all together, making it feel that the entire scene was shot on location at Fenway Park.

We understand asset and content management is critical for you, how do you manage that? Do you use some cloud service for that? We use a variety of software/services. From a visual effects standpoint, we're running everything through Shotgun. It manages all of our elements, assets and shot production with our inhouse team as well as it helps us manage our vendor shots. On the post side, I'd love to tell you that we have a great asset tool that tracks it all, but most of it, is just spread sheets, Excel, shared docs, Avid tools and things like that where we track everything.

And for your media management? We use stuff like Media Shuttle, Aspera and PIX. Right now we're

87


PRODUCTION

using a variety of things. As an example, we use PIX to distribute cuts to executive producers and to the network. If we're moving small data between visual effects vendors and ourselves, we use Media Shuttle, which is a secure tool to do that. Now, with COVID where we're not doing sound reviews and we all can't go into a mix theatre anymore, we're using Frame.io to send files to producers and creative execs to listen to. With Colour Correction the biggest challenge is getting a proper calibrated image to our cinematographers and show runners. So, in most cases we will link them up with a remote streaming solution that supports high bandwidth, high bit depth streaming. Tools like Streambox Chroma and a OLED display. Particularly, if it’s a Dolby Vision master we’re creating,

88

there’s an added challenge in making sure we can initiate the Dolby Vision function on the display. We also don’t want to always stream the colour but provide a file that the DP can review and make notes on. In this case we’ve been looking at Moxion for that. It has full Dolby Vision integration built into their service. Overall, we’re continuously looking for very specific tools and services that do exactly what we need in helping provide a solution.

With the entire COVID situation, remote workflows began important. Are you working on it or do you have any plans to advance on it? Currently our visual effects team continues to work remotely. They're all at home. Our editing teams are somewhat of a hybrid as some editors and assistants are working

from home and some are in the office. The film and television business in Ontario has been deemed an essential business and therefore we’re allowed to operate with all of the lockdowns we’ve had. The industry has proven to the government that we can maintain a safe way of working because of the various protocols in place which includes wearing multiple layers of PPE and the biggest factor is our testing protocols. COVID is not being spread set. Likewise in postproduction, we're allowed to operate. We've set up everybody to work remotely or come into the office, they have the option. At times we’ve wanted to do our part to curb the spread and keep our staff safe and have implemented full work from home direction. The amazing thing is that most staff wants to come to the office.


TAKE 5 PRODUCTIONS

We've created a very safe office environment because everyone has their own edit suite. We don't have a lot of open environments; we've put up plexi-glass shields in open areas. Everyone wears PPE and most importantly we have a lab come in on a weekly basis and test our entire staff. We know that we're all safe and we've been working like this since August and thankfully, we haven't had a single positive test. Everyone feels safe about coming in. I think from a mental health standpoint everyone enjoys getting out of the house as well as it helps with breaking up the monotony of being inside for months on end, plus the efficiencies of working together is much easier. The communication is obviously a lot better when you can actually interact with people.

Companies like Netflix or HBO have replaced TVs as series producers, what is the difference you find working for a TV project or, for example, with Netflix?

Spain, Italy, Germany, and South America. It's one file that goes to Netflix, and that's it.

From our standpoint not a whole lot has changed. The streamers, like Netflix, Amazon, Paramount+ have specific technical requirements that include delivering the projects in HDR (Dolby Vision) or in Dolby Atmos for sound. They typically want 4k/UHD resolution and therefore the native camera sensor needs to be 4k or larger. From our perspective these minor challenges, add to the overall quality of the show which makes for a much better viewing experience for the audience.

There is still the pandemic, so we are dealing with COVID and working through that. Making sure we continue to provide a safe environment for everyone. Providing solutions that allow us to pivot in the event there is another outbreak. The other challenges we’re dealing with now is having to relocate our office. We’re looking for new space which we’ll need to build out to our specifics over the next few months. The upside to the move is that it’s a good opportunity to revaluate how we’re set up and how we do things. The new spaces will us to create an environment that works for all of our post and VFX teams.

We like the delivery to some of the streamers like Netflix as all we need to deliver is a single IMF file. We don't have to worry about sending a version to

What are the challenges for the future of Take 5?

89


CINEMATOGRAPHERS

Carissa Dorson

on ‘A Little Late with Lilly Singh’

Photo by Chris Oeurn

90


CARISSA DORSON

First, it would be great to know more about you. How has your career as a cinematographer developed and how did you end up on television? I have been shooting comedy ever since I moved to Los Angeles in 2012. My

friends and I shot sketch comedy for fun in film school at Florida State University, and it continued when we moved to L.A. I was introduced to producers at CollegeHumor and began working for them as a camera assistant, until they

91


CINEMATOGRAPHERS

trusted me enough to start shooting sketches for them. My career as a DP grew from there as I networked within the internet comedy world. I met a lot of producers and directors that I admired, and fostered relationships with them by shooting their passion projects. Years of doing this led me to where I am now, shooting A Little Late with Lilly Singh on NBC. I was recommended to Lilly when she was looking for a DP for her special, Sketchy Times with Lilly Singh in September 2020. We loved working together on that, and then she asked me to shoot Season 2 of her late night show.

However, you still work on lot in Short films. What media do you love the most? TV or Cinema? It’s so hard to choose! I get a lot of personal fulfilment from telling a meaningful story through a narrative short or feature film. A TV show feels more like an ongoing conversation, which is 92

really fun for me. I like to switch it up and work on a variety of projects.

How has the show evolved technologically over the seasons? I joined the show for season 2, which is currently airing. It’s vastly different from season 1,

which was a traditional talk show shot in a studio with broadcast cameras. We now shoot on-location at a house, and the show has a more casual vibe to fit Lilly’s personality. We utilize EVA1 and GH5 cameras and shoot 23.98 rather than the usual broadcast framerate,


CARISSA DORSON

29.97. The lighting is quite different too, since I love using big soft sources which are not traditional to late-night. The look depends on the scripts planned for each day, and some of them call for more heightened cinematic storytelling. But we prioritize comedy

above everything else. Because of Covid, all of our interviews are virtual this season, so that has presented another technical challenge. Our technical producer, Mike Hammeke, works his magic every day to make these Zoom interviews happen.

Usually, US TV talk shows are characterized by not being especially complex in terms of their photography. Nonetheless, ‘A Little Late with Lilly Singh’ has an interesting look, which captures both the naturalistic feel of a talk show and a beautiful

Photo by Iliana Ipes.

93


CINEMATOGRAPHERS

photography. How did you develop this look? When I started this job, I recognized that they wouldn’t have hired me if they wanted it to look like any other talk show. The look of the show was very much based on Lilly and what she does best, with YouTube as her background. We wanted half of the show to feel behind-the-scenes and candid, welcoming the viewers into her world. These are some of the funniest, most charming parts of the show! Another huge piece of the show is sketch comedy, which is what I know best. Lilly does a lot of character work, and we feature her playing multiple characters in sets that our art team is able to create in the production house. The look is always dictated by the script. For example, we filmed a Wizard of Oz-inspired sketch where Lilly finds herself in Emerald City and talks to the Wizard. That sketch was saturated with color and had a soft diffused look. Finally, we do a lot of interviews and 94

segments in our main living room, where I worked with the production designer, Bonnie Bacevich, to create a colorful and laid-back vibe.

What are the main challenges of ‘A Little Late with Lilly Singh’? We have a very fastpaced schedule, so I always have to form a plan to get efficient coverage. Covid safety

protocols also slow things down because we can’t have too many people in the room when we are setting up and shooting. When Lilly is playing multiple characters in multiple scenes, it involves a lot of quick jumping around from place to place, which means that I have to keep track of lighting continuity. On top of that, we don’t have any color correction on the show because of the tight


CARISSA DORSON

the GH5s cameras. One GH5 serves as our behind the scenes camera, one is on a Ronin-S Gimbal, and one is Lilly’s “rant” camera, to which Lilly delivers monologues in her “rant room.”

You chose to shoot with a LUMIX GH5S as your main camera rather than a cinema or broadcast camera (although you also use multiple AUEVA1). Why did you choose this option? Do you think mirrorless cameras are going to be a standard in these types of tv formats in the future? post turn-around, so finding the look in-camera is another fun challenge!

What were your main visual references for the show? I take a lot of inspiration from Key & Peele, SNL Digital Shorts, and Inside Amy Schumer. Those shows really go for it when it comes to cinematically heightening their sketches, and I really admire that.

Could you tell us more about the cameras + lenses you chose for this project? We shoot with three Panasonic EVA1 cameras and three Lumix GH5s cameras, all serving different purposes. Our main A and B EVA1 cameras are outfitted with Canon Cine 17-120 Servo Zooms, and the third EVA1 has a Canon L-series 24-70 zoom. We mainly use LUMIX G 12-35 Zooms on

Going into the show, I knew we would need several cameras to accommodate all of the different types of segments we were shooting, and I knew I wanted them to be small in size. We’re a small crew that needs to move quickly, and the Panasonic EVA1 and GH5 cameras allow us to do that without compromising on image quality. I have seen mirrorless cameras like the GH5 used in film and

95


CINEMATOGRAPHERS

TV more than once. It all depends on the needs of the project, but the GH5 is a great option for a B or C camera that needs to be smaller than the main cameras. Despite its size, I often can’t tell the difference in the image!

We’ve seen that you chose to capture Full HD, but isn’t true that industry standards are moving to 4K + HDR? Didn’t you think about opting for this format to extend its broadcast later on VOD platforms? Yes, it does feel strange to go back to HD when 4K is more of a standard these days. The reality is that 1920x1080 is still the standard in late night, and it allows for an easier post turn-around for a show that airs four times a week. I’ve been pleased with how it looks, and think that dynamic range and color are ultimately more important than resolution.

‘A Little Late with Lilly Singh’ was shot outside studios, which we suppose poses a lot of 96

challenges, especially when it comes to lightning. How did you solve them? I feel more at home lighting on-location than in a studio. It’s what I’m used to! During preproduction I worked with

Lilly and our production designer, Bonnie Bacevich, to determine our hero spots that we would return to throughout the show. Then I worked with my gaffer, Joe Baltazar, to create a semi-permanent lighting plan for those


CARISSA DORSON

areas. Art team also purchased LED strips to build into some of the set design, such as the clouds mounted to the wall behind Lilly’s couch. I did camera tests with those strips to make sure they didn’t flicker on camera. We mounted some Litemats to the ceiling, but kept the majority of our lighting on stands to move around for different setups.

We suppose that you, as a main part of the industry, opt for led solutions in order to provide an extra flexibility and operability. Is that so? Yes! I love having the flexibility of LED lighting, and I try to use RGB lights when the budget allows. Rosco provided us with several of their DMG Lumiere Mix lights, which give us access to Rosco’s full gel library within each of their lights. They also provide effects lighting, so we can come up with lightning flashes, TV flickers, or police lights within seconds.

Another great deal is the fact that you strictly followed all COVID measures. Could you tell us more about this? We’ve grown accustomed to the Covid safety protocols, which involve getting tested three times a week and wearing masks and face shields on set. Our Covid safety officer makes sure we are social distancing whenever possible. The biggest challenge is each room has a maximum capacity, so we have to work in shifts while setting up. Art team goes first, then lighting, etc. Our largest room has a capacity of seven people, which speaks to how small our crew really is! Covid has also meant that all of our interviews are remote. Our guests appear on a large TV screen that Lilly sits across from in the living room.

Furthermore, we’re sure COVID has affected your postproduction workflows as well. Are you editing the program remotely? Yes, the show is edited

remotely and our post supervisor, Jesse Schiller, did a great job creating that workflow. All of the editors are using Avid from home. The dailies are uploaded to FineCut, which makes it easy for me to review the footage.

Finally, do you think that in the future TV programs will improve their final look thanks to the work of DPs specialized in these formats? TV has been moving in a more cinematic direction for years, thanks to very talented DPs, producers and directors. I’m really grateful to the executive producers, Lilly Singh and Polly Auritt, for giving me the chance to use my cinematic brain on this show. We’re making a late night show that feels refreshing and unique. I hope it leads the way for other unique late night shows, and more female and BIPOC hosts!  97



Turn static files into dynamic content formats.

Create a flipbook
Issuu converts static files into: digital portfolios, online yearbooks, online catalogs, digital photo albums and more. Sign up and create your flipbook.