TM Broadcast International #105, May 2022

Page 1

NEWS - PRODUCTS

Titular noticia Texto noticia

1



EDITORIAL May is a really important month for our industry. Face-to-face meetings have returned thanks to the NAB Show. There has been less attendance due to the constant hangover from the pandemic, but we have been told by senior representatives of the global broadcast industry that this has given them the opportunity to delve deeper than usual into the innovative solutions that manufacturers made available to the public. In the immediate future, other smaller but also important fairs will be held to bring our market’s novelties to the hands of all the professionals who make it up. Manufacturers, distributors and end-users will be able to get closer to the technology at trade shows such as CABSAT for the Middle East market, or at London’s MPTS. We have put together a very complete issue in which you can hear about the great news thanks to the testimonials of those who make this technology work. First, we have prepared a thorough report on the state of broadcast in esports. Its growth has been very remarkable and audience figures are now comparable to those of major sporting events such as the NBA or the NFL. To find out how this industry is growing, we interviewed Riot Games and The Switch: Two companies

Editor in chief Javier de Martín editor@tmbroadcast.com

closely related to the growth of this content. Riot Games, in particular, offered us an exclusive interview about one of its major creations: Project Stryker. The company aspires to be able to broadcast 24/7 content globally in a remote and scalable manner. The Dubai World Expo has recently closed. It has shown the socio-cultural capabilities of 192 nations. In order to get all this content exhibited, about 17 hours a day for 182 days have been broadcast. This large production has involved more than 300 professionals. Here you can find out how they have pulled it off nicely. We have also approached to American industry. American television is in the process of transition and growth with the implementation of the ATSC 3.0 protocol. We have spoken to Madeleine Noland, so she would tell us what the capabilities of the next-generation TV are. On the other hand, in taking a closer look at content production, we interviewed the CTO at production company World of Wonder, responsible for the successful “Drag Race” format and photography director Eric Steenberg, known for his career with Jason Reitman and for the recent release of “Hawk Eye” on Disney +. You will find all the technical features of his works in this magazine. Enjoy!

Creative Direction Mercedes González

TM Broadcast International #105 May 2022

mercedes.gonzalez@tmbroadcast.com

Key account manager Susana Sampedro ssa@tmbroadcast.com

Administration Laura de Diego

Editorial staff press@tmbroadcast.com

Published in Spain

administration@tmbroadcast.com

ISSN: 2659-5966

TM Broadcast International is a magazine published by Daró Media Group SL Centro Empresarial Tartessos Calle Pollensa 2, oficina 14 28290 Las Rozas (Madrid), Spain Phone +34 91 640 46 43

3


SUMMARY 6

20

NEWS Riot Games: Project Stryker Riot’s great project to reach the world with esports We spoke with Scott Adametz, Director of Technology at Riot Games about each and every one of the capabilities of this incredible project.

32

The Switch Spreading the feed of esports We spoke to Charles Conroy, VP Gaming at The Switch, to ask him what a company like theirs has to offer in the eSports landscape.

42

4

ATSC 3.0 – NEXT-GEN TV Here is everything you need to know about the ATSC 3.0 protocol and the international organization ATSC from its president, Madeleine Noland.


50

World of Wonder The technology behind the success of “Drag Race” We talk about how technology has changed over the years of the RuPaul format, and about the difficulties and solutions implemented in the wake of the pandemic.

60

Eric Steelberg, Director of Photography “Virtual reality has a potential we cannot yet imagine”

68 78

Dubai Expo 2020 Exposing the world from Dubai Test Zone

Apantac UE-4-II-K

5


NEWS - PRODUCTS

EVS launches Neuron COMPRESS for JPEG XS processing and PROTECT for content securing EVS has recently released two new product lines and added several software capabilities to its Neuron Network Attached Processor. Built on an IP backbone, and providing SDI connectivity as an option, Neuron is EVS’ Media Infrastructure offering. This solution is designed as “the glue” for IP infrastructures and can perform time processing tasks in SDI, IP, or hybrid production environment. The introduction of Neuron COMPRESS provides new JPEG XS encoding and decoding capabilities. The new addition is a highdensity solution, offering up to 32 UHD or 64 FHD encoders/decoders in one rack unit. “By adding JPEG XS compression to EVS Neuron, EVS is setting new standards in terms of network performance and flexibility in various IP environments. It is a key enabler for the successful

6

deployment of remote production workflows, providing low latency transport of high-resolution signals for easy sharing between multiple sites, while preserving the quality of the vide,” commented Jean Pierre Nouws, Solution Manager at EVS. On the other hand, Neuron PROTECT backups broadcast networks from single points of failure, using intelligent, redundant, isolated (automatic) change overs. These communicate with each other, meaning that if one fails, the other continues to output content on the network. Available within EVS’ Neuron, the PROTECT

line use encapsulation standards like ST2022 and ST2110 and are capable of handling up to 16 channels of full HD or 4 channels of UHD and transporting these over two redundant 100Gb/s Ethernet connections. Optionally, Neuron PROTECT can be enhanced with an SDI I/O module. This will add 40 physical SDI HD-BNC connectors, allowing easy integration of existing SDI operations in an IP environment, acting as bridge or gateway. PROTECT can be used as a multi-channel A/V-overIP transceiver, A/V probe, Up-down-cross converter, automatic change over (ACO) and firewall. 


NEWS - PRODUCTS

7


NEWS - PRODUCTS

Bridge Technologies integrates SR Live Metadata into its monitoring solution VB440 SR Live Metadata can be

mission statement, which

embedded in SDI signals,

is to eliminate the need for

MXF files, and IP networks,

complex (and expensive)

and by combining with

additional rack equipment

compatible products,

and display monitors,

it is possible to simplify

and instead allow for a

operations and prevent

holistic suite of production

setting mistakes during

activities to be undertaken

conversion. Thus, the

through nothing more than

purpose of SR Live

an HTML5 browser.

Metadata within the VB440 is to register the creative Bridge Technologies has recently announced the integration of a new functionality to the VB440 that provides the ability to monitor and display SR Live Metadata within the

decisions and adjustments from the cameras during production, from which the data can then be checked to ensure that the quality of signals and packet behaviours across the broadcast chain – from

VB440’s GUI. The addition

the initial camera output,

of SR Live Metadata

all the way along the chain

capabilities represents

through switches, encoders

yet one more expansion

and other network

to the multitude of tools,

components.

standards and formats

“It’s a privilege and a pleasure to be working with Sony in pushing forward their impressive and much-needed SR Live for HDR workflow by accommodating the use of SR Live Metadata within the VB440. And becoming an SR Live for HDR licensee simply makes good sense to us; we’ve been leading the way when it comes to forwarding IP workflows in the field of broadcast,

The focus of Bridge

and so aligning with other

Technologies is to support

like-minded, high-level

Sony’s SR Live for HDR

and progressive industry

technology to push

practitioners is always

for creatives to use on-

the boundaries of HDR

rewarding,” Chairman for

the-fly in remote and live

production. This sits at

Bridge Technologies, Simen

production settings.

the heart of the VB440’s

Frostad said. 

which the VB440 can not only monitor, but turn into meaningful and usable data

8


NEWS - SUCCESS STORIES

9


NEWS - SUCCESS STORIES

Red Bee Media develops Together TV charity television streaming service Red Bee Media has recently helped Together TV to launch its free to use streaming service. Together TV is known for content that inspires positive social change on its linear channels. The platform developed by Red Bee Media will ensure this content is handled with capabilities including content protection with Just in Time delivery of DRM (Digital Rights Management); geo-blocking, security and content management processes. The UK-based streaming service is now available through app stores and online for iOS and Android devices, on desktops and tablets – with smart TV apps to follow this year. As a community-owned charitable television channel, Together TV’s streaming service reflects this purpose-driven ethos throughout the platform. It does this by connecting

10

the programming to genrespecific resources and campaigns to “Channel Your Inner Good”. For example, those watching a gardening series will be encouraged to “Channel Your Inner Grower”. Together TV’s catch-up service will act as an extension of its linear TV channel. The programming will continue to include factual entertainment and thought-provoking documentaries, in addition to the original content created from the annual Diverse Film Fund initiative. The content catalogue will become richer and deeper with more fan favourites as the channel’s programming becomes available to catch up and stream. Alexander Kann, Chief Executive at Together TV explained: “We are proud to launch this brand-new service that allows our viewers to enjoy Together TV’s inspiring shows and documentaries for free, anytime and anywhere

in the UK. In addition, we hope this platform will help viewers to feel good and connected to the world around them. We selected Red Bee Media based on the company’s service delivery expertise, market understanding and as a good company to work with. We are delighted to have worked so closely with Red Bee Media and Nowtilus who have both worked tirelessly to help us reach this soft launch.” 


NEWS - SUCCESS STORIES

11


NEWS - SUCCESS STORIES

ARD relies on CGI’s dira!, OpenMedia and Viura solutions for newsroom workflows Viura is a software solution for intelligent camera control in the broadcast studio. The tool can be integrated into different playout and newsroom environments, regardless of manufacturer, and is based on the selected video hardware and our control software.

The newsroom and radio

covers key legal, procedural

system provider CGI has

and economic data for the

“We are delighted to have

recently announced an

long-term deployment,

signed this extensive

agreement with Germany’s

operation, and expansion of

framework agreement with

public service broadcaster

CGI software solutions.

the ARD, which confirms

group, ARD. The objective of the contract is to deploy media management and planning systems

The dira! product family covers workflows required for media management in

our strong partnership and provides the basis for long-term planning security on both sides, ”

radio journalism –from the

commented Thomas Roth,

production of broadcasts

Senior Vice President &

and the exchange of

BU Leader at CGI. “As well

materials and programs,

as providing the ARD with

OpenMedia, and Viura.

to broadcast planning and

increased efficiencies in the buying process, it provides

The agreement was

actual broadcasting, to post production and archiving.

the basis for increased

between Norddeutscher

The newsroom system

Rundfunk and Bayerischer

OpenMedia is a tool that

the ARD community and

Rundfunk, ARD’s lead

helps researching, planning

members to gain further

buyers for the dira! and

and producing newscasts,

powerful workflow benefits

OpenMedia products,

magazines and sports

via the common use of

and CGI Deutschland, and

formats.

system resources. 

throughout the ARD’s nine regional broadcasters. The framework covers deployments of dira!,

signed at the end of 2021

12

standardisation efforts in will potentially enable its


ADVERTORIAL

13


NEWS - SUCCESS STORIES

TVUP Media Telecom relies on Ad Insertion Platform’s ad services for its OTT Tivify Spanish company TVUP Media Telecom, a provider of TV services through its OTT platform Tivify, has selected Ad Insertion Platform (AIP) to monetize its advertising inventory across its network. This will be done through its SSAI platform, DAIConnect. TVUP’s OTT platform, Tivify, offers free access to more than 130 streaming channels, including some of the most popular in Spain, as well as a selection of adsupported free-to-air (FAST) streaming TV channels. It enables distributors to offer a wide variety of TV channels and premium

programmatically selling

fill rates. With AIP’s

video ad inventory to

DAIConnect, we have

AIP’s demand partners

the most advanced SSAI

and enabling server-side

solution on the market,

ad insertion (SSAI). The

allowing us to seamlessly

combination of AIP’s ad

stitch video ads within

serving capabilities and the

our complex digital

ability of its SSAI technology

infrastructure.”

to insert dynamic ads into video content on TVUP’s FAST channels will increase the value of the platform.

“We look forward to working with TVUP to implement a 360-degree solution for their OTT

“This year we will expand

platform, ranging from

our FAST channel strategy

technical implementation

to offer more free

to monetization of their

entertainment options

entire inventory,” said

to our Tivify users,” said

Laurent Potesta, CEO of Ad

Eudald Domènech, CEO

Insertion Platform. “Large

of TVUP. “Thanks to AIP’s

and ambitious players

programmatic advertising

like TVUP are increasingly

AIP provides a 360-degree

sales services, we can

turning to AIP to help

monetization service within

increase our advertising

improve their advertising

the TVUP interface by

revenue and maximize

revenues.” 

content to their customers, including advanced functionalities such as apps and games. The platform has an app for all screens: television, cell phones and tablets, and WebPlayer.

14


NEWS - SUCCESS STORIES

15


NEWS - SUCCESS STORIES

SRG and Broadcast Solutions develop Commentary Control Room integrated into OB vehicle

Swiss public broadcaster SRG SSR has recently developed, jointly with Broadcast Solutions, a Commentary Control Room solution. The companies have created a solution incorporating the necessary technology and various connection possibilities into a compact vehicle. Equipped with a 4-wheel drive, air conditioning, UPS, cable drums and equipment, the new production tool, called

16

Cosmo, weighs less than 3.5 tonnes and offers a maximum of workflows and equipment in the smallest possible space. The vehicle offers teams on location more flexibility to run parts of the programme independently. On the technology aspect, compatibility with SRG production formats (audio/video / IP) and connection to the unilateral broadcasters’ home studios (Internet, WAN,

stadium) are of the highest importance. Cosmo is designed, primarily, as a mobile commentary control room that collects commentary signals from venues, processes them and then forwards them to OB vans, SNGs or data lines. However, the vehicle can also be used as a technical operation centre. In this case, the focus is on receiving and distributing audio and video signals to


NEWS - SUCCESS STORIES

broadcast networks, WAN connections or IT networks and the Internet. The OB Vehicle interconnectivity is ensured via a Riedel Mediornet MicroN system plus NetInsights Nimbra solution. The audio/video network is implemented via Mediornet and provides connectivity and signal distribution in the broadcast network. Three remote audio/

video stageboxes can be connected to the vehicle for signal pickup. Riedel’s decentralised router solution processes a maximum of 72×72 HDSDI video signals, including the corresponding audio tracks (embedding and deembedding), with MircoN units in the vehicle and the stageboxes. NetInsight’s Nimbra system provides signal transmission via the Internet or GbE line. The

audio network rests on the Dante format managing up to 220×228 audio signals. All audio signals coming from the commentary stations are collected, controlled and processed in the vehicle. For this purpose, a DHD mixer plus software panel, effects, and monitor are available. A monitor wall and video measuring devices are in place for monitoring the video signals.

17


NEWS - BUSINESS & PEOPLE

SoftIron expands Technology office to broaden its patent portfolio and contribute to the open source community

SoftIron Ltd. has recently announced the expansión of its CTO office as it realigns internally and expands into new engineering roles. This development reflects the company’s ambition to expand its patent portfolio and intellectual property while deepening its engagement with the open source communities through strategic engineering initiatives. Quoting the press release, “the company is looking for engineers who are passionate and dedicated to the open source ethos and in developing sophisticated software

18

that makes a meaningful impact.” The first two projects that will benefit directly from the work of this expanded engineering team will be softwaredefined storage project Ceph and software-defined data centre switching project SONiC.

SoftIron. “We see open

“SoftIron’s commitment to open source is fundamentally about our commitment to the needs of our customers, where scalability is a critical vulnerability for them as they invest in sizable infrastructures that can lock them into untenable situations,” said Van Alstyne, Chief Technical Officer for

a role in which internal

source as a way to offer flexibility and the freedom to scale for our customers while ensuring that their needs are at the center of our own development. Rather than adding more responsibility to existing engineers, we envision engineering advocates can spearhead deeper product integrations that directly respond to our customers’ needs. These feature developments will ultimately become valuable contributions to the code of the respective projects they are involved with.”


NEWS - BUSINESS & PEOPLE

Cobalt Iron patents machine learning authentication control optimization for its Compass platform Cobalt Iron Inc. has recently announced that it has been granted a patent on its technology for machine learning (ML) optimization of authentication control. The patent describes new capabilities for Cobalt Iron Compass. The platform already has the ability to optimize user authentication and access to IT resources dynamically based on what’s happening in the environment. This new patent extends that capability by applying new ML techniques —adding yet another layer of intelligence and security to enterprise IT resources and operations. These new patented ML

techniques continuously improve authentication controls over time, learning from the results of previous controls. What’s more, the technology automatically adjusts authorization controls based on conditions, events, project status, access activities, etc. This eliminates the pervasive security risks of outdated and unresponsive authorization controls and makes the entire IT infrastructure more secure and smarter. The techniques disclosed in this patent are: – Collecting training data, including environment event data, user permission

access patterns, access control duration data, security events and alerts, project data, cyber event information, security event logs, data protection operational results, and the like. – Analyze training data to determine the effectiveness of authentication controls during past conditions and events. – Generate ML rules to potentially adjust authentication controls during future conditions and events. – Monitor various conditions and events, including environmental events. – Dynamically adjust user authentication privileges based on generated ML rules. – Modify the duration of user authentication privilege settings in response to generated ML rules. 

19


ESPORTS

Riot Games: Project Stryker Riot’s great project to reach the world with esports League of Legends is a game that has created a huge community around the world. Its players number in the millions and its fans in the billions. This community is hungry for content and Riot Games knows it. How can this company service and offer content 24 hours a day, 7 days a week to such a huge amount of users? Adding more challenges, how do you do it when you also multiply exponentially this community by adding more games like Valorant or League of Legends: Wild Rift? This is how Project Stryker was born. This idea was born to add the capabilities of scalability, simultaneity and professional features to all events taking place in any corner of the world remotely and without the need for hardware. We spoke with Scott Adametz, Director of Infrastructure Engineering, Esports at Riot Games, about each and every one of the capabilities of this incredible project.

20


RIOT GAMES

We have seen that the esports industry is here to stay. In fact, it has only grown. Has the form of esports broadcasting grown in parallel with the growth of esports? Funny enough, I came from traditional sports. We’re rather new to the esports industry. I’ve been here at Riot almost five years, and when I first joined I realized how many parallels there are between traditional sports and esports. While the content’s completely different, the backend production techniques, the goals of telling the stories, it’s the same. With that being said, I think esports has come a long way on their own, figuring it out as they went. This is something that’s pretty unique. Some of the most amazing innovations have actually come out of esports. That is why I’m still part of this industry, because this is where the innovation is happening. Why the esports industry has this innovation power? Just think for a moment about the purpose of this whole entity. Project Stryker is here to delight fans, to bring joy to billions of fans around the world. Everything we do is about delivering an incredible experience to our fans, viewers and players. That’s Riot’s mantra. When you have that as your north star, you find new and innovative ways to deliver new content experiences, new ways to produce content that are more efficient and allow us to create a lot more of this content. It’s not about innovation for innovation’s sake, and it’s not just about the broadcast and the technology behind the scenes. It’s about why that technology needs to exist for incredible experiences to be produced.

21


ESPORTS

What is the origin of Project Stryker? This was years ago before the pandemic, if we can all remember what that was like. I remember being in our Los Angeles Campus [Riot Games Los Angeles Campus], and I was watching two engineers on the team that I supported playing a game that had not been released. This was a pre-production version of what would become our game called Valorant. They were completely invested in this game. They had been playing for five or six hours. I was curious what were they playing and why were they so invested in spending so much time in it. I watched them play for a bit and I could just see how emotionally invested, how excited they were, and frankly, how good the game was. At that stage, it got me thinking, “This is going to probably be a success.” Riot had grown organically around a single game: League of Legends. And, after that, I kept thinking: “What happens if this game becomes even half

22

Scott Adametz, Director of Infrastructure Engineering, Esports at Riot Games. Copyright Riot Games.

as big as League? How would we create esports content around that? How would we delight fans with this additional game?” As an analogy, it’s as if FIFA suddenly added golf to its repertoire. It was groundbreaking. I went deep and started to come up with an idea of how we might service an additional title with all of the esports components behind it, the broadcast production

needs, and put together a pitch and went through the channels internally to say: “We need all this capacity to produce content around this new game”. The rest of that is history because it was very quickly approved and we began working on what would become Project Stryker. What we have here, is the first of three production facilities around the world


RIOT GAMES

that will have the purpose of allowing us to remotely and centrally produce content for any number of titles, any number of sports, from any number of regions, and in an efficient way.

events around the world. What we’re trying to do is become the backend service that allows all of our competitions to happen more efficiently and more, to be honest, costeffectively.

So these facilities are there to produce content related with several Riot games, aren’t they? Project Stryker wont host any event, right?

What is this facility for?

Exactly, these facilities won’t host the actual tournament. This facility in Dublin [the first one built] is a content factory that allows us to have amazing

THIS FACILITY IN DUBLIN [THE FIRST ONE BUILT] IS A CONTENT FACTORY THAT ALLOWS US TO HAVE AMAZING EVENTS AROUND THE WORLD.

This facility behind me is essentially a place to do as many simultaneous events as possible. It isn’t a big sound stage with merge booths and hotdogs and popcorn. That’s not what this is. This is the video control and the audio control rooms that would be behind the scenes, producing any number of content versions in every language possible. For example, Brazilian team in São Paulo will be using infrastructure in Dublin to produce events there. Their equipment remains the same. They haven’t moved, they’re still in their control room. All we have done is give them access to very powerful equipment behind the scenes through a network so they can produce the same program with more features, with

higher production quality and without having to buy, build and maintain equipment in São Paulo. There are six production control rooms and six audio control rooms in this place just to start. And there is room to expand. That’s the purpose of this facility. How do you achieve this capacity; I mean remote access to powerful tech and simultaneity? It all starts with the network. Riot has what’s called Riot Direct. This has been one of the things that has set Riot apart from other game studios, and it was birthed out of necessity. It is a global ISP to the level of any of these network providers. We developed in order to support the number of players playing Leage of Legends and give them all the best experience. We needed to take that traffic onto our backbone as early and as close as possible. That network is one of the most undervalued assets in Riot’s repertoire. When we wanted to add Valorant we didn’t have to start

23


ESPORTS

from scratch. We were able to leverage what we had learned from running a massive game like League of Legends. The funny thing is that we approached them about adding video. This is something not many people know, but Riot has been doing remote productions for all of their big worldwide events for seven years. By putting that traffic into Riot Direct as early as possible, we can make sure that the video and audio gets to where we’re going to produce the event from, to the production control room, and then send their final signals to the site, or to YouTube or Twitch, or to any of our distribution partners.

What are the technical characteristics of Riot Direct? It’s undersea fiber all over the world connecting continents. We do points of presence at all the major operator hostings around the world where we pick up connections from local ISPs and say, “Hey, if you need access to any of our games, we can give you access to that on our backbone, and throw that traffic to us as fast as possible.” That does two things. It gets it off that operator’s network, so they can spend their time serving whatever it is to their customers, and it also allows us to make sure that that traffic is guaranteed end-to-end.

IT’S UNDERSEA FIBER ALL OVER THE WORLD CONNECTING CONTINENTS. WE DO POINTS OF PRESENCE AT ALL THE MAJOR OPERATOR HOSTINGS AROUND THE WORLD WHERE WE PICK UP CONNECTIONS FROM LOCAL ISPS.

24


RIOT GAMES

Now the fun part is that the type of traffic being sent from the games is very similar to the type of traffic being sent to the video. Naturally we are already using a network that is designed for this. With the network, everything works remotely, right? What is the workflow, do you receive the signals from the point where they originate and at this facility do you process them?

Stryker Dublin - Technical Operations Center. Copyright Riot Games.

Here’s where it gets a little different. Let’s just pick an example. We could use one as an example that’s coming up is MSI. It’s called the Mid-Season Invitational. It’s League’s middle of the year event. we’ll be picking those signals up in Busan in Korea, and we’ll be bringing them to the nearest Riot Direct point of presence to get it onto the network, and then from there, it doesn’t actually come to this building. It goes to a data center. What we did differently about building this facility, and it was so that we forced ourselves to think differently and push the boundaries of what was

25


ESPORTS

technically possible, is there is no equipment room in this building. We do not have video signals building. The idea around this facility is not to be where all of the equipment is. Let’s put that in a data center where it makes sense. We do that for a couple reasons. One, it forced us not to fall back on the old models of SDI, because the data center just happens to be further and far enough away from us. That makes us think about how we could give these control rooms to regions. The other reason is that this possibility gives us the opportunity to expand production without having to build highly technical and complex data centers or equipment rooms in each of our regions around the world. They still get the benefits but they don’t get any of the added complexity. How did you scale up all these infrastructures not depending physical facilities? This is the first of three facilities. The idea here is not to build one giant facility to service the world.

26

Stryker Dublin - Production Room. Copyright Riot Games.


RIOT GAMES

THIS IS THE FIRST OF THREE FACILITIES. THE IDEA HERE IS NOT TO BUILD ONE GIANT FACILITY TO SERVICE THE WORLD. WE’D ACTUALLY TRIED THAT AND WE REALIZE THAT ONE FACILITY ANYWHERE WASN’T GOING TO SOLVE THE PROBLEM.

We’d actually tried that in the past. Los Angeles is the hub of Riot Games. We realize that one facility anywhere wasn’t going to solve the problem because we would overburden that one, we would create a single point of failure, and it would need to be massive. What we did instead is to follow the sun and we carve the world into three swaths of time. For each part, we could build one facility. Each would have to be 1/3 the size, but together they to be enough to satisfy the largest show. Our number was 18 simultaneous productions around the world, which is what traditionally Riot has done. That is 18 languages plus English.

We said, “How could we do this?” We did not want to subject the facilities to the intensity of working around the clock, i.e. 24/7 in three eight-hour shifts per day. What we decided was that each of the facilities would be operational during daylight hours and then pass the baton to the next one. That way we were able to offer service every hour of the day, seven days a week, and 24 hours a day. What technology can we find in your facilities? Let’s start at the data center because that’s probably the coolest part. Everything in here is a network which is routers, firewalls, etc. One of the biggest network players obviously is Cisco.

27


ESPORTS

I approached them and said we’re looking at a very large production facility and we have other needs in the space. We worked out a partnership and that was what birthed this idea that they would be partners with us, not just a vendor or a manufacturer providing us gear. True to form, they have a dedicated, massively brilliant engineer that has helped us solve a lot of problems and avoid pitfalls common to massive 2110 networks like PTP. It is an entirely Cisco or Cisco Meraki network based on whether it’s the video fabric or the infrastructure. The production areas you will find the JPEG XS is on Nevion. From 2019, we’ve been testing with them to do the JPEG XS. They were very early on and having that codec as a test preview, and they have performed admirably. We’re using them today for all of our contribution feeds. Within the facility, it’s a mix actually. We’re not beholden to any one broadcast vendor. We actually think that we should be able to support any but they do need to

28

align to standards. They need to be 2110 compliant and have a pathway to 2110-22, which is our hope to be able to keep compressed essence as our primary format. What challenges you found developing the network? We set out to make it not a layer 2 network. We have some brilliant network engineers at Riot. We contacted them and said, “You may not understand video, but if you were to build a network and you had these requirements and it had to operate at this level of performance, what would you do? They replied that we would make it fully routed and all the areas and all the ports would be a Layer 3 subnet. The idea was to have total control. The idea was to have total control. We communicated that to our transmission providers and they said, “Wow, well, I mean it’s possible. It’s very complex, but it’s possible and it has a lot of advantages. If you are able to do this, we are able to work in that environment.

Stryker Dublin - Production Room. Copyright Riot Games.


RIOT GAMES

Everything is deterministic. Every flow is where it is because we have designed it to be there, not because it just finds its way, or builds its way freely, or has to remember where it got a single meeting point where everything is, that’s not how it works. We let the network decide, and that’s where Cisco comes in. Its IP Fabric for Media and its non-blocking multicast maintain the state of the entire network and make decisions every time a route is requested. We didn’t realize that that hadn’t been done very often. I think now that we’ve built it, we’ve realized its value. It’s not without its challenges, but I think we’ve encountered new challenges, not the same ones that others have had to fight before. We’ve created our own nuance, which is great. What is the broadcast control layer? Right now we are using Grass Valley Orbit as our broadcast control layer. This would be the equivalent of a broadband

29


ESPORTS

router. Below that, Orbit talks directly to the devices on the network and that’s where the actual routing tables are updated. That’s the current functionality. It’s kind of hard to take a legacy broadcast application and bring it into something new. We don’t know if that’s where we’re going to be in a couple of years. However, Riot has a very strong software development team, and they have ideas. There’s a world where we can go freelance and build something custom. For the initial release, it’s Grass Valley Orbit. For our video switchers, it’s all native 2110. As everybody knows, that just means it connects to the network right at the edge of the switch, inside and still SDI-based switches. We’ve left two full production control rooms in a state where we’re going to try new things. That’s all I will say. They are physically built. They have the same intercom, the same multiview, the same number of seats, the same basic infrastructure network and positions. We have populated them with a legacy contingent of transmission equipment. That’s where we believe there is a power that, as Riot, we will discover over time. How do you manage stored media? The perennial problem that occurs when you develop facilities like this is that, of course, they will produce content and what do we then do with that content? We have developed global content operations to acquire, enhance and enrich with metadata and then make it available to all our partners around the world. That same content can be versioned to find new outlets or incentivize new markets.

30

What are your next steps? Dublin facility is in its first shadow. Riot has a model of crawl, walk, and run. The crawl is during masters one which is happening as we speak. This facility’s in shadow mode basically following along with the production from Iceland and creating our versions. It’s a mix of an engineering shakeout to make sure the facility is technically sound but also training opportunity, building up our operators, confidence in the facility and how they work, and then working out any kinks or bugs that we discover along the way. Our next will be a walk. That’s where we get to add to existing productions and create something that was not possible before. It won’t be developing a complete show, because that would be running, but it will be something that Stryker is capable of. After that, the next step will be to run and for that we will have to wait until the event that will be held at the end of the summer this year in Madrid. This event will be the first to be produced entirely with Stryker.


RIOT GAMES

Stryker Dublin - Production Room. Copyright Riot Games.

The content is enriched to be searchable. It is indexed through machine learning and artificial intelligence to extract data from it, including team names, plays, game server data sources all time texts. We can go back and use it to create even more content in the future. It’s been a challenge for Riot. And it has been because we wanted to do it centrally. If you don’t do it this way, you’re leaving it to all the sections to figure it out on their own. We wanted to solve it for everybody and what we’ve achieved is that all the content that goes through Stryker goes into one big repository, which

right now is running in the public cloud, available to everybody who needs it. The objectives of this project are to serve content to a global audience, what are you developing to achieve this capability and what is your vision for the future? One of the objectives of these facilities is to offer content to fans in any language, wherever they live, whatever language they speak, they should be able to enjoy the content. Those rooms, potentially, a traditional switcher doesn’t have the capacity we need to do the types of shows, nor is it smart to do 19

different rooms to do the same thing in a traditional way, just 19 different times in parallel. What we’re going to do is around automation and cloud. We believe there is the possibility of building our shows in an ephemeral way that allows us to have 19 different graphics engines connected to the same show, all working from the same rundown, but with 19 different outputs, and potentially 19 different casters broadcasting the same show, but from all over the world and aggregating that content in the public cloud. In our view, this is the future of broadcasting. 

31


ESPORTS

Spreading the feed of esports We spoke to Charles Conroy, VP Gaming at The Switch, to ask him what a company like theirs has to offer in the eSports landscape. The Switch is dedicated to the infrastructure behind the transmissions. As specialists in the esports arena they have collaborated with major international competitions such as The Overwatch League, the Call of Duty League or BLAST Pro Series. Their job is to bring the excitement of competition to any place in the world, whatever its characteristics. The esports audience has grown significantly during the pandemic and international interest has only grown. A curious fact is that betting on esports grew during these times. This statistic is always indicative that a particular sport is on the rise. Those who contribute to the distribution of the signals know a lot about bringing these disciplines to their current level of audiences.

32


THE SWITCH

33


ESPORTS

What is The Switch? What kind of services does it offer? The Switch is one of the world’s largest transmission provider. Our company has two arms. One is transmission. We work some of the world’s largest events including the Academy Awards, the Super Bowl, the entire PGA Tour, and the NBA. We send that signal to wherever it needs to go. We’ve been doing this for 34 years. On the production side, our other arm, we produce events such as the Latin Grammys, red carpets, award shows and, of course, esports events. We have done these events for the last five years. We worked on The Game Awards, we worked on E3. We support In-gaming, The Overwatch League, the Call of Duty League, BLAST Pro Series, WePlay, and ESL. What is your bond with esports? How did you get that connection? This is a crazy story. I’ve been in esports for 17 years. I started as a Counter-Strike player

34

Charles Conroy, VP Gaming at The Switch.

when I was in high school and quickly realized I was not going to be a world champion. But I also realized that there was no many organization on the space. I put together one of North America’s first professional teams and we travelled around the world representing the United States at various world cup events. DirecTV decided to come out with a TV show called “The Championship Gaming

Series”. This was in 2007, so quite early. This was the first-ever global league that was fundamentally a esports league that existed on three different continents. I ran the Dallas team there. They burned through cash too quickly, and frankly, 2007 was probably too early for esports to be on linear television, so that league folded. From that league, my good friend and then


THE SWITCH

business partner, Jason Lake, reformed Complexity Gaming. Jason and I ran Complexity together for nine years until eventually selling it to the Dallas Cowboys. At that point, I exited the company. I didn’t know what my next move was, and I got a call from The Switch. They said, “Look, we happen to produce an esports event. We don’t know a ton about this market but it’s something we want to be in.” That was three and a half years ago, The Switch launched the gaming vertical. I am in charge of the gaming business unit there. In those three and a half years, we grew The Switch from nothing in the gaming space to this great company in transmission. What kind of technical infrastructure do you have to produce your own esports events? Do you rent or own it? It’s a mix. Just because we do global events, we rent a lot of fiber, but because of our scale and the events we broadcast, we can get fiber

THE SWITCH IS ONE OF THE WORLD’S LARGEST TRANSMISSION PROVIDER

at a very preferential rate. Also, our hardware, such as our encoders or decoders, and our studios and control rooms are owned by us. Most of the hardware we own, in terms of fiber lines, we have partners around the world. Could you explain to us what difference in broadcasting you have found between these esports and regular sports such as the NBA or NFL? Broadcasting esports compared to the NFL or NBA is entirely different. I think the biggest difference is there are no physical limitations within a video game. Obviously, in the NFL, you can’t capture a

variety of angles due to ball interference and just physics, whereas in esports, you can capture everything. I truly believe esports and gaming is going to evolve broadcast technology because we can do more with it. All the other things that are new in the mainstream broadcast world like all these OTT solutions have been around in esports for 10 years. Esports is by far the most cutting-edge factor in broadcast technology right now. Is there a possibility to transfer all these capabilities that you have mentioned and that give this essence to esports broadcasts to the traditional sports ground? Yes. If you take the example of, say, European football, it would require every player to wear a camera and it would require for that camera to remain steady all game. If you take the example of American football, you put the camera in the helmets, but just by sheer impact,

35


ESPORTS

those cameras break. It’s been tried before. Is there a possibility for on-player cameras to be developed? 100%, but every attempt so far of capturing player angles has not been successful. What do you think about offering another kind of experience relying in new technology such as virtual reality? There are a lot of opportunities to take. The only thing that’ll be hard to capture are going to be the player views, the first person views. Right now, putting a camera on a player is just too hard. However, you can capture about 90% of it through drones and other techniques like that. Then to your point, I think virtual reality is a natural next step for viewing both traditional and electronic sports. Let’s say you’re courtside at the Knicks. That ticket, it’s a $5,000 ticket. For $5 you can put on a headset and be courtside at the Knicks, and look to your left and look to your

36

right and have a courtside experience. I think for sure virtual reality is the next step in broadcast viewership, and esports is taking it to the next level. You’re not just courtside at a Knicks game, you’re in the game. You have full control to basically fly around like Superman, for lack of a better term.

What is your particular broadcast pipeline in a specific event? That totally depends on what takers have bought the rights to the event. For example, let’s take our relationship with BLAST Pro Series. They send the event signal out to 32 different takers that are both linear and digital channels around


THE SWITCH

the world. Our job there is to take the video signal and send it out. It may be ingested differently on a German TV channel than it is on Twitch. It’s a completely different signal. Our job is to encode and decode that signal in real time. For example, if you’re watching an event on linear

I THINK VIRTUAL REALITY IS A NATURAL NEXT STEP FOR VIEWING BOTH TRADITIONAL AND ELECTRONIC SPORTS.

TV in Spain and someone is watching it on the internet in Germany, we need you to get that video signal within milliseconds of each other just for integrity purposes. That’s a big part of our job as far as delivering video signal. On site, bringing in private internet lines to carry the signal in and out is incredibly important. A lot of the venues, even major stadiums, have really poor internet and it’s not strong enough to guarantee 100% uptime on a video signal. Another thing with The Switch, I think is incredibly good at, is we can create internet lines and build internet lines pretty much

anywhere. With enough lead time, if you pick a warehouse in the middle of nowhere in England, we can get a proper video feed out of that. We just have to build the fiber. Apart from the fiber provider do you provide other technological infrastructure into these events? We can. We actually produced the entire Latin Grammys in the cloud instead of in a physical control room, through our MIMiC product. MIMiC, is a control room in the cloud, so you can cut a TV show without everyone needing to be in one place. For example, your director can be in Sweden and your graphics guy can be in Ecuador and your audio guy can be in Miami. I did the cloud-based production, obviously, especially during COVID, it was a huge selling point for us. Then being able to remote produce things from our control rooms, like what we did for the PGA Tour was also very advantageous because the

37


ESPORTS

PGA Tour, you travel all around the country, and you don’t want the producers getting COVID. Being able to work out of our control room and produce the entire thing from Burbank was crucial to getting the PGA Tour done last year. From an esports perspective, we offer the same full suite of services. Some people choose to use us just for internet, whereas other people chooses to use us for the entire production, in which case we would staff directors, camera guys, audio people, an entire production from start to finish. What about advanced graphics creation in your events? How do you implement techniques such as augmented reality? Does it have a place in e-sports? Yes. I think one company that is very good with augmented reality is a company called WePlay, from Ukraine. They’ve done a great job introducing augmented reality into their events. It’s worth noting that that’s not what we do at The Switch, but if a client has augmented reality capabilities, obviously, we can support that. I think on a virtual reality signal, because it’s typically 360 cameras getting images, it takes a lot more bandwidth on the Internet to send that signal. We have that bandwidth. We can put up to 10 gigs redundant, that’s 10 gigs by 10 gigs, in one place to get all those camera signals out. That wasn’t possible 10 years ago. The way we support AR and VR is by providing enough bandwidth so that that signal can get out of the venue and get to its destination. Has esports broadcast grown at the same time and in the same way as esports? Yes, I think the numbers and the audience numbers of e-sports speak for themselves. They’re bigger than professional basketball, they’re bigger than Major

38

How esports broadcast industry will grow? I think the esports broadcast industry is going to grow as technology grows. We have augmented reality and virtual reality, which are the next frontiers. Esports has been very quick to embrace both of those. It’ll be interesting to see how those affect just broadcast tech on a greater level, like leading into traditional sports. I gave you the courtside seat example at a Knicks game earlier. I think esports will continue to be a leader in technology when it comes to broadcast. We’ll set trends and people will follow them. As esports becomes adopted into the mainstream we will have the opportunity to showcase those technologies. I’m very excited about it.


THE SWITCH

League Baseball. They continue to grow every day. The viewership is better than many major sports leagues and is only trending upward. Yes, as far as what you’re saying, the audience for e-sports is growing and people are trying to figure out how to capture those eyeballs, because young people don’t read newspapers, they don’t watch traditional TV. It’s a very difficult audience to capture from a consumer perspective. They’re also a desirable audience, because if you look at the average revenue for an e-sports event, it’s higher than most other sports.

How can you get their attention? Interactivity is what esports fans want. They want to be able to interact as much as possible. Twitch has given them the ability to do that by being able to live chat during games. You can’t really do that during an NBA game, or an NFL game. The more interactive they make broadcasts, the larger the esports fan base will respond. How can broadcast evolve into this interactivity, apart from a platform like Twitch? There’s a lot of ways to

make broadcasts more interactive. The easiest way, for example, is that they’re opening up esports stadiums and some of the seats are going to have screens in front of them. You’ll be able to choose which angle you want to watch that game from. You can’t do that at an NFL game. If you watch an NFL game on TV, you’re looking at the camera angle they’re offering you. The camera angle and broadcasts in e-sports break all those boundaries. You can see the players in first person to see what they are experiencing as they are making plays.

These are people you want to engage at a young age to build brand loyalty, but how do you reach them? You don’t reach them through a radio ad, you don’t reach them through a magazine article. This is where their eyeballs are and this is what people are trying to reach, and that’s what makes the viewer demographic incredibly attractive to many advertisers.

39


ESPORTS

I think what sets esports apart is the ability to interact with that level of control for people who are naturally fans of interactivity and who, frankly, have a shorter attention span or demand higher technological standards. It’s something you can’t do in traditional sports. We assume that eSports production, as you mentioned, always relies on remote production workflows, but has the pandemic changed all these workflows in any way? I think the pandemic changed workflows for both traditional and esports. In fact, here were no traditional sports on for over a year, and in that time, esports viewership exploded because it was quite literally the only thing to watch.

40

When sports were gone, esports were there, and that’s because it can exist in a virtual environment where people don’t have to be in the same room. I think that the pandemic forced a lot of companies to examine their workflows and explore remote workflows. Does the traditional sports broadcasting industry have the opportunity to look at and learn from eSports production workflows? Yes, absolutely. I think traditional sports broadcasting can learn a lot from esports. I also think, frankly, esports can learn a lot from traditional sports broadcasting. The two worlds don’t talk to each other enough. What’s been really interesting over the past few years is seeing people that produce


THE SWITCH

things like NASCAR and the NFL coming over

encourage way more of it. I don’t think there

to esports and realizing how difficult it is,

should be a barrier between the two worlds.

because you have so many more camera feeds to deal with. You’re not talking about

What will be the esports-related future

three to six stationary cameras. You’re

of The Switch?

talking about 32 cameras. It’s a harder show to direct. With that said, there’s a level of storytelling in, let’s say the NFL, that esports still needs to learn from. How can this collaboration take place?

I think we’ve shown that we’re here to stay. I think we’ve made a pretty big impact in the space by bringing in the skills we’ve had from 34 years in traditional sports, in an authentic way into esports and not trying to change workflows but trying to enhance them. We’re

I think it’s already happening. Games like

going to grow, to specifically answer your

Madden use NFL producers that are now

question, by continuing to be adaptable and

running shows with esports producers. The

supportive to our clients. We’re going to be

more collaboration there is between the

able to fix difficult problems that very smart

two worlds, I think the stronger they’ll both

people may have, and that’s something we’re

become. People are already collaborating. I

very good at. 

41


STANDARDS

NEXT-GEN TV ATSC is an international organization that seeks to find the best way to transmit the television signal. So far, they have succeeded with the introduction of next-generation television and the ATSC 3.0 protocol. Its implementation is mainly in the Americas and many U.S. cities are already enjoying the next generation of television. Here is everything you need to know about the ATSC 3.0 protocol and the international organization ATSC from its president, Madeleine Noland.

42


ATSC 3.0

43


STANDARDS

broadcast standard. Our mission is to create and foster implementation of voluntary Standards and Recommended Practices to advance terrestrial digital television broadcasting, and to facilitate interoperability with other media. What activities does ATSC promote to achieve its objectives? ATSC develops voluntary standards and recommended practices. To achieve this, ATSC has three types of groups, each with its own purpose and charter: Technology Groups, Implementation Teams, and Planning Teams.

Madeleine Noland, ATSC President.

What is ATSC and what are its objectives? The Advanced Television Systems Committee, Inc. is an international, non-profit organization developing voluntary standards for digital television and multi-media broadcasting. The ATSC

44

member organizations represent the broadcast, broadcast equipment, motion picture, consumer electronics, computer, cable, satellite, and semiconductor industries. We are defining the future of television with the ATSC 3.0 next-generation

Technology Groups are formed to draft technical documentation, including Standards and Recommended Practices. These groups evaluate technologies for inclusion in ATSC Standards and Recommended Practices and draft these documents. They may also develop Technology Group Reports. Technology Groups focus their discussions on technology, relying on other


ATSC 3.0

groups for any exploration of a given topic beyond the technical facets. Technology Groups are open to all ATSC members. They can create sub-groups, including Specialist groups, which in turn can form Adhoc Groups.

“This Teams prove out ATSC technologies in the field via demonstrations, field trials and/or proof-of-concept implementations.”

Implementation Teams are formed to provide a venue for industry discussions related to implementation of ATSC Standards. I-Teams may address business, regulatory and technical requirements for successful roll-out of ATSC Standards. I-Teams do not draft Standards or Recommended Practices; however, they may create Implementation Guides.

Planning Teams are formed

Planning Teams are groups designed to study a given topic, often prior to ATSC starting any technical effort. by and report to the ATSC Board of Directors. Open to all ATSC members, Planning Teams are free to explore a range of facets behind a topic including industry impact, technical viability, technology maturity, and more. “Planning teams study emerging technologies that may benefit the broadcast

OUR MISSION IS TO CREATE AND FOSTER IMPLEMENTATION OF VOLUNTARY STANDARDS AND RECOMMENDED PRACTICES TO ADVANCE TERRESTRIAL DIGITAL TELEVISION BROADCASTING, AND TO FACILITATE INTEROPERABILITY WITH OTHER MEDIA.

ecosystem and/or work to develop new verticals for the broadcast industry.” How do your sponsors and partners contribute to the development and implementation of your standards? ATSC members are the driving force behind all that ATSC does. They work together to develop consensus-driven solutions for the broadcasting ecosystem. Many of our members are also sponsors, whose generosity makes much of what we do possible. The organization has five groups of sponsorship. The top level will be Platinum Sponsors and the enterprises included are Gaian Solutions, LG, Pearl, Samsung, Sinclair Broadcast Group, and Zenith. Gold level will be composed by companies such as Ateme, Crown Castle, Dolby, Nielsen, and Sony. The next group is Silver level and it is formed by American Tower, Comark Communications, DekTec, DigiCap, Eurofins, Gates Air, NAB, SCTE,

45


STANDARDS

and Triveni Digital. And finally, the Bronze sponsorship and they are DTV Innovations, Enensys, Fincons Group, Ganiatech, Hewlett Packard Enterprise, BTS, Nagra Kuldeski, PMVG, Rohde & Schwarz, Thomson Broadcast, Titan TV, and V@ Box Communications.

46

What is ATSC 3.0 designed for? ATSC 3.0 is a secondgeneration digital broadcast standard. It is designed to modernize television broadcasting and also to open new business models and verticals to broadcasters. As the world’s

first IP-based broadcast system, it is capable of delivering high quality television services and any other type of data. In addition to 4K, UHD, HDR pictures, advanced audio, web-based interactivity, advanced emergency messaging and more, the


ATSC 3.0

system can also deliver software/firmware updates to the IoT, map updates to vehicles, distance education study materials, etc. It is a new IP data delivery system for TV and more. What is this standard capable of? What are its characteristics? How does it work? ATSC 3.0 has a new physical layer that is currently the world’s most efficient oneto-many data broadcasting system. It can send data to fixed or mobile devices. The IP backbone allows for any type of data to be delivered over the air by a broadcaster, and the efficiency of the system allows broadcasters to deliver television services and data services at the same time. The IP backbone also enables hybrid OTA/ OTT use cases such as delivering mainstream content OTA and auxiliary content OTT. For example, a broadcaster may deliver the dialog track in several primary languages OTA and several other languages OTT. The audience can select which dialog track to

ATSC 3.0 IS A SECONDGENERATION DIGITAL BROADCAST STANDARD. IT IS DESIGNED TO MODERNIZE TELEVISION BROADCASTING AND ALSO TO OPEN NEW BUSINESS MODELS AND VERTICALS TO BROADCASTERS.

listen to and the receiver can merge that track with the video. How does ATSC 3.0 achieve better audience measurement, and how can this standard target a specific advertisement to a specific audience?

The interactive system of ATSC 3.0 is based on web technologies: HTML5, CSS, JavaScript. For receivers that are connected to the internet, this web-based system can operate much like a webpage would work, supporting both personalized content and audience measurement. The system enables opt-in/ opt-out for these types of services. How does ATSC 3.0 achieve bidirectionality and send personalized information to each television device? ATSC 3.0 is a one-way system, however, it works in concert with the internet to provide an uplink capability that allows these types of services. For example, an ATSC 3.0 app running on a connected TV can utilize the TV’s internet connection to enable a wide variety of services. What are your expansion plans and how do you aim to gain other markets? ATSC is working hard to educate broadcasters

47


STANDARDS

worldwide about ATSC 3.0, and when a country or region expresses interest in learning more, our members respond to those inquiries and support deeper exploration of the technology. For example, India is exploring ATSC 3.0 for direct-to-mobile and MNO broadcast traffic offload scenarios. ATSC formed an Implementation Team to conduct technical demonstrations in Delhi to help with this exploration. Why the division between DVB, ATSC and other standards is needed? Wouldn’t be easier to standardize as much as possible? A worldwide converged broadcast standard has been studied and can be held forth as a lofty, laudable goal which would include not only ATSC and DVB, but also our colleagues in Japan and China. While I cannot speak for ATSC, I personally hope that systems can gradually converge. For example, DVB and ATSC both have HEVC specified. This could be an area of “low hanging

48

fruit” where convergence is possible. There are other examples like this, which suggest that each element of the systems can be considered separately and evaluated for possible convergence on a case-bycase basis. How can ATSC 3.0 incorporate technological advances that are about to be adopted, such as the use of 5G networks for multimedia content transmission? ATSC members are studying this exact question through a technical Ad Hoc Group, TG3-11 on ATSC 3.0 – 5G Harmonization. TG3-11 is studying various possible convergence architectures and plans to report its findings to its parent group, TG3. From there, ATSC members may decide to begin a new project or take some other course of action. What are your plans for the future and how can it be improved? ATSC 3.0 was designed from the beginning to be

extensible and evolvable. ATSC members have numerous ongoing projects to extend the standard. For example, TG3/S43 is working on a Core Network for Broadcast. The Technology Group 3, with due regard for existing standards organizations and activities, develops and maintains international technical Standards, Recommended Practices and other documents for the distribution of television programs and other data using advanced terrestrial broadcast technology, internet


ATSC 3.0

ATSC 3.0 WAS DESIGNED FROM THE BEGINNING TO BE EXTENSIBLE AND EVOLVABLE. ATSC MEMBERS HAVE NUMEROUS ONGOING PROJECTS TO EXTEND THE STANDARD.

and other transports.

address implementation

Technologies considered

issues regarding ATSC

may be improvements to

Standards and other

current systems or entirely

documents.

new systems that are compatible or incompatible with current systems. As appropriate, TG3 may engage in activities to

Another example will be IT-5 Tower Network Implementation. The goal of the Tower Network Implementation Team (TN-IT) is to design, implement, test, validate, and demonstrate the InterTower Communications Network (ITCN) and Inband Distribution Link (IDL). IDL is a one-way distribution system that will provide the program feed to (SFN) towers in the manner of an STL (Studio to Transmitter Link). The ITCN may additionally distribute Broadcast Internet data. In-band fullduplex technology may be used on a portion of the

broadcast spectrum to distribute TV program and data, potentially unrelated to video content, where the transmission and reception occurs simultaneously on the same RF band. ITCN is designed to link all broadcast towers to form a Tower Communications Network (cluster) for control, monitoring, data communication, and localized datacast and broadcast services. The systems developed may utilize Artificial Intelligence (AI) to monitor, configure and direct process flow, bandwidth requirements and diversity schemes for ITCN and IDL along with other technologies to enhance overall capabilities and to allow for future growth. Channel sharing and bonding may be considered. Planning Team 4 on Future Broadcast Ecosystem Technologies is continually monitoring the changing technology landscape for new innovations that can benefit broadcasters. ATSC members are both the users of ATSC 3.0 and the stewards of its evolution. 

49


PRODUCTION

50


WORLD OF WONDER

The technology behind the success of “Drag Race” World of Wonder (WOW) is a Californian production company that was born in the 90s. Always focused on stories, audiovisual technology is a vehicle that has served them to tell them as well as possible. Now, on top of the wave, WOW is the company behind the world famous “Drag Race” format. The phenomenon has crossed borders and the format has been reproduced in many languages. Here you will find the technology that has helped this great content reach all parts of the world and cause such devotion. The words below are the result of our interview with Tom Wolf, Chief Operating Officer of World of Wonder (WOW). We talk about how technology has changed over the years of the RuPaul format, about the difficulties and solutions implemented in the wake of the pandemic, and about the future and present technologies - such as virtual reality, OTT or HDR - that have already changed the way they create this successful content.

51


PRODUCTION

the big screen, and they weren’t even worried about making network shows for television. Instead, they saw tremendous potential in cable and even in public access. Their first show was based on highlighting the punk and anarchic shows made by ordinary people for public access television.

Tom Wolf, Chief Operating Officer of World of Wonder (WOW).

How was World of Wonder born? WOW was born when Randy and Fenton, who met at film school, realized that there were stories they wanted to tell that

52

weren’t being told. They have always been screen agnostic; the size of the screen doesn’t matter; it’s the dimensions of the story that counts. So they weren’t that bothered about making movies for

Telling stories that other people don’t tell or don’t think to tell has always been part of the DNA of World of Wonder, and time and again those stories might come from the margins have proven to be central to who we are as people. They made “The Eyes Of Tammy Faye” in 1999, because they thought Tammy Faye was an important but marginalized and misunderstood character. Almost a quarter of a century later that documentary has been made into a feature film also called “The Eyes Of Tammy Faye”, and Jessica Chastain who plays Tammy has won a number of awards and is nominated for an Oscar as best actress.


WORLD OF WONDER

Since 1991, technology has changed a lot. Have you experienced this technological growth and advances in your content production workflows?

THE GOAL IS FOR

I could talk about this for hours – investing in this space is really important to us. World of Wonder has always worked hard to be early adopters of new technology in our production workflows. From using Avids in our online process in the 90s and quickly taking up Final Cut Pro, to using prosumer DV cameras in the early 2000s, and virtual sets today - from the outset we’ve really tried to be at the forefront.

THE STORIES

For example, we created an all-digital workflow early on; when I started out, we had a dub room, and all the media would come in and be transferred to VHS where we would watch the dailies at our desks, and then in 2005 or 2006 we were one of the first companies that digitized all that material instead. We’ve always tried to push the boundaries, and a big part of that is

TECHNOLOGY TO BE INVISIBLE - A TOOL TO TELL THE STORY. THEMSELVES ARE WHAT MAKE THE CONTENT EVERYTHING ELSE IS A TOOL FOR THE STORYTELLERS.

owning and investing in our equipment so that we can be adaptable and nimble as a full production outfit. Content is a priority for you (story/visibility/ characters/talent), especially content that contributes to normalizing and recounting LGBTQ+ lifestyles, but is audiovisual technology also important in telling the story of these lifestyles?

It isn’t! The goal is for technology to be invisible - a tool to tell the story. The stories themselves are what make the content everything else is a tool for the storytellers. It doesn’t really matter if we shoot in SD or HD or 4K, at the end of the day stories are going to persist. It’s why people still watch series from the 90’s, even though it’s in SD, because it resonates across time and generations. While tech is a big part behind the scenes, it doesn’t impact the storytelling. In fact, it’s a really the other way around. We have a story to tell, and we go back and figure out how we’re going to tell it. Story leads always. Do you own the technology you use in your projects? We own as much as economically possible in pretty much every space – cameras, post facilities, a studio, the storage, editing machines and software -but where it makes sense and we have specific gaps, we rely on third parties and their expertise. Essentially though, the goal is to own

53


PRODUCTION

our core equipment and it always has been. We have over 5PB of on-prem storage, 35+ edit bays, and own most of the cameras and supporting tech we use. It’s unusual in the industry, but we invest heavily in the tech side because it’s efficient – we get multiple years out of the equipment, and it makes sense for us. How was WOWpresentsplus born and why? It came about for both practical and creative reasons. The traditional distribution model for USproduced TV just wasn’t keeping up with the global market and expectations of viewers. Fans in Germany, for example, were sick of waiting months to see Drag Race. We felt (and still feel) that the best way to grow the fan base is to engage with them directly, ideally at the same time as the US fans. The other reason is that we knew we had creative ideas for shows and content that our fans would love, but that a traditional channel

54

was unlikely to fund. It’s a wonderful way to flex our creative muscles, source stories and ideas and reach our global fanbase. From a business point of view, the changing landscape of SVOD which is dynamic and everchanging - basically we see the major conglomerates and networks fighting over 90% of viewers. For us, that remaining 10% is our focus and it’s an incredibly

WE FEEL THAT THE BEST WAY TO GROW THE FAN BASE IS TO ENGAGE WITH THEM DIRECTLY, IDEALLY AT THE SAME TIME AS THE US FANS.


WORLD OF WONDER

The RuPaul franchise is a very successful content production. What challenges have you encountered during development and what solutions have you implemented?

valuable space, and one

up.

there are three channels on television, and that’s seeing marginal voices get pushed out. Three ‘people’ end up controlling the world and that becomes all there is, so there is always space for content that is more specific and fits the point of view that our audience has.

While it makes sense for the

Essentially, we don’t need to

we celebrate in. We know there is room for people to have multiple subscriptions because they currently do. Subscription fatigue is exaggerated and if you offer content that is valuable and exclusive, people will sign

‘big guys’ to fight it out for

be as big as Netflix; we can

attention and numbers, our

service a lot of people and

perspective is different. We

make those people really

know what happens when

happy.

RuPaul’s Drag Race started in 2009 in SD, migrated to HD, and is now in 4K. So, from a tech point of view, the biggest challenge has been media storage and management. Moving all camera source material from all over the world to World of Wonder’s main storage in LA is one of the biggest challenges. We rely on SoDA and the Cloud to transfers media during production, plus IPV’s Curator to process and track all our media. What is the technical infrastructure needed to create “RuPaul’s Drag Race” format? This is hard one to answer – every different format has different needs. Whether that’s down to what production is like locally, to what the expectations are of the crew and team in each territory and what

55


PRODUCTION

scale the show is. There is a difference between Drag Race Italy and Drag Race Philippines - they have different expectations, so the structure is different. But what they all have in common is high production value. It’s all about quality of lighting and quality of camera, but from there everything changes based on the specific needs. How has the pandemic affected your production company and have you implemented remote working modes? Remote working has fundamentally changed World of Wonder – and for the better. Overnight we had to transition an entire

56

production from our LA offices to remote working, which we managed to do successfully. Some people are now back in the office, but most are still at home. All edits now happen remotely and there is a combination of different approaches: some are controlling machines in the office from home, some have media at home, some use IPVs Curator for remote media workflows. It’s down to the individual setup. The real big change is we’re now hiring from anywhere! Prior to this, you had to live in LA – now, we have both a WFH policy and a ‘Work From Anywhere’ policy and that’s going to continue to expand. Going from a

workforce talent pool of 8 million people to 7 billion people is pretty powerful. For example, we have a show shooting in the UK, being edited in the US, but is it really being edited in the US? Three of the editors are based in the UK and I’m not even sure exactly where – and it doesn’t even matter. It’s allowed us to be much more flexible. Early on while we were figuring out the best workflows, there were of course teething problems, but it’s been a smooth transition. I think the biggest learning curve is the belief that a large mediabased production company needed to be run centrally; that turned out to be a


WORLD OF WONDER

REMOTE WORKING HAS FUNDAMENTALLY CHANGED WORLD OF WONDER – AND FOR THE BETTER

The one thing I personally worry about is data piracy and data protection, that keeps me up at night, but we have control over that stuff, we just spend a bit more time on that now.

starting to see some easing of guidelines; for example, we’ve gone from testing every day to about three times a week and only masking when you’re in close proximity to others.

In terms of talent and production, you wouldn’t know any different - apart from the plastic/glass between judges.

myth. When push comes to

The major change is that the floor map is different – the changes between season 12 to season 13 is that it’s much bigger. It’s more physically spread out and we have more individual trailers and lots of testing. That’s the key. Testing regularly and being vigilant in that. But we’re

I may be cynical, but I don’t think it will ever go away – it’s certainly not over and I think in some way restrictions will never be fully gone. Regulations will loosen up, but you’ve still got to build the safest set possible.

shove, there are solutions, and those solutions are not particularly expensive – our solutions were simple and successful. The only problems we have are the same as we’ve always had, which is computers crashing!

Let’s talk about the future of audiovisual technology. Today, content producers shoot their formats in Ultra High Definition as a base, but what about

57


PRODUCTION

other advances such as HDR. Will they become the standard formats? Is World of Wonder adapted to these standards? The short answer is yes. I do think HDR will become standard and expected as a base level - we have adapted to that already and will continue to do so. Drag Race is shot in ultra-HD and HDR. The truth is, shooting and finishing in these formats has advanced faster and further than the distribution. We can shoot in 4K and HDR, but the viewers watching it aren’t necessarily getting that yet. We like to be there early on these standards because our content lives a long time. We’re blessed that Season One of Drag Race is still interesting. The sooner you can move to a higher quality standard, the longer life your content has. We shot in HD between season 2 and season 13, and then in 4K from season 13 onwards, even though no-one had that tech yet. Season 13 will be 4K forever and will have longevity.

58

We see now, with older shows shot on film, that they can be remastered and look really good - which is not something you can do with digital media. It’s all about making your camera masters as high quality as you can, even if distribution doesn’t support it yet, because some day it will, and you will regret not doing it. The perfect example is Season One of Drag Race, which was shot in SD (because we had a low budget), and it’s something I have regretted every day since! Today’s content consumption goes through streaming and video-on-demand platforms. What will happen in the future? What will happen to linear TV and how do you see the market for OTT platforms in the future? Just like radio, we think linear TV will have a long life after the ‘replacement’ technology takes over. There is room for both to happily exist. We’ll see linear-first companies continue to consolidate and

add OTT services to their portfolio, but they’ll always be occasions where viewers just want to put on a marathon while they finish chores or relax in front of live sporting events. In terms of specific event TV – which Drag Race is


WORLD OF WONDER

absolutely a part of – I’m

for many LGBTQ+ hangouts

not convinced it only

because they were showing

applies to linear. In the

the series each week. When

beginning, the show helped

there is a winner or loser,

changed the ‘bar market’

I believe event TV is really

in America significantly.

compelling - however, we

When the series aired on

see from our viewership

a Monday, it became the

on WOW Presents Plus

biggest night of the week

that the second the show

drops on the platform, the numbers skyrocket; so that’s a clear example of SVODS being able to drive event viewing too. 5G and edge computing, virtual production environments, augmented and virtual reality, remote cloudbased production: What is World of Wonder’s roadmap for the future? We use remote productions, we use cloudbased productions - we use these tools, but really, it’s a production; regardless of what Tech stack or tools you use. It’s all about what’s best for the content. We will do augmented virtual reality when we have a story to tell that needs it, not just because it’s cool. All our media hits something in the cloud at some point, but that’s been true for years. It’s not really about having a specific roadmap; it’s finding a solution to a problem in the most efficient way possible and a lot of the technology around that is already being implemented. 

59


DOP IN TV SERIES

Photo by Sebastian Vega.

60


ERIC STEELBERG

“Virtual reality has a potential we cannot yet imagine” Eric Steelberg is an experienced cinematographer with a long career in filmmaking that started right out of high school. He grew up shooting in analog and that has given him a great respect for the old manners. Today, he is a professional who has shot in all possible formats with directors like Jason Reitman -friends since kids and wich Eric has a great cinematographic relationship-, Marc Webb, Richard Glatzer, or Wash Westmoreland. We talked to him to find out how much technology influences his work, which camera sets and lenses he trusts the most, or what he thinks of revolutionary technologies such as HDR, virtual production or on-demand content consumption.

61


DOP IN TV SERIES

Who is Eric Steelberg and how did he enter the world of cinematography? I grew up in Los Angeles and my parents were fond of still photography. As a teenager, I took still photography classes in high school and spent many hours in the darkroom. I’ve always been fascinated by film and dedicated most of my free time to watching it. I don’t remember doing much more than that. I was obsessed. Some friends and I started a film class at our school and acquired film and video cameras of all kinds, from Super 8 to 16mm film and VHS. All my friends wanted to direct and produce, I was the only one interested in photography, so I was able to use all the cameras and shoot everyone’s projects. After high school, I applied to some of the more popular film schools, but they didn’t accept me. I decided to continue working in film on low budget projects, then commercials and finally feature films.

62

I LOVE DIGITAL AND MY BEST WORK HAS BEEN USING DIGITAL CAMERAS, BUT NOT BECAUSE THEY WERE DIGITAL.

What was your progression like, where did you start and how did you get involved in bigger productions? Once I had been working

in commercials for a handful of years, and doing additional photography on small movies, I had an agent and a reel of work that was decent. These were the days of film, there was


ERIC STEELBERG

no such thing as shooting digital yet. There were fewer DPs working than there are today because the job was more inaccessible. Once, I got a call from a producer who had seen a very very small movie I had made on a brand new Sony digital camera –this was about 2003- and she said she was producing a new movie in Los Angeles, also very small, and the two directors wanted to shoot it on the same new Sony digital camera, the F900. I read the script, it was wonderful and charming and I fell in love with it. We made that movie in 17 days and it went on to win the Grand Jury Prize and Audience Award at Sundance, 2006 edition. The film was called Quinceañera. From there I got some attention and my director friend, Jason Reitman, who I met when we were fifteen and with whom I had done a lot of shorts and commercials; told me he had a script called Juno that he was really excited about and wanted to send me. The massive success

of that film launched my career as it is now and it has never slowed down. Really, I’ve never sought out bigger productions; they’ve just ended up evolving into stories with bigger palettes. Considering the beginning of your career and where you are now, what are the technological differences you have seen? The undisputed difference is digital cinematography. It has been a double edge sword. We were promised faster and cheaper but we all know that is far from the truth in practice. Somehow, everything in camera is much more costly than it was when we were shooting film. Let me be clear though. I love digital and my best work has been using digital cameras, but not because they were digital. A few years back I did a film called The front Runner and we went back to shooting on 35 mm. It resulted so fast. We did fewer takes and worked much faster. I think with digital there is

this thing where everyone on set is over my shoulder looking at an image which is very polished and looks finished. But because of that, everyone also feels like they can comment on every little detail including the light, focus, etc. It has become too democratic. With film you could see a low quality video that reproduced what the camera saw. That video was an approximation that did not have the same fidelity that we are used to now. As a result, everyone had to trust that the cinematographer was exposing correctly, that the operator was framing correctly, and that everything was in focus. They were trusted to do the job. Now, they are often second-guessed. Most of the time, I welcome the input. I consider my job and my personality to be very collaborative. I never know where a great suggestion might come from and I encourage my crew to speak up if they have an idea. Some great shots have come from a dolly grip or an electrician with those kinds of ideas.

63


DOP IN TV SERIES

A cinematographer is prepared to work with any material. But there are always preferences. Of all the ones you have worked with, can you tell us which has been your favourite camera and lens choice and why? I get asked this often and the truth is every project has its own perfect combination. It’s a recipe. The same perfect recipe for a meal on one holiday isn’t the same for another many months away. I think it’s a good analogy. With every film or show I start from scratch testing cameras and lenses to see what feels right for this new narrative. I’m proud of my work on a movie called Labor Day; I think it may be one of my best. On that film I used the original RED Epic, Master Primes, Super Baltars, Fujinon Premier Zooms, and an old Cooke 20-60 mm zoom. And this was after much testing. It just seemed right blending those things and the result is great. But I have also never used that combination again, even though it was my favorite, because it was only perfect for that film.

64

Copyright Columbia Pictures, Photo by Kimberly French.

All this being said, and this is where I start to be confusing, my last two projects, Ghostbusters: Afterlife and Hawkeye have been photographed using the Alexa LF and Panavision anamorphic T Series lenses. I tested other combinations, but right now that recipe is what takes the best to me. I think I am also going to use it on my upcoming shoot. Panavision has always been my partner and enabled me to explore my visuals with unparalleled support.

Does Eric Steelberg have a personal style? I don’t think so, but I have been told it’s naturalistic with flourishes… whatever that means! You have worked on numerous occasions with Jason Reitman, has his personal vision influenced your work and vice versa? We have enjoyed a very long and respectful relationship. Like I said


ERIC STEELBERG

I never want to think I’m reusing a “style” or a “look” from another film I have done. I would never do it if I were asked to do it because different narratives must involve different techniques. I hope my work changes from project to project with Jason, and without him. If it doesn’t I’m not growing and exploring the infinite ways I can tell stories. Has Marvel’s Hawk Eye comic influenced the planning and production of the series?

before we met when we were teenagers and made many short films together which then led to commercials and feature films. Ideas bounce off one another; we speak very intimately about our films and the visual approach we want to use. Ours a very complimentary relationship and there is freedom to pitch all ideas without judgement because we both want what is best. We challenge one another to get that result.

Given this relationship, does Eric Steelberg’s work change when you are not working hand in hand? I have worked and, hopefully will continue to work, with many other talented directors who have their own incredible ideas and points of view. We all get hired based on our work with other people and I’m sure most of my interest has been from movies I’ve done with Jason.

We were very influenced by the graphic novel. In my first meeting with director Rhys Thomas he talked to me about the importance of referencing the visual elements of the comic. If you notice, there are shots in the series that are copied directly from the graphic novel series. It was very important to Rhys to respect and honor the wonderfully complex visual sensibility that he had. What has been the biggest challenge you have faced in filming your part of Hawk Eye? It was a very long

65


DOP IN TV SERIES

schedule… 100 days with multiple crews and units, all filmed out of order. Trying to keep the visuals consistent when being shot out of order on such a large scale was very difficult, but we had a great team. What equipment have you used to shoot your part of Hawk Eye? The equipment was the same for the whole program: Arri Alexa LF cameras with Panavision T-series anamorphic lenses. We also used quite a bit of Photo by Dale Robinette.

66

an Alexa Mini LF with DJI Ronin gimbals and MoSys remote heads. A RED Komodo was used for some stunt and VFX work. How have you adapted your workflows to HDR? I was lucky enough to be able to have two HDR monitors on set. However, I am not a fan of HDR, frankly. I think it is too good. What I mean is that I don’t know any cinematographers who want to see everything. Our whole job is to use

light and shadow to control and create what you see and what you don’t see. Suddenly, the marketing people come in and say, “Hey guys, now we can make everything visible above and below what’s being exposed.” Sure there are some cases where it can benefit the shot, but in general I don’t like it. Shadows are supposed to be strong, highlights are supposed to disappear at one point. However, it looks like it’s here to stay.


ERIC STEELBERG

THERE ARE SO MANY WONDERFUL OPPORTUNITIES RIGHT NOW WITH THE AMOUNT OF VARIED CONTENT BEING PRODUCED FOR SCREENS OF ALL SIZES. THERE HAS NEVER BEEN A BETTER TIME TO BE A CINEMATOGRAPHER AND FIND WORK.

What do you think of virtual production, will it change the job of a cinematographer, and what advantages and disadvantages do you see? I think this is an exciting new technology that I will be dealing with in my future. It’s pretty amazing, but you also have to take into account a very specific set of parameters for it to be used to replace a

location. It also requires an enormous amount of prep time, –many, many monthspre-lighting and blocking virtual environments. People are going to have to allow directors and cameramen to start their work months earlier than usual. I guess that’s the disadvantage: It will be needed extra time, money, and crew availability. The advantage is being able to shoot in places and locations that would otherwise be impossible. And also totally credible and seamless! Nowadays, the amount of television and film content on offer is very large. Do you think it is easier nowadays to be a cinematographer? Do you think this offer will be maintained over time? There are so many wonderful opportunities right now with the amount of varied content being produced for screens of all sizes. There has never been a better time to be a cinematographer and find work. At the same time, because there are so many, it

doesn’t seem as specialized as it used to be and that has made it difficult for people to stand out. The choice is very wide. There are lots of people doing amazing work who struggle to find good projects to work on. All I can say is that if you love it, and you think you were born to do it, it’s worth the struggle because there’s no better feeling in the world than being in a theater, surrounded by hundreds of people, when the lights go down and your images appear on the screen. It’s something to aspire to and something that continues to drive me today: people’s common love of sharing stories. Taking all these technologies in mind, what do you think will be the next technological revolution in the industry? I think there is enormous potential in augmented and virtual reality in ways we cannot yet imagine. I don’t think it’s going to replace film or television, but it will be an additional complementary experience.

67


EVENTS

68


DUBAI EXPO 2020

69


EVENTS

The Global Exposition of Nations, better known as Expo, was to have been held in Dubai in 2020. Due to international health conditions, its fulfillment as scheduled was impossible. The agenda had to be delayed by more than a year and the infrastructure had to be changed to embrace new remote working strategies. Dubai Expo 2020 celebrated the convergence of 192 nations over six months. The global scale of the Expo has involved broadcasting the content, images and communication of the events held at a level never before accomplished. From October 2021 to March 2022, more than 17 hours of broadcast content per day have been produced for a total of 182 days. Content has been served to a multitude of international media thanks to a team of more than 300 professionals. This is how they have made it. This interview was conducted in conversation with Mandy Keagan, Vice President of Media Services at Expo 2020 Dubai, and Esteban Galan, Senior Director of Broadcasting at Expo 2020 Dubai, who leads the planning group responsible for the design and delivery of the facilities and technical services for the venue’s broadcast operation.

What have been the responsibilities of the Dubai Expo 2020 outreach team and how many professionals have been on the team? The broadcast management team has been composed of 29 professionals and has been divided into three areas: production, technical and media asset management. The broadcast management team has been part of media services, which planned and provided all the necessary services and

70

facilities for the media who have attended Expo 2020 Dubai and also for those who have covered the event remotely. The Expo 2020 Dubai media services team has also managed the Host Broadcaster operation, which has included more than 300 team members. Expo 2020 Dubai has created a specific infrastructure for broadcasting the signals produced on site, how did you develop it?

How are these facilities set up? Have you interconnected them with fibre? During more than three years of planning, Expo 2020 Dubai created a full-scale operation for Host Broadcast content across different channels, to assist all broadcast coverage on site. We set up a full fibre-optic network to be able to produce remotely, power provision, compound spaces for outside broadcast vans, the 10-floor Expo Media Centre,


DUBAI EXPO 2020

plus additional platforms

12G equipped, including

provided remote coverage

and studios around the

the video server. The two

from all the connected

site. Expo 2020 Dubai also

beauty cameras installed

positions around the site.

provided six studios for

around the site were also

The bookable studios

official broadcasters, three

4K. The rest of the facilities

and portable units were

bookable studios and two

were HD, including two

provided in standard HD

beauty cameras.

production galleries that

quality.

The Expo 2020 Dubai production facilities have provided a mix of 4K and HD. The production room covering the Al Wasl venue, the heart of Expo, was fully 4K equipped with 12 cameras, expandable up to 24 cameras, fully 4K

DURING MORE THAN THREE YEARS OF PLANNING, EXPO 2020 DUBAI CREATED A FULL-SCALE OPERATION FOR HOST BROADCAST CONTENT ACROSS DIFFERENT CHANNELS, TO ASSIST ALL BROADCAST COVERAGE ON SITE.

71


EVENTS

Your facilities produced international signals and others for your own use. Could you explain us more about how you have produced, managed and broadcasted the contents of Expo 2020 Dubai TV?

72

Expo 2020 Dubai produced an average of 17 daily hours of broadcast, which were distributed in various ways: via worldwide satellite coverage, on a cloud streaming service, on a 24/7 YouTube channel, social media and on our Media Asset Management (MAM) system, where high-

resolution content was available to download. What technology have you developed to manage such a large amount of multimedia resources? Have you implemented a cloud solution?


DUBAI EXPO 2020

Expo 2020 Dubai

The first objective was

quality of distributed

created the Media Asset

to minimize on-premise

content in order to appeal

Management System

deployment and leverage

to broadcasters and

(MAM), a cloud-based

cloud-based services.

publishers globally. Finally,

platform to streamline

Secondly, we wanted to

we wanted to facilitate

the preparation and

build a platform that could

and ease the distribution,

distribution of videos, audio

provide the speed and

search, review and retrieval

and still images to media

scale of a global news

of all media assets available

organizations around the

agency. Thirdly, we wanted

in English, Arabic and

world.

to ensure uncompromised

French.

73


EVENTS

What technology did you offer to third parties? Was production and control space available at Expo 2020 Dubai for other broadcasters? Expo 2020 Dubai’s dedicated media centre in Al Wasl Plaza was a highquality, technology-driven and world-class media facility. The Expo Media Centre (EMC) was a hub for accredited media with facilities that include several technologies and facilities. We will quote them: Media workrooms and work spaces for 300 media, TV and radio studios, editing suites and voiceover booths, space for organizations to set up remote newsrooms, conference and briefing facilities, interview rooms, meeting spaces, photography gallery, support services and information provision, thirdparty suppliers to support visiting media including production staff, services, and broadcast transmission services; and site-wide broadcast and press facilities, including stand up, camera and photography positions, and broadcast compounds.

74

The vast majority of services and facilities were provided to accredited media free of charge. This is because we recognized that minimizing costs to the media maximized their ability to cover the event. Expo 2020 Dubai was planned to be held during the year that its name indicates. How has COVID-19 affected the infrastructure that you originally planned? How much has the infrastructure that has been built changed with respect to those previous plans? And what about the human resources that were planned, have they been affected? Indeed, Expo 2020 in Dubai had to be postponed for a year due to the COVID-19 pandemic, and with that came strategic modifications to adjust to the challenges the world faced. For the Expo media team, we focused on providing better facilities so that media from around the world who were no longer able to travel to the Expo

could access our services remotely. These facilities included the Media Asset Management system and the assets we provided to media -- including video press releases for the media to cover our events remotely, to ensuring that our press conference facilities included live camera feeds for remote access and the ability to ask questions as if they were in the room.


DUBAI EXPO 2020

Apart from the pandemic, what are the main challenges that the Expo 2020 Dubai broadcasting team has encountered during these months of the show? How have you solved them? Expo 2020 Dubai has welcomed more than 200 participants, including 192 nations - a record in the history of World Expos posing quite a challenge

for our team. Indeed, all participants have been contributing to Expo’s programming through a great number of activities and events, including daily National Days (celebrating each individual country’s participation), and we needed to make sure all these activities were covered and made available live, in addition to the Expo 2020-led activities. This represented approximately 60 live events a day,

requiring the team to be flexible, resilient and with the highest standards of delivery. What was the most innovative technology you have implemented in your content? To enhance the broadcast experience in Al Wasl during Expo 2020 Dubai’s Opening Ceremony, the latest augmented reality was implemented live on six cameras, including one cable camera. This required precise calibration to match the animation with the movement of the camera to achieve a realistic effect. The cloud MAM used AI-based cognitive services for automated metadata generation and a distribution platform capable of efficiently handling high volumes of high-resolution content. It also provided automated transcription in different languages. Five 5G portable encoding solutions from TVU Network provided the flexibility to cover events anywhere on site.

75


EVENTS

Everyone is talking about Al Wasl, what is the AV infrastructure that it houses? The dome featured a trellis design, with pieces of a specially made projection screen material stretched tightly between each section. This created a 360-degree, 25,380-squaremetre projection surface, visible from both inside and outside. It was one

76

of the largest 360-degree projection surfaces in the world.

racks, smaller and lighter than other projectors in the same class.

We worked with Jacobs Mace, Expo 2020’s Official Programme Delivery Management Provider, and Christie, our Official Projection and Display Partner, to install Christies’ D4K40-RGB pure laser projector features, an all-in-one design, with no external chillers or laser

Its design suited the limited space available to house the projectors, and with 40,000 ANSI lumens per projector, it met Expo 2020’s requirements for producing a bright image suitable for broadcast. A total of 252 Christie D4K40-RGB projectors were


DUBAI EXPO 2020

installed in 42 projector pods located around the inside perimeter of the dome. The pods, which were large enough to hold a compact car, had glass fronts, were air-conditioned and rear accessible, and protected the projectors from the weather elements in Dubai, such as heat and dust. Each pod was able to hold six projectors in two stacks of three.

Christie Conductor, an advanced monitoring and control software solution for Christie 3DLP® projectors, was used to control the projectors. It could automatically turn the projectors on or off, as well as access interrogator logs, update firmware and perform health checks.

Reached the end of

And to create a fully immersive experience in the dome, audio was integrated above the projection capsules.

the 182 days of the event,

Dubai Expo 2020, what have been the assessments and achievements of the broadcasting team? Our greatest achievement was to produce an average of 17 hours a day of live and livestreaming over and to make it all, along with a million photographs, available to the media worldwide. 

77


TEST ZONE

TEST ZONE

Apantac UE-4-II-K Tangible multi-visualization, it’s possible! Test performed by Pablo Martínez

They are with us in our day-to-day routines, being one of the technical elements without which we could not carry out our work in the modern and minimalist controls for production and presentday mobile units. They have

78

always had their staunch opponents and advocates in our technical production market; however, we cannot deny their necessity and efficiency in production environments. Yes, we are referring to multiscreens, the so-called

multiviewers. Long gone are the monitoring bridges made up of countless independent display monitors, huge arrays that, without ever realizing, we transform into multiple display systems on a single screen, which allows us


APANTAC UE-4-II-K

to be much more efficient both as regards of both space and consumption. This lab will be devoted to analyzing the UE-4-II-K model from manufacturer Apantac. With a proven track record in the broadcast equipment industry and a large product portfolio, the innovation they show in their products is at the forefront of development in an industry like ours. An industry that demands scalability of technical solutions by leaps and bounds.

this piece of equipment are unbeatable, but once in my hands, and before including it in a model to perform a battery of tests in a real environment, the possibilities it offers are enormous; power of imagination. We are not dealing with a mere multiviewer even though it is the little one in the “Crescent UE Multiviewers” family. The possibilities it offers can drastically simplify the visualization of our signals and to the fullest thanks to its ‘secret weapon’: It

First contact

features a KVM system that

The truth be told: In theory, the features of

integration, with excellent

takes it at a higher level of response times.

Technical features The UE-4-II-K gives us the ability to work concurrently with four HDMI 2.0 UHD inputs featuring autodetection and audio decoding (up to eight channels per input). This will enable us to work with different signal resolution settings on the same system. As an example, we would avoid having to configure the screen output on different computers to a specific resolution for display. We have two HDMI outputs with a maximum resolution of 4096x2160p (4:4:4). It also has two SDI BNC outputs, one a 12g SDI, and

79


TEST ZONE

3G SDI the other. As for the control and interaction part, it features four USB type-B connectors for connection with the computers that we monitor. Work can be carried out directly from the multiviewer by connecting a mouse and a keyboard in the two silk-screened USB type-A connectors. With regards to the rest of I/O system, we have a RJ45 connector for control and configuration connection from the Apantac jDirector software, a RJ45 connector for the RS232 interface connection, and four RJ50 connectors, two for interaction via RS232 (1 in, 1 out) and two for GPI interaction (eight for each connector).

80

‘Stress’ tests An important factor that we always consider when evaluating new equipment is reliability. Although the testing period is ‘finite’, it always provides us with very relevant information in the face of future unexpected behavior. Hence the ‘stress’ test. As in previous occasions, we will conduct this test focusing on the equipment’s behavior versus temperature changes and intensive handling over a short period of time. Its technical features indicate that it can work within the temperature range of 0 - 45ºC. The test configuration was performed on an insulated

rack with independent temperature control, the equipment was powered by an unstabilized power outlet (220v/44-52Hz), connecting four HDMI input signals, three of which were 3840x2160 resolution and one 1920x1080 resolution. All sources come from four computers which connect via USB to the UE-4II-K in order to use its integrated KVM. The output monitor on which we will watch the behavior during the test is set to 3840x2160 resolution (9ms response time), which is the standard resolution allowed by the equipment. We planned a 3x24h test, with a weighted variation of the ambient temperature


APANTAC UE-4-II-K

in the closed environment of 10 - 40 ºC in steps of one hour variation and one hour constant temperature. During this period the four computers are looping high-resolution videos. Two of them are full screen. In this case no UMD associated with the temperature alarm is configured, but if it is included in each window of the multiview; the signal failure alarm identifiers and four audio channels are associated with each window. Once the test has started, a KVM control test is performed from the multiview on each PC every four hours. The video on playback is modified and

the menu of the computer is navigated for two minutes. Also, it switches between different presets that we have previously defined in its configuration in order to alter the order and display of the different windows. Each multiview input is also set on full screen; in this case the interaction times of each process were measured in order to ‘calibrate’ the system response.

unit was kept within the working margins, especially thanks to the built-in forced front ventilation system. As for usability and response times under these conditions, there was no apparent decrease in functionality. Response times between preset changes were around 1.7 - 1.9 seconds and KVM operation to take control of each computer was accurate and unchanged.

After three days of testing and with the data obtained, we can say that this is a robust piece of equipment that will not create any problems in 24/7 controlled environments. With regards to the values analyzed, the temperature inside the

Real tests Once the ‘stress’ test was completed, we were already clear in which environment we should implement the unit. I refer to what I said at the beginning of this lab review, multiview allows

81


TEST ZONE

us to maintain minimalist workspaces. Surely, many readers will find themselves in situations similar to ours: Little space and mandatory control of two or more computers from a single operating position in the control post or in the mobile unit. In that case, the UE-4-II-K is our loyal ally. Due to the idiosyncrasy of the playout system that we have in our production controls, they necessarily need two screens for each computer as well as two PCs sorted by

82

playout position (main and backup). This means that, at best, as long as we have ‘room’, we must have four displays of at least 17” with two keyboards and two mice, or, alternatively, two displays associated with an external KVM for fast switching between computers (although in this case, and depending on the configuration and operating system, this can be a problem due to the delay in switching the video signal that is caused by a dual output KVM on a single computer). This second option is the one

we have implemented currently, so the test in a real environment was clear. The two 17-inch screens were replaced by a 32-inch monitor, setting a main working preset with four windows including ID UMD and two-channel audio UMD for each window. All other presets were configured for 2:2 and 1:1 layouts. This was possible thanks to the keyboard shortcut provided by the UE-4II-K (Ctrl->Ctrl->Fxx) for changing the preset. Ease of use of the jDirector configuration software is worth highlighting. It allows


APANTAC UE-4-II-K

Conclusion I was truly surprised by the performance of the UE-4-II-K with the integrated KVM module. As you have read, a single piece of equipment can solve a number of problems for us. Unfortunately, there are many details that have not been discussed, but I invite you to visit Apantac’s website and take a look at the technical specifications and manuals of this series of equipment: excellent value multiviews which I would recommend you to consider for your projects. Finally, thanks to Michel Rudelle of APANTAC LLC for letting us try the equipment, as well as to IB3 for allowing us use the facilities to carry out this laboratory.

performing virtually any combination of windows, including UMD of the type label, clock, temperature, audio level and logo. We performed a broadcast parallel during two programs using it as a display and control element on a single monitor. At the same time its SDI output was integrated into one of the windows of the control’s general multiview, which allowed the producer and assistant to have a reference of the playout schedule. Simplicity and operability proved unbeatable. First of all, thanks to the possibility to monitor all four signals concurrently. And secondly, by the possibility to switch between working views (presets) according to the operator’s needs. As a second test in a production environment, it was configured to be used as a display for monitoring events and warnings in the systems department. The use of the multiview enabled an automated control of four computers. The different concurrent processes were displayed on a single screen. It was also shown, in case of need to interact with the computer just in case it might generate errors, a full-screen display on the multiview. For this test, a main preset was made that comprised four windows, a digital clock synchronized by means of an NTP server and four labels showing each window’s ID. It was kept in parallel with the current system to verify its functioning while its behavior was being analyzed by colleagues in the department. Very positive comments were reported regarding accessibility, ease-of-use and efficiency of the equipment for this type of control environment.

What can we miss here? To name one, I am missing a power switch, both for primary power and for redundant power, built into the chassis. Technically it is not complicated, and operationally, for maintenance tasks and operation outdoors, it is of great help.

83



Turn static files into dynamic content formats.

Create a flipbook
Issuu converts static files into: digital portfolios, online yearbooks, online catalogs, digital photo albums and more. Sign up and create your flipbook.