Sports Tech Journal — Fall 2020

Page 1

FALL 2020

VOLUME14 ISSUE 2 PUBLISHED BY

VENUE SPOTLIGHT SoFi Stadium, Allegiant Stadium, and Globe Life Field

ADVANCING THE CREATION, PRODUCTION, & DISTRIBUTION OF SPORTS CONTENT

SPORTS PRODUCTION

IN COVID-19 ERA SVG COVID-19 Sports Production Operations Guide

PLUS

WHITE PAPERS:

Expanded Section Includes Insights From 25+ Industry Experts SVG UPDATE:

The Latest on the SVG Virtual Series of Events


ANY

IP NETWORK PROTOCOL CLOUD PROVIDER & EDGE DEVICE

With over a decade perfecting broadcast-quality live video delivery over IP, Zixi is the preferred choice to live stream the world’s most valuable content.

700+

CUSTOMERS WORLDWIDE

200+

INTEGRATED TECHNOLOGY PARTNERS

WWW.ZIXI.COM

99

.9999%

RELIABILITY WITH HITLESS FAILOVER


NEW NEW NEW

NEW NEW

new


in this issue

upfront

4 FROM THE CHAIRMAN I Remember It Like It Was Six Years Ago 6 COVERING THE FIELD Confronting a Changing World

SVGupdate

8 SPORTS GRAPHICS FORUM SPORTS OTT FORUM SVG CHAIRMAN’S TECHNOLOGY IN ACTION SERIES ESPORTS PRODUCTION VIRTUAL SERIES

FALL 2020 Volume 14 Issue 2

9 AT-HOME PRODUCTION SERIES SVG COLLEGE SUMMIT: VIRTUAL CAMPUS 10 SPORTS CONTENT MANAGEMENT VIRTUAL SERIES SVG VENUE SUMMIT: VIRTUAL SERIES

SPORTS PRODUCTION

IN COVID-19 ERA coverstory SVG COVID-19 SPORTS PRODUCTION OPERATIONS GUIDE PAGE12

PAGE55

SPOTLIGHT

6 SOFI STADIUM 5 60 ALLEGIANT STADIUM 64 GLOBE LIFE FIELD 68 SYSTEMS INTEGRATOR Q&A: BECKTV 70 SYSTEMS INTEGRATOR Q&A: DIVERSIFIED 72 SYSTEMS INTEGRATOR Q&A: NEP GROUP

white papers 74 AJA VIDEO SYSTEMS Establishing a Safe, High-

Quality Sports Production With Modern Technologies 76 ARVATO SYSTEMS Cloud-based Remote Editing for Sports Organizations 78 BECKTV Remote Commissioning and Training For a New Broadcast Facility 80 CANON Operational Innovations in 4K UHD Broadcast Television 82 CLEAR-COM 5 GHz Elevates Intercom Capabilities 84 COBALT DIGITAL Bit-Rate Evaluation of Compressed HDR Using SLHDR1 86 DIVERSIFIED Remote Content Creation in the Age of COVID 88 ECODIGITAL Optimizing Sports Archives for New Times, New Challenges 90 EVS Moving Towards Live Production Anywhere in the Post-COVID Era 92 GOOGLE CLOUD Creating Schedule-Adjusted Metrics in NCAA Basketball for Differentiating Content and More Accurate Team Evaluation 94 GRASS VALLEY Virtualizing Content Creation 96 HAIVISION Remote Production: Why Synchronization Matters for Live Sports 98 INTEL SPORTS Immersive Media Experiences With Intel True View Delivers New Reality for Sports

2

SPORTSTECHJOURNAL / FALL 2020

100 MEDIAKIND Increasing Value of Sports Content: Machine Learning for Up-Conversion HD to UHD 102 MEDIAPRO The Future of Production Is Automatic, Flexible, and Remote — and “Hands Off” 104 NCAM Inserting AR Into the Remote Production Workflow 106 NEVION Optimizing Timing of Signals in IP Remote Production 108 RIEDEL COMMUNICATIONS Custom-Engineered Technologies and Remotely Managed Services for Large-Event Communications 110 SIGNIANT Resilience and Innovation Define the Return of Live Sports 112 SONY Virtualization And Orchestration in IP Live Production Systems 114 STUDIO NETWORK SOLUTIONS The Game is Back On: Rethinking the Sports Broadcast Workflow 116 SUPPONOR Virtual Advertising: Cutting Through the Regulation 118 TAG V.S. Achieving Low Latency in 100% Software IP for Live Production 120 TELOS ALLIANCE Telephony in the Modern Video Production Facility 122 TELSTRA How Network Connectivity Will Shape the Future of Remote Production 124 130

SPONSOR DIRECTORY FINAL BUZZER Change The Game


TAKE THE FIELD WITH SOLUTIONS

FROM RIEDEL

Real-Time Networks for Video, Audio, Data and Communications

THE NEW

MEDIORNET MUON IP Video Processing SFP

PA

C K SA

RE

FE

1G.9 Hz DECT

ARTIST

Digital Matrix Intercom

BOLERO

Wireless Intercom

www.riedel.net

/VIDEO


ASSOCIATION LEADERSHIP

FROM THE CHAIRMAN

I REMEMBER IT LIKE IT WAS SIX YEARS AGO

By Mike Davies, SVG, Advisory Board Chairman March 2020: every day, every hour, it was a new update from the sports world — sports postponed, sports outright cancelled, rescheduling of sports, even sports without fans. Remember how impossible that sounded at the time: “No fans? No possible way.” One after the other, each league and sanctioning body announced their position, and crews around the country started the process of unwinding ongoing and future sports production as the grim reality of the severe consequences the virus would bring came into clearer and clearer view. At Madison Square Garden, the Big East Tournament on Fox Sports 1 had just begun. As we watched the strange scene of a basketball game, stands near empty, we wondered when the shoe was going to drop. During halftime of the first game, we got the inevitable announcement. Go ahead and strike everything as quickly as you can… get home as quickly as possible. No, we don’t know if we are coming back. Scenes like these were happening across the country and throughout the world. Huge spring events like the NCAA tournament, the NBA and NHL regular season, the start of MLB, and NASCAR all hastily crafted plans to shutter it all. If you listened carefully, you could hear all the gears of a million different processes and industries simultaneously grind to a halt. Flash forward just a few strange weeks. I was speaking with Tim Eichorst, a good personal friend whose company, Rush Media, has served the industry with production services and compact mobile units for hundreds of small games throughout the year. As an owner who deals almost exclusively in small trucks, he was wondering how are we going to get going again if the virus is going to be with us for a while? He knew he didn’t have the answers, but he did have some ideas. The industry was going to have to start a conversation of its own and develop protocols and ways that we could operate in the midst of a pandemic. We knew we needed the combination of all of the people who were likely thinking about the same exact thing, and that the Sports Video Group had the unique and cross-sectional membership that we could bring together. Maybe we could form a group — a task force — to try and start figuring things out. Tim even had a name for it: “Clean Freaks.” And that is how a months-long conversation was born. The group — SVG sponsors, networks, industry consultants — came together via Zoom for weekly conversations, sharing perspectives, experiences, and expertise. We agreed that we needed to turn the ongoing conversation into a deliverable and the SVG team organized the group into subgroups to answer questions we all faced. We didn’t want to invent protocol, but we did want to create guidance based upon the limitless experience and knowledge of the membership. The product of this is the SVG COVID-19 Sports Production Operations Guide, and as far as protocol guidance goes — which admittedly doesn’t put it in competitive company — it is a good read with questions and answers. You will find it on page 12, so please spend some time looking it over. The ongoing conversation continues to evolve with new faces, new developments, and new conversations. Operating with the virus has already taken different turns and that makes the weekly calls still important. If you would like to become part of them, please contact Ken Kerschbaumer at kenkersch@sportsvideo.org. This support network was brought uniquely together by the community of Sports Video Group — sharing what each other was doing to safely continue sports production during the pandemic: what was happening on the road, in the bubble, and overseas. This industry of ours is uniquely kind when it wants to be, and never fails to support each other. On a personal note, it has been even more than that — it gives me a chance to get together each week and see a wide group from all over the world. I feel a sense of community, albeit virtual, that I am privileged to be a part of and take one positive out of a sea of challenges these last few months have dealt us all. So, a personal thank you from me for all of your time and input. And I would also like to thank the sponsors who have come in to help contribute to the SVG Sports Broadcasting Fund to allow us to open it up to those affected — personally or within their family — by COVID-19. The Fund, like these conversions, has never been so important. So stay healthy, good luck with the ongoing task of meeting our individual and collective challenges, and — most importantly — continue being there for each other. <

4

SPORTSTECHJOURNAL / FALL 2020

>CHAIR

Mike Davies, Fox Sports, SVP, Technical and Field Operations

>EXECUTIVE COMMITTEE

Ken Aagaard, SVG Chairman Emeritus and HOF Chairman Andrea Berry, PRG, VP and General Manager, Broadcast and Television Eric Black, NBC Sports Digital, CTO Jason Cohen, CBS Sports and CBS Sports Network, VP, Remote Technical Operations Mike Connelly, Fox Regional Sports Networks, SVP, Production Scott Gillies, VENN, CTO Steve Hellmuth, SVG Chairman Emeritus and NBA Entertainment, EVP, Media Operations and Technology Jeff Jacobs, Skyline Sports and Entertainment, Principal Patty Power, CBS Sports, EVP, Operations and Engineering Tom Sahara, SVG Chairman Emeritus Susan Stone, MLB Network, SVP, Engineering and Operations

>ADVISORY BOARD MEMBERS

Adam Acone, NFL Network, Director, Media Operations and Planning Glenn Adamo, Ivanhoe Media and Entertainment, President Peter Angell, Industry Consultant Onnie Bose, NFL, VP of Broadcasting Chris Brown, Turner Sports, VP, Sports Production Technology Tab Butler, MLB Network, Senior Director, Media Management and Post Production Chris Calcinari, ESPN, SVP, Remote Production Operations, ESPN and ABC Sports Mary Ellen Carlyle, Dome Productions, SVP and GM Ken Clausen, HBO, Director of Production Joe Cohen, The Switch, President, Sports Michael Cohen, Bizzy Signals Entertainment, President/ Executive Producer Don Colantonio, Industry Consultant Scott Davis, CBS Sports, VP of Broadcast Operations Jim DeFilippis, Industry Consultant Ed Delaney, Industry Consultant Jed Drake, Industry Consultant David Dukes, PGA Tour Entertainment, Senior Director, Technical Operations Jerry Gepner, CP Communications, COO Steve Gorsuch, Industry Consultant Ken Goss, NBC Sports, SVP of Remote Operations & Production Planning Mark Haden, NHL, Group VP, Broadcast Technology Ed Holmes, The Holmes Group, Principal Deb Honkus, NEP Broadcasting, Chairman of the Board George Hoover, Industry Consultant Darryl Jefferson, NBC Sports Group, VP, Postproduction and Digital Workflow Robert Jordan, CVE, 1337 Facilities, CEO John Kvatek, University of Central Florida Knights, Senior Associate AD/External Operations John Leland, PSL International, LLC, Principal Glen Levine, NEP, President, U.S. Louis Libin, Broad Comm, President Jodi Markley, ESPN, EVP, Content Operations and Creative Services Bernadette McDonald, Major League Baseball, SVP, Broadcasting Grant Nodine, NHL, SVP, Technology Ken Norris, UCLA, Director of Video Operations Gary Olson, GHO Group, Managing Director Del Parks, Sinclair Broadcast Group, SVP and CTO Scott Rinehart, University of Notre Dame, Director, Broadcast Technology Larry Rogers, FirstInTV, President Mike Rokosa, NHRA, Technology Executive Scott Rothenberg, NEP, SVP, Technology and Asset Management Oscar Sanchez, CONCACAF, Director of Broadcast Operations Bruce Shapiro, Broadcast Consulting Tracey Shaw, WWE, SVP, Network and TV Operations Jack Simmons, Industry Consultant Don Sperling, New York Giants Entertainment, VP and Executive Producer Jerry Steinberg, Industry Consultant Patrick Sullivan, Game Creek Video, President Jason Taubman, Game Creek Video, VP Design/New Technology Larry Tiscornia, Major League Soccer, VP, Broadcasting Jacob Ullman, Fox Sports, SVP, Production and Talent Development John Ward, Industry Consultant Ernie Watts, Turner, Manager, Transmission Operations Center Mike Webb, YES Network, VP, Broadcast Operations Jeff Willis, Industry Consultant Dave Zur, KSE Media Ventures, SVP, Operations & Engineering



COVERING THE FIELD

CONFRONTING A CHANGING WORLD PLATINUM SPONSORS

PREMIER SPONSORS

ADDER TECHNOLOGY • CALREC DIGICO • CLEAR-COM, AN HME COMPANY • COBALT DIGITAL • COMREX CORPORATION • DAKTRONICS • EDITSHARE • EVERTZ • EVS • GRAVITY MEDIA • IBM • IBM ASPERA • IHSE USA • IMAGINE COMMUNICATIONS • IRON MOUNTAIN ENTERTAINMENT SERVICES • JOSEPH ELECTRONICS • MARKERTEK • MICROSOFT • NELCO MEDIA • PANASONIC • SHURE • SIGNIANT • SWIFTSTACK • TEDIAL • TELESTREAM • TELOS ALLIANCE • TERADEK • THE STUDIO – B&H • TVU NETWORKS • VIZRT GROUP

CORPORATE SPONSORS

AJA VIDEO SYSTEMS • ALDEA SOLUTIONS • AMAGI • ARISTA NETWORKS • ARVATO SYSTEMS • ATEME • AUDIO-TECHNICA • BLACKBIRD VIDEO • BRAINSTORM • BRIDGE DIGITAL • CARINGO • CATDV • CIS GROUP • CISCO SYSTEMS • CLARK WIRE & CABLE • CREATIVE DIMENSIONS • CROWN CASTLE • DALE PRO AUDIO • DELAPLEX • DIMETIS • DMC BROADCAST GROUP • ECODIGITAL • EEG ENTERPRISES • ELUVIO • ENCO SYSTEMS • ENCOMPASS DIGITAL MEDIA • ENDEAVOR STREAMING • FASTLY • FINGERWORKS TELESTRATORS • FOCUSRITE • FOR-A • FUSE TECHNICAL GROUP • G&D NORTH AMERICA • GLOBECAST • GRABYO • HAIVISION • HARMONIC • IBM WATSON MEDIA • IMAGE VIDEO • IMAGEN • IO INDUSTRIES • JB&A • JVC PROFESSIONAL VIDEO • LEADER INSTRUMENTS • LEGRAND AV DIVISION • LEVELS BEYOND • LIMELIGHT NETWORKS • LTN GLOBAL COMMUNICATIONS • MARSHALL ELECTRONICS • MASSTECH • MATROX • MAXON • MEDIA LINKS • MEDIAKIND • MPE • MULTIDYNE • NCAM • NET INSIGHT • NEVION • NTP • OBJECT-MATRIX • OPENDRIVES • PIXELLOT • PLIANT TECHNOLOGIES • POLYGON LABS • PRIMESTREAM • PRIMEVIEW • PRODUCTIONHUB • QLIGENT • QUANTUM • QUANTUM5X • RCN BUSINESS • RED BEE MEDIA • RT SOFTWARE • SANKEN/BRAINSTORM ELECTRONICS • SEACHANGE INTERNATIONAL • SENCORE • SENNHEISER • SKYLINE COMMUNICATIONS • SMT • SNEAKY BIG STUDIOS • SOLID STATE LOGIC • SOS GLOBAL • SPECTRA LOGIC • SPORTLOGIQ • SPORTRADAR US • SPORTZCAST • STATS PERFORM • STEVENS GLOBAL LOGISTICS • STUDIO NETWORK SOLUTIONS • SUPERSPHERE VR • SUPPONOR • SYNAMEDIA • SYNCWORDS • TAG V.S. • TATA COMMUNICATIONS • TELLYO • TELSTRA • THE VIDEO CALL CENTER • TIGER TECHNOLOGY • TSL PRODUCTS • TV GRAPHICS • UNIQFEED • UTAH SCIENTIFIC • VARIANT SYSTEMS GROUP • VENUE EDGE • VERITONE • VERIZON MEDIA • VIDEON CENTRAL • VIMOND • VISLINK • VISTA WORLDLINK • VITAC • WORLD WIDE TECHNOLOGY• WOWZA MEDIA SYSTEMS • WSC SPORTS • XCITE INTERACTIVE • XYTECH SYSTEMS

MOBILE/INTEGRATOR SPONSORS

3G WIRELESS • ADMIRAL VIDEO • AE LIVE • AERIAL VIDEO SYSTEMS • AIDA CONTENT MANAGEMENT • ALL MOBILE VIDEO • ALPHA VIDEO • ANTHONY JAMES PARTNERS • ARCTEK SATELLITE PRODUCTIONS • ASG (ADVANCED SYSTEMS GROUP) • AV DESIGN SERVICES • AVI SYSTEMS • AZZURRO GROUP • BECKTV • BROADCAST MANAGEMENT GROUP • BSI (BROADCAST SERVICES INTERNATIONAL) • BSI (BROADCAST SPORTS INTERNATIONAL) • C360 • CAT ENTERTAINMENT SERVICES • CHESAPEAKE SYSTEMS • CINESYS • CONFERENCE TECHNOLOGIES (CTI) • CORNERSTONE AV • CP COMMUNICATIONS • CREATIVE MOBILE SOLUTIONS • CSP MOBILE PRODUCTIONS • DIVERSIFIED • DNA STUDIOS • DOME PRODUCTIONS • DX3 MEDIA GROUP INC. • ES BROADCAST • F&F PRODUCTIONS • FILMWERKS • FLETCHER SPORTS • GAME CREEK VIDEO • GEARTECH USA • HB COMMUNICATIONS • HIGH ROCK MOBILE TELEVISION • ILLUMINATION DYNAMICS • IMS PRODUCTIONS • INTEGRATED MEDIA TECHNOLOGIES • INTOTO SYSTEMS • KAUFMAN BROADCAST • KMH AUDIO-VIDEO INTEGRATION • LH COMPUTER SERVICES • LIVE MEDIA GROUP • LYON VIDEO • MEYERPRO • MOBILE TV GROUP • MOVICOM • PROGRAM PRODUCTIONS • PSSI/STRATEGIC TV • REALITY CHECK SYSTEMS • RUSH MEDIA GROUP • SDTV • SHOTOVER • SKYCAM • SMARTCART SVX • SOUTHWORKS • SPARX TECHNOLOGY • T2 COMPUTING • THUMBWAR • UNISAT • VIDOVATION

6

By Karen Hogan Ketchum, SVG, Director of Production and Editor, SportsTech Journal

The past six months has forever altered life as we know it. Phrases like “social distancing,” “personal protective equipment,” and “pandemic” have taken on new meaning, and — seemingly overnight — our homes became more than just the place we slept. They became our offices, our schools, and our refuge from all that was happening just outside our front doors. The sports-production industry is forever altered as well. For Team SVG, the challenge quickly became how do we adapt our ever-growing schedule of in-person events to a virtual world? How do we keep our membership informed and connected? How can we stay connected when we’re spread across several states? And how quickly can we become experts in Zoom? Outside of our organization, we’ve talked to countless people in our industry who are finding their professional lives turned upside down as well. The truck compound has long been the heartbeat that powers the sporting events we love. But, as anyone who’s ever set foot in a truck can attest, finding six feet of personal space before the pandemic hit was all but impossible. What could be done? As SVG Advisory Board Chairman Mike Davies explains in his letter on page 4, the industry set out to determine the protocols necessary that would be necessary for keeping sports on television — and keeping those behind the scenes safe and healthy. Those conversations resulted in SVG Clean Freaks, which has convened weekly over Zoom throughout the pandemic to share their experience and develop a guide for returning to live sports production. The SVG COVID-19 Sports Production Operations Guide is the cover story of our Fall 2020 SportsTech Journal (which, like our Spring 2020 edition, is being published in a digital-only format). It is designed to provide guidance for those who may have questions about how to approach some of the challenges that our industry is facing, given the requirements for safety protocols, social distancing, and more. Turn to page 12 to read up on the group’s recommendations for Personnel Management (page 14), Compound Design (page 20), Compound Entry (page 26), Entering and Exiting the Truck (page 30), Third-Party Providers (page 34), Catering (page 38), Vests/Access Wear (page 42), Audio (page 44), and Cleaning and Disinfecting (page 50). The Guide is also available online at https://www.sportsvideo.org/svg-covid-19-sportsproduction-operations-guide/, where it will continue to be updated as new information is made available and new sections are completed. In addition, SVG’s Sports Production COVID-19 Resources page collects the latest resources, documents, and articles from around the industry in one easy-to-navigate location. Check it out at https://www.sportsvideo.org/resources/covid-19resources/. While our work with Clean Freaks and the COVID-19 Sports Production Operations Guide is certainly the most important work we’re doing for the industry, we continue to find new ways of informing, educating, and connecting the amazing people who comprise our industry. Following our Sports Graphics Forum in February and Sports OTT Forum in March, coronavirus shut down our calendar of in-person events for the foreseeable future. Team SVG — led by the tireless efforts of our editors — kicked into high gear and churned out an impressive slate of virtual events. Beginning on page 8, read up on our two in-person events followed by six multi-day virtual events: SVG Chairman’s Technology in Action Series, Esports Production Virtual Series, At-Home Production Series, SVG College Summit: Virtual Campus, Sports Content Management Virtual Series, and SVG Venue Summit: Virtual Series. Be sure to check out the Tech Portals associated with each event for video interviews, case studies, and more from the sponsors who made these events happen. Also in Fall 2020 edition, the Venue Spotlight (page 55) shines on three gamechanging sports venues that opened this summer: Los Angeles’ SoFi Stadium (page 58), Las Vegas’ Allegiant Stadium (page 60), and Arlington, TX’s Globe Life Field (page 64). SVG’s Kristian Hernandez also sat down with systems integrators BeckTV (page 68), Diversified (page 70) and NEP Group (page 72) to hear their thoughts on how the pandemic has affected business and where we go from here. And, to ensure that the sports-production industry gets the opportunity to hear the latest expert technology insights directly from the source, we’ve expanded our White Paper section to include a record 25 submissions. Turn to page 74 for in-depth reports from leading companies on a wide variety of topics, including at-home production during COVID-19, virtualization and machine learning, and much more. Lastly, the Fall 2020 SportsTech Journal looks ahead to the future of our industry. In his letter on page 130, SVG Executive Director Ken Kerschbaumer shares SVG’s plans to cultivate diversity and inclusion within the sports-production industry through the creation of SPIRIT (Sports Production Inclusion Responsibility in Technology). We wish all of you continued health and safety during these trying times, and we hope to see you again in person as soon as possible! <

SPORTSTECHJOURNAL / FALL 2020


Draw Fans Deeper into the Game with PIERO Live Down & Distance The Unified Broadcast Solution from Ross offers viewers a truly engaging broadcast experience and the addition of PIERO Live Down & Distance takes it to yet another level. With PIERO Live, you can quickly and easily deliver the features that fans and sponsors have come to expect from a football broadcast – First Down lines, Red Zone markers, advertising logos and more!

EASY OPERATION

OPTICALLY TRACKED

The intuitive user interface offers maximum simplicity and flexibility for operators.

All graphics in PIERO Live are applied in real-time using optical tracking – no camera heads required.

rossvideo.com/piero

CUSTOM GRAPHICS Enhance your broadcast with Red Zone markers, play clocks, field goal target lines and more.


EVENTS RECAP

SPORTS VIDEO GROUP CONFERENCES IN REVIEW REVIEW DATA DESIGN AR

2.26.2020 SVA THEATRE NEW YORK CITY AN FORUM ■

From left: CBS Sports’ JP LoMonaco, |drive|’s Nick DiNapoli, NBC Sports’ Chad Hudson, Big Studios’ Joss Meinert, ESPN’s Tim O’Shaughnessy, and Turner Sports’ Jordan Shorthouse

>SPORTS GRAPHICS FORUM FEBRUARY 26 · NEW YORK CITY

The SVG Sports Graphics Forum kicked off 2020 with a full day of sessions at the SVA Theatre in New York City, addressing the latest technological advances and creative-design trends in broadcast graphics. Spearheaded once again by the SVG Graphics Committee, the sixth-annual event covered a wide variety of topics, including augmented reality and virtual analysis, data integration and sports betting, and OTT and streaming.

> SPORTS OTT FORUM MARCH 10 · NEW YORK CITY

SVG hosted its third-annual Sports OTT Forum at The Paley Center for Media in New York City. Over-the-top distribution and direct-toconsumer live streaming packages are poised to dramatically change the world of live sports video. While the entire media ecosystem is approaching this new era of consumer behavior, the sports media industry faces its own unique set of challenges and opportunities. The day-long event covered streaming quality, OTT stack, latency, and piracy, in addition to three high-level keynote conversations. From left: SVG’s Brandon Costa, Verizon Media’s Peter Gallagher, fuboTV’s Ben Grad, and Bleacher Report’s Raphael Poplock offer view from the top.

>SVG CHAIRMAN’S TECHNOLOGY IN ACTION SERIES APRIL 16, 23, and 30, and MAY 6 · VIRTUAL

In lieu of SVG’s annual pre-NAB SVG Chairman’s Forum, this year’s SVG Chairman’s Forum went fully virtual. The SVG Chairman’s Technology in Action Series featured a mix of video interviews, panel discussions, and case studies every week for four weeks, and was made available to SVG Platinum, event sponsors, and SVG members. The SVG Chairman’s Technology in Action was complemented by a twohour online version of the Sports Content Management Roundtable.

> ESPORTS PRODUCTION VIRTUAL SERIES MAY 21 · VIRTUAL

With live sports on hiatus, esports events made the leap from fringe to front stage in a matter of weeks. These online tournaments are being produced remotely and pulling in massive audiences — not only on Twitch and YouTube, but also on major linear sports networks like ESPN. The SVG Esports Production Virtual Series took attendees behind the scenes of these live productions and went inside the groundbreaking virtualized and cloud-based technologies being used by Activision Blizzard Esports, EA, and more.

CHECK OUT THE ESPORTS TECH PORTAL! 8

SPORTSTECHJOURNAL / FALL 2020


>AT-HOME PRODUCTION SERIES

MAY 28 (Live Events), JUNE 4 (Editing and Graphics), and JUNE 11 (Audio and Communications) · VIRTUAL Call it what you like: at-home production, REMI, home-run production, or even simply remote production. Prior to the coronavirus outbreak, it was seen as a way to more cost-effectively produce smaller shows, cut costs on larger shows, and get the most out of equipment and facilities. But, within the first two months of the pandemic, at-home workflows quickly became the lifeblood for an industry challenged like never before. The At-Home Production Series provided plenty of virtual panel discussions, keynotes, and case studies designed to help the sports-production industry embrace new workflows today while also getting ready for tomorrow.

CHECK OUT THE AT-HOME TECH PORTAL!

> SVG COLLEGE SUMMIT: VIRTUAL CAMPUS

JUNE 22 (Your Changing World), JUNE 23 (Your Evolving Control Room), JUNE 24 (Your Live Event Productions), and JUNE 24 (Your Social Synergy) · VIRTUAL In response to the nationwide shutdown of collegiate athletics due to the COVID-19 pandemic, SVG proudly announced the SVG College Summit: Virtual Campus 2020, a four-day series of keynotes, panels, presentations, and video-based networking events. Each day featured two hours of exclusive live programming, showcasing some of the most critical topics and smartest minds in college sports video and providing attendees with the opportunity to learn from, connect to, and share ideas with the best and brightest in college sports video in an entirely new — and socially distant — format.

CHECK OUT THE SVG COLLEGE TECH PORTAL!

continued on following page >

CHECK OUT VIDEO ON DEMAND FROM EVERY SVG VIRTUAL EVENT! www.sportsvideo.org/vod-events/

SPORTSTECHJOURNAL / FALL 2020

9


EVENTS RECAP continued > SPORTS CONTENT MANAGEMENT VIRTUAL SERIES

JULY 22 (Cloud-Based Workflows), JULY 23 (MAM and Orchestration Tools), JULY 29 (Best Practices in Storage and Archiving), and JULY 30 (AI and Machine Learning) · VIRTUAL SVG’s 14th-annual Sports Content Management Forum was presented in an entirely virtual four-day series of keynotes, panels, presentations, and video-based networking events. Two hours of exclusive live programming each day featured media-asset– management (MAM) leaders from major broadcasters, leagues, teams, OTT outlets, esports organizations, and vendors offering firsthand perspectives and behind-the-curtain looks at their respective workflows. This year’s agenda addressed the current state of MAM and archiving, cloud and virtualization, AI and machine learning, object storage and next-gen storage technologies, and much more.

CHECK OUT THE SPORTS CONTENT MANAGEMENT TECH PORTAL!

> VENUE SUMMIT: VIRTUAL SERIES AUGUST 26-27 · VIRTUAL

When the COVID-19 pandemic forced the shutdown of all professional sports in the United States, in-venue production was forever changed. Now, as sports make their eventual comeback, how will this sector of the industry respond to this setback? The SVG Venue Summit Virtual Series took an in-depth look into how professionals can continue to innovate in this new normal and what’s to come in the future. In addition, the two-day event spotlighted two gamechanging venues opening this summer: Globe Life Field in

Arlington, TX, and SoFi Stadium in Los Angeles. <

CHECK OUT VIDEO ON DEMAND FROM EVERY SVG VIRTUAL EVENT! www.sportsvideo.org/vod-events/ 10

SPORTSTECHJOURNAL / FALL 2020


Create Excitement Control With Ease Connect Everyone Whether you’re looking to create captivating live TV, move to an all IP workflow, playout from the cloud or simply upgrade your existing infrastructure, Grass Valley’s award-winning technology powers an end-to-end ecosystem of reliable, open standards-based solutions that enable smart, responsive workflows, connecting your content to any audience at any time. It’s Content Your Way. Find out more at grassvalley.com Copyright © 2020 Grass Valley Canada. All rights reserved. Specifications subject to change without notice.


COVID-19

SPORTS PRODUCTION

OPERATIONS GUIDE TABLE OF CONTENTS PART 1 PERSONNEL MANAGEMENT page 14 PART 2 COMPOUND DESIGN page 20 PART 3 COMPOUND ENTRY page 26 PART 4 ENTERING & EXITING THE TRUCK page 30 PART 5 THIRD-PARTY PROVIDERS page 34 PART 6 CATERING page 38 PART 7 VESTS/ACCESS WEAR page 42 PART 8 AUDIO page 44 PART 9 CLEANING & DISINFECTING page 50

12

W

Version 1.0 published June 2020

elcome to the SVG COVID-19 Sports Production Operations Guide. In early April, SVG began hosting a series of meetings with industry leaders to discuss the return of sports production during the coronavirus pandemic. This guide is a result of those conversations. It is designed to provide guidance for those who may have questions about how to approach some of the challenges that our industry is facing, given the requirements for safety protocols, social distancing, and more. Please note that this guide is simply a starting point for your own organization’s internal discussions, as we are well aware that the wide range of sport productions cannot be served by one document. SVG’s weekly meetings are ongoing and this document will continue to be updated in the future. NOTICE AND DISCLAIMER These documents have been developed by SVG’s Editorial Team based on interviews with leading sports-production professionals and are provided for informational and educational purposes only. They should not be read, used, or interpreted as industry standards or best practices. SVG does not warrant the accuracy or completeness of the information provided by interviewees and assumes no responsibility for errors, omissions, or updates or for injury or damage to persons or property

SPORTSTECHJOURNAL / FALL 2020

arising out of or related to the use of information contained in this document. In no event shall SVG, its employees, and its contributors be liable for any loss of profit or any other commercial damage or injury to persons or property caused or alleged to have been caused directly or indirectly by this document or its use. All decisions regarding the subjects covered within must be made by each operator based on its individual research, resources, and corporate requirements.


SPORTSTECHJOURNAL / FALL 2020

13


COVID-19

SPORTS PRODUCTION

PART 1 PERSONNEL MANAGEMENT

OPERATIONS GUIDE 1.1 What is the first step to be taken with respect to staff planning for a production? This section offers The goal is to minimize the number of people who need to be onsite. recommendations on • Contact the league/team/federation/organizer to discover any limits to the number of personminimizing staff, working nel allowed onsite. This also applies to off-venue studio locations, etc. in small groups, managing • Evaluate the size of the production (cameras, replay, support personnel). • Evaluate whether it is possible for people to work on the show from remote locations. runners, and more. • •

Keep in mind that there may also be additional people onsite to ensure that safety protocols are practiced. The added headcount will impact the size of the production team. Develop a plan in case key personnel get ill. That can range from having out-of-town personnel staged at a local hotel to having a list of available local professionals who can step in if needed. Another option is to know which production personnel onsite can take on different job duties in the event of an emergency.

1.2 Can you explain the concept of having a production team work within a bubble or a working group? In the past, working groups were often created to make transport easier, create a sense of unity, and make it easier for people to collaborate on a task. But, during the coronavirus pandemic, small groups working inside a “bubble” help confine any outbreak to as few people as possible. Each bubble is assigned its own dedicated work areas, rest areas, catering areas, toilets, transportation, and more. If a crew member falls ill or exhibits symptoms, small groups make it much easier for medical personnel to trace contacts, test individuals involved, and contain the outbreak and thus potentially save not only the production but also lives. 1.3 How many people should be in a bubble? Simply put, as few as possible. There are key considerations: overall size of crew and compound, number of trucks and production areas, ability to accommodate support facilities for multiple bubbles, size of the venue, broadcast center, studio, etc. The number of people in the bubble depends on the mathematical equation of square feet of production space divided by 6 sq. ft. per staffer (according to CDC social-distancing regulations). 1.4 Should the bubbles be defined by grouping people working in the same confined space or by job duty? The bubble should be defined by where they are working, NOT by job duty. For example, if the A1 and A2 are in different trucks, they are in different bubbles even though both are working audio. From left: Producer Jeff Gowen, Director Brian Maas, and TD Joel Blosser inside the Arizona Diamondbacks truck

14

SPORTSTECHJOURNAL / FALL 2020


Make it. Manage it. Show the world. Expertly Engineered Solutions Worldwide • • • • • • • • • • • • • • • • • • • • • •

OB Vans/Mobile Units Flypacks Studio Production Centralized Production Host Broadcast Support Systems Integration Post Production Augmented Reality Connectivity & Transmission Wireless & RF Specialty Capture Automated Ingest Media Asset Management Master Control Remote Commentary Live / Non-Live Graphics Premium Playout Advanced Content Delivery Intuitive System Monitoring Video Display Audio Lighting

BEHIND POWERFUL PRODUCTION NEPGROUP.COM


COVID-19

SPORTS PRODUCTION

PART 1 PERSONNEL MANAGEMENT continued

OPERATIONS GUIDE

The CBS Sports compound at the PGA Championship in July was one of the largest compounds in action this year. It is also recommended that, if possible, crew members handling a similar function — such as replay or graphics — have some geographical diversity within the compound (or, better yet, via remote access), which limits the risk of an outbreak’s taking down an entire department. There will also be broader limitations on movement. For example, some may be limited to the venue/field of play, others to the compound, others to the studio. 1.5 How can you tell if someone is in a bubble? It is important that it is easy to determine a person’s bubble visually and from a distance so that extra social distancing can be maintained for people in different bubbles. Recommendations include: • Color-coded wristbands as well as color-coded credentials, both of which must be worn at all times. The wristband must also be worn outside the compound in social situations, meetings at hotels, etc. • Different-colored shirts/jackets/hats/vests. Though more expensive, this makes it much easier for someone to easily be identified. 1.6 How can you tell what areas of the compound and venue are considered one bubble vs. another? Each bubble’s facilities (work area, catering, toilets, break room) must be easily identified by color coding and signage. Other recommendations include: • Colored banners and flags • Zones marked on the ground with chalk, paint, etc. 1.7 Can a person in one bubble talk face-to-face with a person in another bubble? The first option should be via intercom, radio, phone, text, and email. It is recommended that social distancing be 6 ft. and facial masks be worn. Staffers working within the same group are advised to stick to CDC guidelines of 6 ft. minimum distance. And, again, facial masks must be worn at all times. 1.8 Can a person in one bubble enter the physical workspace of another bubble? No. 1.9 How can people in different bubbles communicate in the event of a work emergency? In the case of an emergency, consult the onsite production manager for best steps and procedures. 16

SPORTSTECHJOURNAL / FALL 2020



COVID-19

SPORTS PRODUCTION

PART 1 PERSONNEL MANAGEMENT continued

OPERATIONS GUIDE

1.10 Does each bubble need a separate place for eating meals, separate toilets, and separate wash areas? Yes. Catering areas can be shared, but mealtimes should be staggered. 1.11 Does the bubble need to extend outside the compound and venue? For example, can bubbles come together at the hotel or share transportation to and from the venue? It is important that the bubble mentality follow crew members wherever they go. It is also recommended that, when off premises, crew members maintain quarantine guidelines and avoid public spaces, bars, restaurants, and other public places. 1.12 Do you recommend hiring personnel to do nothing but monitor compliance with guidelines and rules? Is one person enough, or are more needed? And is that number based on size of compound or number of people? Yes, it is recommended that additional staff be hired not only to monitor behavior but also to be an ultimate authority with respect to discipline and conflict resolution. These people should be free of other duties (such as engineering or production responsibilities) because the position is important and requires full-time availability and focus. 1.13 Should a nurse or doctor be in the compound along with a quarantine area in the case of emergency? Many broadcasters will provide additional medical personnel given the severity of the current crisis. If that is not possible, the venue’s medical staff is an option. But contact the venue personnel to ensure that the production team will be able to access invenue medical facilities, because there may be strict limitations on movement. 1.14 What happens if someone gets sick? Do they immediately leave, or are they quarantined? What about others exposed to that person? Broadcasters, production teams, leagues, and venues are developing their own comprehensive guidelines for what to do if someone falls ill. Many rules will follow CDC guidelines, but others will be a variation.

MotoAmerica was among several motorsports to return to action in June.

Managing Runners

1.15 Runners are an important part of a production, and their role is often to be able to go anywhere and do anything. How will their duties change given social-distancing guidelines? Runners will be more important than ever, and the way they work will change drastically. Recommendations include: • Have runners dedicated to each bubble • Give them more-specific job duties so that their movement is more restricted. For example, a runner who goes to local stores and shops should not enter the compound. • Consider a higher level of PPE for those who may leave the compound or have more interaction and exposure to the outside world.

Holding Camera/Production Meetings

1.16 Can in-person production meetings be held? It is recommended that all production meetings be held via comms.

Weather Delays

1.17 What is the protocol for protecting staff during a weather delay? Each production will have a different set of rules pertaining to weather delays. If space is an issue, one recommendation is to ask production personnel to wait out the weather delay in their respective cars. 18

SPORTSTECHJOURNAL / FALL 2020


Migrating to IP is a journey. We’re with you every step.

Broadcasters and content creators have known Sony for definitive cameras, advanced switchers and super motion replay servers. Now they know us as leaders in Media over IP. We’ve implemented cost savings and ultimate performance of SMPTE ST 2110 IP technology across our entire live production line. We support you with key administration tools like our Live System Manager and Live Element Orchestrator. We can also help you design, configure, test and implement your IP system. All of which proves that media over IP changes everything. Except the leader. See the future at pro.sony/IPLive

Operation

Acquisition

Processing

Production

© 2019 Sony Electronics Inc. All rights reserved. Reproduction in whole or in part without written permission is prohibited. Features and specifications are subject to change without notice. Sony and the Sony logo are trademarks of Sony Corporation.

Recording & Playback


COVID-19

SPORTS PRODUCTION

OPERATIONS GUIDE

PART 2 COMPOUND DESIGN

This section is an overview 2.1 What is the minimum distance between production-truck units, production trailers, and other facilities? of compound design and If a compound is located outdoors in a parking lot, it is recommended that trucks be as far apart covers such topics as as functionally possible. Ideally, each truck or production trailer would have at least 8-10 ft. of spacing between facilities dedicated space in front of the entrance (outside of the stairs). Compounds located inside a venue or in a tight on-the-street location may have to be more tightly and how a compound configured but, if possible, have a minimum 8-10 feet between trailers so that people can move. needs to be designed 2.2 Should every corridor between vehicles be one way, or, if corridors are large enough, can differently during the two-way travel be accepted? current coronavirus crisis.

Ideally, there would be two lanes within each corridor separating trucks — one for each direction. However, if space is limited, this can be adjusted to reroute people through single-lane corridors in the compound. Clearly, in a large number of situations, that is simply not possible. In those instances, everyone just needs to be aware of social-distancing guidelines and PPE so that others entering or exiting the unit can do so easily. Everyone also needs to be aware of the belly-bay area and ensure that those who work there are able to work and maintain social distancing.

2.3 If creating a negative pressure inside a production vehicle (exhausting air), where is the safest place to exhaust? Should you create an area cordoned off or blow it under the truck in the wheel-well area? With the additional air exhaust in hot and humid conditions, the AC units will struggle to keep up with the cooling needs. In the past, the air has been precooled from the outside with external AC units blowing cool air into the mobile unit’s intakes.

The NBA’s broadcast compound at Disney World’s Wide World of Sports

20

2.4 Because washing hands is the number-one way to help prevent infection, where should communal hand-washing stations be located in the compound? Keeping handwashing stations about 10 ft. from the end of stairs is probably the best scenario. In the initial rollout, 6-ft.–distance markings on the ground would be helpful. Crew members

SPORTSTECHJOURNAL / FALL 2020


Media Connected

<br >

Let AT&T Global Video Solutions (GVS) help you virtualize your event. Flexible, Cost-Effective Connectivity to Virtualize Your Events – reduce the risk to your employees & attendees. Getting your content to your audience is becoming increasingly more urgent & virtualization a must. GVS has the technology to meet your needs. <br >

GVS Booking Desk: 800-221-7680

<br >

Š 2020 AT&T Intellectual Property. AT&T, Globe logo, Mobilizing Your World and DIRECTV are registered trademarks of AT&T Intellectual Property and/or AT&T affiliated companies. All other marks are the property of their respective owners. <br >


COVID-19

SPORTS PRODUCTION

PART 2 COMPOUND DESIGN continued

OPERATIONS GUIDE

must be aware of social distancing and not congregate at hand-wash stations. These stations will be in addition to individual sanitization stations in each room of the truck, enabling crew members to sanitize their individual workstations. 2.5 What are the best procedures for placement of port-o-Johns and restrooms? Individual working groups should be assigned a dedicated restroom with adequate space for queuing. It also may be worth contacting the venue to see if any restrooms in the venue can be dedicated for use by the TV-production crew that will work inside the venue. 2.6 Does there need to be an area created for storage of luggage/personal items to limit the personal effects stored within the production areas? Current guidelines for many productions require people to take their own transportation (rental car, personal car) to the venue. It is recommended that they store as many personal effects as possible (luggage, additional backpacks and cases, additional jackets) in the car to limit crowded storage in the compound/truck. Individuals who do not have their own vehicle onsite should build additional travel time into their schedule to return to their hotel to pick up their luggage after the show. 2.7 Is more time needed for setup and strike of the compound? The short answer is, yes, more time needs to be set aside for setup and rigging. Consider adding several additional hours for park/power-up of the compound, checking gear, and sanitization. Also account for more time during strike to allow all equipment and cables to be re-sanitized. This will make it much easier for the next production team to get to work without having to worry about first cleaning the cables. Be sure to read the instructions for the cleaning solution, because some of it must be sprayed and then allowed to dry prior to wiping. In general, working with masks and, potentially, gloves will likely slow down the work process. Social distancing will mean that many more functions need to be performed by just one person. 2.8 What about foul weather or severe weather? Do tents need to be provided to keep crew members at a good distance from each other? If so, should the tents be located in the compound? No central emergency shelter area is factored into the compound design. All personnel are encouraged

The American Cornhole League had a producer onsite, with the rest of the Tupelo Honey production team back at the broadcast center.

22

SPORTSTECHJOURNAL / FALL 2020


Unlock the value of your content with Google Cloud AI Get started: cloud.google.com/media

Enrich your video content

Search your library seamlessly

Activate audience insights

Š 2020 Google LLC 1600 Amphitheatre Parkway, Mountain View, CA 94043.


COVID-19

SPORTS PRODUCTION

OPERATIONS GUIDE

PART 2 COMPOUND DESIGN continued to return to their personal vehicles. If they do not have personal vehicles, they should evacuate to the venue’s safe-shelter area or return to their hotel. 2.9 For productions deploying both an A unit and a B unit, could there be an overlap of personnel working in both? If the A and B units are in the same bubble, selected personnel (such as engineering staff who need to support equipment in both) can work in both units. But operators should need to access only the area where they work, relying on intercoms, radios, text, etc., for communicating with staffers in other units. If the units are in separate bubbles, personnel cannot work in both because it will risk exposing more crew members to virus in the event of an issue. This may require additional engineering staffing, so plans should ensure that bubbles have dedicated engineers available to solve issues, problems, and repairs. If additional engineering personnel (for example, a separate EIC overseeing each unit) are not available to service multiple bubbles, it is recommended that those crew members be given additional “bubble credentials” so that they can handle emergency repair and technical support. Another alternative is to create a separate area where the EIC can remotely access and oversee equipment within various mobile units without having to be physically inside the unit. 2.10 If equipment breaks in the stadium and needs to be replaced, is there a DMZ so that the tech inside can enter the compound? No. Crew members should be isolated in their own areas/bubbles. Individuals from the “outside bubble” can enter the compound, but they cannot enter any trucks or trailers. 2.11 Does a general DMZ space need to be created so that personnel from different groups can meet? How big should it be? No dedicated space should be created for meetings inside the compound. Large meetings should be held virtually via intercoms/radios/text/phone. If in-person meetings must be held, they can be held in open-air areas within the compound and must respect social distancing and PPE. 2.12 Can any crew positions be located outside of the truck, or do they all need to be in a dedicated space inside a trailer? The more crew positions located outside the main production trailers — or even out of the compound — the better; the goal is to keep the crew as small as possible. If additional space/trucks cannot be secured onsite and outside workstations/flypacks are absolutely required, they must be set up at socially distanced locations. 2.13 Does there need to be a dedicated medical-testing area in the compound? If possible and space permits, a dedicated medical-testing area should be located at the compound. The final decision sits with the league/broadcaster/venue. 2.14 What to do with smokers? Should there be a separate smoking area for each working group/ bubble? No smoking area will be provided in the compound. Smokers must plan accordingly. 2.15 Is there a staging area for crews about to start their shift? No. Everyone stays in their car until they enter the compound. If entry to the compound is staggered, it is recommended that personnel remain in their car until their entry time, to prevent gatherings near the compound entrance.

24

SPORTSTECHJOURNAL / FALL 2020


UA SERIES / 4K UHD LENSES

UA125 & UA46 THE EVOLUTION CONTINUES NEW

4K

HDR

BROADCAST

LENSES

FROM

FUJINON

4K demands a higher dimension of performance, and our expanded lineup of 4K broadcast lenses meets this challenge. Extending the limits of “High Resolution”, “High Contrast” and “High Dynamic Range”, Fujinon’s cutting-edge optical technology presents the next standard in optical performance — image quality that exceeds the high expectations of imaging professionals.

U A 4 K S E R I E S 2 / 3 " P O R TA B L E Z O O M L E N S E S :

13x4.5 | 14x4.5 | 18x5.5 | 18x7.6 NEW | 22x8 | 23x7.6 NEW | 24x7.8 | 46x9.5 | 46x13.5 F UJ I F I L M U SA .CO M

UA 4 K S E R I E S 2 /3 " ST U D I O & F I E L D B OX ZO O M L E N S E S :

27x6.5 | 70x8.7 | 80x9 | 107x8.4 | 107x8.4 AF NEW | 125x8 NEW

F UJ I N O N .CO M


COVID-19

SPORTS PRODUCTION

OPERATIONS GUIDE

This section lays out recommendations concerning compound entry. Please note that every production will be different and that these recommendations are simply a starting point for developing a more comprehensive plan.

PART 3 COMPOUND ENTRY 3.1 Should there be a single point of entry to the production compound, or are multiple points of entry preferred? It is advised that there be a single point of entry to the entire production compound for all purposes. If crew size warrants, a staggered arrival schedule is recommended to prevent bottlenecks at entry. 3.2 How should social distancing be set up at crew call/compound entry? Staggered arrival times will help prevent crowding at the entrance. If crew members arrive and there appears to be a crowd or line at the compound entrance, they should exercise enhanced social distancing (more than 6 ft. is ideal), wear a mask, and wait for those ahead to enter the compound. If possible, identifiers on the ground (tape/chalk/paint markings) make it easy to maintain social distancing. Staggered start times also help ease congestion, and common sense should apply. Simply put, if crew members arrive and there is a crowd or line, they should maintain their distance, be patient, and approach when congestion has eased. 3.3 Should COVID-19 testing be conducted onsite at the entrance to the compound? No. At this time, COVID-19 testing is not fast enough, practical to implement, or accurate enough to make it part of the compound-entrance protocol. Instead, it is recommended that every single person who enters the TV compound (from crew members to third-party support) be required to have their temperature taken every time they enter the compound. The industry-accepted temperature of a person entering the production compound should not exceed 100.4 degrees Fahrenheit. That temperature-check result supersedes all proof of negative testing.

Game Creek Video trucks onsite in Orlando in the NBA bubble

26

3.4 What is the protocol if a person entering the compound registers a higher temperature? A crew member who exceeds a temperature of 100.4 degrees Fahrenheit must be removed from the show.

SPORTSTECHJOURNAL / FALL 2020



COVID-19

SPORTS PRODUCTION

OPERATIONS GUIDE

PART 3 COMPOUND ENTRY continued A third-party support person who registers a temperature in excess of that number will not be granted access to the compound. 3.5 Should initial entry into the compound be coordinated with the league/venue’s overall entry protocols for entering venue grounds? Yes, whenever possible. The compound entry process should be coordinated with overall entry to the facility (through the league and/or host). At entry, crew members should be given an identifier (colored wristband, colored credential, etc.) for a specific “bubble/work location” in which league/venue staff with that ID color will have free rein to roam. 3.6 Should entry/exit of compound be limited in any way? Yes, movement in and out of the compound should be limited. That movement may be directed by the working group, or access may be granted via crew credential. Crew members will be required to have their temperature checked every time they enter the production compound. For more on this question, see the Personnel Management section of this document. 3.7 How should credentialing be handled? Should credentials be delivered in the mail prior to the event (when possible)? Will there still be a designated place to obtain a credential onsite? Should credentialing be digitized (for example, credentials handled through an app and entry granted as with a mobile boarding pass)? If physical credentials need to be handed out, a designated location should be established either just outside or just inside the compound entrance. The credential staff area should feature plenty of space for credentialing personnel to do their jobs without violating social-distancing guidelines, feature personnel wearing masks, and provide a plexiglass barrier to protect the staff distributing credentials. It is recommended that the plexiglass barrier be 6 ft. tall, be as wide as the table, and have a cutout through which credentials can be safely passed. At this location, it is also recommended that crew members receive colored wristbands identifying their working group and a map showing the locations of key spots in the compound and where in the venue they have access. NOTE: Some organizations are beginning to use smartphone apps for credentialing. Although this is a heavy lift that may take some time to implement, it is encouraged that staff be credentialed this way if it is possible on the particular production. 3.8 What is the protocol for sanitizing articles of clothing, including shoes/backpacks, upon entry into the compound? Is there a limit to the size of the backpack? It is recommended that crew members entering the compound limit what they bring in to a normal-size backpack to minimize their physical presence within the compound. Larger duffel bags, suitcases, backpacks, etc. need to be stored in their owner’s car or back at the hotel. Cleaning of incoming backpacks, jackets, etc., is not required. If desired, it is recommended that UV wands be provided. 3.9 Are crew members required to sanitize hands immediately on entry to the compound at a designated station? Yes. Hand-cleaning stations should be used by all incoming personnel. 3.10 Is it possible to conduct metal-detector tests while maintaining all safety measures? Yes, but additional detectors may need to be installed to prevent people from having to congregate. 3.11 Is PPE gear distributed immediately upon entry? If so, how? Is it done prior (via mail/at hotel) to arrival at the compound? Ideally, PPE is distributed to crew members in “go bags” mailed to them at home and including all the PPE required. If that is not possible, PPE bags should be available onsite and at lodging locations. 3.12 Do third-party entities (catering, fuel, power personnel) come in through the primary entrance? Yes. They must have their temperature taken and be required to wear a mask.

28

SPORTSTECHJOURNAL / FALL 2020



COVID-19

SPORTS PRODUCTION

OPERATIONS GUIDE

This section deals with protocols for entering and exiting a truck, sanitization recommendations, signage, cleaning workstations, and more.

PART 4 ENTERING & EXITING THE TRUCK 4.1 What are some general guidelines for how to enter a production truck safely in the age of COVID-19? The most important thing is to make sure you are allowed to enter the production truck/production trailer. Many productions are creating “bubbles” that are comprised of working groups with their own dedicated facilities. If you are not sure you can enter, please assume you cannot and contact production management to find out if you can enter. If you are allowed to enter the unit, you are required to: • Make sure that your mask is on. • Clean your hands prior to touching railings or door handles (if possible, avoid touching them). If everyone follows these steps, the risk of the handles and railings being contaminated falls greatly. • Please note that if you are wearing gloves, those need to be cleaned as well, as an unclean glove can carry the virus. Once inside the truck, please wash/disinfect your hands again. 4.2 How is signage handled at the entrance of the truck? How can signage be as specific as possible? Signage should be placed at both the bottom of the stairs and also the door into the mobile unit. Signage should include reminders on safety policies, working groups that are allowed in that area, reminders about face covering and hand sanitizing, etc.

4.3 Are there antimicrobial substances that can be used on handles? The EPA has a list of recommended cleaning substances. To see the approved list, please CLICK Top Rank Boxing was ESPN’s HERE. One note: in general, spraying and quickly wiping does not kill the germs and virus but first in-house live production simply moves them from one point to another. Proper spray cleaning involves spraying, letting to return following the sports it sit for a number of minutes, and then wiping. shutdown.

30

SPORTSTECHJOURNAL / FALL 2020


Have all your archived tape at the tip of your finger. Securely migrate tape to cloud fast without disruption. To compete for eyes and ears in this day and age, your indispensable media content must now be easily accessible, fast to locate, and seamlessly integrated with modern and emerging technologies. What’s more, it’s the best way to lower production costs and find new opportunities for new audiences. To find out more details, visit tapeark.com


COVID-19

SPORTS PRODUCTION

OPERATIONS GUIDE

PART 4 ENTERING & EXITING THE TRUCK continued 4.4 If a production unit only has one door, what is the recommendation for entering and exiting? If the trailer or unit only has one door, it is recommended that: • Those who are outside of the unit remain respectful in terms of distance from the stairwell so that those inside can easily exit. • Those who are inside of the unit, if possible, should be more than six feet away from the door so that someone entering the truck can enter without breaking social distancing rules (depending on the floorplan). • When entering or exiting the unit, please sanitize hands prior to touching handles or bars. Ideally, avoid touching handles or bars (use shoulder, elbow, etc.). 4.5 Should you only have one door for entry and another door for exit so people don’t need to pass on stairs? Or should specific groups enter/exit out of a specific door? The group determined that it was more beneficial to segregate the truck by function group/work areas (i.e. “control room”) and have groups use a designated set of stairs rather than need to sidle by others to get to an exit staircase. Also, stair placement can be impacted by compound layout, so it’s not always feasible to have two sets of stairs properly located for egress and entrance. 4.6 Do crew members need to clean their shoes prior to entering the truck? The recommendation of the group is no, that would be minimally effective. But if there are shoe cleaning pads/materials provided, they are there for a reason and it is recommended that — out of respect — cleaning should take place. 4.7 If shared headsets have a UVC clean box or electrostatic hand sanitizer (Zogics), should it be located inside of the truck, near the entrance? Given that there will be fewer people in the truck, it is recommended that an area close to the entry to the truck or tent be set aside for a UVC clean box as well as other cleaning supplies. Small UVC cabinets take only a couple of minutes to sanitize a headset, radio, or other unit, so please simply follow the instructions of the UVC unit. 4.8 Is there a need for someone to be stationed near the entrance to make sure no one goes in who should not (and also make sure each person has a mask, gloves, etc.)? Who provides this person, and is this person a professional or PA? While it is not recommended that someone is dedicated to each unit/trailer, there should be someone within viewing distance of the unit to monitor traffic and also deal with any issues. It is recommended that this person be a management representative (i.e. production or technical manager, when possible). It is up to each individual crew member to be vigilant and respectful of where they should and should not go. 4.9 Do there need to be different rules for office trailers vs. production trucks? The same entry/exit rules should apply to all facilities, if possible (i.e. some smaller production units may only have one door). Each facility has its own unique dynamics, but basic principles must apply across all trailers. 4.10 What are considered necessary work materials that are allowed to be brought inside the truck and can fit within your own personal workstation area? It is recommended that individuals be allowed to bring in their own personal briefcase or small backpack with their work tools enclosed (i.e. computer, timer, etc.). All personal cases must fit comfortably under the individual workstation. 4.11 What are best practices for exiting the truck after the workday and/or during strike? Upon strike, all crew are responsible for wiping down/disinfecting their own workstation, removing all external materials from their area, and returning equipment upon exiting the truck for the night. If there is a personal item you would like to leave in the truck overnight, you must clear it with the EIC/ truck manager first.

32

SPORTSTECHJOURNAL / FALL 2020


THE NEW IP-PIPE SOLUTION

Take your productions to the next level by remotely controlling your IP-based gear through the new LU800. Gain remote control over a wide variety of network gear, reduce your costs and cover more events with the new IP-Pipe feature.

Robotic/PTZ Camera Control Camera Paint & Shading IP-Based Intercom

Learn More: go.liveu.tv/sports


COVID-19

SPORTS PRODUCTION

OPERATIONS GUIDE

PART 5 THIRD-PARTY PROVIDERS

This section is designed to 5.1 What type of companies involved in a sports production are considered a third-party? More than ever, all third-party providers should consider themselves part of the core team. This provide general guidance means a number of things: to third-party production • Adhering to the safety protocols laid out for the production entities that may arrive • Understanding compound entry/exit guidelines • Understanding that testing of a variety of types will be done onsite and need additional • Possibly signing various waivers with respect to self-monitoring, pre-event activities, and guidance for fitting into more. They may be required not only by the production but also by the venue, league, the overall safety plan, team, and more. steps they need to take to In addition, subcontractors, if there are any, will also be expected to adhere to the same policies ensure they are able to and guidelines. operate at full capacity, 5.2 What steps should a third-party take prior to an event to ensure that they adhere to any travel and/or transport guidelines? and more.

The previous ways of traveling prior to an event are no longer the status quo. Providers contracted for a job should ensure that the client shares all of the relevant quarantine requirements, testing requirements, and screening requirements. The client will most likely be able to share comprehensive guidelines so that the third-party team can be fully prepared to work on the production and be an important part of the show. But it is critical that this information be provided as soon as possible, because the protocols sometimes need to begin days before the actual event. Also, comprehensive details on travel from home to the event should be provided.

5.3 What should a third-party vendor expect in the way of testing and/or changes to pre–COVID19 protocols, such as entering and exiting the compound and venue? Because each client will have different protocols and methods, third-party vendors should have a clear understanding of how the operations will take place and, at the least, should be prepared to undergo a temperature check and possibly an antibody test. In certain situations, third-party teams may need to be not only tested but also quarantined while awaiting test results and should expect to sign a release for sharing test results with management onsite. In general: • Ensure that travel arrangements allow for testing and waiting for results. • Contact the client and make sure timing requirements are understood. • Subcontractor needs to be made aware of timing and testing requirements. 5.4 By the time a third-party company arrives onsite, its personnel, facilities, and equipment often have been in transit. What kind of travel details should a third-party company provide so that the client has an understanding of possible exposure and risks? As many details as possible should be provided to the client, and travel events that could lead to accidental exposure should be tracked. Tandem drivers should keep track of things like rest stops, meals, etc. If team members have taken a commercial flight, the client might require additional days of quarantine, and, if it does, travel plans should be adjusted accordingly. Also, extra buffer days should be planned in case of delays due to travel, testing, etc. Third-party vendors may be given a staggered arrival time to maximize safety and ensure that arriving entities can be handled properly. If equipment is shipped with safety seals, it must arrive with the seals intact. If they aren’t, the equipment should be properly cleaned and sanitized.

34

5.5 Does a third-party provider need to provide extra crew/staff onsite in case someone falls ill? Every entity involved with the production is providing additional staff and crew in the event of crew-member illness. Third-party vendors should contact the client to see what requirements are with respect to backup crew and personnel. Those individuals are likely to be quarantined in a hotel in the event they are required.

SPORTSTECHJOURNAL / FALL 2020


Take your sporting event to the next level

Whatever kind of sport you love, it’s better in Dolby and you won’t want to experience it any other way. ht tps://professional.dolby.com/


COVID-19

SPORTS PRODUCTION

PART 5 THIRD-PARTY PROVIDERS continued

OPERATIONS GUIDE

5.6 If the third-party provider’s guidelines don’t match those of the client or the compound-management team has a third guideline, which must be followed? To ensure that everyone is operating under the same guidelines, the client’s protocols and best practices supersede all other guidelines. Third-party guidelines that exceed the client’s (are more conservative, require greater distancing or more cleaning) should be adhered to, but client guidelines should be considered the minimum acceptable practices. Again, those protocols should be understood as early as possible so that third-party employees are not denied entry and/or are required to leave. 5.7 How can a third-party bring in a repair technician or have refueling take place if guidelines may, at first glance, prevent someone from entering the compound without comprehensive screening and testing? In an emergency situation, steps can be taken to allow support personnel to come in without being a risk to the production. First, they should expect to be subject to all testing and PPE protocols and should expect to be delayed in entering compound. Alert the production-management team to the possibility of new personnel’s entering the compound or venue and find out the steps to take for them to be in compliance. 5.8 If a third-party has multiple staffers, do they need to arrive in separate vehicles, or can they travel together? It is recommended that staffers arrive in their own rental vehicle or car, but production management can provide accepted transportation options. 5.9 When a third-party delivers equipment to a compound, what steps should be taken to ensure that the equipment is clean? All equipment is required to have an inspection sticker indicating the date it was cleaned by the supplier. AVP Beach Volleyball If possible, suppliers should provide extra cleaning kits with each shipment specific to the equipment returned with onsite truck type. productions for its Champions Cup in Long Beach in July.

36

SPORTSTECHJOURNAL / FALL 2020


LIVE SPORTS USE THE CLOUD TO DELIVER GREAT FAN EXPERIENCES A GUIDE FOR LIVE SPORTS CONTENT CREATORS, RIGHTS HOLDERS, AND OTT BROADCASTERS

READ NOW

aws.amazon.com/media


COVID-19

SPORTS PRODUCTION

OPERATIONS GUIDE

This section is focused on catering. We respect that catering is a customized offering highly dependent on the budget of the show and the size of the crew, and is also largely determined by union guidelines that might be specific to the production or geographic region. The guidelines outlined in this document are simply suggestions that have proved successful in some of the first productions executed since the COVID19 outbreak.

The Fox Sports Detroit catering area

38

PART 6 CATERING 6.1 Should catering be offered? Yes. It is recommended that catering still be offered on all productions. The goal of this section is to offer general guidelines for delivering that catering in a safe and clean manner. 6.2 Does the length of the production day impact whether meals should or should not be served? If so, what’s the cutoff mark? It is recommended that the length of the production should not matter. The same offerings as those prior to COVID-19 should be provided on productions. Also, all union rules should be referenced and abided by. 6.3 Can a buffet still be used? It is advised that all buffet-style service should be abolished. 6.4 How should meals be distributed? The industry is recommending that a location in the compound should be designated to enable crew members to eat in a safe and socially distant manner. High levels of traffic to this area should be reduced by specifying meal shifts for crew members. A food-serving professional should serve meals in a box placed at an assigned seat. Crew members are requested to keep their masks on until sitting at the seat. At that point, the mask can be pulled down or removed for eating. Crew members should be required to sanitize their hands prior to entering the designated eating area. It is recommended that a hand-sanitizing station be placed at the entrance to the area. And it is recommended that the designated eating area be cleaned following each eating shift. 6.5 Where should meals be ordered from? Where food is ordered from is truly a case-by-case circumstance. The production may choose to order from a local restaurant or catering service or, if the venue allows for it, use the foodservice options within the venue. What should be universal is that any food-service professionals and food-delivery arrival should clear through guidelines outlined in the Compound Entry section of this document.

SPORTSTECHJOURNAL / FALL 2020


LIVE PRODUCTION What are your challenges? • Shoulder programming, OTT and social media content? • A niche sport fighting to get coverage of your events for your dedicated fanbase? • Need to broaden your fan engagement through dedicated social media channels and streaming content? • Perhaps you’re heavily into live events? • Festivals, concerts, company announcements or perhaps global product launches? Or is it Operational - Need to produce live content with a distributed production team?

MIMiC – The first fully integrated cloud based live production service. Find out more @ theswitch.tv

theswitch tv

theswitchtv

the-switch

theswitchtv

theswitchtv


COVID-19

SPORTS PRODUCTION

PART 6 CATERING continued

OPERATIONS GUIDE

6.6 What are the requirements of the catering company? It is recommended that any catering option have food-service certification. Any food-service professionals entering the compound to handle and deliver food should wear masks at all times. 6.7 How are meal choices determined? This is determined on a case-by-case basis and depends on the size of the crew and the budget for the show. If possible, crew members may be offered a menu to choose from prior to arriving onsite or on the morning of each production day. Otherwise, a basic rotation of meal options will suffice. (Note: the needs of vegetarians, vegans, and those with specific food allergies should be respected.) 6.8 How should water be distributed? Will other drinks be offered? It is encouraged that sufficient hydration be provided for all crew members. Water should be distributed in single-serving options. The best option is single-use bottles spaced out on a table. A single cooler where crew members help themselves to drinks is discouraged. When appropriate, crew members may be encouraged to bring their own reusable water bottles. If they are asked to do so, a communal refrigerator and/or cooler is not recommended. It is acceptable to allow crew members to bring their own drinks (sports drinks, soda, etc.) with them. 6.9 Should coffee be offered? It is not currently recommended that coffee be provided. However, if your production chooses to offer coffee, it should be distributed in single-serving cups filled by a single professional dedicated to pouring the drinks. A communal coffee machine, pot, or carafe where crew members help themselves is highly discouraged. 6.10 How will craft services occur? It is advised that the craft-services table be abolished. If a craft-services table is necessary, every item offered should be in a single-serving package set out on the table. Items should not be placed in a bowl for crew members to dig through. In lieu of the craft-services table, many broadcasters are choosing to give each crew member a curated snack box prior to the start of the production. Crew members are discouraged from sharing or trading the items in it. 6.11 Are crew members allowed to bring their own food and drink to the site? Yes. However, it is discouraged that communal coolers and/or refrigerators be used in the compound. Crew members may be allowed to bring a small cooler to keep with them at their workstation. Yellow wrist-band lunch

40

SPORTSTECHJOURNAL / FALL 20 2020


Private Cloud

Edge Cloud

Public Cloud

Your clouds. All together now. Pivot to a true hybrid cloud & make production workflow more elastic. Enable seamless content delivery, consistent user experience, & greater operational integration & efficiency, while providing flexibility & choice.

https://www.nutanix.com/solutions/media


COVID-19

SPORTS PRODUCTION

OPERATIONS GUIDE

PART 7 VESTS/ACCESS WEAR

7.1 How should vests/access wear be distributed? Where should crew member pick them up? This section is focused If possible and practical, vests/access wear required to be worn should be mailed to staffers’ homes on items required to be prior to the event. worn on a crew member’s If that is not possible or practical, it is recommended that a station be set up to distribute these person to gain access or items just inside the entrance to the TV compound. It is recommended that this station be as visible and obvious as possible to obviate crew members’ looking through multiple trailers to authenticate their presence find it. in restricted areas of the All vests/access wear should be distributed by someone wearing a facial mask. If they need to be live venue. This is not to handled and broken down by size, a face mask should be worn. Hands should be washed prior be confused with swag. to distribution of each item. 7.2 Do crew members keep the vest/access wear for the entirety of the production (especially in multi-day events)? Or do they need to return the items prior to leaving the compound at the end of every day? It is recommended that crew members retain their vest/access wear for the entirety of the production. For a multi-day production, it is not recommended that crew members turn these items in at day’s end unless there is genuine concern that the items may have been contaminated or come into contact with someone ill during the day. Crew members are encouraged to hold onto these items for the duration of the event. If it is deemed necessary to wash the vest/access wear each night, it is recommended that the production provide the crew member with multiple vests at check-in. In these circumstances, washing the vest/access wear would be the responsibility of the crew member. When vests/access wear are returned to the operations team at the conclusion of the event, collection should be done by an individual wearing PPE, and the items should be sent off for full cleaning/sanitization prior to their use on a future production. 7.3 When applicable, should crew members receive a vest/access wear to keep for a whole season? If so, should a crew member receive two (or more) items to ensure that they can wash their vest/hat each night? Yes, on both accounts. The ultimate goal is to have as few hands as possible on a vest/access wear. A crew member working on a multiple-week season should be responsible for the items, cleaning and keeping them in good condition. In these instances, multiple vests/access wear should be delivered or distributed to the crew member to allow the items to be rotated and kept clean. 7.4 Are the vests/access wear sanitized each night? Does the crew head ensure that the items are picked up by a local laundromat and returned the next morning? If so, who is responsible for coordinating this? It is not recommended that these items be collected every day and sent out for cleaning. Again, the goal is to ensure that chain-of-custody is as limited as possible. With only the crew member handling the vest/access wear, there is no need for nightly sanitization. 7.5 What should crew members do if they think their vest/access wear has been compromised? A crew member concerned that an item of clothing has been contaminated (for example, accidentally touched by someone else) should contact the production-management team for possible replacement and/or cleaning instructions. 7.6 Can a UV cleaner box clean the vests onsite? If it is necessary to sanitize vest/access wear items in the compound, a UV cleaner box can be used for this purpose. However, the items must be hung within the box so that the entire item can be exposed to the UV light. Folded vests placed in the box will not be sanitized properly. Otherwise, cleaning of these items can be handled through traditional laundry services offered at the hotel, the athletic venue, or a local laundromat. 42

SPORTSTECHJOURNAL / FALL 2020


Š 2018 Maddie Meyer/Getty Images All Rights Reserved

From the Big Easy to Beijing, Eurovision Services delivers the excitement of the NBA to basketball fans around the world. Great events don’t happen by accident! Our Eurovision Global Network provides flawless delivery of content with world-class customer service... managed right here in the U.S.

For more information on what Eurovision Services can do for you, contact Jim Scott E-mail: jim.scott@eurovision.net Cell: (973) 650-9577


COVID-19

SPORTS PRODUCTION

OPERATIONS GUIDE

PART 8 AUDIO

8.1 General Recommendations This section deals with • Crew must wear masks and gloves, including work gloves while handling equipment. handling and cleaning • DO NOT use any disinfectant sprays directly on the equipment; it could cause irreparable microphones and audio damage. equipment, such as radios, • Always use safe, approved disposable wipes on equipment. Wipes are usually 70% isopropyl alcohol (IPA); higher concentrations are no more effective. IPA is a great disinfectant: it kills and how to mike up talent bacteria, fungi, and viruses. It also dissolves oils and grease buildup and dries fast without leavsafely. •

A1 Joe Carpenter in Game Creek’s Riverhawk production truck outside MLB Network’s studios

ing residue. The important detail is to sanitize hands after touching an item that may have come into contact with contamination. Common items — cellphones, pens, notepads, glasses, keys — are often the source of contamination, being used often and usually without hand-sanitizing.

8.2 What is the best way to mike talent when social distancing is a concern? Crew members must wear masks, sanitary gloves, and safety glasses or face shield when working with talent, and they should wear gloves and wash their hands before and after working with each of the talent. If medical gloves are running low or not available (which may need to be considered if medical glove supplies are critically low and demand is high) then use medical gloves beyond the manufacturer-designated shelf life in a setting where there is a lower risk of transmission, if feasible (for example, non-surgical, non-sterile people with no known COVID-19 diagnosis). The user should visibly inspect the gloves prior to use and, if there are concerns (for example, discolored or visible tears or holes), discard the gloves. Extend the use of medical gloves by not changing the gloves between people with no known infectious diseases. Gloved hands should be cleaned between patients and at other times when hand hygiene would normally be performed during routine patient care. Alcohol-based hand sanitizers may degrade vinyl gloves. If a glove becomes damaged (for example, discolored, deteriorated, visible tears or holes), contaminated (for example, body fluids) or no longer provides a liquid barrier, replace it. Consider using non-medical gloves such as those used for food service, embalming, cleaning, or other industrial-grade gloves that most closely align with the ASTM standards for medical gloves as outlined in the FDA’s Medical Glove Guidance Manual. Be aware that counterfeit medical and non-medical gloves may be on the market, especially during this time of increased demand. Reusable masks, face shields, safety glasses, and containers should be disinfected and placed in a sealed, sanitary storage case until next use. Used gloves should be safely discarded at the end of the shoot, and hands should be washed before and after disinfecting equipment. 8.3 Can headsets and earpieces be shared? It is recommended that each crew member and talent have their own headset and earpiece. At the start of a shift, the new user should put on their own disposable covers and windscreens before use. Shared headsets may be used with sanitary earcup covers and windscreens. 8.3 USEFUL LINKS https://www.amazon.com/Stretchable-Headphone-Disposable-Sanitary-Large-Sized/dp/B01MZABCNG/ref=sr_1_6?cri d=3559VFAUXUBNE&dchild=1&keywords=headset+covers+disposable&qid=1591280498&sprefix=headset+cover% 2Caps%2C144&sr=8-6 https://www.amazon.com/Disposable-Sanitary-Earpiece-Covers-Headphones/dp/B00C7BXWGO/ref=sr_1_3?crid=355 9VFAUXUBNE&dchild=1&keywords=headset+covers+disposable&qid=1591280575&sprefix=headset+cover%2Caps %2C144&sr=8-3 https://www.amazon.com/Wode-Microphone-Windscreen-Headset-Protection/dp/B07D2B8HXN/ref=sr_1_8?dchild=1 &keywords=foam+windscreen&qid=1591280696&sr=8-8

44

SPORTSTECHJOURNAL / FALL 2020


A Commitment to Broadcast Excellence.

Outsourced broadcast solutions and production equipment rentals Custom Flypacks and Control Room Builds 4K, Robotic, and Specialty Cameras and Lenses Workflow Solutions and Replay Servers Graphics and Virtual Production RF Audio and Intercom Fiber Optic Solutions

Technical Support Available 24/7 1.800.225.6185 | services@bexel.com | bexel.com


COVID-19

SPORTS PRODUCTION

OPERATIONS GUIDE

PART 8 AUDIO continued At the end of a shift, users should remove and throw away any used headset covers and windscreens and wipe down headsets with disinfectant wipes. Also, ear cushions and windscreens (microphone covers) or voice tubes should be replaced every time a new person uses the headset. Leatherette and foam ear cushions and reusable windscreens (microphone covers) or voice tubes should be replaced every six months or sooner if they become clogged with makeup or otherwise soiled. Headset plastics, consoles, and equipment should be cleaned regularly with approved wipes, especially when a headset is assigned to a new user. A secondary sanitization step using UV-C or heat may also be used to improve disinfection. And, again, after equipment has been disinfected, it should be placed in a sanitary sealed container to prevent contamination before its next use. 8.4 What are the best processes for deploying microphones? Microphones should be wiped with approved disinfectant wipes before and after use. Remove any windscreens (microphone covers) from the microphone boom to allow the surface to be wiped down completely. Disinfected microphones and headsets should be placed in sanitized storage containers to prevent contamination before their next use. Microphone and headset windscreens need to be changed regularly. 8.4 USEFUL LINKS https://www.amazon.com/Wode-Microphone-Windscreen-Headset-Protection/dp/B07D2B8HXN/ref=sr_1_8?dchild=1&keyword s=foam+windscreen&qid=1591280696&sr=8-8 https://www.amazon.com/Windscreen-Microphone-Handheld-Perfect-Recording/dp/B07L9S1YVZ/ref=sr_1_18?dchild=1&keyw ords=foam+windscreen&qid=1591280734&sr=8-18 8.5 How should windscreens be safely deployed? It is recommended that new windscreens be used, but, when foam versions must be reused, they may be cleaned of visible debris, then washed with mild detergent and water or an approved disinfectant, such as 70% isopropyl alcohol. Note that the windscreens must be completely dry before being reinstalled on the microphone or stored in individual sanitary containers until next use. 8.6 How should lavalier microphones be handled? First, crew members must wash their hands before and after handling the mic. The A2s must wear gloves when they touch equipment that the talent will handle. Face shields are necessary for A2s since they may need to be up close to talent to make adjustments and provide assistance. Ideally, the talent will be able to place the lav mic themselves, but, with the different uses, lav mics, and pack configurations, it is likely that the talent will need some sort of help. Wireless and lavalier microphones must be disinfected before and after each use. If the microphone is to be used multiple times by the same person, it may be placed in a sealed container, such as a zip-lock bag, between uses. The microphone, cables, and wireless pack must be disinfected after final use and returned to its sanitary storage case. A foam windscreen must be either replaced with a new one or properly disinfected and dried before being returned to its sanitary storage case. 8.7 What are some recommendations for cleaning hard surfaces of equipment in the audio area? First, disposable gloves should be worn for cleaning and disinfecting surfaces. Reusable gloves should be dedicated to cleaning and disinfection of surfaces for COVID-19 and should not be used for other purposes. Follow manufacturer instructions for the cleaning and disinfection products used. Hands should be cleaned immediately after gloves are removed. If surfaces are dirty, they should be cleaned with a detergent or soap and water prior to disinfection. For disinfection, most common EPA-registered household disinfectants should be effective. Follow manufacturer instructions for all cleaning and disinfection products (concentration, application method, contact time, etc.) because they can all be deployed differently. Additionally, diluted household-bleach solutions (at least 1,000 ppm sodium hypochlorite or a concentration of 5%-6%) can be used if appropriate for the surface. Follow manufacturer instructions for

46

SPORTSTECHJOURNAL / FALL 2020


SOS Global Express During these uncertain times, rely on our proven 24 / 7 team for your upcoming projects

Air, Land & Sea

We Go Around The World, Around The Clock. Handling cargo logistics for major sports, live events, productions and corporate events

OUR EXPERIENCE 30+ years of experience with major global events, sports, live events, broadcast TV, corporate, music touring, conferences, exhibitions, and feature film productions.

WORLDWIDE world, with a network of trusted partners, make SOS GLOBAL the leader in event logistics.

SOME OF OUR PROJECTS

24/07/365 We provide world-class services around the clock. You will always be able to reach us 24/7/365 with one phone call no matter where you are in the world.

• Olympics • FIFA World Cups • NFL Super Bowl • IBC and NAB Show • Global branding events • Global activation projects

WE PROVIDE • Stress-free event cargo logistics • Competitive pricing • Global reach • Import, export, ATA Carnets… One company responsible for your entire project. Please note that we have rebranded to “SOS Events Logistics Ltd” in the UK www.sosglobal.com

USA SOS Global Express, Inc. sos@sosglobal.com

UK SOS Event Logistics Ltd. sos@sosglobal.co.uk

Germany SOS Global GmbH info@sosglobal.eu


COVID-19

SPORTS PRODUCTION

OPERATIONS GUIDE

PART 8 AUDIO continued application, ensuring contact time of at least one minute and allowing proper ventilation during and after application. Check to ensure that the product is not past its expiration date. Never mix household bleach with ammonia or any other cleanser. Unexpired household bleach will be effective against coronaviruses when properly diluted. Prepare a bleach solution by mixing: • 5 tablespoons (1/3 cup) bleach per gallon of room-temperature water or • 4 teaspoons bleach per quart of room-temperature water Bleach solutions will be effective for disinfection up to 24 hours. 8.8 Any recommendations on cleaning soft (porous) surfaces? For soft (porous) surfaces — carpeted floor, rugs, drapes — remove visible contamination if present and clean with appropriate cleaners indicated for use on these surfaces. After cleaning, launder the items as appropriate in accordance with manufacturer instructions. If possible, items should be laundered in the warmest appropriate water for them and dried completely. Otherwise, products that are EPAapproved for use against the virus that causes COVID-19 are suitable for porous surfaces. 8.9 Electronics like cellphones, tablets, touchscreens, keyboards, etc. also need to be cleaned. Any recommendations? With electronics — cellphones, tablets, touchscreens, remote controls, keyboards — remove any visible contamination. Follow manufacturer instructions for all cleaning and disinfection products. Also consider use of wipeable covers for the electronics. If no manufacturer guidance is available, consider the use of 70%-alcohol–based wipes or sprays to disinfect touchscreens. Dry surfaces thoroughly to prevent pooling of liquids. All equipment should be wiped down with disinfectant wipes when taken from the storage case and disinfected again before being returned to it. To prevent cross-contamination, if the equipment must be moved or put back into the case between uses, it must be placed in a sealed container, such as a zip-lock bag, before being put into the case. After final use, the equipment must be properly disinfected before being returned to the storage case. Wipes must use 70% isopropyl alcohol or other approved disinfectant, and UV-C sanitization can also be used if the equipment is first wiped clean of dirt and oil residue. As always, crew members must wash their hands before and after disinfecting the equipment.

8.10 Any suggestions for cleaning a radio, such as removing heavy dust, soil, mud, grime, stains, etc.? Prepare a solution of a non-abrasive dish detergent and water, with no more than 0.5% detergent in the solution. Some manufacturers recommend using distilled water. Apply the solution to the surface of the radio with a soft, non-abrasive cloth. Note: do not apply any liquid directly to the surface of the radio; apply it to the cloth, then wipe the radio with the cloth. MotoAmerica A1 Next, use a stiff, non-metallic, and short-bristled brush to loosen and remove dirt from surface and crevices of the radio. Wipe the debris and moisture away with a dry, soft, lint-less, absorbent cloth. Be sure to remove all moisture from the radio, including any metallic contacts, connector ports, cracks, and crevices. Also, allow the radio to fully dry before attempting to install the battery/batteries, charge, or use the device. 8.11 What steps are best for disinfecting a radio? Wipe down the radio using IPA in a 70%-80% concentration; below 70% will not be effective. Apply the isopropyl alcohol to a soft, non-abrasive cloth and wipe the surface of the radio. Do not apply the solution directly to the radio. Be sure to wipe into the cracks and crevices in the radio to effectively disinfect it. 8.11 USEFUL LINKS Some radio manufacturers allow the use of an antibacterial wipe, but excess liquid must be squeezed out https://www.buytwowayraof the wipe first so that it is merely damp, not wet, to avoid over-saturating the radio with fluid. dios.com/blog/2020/04/ Be sure to remove all moisture from the radio, including metallic contacts, connector ports, cracks, and how-to-clean-and-disinfectcrevices. your-two-way-radios.html Allow the radio to fully dry before attempting to install the battery/batteries or charge or use the device.

48

SPORTSTECHJOURNAL / FALL 2020


THE AUTOMATED LIVE SPORTS SOLUTION

MULTI SPORT CONFIGURATIONS ANY SPORT, ANY FEED ONE CLICK PUBLISHING TO ANY DESTINATION MANAGE, ENRICH, PRODUCE AND DELIVER LIVE EVENTS CONTENT REPLAY DETECTION CONNECTS INTO YOUR EXISTING LIVE PRODUCTION ENVIRONMENT

E U R O P E 路 U S A 路 M E A 路 L ATA M 路 A PA C


COVID-19

SPORTS PRODUCTION

OPERATIONS GUIDE

This section concerns recommendations for how to choose the correct cleaner/disinfectant, how to apply cleaning materials and ensure cleaning staff are protected, basics on air filtration, and more.

PART 9 CLEANING & DISINFECTING 9.1 What steps can be taken to ensure that disinfection is done properly? When disinfecting against SARS-CoV2 (the virus that causes COVID-19), use disinfectants that are on EPA’s List N: Disinfectants for Use Against SARS-CoV-2 and formulated with the active ingredients recommended by EPA’s Design for the Environment Logo for Antimicrobial Pesticide Products. (As of May 2020, the active-ingredient list includes hydrogen peroxide, citric acid, L-lactic acid, ethanol, isopropanol, peroxyacetic acid, and sodium bisulfate). If concentrated disinfectants are diluted to the proper solution using a portion-control device, put in place a testing protocol to ensure that the correct dilution rate is achieved. Inexpensive test strips (under 10¢ each) are available for many disinfectants. Cleaning-chemical products should meet EPA Safer Choice Standard; Green Seal standards GS-37, GS-40, GS-52/53; UL Ecologo 2792, 2795, 2777, 2798, 2791, 2796, 2759; or should be used only with devices that use water, ionized water, electrolyzed water, or aqueous ozone and have thirdparty–verified performance data equivalent to those standards. If the device is marketed for antimicrobial cleaning, performance data must demonstrate antimicrobial performance comparable to EPA Office of Pollution Prevention and Toxics and Design for the Environment requirements, as appropriate for use patterns and marketing claims. 9.2 What type of hand soaps should be used? First, prioritize hand-washing with plain soap and water over hand sanitizers when possible. Hand soaps should meet one or more of the following standards: EPA Safer Choice, Green Seal GS-41, or UL Ecologo 2784. Or they should have no antimicrobial agents (other than as a preservative) except where required by health codes and other regulations (for example, food-service and healthcare requirements). When soap and water are not available, use hand sanitizers that contain at least 70% alcohol. 9.3 Is there a recommendation for paper towels, mops, buckets, etc.? Use paper towels, wiping/drying products, mops, buckets, and other tools that meet one or more of the following standards: EPA comprehensive procurement guidelines for janitor paper and plastic trash-can liners; Green Seal GS-01 for tissue paper, paper towels, and napkins; UL Ecologo 175 for toilet tissue and hand towels; or FSC certification for fiber procurement. Also, use cleaning equipment with ergonomic-design features to reduce worker injuries from, for example, vibration, noise, and user fatigue. 9.4 Any recommended procedures on cleaning and disinfection? Procedures should meet the joint requirements of CDC and EPA on Reopening Guidance for Cleaning and Disinfecting Public Spaces, Workplaces, Businesses, Schools, and Homes. Procedures should also optimize cleaning-personnel resources and minimize unnecessary use of valuable cleaning products and equipment. Do not overuse or stockpile disinfectants or other supplies. When possible, adjust spaces to minimize frequently touched surfaces and regularly update cleaning personnel on building-occupant activities to ensure that cleaning aligns with the way the building is being used. Also identify “high-touch points” along with frequencies for cleaning and disinfecting the different objects so designated and have procedures for quantitative testing of surface cleanliness. 9.5 What do we do about protecting those who are cleaning? Provide personal protective equipment (PPE) — including eye protection, masks, gloves, and gowns — for all cleaning personnel as required by the products and processes being used. Also, consider the requirements of the buildings and its occupants relative to COVID-19. Use tools, equipment, and procedures that reduce worker ergonomic injuries (for example, to the back, shoulders, and knees). Also, train personnel about how to properly put on PPE, take it off, and dispose of it. Train personnel on the hazards of the cleaning chemicals used, in accordance with OSHA’s Hazard Communication standard (29 CFR 1910.1200), and comply with OSHA’s standards on

50

SPORTSTECHJOURNAL / FALL 2020


PEOPLE POWERING CREATIVITY DISCOVER OUR SOLUTIONS FOR THE NEW PRODUCTION LANDSCAPE PRG provides the most comprehensive and advanced production services available, backed by an industry leading team of broadcasting engineers. We bring the highest standard of expertise, innovation and safety to productions of every scale and level of complexity.

prg.com


COVID-19

SPORTS PRODUCTION

PART 8 CLEANING & DISINFECTING continued

OPERATIONS GUIDE

Bloodborne Pathogens (29 CFR 1910.1030), including proper disposal of regulated waste and PPE (29 CFR 1910.132). Train on the basics of infection control and the science of cleaning, PPE, ergonomics protection for workers, hazards of disinfectant and other chemical products, disposal of cleaning chemicals, proper use and maintenance of chemical-dispensing equipment, and other products and equipment used in the cleaning process. 9.6 Quality and safe airflow is also a concern. Any recommendations for that issue? For a building environment, take steps to improve ventilation in the building: • Increase the percentage of outdoor air (for example, using economizer modes of HVAC operations) potentially to as high as 100% (first verify compatibility with HVAC-system capabilities for both temperature and humidity control as well as with outdoor/indoor-air–quality considerations). • Increase total airflow supply to occupied spaces, if possible • Disable demand-control–ventilation (DCV) controls that reduce air supply based on temperature or occupancy. • Consider using natural ventilation (opening windows if that is possible and safe to do) to increase outdoor-air dilution of indoor air when environmental conditions and building requirements allow. • In general, take advantage of the ability to spread the production team and personnel over a wider geographic area to allow more social distancing.

Ultraviolet Germicidal Irradiation cabinets are being used to clean headsets, mics, and other pieces of equipment.

52

9.7 What about air-filtration methods? Increase air filtration to as high as possible (MERV 13 or 14) without significantly diminishing design airflow. The fraction of particles removed from air passing through a filter is termed “filter efficiency” and is provided by the Minimum Efficiency Reporting Value (MERV) under standard conditions. MERV ranges from 1 to 16; higher MERV = higher efficiency: • MERV ≥13 (or ISO equivalent) are efficient at capturing airborne viruses. • MERV 14 (or ISO equivalent) filters are preferred. • High-efficiency particulate air (HEPA) filters are more efficient than MERV 16 filters. Overall effectiveness of reducing particle concentrations depends on several factors: • Filter efficiency • Airflow rate through the filter • Size of the particles • Location of the filter in the HVAC system or room-air cleaner Increased filter efficiency generally results in increased pressure drop through the filter. Ensure that HVAC systems can handle filter upgrades without negative impacts to pressure differentials and/or air-flow rates prior to changing filters. Generally, particles with an aerodynamic diameter around 0.3 μm are most penetrating; efficiency increases above and below this particle size. Consider running the ventilation system even during unoccupied times to maximize dilution ventilation. Generate clean– to less-clean–air movement by re-evaluating the positioning of supply and exhaust-air diffusers and/or dampers and adjusting zone-supply and exhaust-flow rates to establish measurable pressure differentials. Have staff work in areas served by “clean” ventilation zones that do not include higher-risk areas, such as visitor reception or exercise facilities (if open). Also consider using portable HEPA fan/filtration systems to help ultraviolet germicidal irradiation (UVGI) as a supplement to help inactivate the virus. Be sure to implement changes and confirm that building systems are operating as expected. If using air-treatment measures, use devices with third-party testing to ensure that no harmful byproducts are produced. To minimize ozone generation, for example, the air-cleaning device should be listed and labeled in accordance with UL 2998, and ultraviolet-generating devices in supply air or spaces shall not transmit 185-nm wavelengths. This wavelength produces ozone.

SPORTSTECHJOURNAL / FALL 2020


Proven Solution, Exceptional Experiences Enabling TV providers to maximize subscriber life-time-value while minimizing churn The Framework Video Delivery Platform makes it easy to acquire, prepare, and deliver linear and over-the-top (OTT) video with personalized user experiences and revenue generating targeted advertising.

Learn more at https://www.seachange.com

© 2020 SeaChange International, Inc. All Rights Reserved.


As new information comes to light in our ever-changing world, the SVG COVID-19 Sports Production Operations Guide will continue to be updated to reflect the latest industry guidelines and requirements for safety protocols, social distancing, and more.

COVID-19

SPORTS PRODUCTION

OPERATIONS GUIDE

STAY UP TO DATE AT www.sportsvideo.org/svg-covid-19-sports-production-operations-guide/ And for the latest resources on sports production during COVID-19, check out www.sportsvideo.org/resources/covid-19-resources/


A SPECIAL SUPPLEMENT TO

GLOBE LIFE FIELD p. 64

SPOTLIGHT FEATURING

SoFi STADIUM p. 56

ALLEGIANT STADIUM p. 60

• SoFi Stadium • Allegiant Stadium • Globe Life Field • Systems Integrator Q&As


VENUE SPOTLIGHT > SoFi STADIUM

SoFi Stadium Progresses With Installation of Massive 4K, Dual-Sided Videoboard By Kristian Hernandez

M

any aspects of sports have continued to be put on hold, but amid the glitz and glamour of Hollywood, SoFi Stadium has completed the installation of a videoboard that dwarfs the size of the sign that bears the town’s name. As a 4K, dual-sided structure that weighs 2.2 million pounds and possesses 80 million pixels, Samsung and HKS have bestowed one of the NFL’s newest venues with the largest LED display in sports. SoFi Stadium features a one-of-a-kind dual-sided videoboard.

56

“The Samsung LED technology represented in the videoboard at SoFi Stadium is unlike anything fans have ever experienced,” said Jason Gannon, managing director, SoFi Stadium and Hollywood Park. “The design as well as the board’s video and audio capabilities are the first of their kind in sports and will set a new precedent for the in-stadium experience.”

SPORTSTECHJOURNAL / FALL 2020


NOW SHIPPING

PRESENT YOUR LIVE VIDEO PRODUCTION CONTENT LIKE NEVER BEFORE Take your live sports and entertainment productions to the next level with the all-new Kiva live presentation server. Kiva provides an exceptional operator-driven playout solution. Its intuitive interface allows you to present a wide variety of engaging content into your live productions with maximum flexibility and ease of use. From sports, to concerts, to other major events, Kiva instantly simplifies any live production!

HIGHLY VISUAL

FLEXIBLE PLAYLISTS

RELIABLE

Easily ingest, quickly organize, and confidently present a variety of media with a highly visual ‘shot-box style’ user interface.

Effortlessly add new items, remove unwanted content, and rearrange playout order all while playlist is live!

Relax and enjoy trouble-free playout with no stutters, audio dropouts, or unexpected hang-ups.

rossvideo.com/kiva


VENUE SPOTLIGHT > SoFi STADIUM

With 70,000 sq. ft. of LED display and 80 million pixels, this dual-sided videoboard is now the largest in sports.

Inside the SoFi Stadium video control room

58

When fans have the opportunity to enter the new home of the Los Angeles Rams and Chargers, they will stare at 70,000 sq. ft. of LED real estate that hangs 122 ft. above the field of play. For a venue that expects a capacity of over 70,000 individuals, the team at SoFi decided to expand the size of this behemoth to 120 yards long — 1.2 times longer and 1.5 times wider than the actual turf that sits below — and outfitting a maximum panel height of 40 ft. and minimum panel height of 20 ft. From any seat in the house, fans will be able to decipher messaging and material seen on the screen with the help of 4K technology and the largest graphics control system in the history of sports. If the sheer scale of the display isn’t enough, the centerhung is dual sided to allow patrons at the lower half of the bowl to enjoy content with the same clarity. “The monitor is designed to activate the entire seating bowl and is unprecedented in the world of sports and entertainment,” said Mark A. Williams, FAIA, principal, HKS. “The display is shaped to extend the field of play allowing the 360-degree, two-sided board to provide every seat location and fan an unparalleled experience in content and viewing options. This design integration of technology and entertainment will elevate every visitor’s experience while enjoying live events.” On the audio side, the massive structure packs a hefty punch. Integrated by WJHW, the stadium’s JBL audio system has the wattage power equivalence of 1,500 home theater systems. Out of the 4,500 speakers in the building, more than 260 speakers are embedded into the videoboard. In addition, more than seven miles of loudspeaker cable winds throughout its interior. “SoFi Stadium features cutting-edge display technology and serves as another example of Samsung’s commitment to changing the way fans interact with live sports,” said Harry Patz, SVP and General Manager, Display Division, Samsung Electronics America. “We are proud to outfit the stadium with an iconic, first of its kind, double-sided 4K LED display and additional Samsung digital signage technology to ensure that each fan has a unique experience every time they visit Hollywood Park. SoFi Stadium will instantly become one of the NFL’s most talked about stadiums and Samsung is honored to be a part of its history from the start.” Barring any changes to the NFL schedule, the first preseason game played inside the walls of SoFi Stadium will pit the Rams vs. the New Orleans Saints on Friday, Aug. 14 at 7 p.m. PT. The Rams will also host the venue’s first regular season game when the Dallas Cowboys come to town on Sunday, Sept. 13 at 5:20 PT. <

SPORTSTECHJOURNAL / FALL 2020



VENUE SPOTLIGHT > ALLEGIANT STADIUM

Allegiant Stadium Becomes Raiders’ Newly Minted Fortress on the Las Vegas Skyline By Kristian Hernandez

T

he Autumn Wind, a poem written in 1974, has become a mantra for the franchise, but these same stanzas can personify the road less traveled to polish off Allegiant Stadium amid the COVID-19 pandemic. Unphased by the obstacles, the Las Vegas Raiders and its partners remained confident and unflappable in the face of adversity. Topped off with a full-IP control room and a black-glass exterior with the largest outdoor videoboard in the league, it’s a perfect fit for the Silver and Black in Sin City.

With the largest outdoor videoboard in the NFL, Allegiant Stadium is another addition to the landscape in Las Vegas.

60

“Our exterior wall, which is right around 350 ft. wide by almost 80 ft. tall, is the largest outdoor board in the NFL,” said Justin Lange, manager of audio, video, broadcast, and Cisco Vision operations, Allegiant Stadium. “There’s no reason to install a baseband control room now since that’s not what’s going in any venue, so the decision to go IP is because it’s the future.” One of the NFL’s newest stadiums is nearing its completion, but before the edifice progressed to where it currently stands, the plan was put in place by a familiar company. Lange, who previously worked with the Minnesota Vikings, recruited Alpha Video to integrate this brand-new control room. “I’ve worked in just about every control room in the Twin Cities that Alpha has installed, so I’m very familiar with their work,” said Lange. “It was a really easy transition to come out here and know the integrator on the AV side.” With a chemistry that was founded in past projects, the process of erecting Allegiant Stadium was starting off on the right foot. Likewise to choosing Parsons Electric as a local partner in Minnesota, the team decided to go with The Morse Group, a local company in Las Vegas, as the licensed contractor. The staffers at The Morse Group have been heavily involved, but in a time where construction has continued in the face of the ongoing pandemic, Lange relied on the familiarity of Alpha Video. “There was some newness to [working with a new contractor], and they’ve turned out to be great, but having Alpha was a no-brainer,” he continued. “They have made things super slick and smooth.” As in-venue production moves down the pipeline, technologies are always changing and evolving. One in particular, IP, has garnered the interest of

SPORTSTECHJOURNAL / FALL 2020



VENUE SPOTLIGHT > ALLEGIANT STADIUM in-venue professionals for quite some time. While others continue to plan how they will gradually move away from SDI and baseband models to workflows steeped in IP, Las Vegas’ new sports venue will move forward with a full-IP backbone. In a space that comes with a slew of hardware and software, diving deep into the IP waters was the best option. “We have an infrastructure with a ton of single-mode fiber everywhere, so there’s a lot of flexibility that we have with video transport on an IP router,” said Lange. “We’ll be able to send three signals down a single fiber, so you can do three sends and three receives over a duplex pair.” Inside of the newly devised control room, a mixture that consists of an Evertz EXE 2110 router and three EVS IPDirectors are at the center of their IP foundation. As for cameras, the IP network is able to control 12 Panasonic PTZ cameras for use throughout the stadium. Despite having a strong base to execute a videoboard show over IP, it’s not always easy being the ones going full bore, but rather than settling on the conventional method of production, there’s no turning back when the industry’s driving a car without a rearview mirror. “It’s more difficult to conceptualize and deal with [IP], but ultimately, that’s where everything is headed,” said Lange. “If you’re installing a new control room and you’re going to spend $7 to $8 million, you might as well make it an IP solution because in five years, we’re not going to have a single baseband install out there.” In addition to the IP-flavored technologies, the control room is filled with other deployments as well. Some notable highlights include a Ross Video Acuity production switcher; multiple EVS XT3 servers that include 12 1080p inputs, one super-slo-mo input, and three 4K inputs with Epsio Zoom control; Riedel Bolero wireless intercom and Artist 128; and 250 TB of Quantum’s NAS storage. While IP is the headliner of this build, the large-scale videoboards are no slouch. At eight-millimeter pixel pitch, these LED displays will project content with precision and clarity. With the mesh board handles material outside of the stadium, a trio of structures will take care of duties on the inside. These three videoboards will be the driving force with a playout that will be similar for fans sitting anywhere in the stadium. “The south board is just a little over 250 ft. long and 48 ft. high,” said Lange. “The north boards will both be a little over 120 ft. long and 48 ft high. The 16x9 aspect ratio on both sides is exactly the same, so we will be able to match some of that content from the north boards to the south board.” Along with these gigantic displays, the venue is also outfitted with a 30 ft. high and 10 ft. wide marquee at the 15-yard-line and nearly two full rings of ribbon boards that are 112 pixels tall. To showcase the action on these videoboards, the Raiders will use the aforementioned Panasonic PTZs and seven Sony 4300 manned cameras (five hard and two handheld).

62

SPORTSTECHJOURNAL / FALL 2020

The Raiders will be operating HDR videoboards over a full-IP network. Constructing a $1.84 million venue is a difficult yet rewarding task to accomplish. When you’ve completed two massive undertakings of this size in the last four years (U.S. Bank Stadium in 2016) is an entirely different feat. Luckily for Lange, his time in the Minneapolis-St. Paul metropolitan area has not only given him a strong relationship with Alpha Video, but an invaluable blueprint to bringing this idea from paper to reality. For an endeavor of this magnitude and caliber, there was a fair share of logistical and technological hurdles along the way. When it came to solving an issue with equipment, an important lesson that Lange used from his past was one of the financial nature. “We were able to be really strategic with our money, which is another great part about working with Alpha Video because they know I’m not trying to pull one over on them and they were 100% willing to work with me on this,” he said. “We used a little strategic magic there to make sure that we’re taking away money from things that we don’t need and allocating it to things that we do need. Ultimately, it’s just about being a decent person and explaining your reasoning in trying to get things done.” Similar to what went into building the new home to the Black Hole, Lange navigated stormy seas to make it to Las Vegas. In true Raider fashion, he overcame those hurdles to push this ambitious project over the finish line. There is still some work left to be done on the venue before hosting the organization’s first season in the desert. And despite the fact that preseason games have been canceled and fans will not be in the stands in 2020, there is a still an opportunity for Lange and his team to grow and plant the seeds for a fruitful future in Allegiant Stadium. “We have a bunch of toys to play with, so this an opportunity to really buckle down and make sure that everything gets finished up properly,” he concluded. “It’s extremely disappointing that we’re not going to be able to have fans in the building for Year One, but we’re going to use this time to get a good handle on how the system works and make improvements before anyone ever sees the inside of this building.” <


ADVERTORIAL

RECENT

NEWS # 02 6

RIEDEL’S MEDIORNET, ARTIST, AND BOLERO FORM ALL-NEW ROUTING AND COMMUNICATIONS BACKBONE AT NEW JERSEY’S PRUDENTIAL CENTER Prudential Center, the world-class sports and entertainment venue located in downtown Newark, New Jersey, has replaced its aging signal and communications infrastructure with an all-new signal routing and intercom backbone based on Riedel solutions. Riedel’s MediorNet real-time signal transport, processing, and routing technology is integrated with an Artist digital matrix intercom mainframe and Bolero wireless intercom to enable seamless and high-quality communications throughout the facility. Opened in 2007, Prudential Center has been the home arena for the NHL’s New Jersey Devils, the Seton Hall University Pirates men’s basketball team, and more than 175 concerts, family shows, and special events each year. Known by fans as “The Rock,” the arena seats 16,500 for hockey BOLERO WIRELESS INTERCOM

games and almost 19,000 for basketball. The new Riedel solution replaces an intercom system that was original to Prudential Center and a router that had reached its end of life. “When the time came to upgrade our routing and comms systems, we explored several vendors in both categories. Ultimately, Riedel was the perfect choice for our budget, the requirements of our facility, and the technical capabilities of MediorNet, Artist, and Bolero,” said Joe Kuchie, Senior Manager, Scoreboard and Live Production, Prudential Center. “In particular, we have been blown away by the outstanding voice quality and range of the Bolero system when compared to tests we ran on other solutions. Other advantages are the integrated features of MediorNet and the flexibility to grow the system just by adding more nodes where needed, rather than having to buy a completely new router.” Connecting Prudential Center’s control room to the arena floor, the new Artistand Bolero-based intercom system powers communications for all in-arena productions. The MediorNet real-time network routes all video content and feeds from

FREE D NLOA t W O D el.ne .r w ww ied

the control room to the scoreboard and monitors throughout the building, and also transports feeds to replay operators for all sporting events. One especially valuable capability of the new Riedel intercom system is its point-topoint features, which allow individual crew members to speak directly with one another rather than having to share a crowded main channel during a game. With each Bolero beltpack acting as a wireless key panel integrated with Artist, users are able to place point-to-point calls whenever and wherever needed. “With Bolero’s fantastic range, our game-day workflows have improved significantly. Now we are able to talk to our floor managers and camera operators wherever they are, instead of waiting for them to return within range,” Kuchie added. “With our newfound communication range, we have new opportunities for filming in-game features from various locations in our building, capturing live footage from angles that we previously couldn’t use, and generating unique, fan-engaging content that translates into new revenue potential.”

www.riedel.net


VENUE SPOTLIGHT > GLOBE LIFE FIELD

Texas Rangers Open Doors to Globe Life Field With Full-IP Control Room By Kristian Hernandez

A

fter the COVID-19 pandemic delayed the original start of the 2020 MLB season, baseball fans in the Lone Star State had to curtail their excitement a little longer to watch their Texas Rangers take the field in their new home. After a near four-month wait, the league’s first full-IP control room was more than ready to go when the ballclub hosted the Colorado Rockies on Opening Day on Friday, July 24. “We are deploying an IP-based routing system with Lawo, and it will be one of the first of its kind,” said Chris DeRuyscher, Senior Director, Ballpark Entertainment and Production, Texas Rangers, speaking prior to Opening Day. “We’ve decided to jump in with both feet, so we’re not a hybrid.” Three years ago in 2017, the organization began searching for possible partners for this new endeavor. Over time, the process resulted in choosing two final candidates to complete the job. “WJHW was the consultant on the project and they hired Diversified Systems to integrate the control room, which they’ve done a great job with,” said DeRuyscher. “WJHW put together a great plan and Diversified did a great job to execute it. We’ve been really thankful for the team that we’ve assembled on this thing.” Prior to the national sports shutdown, both venues under the Rangers’ jurisdiction were quite busy with activity. Globe Life Park, the franchise’s former place of residence from 1994 to 2019, was housing the team for normal operations while the new stadium was being outfitted. “We were given the best of both worlds where we were editing, producing, and doing everything we needed to do [in Globe Life Park] until [the contractors] said, ‘Okay, you guys are good to go to come Built across the street from Globe Life Park, Globe Life Field hosted its first official MLB game on Friday, July 24.

64

SPORTSTECHJOURNAL / FALL 2020


CINEMA LENSES

OUTSTANDING IMAGE QUALITY FOR BOTH ON AND OFF THE FIELD. With high-end television production soaring in multiple program genres, and HDR and WCG further enhancing image quality on all fronts of 1080P and 4K, the need for elevated image performance and creative flexibilities has never been higher. Ever cognizant of ceaseless creative aspirations in motion imaging Canon has been advancing large format Cinema Lenses on multiple fronts. An expanding family of 4K full frame prime lenses, the longest 4K S35mm CINE-SERVO zoom lens in the world and its renowned companions, the highly innovative 4K COMPACT-SERVO lenses – all are helping to enhance digital cinema cameras in every conceivable studio and on-location shooting situation.

NEW NEW NEW

CINE-SERVO LENSES

COMPACT-SERVO LENSES

PRIME LENSES

17-120MM T2.95-3.9 25-250MM T2.95-3.95 50-1000MM T5.0-8.9

18-80MM T4.4 EF 70-200MM T4.4 EF

CN-E14MM T3.1 L F CN-E20MM T1.5 L F CN-E24MM T1.5 L F CN-E35MM T1.5 L F CN-E50MM T1.3 L F CN-E85MM T1.3 L F CN-E135MM T2.2 L F

NEW NEW

new

WWW.USA.CANON.COM/CINEMALENSES

©2020 Canon U.S.A., Inc. All rights reserved. Canon is a registered trademark of Canon Inc. in the United States and may also be a registered trademark or trademark in other countries.


VENUE SPOTLIGHT > GLOBE LIFE FIELD over here,’” he said. “We were truly [operating] between the new and ballparks because we were going to be jumping back and forth for the foreseeable future.” Unlike a majority of one-sport venues, the transition was a bit different due to productions of athletic events being held in the older venue. Before the XFL dissolved, the team was still going full bore with home games of the then Dallas Renegades. To do this, DeRuyscher split his department into two factions: one that was spearheading the one mile move into their new digs and one that would hold down the fort. “We promoted one of my employees, Mandy Lawson, to director of event production,” he continued. “She was in charge of all non-Rangers baseball event, which included the XFL and now includes the minor league soccer team of FC Dallas, North Texas SC, and concerts.” Located near the third base side underneath the upper deck, in-venue operations will be conducted within a fresh control room. Although the full-IP workflow is the headliner in this newly minted space, other notable elements are worth considering. The crew will be using an Acuity production switcher from Ross Video, multiple replay servers from EVS, cameras from Sony (HDC-3100L, HDC-3500L, FS5s, and a FS7) with Canon lenses (UJ90X9B, P01-DSS, and CJ24ex7.5BIRSES), Bolero from Riedel Communications, and storage servers from Quantum. On the software front, the team will leverage Daktronics’ Show Control and Live Clips Player, Ross XPression for computer graphics, and MAM services from CatDV. Following another trend, the organization will make the leap from HD to HDR and outfit their production equipment will fullHDR capabilities. Below the massive retractable roof, which was installed to combat the high temperatures of Texan summers, Globe Life Field will feature LED video boards in four separate locations. One of these four will look very familiar to those that attended games in the old confines. “We took the left field video board at Globe Life Park and at the new ballpark, it’ll be our center field out of town scoreboard,” says DeRuyscher. “It’s 80 ft. high by 21 ft. wide and it is the only thing we’re taking over [to the new stadium] since we put that video board in in 2016 so the components are still good.” Due to the massive size of the stadium, some fans seated in right field and left field will have an obstructed view of the center field board. To solve this issue, a 39 ft. by 110 ft. left field videoboard and a 57 ft. by 157 right field videoboard were erected to accommodate each seat in the house. “At Globe Life Park, we have a giant right field board, but that board is smaller than our left field board at Global Life Field and the video board in right field at Globe Life Field is gigantic with a 66

SPORTSTECHJOURNAL / FALL 2020

Ross Video’s Acuity production switcher is at the helm of an IP-based control room with a Lawo router at the core. ton of LED,” he says. “We’ll also have a monstrous ribbon board that goes from foul pole to foul pole that runs along our main concourse.” This main course ribbon board is two ft. by 916 ft., but an entirely new element that will become a first in professional baseball, will be seen behind the dish. “We’re going to be the first team in Major League Baseball to have digital home plate signage,” he says. “Instead of rotational signs, all of our signage behind home plate is going to be LED that we will control from the control room.” This structure will be two ft. by 108 ft. As for pixel ratios, the pre-existing centerfield structure has a pixel ratio of 432 x 1,632, but for the new structures, left field has 2,208 x 792, right field has 3,000 x 1,152, main course has 18,336 x 48, home plate has 5,888 x 128, and right field ribbon has 4,224 x 72. Baseball will be played in 2020, but it will be unlike than what we’re used to seeing. Instead of packed house ready to give a hearty welcome to their team, this new stadium will be without the noise of the crowd. As an alternative, the franchise is allowing fans to be present as cardboard cutouts, named “DoppelRanger,” behind home plate and the dugouts of each team. The construction and integration of this brand-new control room, however, isn’t any less impressive. When fans eventually have the opportunity to see this edifice in person, with its blend of uniqueness and high-quality technology, it’ll become the new cornerstone of the Texas Live entertainment district for years to come. “This whole area has been amazing from the get-go and it’s been successful without the ballpark, but with the ballpark, it’s going to go to another level since we’re right next to AT&T Stadium,” he concluded. “This is going to be a whole new world since we’re going from one way of thinking to the other and I’m really pumped about it. We feel like we’re going to have as good as an entertainment production package as anybody in the world.” <



VENUE SPOTLIGHT > SYSTEMS INTEGRATOR Q&A SVG SIT-DOWN: BECKTV WEATHERS CORONAVIRUS WITH SAFETY PROTOCOLS, NIMBLE STRATEGY Since 1982, BeckTV has been responsible for conceptualizing, designing, and building detailed television facilities and control rooms throughout sports. During the coronavirus pandemic, current projects may be experiencing delays in their schedules. SVG sat down with three members of BeckTV: Matt Weiss, VP and senior engineer, Eastern Region; Brendan Cline, director of engineering, and senior engineer; and Paul Nijak, director of operations and senior engineer, to learn about what the company is handling these unforeseen circumstances. How is BeckTV continuing to communicate with their clients? How is the team adopting new ways to get in contact with new clients? Weiss: We are continuing weekly project calls with clients on Zoom and over the phone and we’re also developing what we’re calling a COVID Two Week look-ahead in our scheduling. We have this temporary look ahead, where we say we’re going to be onsite the week of April 13, but we’re going to look ahead the next week after that to see if it’s safe for our crew to be onsite in the city that we’re traveling to. Cline: Once the announcement was made that NAB was canceled, there was a domino effect that happened on a daily basis where we were trying to adapt, but we couldn’t keep up with it. What we really had to do was tell people that we don’t know what we’re going to see tomorrow. It’s hard to project [what’s going to happen] in a week or two weeks when things are changing so quickly. It’s not just with clients. It’s also about how we communicate and distribute information internally so that we [all understand] how we’re supposed to be operating and how we’re supposed to be safe and what the company’s plans are. After that, we can ask ourselves what our response will be to our clients.

Top: Matt Weiss Middle: Paul Nijak Bottom: Brendan Cline 68

What other guidelines are being put in place to ensure a safe working environment for all involved? Weiss: We have certain staff that might live with an elderly relative, so we don’t want to put them out in the field. It might be safe and they’re following the CDC guidelines onsite and PPE [personal protective equipment] is being adhered to, but then we don’t want them to bring that home to their family members, so we’re telling them to not go onsite and stay at home. We’ve also sent technicians home with build kits of wiring, so they’re not even at the office. We are limiting in-person contact, so anybody that can work remotely and doesn’t need to be in the office, is working from home. Cline: We have about six people in our office in Austin, but we are trying to still build [elements] like racks and do commissioning and other things in the shop. It’s the safest and most controllable place that we know of, but we’re making sure that we don’t have four people on top of each other while doing it and we’re not staying overnight. Has the coronavirus forced the company to think differently about how control rooms are constructed? What new technologies can potentially aid remote workflows moving forward? Cline: It’s too early to know, but live television is not going to go away and these control rooms are going to continue to be control rooms with possible remote support for those that are able to do that or some sort of disaster production kit. I think that working from home is going to be something that changes production, but in terms of producing a show, you still need things like cameras, wires, and a switcher. Automation helps, and a lot of stations have automation where they want to have one director, one producer, and those are the only people allowed in the building, but I don’t think it changes the whole entire outlook of how a control room may look. Weiss: [VP and senior engineer] Paul Kast in our New York office had a client that was locked out of a facility and they were trying to build a show list to do a live remote show. They built it in automation, but the control panel that was on the console was the only element that could load the show. There was no button in the software that would allow them to start the show, so even the manufacturer spent hours trying to figure it out. Cline: We’re driven by what the industry wants to do, so right now, we’re trying to maintain the jobs that we have. We have a lot of clients asking for some changes because they need to change their workflow, but we’re not going out and designing this without a client in mind that’s asking us to do it. It’ll be interesting to see the different requests for changes coming out of this. – KH < To read this interview in its entirety, CLICK HERE

SPORTSTECHJOURNAL / FALL 2020


When the crowds return...

Amplify the experience

Copyright Š 2020 Grass Valley Canada.


VENUE SPOTLIGHT > SYSTEMS INTEGRATOR Q&A SVG SIT-DOWN: DIVERSIFIED MAINTAINS PROACTIVE APPROACH DURING TIME OF UNCERTAINTY Founded in 1993, Diversified has emerged as global partner with more than 50 offices serving a dynamic and diverse global clientele. In the time of the coronavirus, every aspect of the sports video production community is feeling the effects. SVG had the chance to speak with Duane Yoslov, SVP, sports & live events; Anthony Cuellar, SVP, global marketing; and Chris Sullivan, VP, business development, sports & live events, about how business is continuing in the wake of this sports hiatus, how safety protocols are being developed for all employees, and how the company is planning for a future when live events return. How is the company continuing to communicate with clients? How is the team adopting new ways to attract new clients? Cuellar: Our sports and live events team is one of our key specialties, but for many years, we have also been working with clients in corporate, education, government, and many other verticals to provide communication and collaboration solutions. We’ve used these same solutions to stay connected via virtual rooms in Zoom, [Microsoft] Teams, and [Cisco] WebEx. It’s been really interesting for us because as the markets changed, we’re in a position where our customers and clients have started to request solutions immediately. Initially, we set up a COVID-19 page on our website for business continuity, to inform and share solutions that we’re offering free of charge to customers to cope with this. That’s one way that we started to provide communication and collaboration tools with our clients. This has helped during the crisis and how we’ve approached it across all of our business lines internally as well as externally. Sullivan: We’ve done a lot of [video conferencing] before, but now, it’s become the [common] way of doing business. Since everybody is starting at 8:00 in the morning, it seems like the calls are running right through 9:00 at night, so it hasn’t slowed down. We’ve stayed really busy but it’s just in a different way. Are your active projects continuing or have they been put on hold? Yoslov: The travel restrictions have caused a profound impact to our active projects. In total, there were more than 300 job sites that were affected globally. A small handful of those were sports or stadium projects, so we’ve regrouped, come up with contingency plans, and kept our teams intact for the most part that are servicing those projects. We’re reacting day-by-day as the environment changes.

70

SPORTSTECHJOURNAL / FALL 2020

From left: Duane Yoslov, Anthony Cuellar, and Chris Sullivan Cuellar: There was a security project in California that would normally have been installed by our team in Virginia. They weren’t able to travel people, but it was a mission-critical job and something that had to be done. We were able to take the expertise of our local AV team in California and have them do the implementation and get the work done for our clients while not having to travel. Those are the types of projects where we have been able to work across the country since we have so many offices. Sullivan: We have people onsite at Texas Rangers’ [Globe Life Field] and [SoFi Stadium in] L.A., but there were some folks that decided, due to family reasons, to not go to work and that’s fine. It’s their choice to decide whether they want to go onsite [or not]. We’re taking the necessary steps at these sites, so they’re doing temperature checks when you go in and wearing masks. All of those preventative measures are in place to make sure it has been as safe as possible. Has the coronavirus forced the company to think differently about how control rooms are constructed? What new technologies can potentially aid remote workflows moving forward? Yoslov: The construction and requirements of a control room are built around fan engagement, but when considering other technologies that will enable these production crews to continue to engage their fans through live production and production of digital or OTT content, there are a lot of different concepts and proposals. Our media workflow group is engaging with production teams about technologies that allow remote editing in the cloud. As for the physical control room, we’ve been talking about physical barriers or personal protection gear, but not in terms of reshaping the layout of the rooms. While we don’t know how long this will impact venues or production, there will be an end to this since vaccines and other medical therapies are in rapid development. Most of our efforts are focused on a more temporary solution than redesigning a room. Since real estate is always at a premium, there are a lot of concepts stirring around on how to minimize the footprint of the overall control room. – KH < To read this interview in its entirety, CLICK HERE


SAME GAME NEW EXPERIENCE Whenever and however live sporting events come back, we know getting information and connecting with your fans and audiences will be more important than ever when they come to your venues. Let us help you design your facility communication plan.

EVERY MOMENT MATTERS www.daktronics.com/show


VENUE SPOTLIGHT > SYSTEMS INTEGRATOR Q&A SVG SIT-DOWN: NEP INTEGRATED SOLUTIONS’ SCOTT NARDELLI ON CONTROL ROOM BUILDS, EVENTUAL RETURN TO LIVE SPORTS

Has the coronavirus forced the company to think differently about how control rooms are constructed? What new technologies can potentially aid remote workflows moving forward?

NEP Integrated Solutions is responsible for designing robust systems to meet the demanding requirements of broadcasters across the globe. In order to navigate the uncharted waters of the coronavirus pandemic, many in-venue teams are relying on many at-home technologies to stay in tune with their facilities. SVG sat down with Scott Nardelli, SVP and GM, NEP Integrated Solutions, to break down how external and internal communication is maintained, how safety is of the utmost priority, and predict what needs to be done to create the control rooms of the future.

There’s been a lot of discussions about it and I don’t think anybody really has the right answer yet. When you look at control rooms, there’s a lot of people in Scott Nardelli there. Is it possible to just put up plexiglass and separate everybody? Sure. There may be some temporary fixes, but there are a lot of deeper issues that underlie the positioning of people. It’s not just sitting within six feet of each other and there being a sneeze guard in between. What impact dose the ventilation system have? How does that air transport from one part of the room to another? How is and when is [the air] filtered? There has to be some serious discussions, not only on the immediate and reactionary methods, but the long-term prevention of spreading communicable diseases in tight spaces. From a technology standpoint, we can do what we need to do since its readily available. Everybody’s migrating to IP and whether coronavirus pushes [remote production] to the forefront is certainly possible, I think it will because [it requires] less travel and lowers the overall risk. This might be a tipping point where people start look at remote production a lot more seriously. Not because of the pure efficiency model, but for the health and safety model and other benefits that it brings as well.

How is the company continuing to communicate with clients? How is the team adopting new ways to attract new clients? We have our traditional broadcast clients, league and Fortune 500 clients, and have been using video conferencing with them prior to the pandemic, whether it be [Cisco] WebEx or [Microsoft] Teams and now Zoom, they’ve all been using them. I’ve been using BlueJeans, which is a web-based conferencing program, to communicate with not only my team but my customers for years, so that part of it isn’t all that new. The new part of it is the frequency of [these video calls] and I think that’s important because even in our normal day-to-day business, it’s about being able to see somebody face to face. It’s good to actually be able to see how people respond sometimes to some of the things you say. The video aspect, which is more prevalent today than it’s been in the past, keeps people engaged. To some extent, it makes you better prepared for work and helps set the stage when we are working from home and in a different environment. Are your active projects continuing or have they been put on hold? Some [project timelines] have been delayed, but others have accelerated. From the design, build, engineering, and consultative part of the business, that’s still moving along relatively quickly at a decent pace. We’re in good shape there and we have a fair amount of work. The integration side, which is the boots-on-the-ground installation, that is fluid and very much day-to-day. With some of the cities and states starting to slowly open back up, we’re going to see some changes and be able to move [forward], but we have to remain agile. We’ve got to be able to pivot at any moment because the situation changes. For example, an installation team is going to go do a project in New York next week, then New York extends restrictions two more weeks, and we decide to go to Atlanta instead because Georgia is opening up.

72

SPORTSTECHJOURNAL / FALL 2020

How will the sports video production community adapt its strategies in the next 12 to 24 months? Without the advent or presentation of a treatment and a vaccine, most of what we’re going to do in the next 12 to 18 months is provide more reactive solutions with an eye to the future and developing long term creative solutions. It’s difficult to know what changes will be mandated and what will become best practices. Some may be temporary, and some may in fact be permanent. Ultimately, our strategy will be to work closely with our clients to help them navigate the challenges we are all facing due to the pandemic. We’re all focused on the near term, but I would say most people in the industry are also focused on rallying their troops, keeping people positive, and keeping their attitudes up. As an industry when challenged [In the past], we would share insights and information, and then everyone would develop solutions to overcome [an obstacle]. I think that will be the same [case] here. We’re a resilient industry with a lot of smart individuals. – KH < To read this interview in its entirety, CLICK HERE


DON’T MISS SVG’S UPCOMING VIRTUAL SERIES EVENTS! Sept 23-24 TRANSPORT VIRTUAL SERIES

Oct 7-8 IP PRODUCTION VIRTUAL SERIES

Oct 28-29 AT-HOME PRODUCTION VIRTUAL SERIES Nov 17-18 ESPORTS PRODUCTION SUMMIT Dec 14-15 SVG SUMMIT Information coming soon! Dates subject to change Learn more at www.sportsvideo.org/events/

AND CHECK OUT VIDEO ON DEMAND FROM EVERY SVG VIRTUAL SERIES EVENT! SVG CHAIRMAN’S SERIES: TECHNOLOGY IN ACTION ESPORTS PRODUCTION VIRTUAL SERIES AT-HOME PRODUCTION SERIES SVG COLLEGE SUMMIT: VIRTUAL CAMPUS SPORTS CONTENT MANAGEMENT: VIRTUAL SERIES SVG VENUE SUMMIT: VIRTUAL SERIES Learn more at www.sportsvideo.org/vod-events/

Go to www.sportsvideo.org to learn more about these events, including registration details and sponsorship information.


WHITEPAPERS Establishing a Safe, High-Quality Sports Production With Modern Technologies

A

By Andy Bellamy, AJA Video Systems, Product Marketing Manager

s sports resume amidst a global pandemic, athletes are returning to a markedly different environment, with stadiums implementing changes to protect the health of coaches, managers, players, staff, and fans. One such change is that crowded stadium stands will now sit empty, making live broadcasts and streaming paramount to reach audiences watching from home. Much like returning athletes, production professionals also face new challenges and changes as they return to work and adhere to guidelines that ensure safer working conditions for crew members. Productions are exploring how they can reduce the number of staff present at the

74

SPORTSTECHJOURNAL / FALL 2020

venue, in the OB truck, or in-studio; spacing equipment further apart on-set to practice social distancing; and facilitating remote workflows where possible, among other considerations. Emergent and established technologies are making these changes possible without compromising production quality. The first step to creating a more socially distant and safer working environment, however, is to begin to better understand the distinct types of technology available, including fiber, broadcast over IP, and streaming.

> ACHIEVING DISTANCE WITH FIBER For the purposes of complying with new guidelines for distancing, fiber technology can be harnessed to simply extend existing SDI cabling. SDI video cable runs, including those which are 12G-SDI, have limitations to how far they can be run without degradation or signal drop out. But these shorter distances can be significantly extended by using fiber converters like the FiDO range from AJA. Fiber optical converters allow the transmission of digital information (in this case, digital video and audio) as light pulses through glass or plastic fibers almost as thin as a human hair, and can extend high-frame rate 4K/UltraHD video and audio up to 10km. They are available in two types: single-channel fiber converters, which carry a single video and audio feed, and


dual-channel fiber converters, which carry more than one feed. Professionals can also opt to use single-mode or multi-mode fiber converter models. Multi-mode fiber cable is more flexible for cable runs and can be curved and taken around corners without loss of signal, but overall distance is curtailed compared to single-mode. As fiber converters are also generally format agnostic, various SDIbased signal types can be transported, even RAW camera formats. Fiber converters are easy to deploy and available as transceivers, transmitters, or receivers. Rather than cabling a camera directly back to a monitor or recorder via SDI cable, the camera’s video output can be plugged directly into a highly portable fiber transceiver or transmitter, like the AJA FiDO-TR-12G or the FiDO-T-12G. The optical fiber cable run can then be extended from the fiber converter to wherever the updated monitoring positions are on-set, without any loss in signal quality. When using fiber to achieve greater distance between gear on-set, professionals will also need to think about how they will terminate the fiber cables, with either ST or LC connectors, and fiber converter options exist for both. In recent years, fiber converters have been appearing in more OB environments, as they provide immense flexibility. ST connectors, in particular, offer a locking barrel connector that’s resistant to the rigors of OB environments. LC SFP connectors, on the other hand, are more typically seen in production and broadcast studios. Simply add the fiber transceiver or receiver converters that correspond to the production’s requirements at the other end of the fiber optical cable run to provide the desired output. With additional functionality like audio embedding/disembedding, fiber converters offer a cost-effective, hassle-free solution to creating more distance between equipment and crew on-set.

>E XPLORING VIDEO OVER BROADCAST IP ADVANTAGES Video over broadcast IP (Internet Protocol), or the transmission and switching of video, audio, and metadata over standard network equipment such as a LAN, WAN, or the internet, offers a host of advantages to sports productions. It provides incredibly dense channel routing or switching capabilities, and the ability with some approaches to embed and disembed essences, whether video, audio, or metadata. Unlike streaming, it is also designed to handle the broader bandwidth of uncompressed video with as little latency as possible. In video over broadcast IP implementations, IP converters — like the IPT/R 10G range from AJA — are key, providing a bridge between current SDI or HDMI sources to a range of IP approaches and back as needed. Several standards can be utilized to implement IP converters, with broadcast production environments often favoring SMPTE ST-2110. The SMPTE ST-2110 standards suite is the grouping of specifications for transport, synchronization, and description of the individual elementary essence streams (video, audio, and ancillary data) over IP for real-time production, playout, or other professional applications. Typically, SMPTE-2110 systems afford more flexibility, as its ethernet transmission base allows for embedding and disembedding of audio or video. NMOS (Networked Media Open Specification) control of all IP converters, for instance, can be

centralized to a remote laptop station to ensure the safety of the operating staff and allow the operator to dynamically route video and audio essences with ease. New sources can be added dynamically as needed, and just as with fiber, an existing network can be easily updated by using an appropriate IP converter at each end of the video signal chain, to provide safer distancing with high quality and low latency video. IP offers advantages to both live production for OB trucks as well as to permanent studio installations, with the ability to scale to almost any format and frame rate further down the road. IP does require much more complex installation and management than fiber.

>R EMOTE MONITORING WITH STREAMING TECHNOLOGY Traditional fiber and IP pose tremendous advantages in live sports production environments when used to extend distances between production stations that offer video and audio, with up to 10km of range being offered for fiber and around the globe for IP. However, for some productions, the technical requirements of remote workflows that allow production staff to monitor video content from afar extend far beyond the reach of even fiber. Unlike downloading content for on-demand viewing, a process by which the file must be transferred to a computer or device before playback can begin, streaming allows the media transmission and playback to happen simultaneously in near real-time. Streaming serves a range of monitoring needs in live sports production, whether for reviewing a live feed from the arena or stadium remotely, or edited materials that will be woven into the broadcast or OTT feed. It is easy to implement with plug and play streaming, recording, and encoding solutions like AJA HELO. To achieve a high-quality feed without compromising the video quality or frame rate, it’s important to look for a unit that can accept baseband video via a HDMI or SDI video input and convert the signal for live streaming to the appropriate destination. Having a suitable web browser to control the streaming device from anywhere in the world is yet another important consideration, as is the device’s ability to provide a high-quality, low-latency video signal. Streaming protocols can be used to compress baseband video for remote monitoring by any number of appropriate viewers, in most cases with the ability to record the stream locally by the viewer if required. To manage the stream, whether remotely via Ethernet and a web-based GUI served from a device like HELO, or by local push-buttons, simple configuration and control, including presets, can help simplify the process. Streaming technology also extends the distances for remote viewing to other cities and even other countries with ease, and security protocols can also be added to make remote viewing both secure and safe. A significant number of devices and technologies are available today that can help keep sports productions safe, on track, and engaging to viewers. By introducing fiber converters and using Ethernet networks alongside streaming, there’s no compromise to the quality or scope of the production at hand, and production teams can streamline operations while keeping a safe distance from one another on-set. < SPORTSTECHJOURNAL / FALL 2020

75


WHITEPAPERS

Cloud-based Remote Editing for Sports Organizations By Karsten Schragmann, Arvato Systems, Product Manager

T

raditionally, there have been two approaches to remote editing. First, “proxy” or “low-res” editing, where editors use specialized editing clients that utilize lower resolution, and therefore bandwidth. After the edit, projects are either sent to “craft” edit clients, such as Adobe Premiere Pro, linking back to the highresolution material, or a new clip is created, usually by a serverside render engine based on the high-res material. This approach has many merits, especially when used for journalistic or highlight editing, which require only simple edits and/or voice over. But it is less suitable for other workflows that are relevant to the sports market as it offers fairly limited editing functions at lower quality, which limits the evaluation of sharpness or depth of field. Editing clients specialized on proxy editing, also do not offer the look, feel, and functions of craft editing clients. A second approach, especially as high-bandwidth connections have become more widely available, has been to connect directly to the high-res storage with a craft edit client over a Virtual Private Network (VPN). However even with high speed broadband connections, this approach does not give users a local-like experience since the protocols that have to be used such as Simple Management Protocol (SMB), Apple Filing Protocol (AFP), and Network File System (NFS) are not really designed to operate via high-ping networks. This results in high potential for packet loss, which dramatically degrades their performance. Even audio elements take valuable time for the client to analyze and generate wave forms, often resulting in users disabling useful functionality just to make editing practical. An alternative approach is cloud-based editing. Several different approaches have been taken here, including proxy streaming and cloud editing with live encoding, but both have trade-offs that mitigate strongly against any advantages they confer. 76

SPORTSTECHJOURNAL / FALL 2020

The considerations that have to be dealt with are as follows: Bandwidth: Since bandwidth is limited, production formats cannot be used directly for editing. While a proxy format is the obvious approach, the quality needs to be as close to the original file as possible. Scalability: A cloud solution provides the possibility to scale streaming servers based on the number of connected clients. Moreover, a public cloud solution hosted on the major platforms like Microsoft Azure or Amazon Web Services allows users to create streaming instances close to the clients’ location, resulting in lower latency. Training: Integrating a streaming solution into a commonly used editing application like Adobe Premiere Pro allows editors to work with well-known software and does not require to adjust to new clients or workflows. Security: Large cloud providers offer solutions to protect data and put a lot of effort into securing connections. Encryption algorithms, which are used in the Transport Layer Security (TLS), formerly known as Secure Sockets Layer (SSL), prevent third parties from reading and modifying any information transferred. Encryption needs to be applied to secure sensitive information or content, that is regulated by copyright. Given these considerations, cloud editing based on a Media or Production Asset Management System (MAM/PAM) is an increasingly ideal fit for many sports remote editing use cases. It uses streaming servers and compressed video for playback in the editing client, but on top of that adds the resources of a MAM system to provide components such as pre-generated proxy video, metadata enrichment, and management of editing projects. An example of a remote editing workflow in the cloud built on top of a MAM is as follows, which shows an overview of a hybrid craft editing installation, extended with remote editing. The main site contains a “classic” setup of on-premise based craft editing. Media assets are centrally stored on a Hires-Storage. These files are accessed by local Adobe Premiere installations. The craft


editor imports the video assets to its bin, edits a sequence, and renders it via the local Adobe Media Encoder, in order to create a new asset. All benefits (and limitations) of an on-premise solution remain. On top of this, there is a cloud solution extending the range of editing functionalities to remote locations. A project and media management solution is hosted in the cloud, which enables the local and remote editor to search and browse for centrally managed assets. The managed assets can be stored in the cloud or in the on-premise storage. The proxy, which has been created from the HiRes source files is also located in the cloud. This also applies to the streaming server. The streaming server accesses the proxy and streams it to the connected remote client’s for editing, review, etc. Whereas the architecture might vary depending on the system scaling, a solution hosting HiRes and Renderer in a cloud environment is viable. One important factor to note is that the selection of the proxy format used has significant influence on the perception of the edit experience since resolution and compression influence the availability to determine and evaluate quality or sharpness. A proven proxy format standard that is useful here is SMPTE RDD25. It is an AVC “Long GOP” proxy with AAC audio, originally conceived to standardize low-resolution proxies for use with lowres editors, which has seen extensions to improve resolution, bandwidth, and audio capacity. Based on the HiRes source and using the Main instead of the Base profile, a proxy can be created that is still possible to encode faster than real-time (depending on the source file up to 70fps/s). The resulting H.264 with 6-10Mbit/s and 1920x1080 and 8 Stereo Audio tracks is close to the source video but meets the requirements of limited bandwidth. The proxy can be created in an mp4 container, which extends the interoperability or in a Material Exchange Format (MXF) container, which allows editing while the proxy is generated. Ongoing codec developments mean that the H.264 component of this workflow can be further optimized. Furthermore, the quality is high enough that it can be used for the direct distribution of edited material to different platforms, especially social media. The streaming protocol used is another fundamental element for cloud editing. The subjectively perceived experience of editing stands and falls with performance of playback and responsiveness, which is mainly driven by the performance of the streaming, and is a vital part of, for example, clips generation. Adobe Premiere allows for integration of custom-made proprietary importer plugins which handle the playback of the content. A video may be divided into many files (chunks or segments), each containing only a few seconds of video at one extreme or stored in a single unchunked file at the other. With larger

chunks it might happen that a single frame is requested, but two complete chunks are transported and decoded, because the frame is within a Group of Pictures (GOP), which is separated over two chunks. We have developed a proprietary protocol that utilizes a TCP connection. The implemented Premiere Importer functionality has been adjusted in order to transport only the exact required individual frames as they are requested by the client application. This allows fast scrubbing as well as fast forward and playback. Producers — especially in sports and news use cases — frequently need to scrub through large amounts of video to find the elements they need for their project. When only those frames are downloaded that are needed for decoding, it will reduce latency in streaming, leading to a better user experience. An improvement is to support asynchronous send and receive of packages. Asynchronous frame requests improves response times. In an asynchronous frame request scenario, the client can send multiple requests at the same time while in parallel receiving all return information. The latency depends on the quality of network and especially on the distance between streaming server and client. Therefore, the outlet of a cloud environment needs to be as close to the client as possible. The large hyper scalers AWS and MS Azure, with their distribution of data centers across the globe, provide scenarios where this requirement can be met. Equally, they enable the streaming server and storage to be located in the same availability zone, which is an important factor for the need to connect cloud storage to a streaming server with sufficient random-access performance. Coupled with the TCP-based solution’s ability to transfer frames to the exact byte, this helps solve the dichotomy of matching the need to utilize MXF’s editing growing files capability with the restrictions on its use with Object Storage. There is no single change that will improve and accelerate cloud-based editing workflows to the extent that cloud-based editing becomes the norm in sports usage. What is needed is work in several areas including server-storage connectivity, the use of TCP, reduction in the number of frames transferred, improved streaming protocols, better handling of audio files, smart local caching, and more. < SPORTSTECHJOURNAL / FALL 2020

77


WHITEPAPERS

Remote Commissioning and Training For a New Broadcast Facility the BeckTV Engineering Team By Integrating Media Solutions

I

t’s difficult to imagine a more collaborative or hands-on process than the technical commissioning of a new broadcast facility for live sports. In pre-pandemic days, commissioning new equipment and training personnel involved intensive, face-to-face interactions between manufacturers, operators, and engineers. Consider, for instance, a trainer and 15 facility personnel sitting together in a room for a session on a new switcher or replay system, where they could actively touch the equipment, see first-hand where the signals were going, and engage in a free-flowing question and answer session. Such a scenario is currently not possible in our new normal of social distancing, travel limitations, and the remote workflows that most media organizations have had to adopt, almost overnight. And yet, we’ve seen no slowdown in the development of new broadcast facilities and expansion/upgrading of existing capabilities. Effective commissioning and training processes have never been more important nor more in demand, which means engineering and integration firms need to take a different approach in providing comprehensive services. The good news is that today’s advanced teleconferencing and audio/visual technologies have enabled a new remote model for high-level commissioning and training that can yield benefits over both the short and long terms. This model has been field-

78

SPORTSTECHJOURNAL / FALL 2020

tested and refined to provide facilities and their personnel with a safe remote working experience that matches the quality and thoroughness of the conventional model. In this paper, we will describe the tools used to capture and transmit a variety of visuals that together account for critical elements of system usage and training — including essential components for creating a control-room-like environment for training as a remote service. We will also describe how this new model leverages various broadcast systems’ existing remote management capabilities more fully and efficiently.

>E LEMENTS OF A REMOTE COMMISSIONING AND TRAINING SOLUTION By and large, most of the equipment necessary for adequate and effective remote training and commissioning already exists within a typical broadcast control room or media organization. For instance, most broadcast facilities utilize multiviewers, webcams, waveform monitors, scan converters, and more. In addition, nearly every modern enterprise is expanding its use of online video conferencing tools such as Zoom, Teams, and GoToMeeting. An effective remote training/commissioning setup requires a knowledgeable engineer working on-site to create the virtual environment. With slight modifications to the setup, depending on whether the task is commissioning or training, the engineer can install all of the tools required (essentially the same tools that would be needed for on-location training and commissioning). One important component for commissioning is a virtual QC station that uses a combination of webcams and multiviewers to enable third-party vendors to verify remotely that their equipment is functioning properly; for instance, whether a switcher is outputting video. Just as they once did onsite, vendors should be able to walk up to the virtual station and see what’s going on. This can be accomplished with the right mix of laptops and webcams installed onsite. The incorporation of a local QC waveform and vector scope can provide further data for troubleshooting. Again, the onsite engineer is key here to creating a local workflow by which output from all equipment can be presented as seamlessly as possible to remote personnel. One technique is a composite video of everything people need to see with their eyes that can be streamed to an open collaborative meeting platform. The engineer leverages webcams to capture any local monitor wall that is helpful to view, or to provide “over the shoulder” video of trainers or technicians as they operate a piece of equipment. All output from facility cameras and critical computers equipped with the necessary software can be combined as extracted video and input into the facility’s preexisting house multiviewers. The multiviewer output is then fed into the videoconferencing platform using a video-to-USB capture device on the hosting


workstation to facilitate a collaborative virtual meeting room environment. This robust presentation offers critical information for troubleshooting and an open forum for vendors and other personnel to discuss issues and ask questions; in other words, a virtual experience that facilitates (or closely approximates) the back-andforth interaction that might take place in an onsite, in-person setting. The videoconferencing system, of course, is a vital link here — and the service needs to be able to support both spontaneous, continuous collaboration (think of being able to pop in and out of a meeting room with a couple of colleagues over the course of a day, in an ad hoc, unscheduled way as needs arise) and also the ability to break off into separate, smaller groups to discuss issues. Zoom, for instance, offers the ability to set up conference bridges and virtual meeting spaces for 24 hours at a time, and its “breakout room” feature enables the spontaneous gathering of smaller groups.

> THE REMOTE MODEL IN PRACTICE One BeckTV client, a major global sports origination facility, launched just before the pandemic changed the world. This client’s build is technically advanced and complex, with an IP infrastructure based on SMPTE ST-2110 technologies — together with an assortment of new vendors that are still refining their own SMPTE ST-2110 and NMOS capabilities and approaches. Brendan Cline, BeckTV’s director of engineering, is overseeing training and commissioning for this project and has a unique perspective on the challenges and rewards of the remote model. “For this project, we had a double-edged challenge: commissioning a large volume of equipment based on bleeding-edge technology, plus finding a way to get it all done remotely,” he comments. “Advanced logistical planning was critical, as was asking all the right questions up front and making sure the right people were communicating and the right people were getting trained.” “As with other remote projects, this build requires an alwayson and accessible virtual meeting room that provides all the tools and information needed to troubleshoot problems, as well as solid remote network connectivity to allow vendors to access all of the necessary equipment,” adds Paul Nijak, senior engineer. “Remote personnel need to be shown everything the commissioner or trainer is doing onsite, just as if they were physically sitting in the same room.”

>A PRODUCTION APPROACH YIELDS A VERSATILE AND VALUABLE RECORDING The remote commissioning and training scenario described above is, in effect, a broadcast production — leveraging broadcast equipment ranging from cameras and multiviewers to the switchers and other equipment on which the team is being trained.

The environment, in effect, is a remote control room that enables everyone involved — from the vendors testing the equipment to the technicians and operators learning how it operates — to work together simultaneously and, in essence, “build a show.” As such, the new facility is left with an invaluable asset: a high-productionvalue record of the entire process that can be leveraged well into the future. Project Engineer Brock Raum comments, “After all, we’re talking about commissioning a broadcast facility and training people that work in video. Why not repurpose the gear we’re installing in such a way that serves both goals? The training session recordings are turning out to be even more valuable than the standard onsite training we’ve seen through the years. Since the recording is very information-rich, it’s a highly effective tool for training new operators down the road, or simply giving current operators a refresher on things they might have forgotten.”

> A MODEL FOR THE LONG HAUL In the short term, a remote model for commissioning and training is the safest option. With minimal personnel actually onsite, facilities are able to maintain the necessary social distancing for their teams, without missing any key milestones for getting the facility up and producing. “In the longer term, we expect this model to be yet another silver lining of the pandemic — a proving ground for permanent remote commissioning workflows that will yield not only new efficiencies and cost savings but also a high-value record for future reference,” adds Senior Engineer Paul Kast. Whether commissioning and training happens onsite or remotely, there are several key tenets that will always apply when building out and launching a new broadcast facility. One of the most important is to select an integration partner that has a solid grounding in sports production. The right partner will spend the up-front time needed to truly understand the project objectives and the client’s individual requirements. From there, it’s critical to understand how best to approach commissioning and training in a way that does not waste vendors’ time and provides operators with the skills and tools they’ll need in the most effective and relevant manner possible. < SPORTSTECHJOURNAL / FALL 2020

79


WHITEPAPERS

Operational Innovations in 4K UHD Broadcast Television By Larry Thorpe, Canon, Senior Fellow, Professional Engineering and Solutions >C HALLENGE OF FOCUS CONTROL FOR LIVE SPORTS COVERAGE Especially on live broadcasts of major events where production directors choreograph dozens of camera feeds, the operational pressures can be considerable. The individual camera operator can, at any moment, be simultaneously actuating one or all of the four primary lens-camera operations of image framing (zoom demand), focus on chosen subject (focus demand), operational panning and tilting, and focus tracking on moving subjects. Second-generation 4K UHD long zoom telephoto lenses have introduced significant advances on the both operational side and on the performance specifications. Focal ranges were extended, 4K image sharpness improved across the image plane, HDR/ WCG functionality added, and improvements in built-in image stabilization systems [1]. 4K UHD poses severe challenges to achieving razor sharp focus in a small image format like 2/3-inch. The sensitivity of the focus control increases as the resolution of the television system increases — in terms of a rapid drop in sharpness with only a small movement of the focus control knob. The challenge is exacerbated in that 4K UHD broadcast cameras still utilize HD viewfinders — and these are very limited in size, whereas the final 4K image will be typically viewed on large 4K displays that clearly expose even the smallest shortfall in focus precision. The small 2/3-inch image format is often linked to deep depth of fields, and hence some degree of latitude in controlling the lens focus. However, this can be extremely variable depending upon the actual shooting environment and associated lens settings. Daytime or nighttime scene illumination and lens aperture setting, subject distance from the lens, focal length setting — all collectively contribute to very wide ranges of depth of field, especially in sports coverage, as typified in Table 1.

80

SPORTSTECHJOURNAL / FALL 2020

Table 1: Three long lens shooting situations that show the variability of depth of field For a 4K UHD lens-camera system based on the small 2/3-inch image format size, the depth of field restrictions become more severe and the challenge to achieving sharp focus escalates. Given the special operational demands of 4K UHD 2/3-inch cameras — and with 8K UHD cameras already looming — it was believed that another level of both Zoom Demand and Focus Demand controllers would be an important addition to the existing family of broadcast controllers. These would offer heightened degrees of operational control over both zooming and focusing operations. In particular, given the growing addition of large format lens-camera systems within the sporting world, they would bring to the small 2/3-inch image format broadcast lens operational capabilities that emulate long established capabilities in the precision focusing operation of those larger format cinema lenses.

>N EW FOCUS DEMAND – RESOLVING THE CHALLENGE OF FOCUS ACCURACY A new focus demand having a built-in electronic display supports the presetting of a range of operational modes relating to lens focusing. The display portrays the optical focus range and, below this, it displays the physical control range — two key programmable parameters that determine the chosen mode of operation. It takes 2.5 turns of the Focus control knob to manually cover the total focus range. This, in itself, does apply a degree of Vernier control of focus, which works well when the lens settings favor a reasonably deep depth of field. But, now this can be considerably augmented. Television outside broadcasts entail a wide range of shooting environments that place quite different operational requirements on the camera operator. The ability to separately preset the optical focus range and the control knob range is a significant new innovation that offers new imaging flexibilities. In the context of sports coverage — one extreme might be the need for the lens-camera system to be moved rapidly between two individuals separated by a considerable distance — requiring a significant degree of optical re-focusing to be rapidly implemented. Here, the large change in optical focus would best be achieved with a small turn of the focus control knob, which is not possible with standard controllers. An alternate situation might entail a precision facial close-up of an athlete who is a great distance from the lens. In


that telephoto setting, the ability to implement a minor optical focus adjustment with a wide (or multiple) rotation of the control knob facilitates a precision vernier adjustment that helps ensure a perfect capture of tension or emotion on that face. Two specific pre-settable focus modes (others are also available) are now described to illustrate the flexibilities offered by the new Focus Demand.

>S CENARIO ONE: FINE FOCUS MODE The value of this mode is best illustrated by consideration of a 4K field lens-camera system located within a stadium and imaging a player or athlete a considerable distance away on the field. The director is interested in capturing a moment of concentration or emotion in a facial close-up. The lens is zoomed to achieve the desired framing of the chosen subject and the focus control is rotated to anchor the chosen subject distance. With the aid of the built-in display and the control buttons integral to the new Focus Demand, simple adjustments are now made to preset two limits around that central focusing setting — producing the desired small focus operational range. Now, range of motion of the control knob is expanded — the 2.5 turns that previously covered the entirety of the focus range of the lens is instead dedicated to that programmed restricted range of optical focus. This allows a beautifully smooth and slow zeroing in on razor-sharp focus of the chosen subject, as illustrated in capturing the concentration of an athlete awaiting the firing of the starting gun in Figure 1.

Focus Curves The Focus Curve selector switch on the Focus Demand lets operators switch the focus position in relation to the focus knob position, between one straight line and two basic curve modes, known as the Far Mode and the Near Mode. The Far Mode — associated with Infinity — is the curve in which the focus position changes very slowly the more the knob is turned toward the infinity side. This makes fine focus adjustments easy on the infinity extremity of focus. The Near Mode is the opposite of Far Mode, in which focus position changes very slowly the more the focus knob is turned toward the close side. This makes fine adjustments easy on the MOD extremity of focus. New Zoom Demand This new Zoom Demand controller has all of the operational functions of the standard Zoom Demand including Frame Preset/ Shuttle Shot/Speed Preset. However, the new Zoom demand has additional important features: user settings can be registered and operational functions can be assigned to switches from the display screen. Preset speeds can also be set, and zoom control curves can be selected. Users can also check connection status and see whether various functions are on or off. Zoom Control Curves With Zoom Demand, the zoom speed control can be programmed to have different characteristics in relation to the control thumb ring rotation angle. There are preprogrammed control patterns within the controller that are termed the Zoom Curves. One Curve offers a faster zoom speed with smaller thumb ring rotation angle, making it ideal for high-speed zoom operation. A second Curve is the opposite — making it useful for operation at lower zoom speeds. A third Curve is midway between the former two.

>S UMMARY Figure 1: Principle behind the setting of Fine Focus Mode 1 to empower razor sharp focus on a facial close-up. By choosing the degree of the restricted range, the Focus Demand can achieve a highly precise 4K image sharpness, which is especially valuable on facial close-ups. A variation on this scenario might allow the camera operator to implement precision rack focus between various athletes, adding a degree of creativity to the live production.

>S CENARIO TWO: CONTROL RANGE LIMIT In a sense, the Control Range Limit mode is the opposite of the Fine Focus Mode. This sets dead zones in the demand movement range and enables greater optical focus range with smaller rotation angle with the focus adjustment range kept as-is. This is ideal for alternating focusing, such as at a tennis rally, where the active focus range has been preset to allow rapid refocusing on each player with a minimum rotation of the focus knob.

As discussed in a previous white paper [1] significant steps forward were made simultaneously on the operational specifications and on the optical performance specifications of second-generation long-zoom 4K UHD field lenses. Equally remarkable is the parallel development that was also ongoing in addressing those identified challenges in 4K UHD imaging; namely, the operational impediments to ensuring sharp focus in shooting environments that can be particularly difficult. The development of the new Focus Demand and Zoom Demand controllers are innovative flanking accessories that further empower camera operators in live sports coverage. More details are available in reference [2]. < References [1] Canon White Paper: “Second generation 2/3-inch 4K UHD Long-Zoom Field Lens” http://downloads.canon.com/bctv/4K_Box_Lenses_White_Paper.pdf [2] Canon White Paper: “Augmented Creative Control: Innovations in Focus Demand and Zoom Demand Controllers for Long-Zoom 4K UHD Field Lenses” http://downloads.canon.com/bctv/white_papers/Zoom_and_Focus_ Controller_White_Paper.pdf

SPORTSTECHJOURNAL / FALL 2020

81


WHITEPAPERS

5 GHz Elevates Intercom Capabilities By

Simon Browne, Clear-Com, VP of Product Management

Craig Fredrickson, Clear-Com, Senior Product Manager

have led to exploration of the possible use of the 5 GHz spectrum. The higher frequency 5 GHz landscape opens up a multitude of possibilities for improvement. The increased radio bandwidth across its more than 25 MHz channels expands data capacity, which allows for finer control, higher capacity, more robustness, flexible transmission protocols, lower latency, and improved audio quality. The benefits of 5 GHZ were heavily researched and tested, and input was gathered from all ends of the production field, including architecturally challenging stadium and convention center environments, indoor and outdoor live event venues, video-wall-laden conference centers, and more. Field tests at the Canadian Broadcasting Corporation (CBC), the Montreal Bell Arena, and even Times Square on New Year’s Eve are showing that 5 GHz is expanding the possibilities for digital wireless intercom technologies in particularly challenging radio environments.

> I NTERFERENCE AND MULTIPATHING > I NTRO With enhanced freedom and range, and more efficient use of the spectrum provided by digital solutions, shows are becoming more dynamic and more adventurous with deployments in increasingly complex environments such as stadiums, crowded urban areas, and in venues with architectural oddities like domed ceilings. With the number of artistic, technical, and logistics cues that are required for these action-packed endeavors, the use of digital wireless intercom has become even more essential. As the range of uses for untethered, full-duplex communications continues to expand through more departments and production roles every day, much has been learned about the capabilities and limitations of existing digital wireless intercom technologies operating in the 1.9GHz and 2.4GHz bands. These areas of the spectrum have become packed with DECT and WiFi -based devices used for broadcast production communications, distribution, and capture — all while consumer devices continue to push for more bandwidth of their own. Even as 1.9GHz and 2.4GHz technologies solve a multitude of problems and fulfill mission-critical roles in increasingly complex environments, the saturation of available bandwidth, constraints of legacy transmission protocols, and issues with environmental multipathing and other interference challenges

82

SPORTSTECHJOURNAL / FALL 2020

In challenging environments like stadiums, crowded urban spaces. and in the presence of architectural oddities like domed ceilings, digital wireless intercom can suffer from interference caused by reflections. With 5 GHz, those challenges can become opportunities. Reflections, or more specifically, the multipathing they create, can be harnessed in favor of better transmission. Through precise engineering of Orthogonal Frequency Division Multiplexing (OFDM) radio technology, the multipathing that easily propagates among 5 GHz wavelengths can be transformed into “constructive interference.” As reflections help the signal propagate, the OFDM makes the transmission more robust, helping it to survive all the extra bouncing around and deliver clear audio signal. OFDM is also used in WiFi, but whereas WiFi’s priority is to maximize raw data throughput, the design priority for intercom is improved audio performance and the robustness of the radio link. The radio technology is application-specific, and therefore highly optimized for transmission of real time audio, where WiFi’s purpose is generic. Multipathing can also cause limitations in propagation. So, in the field tests, engineers closely evaluated the transmission distances achieved by transceivers. In a standalone test in a large domed stadium, one transceiver covered the field and stands. In the empty stadium with no body blocking of RF, the signal went half-way up the tunnels. In further testing during a game, four transceivers covered the stands and field for 40 beltpacks, while two transceivers were used for locker room and tunnel coverage.


>R OAMING / SCALABILITY Compared to 2.4GHz technologies, 5 GHz devices typically do have a shorter range compared to 2.4GHz. But that fact does allow for easier reuse of frequencies, which is ideal for highdensity applications. In the relatively few cases where this might be an issue, intercom solutions that use the lower-frequency bands can be run simultaneously to form a single, unified communications system. Users have a lot of flexibility as they “engineer for range,” relying on a device architecture that allows for enhanced roaming across multiple transceivers, combined with deployments that maximize capabilities across spectrums.

>F REQUENCY COORDINATION Broadcast productions and live events have become more sophisticated, and hence more complex. This represents both technical and creative progress, and it often offers a more compelling experience for the audience, too. But it creates a challenge. A decade ago, only the personnel who were absolutely key to a production would have access to a wireless intercom. But wireless systems have proven to be so incredibly useful that everyone, understandably, wants to have one. The issue is that it is not just production teams that use wireless; there’s a multitude of other users from security to medical staff, all of which can lead to a fog of active radio channels. Another 5 GHz benefit for large-scale communications is that it can be managed with frequency coordination for reduced interference. Unlike DECT technologies, the 5 GHz band allow users to allocate frequencies. This Static Frequency Allocation, which is coordinated with the same methodologies as in WiFi, enables technicians to dedicate channels for intercom, camera remotes, scoreboards, and other devices, improving cooperability. The remainder of channels may then be used for WiFi. This guarantees the bandwidth required for intercom. Field trials are showing that even in situations where there are no free 5 GHz channels, systems can still coexist with WiFi on channels without hurting its performance.

>C ONTROL There are three additional controls and opportunities for finetuning made possible by the embodiment of 5 GHz technology: Coordination, power, and directionality. Users can put their RF energy where they need it. Increased focusing capabilities allow digital wireless intercom users to narrow the likelihood they will interfere with other users operating in the same spectrum.

They can also tailor their system further through the reuse of channels when clean channels are hard to find, or to maximize capacity in scenarios where maximum range is not needed. This means that at a major awards event, for example, where multiple 5 GHz intercoms might be used alongside various DECTbased solutions relied upon by countless outlets, it’s possible to dial back the amount of radiated power and reduce interference on neighboring systems. Additionally, in cases where transceivers cannot be located near enough to the operators to use lower-power, and instead high power is needed, the antennas can be swapped out for directional antennas.

>A UDIO QUALITY The enhanced audio quality that comes along with 5 GHz is due to its broad channel bandwidth, which leaves more room for audio. The result is that an intercom solution in this bandwidth can provide up to 12kHz audio bandwidth with a lower noise floor. This enhanced audio quality and lower latency is opening up new opportunities in live broadcast where, for example, mobile announcers can use pop-up voiceover booths and a wireless beltpack for clear, full speech-band audio commentary.

>C ONCLUSION The characteristics of 5 GHz can only be seen as a positive for the marketplace, providing an opportunity to capitalize on new levels of performance, audio quality, and customization. 5 GHz is an excellent and future-proof choice for production environment communications. The 5 GHz wireless spectrum is full of opportunity. It is a highly manageable resource, backed by mass market development and readily available deployment tools, that represents enormous potential for today and tomorrow. < SPORTSTECHJOURNAL / FALL 2020

83


WHITEPAPERS

Bit-Rate Evaluation of Compressed HDR Using SLHDR1

By

Ciro Noronha,Ph.D, Cobalt Digital, EVP of Engineering

Kyle Wilken, Cobalt Digital, VP of Firmware

Ryan Wallenberg Cobalt Digital, VP of Engineering

> I NTRODUCTION A number of current HDR standards include transmission of metadata along with the content. HDR encodes absolute luminance information, which may be outside the limits of what a particular monitor can display. Metadata helps the monitor map the incoming content to its capabilities. The SLHDR1 standard [1] operates this way. What is unique about it is the fact that the content is actually SDR, and the metadata allows a compatible device to map that SDR content to the original HDR, or to some intermediate level that it can support. Legacy devices with no metadata support can simply display SDR. This is similar to what happened when color TV was introduced — the transmission was a black-and-white image with color information on the side. Video encoders and decoders are agnostic to HDR. The encoder takes the video samples and converts them to a bit stream; the decoder converts that bit stream back to video samples that are approximately the same as what entered the encoder. Neither device interprets the meaning of the video samples. Metadata, if present, may be carried along the bit stream. In this article, we attempt to answer the question of whether or not there is a bit rate penalty when one uses SLHDR1 to transport HDR over a compressed video link. We do that by establishing a baseline with native HDR video, and then switch to SLHDR1 and determine at what bit rate the quality is the same as the native HDR. This is similar to the work presented in [2].

84

Figure 1: Test Setup

SPORTSTECHJOURNAL / FALL 2020

>B IT RATE EVALUATION Test Setup The test setup is shown in Figure 1. As indicated, there are three test paths: • Path 1 (in blue) is an end-to-end HDR10 path. A native SDI HDR10 signal is applied directly to the encoder, converted to either AVC or HEVC, and then decoded back to SDI. • Path 2 (in purple) is an SLHDR1 path. The native SDI HDR10 signal is routed to an SLHDR1 encoder, which produces an SDI SDR signal with metadata, carried in the ancillary data space using SMPTE-2108 [3]. This signal is applied to the encoder and converted to either AVC or HEVC. The SLHDR1 metadata is extracted from the ancillary space and injected in the video bit stream as SEI messages. The decoder produces an SDI signal, and with the metadata restored to the ancillary space. Finally, an SLHDR1 decoder re-creates the SDI HDR10 signal. • Path 3 (in red) is an SLHDR1 path that bypasses the encoder/ decoder. It is used to obtain a baseline reading without compression. In all cases, both the original and decoded signals are captured in YUV format by a video recorder. The files are transferred to a computer, where the quality metrics are calculated. For this evaluation, we selected the following metrics: • Peak Signal-to-Noise Ratio (PSNR), which measures the absolute difference between each frame in a sequence. It is well-known that PSNR does not correlate well with perceived quality. • PSNR_DE100: PSNR of mean of absolute deltaE2000 metric, referred to a 100nit luminance. • PSNR_L100: PSNR of mean square error of L component of CIELab color space used for the deltaE2000 metric, referred to a 100nit luminance. The DE100 and L100 metrics were selected since they have been shown to correlate well with perceived quality [4]. The test procedure was as follows: 1. Take a baseline reading of the PSNR using Path 3. This only needs to be done once. 2. Select a target test video bit rate Br for the AVC/HEVC encoder. 3. Run the Path 1 signal and compute the selected quality metrics.


4. Run the Path 2 signal and compute the selected quality metrics. 5. Repeat steps 2-4 for other values of Br. 6. Perform the BD-rate computation for both PSNR_DE100 and PSNR_L100. After the PSNR results were obtained, we performed the BD-rate computation [5] in order to evaluate the average increase or decrease in bit rate in the SLHDR1 case to match the quality metric in the HDR10 case. This process was done only for the metrics that correlate well with perceived quality, namely PSNR_ DE100 and PSNR_L100. The details of the test setup in Figure 1 are: • SLHDR1 Encoder and Decoder: Cobalt 9904-UDX • AVC/HEVC Encoder: Cobalt 9992-ENC - GOP size: 100 frames - Bit Depth: 10 bits - Chroma Mode: 4:2:0 (consumer grade signals) • AVC/HEVC Decoder: Cobalt 9992-DEC Test Sequences The tests were performed with three test sequences. All sequences had the following common parameters: • Duration: 12 seconds • Resolution: 1920×1080 • Color Space: BT 2020 The contents of each sequence were as follows: • Sequence 1: “base jump” — extreme sports in mountains • Sequence 2: “baseball” — baseball game at night • Sequence 3: “zombie” — city scenes Table 1 presents the quality metrics of the SLHDR1 process before encoding and decoding (Path 3 in Figure 1). The SLHDR1 process is not exact – there is a small impact in the metrics, as the image is not exactly reconstructed. Note that the YUV PSNR is provided as reference; since these metrics are fundamentally different, the absolute values should not be compared between them. Table 1: SLHDR1 Metrics with Encoder/Decoder Bypassed Sequence Path 3 YUV PSNR Path 3 DE100 Path 3 L100 Sequence 1 59.26 dB 38.54 dB 51.80 dB Sequence 2 58.72 dB 38.16 dB 51.53 dB Sequence 3 59.07 dB 38.32 dB 51.67 dB Evaluation Results Figure 2 shows the raw PSNR_DE100 and PSNR_L100 results for the three sequences, using both HEVC and AVC encoding. The bit rate ranges used are different since HEVC encoding is more efficient than AVC encoding, so a higher range is used for AVC. Generally, Figure 2 shows that there is almost always some PSNR improvement at the same bit rate when SLHDR1 is used, with very few exceptions. In other words, even though the SLHDR1 process is not perfect (the HDR image is exactly reproduced), when one uses quality metrics that correlate well with perceived quality, the resulting image after compression/decompression actually looks better than compressing and decompression the HDR image directly. This matches the conclusions presented in [2]. The remaining question pertains to quantifying the bit rate

Figure 2: Raw DE100 and L100 Results for the Test Sequences advantage. If one is seeking to achieve a certain target quality for an HDR link and has the option of either transmitting HDR10 natively or SLHDR1, which method will yield the lowest bit rate and by how much? The standard way of answering this question is by the use of BD-rate [5], which produces an “average” value over the tested range. The relevant numbers are provided in Table 2 below. Table 2: BD-Rate Values Sequence HEVC AVC DE100 L100 Sequence 1 0.90% -6.92% Sequence 2 5.61% -25.90% Sequence 3 7.27% -20.25%

DE100 -6.07% -5.84% -2.58%

L100 -3.65% -2.42% -13.07%

Table 2 indicates that, for most combinations, a lower bit rate is required for SLHDR1 transport as compared with a straight HDR10 link, sometimes significantly so. This confirms the conclusions presented in [2], using a different commercial encoder. One final item for discussion pertains to the SLHDR1 metadata bit rate. When transporting HDR10, there is no metadata requirement, but SLHDR1 adds per-frame metadata that is included in SEI messages inside the video elementary stream. The maximum amount of metadata per frame is 61 bytes. Therefore, an upper bound on the SLHDR1 metadata bit rate is 24.4 kb/s for a 50fps signal, and 29.3 kb/s for a 60fps signal. This bit rate increase is negligible (less than one audio channel), and, in many encoders, is absorbed by a slight adjustment in the NULL packet rate, so the overall bit rate is unchanged.

>C ONCLUSIONS The impact of SL-HDR1 in compressed bit streams is a function of the content, and can be quite significant. When using quality metrics that are better correlated with human perception, it is often possible to actually decrease the link bit rate while keeping the same quality, as originally reported in [2]. < References 1. European Telecommunications Standards Institute, “High-Performance Single Layer High Dynamic Range (HDR) System for use in Consumer Electronics devices; Part 1: Directly Standard Dynamic Range (SDR) Compatible HDR System (SLHDR1),” ETSI TS 103 433-1 V1.2.1, 2017 2. Touze, D., and Kerkhof, L., “Single-Layer HDR Video Coding with SDR Backward Compatibility”, 2017 SCTE-ISBE CABLE-TEC EXPO 3. Society of Motion Picture and Television Engineers, “HDR/WCG Metadata Packing and Signaling in the Vertical Ancillary Data Space,” SMPTE ST 2108-1, 2018 4. Hanhart, P., Řeřábek, M., and Ebrahimi, T., “Towards high dynamic range extensions of HEVC: subjective evaluation of potential coding technologies”, Proc. SPIE 9599, Applications of Digital Image Processing XXXVIII, 95990G 5. Bjontegaard, G., “Improvements of the BD-PSNR model”, ITU-T SG16/Q6 VCEG 35th meeting, Berlin, Germany, 16–18 July, 2008, Doc. VCEG-AI11

SPORTSTECHJOURNAL / FALL 2020

85


WHITEPAPERS

Remote Content Creation in the Age of COVID By Dave Simon, Diversified, Director, Technical Operations The COVID-19 pandemic has dramatically disrupted business across all industries, with more than a third of the workforce in the United States working remotely almost overnight. Companies and their employees have had to rapidly adjust their operational models from collaborative office work to remote business via a suite of communications platforms like Zoom, Slack, and Teams. While organizations continue to implement safe social distancing measures in-house, we are seeing efforts to move job functions that were traditionally in-house to remote. As professional sports teams return and adapt to playing in empty stadiums, their respective in-house production and editorial teams are adapting as well. Content creators — many of whom work on high-performance workstations with direct access to shared storage — are now having to retool their operations and formulate new work-from-home scenarios. Some organizations have been equipped to handle this shift while others have had to improvise with cumbersome ‘sneaker-net’ workflows, shuttling media on portable hard drives or transferring files through corporate VPN. Others have adopted screensharing platforms like Teamviewer and VNC to access on-premises systems from secondary computers. These methods, while function, have resulted in fractured workflows with core resources such as shared editorial storage, tape archives, and high-end editorial systems under-utilized or sitting idle. As the pandemic continues, the question becomes how to create content in a traditionally collaborative environment while maintaining the health and safety of individuals without sacrificing the final deliverable. Editors, graphics and visual effects artists, and content loggers are among those whose roles can be shifted offsite today through various forms of remote access. Leveraging display extender technologies like PCoIP-based KVM, virtualizing desktops, and Desktop as a Service all offer viable solutions for decentralizing traditionally onsite operations. Each of these technologies relies on underlying display transport protocols, such as Teradici’s PCoIP, HP’s Remote Graphics Software RGS, and VMWare’s Blast Extreme, among others. Developed to deliver multiple high-resolution display outputs to end points over ethernet, what makes protocols like PCoIP and 86 SPORTSTECHJOURNAL / FALL 2020

RGS unique is the way screen images are refreshed and delivered to the end user. Instead of sending continuous full-frame video refreshes that can add encoding overhead and latency, it works by updating only the pixels that change from moment to moment. Bandwidth and connection requirements will scale anywhere from 15 to 100+ mbps depending on available bandwidth, per screen. Bi-directional audio, USB, and serial data is carried alongside display data as a fully encrypted data stream to deliver a very viable low latency, high-resolution near-realtime user experience. Virtualizing the creative workstation is another option for today’s content creators through the use of virtual desktop infrastructure (VDI) platforms like VMWare Horizon. VDI has been a standard practice for task-based job functions for decades; you could even consider mainframes from the 1960s and 70s as a primitive form of VDI. Only in recent years has the technology become a viable and acceptable solution for creatives. Unlike the traditional approach of managing physical workstations, VDI enables organizations to provide high-end virtual workstations complete with 3D graphics acceleration, audio support, and multiple displays with access to on-premises resources like shared storage and archive, without the expense and overhead of supporting individual workstations. A VDI infrastructure is based around a hypervisor to provisions, managing the virtual machines and hosted on dense hardware servers or blade servers clustered to provide a pool of available resources. An entire Post department-


worth of physical workstations could be collapsed down to 6-10 rack units of hardware in a data center. VDI is attractive to IT organizations as it simplifies deployment, offers greater control over security, and makes scaling beyond a fixed number of workstations easier to manage and plan for. Organizations can easily deploy VDI into an offsite data center that offers greater levels redundancy and protection against facility outages like power and network maintenance. One of the greatest limitations of VDI is the lack of virtualized MacOS support, a favorite of creatives. A third approach is cloud-based Desktop as a Service (DaaS) companies like Bebop and Avid’s own Avid On-Demand. Much like a roll-yourown VDI, these companies offer turnkey virtual workstations provisioned for content creation. Avid On-Demand is built around the Avid ecosystem with virtualized Media Composer workstations connected to Nexis storage workspaces hosted on the Microsoft Azure cloud. Bebop Technology offers a non-Avid option, providing virtual workstations preloaded with Adobe Creative Cloud, Cinema 4D, Autodesk products, etc. — all you provide is software license and media. In the case of Bebop, teams can have their workstations provisioned in their own cloud environment. Teams interested in moving to a DaaS model do have to consider the expense of moving to a managed service where usage costs can rapidly accumulate, as well as the logistics of shuttling media onto and off of the platform. Automated transfer workflows do need to be considered when planning for a move to DaaS; fortunately, the service providers include the tools to make this relatively easy. To access resources at home, all three solutions are based on the same core principals and protocols. In all three scenarios, users can access remote resources on both hardware and software-based clients. Software clients follow the same principal as standard screensharing technologies and require a separate computer to run the client receiver application. For organizations interested in hardware solutions, consider deploying thin clients or zero clients from one of handful of equipment manufacturers including 10ZiG, Amulet Hotkey, and Dell. Today’s thin clients and zero clients can support multiple protocols, high-resolution displays, USB devices including custom NLE keyboards, Wacom tablets, webcams, and bi-directional audio for monitoring and video conferencing. One manufacturer, Amulet Hotkey, has taken it a step further and packaged a PCoIP co-processor card into an external KVM transmitter form-factor, making it possible to extend up workplace-bound Mac and PC workstations, video servers, scopes, multiviewers, or any device with a DisplayPort/

USB-C output to the home. For teams with workflows and media repositories centered around on-premises architectures, the Amulet Hotkey PCoIP KVM solution may be the ideal solution for extending beyond the bounds of the facility. Technology aside, there are several other factors to consider when planning a shift to offsite operations. For starters, robust home internet access is a baseline requirement. Without at least 30 mbps of downlink throughput, these solutions won’t offer much in terms of a robust experience. We also have to consider whether the job function is even possible to move offsite. For instance, not every seat in the control room or production suite should be done from home. However, with the push to decentralize operations, remote postproduction is very real and feasible today. As this is quickly becoming the new normal, organizations must rely more heavily on their employees to be able to perform their job beyond the controlled environment of a corporate network and out of reach of the helpdesk. Employers and employees must adjust expectations and be accepting that the experience is a bit different; fortunately, there are solutions to help minimize that difference. <

SPORTSTECHJOURNAL / FALL 2020

87


WHITEPAPERS

Optimizing Sports Archives for New Times, New Challenges

By

John Reuter, EcoDigital, Solutions Architect

Dusty Alves, EcoDigital, Director of Client Relations Like many other businesses around the world, sports broadcasters and networks have had to reconfigure their operations to work safely, and often remotely, during the COVID-19 pandemic. Even more challenging has been the utter lack of live sports over the past few months. Without fresh content, broadcasters and networks have had to dig into their archives like never before to fill airtime and keep sports fans tuned in. Digitization of massive archives already represents an immense and ongoing undertaking for many sports broadcasters and networks. In some cases, they deal with tens of thousands of hours of content and double-digit petabytes of storage. Inevitably, these storage requirements grow over time as more content is captured, often in increasingly data-intensive formats and resolutions. The pandemic has brought added urgency to these archiving projects, emphasizing the value of maintaining a cohesive archive strategy with robust, unified search across storage tiers and simple, fast access to assets across the archive. Live sports have begun to return in a limited way, but the current crisis has underscored the importance of being able to access archived media, find moments of interest, and leverage that content to keep fans engaged. At the same time, to satisfy the requirements of day-to-day production as live sports come back, broadcasters and networks need not only searchable video archives, but also the ability to quickly restore short clips — a walk-off home run, a rush for a touchdown, etc. — for new programs and productions.

> THE SOLUTION: SOPHISTICATED CONTENT STORAGE MANAGEMENT Sports broadcasters and networks are using content storage management (CSM) technology to optimize costs and simplify management of media archives across a scalable, consolidated 88

SPORTSTECHJOURNAL / FALL 2020

storage infrastructure. The CSM integrates directly with playout automation, media asset management (MAM), newsroom, and editing systems to bring users the content they need. Key characteristics such as scalability, comprehensive search capability, robust redundancy, remote access, seamless migration to new technologies, and — most important of all — partial file restore, are critical to maximizing the value of archived content. Scalability When built on standard off-the-shelf IT components, the CSM can provide a distributed architecture — on-premises, in the cloud, or a combination of the two — that scales along with the user’s capacity and throughput requirements. To scale up to handle a large amount of information, as in a high-priority digitization project, the CSM user can simply add compute resources in the cloud, get content processed and stored, and then dial back resources to normal operations. The on-premise environment can be likewise scaled, albeit less quickly, using a virtual environment to spin up additional instances and optimize resource use. In this case, any storage resource — robotic tape library, tape drive, disk volume, data mover, or transcoding engine — can be added or removed on the fly. Advanced Search Storing media assets as objects with rich metadata and using policies to automatically manage storage, the CSM ensures assets are stored on the appropriate storage tier (flash, disk, tape, cloud) for archiving, with fast, easy retrieval through advanced search functions. To accelerate searches for specific content, the CSM separates the metadata from the content itself. The content goes into the deep storage, but the metadata is readily available, allowing users to get results in milliseconds. When users find what they need, they simply issue the restore command. Although content might be retrieved from one of several storage tiers, the end user sees only a single video archive. Partial Restore A CSM with partial restore capability gives users valuable efficiency in restoring content from archive. Whether from tape or deep storage on the cloud, the ability to retrieve only the required sequence of frames, defined by timecode in and out points, saves sports broadcasters and networks significant time and money. Partial file restore allows users to move only the slice of content they need rather a complete file. Retrieving content to the destination device takes just a small fraction of the time required to restore full-length game or show — five minutes rather than three hours. With faster access to content, users also can process much more information more quickly. For content stored in the cloud, where costs are pinned to retrieval, the CSM’s partial file restore capability translates to both time and cost savings. The savings add up quickly when the archive is the main source of content, as has been the case during the pandemic, with old clips forming the basis of new shows.


Analytics Taking advantage of analytics, the CSM can help users better understand how they use their archives and make faster and better decisions about migrating content. Perhaps they have a good idea of how much they need to store, or how much they generate on a weekly or monthly basis, but don’t have accurate predictions around how much content they will be restoring from archives. By logging and analyzing historical data around file restoration, the CSM can assess past behavior and provide numbers specifying the archive total, the average number of restores per day, the average size of restores, and so on. This helps users determine if initial sizing of the system is on target, or if system scaling is on order. If, for example, the user is running at 80% but foresees a peak in volume, such as with a new launch or the addition of cameras to production, this information can enable proactive expansion of storage and processing resources. This information can also be useful in preventative maintenance. The CSM can alert the user to conditions such as excessive reads or writes for a drive or tape, noisy network connections, and other factors that can threaten performance or reliability. Redundancy The CSM can provide varying degrees of redundancy, all in an automated fashion, across multiple locations and storage tiers, both on the ground and in the cloud. Keeping multiple systems interconnected, the system can provide a federated view of all content, everywhere. If content at one location is lost, the CSM can automatically restore it to another as a background task. For end users working on day-to-day production, the process is invisible. A rules engine determines what content gets moved where, on what schedule, at what priority level, and how many copies should be made. If there is a failure on one side of a transfer, or a temporary network outage, updates are resent and resumed as soon as the connective recovers, realigning storage with preconfigured policies.

This very robust approach to redundancy primes the CSM for applications such as disaster recovery and true business continuance, where two full facilities are replicating each other. It also supports more efficient remote access to content, a must in the current environment. Many broadcast networks now operate multiple content production and distribution sites around the globe. Moving content seamlessly between these sites increases efficiency, lowers operational costs, and enables disaster recovery strategies that support business continuity. Interconnected CSM systems can effectively form a content-sharing network, giving users a second means of access when demand is high. Next-Gen Tech Migration Most archives are built on multiple technologies, typically including some type of fast storage disk or cloud object storage, as well as tape and cloud storage. As the user updates these elements, the CSM orchestrates the movement of content in a seamless way. Content is migrated from old systems to new, and fresh content is routed automatically to the new system. The broadcaster or network can update its technology or expand its archive smoothly, and for end users, the experience of searching and retrieving content remains the same.

> THE RESULT: A FUTURE-PROOF FOUNDATION A sophisticated CSM gives sports broadcasters and networks an efficient and cost-effective solution not only for digitizing and logging archived content for re-use, but also for capturing and preserving live content to support high-quality video highlights and television shows. With functions such as partial file restore supports faster, more economical access to content, users can maximize the value of content within their archives. Rather than focus on the technical infrastructure underpinning their operations, broadcasters and networks can focus on the critical task of making great programming that keeps viewing audiences engaged. < SPORTSTECHJOURNAL / FALL 2020

89


WHITEPAPERS

Moving Towards Live Production Anywhere in the Post-COVID Era By Alex Redfern, EVS, SVP Solutions Architecture Sports broadcasters today are challenged to produce more content more efficiently while coping with shrinking budgets. Traditional live production as we have known it for many decades, where all primary equipment and staff are located at the production site, is making way for alternative models to better adapt to the new realities of the industry. Remote production, for instance, is known to create more efficient workflows for the coverage of live events and is changing the way crews collaborate. And while the ability to produce an event from a distance is not new, it has gained momentum over the past few years and has become more relevant than ever since the COVID-19 outbreak.

>T HE 2020 ‘OPERATOR AT HOME’ PRODUCTION MODEL There are different types of remote production, and the terminology can have different meanings depending on who you talk to. However, it is generally agreed that ‘remote production’ is the overarching term that describes a production with some elements of the production happening remotely from another element. It can be broadly categorized into different approaches including

distant remote, centralized, or distributed, depending on where the staff, equipment, and facilities are located. Prior to the coronavirus outbreak, remote production was mostly synonymous with centralized production, sometimes referred to as ‘Home-Run’ or ‘At-Home’ production. It was the most widespread of these approaches, referring to setups where a majority of the equipment and staff are housed in a broadcast center for a production separate to the venue where the event is taking place. Broadcasters using this model before the pandemic already reported a number of benefits; it greatly reduces travel expenses, since fewer people and less equipment are needed at the venue. In turn, less travel means a reduced carbon footprint for a more environmentally friendly way to produce live events. Additionally, remote production means more games can be covered in a shorter period of time. By spending less time on the road, operators are able to produce back-to-back games (perhaps even in the same day) or work on a wider variety of sports. This allows operators to rapidly gain new skills and experiences, leading to smoother, higher-quality productions. When the coronavirus swept the world, broadcasters were thrown into the deep end of remote production. The travel restrictions and social distancing measures brought by the pandemic meant crews needed to be shifted away as much as possible from studios and centralized production facilities. Despite the little time they had to prepare, many broadcasters found new and creative ways to adapt their existing setups on the fly, allowing parts of their production crews to work from the safety of their own homes. By creating distributed remote workflows, businesses were able to continue producing and delivering content while helping contain the spread of the virus. Almost overnight, this ‘operator at home’ production model became the prevalent form of remote production, and the only way to get programs back on air safely.

>O VERCOMING THE REMOTE PRODUCTION FEAR FACTOR It’s fair to say that sports productions are among the most severely hit by the pandemic; while newsrooms and talk shows have been able to more easily adapt their formats and find solutions and workarounds to continue their programming with little disruption, it’s the entire sports business model that has been affected by the blank sports calendar. But, despite being hit harder by the crisis, the absence of live games has 90

SPORTSTECHJOURNAL / FALL 2020


also given them the opportunity to take a step back and start rethinking their production models for the better. Before COVID-19, transitioning to remote workflows was an intimidating process for many. Perhaps because remote workflows were not routinely used for complete productions of tier-1 or primetime live sports events, but rather, as an addon or expansion of traditional production methods. In lower tier sports where it could be considered less was at stake, it was more regular. Broadcasting equipment and technology also represent significant investments, and the prospect of making substantial transformations to already established production infrastructures was a daunting prospect. This widespread adoption of remote workflows during the crisis has showed the entire industry — including sports broadcasters — that remote production is well and truly a viable option and that the transition towards this model doesn’t necessarily mean having to perform a complete overhaul of existing infrastructures. The core ingredients exist today for broadcasters to easily complement their current production setups to accommodate (or better support) remote workflows while capitalizing on their current investments. The fear factor has finally been removed.

>A STEPPING STONE TO LIVE PRODUCTION ANYWHERE As we emerge from this crisis, it has become clear is that it will be important for sports broadcasters to take note from the industry’s recent achievements and continue to adapt their infrastructures for a better support of remote workflows in the long run. By leveraging IP-based toolsets, software-defined technologies, and cloud-based solutions, broadcasters can seamlessly move towards production models where location is no longer a constraint. At the end of the day, whether it’s REMI, GREMI, centralized, at home or distant

remote, what you want to achieve is live production anywhere. As an example, the replay element of a broadcast usually represents multiple operators in a confined environment, but with replay systems that exist in the market today, operators can work from literally anywhere, providing there is a secure IP connection. They can conveniently set up the replay controller and a multiviewer either in the broadcast center, or from the comfort of their own homes, connect to a server deployed at the event location, and begin working immediately. They can build their replay and highlights packages from a distance, sometimes even thousands of miles away, in a similar way to how they would do back at the venue. Another possible scenario is moving the servers away from the event site, back into the main production facility. This allows the creation of live programming with minimum equipment and staff at the event site, since most operations are done from a distance. This client-server decoupling is not limited to replay systems. There are review systems (VAR), all-in-one production systems, and live switchers that also rely on this type of architecture, allowing operators to work from anywhere, at any time. Content management from different sites is also possible. With natively web-based tools, as well as other technologies such as remote desktop and PC over IP technology (PCoIP), production staff can manage ingest, metadata, and clipping, as well as playout from any location. By further exposing the fragility of traditional approaches, the challenges brought by COVID-19 have strengthened the idea that remote production will play a central role in the future of live production. As we move forward, we can expect to see improvements in today’s tools and technologies and an acceleration of innovations that will bring the future live remote production experience to the highest level with crews collaborating live, from anywhere. < SPORTSTECHJOURNAL / FALL 2020

91


WHITEPAPERS

Creating ScheduleAdjusted Metrics in NCAA Basketball for Differentiating Content and More Accurate Team Evaluation By Alok Pattani, Google Cloud, Data Science Developer Advocate > I NTRODUCTION Evaluating teams in college sports properly requires going beyond win-loss records and basic stats, as the level of competition each team faced in achieving those results should be taken into account. In NCAA Division I basketball, more than 350 men’s and women’s teams play schedules of highly varying quality, both in and out of their conferences. Fairly evaluating teams is not just important to create better stats and answer basketball-relevant questions (e.g. which are the best offenses?), but the idea is crucial to the NCAA Tournament selection and seeding process. Both the men’s and women’s tournament fields are chosen by basketball committees made up of various executives within the NCAA basketball community. The process involves many different pieces of data collected on each team: multiple team rankings, measures of strength of schedule, counts of opponents played and defeated by difficulty, and more. Enabling team power ratings and rankings to properly account for the differing schedule contexts for team performance results in a more realistic and accurate evaluation process.

>R AW VS ADJUSTED STATISTICS A “raw” team stat is generally an aggregate calculation based on a team’s performance over all its games (e.g. offensive efficiency), which can be measured across all teams to generate a “raw” ranking. But should a team with a great raw offensive efficiency rank very highly if they play many of the weakest defensive teams in the country? How a team performed in isolation isn’t as valuable without also knowing against whom those performances took place (and where). Advanced team stats like efficiency, pace, and the Four Factors exist publicly for NCAA men’s basketball on websites like KenPom, Sports Reference, and TeamRankings, but some of those stats aren’t always adjusted for schedule. Also, sites like those rarely include women’s basketball stats. As part of its partnership with the NCAA, Google Cloud developed a system to adjust various 92

SPORTSTECHJOURNAL / FALL 2020

important basketball team metrics for schedule, taking into account each team’s opponent and location (home, road, or neutral) in every game.

>S CHEDULE ADJUSTMENT METHOD The schedule adjustment method uses ridge regression (a statistical/ machine learning technique) to adjust each team’s performance based on quality of opponent on the other side of the ball (e.g. team offense is adjusted for opponent defense). The main idea is that each stat of interest is a function of three things: a team’s ability, their opponent’s ability (on the “other side” of that stat), and home-court advantage. Using a very loose model representation: game_stat ~ intercept + tm_effect + opp_effect + home_ advantage + (error) A separate model was fit for each statistic (e.g. 1 for pace, 1 for effective FG%, etc.), with the model’s input data including the stat value from every game in a specific season, across all teams and opponents. The ridge regression model was fit using Python’s scikit-learn package, with the resulting regression coefficients representing team effect, opponent effect, and home advantage estimates. In other words, the regression automatically does the opponent and site adjustment and produces adjusted team stats. Ridge regression is a particularly good fit for this use case to help handle multicollinearity and “shrink” coefficients (particularly in small sample size cases) — a form of regularization that gives more sensible estimates. This is a much more thorough and statistically valid technique than your typical “stat vs. opponent season average” adjustment that is often found in other such analysis, with the additional advantage of site adjustment. In the basketball context, the concept behind how a team’s stat gets adjusted is fairly logical: playing higher quality opponents and doing so away from home generally results in a team’s statistics being boosted by adjustment; playing lesser opponents and more home games in general leads to a negative adjustment. For example, if a team’s defense often holds opposing offenses below their usual efficiency, then it will likely rate as a good defense after adjusting for schedule.

>E FFECT OF ADJUSTMENT ON TEAM STATISTICS To see how adjusting for schedule can impact the measurement and ranking of teams in a particular statistic, consider the following plot. It shows each 2018-19 men’s Division I basketball team’s raw and adjusted version of net efficiency — net points per 100 possessions, a pace-adjusted version of scoring margin — a couple days before Selection Sunday. Raw net efficiency is on the x-axis, adjusted net efficiency is on the y-axis (up and right is good), with each team represented by a single point. Most points fall relatively close to the diagonal gray line, where raw and adjusted efficiency would be equal. But some teams have adjusted efficiency values and rankings that are rather different from their raw ones, which suggests their competition has a


2018-19 NCAA Men’s Division I Basketball Teams Adjusted vs Raw Net Efficiency (Through March 11) substantial effect on the way they look. Take the two teams highlighted on the plot: Abilene Christian (ACU), ranked 21st in raw net efficiency, and Auburn, slightly behind at 24th. While ACU performed well that season, playing a relatively weak Southland Conference and non-conference schedule dropped them all the way to 146th after adjusting for schedule. Meanwhile, adjusting for Auburn’s strong Southeastern Conference and non-conference slate of opponents boosted their efficiency and their ranking up to 13th — well ahead of ACU. This is schedule-adjusting doing its job, as the teams’ similar raw efficiency numbers belie the fact that Auburn’s results were much more difficult to achieve. Most informed college basketball observers would have rated Auburn well ahead of Abilene Christian at that time, with the NCAA committee giving a ACU a 15-seed and Auburn a 5-seed in the 2019 NCAA Tournament. The adjusted net efficiency was much more reflective of tournament performance, too, as ACU lost by 35 to Kentucky in its first game, while Auburn defeated some really strong teams (including Kentucky) on the way to the Final Four.

>P REDICTIVE VALUE OF ADJUSTED NET EFFICIENCY The preceding example illustrates the predictive value of adjusting net efficiency for schedule across teams. A much more comprehensive study of the predictive value of using schedule-adjusted net efficiency as a power rating across late-season games, including NCAA Tournament contests, shows that it is quite predictive of future performance for both men’s and women’s basketball. Details of that more thorough evaluation are beyond the scope of this paper, but the table below helps demonstrate this with a specific example. The table shows each of the eventual Sweet 16 women’s basketball teams for the 2018-19 season, with their rankings in raw and adjusted efficiency (along with the rank difference) from the day the NCAA Tournament field was selected (i.e. before any Tournament games). Note that 14 of the top 16 teams in adjusted net efficiency made it to the Tournament’s second week, compared to only eight of the top 16 in raw efficiency. And while the top teams rated very highly in both metrics, seven eventual Sweet 16 teams ranked more than 15 spots better in adjusted net efficiency (UCLA is a particularly large outlier, ranking 107th in raw efficiency). This is a small sample, but is reflective of the greater trend across many more years and games. In part because of this predictive value demonstrated over several

2018-19 NCAA Women’s Division I Basketball Sweet 16 Teams Raw and Adjusted Net Efficiency Ranks (as of Tournament Selection Date) years of analysis, adjusted net efficiency is a primary factor in the new versions of the NCAA Evaluation Tool (NET) recently adopted by both the NCAA men’s and women’s basketball committees.

>E XTENSIONS AND FURTHER DISCUSSION The Google Cloud schedule adjustment methodology and implementation across multiple statistics allows for much more accurate team evaluation, both overall and in a number of specific facets of the game. These metrics can be used in differentiating and insightful storytelling for college basketball fans and analysts alike. Google Cloud used some of them in notes and other content elements across media platforms during the 2019 men’s NCAA Tournament, including on Tournament studio and game broadcasts viewed by millions. In 2019-20, these schedule-adjusted metrics were calculated daily for every men’s and women’s basketball team the past six seasons, resulting in more than seven million rows of adjusted team stats available for all sorts of interesting analysis. This methodology was also extended to some advanced player statistics, allowing the creation of unique “all-in-one” player metrics that account for competition faced. Many of these results are found in the only public-facing dashboard with schedule-adjusted men’s and women’s metrics like these. While not in the scope of this paper, Google Cloud data analytics tools including Dataflow, BigQuery, and Cloud Composer were critical in the ingestion, data warehousing, computations, and orchestration necessary to calculate metrics at this scale on a regularly scheduled basis. More broadly, the idea of adjusting metrics for schedule is fairly fundamental in sports analytics, especially in college but also in many professional sports, too. Having better metrics for team and player quality has many other implications in sports broadcasting, including deciding which games to schedule or teams to cover. In this way, putting this type of methodology and implementation into play can help make sports production and content smarter, more accurate, and more insightful. < SPORTSTECHJOURNAL / FALL 2020

93


WHITEPAPERS

Virtualizing Content Creation By

Ian Fletcher, Grass Valley, Chief Application Designer

Chuck Meyer, Grass Valley, Chief Technology Officer

The cloud is certainly top of mind. The increasing move towards remote production heightens interest in the cloud as both a technology and a service. The pandemic has accelerated this trend. Content originators are eager to trial Proof-of-Concepts, and move rapidly to on-air operations. Until now, remote production typically consisted of two teams, one at the venue and one at a central location. The pandemic challenges this model, stretching it to include the concept of production talent working from home with virtualized processing equipment in the cloud. Grass Valley anticipated this three-position model of distributed cloud production based on technology trends. It was working with early adopters prior to widespread quarantine. If there is a silver lining to be found, perhaps it can be described by explaining the lessons learned with GV AMPP, Grass Valley’s Advanced Media Processing Platform. AMPP is designed natively for the cloud. It is not an adaptation of the cloud, nor is it an attempt to have the cloud behave as an emulator of an on-premise facility. The simple three-position model could be considered as cameras and microphones at a venue, producers at home, and the virtualized application in the cloud. In the limit, there could be more than one of each position, in more than one geographic location. This immediately complicates the requirements on a number of levels, which can be categorized as Control, Monitoring, and Latency Management. Different producers must have an intuitive user interface, which provides them a coherent media experience and at the same time, allows for collaboration with additional producers. GV AMPP consists of a control plane and a data plane. An essential concept with cloud technology is to include resiliency in preference to rigidity. Strong identity principles are used throughout both control and data planes. The control plane provides operators a single point of system entry, a unique identity, and therefore authentication and a high degree of security. There can be multiple control points, located where required. These control points need not be located where video is 94

SPORTSTECHJOURNAL / FALL 2020

processed. The use of strong identity ensures proper coordination of data plane resources amongst control points. Just as operators need not be collocated, it is possible to locate video processing, or data plane, resources where desired. A processing engine could be located at the venue, in the cloud, or in a producer’s “home” based on system performance requirements. At the same time cloud technology is adopted, the user interface has been designed to offer a familiar look and feel. Figure 1 is a screen shot of AMPP. There are three geographic locations. There are a number of feeds, or sources. Signals are monitored and displayed in a familiar way. One nice feature is that monitoring is shown at the point of analysis, rather than pushed to a monitor wall which can often be well down stream, and therefore, inaccurate. At the same time, the system is represented as a connected graph showing general location, or deployment, of sources, processing, monitoring and feeds, as well as the signal flow. Each producer can see their necessary resources, which can include the collaborative resources of another producer. Interaction between producers is managed to ensure usability. The data plane exploits a microservice architecture. Video processing is carried out by devices, each with strong identity. Devices can be coordinated to appear as a single piece of equipment and controlled by operators as if they were a single piece of equipment. Still, the equipment is really a virtual device often composed of numerous microservices that may perhaps be located thousands of miles away. By design, AMPP provides a user interface is that is suitably responsive for human interaction even with this underlying implementation.

Providing human operators with acceptable responsiveness is a challenge. Signal propagation is measured in time. This delay, or latency, can be annoying, so much so as to render the system unusable. AMPP data plane technology targets bottlenecks in the transport stream, codecs, and receiver design to ensure the lowest latency as well as coherent signal management. In the end, the producer is provided a satisfying, very useable, albeit virtual, creative experience. In order to meet the latency requirements of a given system, certain parameters can be adjusted; conceptually trading between


aspects of quality for gains in latency, or vice versa. As compute improves, so do codecs. A 1080P59 signal can be very useful for production decisions even when compressed to 5 Mbps. As available bandwidth grows, the number of possible sources at a given location will also increase. Increases in web scale, transport, and compute technology will provide for more sources, with higher quality as well as the ability to decrease latency over time. AMPP will natively scale with these developments in underlying technology. Causality cannot be violated. As the number of functional positions and their locations increases, unique identity is an important tool. NMOS provides one scheme of using unique identity and time for each grain that makes up a flow of video essence. In the terminology of SMPTE ST 2110, each frame of video has a unique time stamp. One need only add a unique identity for each audio, video, and control essence stream. It is now possible to form time coherent alignment of essence, as well as processing control to achieve a determined outcome in any geographic location. Care can be taken to manage timing requirements for a video production service operating at full bandwidth, a monitoring service using compressed signals, a production decision process (which generates a control sequence), and the like. In short, any task can be carried out in a timeline and associated with the necessary processing steps to create flexible workflows meeting the human factors requirements for a given production. Because this system is cloud native, it can be easily torn down and re-configured to meet the demands of the next shoot, be it breaking news or live sports. Figure 1 shows one representation of an AMPP system. But, AMPP has any number of userconfigurable views. In a second view, shown here as Figure 2, system dynamics for latency management are represented as horizontal bar graphs. This is especially important where understanding critical signal timing is essential to on-air confidence for production operators. Again, this screen shot shows the connected graph of a system along with the signals, processes and locations used, thereby representing the system. The green arrows above point out horizontal timing indicators located at the top of the signal monitors, which are really live icons. The length of the green bar represents the timing position within the timing window. Timing windows are established as part of the system configuration as-built, or deployed. They are shown normalized to fit the length of the video monitor icon to simplify visualization. The green bar then represents accumulated latency for a given signal at the exact point in the system shown by the icon. In Figure 3, a close-up of the timing bars is shown. In all

likelihood, the green bar length will be unique for each icon, or stream, represented. A shorter bar indicates less latency, a longer bar represents longer latency. The system operates acceptably when all the bars stay within the normalized window. AMPP manages this process. The icons and green bars exist to ensure confidence and facilitate understanding of acceptable system operation. Bars will very likely never be identical. The bars in this case are not identical and do not extend to the limit of the normalized timing window. Both streams are well within the acceptable timing window for stable operation.

GV AMPP certainly faced design challenges. These challenges were reduced to the design goals, which have been discussed above as key to meeting our customers’ needs. In turn, the solutions were developed in tandem with early product adopters ensuring their needs were met. Using cloud native technology, including microservices, enables rapid turn-around for developing new features, new workflows, and new virtual devices. In another aspect, AMPP uses a true foundation of CORE applications or services. In this way, new applications, workflows, and devices all inherit the assurance of latency-managed, cost-optimized operations, which respect human factors. <

SPORTSTECHJOURNAL / FALL 2020

95


WHITEPAPERS

Remote Production: Why Synchronization Matters for Live Sports By Ghislain Collette, Haivision, VP, Product Management

> I NTRODUCTION Nothing engages viewers like live video. Whether it’s for television or corporate communications, live video can make people feel as if they are truly taking part in the event, no matter where they are watching from. For producers with tight budget constraints, planning for a live event involves tough choices between deploying remote production staff and the cost of transmitting video. Traditionally, live production of remote events requires an onsite crew of camera operators, sound engineers, and a technical director. Adopting a remote production model can reduce production costs and logistical complexity by reducing the burden of deploying expensive resources: the equipment required to capture, process, and produce at a remote venue and the field crew needed to set up, operate, and manage it. Creating greater efficiencies allows broadcasters to produce more events and deploy their best resources more effectively. Although costs can be significantly reduced by managing live production workflows from a main master control room (MCR), sometimes referred to as remote integration or REMI, the additional bandwidth typically required for transmitting multiple contribution video feeds over satellite or a dedicated network can negate the savings of having a centralized live production facility. In this white paper, we will explore how broadcasters can leverage the latest video streaming technologies to satisfy the

Figure 1: Remote production workflow with synchronized video streams 96

SPORTSTECHJOURNAL / FALL 2020

demands of remote production workflows without the traditional costs and logistical complexities.

>T HE CHALLENGE OF SYNCHRONIZING MULTIPLE CAMERA STREAMS OVER IP While broadcasting live events, the use of multiple cameras allows for a more engaging and dynamic viewer experience. For remote locations such as sports stadiums or concert venues, a producer needs to be able to seamlessly switch between live video feeds depending on what angle is most suitable at a given time. Typically, a single audio stream is used, as sudden changes in audio are very noticeable and can be distracting. If the video is not synchronized, switching between cameras can result in issues such as input lag and lip sync delay. At the live production facility, decoders receiving the live feeds need to be kept in sync so that a producer can immediately include any of the sources within their live playout workflow. One way to help mitigate multi-camera and audio sync issues is by multiplexing camera feeds over satellite uplink, although this can be a costly solution. Another option is to use a dedicated private network that can provide a stable level of latency and therefore the ability to manually sync video and audio feeds, although this is not always possible from remote locations. Streaming over the internet is a more cost-effective and flexible approach; however, bandwidth availability is difficult to predict and can change at any given moment. Being able to synchronize remote contribution streams over the internet resolves the dilemma between managing costs and ensuring broadcast quality. Keeping live video and audio in sync while streaming over IP networks can be a considerable challenge. Especially when dealing with an unpredictable network like the internet where round trip times and bandwidth availability can continually fluctuate. In order to ensure that all video and audio streams are in sync with each other, broadcast and network engineers need to spend


Figure 2: How Stream Sync works in a remote production workflow considerable time manually adjusting the timing of each video decoder output. Typically, this is done using a test pattern device to calibrate audio channels with live video sources. This approach requires coordination between people at both the remote location and at the MCR and can be very time consuming. The more cameras and audio channels involved, the more complicated it becomes to synchronize everything, and the more time needed before going on air. Although with the right tools, this approach can be made to work, there is a simpler and faster way.

>T HE STREAM SYNC SOLUTION Haivision’s Stream Sync solution automates and simplifies realtime frame alignment. Stream Sync is supported by the Makito X Series of video encoders and decoders, including the new Makito X4 encoder and decoder for 4K or quad-HD video. These Haivision devices are configured to stream multiple channels of live event video that are kept in sync, accurate to within a single frame. Stream Sync works by continuously monitoring the end-to-end transit time, and dynamically adjusting the internal decoder buffers to compensate. Stream Sync enables broadcast engineers and producers to capture multiple live video and audio streams from a remote venue and keep them all in sync for immediate use. Makito X and X4 video decoders ensure that live feeds are synchronized so that downstream production equipment will not experience issues when switching between video and audio sources. Stream Sync continuously monitors the characteristics of the streams and the network and applies the exact amount of buffering required to ensure smooth and synchronized playout across

multiple feeds. This is done in real-time based on timestamps embedded in each stream from the remote Makito X or X4 encoders. For live production, this means that any camera can be used with any audio track, with no noticeable video hits or loss of lipsync. For Stream Sync to work, cameras need to be genlocked and the Makito X and X4 video encoders synchronized to an NTP server designed for broadcast applications. This can be easily configured through the video encoder GUI, which provides a way to specify the NTP server used and ensure that all outgoing streams are timestamped in sync. Stream Sync not only benefits broadcasters, but also provides companies, nonprofits, and government agencies a way to deliver broadcast-quality live event coverage. Corporate training, executive communications, and webcasts can all be more engaging using multiple cameras with none of the distractions of out of sync video and audio streams. With the Makito X4 encoder and decoder pair, four HD video streams can be kept in sync with Stream Sync using only one device on location and another in the production studio.

>C ONCLUSION New technologies for remote production workflows over the internet — such as Haivision’s Stream Sync — are allowing broadcasters to cover a wider range of live events, including sports and news gathering, without the costly overhead of deploying production teams and OB trucks to each site, or transporting video over satellite or dedicated networks. Being able to sync remote video streams over the public internet is more cost effective and flexible than using satellite or private managed networks. It enables any type of broadcaster to live stream events with multiple camera angles from any location with broadband internet access. <

SPORTSTECHJOURNAL / FALL 2020

97


WHITEPAPERS

Immersive Media Experiences With Intel True View Delivers New Reality for Sports

By

Harleen Gill, Intel Corporation, Principal Engineer

Ritesh Kale, Intel Corporation, Director of Engineering > INTRODUCTION With advancements in volumetric media technology, Intel Sports is paving the way to deliver unique and compelling immersive media experiences to sports fans. By harnessing the power of volumetric video, Intel Sports provides leagues, teams, and broadcasters with new storytelling capabilities to engage fans through interactivity, personalization, and unbounded perspectives of the game. Stadiums filled with fans in their seats, sitting side by side, watching their favorite players and following the play up-close is a fan experience that was once commonplace, but now is vulnerable due to the coronavirus pandemic. With fans at home, social platforms have seen higher engagement and the demand for streaming content has increased as fans look to stay informed, connected and eagerly wait for sports’ full return. Intel Sports with its Intel True View Platform enables fans from anyplace to experience the game from a new perspective and allows for safe, large-scale remote production for broadcasters and teams, meeting the unexpected challenges of an unprecedented time and evolving fan behavior.

three degrees of freedom with some limited head movement (3DoF+), or full six degrees of freedom (6DoF) immersive media experiences. Powered by Intel True View technology, a multitude of immersive media experiences can be created using a volumetric model of action happening on the field. Enhanced storytelling with Virtual Cameras: Virtual cameras can follow the ball or players of interest and look at the action from any point-of-view, create virtual sky-cams from any location in the field, or watch the action from a referee’s point of view. Create AR experiences: AR experiences can be created where mobile devices (phones/tablets) can project the on-field action onto a tabletop and allow a user to navigate around the game action. Create VR experiences: VR experiences can be created for users ranging from 3DoF to 6DoF, placing users virtually in the middle of the action. Enable influencers to tell new stories: Influencers (players, commentators, analysts, etc.) can view the game with unlimited camera angles and unique perspectives to tell new stories to their fan-base. In order to create the experiences described, Intel Sports utilizes a combination of cloud technologies and traditional storytelling techniques in a remote setting away from the venue.

>E XECUTION, CREATION, AND DELIVERY OF IMMERSIVE MEDIA EXPERIENCES The Intel True View Platform and workflow starts with a venue capture system comprised of a camera array built into the perimeter of the stadium as shown in Figure 1. High-resolution cameras are angled to capture the entire field of play and the camera array is connected by fiber to dedicated on-site Intel Xeonbased servers. The data is then uploaded to a media processing pipeline that stores, synchronizes, analyzes, and processes the data in the cloud.

>E XPERIENCES ENABLED BY VOLUMETRIC CAPTURE Immersive media is a form of media that includes non-traditional media formats such as 360-degree video, virtual reality (VR), augmented reality (AR), mixed reality (MR), and other emerging technology platforms. The Intel True View Platform enables sports fans to choose where and how they want to consume the next generation of sports content, including options like three degrees of freedom (3DoF), 98

SPORTSTECHJOURNAL / FALL 2020

Figure 1: Intel True View Platform Volumetric Capture at a Stadium


Figure 2: Intel True View Immersive Platform Architecture and Workflow The core immersive media processing and experiences pipeline components, as shown in the Intel True View Platform end-toend workflow in Figure 2, are all hosted in the cloud. These cloud workloads generate massive amounts of volumetric data (up to 200 Terabytes of raw data per match) in the form of voxels, which capture height, width, depth, and relative attributes that are needed for point cloud formation. Volumetric video created from the point cloud generates volumetric data that is rendered into high-fidelity 3D video in the form of virtual camera videos. In the remote production site, a variety of production tools allow creative producers, broadcasters, and the Intel Sports production team to create volumetric content focused on the most action-packed or analysis worthy parts of the game. The virtual camera tool allows a producer to define and place a variety of virtual cameras throughout the field including stationary, rail, and tracking cameras. This can track objects such as the ball or players during the live game, while traditional production tools continue to allow producers to curate and enhance content with graphics, audio, and telestration, as is common practice when creating highly produced content. Back in the cloud, the video renderer creates and renders the series of images based on virtual cameras defined by the virtual camera tool operator. These virtual camera streams are then sent to the video encoder software, which converts the uncompressed videos into a compressed digital video format. The system is designed to support the most common industry standard video codecs H.264 (AVC), H.265 (HEVC), and M-JPEG and AAC for audio codec, in order to support a wide range of client platforms and devices. The stream packager takes the encoded videos and converts the memory bits into consumable bitstreams and prepares the content for live streaming using Transport Stream (TS) and MP4 file format for HLS (HTTP Live Streaming) distribution. In order to enable business partners to control which video streams are outputted to immersive media streaming applications,

the content packaging tool and the content management system (CMS) are provided. The content packaging tool is part of the final leg of creative content creation where content can be selected, organized, and assembled into a final game package. The CMS enables control on when content is made available to clients. It stores the location of the content distribution network (CDN) streams and manages the output streams shown to end users. The CDN serves as the distribution component of the pipeline that takes games packages and streams the content to client players. On the client side, the stream de-packager takes and reads from the consumable bitstreams and converts it into a format that the decoder can understand. The client player’s video decoder takes the compressed video, then decompresses the video into a series of images. The video renderer takes the series of images and renders them sequentially into what is seen on an end users’ device. And finally, the client application can be designed to provide a variety of experiences enabled by SDKs.

> L EADING A NEW MEDIA FORMAT As fans hunger for the return of their favorite teams and players, the demand for sports and sports-related content has never been greater. The Intel True View Platform not only meets the needs of fans and storytellers during the coronavirus pandemic, but also sets the stage for a new reality of how sports content will be consumed now and in the future. Applying the latest technical advances in volumetric content creation and deep knowledge of operating and deploying large-scale systems, the platform enables fans to be virtually present in the stadium or field, alongside with their favorite teams and players. Broadcasters, teams, and production crews are enabled to perform large-scale production and deliver unique content and storytelling via virtual cameras capturing perspectives that cannot be captured by physical cameras, all safely without needing to send camera crews into the venue. The Intel True View Platform empowers producers and content creators to focus on unique creative storytelling with new immersive media experiences that puts their fans right in the center of the game. < SPORTSTECHJOURNAL / FALL 2020

99


WHITEPAPERS

Increasing Value of Sports Content: Machine Learning for Up-Conversion HD to UHD By Tony Jones, MediaKind, Principal Technologist Following the height of the 2020 global pandemic, live sports are starting to re-emerge worldwide — albeit predominantly behind closed doors. For the majority of sports fans, video is the only way they can watch and engage with their favorite teams or players. This means the quality of the viewing experience itself has become even more critical. With UHD being adopted by both households and broadcasters around the world, there is a marked expectation around visual quality. To realize these expectations in the immediate term, it will be necessary for some years to up-convert from HD to UHD when creating 4K UHD sports channels and content. This is not so different from the early days of HD, where SD sporting related content had to be up-converted to HD. In the intervening years, however, machine learning as a technology has progressed sufficiently to be a serious contender for performing better up-conversions than with more conventional techniques, specifically designed to work for TV content. Ideally, we want to process HD content into UHD with a simple black box arrangement. The problem with conventional up-conversion, though, is that it does not offer an improved resolution, so does not fully meet the expectations of the viewer at home watching on a UHD TV. The question, therefore, becomes: can we do better for the sports fan? If so, how?

>T RADITIONAL APPROACHES TO UP-CONVERSION UHD is a progressive scan format, with the native TV formats being 3840x2160, known as 2160p59.64 (usually abbreviated to 2160p60) or 2160p50. The corresponding HD formats, with the frame/field rates set by region, are either progressive 1280x720 (720p60 or 720p50) or interlaced 1920x1080 (1080i30 or 1080i25). Conversion from HD to UHD for progressive images at the same rate is fairly simple. It can be achieved using spatial processing only. Traditionally, this might typically use a bi-cubic interpolation filter, (a 2-dimensional interpolation commonly 100

SPORTSTECHJOURNAL / FALL 2020

used for photographic image scaling.) This uses a grid of 4x4 source pixels and interpolates intermediate locations in the center of the grid. The conversion from 1280x720 to 3840x2160 requires a 3x scaling factor in each dimension and is almost the ideal case for an upsampling filter. These types of filters can only interpolate, resulting in an image that is a better result than nearest-neighbor or bi-linear interpolation, but does not have the appearance of being a higher resolution.

>M ACHINE LEARNING AND IMAGE SCALING Machine Learning (ML) is a technique whereby a neural network learns patterns from a set of training data. Images are large, and it becomes unfeasible to create neural networks that process this data as a complete set. So, a different structure is used for image processing, known as Convolutional Neural Networks (CNNs). CNNs are structured to extract features from the images by successively processing subsets from the source image and then processes the features rather than the raw pixels.

The inbuilt non-linearity, in combination with feature-based processing, mean CNNs can invent data not in the original image. In the case of up-conversion, we are interested in the ability to create plausible new content that was not present in the original image, but that doesn’t modify the nature of the image too much. The CNN used to create the UHD data from the HD source is known as the Generator CNN. When input source data needs to be propagated through the whole chain, possibly with scaling involved, then a specific variant of a CNN — known as a Residual Network (ResNet) — is used. A ResNet has a number of stages, each of which includes a contribution from a bypass path that carries the input data. For this study, a ResNet with scaling stages towards the end of the chain was used as the Generator CNN.

> TRAINING For the Generator CNN to do its job, it must be trained with a set of known data — patches of reference images — and a comparison is made between the output and the original. For training, the originals are a set of high-resolution UHD images, down-sampled to produce HD source images, then up-converted and finally compared to the originals. The difference between the original and synthesized UHD images is calculated by the compare function with the error signal fed back


to the Generator CNN. Progressively, the Generator CNN learns to create an image with features more similar to original UHD images. The training process is dependent on the data set used for training, and the neural network tries to fit the characteristics seen during training onto the current image. This is intriguingly illustrated in Google’s AI Blog [1], where a neural network presented with a random noise pattern introduces shapes like the ones used during training. It is important that a diverse, representative content set is used for training. Patches from about 800 different images were used for training during the process of MediaKind’s research.

> THE COMPARE FUNCTION The compare function affects the way the Generator CNN learns to process the HD source data. It is easy to calculate a sum of absolute differences between original and synthesized. This causes an issue due to training set imbalance; in this case, the imbalance is that real pictures have large proportions with relatively little fine detail, so the data set is biased towards regenerating a result like that — which is very similar to the use of a bicubic interpolation filter. This doesn’t really achieve the objective of creating plausible fine detail.

Training data is required here, which must come from 2160p video sequences. This enables a set of fields to be created, which are then downsampled, with each field coming from one frame in the original 2160p sequence, so the fields are not temporally co-located. Surprisingly, results from field-based up-conversion tended to be better than using de-interlaced frame conversion, despite using sophisticated motion-compensated de-interlacing: the frame-based conversion being dominated by the artifacts from the de-interlacing process. However, it is clear that potentially useful data from the opposite fields did not contribute to the result, and the field-based approach missed data that could produce a better result.

> HYBRID GAN WITH MULTIPLE FIELDS A solution to this is to use multiple fields’ data as the source data directly into a modified Generator CNN, letting the GAN learn how best to perform the deinterlacing function. This approach was adopted and re-trained with a new set of video-based data, where adjacent fields were also provided. This led to both high visual spatial resolution and good temporal stability. These are, of course, best viewed as a video sequence, however an example of one frame from a test sequence shows the comparison:

>G ENERATIVE ADVERSARIAL NEURAL NETWORKS Generative Adversarial Neural Networks (GANs) are a relatively new concept [2], where a second neural network, known as the Discriminator CNN, is used and is itself trained during the training process of the Generator CNN. The Discriminator CNN learns to detect the difference between features that are characteristic of original UHD images and synthesized UHD images. During training, the Discriminator CNN sees either an original UHD image or a synthesized UHD image, with the detection correctness fed back to the discriminator and, if the image was a synthesized one, also fed back to the Generator CNN. Each CNN is attempting to beat the other: the Generator by creating images that have characteristics more like originals, while the Discriminator becomes better at detecting synthesized images. The result is the synthesis of feature details that are characteristic of original UHD images.

Figure 2: Comparison of a sample frame from different up-conversion techniques against original UHD

> HYBRID GAN APPROACH With a GAN approach, there is no real constraint to the ability of the Generator CNN to create new detail everywhere. This means the Generator CNN can create images that diverge from the original image in more general ways. A combination of both compare functions can offer a better balance, retaining the detail regeneration, but also limiting divergence. This produces results that are subjectively better than conventional up-conversion. What about interlace? Conversion from 1080i60 to 2160p60 is necessarily more complex than from 720p60. Starting from 1080i, there are three basic approaches to up-conversion: • Process only from the corresponding field • De-interlace and process from the frame • Process from multiple field directly

> CONCLUSIONS Up-conversion using a hybrid GAN with multiple fields was effective across a range of content, but is especially relevant for the visual sports experience to the consumer. This offers a realistic means by which content that has more of the appearance of UHD can be created from both progressive and interlaced HD source, which in turn can enable an improved experience for the fan at home when watching a sports UHD channel. < References 1 A. Mordvintsev, C. Olah and M. Tyka, “Inceptionism: Going Deeper into Neural Networks,” 2015. [Online]. Available: https://ai.googleblog.com/2015/06/ inceptionism-going-deeper-into-neural.html 2 I. e. a. Goodfellow, “Generative Adversarial Nets,” Neural Information Processing Systems Proceedings, vol. 27, 2014.

SPORTSTECHJOURNAL / FALL 2020

101


WHITEPAPERS

The Future of Production Is Automatic, Flexible, and Remote — and “Hands Off” By Joan Bennessar, Mediapro, Sport Intelligence Technical Director, AutomaticTV As sports competitions resume around the world, a number of leagues (MLB, LaLiga, Premier League) are playing games in their home venues, while others —like the NBA, MLS, Canada’s CEBL and CPL, and Spain’s ACB Liga — have opted for the single hub model. Match productions have had to adapt to fit stringent new requirements set out by the venues, teams, and local health authorities. When covering major sports events, broadcasters typically deploy a large crew and range of cameras, from beauty cams to large cameras on tripods, Ultra SloMo, and Skycams, as well as the relevant audio and graphic systems — but such a set-up can be unaffordable or impractical for many organizers. However, there are now a range of mature automatic production solutions that allows them to produce professional coverage on restricted budgets. These increasingly sophisticated technologies permit rights holders to maximize returns while retaining quality and improving the flow of data and analysis available to referees and coaches. To decide which system to choose, the first step is to define the camera plan and decide which and how many of these cameras will be automated, in line with available budget. Each sport has its own specifications, and the master camera of an automatic system can deliver streams automatically without human intervention by using a very accurate Artificial Intelligence (AI) engine. The most cost-effective solution involves a single automatic master camera, including overlays and audio. Very few solutions offer a multicamera production — that is, at least two more cameras in addition to the master one, with switching between the different cameras automatically. Some solutions permit the integration of human-operated cameras, which enhances the production with close-up views. The switching, in this case, is manually operated. 102

SPORTSTECHJOURNAL / FALL 2020

Live streaming of Canadian Elite Basketball League 2020 Summer Series using AutomaticTV With all these options, a broadcaster can plan whether to make a three, five, or seven automatic multicam production. Recommended setups are a three-camera production, completely automated — ie, without human intervention. The next level would be a five or seven-camera position, with a director switching cameras manually. A single person can therefore manage a complete multicamera production of up to seven cameras, making it a very attractive solution for many sports broadcasts. A more advanced production can be accomplished by adding one, two, or three close-up cameras, each manned by an operator. In indoor sports, this could comprise a camera on a tripod with a large zoom, close to the master position, and one or two handheld cameras on the floor. Outdoor setups are generally more complex. One important aspect to consider is the variety of cameras used. Few automatic production solutions allow for the same brand of cameras, unmanned, either on tripods, or handheld — a key factor in facilitating the color balance of all cameras, as well as the dynamic range, given that all cameras will react identically when lighting conditions change, as happens during a sports event. Another important area is the availability of tools to adjust white and black levels, exposure, contrast, and white balance of each individual camera — critical selection criteria for a producer when choosing between various alternatives. Integrated color management tools in the automatic software can also help to integrate different cameras; although, as stated before, using the same cameras across the production will result in a better broadcast product — bearing in mind that the goal is to come as close to a professional “standard” broadcasting production as possible. So far, we have described a completely unmanned production with one, three, or five positions; a single-person multicamera production from three to five positions, including replays; and a three to five-person multicamera production with the same number of camera positions, including replays and two to four close-ups.


The crew required to operate the system is also a key point to consider. It is important to evaluate whether the team must be onsite, or whether the production can be done from a remote location. Some solutions on the market allow camera switching and replays to be done remotely, meaning only camera operators remain onsite. Implementing these workflows requires experience, as different parameters involving latency, bandwidth, and other network elements have to be taken in consideration. Integration with a broadcaster’s workflow in areas such as graphics and ingesting the feed will also impose requirements on the automatic production solution. The broadcaster may decide to use their graphic engine instead of relying on the automatic solution, mixing the signal on the new system; NDI is increasingly becoming a standard solution to achieve these integrations. In the case of the video ingest, it depends on whether the video is delivered locally, at the venue, or at an outside contribution point. If it’s local and the production company still uses SDI technology, the automatic solution will have to provide the outputs as SDI, progressive, or interlaced. If the production company has already migrated to IP, NDI will likely be the preferred method. When approaching remote delivery, Secure Reliable Transport (SRT) should be taken in consideration, in addition to commonly used protocols such as Real-Time Messaging Protocol (RTMP). With SRT, video can be delivered through public internet, thus avoiding requiring the use of dedicated — and expensive — private networks. Several automatic production systems have evolved significantly and support many of the features discussed previously; these are becoming increasingly capable and production friendly. Many sports now require the use of Instant Replay solutions, commonly known as Video Assistant Referee (VAR), where officials can instantaneously review any play from different angles. VAR requires all cameras to be perfectly synchronized and deliver the feeds with low latency in real time. The five and seven-camera solutions, described previously, are ideal candidates for use in indoor sports. An automated production system capable Tactical camera production generated for LaLiga match, Real Madrid v Eibar (June 2020)

of delivering all the streams to both the broadcaster, for OTT streaming or through traditional channels, and to officials in the venue thus becomes very attractive. An automated multicamera system comprising three cameras gives officials access to five views: the program feed (virtual master camera), left and right cameras (the two physical cameras to create the master), and leftand right-side cameras. In the case of a seven-camera position, the officials have access to nine views. Coaches and analysts are keen consumers of videos and have completely different requirements when analyzing matches and reviewing practicing sessions. However, there is a common requirement for both: they must be able to ingest the feeds directly into their video analysis tools. This is a vertical market with many tools, each one with its own features, but with different capabilities regarding the supported video formats and protocols. To fulfill these requirements, the automatic production solution has to implement a number of protocols and allow the configuration of different formats and encoding qualities. If the analyst cannot ingest the stream directly, they can use external conversion boxes to capture the video from a HDMI or SDI source. The most applicable automatic solutions will be the ones that do not require the usage of conversion devices, which at critical moments in a sports event can be potential sources of failures. Depending on the sport, analysts require specific views, which often are different from those of the broadcaster. In some sports, the preferred view is not from the center but from the short line, or both. A common requirement is to have all the players constantly in view, without replays, beauty shots, or images of the bench. In soccer, for instance, AutomaticTV has created a Tactical Camera view where the 20 players are always in view as well as, when relevant, the goalkeeper of the defending team — a view not delivered by the broadcaster, which, even in a wide master camera shot, only shows a maximum of 10 to 12 players. For this reason, top competitions deploy both a broadcast and an automatic production solution to produce specific signals to satisfy both viewers and coaches. Practice sessions have two peculiarities: there are no fixed rules and there are several groups of players practicing simultaneously. There are automatic production systems that can create several virtual cameras, framing each one on a selected part of the field and producing several output video streams. Such solutions are a better option than having the analyst recording the action, as they can concentrate on their primary role: analyzing players and teams. In summary, a production company or broadcaster looking to add automatic production to its portfolio of services will need to find a solution that can be integrated in its production workflow and which has complete color management and multicamera capabilities to allow for future growth. When comparing options, it is advisable to analyze camera quality as well as dynamic range as there are good solutions using surveillance or industrial cameras that do not provide the image quality that would be expected on a broadcast production. < SPORTSTECHJOURNAL / FALL 2020

103


WHITEPAPERS

Inserting AR Into the Remote Production Workflow By Phil Ventre, Ncam, VP Sports and Broadcast Augmented Reality (AR) provides numerous compelling opportunities for broadcasters to differentiate their coverage, especially in the areas of explanatory graphics and/or sponsorship opportunities. To date, however, such solutions have involved specific crew for rigging, equipment that requires precious rack unit space, and a lengthy set up process, all of which translates into additional costs and time penalties for productions. Using first-generation technology, typically the load out would involve one operator per camera, in addition to the camera person themselves. There was also a penalty in terms of equipment; our first-generation camera bar that was mounted onto the camera itself weighed 2.65lb (1.2kg) and the rack-mounted server units that accompanied each unit weighed 24.25lb (11kg). The camera bar was also required to be tethered to the server via ethernet, further decreasing flexibility and choice in deployment.

104

SPORTSTECHJOURNAL / FALL 2020

While not hugely problematic for many AR use cases, such as realtime pre-visualization on movie sets, the increased footprint that AR’s deployment necessitates when it comes to live sports and events has seemed increasingly out of step with the meta trend in the industry towards remote production. Sport OB in particular started to accelerate its pivot towards remote workflows throughout the course of the past two years driven by the twin motors of cost-effectiveness and increasing environmental considerations and/or legislation. Our analysis showed that what was required to keep AR in the spec sheets and at the forefront of future production’s plans in terms of the live sports market was: • Miniaturization of equipment • Lowered bandwidth requirements • Remote operation • Decrease of personnel • Swift deployment This development effort was already well underway before the COVID-19 pandemic added an urgency to remote production adoption. The resulting acceleration of the R&D cycle means that those issues have all been addressed and that AR can now be comfortably deployed as part of best practice remote workflows due to impressive innovation in the following fields. Indeed, these apply to much of the equipment that will be considered by OB companies over the coming years as remote production workflows journey from mainstream acceptance to majority deployment.


>M INIATURIZATION Form factor is an issue when it comes to OB manifests, with smaller being better and enabling less vehicle transport. Weight is an issue when it comes to number of crew required to maneuver kit and, in AR’s case, in the feasibility of cameramounted equipment. Progress here has been rapid. The Mk2 version of our camera bar, for instance, weighs a mere 0.64lb (290g), while the processor has been shrunk down to 1.98lb (900g). This means it can be camera mounted too, doing away with the previously required ethernet tether. This also makes AR deployable on wireless handheld units, Steadicam, spider cams, etc. It also gives productions the flexibility to use the same unit both in the studio and pitch-side.

> L OWER BANDWIDTH By moving many of the processing requirements of the AR workflow onto an onboard unit, the bandwidth requirements of the link from the camera have been shrunk by a factor of close to 1,000. This means that AR units can be added to the remote production workflow with no significant overhead on bandwidth requirements. Indeed, the raw data requirement for an AR workflow in this case is as low as 16KBps per camera. The heavy lifting part of any AR workflow occurs at the rendering stage when the data from the real world image is combined with the graphics elements for output. This can occur at the production hub alongside other operations such as switching, EVS, audio mixing, etc.

>R EMOTE CONTROL One of the goals of remote production is to have steadily less kit at the ground and that which is essential to be there become increasingly controlled remotely by operators. Remote camera positions are becoming increasingly normalized, while kit that was previously considered essential onsite, such as EVS, is now routinely kept back at base. The same is true for technologies such as AR. This needs to fulfill set-and-forget requirements to be successful, with the solution engineered so it can be controlled — and trouble-shot if necessary — remotely from base or another cloud-based site.

> L OWERED FOOTFALL The obvious desired knock on effect from this is a lowered need for physical presence at the site. Pre-COVID this was primarily on a cost and environmental basis; post-COVID, this is a socially distanced requirement. Rigging and operation are increasingly tasks undertaken by the same crew and equipment needs to be designed to facilitate a low headcount, in terms of both its set up and its operation. In keeping with the remote operation requirement detailed above, once our AR solution/s are set up by an engineer, they can leave the site.

>F AST DEPLOYMENT If specialists have to be onsite, the current environment dictates that the sooner they can be offsite, the better. Set and forget is less of an appealing workflow paradigm if the set process takes hours and hours to complete. Solutions need to be engineered for fast and accurate setup and calibration. Progress here in the AR field has been rapid; with one broadcaster we are working with deploying our solution, we are able to have a virtual studio up and running in over 90 possible locations in under an hour. These and other developments mean that AR does not have to be considered as a pre-COVID addition to a broadcast that now has too much overhead and operational risk to be considered viable. Rather, broadcasters can not only add AR elements to their productions, they can ramp them up and deploy the technology safely under pandemic conditions — either purely informational or increasing advertising and sponsorship opportunities — without increasing onsite footprint. This is not only an immediate benefit for them now during the current pandemic as they look to gain traction for the returning leagues, but will prove a significant asset in helping ensure the success of the carbon neutral productions that will be required in the future. <

SPORTSTECHJOURNAL / FALL 2020

105


WHITEPAPERS

Optimizing Timing of Signals in IP Remote Production By Andy Rayner, Nevion, Chief Technologist >T RANSFORMING SPORTS COVERAGE Remote production is transforming sports coverage by revolutionizing the economics and the logistics of live event production. Where substantial outside broadcast production capabilities and staff were once needed onsite to produce the main feed, most of the production can now be done from the central facilities. This means fewer equipment and human resources are needed onsite, and a greater utilization of the resources in the central location can be achieved. Remote production has been making high-profile sports events, such as the Super Bowl, the FIFA World Cup, and the Summer and Winter Games, more cost-effective to produce at a time when the cost of acquiring broadcasting rights is soaring. Often forgotten, though, is that remote production also makes the coverage of events with smaller audiences (e.g. lower leagues or less common sports) commercially viable, thereby creating new revenues streams for broadcasters. For example, HDR — a service provider to the broadcast industry — used a remote production solution based on Nevion equipment and software to enable the professional coverage of Danish horse racing — not a sport that attracts a large audience, but one which can be profitably broadcast with the right production cost structure. The world-changing COVID-19 pandemic in the first half of 2020 significantly accelerated the move to remote production due to the need to keep people involved as safe as possible.

>T IME IS OF THE ESSENCE A big and significant factor in the growth of remote production has been the availability of ever more cost-effective bandwidth that telecom service providers offer on their high-performance IP-based networks, which can be leveraged to better connect venues and central production. Only a few years ago, the high cost of long-haul transport (over terrestrial or satellite links) was such that only a few high quality signals could be transported from (and indeed to) the venues. As a result, coverage had to be largely produced onsite with mainly the final program output transported to the central facility. 106

SPORTSTECHJOURNAL / FALL 2020

Now, however, it is often viable to transport all the flows needed to produce the coverage centrally, so remote production is replacing onsite production in many instances. One key distinction between onsite production and remote production is timing, especially latency and signal synchronization. These aspects are covered in detail in the rest of this whitepaper.

>K EEPING A LID ON LATENCY Live coverage requires signals to be transported with as little latency as possible across the whole production workflow. When the coverage is largely handled on-location (e.g. with an OB truck), production is self-contained and, as result, latency is not an issue within the production environment. In this scenario, the transport back to the central location (contribution) needs to be prompt, but a reasonable delay can be tolerated — for example, the latency of satellite contribution with temporal compression can be several seconds. With remote production, the connectivity between the venue and the remote central gallery forms a crucial part of the production, and latency needs to be kept as low as possible, close to that experienced in a local production environment. Apart from the speed of light in fiber, every aspect of the signal transport needs to be optimized to minimize latency.

>T RANSPORT OPTIMIZATION Live remote-production workflows inherently involve the transport of signals over long distances, passing through edge equipment (e.g. for encoding and protection), routers, and processing equipment. The choice of technology used and the configuration of this equipment can make a significant impact on the overall latency being incurred.

> L OW LATENCY COMPRESSION Despite the availability of more cost-effective transmission capacity, the demands of modern remote production in terms of the number of signals and the quality of these (4K/UHD, HDR, HFR, WCG) mean that, in many situations, some form of video compression is needed to reduce bandwidth requirements. Video compression is always a compromise between image quality, compression rate, and latency. As noted above, low latency is crucial for remote production so traditional codecs like H.265, H.264, and, even more recently, JPEG 2000 with their total multiframe end-to-end latency are not ideal for this. New codecs have been developed that offer substantially reduced latency. For example, JPEG 2000 ULL (ultra-low latency) achieves a total end-to-end latency of around one frame, while keeping a compression ratio close to 10-to-1. TICO achieves an even lower latency (sub-frame) but offers a ratio of only 4-to-1, which is useful for squeezing 4K onto a 3G link, but not as optimum for bandwidth savings. Most recently, JPEG XS has emerged as an ideal candidate for remote production. JPEG XS achieves pristine multi-generational


compression with ratios of up to 10-to-1 and a latency of a tiny fraction of a frame. A commercial implementation of JPEG XS has been available from Nevion since August 2019 and has successfully been used for live production; for example, by Riot Games, which remoteproduced from Los Angeles the esport final of the League of Legends taking place in Paris — 9,000 km away!

>S IGNAL SYNCHRONIZATION An inherent part of live production is the requirement for signals from all sources to be synchronized. When doing onsite production, a local timing reference is distributed and used by all equipment. Signal synchronization in an IP remote production environment presents its own challenges because it involves multiple geographically diverse locations (e.g. the venue and the central ‘at home’ production location). The locations clearly need to be frequency aligned (typically using a GNSS reference at each location) and phase-aligned, to enable the transit delays to be compensated for via appropriate buffering in the IP media edge (IPME) where the central production is taking place. Unlike native SDI and AES3, which only implicitly convey relative time, SMPTE ST 2110 has the inherent capability of defining absolute time through the RTP timestamp being referenced to (GNSS derived) PTP. This means that each essence has a timestamp relating to the point of capture, which allows alignment to take place at any point in the workflow with total certainty — a real benefit in production. However, leveraging that advantage is currently hampered by the fact that, while some appliances like Nevion’s software-defined media node (Virtuoso) honor the maintenance of the origination timing with the RTP time-stamps, many pieces of equipment don’t, meaning that this absolute reference is lost. A recent revision of SMPTE ST 2110-10 is now encouraging the practice of maintaining origination timing through the production chain and hopefully more manufacturers will adopt this approach.

>T IME DOMAIN CORRECTION The broadcasting industry’s approach to live production has traditionally been the that the production team needs to be handling signals in precisely the same time-domain, with no perceptible delay between capture by the cameras and microphones, and the production gallery. The previous sections of this whitepaper have also followed this assumption. However, recent tools like augmented reality (AR) have shown that live production chains can in fact deal with noticeable delays and with different parts of a production gallery working in offset time domains. Building on this observation, some solutions are emerging that embrace delay as part of a trade-off to minimize the cost of transit bandwidth. The approach used involves running the remote gallery production at home on proxy images (i.e. not using the full-resolution flows) in a delayed timedomain, while the full-resolution processing remains onsite. The time-offset vision and audio controls from the gallery are retrospectively applied onsite to buffered versions of the full resolution signals back at the origination site, with timing compensation to account for the transport and processing delays. This approach means that there is no need to transport all the full-resolution flows from the site to the central facilities, resulting potentially in substantial bandwidth (and cost) savings.

This time domain correction approach will become more widespread over time, especially with cloud-based live production, which will be introducing latency into the production flow.

>T OWARDS DISTRIBUTED PRODUCTION IP remote production is just one part of a more comprehensive move towards distributed production, where all the production resources (studios, control rooms, datacenters, OB Vans, equipment, cloud processing, and even people) are connected via IP networks and can be shared across locations to create content. Distributed production not only brings cost savings through better usage of the resources, but also enables a total rethink of workflows — unrestricted by geography. In that sense, distributed production is the big prize the move to IP. Like for IP remote production though, getting timing issues right are fundamental the success of distributed production. < SPORTSTECHJOURNAL / FALL 2020

107


WHITEPAPERS

Custom-Engineered Technologies and Remotely Managed Services for Large-Event Communications By Carsten Voßkühler, Riedel Communications, Project Manager Communications for today’s large-scale sporting and entertainment events are complex, and growing more so by the day. The technical challenges of in-venue production have been further complicated by the coronavirus pandemic and a greater need for physical distancing and remote work. A remote operations center (ROC) addresses all these challenges and offers a flexible solution for meeting the evolving requirements of live production. This model not only simplifies implementation of sophisticated real-time communications, but also reduces users’ operational and equipment costs. Leveraging centralized technical resources, a secure internal data center, and a network with global reach, the ROC enables provision of tailor-made technologies and supports comprehensive system management, freeing up users to focus on the match, concert, or other live event at the center of the production. Skilled operators remotely control, configure, and calibrate all system components in real time, ensuring maximum security, optimizing system configuration, and monitoring and maintaining audio and video quality. The ROC also eliminates the need for extensive travel by technicians and engineers, reducing hospitality and transportation costs — as well as the user’s carbon footprint — while enabling implementation of appropriate distancing protocols. The facility can support single events or a dozen simultaneous events, with just one team of specialists working to maintaining consistent communications capabilities and quality throughout every event. Proactive monitoring at all times ensures that systems are ready to go as the match or show kicks off. Remote control and support for live sports applications 108

SPORTSTECHJOURNAL / FALL 2020

>A CLOSER LOOK AT THE ROC With state-of-the-art communications and networking equipment, the ROC model is setting a new standard for production of cost-effective, high-quality audio, and video transmissions for live events. Staffed 24/7 by a knowledgeable team and equipped with the latest gear, tightly integrated to ensure reliability, efficiency, and security, the ROC has the flexibility to address multiple applications and to adapt quickly — even instantly — to the end user’s changing needs. Typical system components within the ROC include scalable, networked solutions for reliable communications and audio signal distribution, as well as integrated media signal distribution and processing. Multifunctional user interfaces that offer varied functionality (much like apps do on smartphones) give operators intuitive interfaces and just the right capabilities for different applications. Operators at the ROC configure, monitor, and maintain these systems along with antennae and wireless intercom systems deployed at the event venue. Centralized monitoring and control with custom-engineered intercom technologies ensures that end users enjoy flexible, crystal clear communications from any point within the venue. Because every component is designed with full redundancy, the ROC is well-prepared for any potential system failures. Generator backup ensures that equipment at the ROC can operate independently of the power grid, in turn enabling continuous monitoring even if a power outage occurs.

>U SE CASE: COMMUNICATIONS IN LIVE SPORTS The ROC provides remote control, monitoring, and quality assurance during production for live sports such as football and motorsports. The setup enables crystal-clear communications and sharing of audio, video, and data information between the ROC and users onsite, such as referees, teams, and engine suppliers, and even allows remote access to onsite equipment such as Riedel’s Artist digital matrix intercom, Bolero wireless intercom,


and MediorNet media signal distribution and processing system. The operators in the ROC can control and configure all the system components such as aerials and radio equipment in real time, whether for one venue or for dozens. Thanks to proactive monitoring, operators are notified as soon as any piece of equipment fails. As a result, they can perform instant troubleshooting, or even make an onsite repair before onsite staff become aware of an issue. On match or race days, the operators in the ROC make a range of adjustments to achieve the best possible sound quality. By filtering unwanted noise (crowds, engines, etc.) from the signals and by accommodating the varying volumes of speakers’ voices, intercom technicians can create the perfect audio mix.

>U SE CASE: REMOTE MASTERING FOR A LIVE-STREAMED CONCERT With the benefit of a worldwide MPLS backbone, hundreds of local partners, and local peering points, a ROC can support a wide variety of remote and distributed production applications. Serving as the communications and signal transmission hub, the ROC facilitates flexible collaboration between geographically distributed sites. One such use case is remote mastering for a livestreamed concert. To unite performers, a mixing team, and a remote mastering studio, the ROC not only acts as a service provider, but also performs signal configuration, transmission, and monitoring to ensure streams move smoothly between venue and studio. Technicians at the venue can quickly connect a compact plug-andplay live event recording, mixing, and broadcasting system to the ROC via a WAN connection, then turn their focus on the music and let experts at the ROC worry about setup, communications, and signal flow. This model allows musicians, mixing and mastering experts, and communications engineers to work together smoothly, even when they are a world apart, to deliver a perfect sound experience to a global audience.

>U SE CASE: CHARITY AND SPECIAL EVENTS The coronavirus pandemic has driven visitors away from museums, zoos, and other institutions that typically welcome a steady stream of visitors. As these organizations work to stay

Remote signal configuration, transmission, and monitoring for live music productions connected with their members and the larger public, the ROC offers a fun, simple, and economical solution: a live stream from a special event. The resources afforded by the ROC can, for example, help a local zoo share a special event via a live stream. Expecting a healthy birth from a notable resident, the zoo need only have a series of CCTV cameras installed, connect to the ROC, and let technicians monitor and control cameras 24/7 as the event unfolds. Because the ROC houses almost all technical resources, as well as a team to configure, monitor, and maintain the live stream, the burden on the zoo (or similar institution) remains small. Given the popularity of cute newborn animals, this one-time investment in a special event can boost visibility and remind people to visit in person when safety permits.

>S UMMARY As these three use cases demonstrate, the ROC can deliver value in many different ways. Providing the systems, service, and knowledge critical to managing advanced communications and signal transmission, the ROC makes it easy for people and organizations to implement sophisticated solutions that help to drive their business. The ROC enables a full-service model that reduces the personnel and logistical production effort onsite while enabling unrestricted transmission security and high service quality. While this approach is ideal for a world in which remote work has become a must, it is valuable in any production environment that can benefit from professionally managed communications based on sophisticated intercom and signal transmission technologies. <

SPORTSTECHJOURNAL / FALL 2020

109


WHITEPAPERS

Resilience and Innovation Define the Return of Live Sports By Michael Darer, Signiant, Content Marketing Manager For many, the cancellation of live sports was the first sign that COVID-19 was going to be more disruptive than everyone had hoped. Though an onslaught of similar cancellations followed in the ensuing weeks, the early March suspension of play for the NBA felt like something of a shot across the bow for sports leagues in the U.S. and beyond, with the shock peaking with the March 24 postponement of the summer games. As the months have gone on, many prominent leagues are now returning to some level of normalcy, though what that will ultimately look like varies depending on the parameters of a given sport. Broadcasters, too, are asking themselves how they can produce live sports safely, navigating the minefield of COVID-19 and the demands of an anxious and starved audience. While each league is putting their own plan together, one theme remains common. There will be a lot less people present, and that includes production crews.

>T HE STATE OF THE (LIVE SPORTS) UNION After weeks of questions, the M&E industry began to get answers as to what the return of live sports would look like in early May. International leagues paved the way, with Taiwanese baseball firing up in April, the return of South Korea’s KBO League, (quickly followed by a deal with ESPN allowing the network to broadcast six games a week), and Germany’s Bundesliga football association springing into action with seven games beginning on May 16. In each of these cases, the return strategies typified what the world has seen and heard from other organizations, including many conditions long-anticipated, such as the prohibition of spectators during games. Although months have passed since then (and the conversation is now more heavily focused on 110

SPORTSTECHJOURNAL / FALL 2020

larger U.S.-based leagues such as MLB and the NBA, both of which resumed their seasons in late July), many of the defining features of those early days remain front and center, both for organization (when the NHL resumed in August, they restricted all their matches to Toronto and Edmonton, much as NASCAR limited races to two locations when it picked back up on May 17), and broadcasting. The strategies that production teams and leagues developed at the beginning of the summer, thus, continue to provide unique opportunities and unique challenges.

>R EMOTE PRODUCTION PAVES THE WAY While remote production has been a growing force in media, it wasn’t until this outbreak that “reality” became “necessity,” and “necessity” slowly began to look like “advantage.” For live sports, broadcasters devised strategies to shrink the number of production personnel onsite, transferred larger amounts of content to offsite teams, added more pre-recorded content and advanced graphics along with a dispersed team of remote editors. Signiant has a unique lens into this, as its software products are widely used across live sports production and the platform has seen a massive surge in usage in recent months as these remote workflows have been put in play. Implementing highly advanced sterilization tools, leveraging video conferencing, and coordinating staff to limit density and maximize efficiency, major broadcasters like Fox Sports, BT Sports, and NBCUniversal have made major strides. Faced with complications, sports and production organizations alike have gotten creative. For instance, the totally remote NFL and MLB Drafts invested in a hefty amount of pre-recorded content — including PSAs and musical acts — which kept the broadcast flowing and captivated fans despite the unorthodox circumstances. And it’s not just about making do. Many broadcasters found the current conditions actually allow for exciting experimentation and elucidate previously untapped opportunities. Without


crowds in the bleachers, Fox Sports, for instance, discovered new camera angles that previously wouldn’t have been possible, along with using highspeed, custom drone cameras to help reinvigorate NASCAR fans. For all these innovations, leagues and broadcasters are also aware of the challenges that will no doubt arise, especially around bandwidth. During SVG’s own Sports Content Management Virtual Series, Grant Nodine, SVP of technology with the NHL, discussed the difficulties around moving content efficiently when play is restricted to limited locations. As he explains it, with three games in a day, production teams will have to be quick on their feet, moving footage of one game quickly into archives, then immediately preparing to send pre-recorded footage and relevant graphics for the following match back to the trucks. In these conditions, ordering workflows, leveraging intelligent transfer software that optimize throughput, and staying on top of what content will be needed when will prove vital.

> I NNOVATING THE FAN EXPERIENCE As effective as broadcasters have been at addressing many of the production challenges posed by a totally remote model, there are still some questions that have yet to be answered, one of which is how the lack of fans affects viewing experience. Although there was an established consensus among both fans and leagues that, when live sports came back, the bleachers would have to remain empty for some time, it’s likely that many underestimated the impact this could have on the viewing experience. When the aforementioned Bundesliga football association resumed play, one of the main complaints from otherwise enthusiastic viewers was that there was something strange about seeing a sport — so long defined by its passionate, no-holds-barred fanbase — played on a pitch with completely empty stands. Recently, broadcasters have turned to a number of creative avenues to try to rectify this. Some games gave fans the option to purchase a cardboard cutout of themselves that would be placed in the bleachers (with one unfortunate man almost getting his fake head taken off by a home run ball). A more interesting approach, however, can be seen on Fox Sports’ broadcasts where stadiums are populated by Sim-like

“virtual fans,” that can alternate basic movements to better simulate a crowd. This isn’t the first time this solution has been discussed, either. Back at the beginning of the summer, OZ Sports, an Icelandic sports technology firm, came forward with an AR solution known as OZ Arena, which could add digital fans to a stadium, complete with crowd-sourced audio. OZ Arena could be deployed at a broadcasting facility, an OB truck, or at the stadium directly giving viewers the sense that those bleachers were as packed as ever, bringing back the rush of the live sports experience.

> L IVE SPORTS PRODUCTION IS KEEPING ITS HEAD IN THE GAME The chaos of the last few months have put a major strain on live sports leagues and the broadcasters they work with, but — as we see the return of play — it’s clear that none of these organizations are planning on just laying down. With each new challenge the pandemic has posed, production teams have pushed themselves to innovate, discovering new ways to bring content to fans while staying safe. Remote production has become the name of the game for live sports — now more than ever — and the benefits that the strategy is revealing will likely extend long after this period comes to a close. Leveraging intelligent file transfer solutions alongside a host of new and exciting technologies, these enterprises are getting back in the game with a confidence that’s as impressive as it is exciting. While it’s extremely likely that the landscape will keep shifting, it’s equally apparent that as it does, these businesses will keep innovating, constantly looking for the next new idea, the next play, the next way to bring it all home. <

SPORTSTECHJOURNAL / FALL 2020

111


WHITEPAPERS

Virtualization And Orchestration in IP Live Production Systems By: By

Hugo Gaggioni, Sony, Chief Technology Officer

Deon LeCointe, Sony, Head of IP Technology and Live Production Systems

Ryoichi Sakuragi, Sony, Solutions Architect > I NTRODUCTION For a long time, the content creation and TV broadcasting industries have been in an evolutionary path of technical and business transformations. However, in recent years, broadcasters have experienced business challenges very different from those of the past: • The growing demand for more content • Increasing competition for viewers and consequently pressure on traditional revenue streams • Rising costs on rights acquisition for premium media events • Pressure for workflow efficiencies and optimized production infrastructure These challenges have resulted in broadcasters’ strong demands for managed operational costs, higher productivity, better use of resources, more sophisticated production values, and the adoption of newer technologies for higher quality content (UHD, HFR, HDR, WCG). Solutions to these challenges are beginning to take shape with the transition from SDI to IP technology. The broadcast and media production industries are now looking forward to using products and services based on SMPTE ST 2110 and its companion AMWA NMOS standards and specifications, for multi-vendor interoperability of compliant products and systems. The next paradigm shift in the IP Live evolution is the introduction of Virtualization and Orchestration. Sony is making live production more agile, flexible, and cost effective with its end-to-end IP Live solutions based on these open standards and specifications. The IP Live Production system 112

SPORTSTECHJOURNAL / FALL 2020

optimizes the use of studio facilities, control rooms, and Outside Broadcast operations with remote production and sharing of production resources. Virtualization means that the operations of Remote Production across facilities, venues, and Outside Broadcasting (OB) can be greatly expanded through resource sharing, centralized monitoring, and device configuration and control, with its consequential increase in production activities. Orchestration for live production dynamically aggregates, configures, and schedules the usage of production resources and workflows, as well as executing coordination and monitoring of network resources under SDN control across multi-vendor network topologies. The purpose of this White Paper is to give an overview of Sony’s IP Live Production system, which enables end users to create shared production environments and build dynamically configurable networks under automated production workflows, along with scheduled resources and services, resulting in a full virtualization of resources on-demand.

>S ONY’S VISION: HIGH QUALITY LIVE PRODUCTION FROM ANYWHERE With the continuing demand for the creation of more live content, broadcasters are facing the challenge of producing very high-quality remote events utilizing the processing capabilities of their broadcast center. This enables content creators to maximize their production efficiency and minimize costs so they can produce more events within the same budget. In addition, by extending the processing infrastructure — originally designed for the main production center — to remote venues and production sites, considerable savings can be achieved by provisioning as many sources as needed through resource sharing operations. The figure below depicts three different scenarios that illustrate Sony’s vision of high-quality live production from anywhere enabled by the IP Live Production system. The IP Live Production system enables studios, control rooms, and the production equipment to be shared within a facility for a more efficient use of production resources. In addition, it is highly scalable, IP switch vendor agnostic, and complies with international industry standards and specifications: SMPTE ST 2110, AMWA NMOS, and EBU Tech3371 and JT-NM TR 1001 recommendations. The combination of control room(s) and studios(s) can change dynamically under the control of an end-to-end facility management system. This system can change all aspects of the production resources (e.g., system format, resolution, booking of devices, and services under a production schedule), as well as control of all network elements. Also, changes of tally and/or device name indications are accomplished using the broadcast controller in accordance with the operational actions directed by


The entire group of IP media streams can be routed to any destination using the broadcast controller in combination with the facility management system and the network orchestration and SDN control system.

>S ONY’S IP LIVE PRODUCTION PLATFORM With SMPTE ST 2110 and AMWA NMOS standards now fully implemented into Sony’s live production products, a new strategic partnership has been established with IP media network specialists Nevion for the creation of a new end-to-end IP Live Production Platform that fully integrates these standards with the technologies of Orchestration and Virtualization within a multilayered software solution. The resulting solution platform can be seen in the figure below. This platform consists of: 1. An enterprise-wide facility management layer – Sony’s Live Element Orchestrator 2. A broadcast control layer – Sony’s IP Live System Manager 3. An orchestration and SDN control layer – Nevion’s VideoIPath 4. A portfolio of hardware/software media creation products all equipped with IP interfaces complying with SMPTE ST 2110 and AMWA NMOS. More details on the control and processing layers of the IP Live Production Platform are given in the following sections. Figure 1: IP Live enables various kinds of solutions for Optimized Workflow Efficiency the facility end-to-end orchestrator. The use of IP Live for remote productions leads to a reduction of production staff at the remote site — with more events/shows produced on the same day by the staff that remains at the main broadcast center. The recent introduction of Sony’s HDCE-TX30 (IP camera extension adaptor) enables full control, returns, intercom, and tally information of the remote cameras without CCUs. This is accomplished with compliance to the SMPTE ST 2110/AMWA NMOS specifications, and with no operational compromises at the remote site. Also, full protection of the transmission flows is guaranteed using SMPTE ST 2022-7 (hitless failover feature) between the remote sites and the main production facility. In the case of more elaborate inter-facility productions with resource sharing, the facility management system can treat multiple production locations as extensions of the resources and processing capabilities in existence at the broadcast center. The benefits of IP Live remote production can be extended to the connection of OB vehicles at the remote venue enabling the sharing of resources between home and away OB operations. Production facilities at the remote site can access any sources in the network, increasing the number of camera chains, slomo feeds or switcher resources. This is accomplished using the network orchestration and SDN control system.

1. Facility management system – Sony’s Live Element Orchestrator (LEO) This management layer is a powerful and customizable live production orchestration software that performs the dynamic aggregation, configuration, monitoring, workflow automation, and scheduling of production resources and services (including multi-vendor products). This layer is responsible for the Virtualization of production environments on demand with a reduction of downtime during system changes, hence conducing to improvement of productivity in content production. This provides overall system management and monitoring of an IP-based production system by supporting major industry protocols, for both Sony and third-party products and services. This layer performs orchestration and execution of changes of the enterprise-wide system with configuration and control of machine room resources — which can be shared among multiple studios and production sites (creation of a service). The system changes can be carried out according to program schedule or ad-hoc and communication with third-party scheduling services is also supported. <

To read this White Paper in its entirety, CLICK HERE SPORTSTECHJOURNAL / FALL 2020

113


WHITEPAPERS

The Game is Back On: Rethinking the Sports Broadcast Workflow By

Truman Wheeler Studio Network Solutions, Marketing Content Creator

Melanie Ciotti, Studio Network Solutions, Marketing Strategist The world of sports suddenly froze in Spring 2020, but the thaw has begun. First came the titans of European football, then the heroes of American soccer. By August, most franchises were resuming play or planning their upcoming season with coronavirus-related restrictions. And as professional sports leagues gradually return to the ballparks, pitches, and rinks of the world, so too do the sports media professionals tasked with capturing each game. Everything is different about live sports this year, save the rulebook. Sports broadcasters and production teams are finding new ways to adapt to an ever-changing environment. Most prominent and profitable among these changes is the new fan experience.

>R EINVENTING THE AUDIENCE EXPERIENCE The goal of most sports broadcasts is to capture the experience of being in the stadium and deliver it in an authentic, convincing way. Fans watching at home expect to feel the pressure of a penalty kick, the tension of bases loaded, the euphoria of a buzzer beater. They want

to go through the same emotions as their in-stadium counterparts. But with every fan watching from home, sports broadcasters can no longer aim to give viewers a true-to-life stadium experience. A quiet, empty stadium simply isn’t the experience viewers crave when watching live sports.

> AVOIDING THE AWKWARD In 2015, civil unrest in Baltimore led to what many fans prior to 2020 would describe as one of the strangest professional baseball games, ever. Fans at home watched a quiet contest in a colossal ballpark where only the announcers’ commentary provided a sense of normalcy for viewers. It was eerie, awkward, and almost dystopian. Players and fans alike were eager for the next game in a crowded stadium. Thankfully, they didn’t have to wait long to start filling the stands once again. In 2020, that eerie emptiness has lingered for months. Tournaments and entire seasons are being played without fans. Quite frankly, the traditional live sports experience doesn’t exist this year. Instead, broadcasters and production teams need to get creative to avoid the awkwardness of emptiness.

>C REATIVE BROADCASTING SOLUTIONS Sports broadcasters are tasked with creative reinvention of the way fans enjoy professional sports. The teams that create the most enjoyable experience for fans watching from home can expect a major competitive advantage. Digital effects are driving creative change in 2020. For example, La Liga used a digital mesh of colors to create a lively crowd effect and add ambience to their television broadcasts. While most of the downtime between innings, plays, and hydration breaks goes to advertising — which plays a huge role in offsetting lost ticket revenue—some breaks in play are too short for advertisements. Replays, game analysis, and digital effects like goal-line technology and golf ball tracking can help an audience feel more in tune with the game and improve their overall viewing experience. Fans still want to be part of the game, too. Teams, leagues, and networks are trying myriad ways to showcase real fans during live broadcasts. The National Hockey League used large LED screens above the benches to show fans at home and produce novel picture-in-picture moments. Major League Soccer live-streamed fans on the jumbotron and used the stands for supplemental advertising space. Several sports broadcasts have featured cardboard cutouts of fans to fill their would-be sold out seats. While the verdict is still out on which league, sport, or network got the mix just right, it is clear that < Figure 1: Stadiums were left empty as sporting events around the world were cancelled, postponed, or played behind closed doors for most of 2020.

114

SPORTSTECHJOURNAL / FALL 2020


creativity and ingenuity to enhance the spectator experience will certainly pay off in the COVID-19 era.

>T HE IMPORTANCE OF STADIUM SOUND As the industry works through this tedious trial and error, one thing has become crystal clear: sound matters. As reported by multiple sources, most professional sports leagues are sourcing stadium audio from game developers like Electronic Arts and Sony Interactive Entertainment. Both have built massive libraries of authentic game audio for their sports simulators that work great for live sports broadcasting as well. Stadium sound is not only affecting the audience. It’s impacting the game itself. When the U.S. returned to soccer, referees were reminded that the sound of player-to-player contact should not affect their calls, as both the sound of impact and player reactions will be amplified in a fanless stadium. In baseball, the subtle squeak in the catcher’s shoes as he repositions for a pitch — something hitters can’t typically hear in a packed stadium — may now be affecting the hitter’s decision to swing at an outside pitch. While there likely won’t be a solution to these on-the-pitch sound effects until fans can once again return to stadiums, broadcast and sports production teams can superimpose sound with high-quality crowd audio for a better at-home experience.

Figure 2: Social distancing guidelines have made the use of remote broadcast technology more commonplace.

>R EMOTE WORKFLOWS FOR SPORTS BROADCASTING Social distancing guidelines pose unique challenges to media crews covering live events. Broadcasters have to consider not just the on-screen nature of their programming, but the entire sports broadcast workflow behind the scenes as well. Commentators often have enough room in their broadcast booths to keep a safe social distance from one another. The remainder of the crew, however, may need to make big changes to adhere to social distancing guidelines. The close confinement of a production truck or in-stadium booth can spell trouble for large crews, or small- to medium-sized crews for that matter. If teams don’t have access to remote workflow technology, they may be forced to scale back their production. Fortunately, remote workflows have proven successful in sports broadcasting and allowed production teams to carry on with a much smaller and safer in-stadium footprint.

>B ETTER WORKFLOWS START WITH BEING ORGANIZED Building a successful remote workflow for sports broadcasting requires a storage infrastructure that can handle live VFX, additional graphics, and expanded audio libraries, as well as the ability for members to access live footage to preview, edit, and playback from remote locations. Media asset management (MAM) is gaining importance as well. In the absence of a full schedule of live games, highlight reels have become a lifeline for teams, leagues, and networks with little other content to distribute to their sports-starved audiences. The better organized and more accessible these clips are, the more potential they have to add to the program’s success. For a remote workflow to be successful in this new broadcast environment, production teams need remote access to their media and their MAM at all times, both during live broadcasts and for postproduction and syndication thereafter. The Benefits of a Remote Workflow with EVO At Studio Network Solutions, we built remote access into our shared storage solutions to ensure broadcast, VFX, and post-production teams could continue creating and delivering content as they adapted to new protocols brought on by the pandemic. In March 2020, before the first stay-at-home orders by any state in the U.S., SNS released Nomad, a remote editing utility for EVO. Nomad allows remote editors to download proxy footage and source media from their in-facility shared storage to their remote workstations for offline editing. Proxy files are automatically generated by EVO’s transcoder as it ingests media, and Nomad retrieves these files in the same folder structure from EVO to ensure a smooth relinking process back online for the final export. By enabling access to auto-generated proxies, Nomad helps teams working with high-resolution footage spend more time editing and creating, and less time transcoding proxies or waiting for massive source media files to download over a remote internet connection.

>R EMOTE ACCESS TO SHARED STORAGE Nomad works with any remote connection to the network storage, most commonly via a VPN or remote desktop application. For those who do not have remote access set up yet, we created a secure, cloudhosted virtual private network exclusively for EVO called SNS VPN. This service enables continuous remote access to all the workflow tools built into EVO, including ShareBrowser media asset management, Slingshot automations engine and API, and Nomad for remote editing.

>A NEW WAY FORWARD The sports broadcast industry — largely without work for months due to postponed and cancelled seasons — has been ramping up to an exciting end of the year. As teams, leagues, and networks all battle the continued threat of delayed seasons, broadcasters and production teams are doing their part to get us back in the game. That means more creativity with simulated crowd noise and digital fan overlays, new remote and hybrid workflows, and uncomfortable but necessary change. Whatever the new way forward is for your sports video team, remote workflow technology is at the top of its game, and we’re ready to play ball. < SPORTSTECHJOURNAL / FALL 2020

115


WHITEPAPERS

Virtual Advertising: Cutting Through the Regulation

By

Alex Kelham, Lewis Silkin LLP, Partner, in conjunction with David Patton, Supponor, VP Business Development This paper proposes a way forward for organizations seeking to implement virtual advertising solutions. Although there is a complex legal framework, with differing positions around the world, we believe it is useful to develop some broad principles in this emerging regulatory area to support the development of the Virtual Advertising sector. Virtual Advertising has been used in various forms for many years. For example, the digital insertion of on-field logos at sporting events has been experimented with for a couple of decades now. Only recently, though, has technology advanced to the extent that it is possible to digitally overlay advertising boards that do not impede upon the live action. It is also now possible for multiple broadcast feeds to be distributed to international markets, each displaying different virtual advertisements. As such, a number of rightsholders now exploit virtual advertising to allow tailored advertising to be sold on virtual perimeter boards to different territories or regions. This presents opportunities for advertising to be segmented and, for example, allows:

116

SPORTSTECHJOURNAL / FALL 2020

Several same-category regional sponsors of an event to secure advertising on the broadcasts, shown in ‘their’ region only; A global sponsor to be offered the ability to present different messages in different territories to reflect language, cultural, or taste differences; or Virtual perimeter boards offered to the general market to be sold to local brands targeting their local markets only.

The result is the ability to increase advertising revenues. But, while the tech is ready to go, one of the barriers to wide adoption of these new virtual advertising solutions has been concerns about regulatory issues. This is not a simple regulatory landscape. Much of the relevant regulation is old and does not anticipate this type of technology, and when taking a global approach the thought of taking legal advice in dozens of territories is eye-watering. However, by understanding the issues and the objectives of the regulation, and taking a sensible and risk-based approach to implementation, we believe a way forward can be found. There are, of course, many national variations, but broadcast regulation in respect of commercial references in editorial programming generally has four key aims wherever you look around the world: avoiding over-commercialization of editorial programming; ensuring broadcaster’s editorial independence and freedom from commercial influence; ensuring viewers are not misled as to what is advertising and what is not; and protecting viewers (or specific audience groups) from inappropriate and subliminal advertising. These aims tend to manifest themselves in rules on commercial references in programs and ‘product placement’ rules in the following ways: • Avoiding over-commercialization: controls on ‘undue prominence’ of advertising in programs. (Note that in-program advertising does not normally count towards limits on the quantity of advertising minutes that can be shown in a period, but this should be checked on a territory by territory basis as there are anomalies). • Editorial Independence: rules to ensure that the advertising is appropriate/natural in the context of the program in question, is not focused/dwelled upon inappropriately, and that producers/broadcasters retain ultimate editorial control; • Ensuring viewers are not misled: transparency rules, banning ‘surreptitious’ advertising, and requiring notices to be given at the beginning and end of programs to highlight that product placement has been included; and • Protecting viewers: specific bans on product placement or commercial references in certain types of programming (e.g. news or children’s


programs), or in respect of certain types of product (such as alcohol, tobacco/e-cigarettes, ‘unhealthy’ food and drinks, gambling, and medicines). Subliminal advertising is also generally prohibited. The content of advertising is controlled globally through a combination of legislative mechanisms and ‘self-regulation’ via industry codes of practice, which are adopted and enforced by advertising industry appointed bodies. Above and beyond that, most sports will have rules, whether at an international, regional, national, and/or league/competition level, which governs field of play and kit advertising. Some will deal specifically with virtual advertising. For example, in football/soccer, the “Laws of the Game” provide that “No form of commercial advertising, whether real or virtual, is permitted on the field of play, on the ground within the area enclosed by the goal nets or the technical area, or on the ground within 1m (1yd) of the boundary lines from the time the teams enter the field of play until they have left it at half-time and from the time the teams re-enter the field of play until the end of the match.” Clearly, sports rights owners will need to ensure that any use of virtual advertising complies with such rules. The situation is further complicated by the ongoing roll out of data privacy legislation and the sheer complexities of broadcasting global sport. As regulation is generally country-specific this begs the question, whose laws apply? Is it the country where the relevant sports event is taking place? The country where the sports rights owner or agency (implementing virtual advertising) is based? The country where the relevant digital ads are inserted into the broadcast feed? The country for which a specific advert or virtual feed is targeted? In fact, this is probably the hardest legal issue to address, particularly given that the answer will vary depending on which of the different types of regulation mentioned above is under consideration, and possibly where the enforcement action is being brought. This paper therefore proposes an approach which we believe is likely to be compliant in most jurisdictions, such that the question of ‘whose regulations apply’ becomes secondary. Below is a set of low risk principles we would propose the industry could adopt with a view to: • •

Minimizing regulatory challenges; and Paving the way for low risk exploitation of the opportunities presented by virtual advertising. Sports rights owners will take a view on the level of risk they are willing to take in each given scenario. This will be influenced by a whole host of factors including PR considerations, likely exposure to contractual liability, the type of advertising, where it will be seen, and relationships with their broadcasters. Getting a view from local broadcasters as to whether they have any concerns about the proposed use of virtual advertising is likely to be very influential.

>P RINCIPLES OF ENGAGEMENT 1. Sports-specific rules and regulations which apply to the event being broadcast should always be fully complied with. 2. If a broadcast feed is being produced for a specific country, the laws of that country should be complied with. 3. Data protection laws should be considered carefully and

complied with if personalized advertising is being used in online transmissions. 4. For multi-jurisdictional feeds, the following principles are proposed with a view to ensuring compliance with the majority of broadcast and advertising laws and regulations around the world: • The virtually inserted advertising should be inconspicuous, i.e. it should appear ‘normal’ to viewers who are used to advertising appearing in the context shown in the sporting event. • The virtually inserted advertising should not be unduly prominent during a program where there is no editorial justification for this. • Notices should be given at the beginning and end of the program, and between breaks, to state that virtual advertising/product placement is included. • Advertising for cigarettes, tobacco products, e-cigarettes, prescription only medicines, political advertising, weapons, escort agencies, pornography, and other types of advertising that are banned and/or likely to bring the event into disrepute if advertised should never be inserted into virtual advertising feeds. • Advertising for other highly regulated products and services such as alcohol, betting, fantasy gaming, ‘unhealthy’ (high fat, salt, sugar) food and drinks, infant formula, and highly regulated financial products such as payday loans should be treated with caution. Different territories will obviously take different approaches. • Third-party rights should be respected. A virtual ad should only be placed on to land/buildings or other third-party property: where the land/building owner has consented; where no regulatory consents would be required if the advertising were actually there (or where consent from the relevant regulator has been obtained); and if the virtual advert is replacing physical advertising, where the advertiser has no contractual or other right to object. • The content of advertisments included in virtual feeds should not be promotional (i.e. not include an active call to action to purchase, or promote sales, offers, or consumer promotions). It should generally only be brand, slogan or product-based advertising. • A contractual obligation should wherever possible be placed on the advertisers to ensure the adverts they supply otherwise comply with advertising law and regulation in the territories where the ads will be shown. This is not an exhaustive list, and there are many entities who will accept greater degrees of risk for the appropriate reward. But in the absence of any international regulation it is a useful framework to work with to further establish the success of Virtual Advertising on a global scale. <

To read this White Paper in its entirety, CLICK HERE SPORTSTECHJOURNAL / FALL 2020

117


WHITEPAPERS

Achieving Low Latency in 100% Software IP for Live Production By

Pini Maron, TAG Video Systems, System Architect > IS LIVE PRODUCTION READY FOR A 100% SOFTWARE ENVIRONMENT? Let’s face it, we all understand the value of an infinitely scalable and flexible workflow. Imagining the days when we can configure systems, increase capacity, change location, or even change the application we are working on with the click of a mouse is what being 100% digital is all about. But broadcasting is not like other industries. Yes, we like to pride ourselves that what we do is hard, but the truth is it is hard. And where broadcast workflows are the most mission-critical ($$$$$$$$) is in Live Production. And Live Production is wicked hard to deploy on 100% software 100% IP on 100% off-the-shelf hardware. And for good reason. Until recently, deploying a 100% software solution on commercial off- the-shelf hardware (COTS) for SMPTE ST 2110 uncompressed Live Production workflows was not possible. However, recent innovations have overcome these challenges and opened up the full potential of IP workflows, delivering on the scalability and agility promises of IP software. This report will focus on the technologies and solutions that are enabling broadcasters and production facilities to take full advantage of 100% software IP-based workflows for live applications. In particular, the report will describe how software-based systems can enable the very low latency that’s essential for highquality Live Production.

> SMPTE ST 2110 CHARACTERISTICS AND THE CHALLENGES THEY POSE FOR COTS HARDWARE Two unique characteristics of SMPTE ST 2110 — the standard for moving professional media over managed IP networks — are particularly difficult for COTS hardware to handle. For facilities implementing ST 2110 in software, these characteristics must be addressed in order to establish a true low-latency IP-based workflow for Live Production. 118

SPORTSTECHJOURNAL / FALL 2020

ST 2110 Characteristic 1: Massive Bandwidth Unlike the IPTV signals distributed over low-speed consumer network connections, full-baseband production-quality signals require vast amounts of bandwidth. A 1080i signal requires 1.5 Gb/s, a 1080p signal requires 3 Gb/s, and a 4K UHDTV signal requires up to 12 Gb/s. For software-based IP solutions, the greater the bandwidth required, the more constraints on the operating system and application software.

ST 2110 Characteristic 2: Tight Packet Pacing The second characteristic of SMPTE ST 2110 that is difficult for COTS hardware to handle is tight packet pacing. SMPTE ST 2110 specifies a Narrow Sender that requires extremely rigid pacing of packet transmission to accommodate the smaller buffer required for low latency. This pacing is on a microsecond timescale, with very small tolerances for variance. This is readily achieved on hardware-based platforms but very difficult on COTS for reasons which we will explain shortly.

> L OW LATENCY DEMANDS OF LIVE PRODUCTION The massive bandwidth requirements and tight packet pacing typical of SMPTE ST 2110 contribute to higher latency, which is fatal for Live Production applications (uncompressed workflows). With these characteristics, it was simply not possible to process ST 2110 on standard COTS hardware. This was not just a function of ST 2110, it was also a function of how standard COTS hardware does its processing. In the next section, we will discuss these challenges and the recent innovations that now enable ST 2110 on standard COTS hardware.

>O VERVIEW OF SOFTWARE AND STANDARD COTS HARDWARE Below, for reference, is a look at the typical software and COTS (consumer off-the-shelf) hardware stack. The NIC, or network interface card, is the bridge between the hardware and software. This drawing is here in order to define some of the key elements coming up in this report. The Software and COTS Hardware Stack


Solution: Rewrote Network Driver

>3 CHALLENGES IN PROCESSING ST 2110 ON STANDARD COTS HARDWARE

Challenge 1: Lack of Buffering/Steering of RDP/ UDP Because most network interface card drivers do not deliver RTP / UDP packets into the system with the flexibility of other packet types, all UDP traffic is delivered to only one CPU core. As a result, it is impossible to use all available cores and maximize the number of streams handled. The penalties here are resource waste and latency.

Solution 1: Steer RTP/UDP Traffic to Multiple CPU Cores TAG Video Systems addressed the lack of buffering/steering of RDP/UDP by rewriting the standard network driver that deals with UDP. A specific high-bandwidth UDP-specific network driver component was developed that enables packets from any network interface to be routed to any core, enabling the maximum architectural efficiency for the application and minimizing unnecessary and slow data transfers between cores. In rewriting that element of the driver, it is now possible to take advantage of multiple cores to make ST 2110 work at higher density and lower latency.

Challenge 2: O/S Kernel Cannot Handle Tight Timing of ST 2110 Handling the demands of the application, OS, and network interface, the platform cannot always send packets with the time precision required by SMPTE ST 2110. Because there is no room for error, applications typically trade application performance for the compute power necessary to meet timing specifications. Here again, the penalties are resource waste and latency.

Solution 2: Rewrite the Network Driver TAG Video Systems addressed the inability of the O/S kernel to handle the tight timing of ST 2110 by rewriting the network driver. The company augmented the network driver with a TAG UDP-specific network driver element that uses low

level capabilities of the NIC hardware to relieve the CPU cores from the responsibility of maintaining packet pacing. From an architectural point of view, the network driver is the correct place to do this task (rather than at the same level as the OS and application). This approach frees the application layer from extremely tight real-time responsibilities. The remainder of the network driver thus can continue to provide the capabilities for other network traffic types.

Challenge 3: Double Memory Copy The application traditionally writes packet data to its own application memory, and from there it is copied to a local buffer on the network interface hardware. This two-step process introduces latency and consumes resources that could be used for the application.

Solution 3: Remove the Double Memory Copy TAG Video Systems addressed the timing and latency issues caused by double memory copies by simply removing the double memory copy. The solution involves using the network driver to allow the Application User Space to be mapped directly to the Network Interface Card. This means the application writes the data once, and it’s available for the network interface to dispatch without further delay. This in turn enables TAG Video Systems software to meet the tight timing constraints of ST 2110 senders.

>A CHIEVING THE FULL POTENTIAL OF 100% SOFTWARE IP IN LIVE PRODUCTION IP has ushered in the software era for broadcasters, who now can leverage 100% software solutions supporting SMPTE ST 2110 uncompressed media workflows on COTS equipment. The driver behind this evolution is to maximize the scalability and flexibility of the broadcast workflow, thus producing the highest quality product in the most economical manner. This shift to 100% software on standard COTS hardware will bring the broadcast industry the agility and flexibility already realized by other industries. Although ST 2110 presents nontrivial challenges for software-based solutions, recent innovations have made it possible for certain Live Production applications (Multiviewing and Monitoring) to be deployed in this manner. As the most challenging of the applications to achieve this while maintaining Live Production’s requirements of Low latency, TAG Video Systems has now paved the way for other applications to follow. <

SPORTSTECHJOURNAL / FALL 2020

119


WHITEPAPERS

Telephony in the Modern Video Production Facility By Shaun Dolan, Telos Alliance, Senior Support Engineer

> BACKGROUND

Figure 1: Let’s play “ Find the Phone!” Each phone is dedicated to one hybrid to allow dialing out or individually answering calls. Even in the best-case scenario, an enterprising engineer has built a switching mechanism that allows one phone to be used with multiple hybrids, but only one at a time in one physical location.

> MONITORING

While the underlying telephony transport method has undergone a shift in the last decade, the workflow in many facilities has remained largely unchanged. This white paper will explore a brief history of telephony in broadcast and production facilities to identify the challenges that are sometimes taken for granted, then moves to explore the solutions provided by modern Voice over IP (VoIP), Audio over IP (AoIP), and control mechanisms.

What about monitoring? Again, creative engineers have come up with inventive tally systems to see what hybrids are currently in use through lightboards or other means, but these systems leave a lot of questions unanswered, such as: Is the talent in the next live shot connected to the correct hybrid? If not, which hybrid are they connected to? What are the audio levels in and out of each hybrid? How can I dial the reporter manually?

> LEGACY WORKFLOW

> LEGACY TECHNOLOGY

Audio transported over the Public Switched Telephone Network (PSTN) has long been an afterthought in some video-centric facilities. For example, while IFB and engineering coordination are critically important links between talent and their production teams, the workflow and technology used to facilitate this communication has stayed mostly the same in some operations. Many facilities still have first-generation digital telephone hybrids in service, racked up in a far, dark corner of a technical operations center, connected to Plain Old Telephone System (POTS) circuits. Likewise, while “phoners” and other contribution audio from the PSTN is not a producer’s first choice to get a guest on the air, it is often the only viable solution in late-breaking situations and the COVID-19 world; a cell phone is many times the lowest common denominator between content producer and contributor. Despite broadcasters using telephones during some of the most critical moments of a developing story, the duty of converting the inherently 2-wire POTS circuit to 4-wire audio is often relegated to an ancient telephone hybrid.

> CONTROL The control method for legacy POTS hybrids has remained largely unchanged as well. In facilities that are still using this legacy technology, it is not uncommon to see dozens of analog phones mounted to the wall or scattered on a desk, as pictured. 120

SPORTSTECHJOURNAL / FALL 2020

POTS circuits are still pervasive in many corners of the broadcast and production industry. While most radio and audio facilities have moved away from analog lines, some circles have increasingly resisted change. The biggest pain point with POTS circuits is cost. I frequently speak to broadcasters who pay more than $50 per month for each POTS line they have. Compounding this pain of increasing POTS cost is decreased reliability. The major incumbent telecom providers are racing to abandon copper phone networks. Even where “real” copper POTS lines are still available, these providers are running out of technicians who know how to maintain them. The notion of a POTS line being more reliable than an IP-based circuit no longer holds true. While many content producers have moved to VoIP, some still use POTS because it “still works.” While this may be true for now, the sunset is gradually and painfully coming. Many facilities have moved to VoIP for their backend, but use Analog Telephone Adapters (ATAs) to connect to their aging hybrids. While this temporarily reduces telecom costs, these facilities miss out on the better call quality, higher reliability, and modern control options offered by modern broadcast telephony solutions.

> A PARADIGM SHIFT While these telephony workflows and technologies in a broadcast facility were merely products of their time, it is important to reevaluate


them in the context of modern times. New solutions are now available to remove the roadblocks inherent to legacy technology and take advantage of modern VoIP, AoIP, and control systems.

VoIP Voice over IP has been steadily increasing in popularity since the wide adoption of Session Initiation Protocol (SIP) in the mid2000s and has reached critical mass. It is rare to find an office with anything but SIP phones on desks at this point. This same technology can bring similar benefits to broadcasters. The initial draw to VoIP, for many, is the cost savings. Switching to VoIP saves broadcasters I speak with 50-75% on their recurring telecom expenses, regularly. However, not only do telecom costs drop dramatically when switching to VoIP, reliability skyrockets. Telecom companies are actively maintaining and growing their IP backbone. What’s more, providers can offer divergent paths and redundant circuits with automatic failover in ways that were just not possible with POTS. Another excellent feature of VoIP is the granular troubleshooting it affords. The widely accepted SIP protocol, standardized by the Internet Engineering Task Force (IETF), includes more than 40 different error messages that can tell you why a call was not completed. The SIP messaging and audio flows can be analyzed by tools such as Wireshark, allowing an engineer to see precisely what is going wrong. This gives us far more information to get the problem resolved faster — gone are the days of calling your provider with no more information than “there’s no dial tone.”

> MODERN WORKFLOW Multiple AoIP protocols are now baked into nearly any relevant gear in a production facility, including into VoIP-to-AoIP gateways. However, what had been missing from these simple gateways was a method of unified control. Without this critical control layer, VoIP to AoIP gateways are used essentially within the same confines of the legacy workflows discussed earlier. Pictured is an example of a custom HTML5 user panel that allows viewing of line status, caller ID, audio levels, and audio routing. It also allows a user to dial any IFB and talk back to any IFB. In this way, full control of an entire facility’s telephony system can be realized. This is accomplished through the Telos VX Enterprise, a VoIP multi-line call handling engine for broadcast, and Axia Pathfinder Core PRO, a full-scale orchestration platform. Figure 2: A modern facility-wide IFB control panel

> BEST PRACTICES For anyone interested in modernizing their facility’s telephony service and workflow, we can suggest a handful of best practices that can help ensure workflow cohesion and flexibility in the future.

VoIP Provider Partner with a VoIP provider and product offering that suits your needs. If availability of telephone IFB and contribution audio is not optional for your productions, consider ordering a dedicated highQuality of Service (QoS) SIP trunk circuit. While higher-priced than other solutions provided over the top of internet circuits, these dedicated circuits typically include an SLA, guaranteed bandwidth, and give you the best chance of all audio and SIP messaging being delivered reliably with low latency. This type of circuit is the closest you will get to legacy circuits, such as POTS or T1, with regard to availability.

VoIP Engine Ensure your VoIP engine is well-suited for broadcast and live production. It should have smart workflows built in that can handle most of your needs automatically, as well as an API to allow custom control from third-party applications. Consider if its hardware platform has resource availability for growth. Your VoIP engine should also be priced to your needs, meaning that you are only paying for the functionality you need right now. Licensing options can allow for increased capacity when you need it. Of course, your VoIP engine should allow for compliance or compatibility with audio interoperability standards, such as AES67 and SMPTE 2110-30. For facilities where network audio is not yet the standard, ensure that the VoIP engine you choose easily integrates with network audio gateways that provide signals in your preferred format, such as analog, AES3, or SDI.

Orchestration A telephony orchestration platform should allow full integration into your VoIP engine’s functionality. Make sure it will mesh seamlessly into the smart workflows already in place in your VOIP engine by default but can allow overriding them as needed for special operations. The platform should allow you to create custom user panels, using widely compatible methods such as multitouchfriendly HTML5. Additionally, evaluate if the orchestration platform you’re considering can integrate with a wide variety of other protocols, such as those used by your intercom, audio routing, and monitoring systems.

Hardware Control In many cases, hardware control is equally important as software control for fast-moving workflows. Find a VoIP broadcast solution that includes options for dedicated hardware controllers that allow for audio, monitoring, and basic control. Intuitive design and useability are critical here, and dedicated hardware controllers can help bridge the gap for staff members who are less comfortable using software for basic functions. Alternatively, make sure the VoIP engine and/or orchestration platform you choose allows for interfacing with generic hardware controllers, such as via IP messaging or simple GPIO. < SPORTSTECHJOURNAL / FALL 2020

121


WHITEPAPERS

How Network Connectivity Will Shape The Future of Remote Production By Steven M Dargham, Telstra Broadcast Services, Business Development Executive – Special Events

The evolving production techniques that enable international remote production have taken on a new urgency, since the start of the global COVID-19 crisis. The catchphrase we have used at Telstra Broadcast Services is ‘From Pandemic to Permanent’ — an acknowledgement of pandemic’s profound and lasting impact. What we’ve seen is a shift in attitudes where remote production workflows are now viewed as essential, instead of being an optional consideration based primarily on its cost-effectiveness of deployment. Indeed, with socially distanced production environments mandated in many territories, remote workflows are now often the only way that many large-scale live events can be covered. However, for all the production experience gathered over recent years, we are not yet at the point where this can be offered as a

122

SPORTSTECHJOURNAL / FALL 2020

plug-and-play technology. Having delivered several high-profile Tier-1 international remote production projects recently, Telstra Broadcast Services (TBS) has found that a key factor for successful remote production circuits is to strike the right balance between delay jitter, bandwidth, and buffering — at an economical price point. This process has to be replicated too. Our experience is that use of IP over subsea cables is the most efficient route that satisfies all the technical criteria. One of the key underpinnings of remote production is that at least two separate signal paths are necessary for diversity purposes. Shorter paths are generally preferred, but this is not always possible for a global event for which the back-up path can often be considerably longer. For instance, for the 2019 Rugby World Cup (RWC) where TBS set up circuits between various stadia in Japan and the international production hub at IMG in London, the most direct route was via the Suez Canal while the other was via the Pacific and Los Angeles. Both routes were vulnerable to potential disruption, and both perennial low-level concerns for all communications routed via the respective regions. It is also important to proactively plan for natural disasters. For example, less than two weeks before the start of the RWC, the Category 4 storm Typhoon Faxai took out two of the primary circuits out of Tokyo and caused undersea cable damage. The normal timeframe to restore such damage could be months, but with our readiness plan and dedicated efforts with our partners, we were able to get the recovery period down to days, in time for the start of the event. All in, 21 high bitrate video feeds were sent from the IBC to London via redundant routes where they terminated in two of Telstra’s London PoPs, before being passed on to the IMG production facility near Heathrow. JPEG 2000 compression was used to compress the signals down to 10Gbps. Keeping jitter is vital in these circumstances. This is because productions need to maintain hitless switching between both paths, in case the signal drops on one of them. Jitter for the RWC was measured to be in the microsecond range. Buffering, of course, needs to be minimized wherever possible. While frustrating for the home viewer streaming content, it can be disastrous for a production linked transcontinentally via talkback. Over recent years, more


> Master Control Room in the Telstra Broadcast Operations Center in Sydney, Australia. and more equipment has been kept back at home base, with the onsite kit required to remotely produce a large-scale live event having shrunk from an entire OB vehicle to a mere two or three rack units. With graphics, audio, playback, and many other live production processes now taking place at the production hub, the remote camera crews need to be seamlessly brought into the workflow so that the director has complete continuity of execution. For example, the director needs to be able to tell Camera 2 to quickly pan right or left, and the camera operator has to do that instantly with no discernible lag, similar to how the EVS operator in the production gallery cues up a clip. This seamless switching is vital to the successful execution of remote production. Happily, it is achievable with careful routing; despite the 16,000km length of the longest circuits in our RWC example, the transmission delay between Japan and London during the tournament was measured at just 223.1 milliseconds. There are also other factors to consider in setting up these circuits, such as international data costs which are driven by market demand. For the largest-scale events, pricing can be an issue that needs close examination and occasionally robust negotiation. This is particularly true when it comes to last-mile connectivity, which can often be both a financial sticking point at stadia when working internationally and a technical one as well, requiring detailed planning at an early stage. On the part of telecoms providers, they can occasionally be seen as slow to deliver against the requirements of live sports events. As a result, when working in a territory and with a telco for the first time, a careful planning stage — and preferably working with an organization with on-the-ground experience — is one of the primary keys to avoiding last-minute issues.

It will be instructive to see how international connectivity develops for remote production over the next few years. In Australia, we have connected all Tier-1 stadia for 60GB connectivity and are looking to roll out the network to Tier-2 arenas. Such connectivity is becoming more and more a feature of stadia around the world. Equally, we are starting to see the emergence of large production hubs, often established and run by major broadcasters or broadcast service providers. These hubs can tap economies of scale from centralizing equipment and can run multiple concurrent productions from the same location, using the same centrally located teams. During the pandemic, we also saw further development of distributed remote production, where the production crew may be working remotely from both the venue and centralized production hub, for example working from home and accessing equipment and workflows that have been virtualized. Technological advances in the next decade will make it possible to ship a single box to an event location, into which multi-skilled crew plug their cameras with everything else taken care of remotely. Once low jitter and low-latency connectivity become commonplace with the support of underlying infrastructure, we will see remote production evolve into genuine plug-and-play services — translating into even wider choices for production hubs and virtualized workflows. <

SPORTSTECHJOURNAL / FALL 2020

123


SPONSORINDEX AD PAGE

SPONSOR

LEVEL

CONTACT

EMAIL

PHONE

3G WIRELESS

Mobile

Gordon Capaccio

GordonC@3gwireless.tv

410-969-3501

ADDER TECHNOLOGY

Premier

Tim Conway

usasales@adder.com

888-932-3337

ADMIRAL VIDEO

Mobile

Paul Halsey

paul@imhd.tv

716-651-9900

AE LIVE

Mobile

Neale Connell

neale@ae.live

+44 1442 234 531

AERIAL VIDEO SYSTEMS

Mobile

Argyle Nelson

argyle@aerialvideo.com

818-954-8842

AIDA CONTENT MANAGEMENT

Mobile

Peter Flood

peter.flood@aidacm.com

201-693-5451

AJA VIDEO SYSTEMS

Corporate

Christina Oliver

christinao@aja.com

530-271-3326

ALDEA SOLUTIONS

Corporate

Daniel Gonzalez

daniel.gonzalez@aldea.tv

514-461-4136

ALL MOBILE VIDEO

Mobile

Eric Duke

eduke@amvchelsea.com

212-727-9862

ALPHA VIDEO

Mobile

Jeff Volk

jeff.volk@alphavideo.com

952-841-3311

sanjay@amagi.com

+91 98440 39275

AMAGI

Corporate Sanjay Kirimanjeshwar

AMAZON WEB SERVICES

37

Platinum

Paula Taylor

tpaula@amazon.com

503-222-3212

ANTHONY JAMES PARTNERS

Mobile

Mark Roberts

markr@anthonyjamespartners.com

609-751-4379

ARCTEK SATELLITE PRODUCTIONS

Mobile

Brian Stanley

bstanley@arcteksat.com

612-308-9079

ARISTA NETWORKS

Corporate

Bryan Obeidzinski

bryano@arista.com

917-940-0097

ARVATO SYSTEMS

Corporate

Kurt Krinke

kurt.krinke@arvatosystems.com

248-755-0676

Mobile

Chris Payne

cpanye@asgllc.com

917-807-1804

Platinum

Scott Beckett

sb9485@att.com

214-647-0486

ATEME

Corporate

Dave Brass

d.brass@ateme.com

484-860-0358

AUDIO-TECHNICA

Corporate

Mike Edwards

medwards@atus.com

330-686-2600 x2030

AV DESIGN SEVICES

Mobile

Jim Landy

jmlandy@avds.tv

609-531-2642

AVI SYSTEMS

Mobile

Craig Frankenstein

craig.frankenstein@avisystems.com

248-957-6161

AZZURRO GROUP

Mobile

Anthony Sotomayor

asotomayor@azzurrogroup.com

212-625-2372

BECK TV

Mobile

Fred Wright

fwright@becktv.com

972-505-8941

Platinum

Craig Schiller

cschiller@bexel.com

818-565-4202

Corporate

Adrian Lambert

A.Lambert@blackbird.video

+44 7905863352

ASG (ADVANCED SYSTEMS GROUP) AT&T

21

BEXEL

45

BLACKBIRD VIDEO BLACKMAGIC DESIGN

Platinum

Bob Caniglia

bobc@blackmagicdesign.com

408-954-0500 x319

BRAINSTORM

Corporate

Miguel Churruca

mchurruca@brainstorm3d.com

+34 91 781 6750

BRIDGE DIGITAL

Corporate

Richie Murray

richie@bridgedigitalinc.com

615-859-5754

BROADCAST MANANGMENT GROUP

Mobile

Josh Gallant

jgallant@broadcastmgmt.com

973-820-5847

BROADCAST SERVICES INTERNATIONAL (BSI)

Mobile

Jim Eady

jim@bsi-tv.com

905-332-2171

BROADCAST SPORTS INTERNATIONAL (BSI)

Mobile

Peter Larsson

Peter.Larsson@BSINTL.COM

410-564-2600

C360

Mobile

Evan Wimer

ewimer@c360live.com

724-940-3277

CALREC AUDIO/DIGICO

Premier

Jack Kelly

jackk@g1limited.com

631-396-0184 x102

Platinum

Richard Eilers

reilers@cusa.canon.com

609-480-6019

Corporate

Adrian Herrera

adrian.herrera@caringo.com

619-665-8153

CANON CARINGO

124

5

1, 65

SPORTSTECHJOURNAL / FALL 2020


SPONSOR INDEX SPONSOR

AD PAGE

LEVEL

CONTACT

EMAIL

PHONE

Mobile

Greg Landa

greg.landa@es-cat.com

904-494-7532

Corporate

Jeremy Strootman

jeremy@squarebox.com

314-229-6012

Platinum

Jennifer Cleveland

jennifer.cleveland@centurylink.com

571-485-8781

CHESAPEAKE SYSTEMS

Mobile

Louise Shideler

louise@chesa.com

410-752-7732

CINESYS

Mobile

Brent Angle

brent@cinesys.io

713-272-0732

CIS GROUP

Corporate

Matt Silva

matt.silva@cisgroup.tv

954-257-9938

CISCO SYSTEMS

Corporate

Susan Friedman

sufriedm@cisco.com

908-433-6948

CLARK WIRE & CABLE

Corporate

Tom Yatabe

tom.yatabe@clarkwire.com

800-222-5348 x18

CLEAR-COM, AN HME COMPANY

Premier

Rachel Archibald

Rachel.Archibald@Clearcom.com

510-337-6676

COBALT DIGITAL

Premier

Chris Shaw

chris.shaw@cobaltdigital.com

217-344-1243

COMREX

Premier

Chris Crump

ccrump@comrex.com

978-784-1776

CORNERSTONE AV

Mobile

Matt Endicott

me@cornerstoneav.com

801-221-0099

CP COMMUNICATIONS

Mobile

Kurt Heitmann

kurt.heitmann@cpcomms.com

914-345-9292

CREATIVE DIMENSIONS

Corporate

Joel Roy

jroy@gowithcd.com

203-250-6517

Mobile

Noah Gusdorff

noah@cmsi.tv

818-847-7390

Corporate

Sergio Amatangelo

sergio.amatangelo@crowncastle.com

724-416-2710

CSP MOBILE PRODUCTIONS

Mobile

Matt Keske

mkeske@cspmobile.com

312-914-2616

CTI (CONFERENCE TECHNOLOGIES INC.)

Mobile

Ry Alford

ralford@conferencetech.com

404-352-3000

Premier

Sales

sales@daktronics.com

800-325-8766

DALE PRO AUDIO

Corporate

Eric Eldredge

eric@daleproaudio.com

212-475-1124

DELAPLEX

Corporate

Mark Rivers

Mrivers@delaplex.com

404-867-3334

DIMETIS

Corporate

Kai Bechtold

kbechtold@dimetis.de

+49 6074 3010 400

Mobile

Christopher Sullivan

CSullivan@diversifiedus.com

201-670-6548

Corporate

Don Cardone

don@dveo.com

908-998-1080

Mobile

Sam Schrade

sam@dnawebs.com

281-802-8000

Platinum

Cherylene McKinney

cherylene.mckinney@dolby.com

646-823-1523

DOME PRODUCTIONS

Mobile

Mary Ellen Carlyle

mcarlyle@domeprod.com

416-341-2022

DX3 MEDIA INC.

Mobile

Dale Smith

dalesmith@DX3media.ca

416-433-3261

ECODIGITAL

Corporate

Justin Russell

justin.russell@goecodigital.com

443-926-2673

EDITSHARE

Premier

Tracy Geist

tracy.geist@editshare.com

415-298-1290

EEG ENTERPRISES

Corporate

Eric McErlain

ericm@eegent.com

516-293-7472 x502

ELUVIO INC.

Corporate

Amy Meadows

amy.meadows@eluv.io

314-799-5564

ENCO SYSTEMS

Corporate

Ken Frommert

ken@enco.com

248-827-4440

ENCOMPASS DIGITAL MEDIA

Corporate

Joe Garzillo

jgarzillo@encompass.tv

203-965-6334

ENDEAVOR STREAMING

Corporate

Kayla Conover

Kayla.Conover@endeavorstreaming.com

516-622-8381

Mobile

Bob Hawkanson

bob@esbroadcasthire.com

407-601-6926

Platinum

Jim Scott

jim.scott@eurovision.net

973-650-9577

EVERTZ

Premier

Kyle Miobertolo

kmiobertolo@evertz.com

905-220-4738

EVS

Premier

William Walz

w.walz@evs.com

973-575-2116

F&F PRODUCTIONS

Mobile

George Orgera

GeorgeO@fandfhd.tv

727-535-6776

CAT ENTERTAINMENT SERVICES CATDV CENTURYLINK

27

CREATIVE MOBILE SOLUTIONS CROWN CASTLE

DAKTRONICS

71

DIVERSIFIED DMC BROADCAST GROUP DNA STUDIOS DOLBY

35

ES BROADCAST EUROVISION SERVICES

43

SPORTSTECHJOURNAL / FALL 2020

125


SPONSOR INDEX AD PAGE

SPONSOR

LEVEL

CONTACT

EMAIL

PHONE

Corporate

John Agger

jagger@fastly.com

844-432-7859

FILMWERKS INTERNATIONAL

Mobile

Michael Satrazemis

michaels@filmwerksintl.com

910-675-1145

FINGERWORKS TELESTRATORS

Corporate

Bryan McKoen

bryan@telestrator.com

604-762-1477

Mobile

Dan Grainge

dan@fletch.com

312-932-2704

FOCUSRITE PRO

Corporate

Ted White

ted.white@focusrite.com

310-321-4104

FOR-A

Corporate

Adam Daniul

daniul@for-a.com

305-773-7608

Platinum

Gordon Tubbs

gtubbs@fujifilm.com

973-686-2769

FUSE TECHNICAL GROUP

Corporate

Christian Dundee

cdundee@fuse-tg.com

818-827-6057

G&D NORTH AMERICA

Corporate

Craig Abrams

cabrams@gd-northamerica.com

818-748-3383

FASTLY

FLETCHER SPORTS

FUJIFILM

25

GAME CREEK VIDEO

Mobile

Pat Sullivan

psullivan@gamecreekvideo.com

603-821-2205

GEARTECH USA

Mobile

Brad Wensley

bwensley@geartech.ca

647-295-6672

Corporate

Tim Jackson

tim.jackson@globecastna.com

310-849-3901

Platinum

Kari Szul

kariszul@google.com

212-565-0448

Corporate

Mike Kelley

mike@grabyo.com

914-584-7324

Platinum

David Cohen

david.cohen@grassvalley.com

215-837-8699

Premier

Mark Allatt

Mark.Allatt@gravitymedia.com

+44 1923 691421

HAIVISION

Corporate

Karen McCone

kmccone@haivision.com

514-993-2683

HARMONIC

Corporate

Eric Kenyon

eric.kenyon@harmonicinc.com

408-490-7665

HB COMMUNICATIONS

Mobile

Cristina Schussler

cristina.schussler@hbcommunications. com

203-747-7187

HIGH ROCK MOBILE TELEVISION

Mobile

Phil Engborg

phil@carr-hughes.com

518-584-0202

IBM ASPERA

Premier

Laura Petrosillo

laura.petrosillo@ibm.com

510-849-2386 x222

IBM STORAGE

Premier

Bill Martinson

billmar@us.ibm.com

650-400-7078

Corporate

Gates Killian

gates.killian@ibm.com

704-201-7126

IHSE USA

Premier

Dan Holland

dholland@ihseusa.com

732-738-8780

ILLUMINATION DYNAMICS

Mobile

Rich Williams

rich@illuminationdynamics.com

818-686-6400

IMAGE VIDEO

Corporate

Zach Wilkie

zwilkie@imagevideo.com

416-750-8872 x228

IMAGEN

Corporate

Nick Ashwin

nick.ashwin@imagen.io

214-418-3064

IMAGINE COMMUNICATIONS

Premier

Mary Schoof

mary.schoof@imaginecommunications. com

703-869-2042

IMS PRODUCTIONS

Mobile

Kevin Sublette

ksublette@imsptv.com

317-492-8770

INTEGRATED MEDIA TECHNOLOGIES

Mobile

Tom McGowan

tom.mcgowan@imtglobalinc.com

818-761-9770

INTOTO SYSTEMS

Mobile

Stephanie Rowen

srowen@suitelifesystems.com

310-405-0839

Corporate

Brittany Fitchett

bfitchett@ioindustries.com

519-663-9570 x242

Premier

Lynne Washington

Jacquelyn.Washington@ironmountain. com

610-831-2529

Corporate

Maria Casey

sales@jbanda.com

415-256-2800

Premier

John Cleary

jcleary@josephelectronics.com

800-323-5925

Corporate

Dan Skirpan

dskirpan@us.jvckenwood.com

724-747-9301

Mobile

Bill Kaufman

bill.kaufman@kaufmanbroadcast.com

314-533-6633

GLOBECAST GOOGLE CLOUD

23

GRABYO GRASS VALLEY

11, 69

GRAVITY MEDIA

IBM WATSON MEDIA & WEATHER

IO INDUSTRIES IRON MOUNTAIN ENTERTAINMENT SERVICES JB&A JOSEPH ELECTRONICS JVC PROFESSIONAL VIDEO KAUFMAN BROADCAST

126

SPORTSTECHJOURNAL / FALL 2020


SPONSOR INDEX AD PAGE

LEVEL

CONTACT

EMAIL

PHONE

KMH AUDIO-VIDEO INTEGRATION

59

Mobile

Kevin Henneman

khenneman@kmh-integration.com

800-590-2520

LAWO

17

SPONSOR

Platinum

Jeff Smith

jeff.smith@lawo.com

888-810-4468

LEADER INSTRUMENTS

Corporate

Scott Cannon

cannon@leaderamerica.com

800-645-5104

LEGRAND AV DIVISION

Corporate

Gordon Wason

gordon.wason@legrand.com

516-350-2327

LEVELS BEYOND

Corporate

Donnie Gilbert

dgilbert@levelsbeyond.com

720-508-8947

LH COMPUTER SERVICES

Mobile

Doug Cole

dcole@lhcomp.com

954-752-5805

LIMELIGHT NETWORKS

Corporate

Meredith Johnson

meredithj@llnw.com

602-850-5384

Mobile

Brad Sexton

brad@livemediagroup.com

818-435-3006

Platinum

Dave Belding

daveb@liveu.tv

714-916-8275

LTN GLOBAL

Corporate

Rich Rozycki

rrozycki@ltndigital.com

646-832-1952

LYON VIDEO

Mobile

Chad Snyder

chad@LyonVideo.com

614-319-4071

MARKERTEK

Premier

Adam June

adam@markertek.com

800-522-2025 x7361

MARSHALL ELECTRONICS

Corporate

Leticia Ambriz

leticia.ambriz@marshallelectronics.net

800-800-6608

MASSTECH INNOVATIONS

Corporate

Luc Tomasino

Luc.tomasino@masstech.com

918-605-3236

MATROX

Corporate

Francesco Scartozzi

fscartoz@matrox.com

514-822-6075

MAXON

Corporate

Paul Babb

paul@maxon.net

805-376-3331

MEDIA LINKS

Corporate

Tom Canavan

tcanavan@medialinks.com

860-206-7326

MEDIAKIND

Corporate

Dan Burnett

daniel.burnett@mediakind.com

678-689-6535

MEDIAPRO

Platinum

Mario Sousa

msousa@mediapro.tv

305-357-5893

MEYERPRO

Mobile

Steve Meyer

steve@meyerproinc.com

503-638-2096

MICROSOFT

Premier

Scott Bounds

sbounds@microsoft.com

646-225-4380

MOBILE TV GROUP

Mobile

Nick Garvin

ngarvin@mobiletvgroup.com

303-542-5555

MOVICOM

Mobile

Greg Salman

gsalman@movicom.com

323-633-7033

MPE

Corporate

Neal Pilzer

neal@mpenyc.com

303-618-4423

MULTIDYNE

Corporate

Frank Jachetta

frank@multidyne.com

516-629-0373

NCAM

Corporate

Phil Ventre

phil.ventre@ncam-tech.com

+44 7581 482750

Premier

Philip Nelson

philip@nelcomedia.net

(210) 863-0360

Platinum

Susan Matis

smatis@nepgroup.com

412-423-1339

NET INSIGHT

Corporate

David Dellafave

david.dellafave@netinsight.net

201-669-2037

NEVION

Corporate

Sales

sales@nevion.com

805-247-8560

NTP

Corporate

Kurt Howell

kurt.howell@ntp.dk

813-422-1289

Platinum

Jarrett Schwenzer

jarrett.schwenzer@nutanix.com

646-660-5898

LIVE MEDIA GROUP LIVEU

33

NELCO MEDIA NEP GROUP

NUTANIX

41

OBJECT-MATRIX

Corporate Nick Pearce-Tomenius

nick.pearce-tomenius@object-matrix.com

+44 2920 382308

OPENDRIVES

Corporate

Paul Swanson

p.swanson@opendrives.com

310-659-8999

Premier

Steve Milley

stephen.milley@us.panasonic.com

770-619-1779

PIXELLOT

Corporate

Stephen Hamilton

stephenH@pixellot.tv

213-321-9554

PLIANT TECHNOLOGIES

Corporate

Gary Rosen

sales@plianttechnologies.com

334-321-1160 x540

POLYGON LABS

Corporate

Grigory Mindlin

grig@polygonlabs.us

908-403-2061

Platinum

Andrea Berry

andrea.berry@prg.com

818-252-2645

PRIMESTREAM

Corporate

Namdev Lisman

namdevlisman@primestream.com

305-625-4415

PRIMEVIEW

Corporate

Chanan Averbuch

chanan@primeview.biz

212-730-4905

PANASONIC

PRG

15

61

51

SPORTSTECHJOURNAL / FALL 2020

127


SPONSOR INDEX AD PAGE

SPONSOR

LEVEL

CONTACT

EMAIL

PHONE

Corporate

Steve Rotz

srotz@productionhub.com

877-629-4122

PROGRAM PRODUCTIONS

Mobile

Amy Scheller

ascheller@programproductions.com

630-792-9700

PSSI GLOBAL SERVICES

Mobile

Clint Bergeson

info@pssiglobal.com

310-575-4400

QLIGENT

Corporate

John Shoemaker

john.shoemaker@qligent.com

321-956-3454

QUANTUM

Corporate

Ruth Compton

ruth.compton@Quantum.com

949-856-7793

QUANTUM5X

Corporate

Paul Johnson

pjohnson@q5x.com

519-675-6999

RCN BUSINESS SERVICES

Corporate

Sean Sullivan

sean.sullivan@rcn.net

571-623-4332

Mobile

Jeff Heimbold

j.heimbold@realitychecksystems.com

323-465-3900

Corporate

Oliver Abadeer

oliver.abadeer@ericsson.com

+44 7500 104 133

PRODUCTIONHUB

REALITY CHECK SYSTEMS RED BEE MEDIA RIEDEL COMMUNICATIONS

3, 63

Platinum

Joyce Bente

sales-us@riedel.net

818-559-6900

ROSS VIDEO

7, 57

Platinum

Kevin Cottam

kcottam@rossvideo.com

613-228-0688 x4366

Corporate

Mike Fredriksen

mike.fredriksen@rtsw.co.uk

+44 020 7384 2711

Mobile

Tim Eichorst

timeichorst@rushmediaco.com

608-850-7411

Corporate

Jim Pace

jpace@plus24.net

323-855-1404

Mobile

Mark Parikka

operations@sdtv.com

619-293-7777

RT SOFTWARE RUSH MEDIA COMPANY SANKEN/BRAINSTORM ELECTRONICS SDTV SEACHANGE INTERNATIONAL

53

Corporate

Walid Hamri

walid.hamri@schange.com

786-510-7626

SEAGATE

31

Platinum

Brandon Aleckson

brandon.aleckson@seagate.com

408-658-1852

SENCORE

Corporate

Brandon Baker

brandon.baker@sencore.com

605-978-4743

SENNHEISER

Corporate

David Missall

david.missall@sennheiser.com

774-253-1036

SHOTOVER

Mobile

Gordon Barry

gbarry@shotover.com

+64 275458777

SHURE

Premier

Rick Renner

renner_rick@shure.com

312-736-6042

SIGNIANT

Premier

Jon Finegold

jfinegold@signiant.com

781-221-4000

SKYCAM

Mobile

Stephen Wharton

stephen.wharton@skycam.tv

719-963-4150

Corporate

Thomas Gunkel

thomas.gunkel@skyline.be

+32 51 31 35 69

Mobile

Gil Cowie

gil@smartcartsvx.com

646-863-6263

SMT

Corporate

Patricia Hopkins

p.hopkins@smt.com

919-493-9390

SNEAKY BIG STUDIOS

Corporate

Stephen Moorman

smoorman@sneakybig.com

480-344-0100

SOLID STATE LOGIC

Corporate

Steve Zaretsky

stevez@solidstatelogic.com

203-253-1364

SKYLINE COMMUNICATIONS SMARTCART SVX

SONY

19

Platinum

Deon LeCointe

Deon.Lecointe@sony.com

201-930-6926

SOS GLOBAL

47

Corporate

Stephen O’Connell

soconnell@sosglobal.com

252-635-1400

SOUTHWORKS

Mobile

Bruce "Zip" Zieper

bruce.zieper@southworks.com

949-244-5691

SPARX TECHNOLOGY

Mobile

Kevin Annison

kevin@sparxtechnology.com

704-957-8198

SPECTRA LOGIC

Corporate

Hossein ZiaShakeri

hosseinz@spectralogic.com

303-449-6400

SPORTLOGIQ

Corporate

Eugene Plawutsky

eugene@sportlogiq.com

438-380-5435

SPORTRADAR

Corporate

Stephen Byrd

s.byrd@sportradar.com

847-274-3322

SPORTZCAST

Corporate

Michael Connell

mike@sportzcast.net

321-888-3800 x101

STATS PERFORM

Corporate

Stephanie Brown

sbrown@stats.com

646-324-2444

STEVENS GLOBAL

Corporate

Terry Isles

terryi@stevensglobal.com

800-229-7284

STUDIO NETWORK SOLUTIONS (SNS)

Corporate

Stephen McKenna

smckenna@studionetworksolutions.com

314-733-0551

SUPERSPHERE VR

Corporate

Lucas Wilson

lucas@superspherevr.com

310-593-1111

128

SPORTSTECHJOURNAL / FALL 2020


SPONSOR INDEX SPONSOR

AD PAGE

LEVEL

CONTACT

EMAIL

PHONE

Corporate

Ross Hair

ross.hair@supponor.com

+44 77 8988 2431

SWIFTSTACK

Premier

Erik Pounds

epounds@swiftstack.com

415-244-7058

SYNAMEDIA

Corporate

Christelle Gental

cgental@synamedia.com

[44] 1784774333

SYNCWORDS

Corporate

Ashish Shah

ashah@syncwords.com

718-408-9191

Mobile

Dominick Tarabocchia

dominick@t2computing.com

212-220-9600

TAG V.S.

Corporate

Abe Zerbib

abe@tagvs.com

315-646-8400

TATA COMMUNICATIONS

Corporate

Utkarsh Gosain

utkarsh.gosain@tatacommunications.com

917-499-0236

Premier

Jay Russell

jrussell@tedial.com

424-645-5300

Premier

Scott Murray

scottmu@telestream.net

530-470-1306

Corporate

Richard Collins

richard@tellyo.com

+44 7799 117 269

Premier

Martin Dyster

martin.dyster@telosalliance.com

717-735-3611

TELSTRA

Corporate

Anna Lockwood

Ana.Lockwood@team.telstra.com

877-835-7872

TERADEK

Premier

Jon Landman

jon@teradek.com

818-667-4258

THE STUDIO - B&H

Premier

The Studio Team

thestudio@BandH.com

800-947-9962

Platinum

Christian Kneuer

christian.kneuer@theswitch.tv

212-239-3715

Corporate

Larry Thaler

LThaler@TheVCC.TV

212-235-7019 x720

Mobile

Brian Carr

brian@thumbwar.tv

310-910-9030

TIGER TECHNOLOGY

Corporate

Angus MacKay

angus.mackay@tiger-technology.com

514-758-0922

TSL PRODUCTS

Corporate

Greg Siers

greg.siers@tslproducts.com

301-272-0939

TV GRAPHICS

Corporate

Dan Murphy

dantvgraphics@gmail.com

303-550-7088

Premier

Ken Valdiserri

kvaldiserri@tvunetworks.com

312-316-3776

Corporate

Marcel Naef

marcel.naef@uniqfeed.com

347-401-2020

Mobile

Anne Komarovsk

anne@unisatmobile.com

310-717-7643

UTAH SCIENTIFIC

Corporate

Barry Singer

bsinger@utsci.com

917-880-5366

VARIANT SYSTEMS GROUP

Corporate

Adolfo Rodriguez

adolfo@variantsystemsgroup.com

503-567-9658

VENUE EDGE

Corporate

David Saphirstein

dsaphirstein@gmail.com

407-505-9410

VERITONE

Corporate

Logan Ketchum

lketchum@veritone.com

818-519-0054

VERIZON MEDIA

Corporate

Jeff Casey

jeffrey.casey@verizon.com

908-559-2036

VIDEON

Corporate

Todd Erdley

todd@videon-central.com

814-235-1111

Mobile

Jim Jachetta

jimj@vidovation.com

949-777-5435 x1001

VIMOND

Corporate

Megan Wagoner

megan@vimond.com

202-422-6844

VISLINK TECHNOLOGIES

Corporate

Emily Fox

emily.fox@vislink.com

+44 1442 431334

VISTA WORLDLINK

Corporate

Joshua Liemer

jliemer@vistaworldlink.com

954-838-0900 x210

VITAC

Corporate

Sales

info@vitac.com

800-278-4822

Platinum

Bryan Reksten

bryan.reksten@vitec.com

770-331-4802

Premier

Mark Gederman

mgederman@vizrt.com

401-787-0120

WORLD WIDE TECHNOLOGY

Corporate

Brandon Handley

brandon.handley@wwt.com

(610)908-5881

WOWZA MEDIA SYSTEMS

Corporate

Brad Wright

brad.wright@wowza.com

720-608-4733

WSC SPORTS

Corporate

Galit Shiri

galit@wsc-sports.com

+972 54 471 6862

XCITE INTERACTIVE

Corporate

JR Reichl

jreichl@xcite-interactive.com

800-464-9445

Corporate

Greg Dolan

gdolan@xytechsystems.com

347-746-4734

Platinum

Eric Bolten

eric@zixi.com

978-853-8482

SUPPONOR

T2 COMPUTING

TEDIAL

49

TELESTREAM TELLYO TELOS ALLIANCE

THE SWITCH

39

THE VIDEO CALL CENTER THUMBWAR

TVU NETWORKS UNIQFEED UNISAT

VIDOVATION

VITEC

29, 67

VIZRT GROUP

XYTECH ZIXI

C2

SPORTSTECHJOURNAL / FALL 2020

129


THE FINAL BUZZER

CHANGE THE GAME By Ken Kerschbaumer SVG, Executive Director, Editorial In 1990, while at the University of Delaware, I wrote a column for the school newspaper on the hot button topic of the day on college campuses: divesting from South African companies. My column coincided with MLK Day and the university was closed in honor of Dr. King. But, at the same time, the school was financially supporting a nation that was in direct conflict with Dr. King’s teachings. My column had the righteous indignation that only a 20-year-old can bring to the table — and almost got me kicked off the school paper — but my professors stood by me and the point was made: feeling good by giving a holiday is not the same as doing good. I am reminded of that column now as I look at a sports production community that is playing a big role in shining a light on social injustice, systemic racism, and much more. But, at the same time, there is a serious lack of diversity when it comes to the production, engineering, and management ranks of sports networks and digital entities. Thankfully, this lack of diversity has resulted in networks, media companies, production companies, and even manufacturers working hard to fix the problem. Diversity programs have been sprouting up everywhere, and many of the companies who are involved with SVG have already done so. And, rather than simply spinning up another one, our goal is to act as a gathering point for those disparate efforts. We are calling our program SPIRIT (Sports Production Inclusion Responsibility in Technology). In the coming weeks and months, we will launch a number of initiatives to lay the foundation for meaningful change. Those steps include: SVG VOICES: A collection of video and written resources that tell the stories of women, minorities, and LGBTQ individuals who are already in our industry and making it more diverse. The goal of VOICES is to have those profiled discuss how they became interested in the industry, how they became part of it, and how they found success. Their stories will inspire a new generation and inform those already in the industry. SDI (Sports Diversity Initiative): SDI is designed to connect sports production professionals with HBCUs, high schools, and other programs that can be feeders for nextgeneration talent into the sports production industry. SPIN (Sports Production Inclusion Network): A network designed to connect sports production professionals who make hiring decisions with HR and Diversity Officers at key network, league, and manufacturers and vendors to help both more effectively address the challenge of diversity and inclusion. SVGW: SVG’s well-established women’s networking group will also fold under the SPIRIT umbrella. Active in the U.S., Europe, and Australia, it is indicative of the kind of success we can find as we expand our efforts. How do all of those pieces come together? At a high level, our vision is that VOICES provide resources that our SDI members can share with students and others so that they can be inspired to want to be a part of the industry. Diversity cannot happen if those we want to enter the industry never know what opportunities are available. VOICES and SDI will change that. And SPIN will do the important work of making sure HR and Diversity Officers at networks and leagues can connect with those closest to the job opportunities. A more diverse workforce will benefit all. It provides more perspectives to programming and production strategies that can drive viewership. It expands the scope of a future workforce that will need to arrive in the next three to five years to replace those who will retire. There is much work to be done. It will take energy. It will take commitment. And our hope is that in the coming months, you and your company will want to become part of SPIRIT. < 130

SPORTSTECHJOURNAL / FALL 2020

PUBLISHED BY SPORTS VIDEO GROUP 19 West 21st St., Ste. 301 • NY, NY 10010 Tel: 212.481.8140 • Fax: 212.696.1783 www.sportsvideo.org EXECUTIVE DIRECTORS PAUL GALLO, Executive Director

paul@sportsvideo.org | 212.696.1799

MARTIN PORTER, Executive Director

marty@sportsvideo.org | 516.446.2029 EDITORIAL KEN KERSCHBAUMER,

Executive Director, Editorial kenkersch@sportsvideo.org | 646.205.1810 JASON DACHMAN, Chief Editor jason@sportsvideo.org | 646.861.2373 BRANDON COSTA, Director of Digital brandon@sportsvideo.org | 646.861.2370 KRISTIAN HERNANDEZ,

Associate Editor and Social Media Manager kristian@sportsvideo.org | 646.880.4902 SUSAN QUALTROUGH, Copy Editor susan@sportsvideo.org RIVA DANZIG, Art Director riva@sportsvideo.org SVG SERVICES KAREN HOGAN KETCHUM,

Director of Production karen@sportsvideo.org | 646.559.0434 KATIE CHAMPION,

Production and Operations Associate katie@sportsvideo.org | 646.524.7497 ALICIA MONTANARO,

Meetings and Events Manager alicia@sportsvideo.org | 646.880.4901 ANDREW LIPPE,

Membership & Client Services Manager andrew@sportsvideo.org | 212.481.8133 CRISTINA ERNST, Event Operations Director cris@sportsvideo.org | 917.309.5174 SPONSORSHIP ROB PAYNE, Managing Director,

Worldwide Sponsor Development rob@sportsvideo.org | 212.481.8131 ANDREW GABEL,

Director, Sponsor Development agabel@sportsvideo.org | 646.998.4554 DYLAN DAVIDSON, Sponsorship Coordinator dylan@sportsvideo.org | 646.559.0435 SVG EUROPE CONTACTS JOE HOSKEN, General Manager

joe@sportsvideo.org Tel: +44 74290 90134 ABOUT SVG

The Sports Video Group was formed in 2006 to support the professional community that relies on video, audio, and broadband technologies to produce and distribute sports content. Leagues, owners, teams, players, broadcasters, Webcasters, and consumer-technology providers have joined SVG to learn from each other, turn vision into reality, and implement innovations, while sharing experiences that will lead to advances in sports production/distribution and the overall consumer sports experience.

SportsTech Journal is produced and published by the Sports Video Group. SportsTech Journal © 2020 Sports Video Group. PRINTED IN THE USA.


Turn static files into dynamic content formats.

Create a flipbook
Issuu converts static files into: digital portfolios, online yearbooks, online catalogs, digital photo albums and more. Sign up and create your flipbook.