TM Broadcast International 79, March 2020

Page 1





NRK: Moving from a traditional broadcast perspective to a personalized user universe

64 Super Bowl LIV


Welcome to one of the greatest sport shows that can be offered for live T


ENCO enCaption

El Ranchito Filling the most spectacular series with magic


Editor in chief Javier de Martín

Creative Direction Mercedes González

Key account manager Susana Sampedro

Administration Laura de Diego

Managing Editor Sergio Julián



ISE 2020

Test Zone:


Oscars This is how production of the year’s most expected awards ceremony was carried out

TM Broadcast International #79 March 2020

TM Broadcast International is a magazine published by Daró Media Group SL Centro Empresarial Tartessos Calle Pollensa 2, oficina 14 28290 Las Rozas (Madrid), Spain Phone +34 91 640 46 43 Published in Spain ISSN: 2659-5966

EDITORIAL The most relevant news of the month, and perhaps of the year, does not surprise us. Still, we are sad to confirm it. NAB Show 2020, a referential fair for the global broadcast sector, will not take place in April. This has been the message shared by the organizers of the show. It denotes that NAB is currently considering a number of options ranging from postponement to cancellation. During the last weeks, historical companies such as Ross Video, AJA, Avid, TVU, Adobe or Nikon had stated that they were not going to attend the event. These withdraws were not representative of the state of the show in absolute terms, as 96% of exhibitors still planned to attend. However, there was one issue that we are sure has been decisive in the final decision: attendance. We must face the reality: confirmed cases of Covid-19 continue to expand across countries around the world and authorities recommend their citizens not attend to events with large crowds of people. Another important factor is the travel restrictions announced by the United States for citizens of the European Union. The fact that NAB, like other show such as Cabsat or BroadcastAsia, won’t take place as planned is not good news. Both the magazine and the Daro Media publishing group defend these events as key market drivers to create new links, close agreements, share knowledge and elevate an industry in continuous evolution. But we must not lose perspective: the show must go on. Many companies were going to unveil their strategies for 2020, and that’s what’s going to happen. At TM Broadcast International we will strengthen our efforts to share all the information about new products and spread each and every one of your projects. We are proud to be loudspeakers for all of you and we reaffirm our commitment to continue to be a valuable, objective and committed media outlet for the whole industry. In these times, we must look to the future with calm and integrity. It is very difficult to foresee how events will unfold. However, the industry has overcome numerous challenges in the past. We are sure that with work and diligence the sector will overcome the impact that Covid-19 is causing.



The Vizrt Group presents VizrTV and NewTekTV, two brand new communications platforms The Vizrt Group, parent company of the Vizrt, NewTek and NDI® brands, has announced the launch of VizrTV ( and NewTekTV ( These two “digital-first” communications platforms are designed to deliver key product awareness, innovation, and support messages to help customers and partners better adapt to changing market environments. The new digital-first platforms will enable the Group’s three brands to communicate their key product innovation messages by utilizing IPbased, software-defined visual storytelling (#SDVS) on a global basis. For the tradeshows, the group will be exercising its local teams from Vizrt, NewTek and NDI who will be 6

tailored way, to a screen that is convenient to our customers.”

ready to welcome any customers in attendance, to discuss how softwaredefined visual storytelling tools and solutions can help them enhance their live content production capabilities. “Our customers play a vital role in keeping their audiences informed during these dynamic and rapidly changing times,” said Michael Hallén, CEO of the Vizrt Group. “Because we are leading the industry with adaptive and software-defined visual storytelling products, we are uniquely positioned to use VizrTV and NewTekTV to deliver our message, using our own tools in a very

“Vizrt Group’s primary concern is for the health of its people, be they customers, partners, colleagues, or family. As such, Vizrt Group is actively monitoring local and global health authority advice and guidelines, and these will be rigorously followed. As the public health situation continues to develop, Vizrt Group is committed to supporting our customers and partners as they adapt, and IP-based softwaredefined visual storytelling is our primary communication platform. This will be supported with locally focused tradeshow and exhibition participation” said the company in a press release. 


Blackmagic Design releases Video Assist 3.1, Blackmagic RAW 1.7 and DaVinci Resolve 16.2 Blackmagic Design has just published three software updates for three flagship products: Video Assist, Blackmagic RAW and DaVinci Resolve. Blackmagic Video Assist 3.1 and Blackmagic RAW 1.7 add support for Blackmagic RAW recording from Panasonic EVA1 and Canon C300 MK II cameras on the new Blackmagic Video Assist 12G models. This update lets customers take advantage of Blackmagic RAW workflows and allows “dramatically improved quality and creativity in the post production process”, according to the press release. Blackmagic Video Assist 3.1 also adds support for metadata entry allowing shots to be logged and tagged on set for better management of media for large productions. Both updates are available now for

download from the Blackmagic Design website. At the same time, the company has announced DaVinci Resolve 16.2. The new update includes major Fairlight updates for audio post production and improvements for color correction, editing and more. These are all features of DaVinci Resolve 16.2:  Improved usability in Fairlight timeline editing.  Improved Blackmagic Fairlight sound library.  New automatic sorting of effects and plug-ins.  Improved immersive 3D surround sound bussing and monitoring.  Fairlight audio editing track index improvements.  Improved AAF import and export.  Improved import of older legacy Fairlight projects.

Blackmagic Video Assist 12G HDR, professional monitoring and recording device that can benefit from these updates.

 Multiple improvements in audio mixing and FairlightFX filters.  Major improvements in the Fairlight console audio editor.  Improved transport control on the Fairlight console audio editor.  New editing features including loading and switching timelines.  Improved media pool with faster copy and paste of clips.  New color grading features including smart filters and more.  Improved file format support for new cameras and standards.  7


Calrec releases the virtual mixing console Type R for TV

Calrec Type R for TV

Building on the success of its Type R for Radio, Calrec has launched Type R for TV, a virtual mixing console. Its native IP core integrates with station automation systems. Type R for TV meets the rising demand for automated audio consoles in smaller news operations, which is becoming more prevalent according to Calrec’s Director of Product Management, Henry Goodman. “More often we’re seeing the console surface not used at all, which is driving the demand for broadcastlevel virtual consoles,” he said. Type R for TV is compatible with popular station automation 8

systems including Ross Overdrive, Sony ELC and Grass Valley Ignite. Furthermore, it provides fully automated programming with realtime adjustment of unpredictable external factors through a standard web browser. Type R’s hardware elements can be added if desired. Powered by standard POE switches, Type R has three different panel options. If broadcasters prefer a physical surface then banks of 6 x faders can be added, or adaptable soft panels like Calrec’s Large Soft Panel (LSP) and Small Soft Panel (SSP) can be used. Furthermore, Calrec’s Assist web UI, users can

access the virtual desk remotely. Assist works over TCP/IP and has minimal control lag. Goodman continued, “The move towards virtual mixing is gaining momentum, particularly in the US market. Calrec’s Serial Control Protocol (CSCP) has already provided a very successful transition to this model. It allows the audio console to be configured and driven entirely from the production control room’s vision switcher and remote fader panel, and Type R for TV builds on that legacy while maintaining our reputation for resilience and redundancy.” 


Net Insight’s Nimbra Edge is now available on Microsoft Azure Net Insight is extending its Nimbra Edge offering to include the service on top of Microsoft Azure. This will enable media companies who want to run on hyperscale cloud infrastructure to leverage the global infrastructure

and scalability of Microsoft Azure. The media transport solution is now available through the Azure marketplace. Nimbra Edge offers a hybrid cloud infrastructure that enables media companies

to “easily connect, manage, and consume low-latency, high-quality videos anywhere”. Current Nimbra customers can now, with a software upgrade, connect existing appliances to Nimbra Edge. 

Mo-Sys unveils TimeCam, a remote camera solution that “eliminates latency” Mo-Sys Engineering, developer of real time camera tracking and camera remote systems, presents TimeCam, a solution to remotely operate cameras “without perceived delay”. According to Mo-Sys, TimeCam represents a triple benefit to production companies: “First, there is the saving in cost and environmental impact in sending camera operators to site. Second, it means that the most in-demand operators can be much more productive, providing excellent coverage at a live event each day rather than losing time through travel. Third, it means that you can add cameras to your coverage without adding headcount: for instance, a downhill ski race might have eight cameras along the course, with one operator controlling cameras 1, 3, 5 and 7 and a second controlling 2, 4, 6 and 8”. “’Traditional’ remote production puts the control back at base, but still needs camera operators to travel to the location,” explained Mo-Sys CEO Michael Geissler. “By compensating for latency in transmission and compression/decoding, TimeCam means that operators too can stay at base and be much more productive by operating on several events, when normally they could only be on one.” 



Riedel solutions played fundamental roles in the first-ever live stream of Edinburgh’s Hogmanay 19 Celebration enabled critical IP-based systems such as CCTV, Internet access, and weather monitoring. Riedel’s Artist digital matrix intercom system and Bolero wireless intercom were tightly integrated with MediorNet to ensure communications for the production crew across five stages. Riedel supplied a comprehensive signal transport and communications backbone for the spectacular New Year's Eve Hogmanay celebration in Edinburgh, Scotland on Dec. 31.

Once again, Riedel Communications supplied a comprehensive signal transport and communications backbone for the New Year’s Eve Hogmanay celebration in Edinburgh, Scotland on Dec. 31. Riedel’s MediorNet provided real-time transport, processing, and routing for all video and audio signals and also


Billed as one of the world’s largest New Year’s Eve celebrations, Edinburgh’s Hogmanay 19 culminated in a giant street party on Dec. 31 with international, U.K., and Scottish performers and a huge midnight fireworks display. Award-winning DJ and producer Mark Ronson headlined the concert and provided a live, custom soundtrack to accompany the fireworks. For the first time in Hogmanay history, the festival was livestreamed to Facebook, YouTube, and the

Hogmanay website. Produced by Underbelly, a U.K.-based live entertainment production company, the multicamera webcast allowed viewers anywhere in the world to enjoy the entire 5.5-hour event from their phones, tablets, or laptops. “Since Underbelly first produced Edinburgh’s Hogmanay in 2017, we have installed a city centre wide Riedel network. Over the three years, we have significantly increased the utilisation of the system and it has now become an essential and integral part of the event. The network facilitates not only the creatively world class show, but is also essential for the crowd safety management and operational success of the event,” said David Watson, Production Director, Underbelly. “Riedel’s world-class networking and comms


technology, coupled with the onsite configuration and event expertise of James Mitchelmore (Direct Control UK), Ed Lawlor (Visual Certainty), and the Riedel team, was absolutely essential for the smooth running of the festival.” Operating over ten kilometers of fibre, Riedel’s MediorNet formed a massive signal transport communications backbone

for Hogmanay19, with 16 MediorNet Compact Pro stageboxes deployed at sites throughout Edinburgh. The network was anchored by a MetroN core router, with additional MediorNet MicroN high-density media distribution network devices enabling many different modular configurations. In addition to audio distribution for all stages and video distribution to all

big screens, MediorNet provided an IP tunnel to facilitate CCTV, Internet access, weather monitoring, power monitoring, and lighting control. To facilitate crew communications, two Artist64 digital matrix intercom systems supported 33 Artist panels and 35 Bolero wireless beltpacks, with intercom signals provided by 19 Bolero antennas. 


VRT’s annual charity fundraiser “De Warmste Week” benefits from Quicklink Studio (ST55)

In addition to De Warmste Week, VRT is using the Quicklink Studio for audio calls during radio broadcasts.

VRT (Flemish Radio and Television Broadcasting Organisation), the national public-service radio and television broadcaster for the Flemish Community of Belgium, has recently used the Quicklink Studio (ST55) solution for coverage of VRT’s annual charity fundraiser De Warmste Week (The Warmest Week). The Quicklink Studio was provided to VRT by VP Media Solutions, a media technology solutions and services provider based in


Brussels, Belgium.

Week raised over €17.5

Studio Brussel, which is part of VRT, is a Dutchspeaking radio station that is aimed mainly at a youth audience. During De Warmste Week, Studio Brussel used the Quicklink Studio (ST55) for their radio marathon. Every morning Studio Brussel held a video call with Dieter and Kevin from ‘Down The Snow’ to follow them along their hitchhiking adventure from Russia to Belgium. The VRT fundraiser De Warmste

million for charities, breaking all previous records. In addition to De Warmste Week, VRT is using the Quicklink Studio for audio calls during radio broadcasts. In the near future, VRT plans to use the Quicklink Studio to integrate video calls with artists into their broadcasts, enriching the video streams and producing more online content. 


W!ld Rice theatre company integrates a Clear-Com communications system in its permanent performance venue Singapore’s W!ld Rice recently established a new performing arts complex with an environment fit for performance experimentation. The theatre company chose Clear-Com for its communications system. After drawing more than one million people to its performances which were previously held in rented spaces since its founding in 2000, W!ld Rice decided it was time for a permanent home. The company’s spacious facility was recently completed and is located in the Funan Mall within the heart of Singapore’s Civic and Cultural district. The new space features over 18,000

square feet of performance space and a 380-seat theatre, the first dedicated performance venue in Singapore to be fully designed, managed and programmed by a theatre company. Once they had the resources to handle a more rigorous schedule of productions and events of all sizes, the W!ld Rice team was able to act on their plan for an enhanced communications for production crews. Key requirements were a twochannel system at the stage manager’s desk and the ability to expand seamlessly to keep pace with future growth. The new system, introduced by consultant, Radian Acoustics, includes a Clear-Com HelixNet® AllDigital Networked Partyline Intercom System combined with the FreeSpeak II Wireless and Encore Analog Partyline Intercom Systems.

These technologies are making crew communications during shows and rehearsals “more efficient and streamlined”, according to the press release. “We’ve been able to secure a Clear-Com solution suited to our needs and unique challenges,” said W!ld Rice Technical Director, David Sagaya. “FreeSpeak II provides the performance and reliability we require for our caliber of productions, and HelixNet is easily scalable to whatever communications need arises.” “The combination of HelixNet and FreeSpeak II was a great solution for this space,” remarked Consultant Tan Suan Wee of Radian Acoustics. “The robustness, audio quality and scalability of the system mean that the theatre will enjoy seamless usage for years to come.”  13


ESL benefits from Ross Graphics and Control solutions at ESL One Hamburg event

For the third year in a row the Barclaycard Arena in Hamburg, Germany, played host to the recent ESL One esports event. Once again, Ross Video’s XPression graphics suite and DashBoard control system were chosen by the organizers to drive the live feed and instadium production. 14

“We’ve used XPression for a number of years now, and we’ve consistently found it to be a very powerful and flexible tool”, notes Roman Erber, Head of Broadcast CG at ESL. “At this event, we used three channels of XPression; one channel for all of the show’s editorial content, a second channel for

sponsor logos and other content we needed to rotate in-game, and a third channel for the animated breakscreens that were aired during ad segments”. ESL has now chosen an XPression + Datalinq solution for several of their events. This reflects the fact that esports generates enormous


amounts of data that needs to be presented to audiences. According to Erber, this data is crucially important. “Every esports tournament we manage is different in terms of the games and the way that passages of play unfold. Many games are very fast-moving, with multiple decisions being made and tactics changing by the second. On top of that, every game has a different way of outputting data; XPression + Datalinq enables us to capture, interpret and present game data in a very consistent and repeatable manner. What we do is obviously quite different from a traditional sports broadcast and we want to ensure that we present the games in a way that satisfies both the hardcore and more mainstream/casual fans”. The sheer amount of data generated around an esports tournament means that some degree of automation of data gathering is needed. “We can be pulling information from multiple sources at any given time

(e.g. pre-game betting odds) but there are times during the tournament when we want to overrule the automated pull and let our graphics operator decide what to display. Similarly, there can be hot phases of a game when it’s important to show key information (e.g. highest kill streaks, overall kill/death ratio, etc.) and that data has to be pulled automatically because the pace of the game makes it impossible to collect and gather all this data manually. Fortunately, XPression makes this process painless and efficient”. DashBoard, Ross Video’s open-platform control system, was also an integral part of the Hamburg event. “DashBoard has allowed us to create custom control panels for our teams at various events, and this has really helped to simplify the control of our productions”, comments Erber. “Our CG editorial position in Hamburg used DashBoard extensively, enabling us to pull content from various social media platforms

and feeds into our production. The ability to build custom control panels means that this operator doesn’t need to be a trained XPression operator – they’re working with a panel that has been made just for them and does exactly what they need”. The XPression operators in Hamburg also found DashBoard highly valuable. “Esports broadcasts are becoming increasingly complex and we have important sponsor obligations to fulfil. We have a tracking and time system that is triggered by DashBoard, ensuring that our sponsors get the visibility (time on-screen and in the right segment) that we’ve agreed. With one hot button in DashBoard we can meet all our sponsor commitments – everything is preprogrammed on the back end – and this helps make our productions much less stressful and more efficient to manage. We couldn’t do it without Ross!”  15


WLVT-TV upgrades its control room with the new FOR-A HVS-2000 switcher PBS39/WLVT TV, part of Lehigh Valley Public Media based in Bethlehem, Penn., recently update its control room with a FORA HVS-2000 HANABI video production switcher and ClassX broadcast graphics system. According to For-A, the new equipment has improved the “on-air look of the station’s original programming as well as its live production workflows”. For example, the HVS2000 can load and playback graphics from the ClassX, so the TD can access them during live productions. As a result, PBS39’s graphics designer does not have to be in the control room for “routine” productions. “When I saw how well the two integrated together, plus the price, it seemed like an obvious choice,” said Andrea Cummis, chief technology officer. 16

WLVT TV director of production Javier Diaz demonstrates the station’s FOR-A HVS-2000 production switcher.

PBS39 equipped the new switcher with 32 inputs (expandable to 48) and a 3 M/E panel with 6 M/E performance, plus 4 MELite™ buses, which transform traditional AUX buses into fully functional M/Es. A “typical” PBS39 studio production is a four-camera shoot, but it is not unusual to have a 10-camera shoot that uses both studio and PTZ cameras. “We do surprisingly big shows here – and with the FORA switcher, it’s no problem at all,” Cummis said.

also supports the use of RSS feeds for updating election results and other data in graphics, a feature that was not supported by the station’s older graphics system. Cummis said the system “worked great” for local primary election coverage last year, but she is looking forward to using the system to its full potential for Election Night in November.

Furthermore, ClassX broadcast graphic system

North, Central, and South

FOR-A is the exclusive distributor of ClassX in America. 


Brainstorm and Spidercam join forces to provide enhanced AR for sports and live production events coverage of the WWE season, including Royal Rumble and other events.

Brainstorm, manufacturer of real-time 3D graphics and virtual studio solutions, will collaborate with Spidercam to develop augmented reality content for sports and other events in large spaces. InfinitySet, Brainstorm’s virtual set and augmented reality solution, “performs at its best” by using the camera view plus the tracking data provided by the different Spidercam systems. “Brainstorm seems the perfect

combination for Spidercam” says Jan Peters, CEO of Spidercam. “The camera angles and flies Spidercam provide can also be enhanced with the real-time 3D augmented reality graphics by InfinitySet, providing an enhanced storytelling to impress an increasingly demanding audience”. Brainstorm and Spidercam enjoy a considerable track record of mutual co-operation and have successfully deployed installations in events such as the

According to Miguel Churruca, Marketing & Communications Director of Brainstorm, “Working with Spidercam has always been a pleasure because of the way they track big spaces allows for our augmented reality elements to stand out thanks to the wide camera movements they can provide. The willingness and vast experience of their staff makes the collaboration seamless on a human level. At Brainstorm we are always pushing the envelope when it comes to technology, and integrating with Spidercam seems only natural since they’re on the cutting-edge of robotics camera technology”.  17


Ncam appoints Robin Shenfield as Chairman Ncam Technologies has announced the appointment of Robin Shenfield as Chairman. In this strategic role, Shenfield will apply his extensive experience in VFX and post-production to the changing world of real-time VFX, and work in collaboration with the Ncam senior management team and investors to achieve the company’s aims for sustainable global growth. Shenfield is the co-founder and former global CEO of The Mill, one of the world’s most awarded companies in advertising visual effects and production. Nic Hatch, CEO, Ncam, said, “Robin’s reputation as a highly respected and trusted individual precedes him. His deep understanding of the creative industries, combined with his experience of successfully building and running an investor-backed business and his eagerness to embrace new technologies, make him the ideal person to help us fulfill and exceed our global ambitions.” 


Dalet announces the appointment of Bea Alonso to Director of Global Product Marketing Dalet has announced that Bea Alonso takes the lead of the global product marketing group of the company. Bea will oversee Dalet’s expanded product marketing team and technology partnerships. In the same vein, Bea will collaborate with Dalet product management, sales and corporate marketing group to define and execute all go-to-market strategy plans across industries and verticals. Prior to joining Dalet, Bea spent the last two decades with technology vendors Avid and Grass Valley, joining Ooyala in 2016 to spearhead the media logistics business across the Asia Pacific region and ultimately leading the company’s product marketing team. Deeply involved in the media community, Bea serves as an Advisory Board member for the DPP and RISE – an advocacy group for Women in Broadcast. Bea is also part of the organizing team for the Barcelona Video Tech community and an IABM board member. 


Morten Aass joins NEP Norway as Managing Director NEP Norway has announced that Morten Aass has joined its organization, based in Oslo, as managing director, effective immediately. NEP Norway is a division of NEP Group. Aass, an industry veteran, will oversee all of NEP’s Norwegian operations, client relationships, broadcast production facilities and a team of more than 100. His primary focus will be to ensure the position, continued growth and success of the Norwegian company. Aass takes the helm at NEP Norway as Lise Heidal, former managing director, transitions to a new role

as NEP’s SVP of Global Media Solutions. In addition to being the managing director of NEP Norway, Aass will also be part of NEP’s European management team. Prior to joining NEP, Aass most recently served from 2018 as senior strategic advisor and content board member at

Nordic Entertainment Group, and from 2017 to 2018 as country manager for MTG (later Nordic Entertainment Group), where he was responsible for the TV3 and P4 group of channels, plus national streaming operations. Before that, from 20142018, he held several executive-level roles at Nice Entertainment Group, including as president and CEO for the company’s operations in 16 countries and across 28 companies. Throughout his 30-year career in the industry, Aass has demonstrated an entrepreneurial spirit and focus on driving growth through innovation. 

MOG and SimplyWorks join forces to deliver end-to-end media technology solutions MOG Technologies has announced a new partnership with SimplyWorks in the Middle East and Europe region. The new partnership between MOG and SimplyWorks will focus on helping customers find the best solutions for their broadcast ecosystem, enhancing it with Video Cloud Services from Preparation to Distribution; Turnkey Video Appliances and Saas Solutions, that will open doors for mutual expansion. 



MOVING FROM A TRADITIONAL BROADCAST PERSPECTIVE TO A PERSONALIZED USER UNIVERSE TV stations must transform their strategies in order to adapt to ever-changing viewer conditions, as well as to remain as indispensable agents within society. This axiom is embraced by public corporations such as Norsk Rikskringkasting (NRK) in order to deliver on its commitment with the Norwegian population. Television has undergone significant transitions from its inception and, over the next 4 years, will be facing one more: Taking television viewing to hyper-specialization and customization, increasingly widespread within the environment of OTT platforms. We wanted to know how this project is being tackled by NRK. Therefore, we got in touch with Heidrun Reisรฆter, Technology director at NRK, who was so kind as to answer our questions.

By Sergio Juliรกn Photos Ole Kaland, NRK





Interview with Heidrun Reisæter, Technology director at NRK NRK is a television that has been operating since 1954. If I’m not mistaken, you have always been audience leaders even with the arrival of private broadcasters. How can you explain this amazing success? Do you think that being at the technological state-ofthe-art has helped you achieve your public services goals? State-of-the-art technology has never been a driver for NRK, but has been a very important means to reach our creative and ambitious content goals. In the early days of television that meant building our own mixers and microphones. These days it means developing a world class offering for our audiences in form of our online products. In the end it is the ability to cooperate on both strategy and operations across content production, publishing and product development that is key. Everyone needs to 22

have a common vision for what we serve our audience and how we reach them in a changing media landscape.

What is NRK’s vision regarding the technical side of broadcasting? Do you continuously study both user’s needs and technology’s evolution to find that alwayssought balance? Yes, we find it important to seek this balance. Our

choices are based on user needs and strategy, not solely on new possibilities in technology. The mindset in NRK has shifted a lot through the last years, from being an organization dependent on the possibilities of what the vendor of systems could offer, to an organization anchoring needs and values to the public´s needs before we decide on solutions. NRK is moving away from a


into technical productions elements?

Everyone needs to have a common vision for what we serve our audience and how we reach them in a changing media landscape.

project organization to a product organization with agile processes and continuous improvements.

I took a look at your NRK Corporate Strategy 2019 – 2024. How does this purpose translate

As our users are moving to a world of on demand content we need to develop both how we produce content and how we publish it. Thus our strategy translates into more priority on nonlinear content and how NRK can move from a traditional broadcast perspective to a personalized user universe. Content atomization and objectbased publishing is subject of ongoing innovative processes. For instance, lately we have developed new systems for podcast publishing, and continually develop our OTT platforms.

How many production facilities does NRK have? What kind of productions do you do there? How is your main production center? NRK is producing content from more than 14 sites daily, both from our head office and our regional offices. We have divided our production center into three. Drama and field production, Studio production included radio 23


and podcast and OB and event production. NRK’s headquarter in Oslo does a lot of inhouse production like News, Sports, Feature, Culture, Drama, Documentary, Education, Entertainment and Children/Youth shows. Our Regional offices do


mostly News, but the biggest offices have certain shows like nature, entertainment and children's content. At NRKs headquarter in Oslo we have both studioand OB-facilities. Two main studios each 550m2, one used for multipurpose

and topical entertainment productions throughout the week. The other is used for block productions such as drama or entertainment. In addition, we have smaller studios specially designed for daily news and sport, and other smaller studios for


children, youth, culture and so on. We have two bigger OBtrucks mainly used for bigger live events as concerts, entertainment and sports production. We also have a flight-based OB unit used when suitable and specially

Most of our premium productions are produced in 4K and above, our playout quality however is broadcasted in HD.

when we do productions over time. We also have three smaller OB trucks used for smaller events and sport productions and three sound trucks.

What is your production standard currently? Do you offer a 24/7 programming of HD content? Our production standard format for television is currently Sony XAVC 1080 i25 for all productions except for Drama and Nature categories. The acquisition format for Dramas and Nature is often shot in 4K or 6K. The mastering format spans from HD, UHD to UHD HDR in our latest Nature series “Snow How” which is a Nordic cooperation.

Back in 2016, Bjarne Andre Myklebust, Head of distribution at NRK, stated that NRK was currently doing some drama productions at 4K, but the regular broadcast in 4K was not planned. Has this changed? Most of our premium productions are produced in 4K and above, our playout quality however is broadcasted in HD. We are making select drama series available in 1080 and are testing both 4K and HDR in small scale test on our OTT platform – NRK TV. “Snow How”, which is a nature series about snow, is shot in 4K and mastered in UHD HDR using AVID format DNxHR HQX UHD 50p and brought to our audience in 25


4K HDR p50 as a test of multiple qualities. This is our first 4K HDR test, which is available to our audience via our OTT service, NRK TV.

What technologies live in the NRK core? Do you trust in a specific manufacturer for video capture, distribution, etc.? Video capture is in general done by Quantel sQ1800 servers and EVS servers for studio and sports events. For playout in our main transmission we also have Harmonic Spectrum servers. NRKs automation system in our playout, Abit, is up for replacement and we will start a replacement process in Q2 2020.

AR and VR have changed the way we understand content, especially when it comes to newsroom productions. Are you deploying these technologies? What graphic system do you use? NRK graphics has for many years been empowered by Vizrt graphics and are still 26

handle massive amounts of data and information. That is where MAM systems arise to help technicians. Two questions: what MAM system did you choose and are you applying an AI solution to help you streamline your workflow?

delivering our most demanding AR and VR news graphics. We have been looking into the potential of using HTML graphics and have developed our own graphics product called NORA. We have also been exploring Open Source products such as Unity combined with Stype Kit tracking for AR and VR graphics in our news and live sports productions when suitable. The same system is also used when planned for in bigger entertainment events.

Our chosen MAM system is Tedial and is in the final phase of implementation. We have not yet used AI to streamline our workflows, but it’s on the backlog of areas to investigate in the near future.

In addition, today broadcasters have to

I want to make you the big question: IP


I’m sure that outside broadcasting will be such a challenge due to Norway’s orography and extension. What systems do you use to guarantee these coverages? Do you trust OB Trucks or vans? Are you deploying transmission “backpacks”?

transition. Do you plan to move from SDI technology to an IP infrastructure? Do you already use some NDItype technologies in your workflow? NRK is in the planning stages of a transition from SDI to IP. We are planning to relocate our headquarters and are exploring the possibilities of moving to IP in that context. Our aim is to start with a small installation in our current headquarter, to build editorial, creative and technical experience. Currently we have smaller

initiatives which uses NDI, but we have not standardized on NDI or any other technology yet. One of our regional offices was used as a sandbox for how to test and establish an IP based infrastructure and workflows but our goal of using 2110 on the video side was proven immature and incompatible between the standards interpretation by vendors. Our radio IPtechnology is on Lawo/VSM and DAVID systems solutions. Our aim is to roll this out to the rest of our locations.

For news gathering we mainly use LiveU. For OB and live event production we use LiveU when possible and SNG when we have no other options. Access to fiber can be challenging, but is the preferred way to get the signal home.

What’s your vision regarding remote production? Have you already done any test on this? Are you currently producing content remotely? Our production from the summer Olympics in London back in 2012 was produced remotely from Oslo as our first remote production. After that we produced lot of smaller remote production locally in Oslo. For Radio productions we are weekly doing remote production 27


in the field controlling the hardware by Web UIs in the field using the Quantum XL as an extension for the Lawo audio switcher. Mostly communicating on a VIPRInet mobile connection.

I would like to know more about your OTT VOD platform. It cannot be accessed from foreign countries, so I didn’t have the chance to delve too much. What’s your goal and what are you offering? What technological systems are powering this platform? Our goal is to deliver a streaming platform to the audience that is both able to gather the population for big live events and to recommend relevant content to each individual with a great user experience. On the platform the user can access all new 28

programming but also a large archive of content from earlier years. Much of our programming is published on demand first. We have built most of our OTT platform ourselves, it is running on Azure infrastructure, with modules from the Microsoft portfolio – but the presentation, CMS and back end integrations are built by NRK.

NRK is also a world reference regarding transmedia productions. The most relevant example I’ve heard of this is, of course, SKAM. Do you still embrace these kinds of productions where technology, interactivity and narrative join forces? Yes, after Skam we have produced similar “real time” dramas for the young target group, and we use these methods

combining video, social media and user engagement when it serves our purpose. At the moment we develop new and more interactive services to increase user engagement around tv shows like Eurovision Song Contest and at the same time recruit logged in users for our services.

What other digital services are you working on NRK? Our main products that we develop for the audience are TV and Radio players for web and app, our news website, a kids’ player and, which is the world’s 5th largest weather service. We also experiment with new services like atomized news briefs for smart speakers. With rapid


changes in technology and user needs, we must constantly monitor what service development is needed to reach the public in the best way.

Two questions left. First at all, Slow TV, quite a fascinating concept. Could you tell us what are the main technical challenges you have to face when producing these type of content? Due to often very remote locations it is hard work and a lot of planning to get the signals home. The nature of these productions is also the continuing changing of locations. This makes it difficult to maintain a stable connection. The

result of this is can be some signal dropouts but I think our audience are forgiving on this matter. The most important being the live production and the local presence during the slow-TV productions. When planning the locations and spots and when we know we are moving into difficult signal areas, we can send preproduced content as a backup. The slow-TV productions are also a sandbox for concepts and solutions like our backpack production unit for live TV production we used when walking in the Norwegian mountains in 2018. We also tested

MESH network technology to ensure enough bandwidth and live coverage.

What is the future of NRK? What are your next technological movements? Where are you heading? I guess the only thing we know for sure about the future is that both technology, the market we operate in, and the expectations from our users will continue to change. In NRK we will do our best to be able to adapt both when it comes to organization, content development and the use of technology. ď ľ 29


Welcome to one of the greatest sport shows that can be offered for live TV:



By Anto BenĂ­tez Professor in the Communication Department at Carlos III University of Madrid and sports director



This year the opportunity to show excellence as sports producer has been given to Fox TV station. There were some circumstances that made the event something special for the competition as it was the centennial of the NFL and also the fact that Miami was the venue, thus especially attracting a Latin audience. Fox Sports has paid great attention to this aspect, scheduling for halftime stars such as Pitbull, Shakira and Jennifer Lopez; the one chosen to perform the US anthem was Demi Lovato. The people in charge at Fox were fully aware of what was at stake: this event has been traditionally considered, together with the Olympic Games and the Soccer World Cup and the Rugby World Cup as one landmark 'setting the trend for live sports'. This is evidenced by the deployment of more than one hundred cameras, which enables producing a special programming during the prior week from a set located in South Beach in addition to a pre-game feature last


over 4 hours, the halftime show and the full game coverage. Super Bowl LIV could be seen through conventional networks: Through the terrestrial network under the 720p standard and SDR and, where available, by cable and satellite in 1080p. And in this occasion Fox has taken good care to especially promote streaming

audiences by offering a 4K HDR signal. This way, they were looking to cause in viewers a marked improvement in the visual experience, making them feel as if they were down the field, something that had never been seen in a TV football game like this, in spite of the fact that Fox has already been testing this system throughout the whole season and, prior to this,


5G will created. Super Bowl audiences had been slightly decreasing over the past few years. Fox should be happy, for as a result of the game expectation and halftime combined, little else could be asked from the development of the game itself in the form of excitement to the brim right to the last minute and magnificent individual performances, as if events themselves had been the work of good fiction scriptwriters.

had also offer MLB games with this quality. Their calculations, however, only contemplated 50 thousand users being both able and willing to use this premium connection, and comments posted on the net after the event showed both astonishment and disappointment in equal measure**. Furthermore, this is not the first time in which the

version offered on the internet clearly exceeded the quality given by traditional broadcast version, but it is definitely a breakthrough in view of so many media focused on the event as in this instance; this in addition to streaming instead of Video on Demand, which sets a trend towards the future of distribution and new bandwidth and latency capabilities that

And audiences rose up to the challenge as, in a general trend for diversification of audiences and screen as viewers flee from traditional TV, the Super Bowl keeps setting peaks in concurrent audience. The figures given by people in charge of the channel themselves indicate more than 102 million viewers in the US between Fox, Verizon and the NFL platform combined, of which 100.7 million used the Fox network -99.9 million during halftime. Including the figures regarding streaming, this is the first rise in the total figure in five years.



The policy adopted by Fox reflected in this occasion the comments posted by spectators and prepared the event with the promise that there would include one commercial less per each


block while increasing rates to 5.6 million. And even so, it seems that profits have followed suit and have beaten last year's amount, which reached US$335.5 million.

Deployment for the game The data provided by the technical managers of Fox Sports for the game are overwhelming. Both Michael Davies, Senior Vice President, Field & Technical Operations, and Michael Drazin, Broadcast Operations and Engineering, featured countless appearances in the media and told that the compound deployed at the stadium included 13 mobile units and over 54 kilometers of wiring. Production standard was set to 1080p and HDR under the HLG BT.2100 (hybrid log-gamma; see at the end of this article) format developed by NHK and BBC technicians. Video sources featuring lower capabilities were scaled down by using AJA


FS-HDR, thus reaching all mixers and video replay devices. It has been mentioned that 70 cameras were used –manual and robotic cameras combined-; 20 of them exclusively for the pylons and dedicated to the end zones; 24 highspeed cameras ranging between 180 and 1,800 fps. Also cameras of a very high resolution were used, 3 of them 8K, of which one broadcasted the whole pitch at said resolution for the first time ever and the other two were used for benches and sides, with the capability of using up to 12x zooming for replays. Additionally, 8 high-speed cameras reached 4K resolution and were dedicated to the end zones, being in particularly aimed at the high end zone and down the line. Two SkyCams were used, one of them overlooking the stadium's rooftop. 7 cameras were wireless, some of them mounted on a steady, a mobile Gimbal and some others for digital cinema. As for the sound, 72 microphones were to be deployed on the pitch, amongst them wireless

70 cameras were used –manual and robotic cameras combined-; 20 of them exclusively for the pylons and dedicated to the end zones; 24 high-speed cameras ranging between 180 and 1,800 fps.

microphones for the players. Several sub-mixes were made, the most complex one in 5.1 Surround to go along with higher bandwidth signals in distribution. For graphics, resources from SportsMEDIA Technology were used for application of Virtual Reality (or Mixed Reality if you prefer) to several cameras, including the goal-post robo-cams and one of the SkyCams. The graphics engine used was the Epic Game’s Unreal Engine. Takes so offered were really useful in the pre-game feature, most especially with the touching video in which tribute was paid to many

of the greatest players in these 100 years of competition. Said video was developed by inserting compositions previously prepared by the graphics engine alternating with others apparently live- taken from cameras deployed on the pitch.

The production By looking at the screen it would seem –in a quick analysis- that the goal of a football director is an extremely simple one. As soon as the scrimmage is in place, and therefore, a down is about to start on the pitch, the ball must be in view all the time until



play stops. First of all the action is unveiled at the same time as viewers do, thus configuring that particular narrative line of the live broadcast. A general view is shown taking in the two teams in full (22 players on the field) and the virtual line that must be reached in order to get a new down. The take gradually closes on the ball and focuses on the nearest players as soon as play commences. An average down can last between 6 and 8 seconds. The take is maintained in order to appreciate how the action was resolved. And then play is explained through close-ups showing the reactions of the main players and, if applicable and provided there is enough time, replays are shown from different angles in order to analyze the elements that took part in the sports action, either the last play or the most important action that has taken place recently or being related to it. Then close-ups on the protagonists or takes of the technicians involved or of the huddle, while preparing for the next down. And all over again.


In a game featuring so many cameras -about 70- at least 40 will be used only for replays.

This description seems very simple, but getting everything right is not. In the first place, actors involved may be quite a few: players on the field, technicians and players on the bench (a team may have up to 53 players ready for action); spectators and relevant people in the bleachers, referees and members of the organization. And it may be the case that to every action on the pitch there is a reaction by each protagonist. Furthermore, it is advisable to consider that in a normal pass the ball can travel 30 yards away from the QB and then the playing receiving it can run with the ball. This would cause the two main players in action have simultaneous reactions while being 50


meters apart. Following this logic, more than 100 evident and identifiable reactions can be computed, of which most surely a dozen would be directly relevant after an initial review; this gives rise to a scenario of difficulty for decisionmaking within tenths of a second. And this difficulty is taken to the screen as

deciding on the point of interest at every instant means also limited the audiovisual space: framing and focus of each one of the human-operated cameras. These are decisions that need to be taken live, while others involve negotiating upon reviewing recorded material. In a game

featuring so many cameras -about 70- at least 40 will be used only for replays. The footage* that remains undisclosed is dramatically greater than what is shown on the screen at a given time. Surely, the director has to undergo a relentless pressure of having to discard 35 fantastic shots offered by his team about a reception and a rush that are an incredible display of elasticity, power and coordination by the runner and one of technical skill, composition and aesthetics by camera operators. Based on the example described above, only two make it to the screen -one of them recorded at high speed-, so a three—second action has taken at least nine seconds of screen time. They had better be the best that can be displayed. In this fashion, both producer and director reach agreements on work procedures for videos and cameras and the director then designs a plan that is conveyed to collaborators. In the first place, in view of his experience and knowledge of the game, a



director must be able to translate indetermination of an apparently unyielding sport into a set of situations that can be planned. He must conceive an ideal script and allocate a task to each individual camera operator for each of said situations, having in mind that variables such as the spot where action takes place (for example, close to the end zone), the player involved (a rookie, a star), of the time in the game (at the end in a close game) will have an impact on decision-making. One possibility could be to allocate, while the ball is in play, tactical missions to a number of operators with joint framing (all backs during the whole down, for instance) or technicalaesthetic missions with takes carrying out individual (isolated) monitoring of each player- The usual practice is that whenever the ball is still missions are allocated so as to be able to show close-ups with reactions by the different protagonist. A protagonist may have a proper name or simply play a role: the player running with the ball, the tackling player, the one who missed, the one who score, the one taking the Tight End position, the blocker... Likewise, the producer and the director must have outlined and agreed on a plan that allows for


quick decision-making on which of these missions are to be recorded and how to proceed to choose the best takes that can explain, analyze or give rise to comments on the play; although usually plans are drawn on how to create, under various criteria, play lists to be played out during the game or that organize unseen material for further highlights. What is clear at the moment is: the larger the deployment, the higher the number of decisions that are to be made by more people subject to pressure. It is therefore necessary that the goals set for each individual member of the team are as clearly defined as possible and the way of tackling potential problems is a common one. In other words, the director must have designed, based on his catalog of foreseeable situations or script, a set of instructions for collaborators. ‘If, in a given situation, the player being followed goes down, then...'. And, for anything that may deviate from an algorithm-based decision-making system, another one must be set up to use heuristic decision-making, being capable of instantaneously discarding unfeasible or overly complex solutions but also allowing the taking of risks which –if successfully overcomemay render exceptional results.




The game The human team in charge of showing the game on the screen was well coordinated and had long experience. For producer Richie Zyontz, this was his sixth Super Bowl, while for director Rich Russo, it was his fourth time. It was very clear to both that the teams that were going to fight for the Super Bowl were capable of moving at great speed. Some of the weapons of both teams were quickly noticeable as forecasts and expectations were being met. But something changed in the final quarter. The Chiefs were trailing on the score and decided to step up the pace, as time (9 minutes to go) was running against them. They were forming scrimmages so quickly, they were rushing to put the ball in play again, that there was no time for replays. In several occasions the TV narration was forced to get back hurriedly to live play from the explanations that were being given (for example, at 1st down and 10 with 7 minutes to go) and not


even the 1st and 10 system had time to sync and be displayed on screen with the new position. Adapt to this new sports situation was required; change of plans.

succession of defenses that forced the Niners to kick the ball away. The Chief pushed in several drives and reach 3rd and goal with nearly 3 and a half minutes to go.

In this new scenario, the Niners felt the pressure applied by the Chiefs, which come closer to a three-point gap and with a bit over five minutes to go retrieved the possession of the ball after a good

And here came the play for which the game will be probably remembered: Mahomes launched a fiveyard pass to the right to Damien Williams, who rushed and, with the opposition of Richard


points ahead with 2:44 minutes to go, thus forcing the Niners to score a touchdown to win.

Fox Rules Analyst Headquarters

Sherman, crossed the corner of the end zone. The referees being closest to the action called a touchdown. The sequence is as follows: A general take on the play is shown; then closes up on Williams, who receives and is pushed outwards on the corner and scores; the frame isolates the running back as we goes back to the end zone celebrating with some dance steps, when his team mates

come running to embrace him. Then, a take from the steadycam posiitioned just by the celebrating bunch. After the replays nothing is yet clear (see at the end of this article). Referees consult each other and also with the Art McNelly Game Central and, not being able to find any undisputable evidence against it, they conceded the touchdown. After scoring the extra point, the Chiefs are now 4

In summary, the LIV Super Bowl was a TV show of the highest level that lived up to the expectations raised on occasion of the anniversary being commemorated. With overwhelming figures and some pioneering gestures towards technologies that will likely be integrated in production routines in the near future such as HDR and usage of graphics engines for generating mixed-reality and liveimage combinations, just contemplating this deployment was a real show itself. There is no question that the effort made by Fox Sports will not be limited only to the 50 thousand screens that were foreseen to connect via top-quality streaming, but surely this will have fueled the sale of $K devices and sticks as well as other streaming connection devices. The NFL has turned 100: Long life to the Super Bowl too! ď ľ 41


What does broadcasting in 4K UHD HDR, HLG BT.2100-2 means: Announcing that the LIV Super Bowl is offered on a conventional network in 720p means that the resolution being broadcast is 1,280 pixels in width and 720 pixels in height, so the total display area is 921,600 pixels. But 8 bits must be used to display color on each pixel, so the amount of bits needed per image is 7,372,800. The standard involves 60 images per second, so the amount of bits per second is 442,368,000, which translates into a required bandwidth of approximately 55.3 MBytes per second. For 1080p, frame resolution is 2,073,600; at 8 bits per pixel: 16,588,800 bits per frame; 995,328,000 bits per second at 60 fps, which makes a total of 124.4 MBytes per second.

meet the minimum required the system would have 4 times as many values for displaying the different colors than if working with 8 bits. Taking 10 bits in consideration, 82,944,000 bits must be reserved for displaying each image. But 60 frames per second are need, so the calculation amounts to 4,976,640,000 bits per second, more or less 622 Mbytes/sec. 11 times the bandwidth (not taking into account compression schemes or color subsampling, which can dramatically decrease this figure without noticeable quality loss) of the 720p standard.

Now, when it comes to 4K UHD format, this means 3,840 pixels wide by 2,160 pixels high, which makes 8,294,000 pixels per frame. In this case, information reserved for color -this being one major difference- could reach 10 or 12 bits. Assuming it was 10 bits, in order to

 There was not a large audience that could be able to watch the Super Bowl 2020 at the highest quality, 25Mbps, with screens able to support 4K and HDR; in fact the estimate was 50 thousand users;


Taking into account pending challenges and potential obstacles, Fox Sports made the following assumptions:

 Available bandwidth was

not enough and, in this instance, broadcast had to be done with much lesser resources (divided by four in the best case scenario);  According to the experiences told by its technicians, there is a relationship between spatial resolution (number of pixels per frame) and temporal resolution (number of frames per second), and they have concluded that movements in 2160p and 60 frames per second are seen with motion blur due to the undesired pixel offset effect;  Fox Sports has been performing tests all season


long, with enough time for experimenting and satisfactory results;  They have chosen to produce in 1080, broadcast in 720p SDR (Standard Dynamic Range) for antenna, 1080p for cable or satellite and available capacity; and 4K HDR for sufficient bandwidth in streaming. And in order to achieve its goals -especially HDR- without an impact on its normal production routines, the following workflow had been foreseen:  Video sources that have no original capacity go through AJA converters; All signals, whether from HDR sources or not, are converted as a minimum to the HLG BT.2100 standard, even when distribution to replay devices and mixers within the OBV. The HDR in question means High Dynamic Range: A much greater dynamics is available, many more values, for representing the various colors. In practice this means than many more details can be displayed on the screen, which are distinguishable both in high lighting and in low lighting conditions (these being the ranges more flattened by

traditional display systems, not only on television but also in photo-chemical cinema), and also that some hues can be achieved that had not been possible until these technological resources became available. The gamut (set of color hues that can be obtained and displayed) of HDR is greatly extended as compared to the traditional range. Recommendation ITU-R BT.2100-2 (see!!PDFE.pdf) specifies the valid parameters of the image for use in production and exchange of TV programs in HDR. It defines two methods: PQ (Percentual Quantization) and HLG (Hybrid-Log Gamma). The former is conceived for direct application to image production or acquisition processes, to cameras, while the latter has been designed for activation at the end of the process, on the screen. Based on the fact that only one in every four pixels is genuine, user responses have been very uneven: some opinions declared that this had been the best Super Bowl they had ever seen from a visual standpoint, as intended by Fox Sports; others -in a similar percentage- said the effort was not worth the trouble. At any rate, all initiatives planned with sound judgment and aimed at enhancing the visual or audio experience of viewers making use of the available technology should be welcome. And, in order to achieve greater usefulness, it would be fitting that its promoters shared their conclusions with honesty so we all will be able to win in the development of better windows for the audience. 



The play of the game This is the most exciting time in the whole game. On screen, the figure of Damien Williams, a graphic showing his name and his face sketched by a graphic artist and a steadycam. On the take offered by the steady, his mates form a circle around him, celebrating the touchdown that may win for the Chiefs their second NFL title in the franchise's history. But, those who saw the action remain in doubt: Did he succeeded in getting part of the ball beyond the goal line's vertical before stepping out of the field with his right foot or did he actually step out before crossing the line? Viewers eagerly await the replays, and several are provided: from a highspeed camera aligned with the end line at the corner (the one closest to the main cameras), one can see that Williams has got the ball past the goal line, but this take does not allow checking whether this was before or after stepping out. The pylon’s fixed camera, showing


Williams’s feet and the effort by Richard Sherman to push him out, but this take misses the ball within frame. The take from the SkyCam, behind the quarterback, letting us see perfectly the tactics for the down, but not allowing for conclusive details. The replay offered by the camera on the left-hand line taken at high-speed covers the action from the back of the Chiefs’ attack, allowing us very well to see that Williams stepped on the outside line as well as to ascertain that the ball has crossed the end zone, but no to see what happened first. Also at high speed, another camera placed in the bleachers opposite the end zone shows the action again although not clearing any doubts; but it also shows that the operator of the camera aligned with the end line is aimed at the pitch, probably focusing on Mahomes*, and also that the official is perfectly positioned, better than anyone else, to appreciate the action. But what is required from him is perhaps impossible for a human being: being able

to establish, live and instantaneously, the position of the football by millimeters on the length and width axis on the plane of the field as well as the position of the foot stepping on the line. In spite of the referee having his own perception and quickly calling it a touchdown, the tool assisting in decisionmaking is based on video: instant replay. NFL rules establish that reversing a decision made by the referees on the pitch is only possible “when the Senior Vice President of


composition of synced cameras showed at the same time both lines and Williams's foot (on the pylon, for instance), although it must be taken into account that obstacles -although not impossible to overcomemight exist to sync each frame in the footage of cameras recording at different speeds. Kansas City Chiefs running back Damien Williams (26) runs for a touchdown during the NFL Super Bowl LIV football game against the San Francisco 49ers, Sunday, Feb. 2, 2020 in Miami.

In any event, NFL, since the new pylon cams began to be used, warns in its website

Officiating or his or her designee determines that clear and obvious visual evidence warrants a change” (see the-rules/2019-nflrulebook/). Certainly the takes shown by Fox Sports do not clearly establish that the decision made by the referees should be reversed. Therefore, Senior Vice President Al Riveron was not able either to see anything definitive at the Art McNeally GameDay Central, where all queries –as well as the audienceare centralized.

Many viewers have voiced their disappointment in social media, as even though a huge number of cameras were deployed for the game –and most specifically at the end zones- it has not been possible to clearly show how the action really was. From the author’s point of view, based on the available resources, only two solutions were possible: That the Sky Cam were exactly positioned on the pylon on the corner and recording at high speed, or that a

( /the-game/technology/nflpylon-cameras/) to whoever may want to read it: ”The new angles mostly enhance the viewer’s experience, but are unlikely to be the panacea some want in the instant replay process. While a player’s feet may be shown or if the football crossed the goal line, the pylons may not do both at the same time, which is what is required in the replay process”.  45




Game of Thrones, The Mandalorian, Lost in Space, Jurassic World: Fallen Kingdom‌ The portfolio of the production company El Ranchito, established 2004, is impressive. Dedication, talent and a broad team of professionals has enabled them to tackle multiple audiovisual projects; so much so that recognition has not taken long to come in the shape of numerous awards such as several VES (Visual Effects Society), one HPA (Hollywood Professional Association) and even one Emmy. We interviewed Gonzalo Carrión, its General Manager, to show us around this exciting world.



The Mandalorian

How was El Ranchito born and what has been the development of the company? El Ranchito was born in year 2004, product of a group of professionals with long and proven experience in the visual effects sector. Right from the outset, El Ranchito took part in domestic projects, but with a clear international projection, such as “Fragile: A Ghost Story” or “Agora”, films 48

that gave the company its first two Goya awards. In 2012, thanks to “The Impossible”, El Ranchito got its first international prize, a VES award from the Visual Effects Society. This was the beginning of an unstoppable international track record. The company started collaborations with production firms such as HBO or Netflix in projects such as “Game of Thrones” or “Lost in Space”.

Nowadays, El Ranchito boasts one Emmy award, five VES awards, eight Goya awards, three Gaudí awards and an HPA award from the Hollywood Professional Association and has created visual effects for productions such as “The Mandalorian” and “Jurassic World: Fallen Kingdom”, and for commercials for companies such as El Corte Inglés, Zara, Coca Cola or Mercedes Benz.


El Ranchito was created in 2004. What would be the analysis you make about the development of the broadcast scene? Have you identified an increasing trend in the need for applying postproduction work in TV shows? In the field of visual effects, which is what we can assess, the trend is a clear rise. Nowadays hardly any productions for the film or TV industry are

made without some kind of visual effects. VFX allow things to happen that would be just unthinkable or very hard to create physically; however they also help to boost the result of real special effects such as fire, water or explosions, alter streets and cities in order to adapt them to the action’s time setting and, most especially, they reduce costs as thanks to VFX there is no need for

building large sets or hiring hundreds of extras for the sequence of a battle or a sports stadium full of spectators.

How did the first international opportunities come up? Are there any big differences regarding workflows between a domestic production and a production from abroad? Although before 2012 we 49


had already taken part in some international coproductions and productions such as "The Expendables 2" or "Che", the real opportunity to become an international VFX provider was given to us by "The Impossible". From then onwards we began to be called to take part in international productions. The first one was “Cosmos: A Spacetime Odyssey” and shortly

Lost in space.


El Ranchito boasts one Emmy award, five VES awards, eight Goya awards, three Gaudí awards and an HPA award from the Hollywood Professional Association and has created visual effects for productions such as “The Mandalorian” and “Jurassic World: Fallen Kingdom”


afterwards came “Game of Thrones�. As is to be expected, there are differences between domestic and international productions, but increasingly to a lesser extent because, fortunately, platforms such as HBO, Netflix or Movistar and TV stations such as TVE, Antena 3 or Telecinco are producing series with budgets that some years ago seemed

inconceivable. For El Ranchito there is a clear difference: international projects normally rely on many post-production firms for a project and we are just one more piece in the jigsaw, while domestic projects usually rely on a single post-production firm and we take care directly of VFX supervision. However, this is changing as productions become more complex

and we are getting increasingly closer to the international model.

You work in all kinds of productions. Focusing on two of them, which are the main differences between working in a series and in a film? The basic differences are time, demands and volume. Films normally allow more time but are more demanding. Series



afford less time but normally there is a large workload, which results in a much quicker, less demanding production, although quality in certain series is already quite close to the quality achieved in films. And then, in advertising times dramatically decrease, but work volumes are relatively small.

We recently held an event about production in which came up the issue of the increasing importance of the postproduction stage from the initial stages of preproduction. Is this true? Do you get involved in the processes right from the start? Of course. Some of the services offered by El Ranchito are filming supervision, pre-viewing and Concept Art creation. If we supervise a project we are involved in it from the beginning, but we have also been present at the filming s of projects such as “Game of Thrones� as one more postproduction firm.

What has been, in your opinion, the most 52

The Impossible

relevant technical innovation in the field of post-production in recent years? Regarding visual effects, there have been several of them. For example, the whole issue about digital de-aging, so fashionable these days because of "The Irishman"; and most

especially the creation of virtual actors, which will enable films to be made with already deceased actors. This is a reality that has begun to transform the cinema industry. Also very important is the use of real-time rendering for creation of hyper-realistic CGI sequences in Virtual Production. These kinds of


applications dramatically reduce post-production times and, accordingly, decreases costs.

How do you manage your work? What are your premises like? What resources do you have available in your daily operations?

We have offices in Madrid, Barcelona and Tenerife and commercial

branches in Los Angeles and Mexico. We also have a film set that we mainly 53


use for capturing motion. Then we have a production department in charge of managing workflows and also a composition or 2D department, a 3D department, an editorial department and, also, departments for match move, matte paint, animation, capture, RIG, crowd, FX, IT, research and development, as well as an advertising department. We are in total a team of 140 people. Furthermore, we have 150 render farms.

Do you have any software of reference or do you just adapt to each production requirement? Which are they?

We use Nuke for composition, Maya and Substance Painter for 3D, Houdini for FX, Photoshop for matte paint, Silhouette for rotoscoping, Da Vinci Resolve for editorial, Golaem for crowd and Shotgun for production, amongst a wide array of many other applications. We have also begun to use Unreal in some projects.

What are the hardest technical challenges you usually face? They vary from one project to another. The first season of “Game of Thrones” in which we worked was quite a challenge. Yet, each season has exceeded the previous one in

complexity! In “The Impossible”, water was also a challenge because it is, probably, the most complicated effect. Creatures in projects such as “A Monster Calls”, “Cold Skin” or the alien buffaloes of “Lost in Space” were also a complex challenge. Each project means a challenge and that is the most exciting part of our job.

Do you use at present cloud-based solutions, either for remote work or for sending of assets? We have several integrated cloud-based solutions on our pipeline that are tailored to the needs of each single project.

John Wick Parabellum.



How do you solve the demands from current OTT platforms to closely monitor the final outcome for each episode in a series? By having a daily conversation with them and by showing them the development of our work. We work side by side with our clients, providing creative solutions in an extremely challenging and complex environment.

What has been the biggest challenge you have faced at El Ranchito? We have faced many, but most probably the biggest one of them was the first season of “Game of Thrones” in which we took

part; the famous battle in the snow between men and white walkers in the "Hardhome" episode in season five. It is not that this was our most complicated work, as we have done more difficult things afterwards for these series; but it was our first time and when everything is new the challenge is always greater.

In your opinion, what do you think the future is for post-production? I think the future will be influenced by an increasingly smaller difference between cinema and television. And I think that the use of actors virtually created from a real actor –either

deceased or alive- will completely transform the outlook as determining whose are the rights to exploit the image of a virtual actor will become a major issue. The story told back in 2013 in Ari Folman’s film "The Congress” about an actress signing a contract so that a studio could make films with her in them but without the actress actually being required to be physically present at filmings could eventually become real. Maybe the winner of best acting awards in the future will be an animator and not an actor or an actress! 

Jurassic World: Fallen Kingdom



This is how production of the year’s most expected awards ceremony was carried out 23.6 million TV viewers solely in the US were witnesses of the Oscars 2020. The biggest event celebrating global cinema tries to renovate every year by implementing new technologies and instilling increased showiness. The aim is clear: capture the attention of spectators during the three-hour event. This broadcast involves many challenges: from the unpredictability of a live show to the distribution of the signal across the globe, as well as the management of a massive amount of audiovisual resources being used throughout the ceremony. TM Broadcast was able to get an exclusive interview with Tim Kubit, Engineer in Charge, so he could explain to us in detail how every single challenge was successfully overcome.

What is the main technical challenge of the Oscars? One of the primary technical challenges in design and implementation of technology for the Academy Awards is finding ways to custom build or modify existing products 56

to use in a way that they were perhaps not designed to be used. Some examples of that include building custom through the lens teleprompters to operate either in a very confined space, or that can be exceptionally large, to be read from a great distance

away. Custom complex curved tack for use with manned and robotic cameras, extremely low latency video and audio transmission and receive paths for remote Orchestra performances, managing latencies in mixed format cameras and screen surfaces are some


of the others. Pushing the edge of what can be accomplished with existing technology is always at the forefront in the design.

It is, probably, one of the biggest live productions of the year. How do you manage to distribute the signal worldwide? As is always the case in any large endeavor like the Oscars, it doesn’t 57


happen without teams of Engineers and Logistic folks herding it along, every step of the way. Most folks are aware that it airs domestically in the United States on The American Broadcasting Company’s system, but it is also carried by over 200 International distributors. This happens with a combination of terrestrial fiber and extra-terrestrial satellite feeds to reach every corner of the globe. No matter where you are during the broadcast, you can most likely put an antenna in the air and tune in. The Main domestic show, as I mentioned earlier, is produced from the Main broadcast truck at the Dolby Theater. The International feed mobile unit is located in another lot, some distance away from the main show truck, supported with fiber signal paths of the Main show and various cameras to provide “cover” shots to cover areas within the production that are Domestic sponsored segments, not for International distribution. The output of the International truck is 58

actually what feeds the rest of the World.

How much time do you need to prepare the coverage of The Oscars? The Oscars planning for the next year pretty much starts the morning after the event. The pace rapidly accelerates about 90 days out from the show. That acceleration continues to the morning

of the Nominations announcements when the Production team finds out who and what has been nominated. After the Nominations have been announced, things really start moving with trying to book musical talent for the nominated songs, and production elements to support groundbreaking features of the Nominated films for that year.


storage this year. We have built a direct fiber loop to the Post house working 24/7 on renders and rerendering of content that is then transferred to us at the Dolby Theater on a 10Gb network link. Content is reviewed using an asset approval system which is a secure proprietary system that allows Producers to make notes about changes that is stored in the Metadata of the clip so it can be turned around as quickly as possible.

In addition, many graphics are implemented during the broadcast of the ceremony. What technology have you chosen? In the Oscars, a lot of video content is deployed during the ceremony. What systems do you deploy to edit these assets and finalize the content? There are a voluminous number of clips on the show. Although some people believe that the Production team is aware of the winners before the broadcast, they really

aren’t. We find out who won at the exact same time as the rest of the World. Since that is the case, every winner clip and every package has got to be built out completely times however many people are nominated in each category. That workload combined with the required clips to support the broadcast came to 40 Terabytes of

There are actually multiple versions of graphics. There are the graphics that are built into the clips for the screen Content which are being managed by a series of servers, then there is the electronic broadcast graphics being used for the lower 3rds which come from a team of Electronic Graphic operators. Aside from the graphics used in show, 59


The Oscars ceremony for 2020 utilized 21 cameras: a selection of Studio cameras, handhelds, RF Steadicams, Technojibs, Towercams, hi-speed winch cameras, and a Moviebird platform.

there are also people dedicated to live effects. The 5-window boxes that reveal the individual nominees and resolve to the winner is a product of that department.

People are more and more accustomed to 4K / HDR or even DolbyAtmos-like technologies when the enjoy content in their home. What transmission standard will you deploy for 2020 coverage? The show airs on the ABC Network which has an HD Broadcast standard of 1280x720 Progressive 59.94 Progressive Frames per second. As such, the show is transmitted in that 60

format. Although UHD and greater video resolutions, High Dynamic Range and ThreeDimensional audio experiences like Dolby Atmos are all emerging technologies, a substantial amount of the viewing audience has not yet adopted it. Although there are many emerging technologies used in the production of the show, it remains in the native 720p 59.94 transmission format for the US Domestic markets. The International markets also receive the show in 720p 59.94 and trans-convert to the standard of their native countries for broadcast there.

The Oscars is a huge event. How many cameras are used for the ceremony? The Oscars ceremony for 2020 utilized 21 cameras: a selection of Studio cameras, handhelds, RF


Steadicams, Technojibs, Towercams, hi-speed winch cameras, and a Moviebird platform.

What other technologies are part of the production (mics, intercom system)?

As you can imagine, with the variety of performances needing audio support and the challenges that always exist in finding a balance between the needs for a good audio mix in the

house compared to that of a good audio mix for the home viewer, we are constantly working with the latest RF microphones and PA equipment. It goes without saying that these RF systems require

The 92nd Oscars® at the Dolby® Theatre in Hollywood, CA on Sunday, February 9th, 2020. Credit: Terekah Najuwan / ©A.M.P.A.S.


Regina King presents the Oscar® for Actor In A Supporting Role during the live ABC Telecast of The 92nd Oscars® at the Dolby® Theatre in Hollywood, CA on Sunday, Credit: Blaine Ohigashi / ©A.M.P.A.S.February 9, 2020.

constant adjustment and very careful RF Coordination between all of the shows/venues. Intercom is also an everexpanding department as without good, reliable communications, even a very well-rehearsed show will fall apart in seconds. In the case of the Oscars, there are 4-wire communications established between the Main show, Pre-Show, Web 62

show and International trucks. In addition to the 4-wire interconnects, there is a myriad of EF Communication packs that all need careful RF coordination to keep in working order.

Do you have any technology partners involved in the production? Throughout the years, there have been many technology partners.

Many manufacturers are desirous of hi-lighting their products within the body of the show. In each case, there can be challenges in implementation that often require us to “think out of the box” for solutions.

What technologies will be deployed at control room? Are you using an OB Van/truck for the main control room of the production?


The entirety of the project requires several sets of OB Vans. 7 in total for this year, to cover production needs for the Pre, Main, Web, and International shows.

The entirety of the project requires several sets of OB Vans. 7 in total for this year, to cover production needs for the Pre, Main, Web, and International shows.

Have you ever considered remote production for future ceremonies? Although Remote Production as a concept and in the practical sense will continue to evolve

and become more and more popular, a show like the Oscars evolves and flows so quickly that without the engineers, Production teams and operators being in close proximity to each other, it would be a much more difficult task. The concepts of Remote Production in Sports coverage and other similar projects are, in my opinion, closer at hand than that of a large scale entertainment show. The day will come when that is no longer true, but at present, there is a whole lot of the creative process that involve close interaction between all of the participants. More so than dependable intercom can provide.

In your opinion, which state-of-the-art technologies could be deployed in future editions of the Oscars? There are many ways the Production can benefit from emerging technologies. Certainly, UHD video resolutions or greater, increased color gamuts in High Dynamic Range systems, High Frame Rates and 3-D audio will all play into future broadcasts in due time, as market penetration for those technologies increases. At the Oscars, we are always looking to push technology limits while closely monitoring the overall risk and reward for that effort. ď ľ 63




Cybersecurity, “nothing� new in our industry Cybersecurity is something having already an impact on all professionals from nearly all industries, including broadcast. And the solution is not always a technological one. Proper implementation of the required protection without the right policy and adequate procedures in place will be altogether inefficient. By Yeray Alfageme, Service Manager Olympic Channel



Our broadcast industry has been gradually relying less in dedicated hardware and has widely moved on to a standard IT infrastructure, which in turn has opened the door to the reality of cybersecurity. At the same time, and mostly due to

the high level of performance we demand from our equipment, implementing cybersecurity systems becomes really complicated, and not only because antivirus software. From manufactures, which –due

to the fact that it had never been a requirement for their clients- had not made up to then any development efforts targeting security, to freelance operators that handle the systems in each production, crosswise awareness to turn IT

protection into an everyday, indispensable element is required.

scope, and hardly any time has been devoted to make sure they are secure. But manufacturers are not the only ones to be blamed for this oversight: clients, broadcasters and producers have never demanded secure systems. In some instances network

protocols used by the relevant software are not even documented, which makes it impossible to detect whether an existing network communication is legitimate or not.

There are countless alternatives for media management software, whether edition systems, PAM or MAM, or hardware configuration systems for use on a more limited 66

Let’s see a practical example well known to many: an antivirus running


on a video editing station. One of the main functions of antivirus software is the permanent monitoring and scanning of files being used in a computer, which slows down operations in some applications. The reaction of any editor who detects an antivirus will be

to deactivate it straight away with no questions asked.

prior scanning. Had the pendrive been used for any other purpose before? Had it been exposed to any viruses in the past? What if it infects the whole mobile unit or the entire TV Compound, as all trucks are connected? What would happen to the event? If we think that

way, it turns out that having an antivirus in place is not such a big deal, forcing operators to scan their devices in a standalone machine and making sure everything is clean in order to prevent risks, right?

This example, not farfetched at all, should be used for reflecting. We could have the best firewalls and antivirus software available but, if there is no clear culture

It is starting to become a

and strict, well-defined policies in place and a shared responsibility in preventing risks, it will be to no avail. In many mobile units, any operator is free to bring from their homes a removable drive and plug it into the systems to download their homemade setup with no

common practice that broadcasters using and producing remote operations require suppliers to observe strict cybersecurity conditions. Nothing really complicated is needed, as guidelines in this regard are already created and 67


complying with them is absolutely a routine matter in purely IT or business environments. Therefore, adapting them to our industry is not a complex issue. It is obvious that scanning terabytes of content the same day in which operations are to take place is not really feasible. Therefore, the creator of the relevant content must take responsibility – even from a legal perspective- and make sure that the storage media –either a hard disk or a pendrive- are free from virus. And all this becomes much more complex when dealing with Cloud environments. And the thing is that in a standalone mobile unit or in a unit connected to the TV Compound our network ends up right there, in the stadium. In remote operations, the networks is monitored and closed when connecting the stadium with the production centre, nothing else. But in the Cloud we are actually connected to the internet. This would be like leaving the door of the OBV open and the TV Compound not fenced and with no one to enforce security during the 100-metre dash final: quite risky indeed. As an internet connection is used to access the Cloud, the computers being used for production can be easily used for checking emails, surfing the Web or even for personal matters if they are not duly protected. In many projects in which work is carried out on a Cloud-based system the IT department itself will forbid suppliers to remove must-have protection layers due to the obvious risk, which might jeopardize the project's feasibility. This combined with the fact that even engineering favours removing these restrictions in order to let the project move forward, something that is wrong no matter how you may look at it. 68


If we in the industry are moving towards implementation of an increasing number of network-based technologies such as NMOS, PTP and similar ones which have the potential of being managed remotely or even from the Cloud, thus becoming connected to the Internet, we must change our mindset. And this is because implementing proper technologies to get protection is no longer enough. We must be aware that a fence in the TV Compound is as necessary as a firewall in a network or an antivirus at the edition room. Let's switch from accepting and even promoting simple –to-use systems to demand that they are compliant with appropriate security guidelines. It was only a few months ago when I could no longer see PCs with Windows 7 –an operating system over 10 years old- running on broadcast stations... And unfortunately enough, I am sure that there are still some computers with Windows XP running, an operating system that is now of full age. Crazy... Let’s imagine for a second that our channel's flagship program could not be broadcast because of a computer virus. In fact, we do not need to imagine that. It has already happened. How much money was lost then? And even more, what was the reputational impact for the channel, producers and associated advertisers? In this situation, waiting for a hard disk to be scanned, not letting anyone connect anything to our vision mixer or using two layers of firewalls in our network are investments with an outright return if they might prevent a program from not being broadcast. Moreover, some cybersecurity measures have no relationship whatsoever with technology, such as blocking our phone and computer or having our desk free from papers at the end of the day.  69


ISE 2020

A bitter farewell to Amsterdam What has happened this year to the ISE (Integrated Systems Europe) trade show? Consternation has been widespread in the audiovisual sector. Those corridors that in the 2019 edition were crammed with souls closing deals and discovering opportunities were not as crowded, not by a long chalk. Some booths at the smaller showrooms were just empty, waiting for a visitor to arrive. As for figures, the slump in visitors is devastating. The fair has lost more than 35% of visitors as compared with last year (81,268 versus 52,128).

It is inevitable to point out to the Coronavirus as one major driver to blame for the event's failure to meet expectations, in view of the sustained growth it had experienced since its inception. The public alarm raised by COVID-19 became the main topic for discussion amongst all visitors. Authorities, also WHO itself, did not see a reason to cancel the event, a guideline observed by AVIXA and CEDIA, the firms in charge of the show. LG was the first manufacturer to cancel, and 50 Chinese companies and 20 from other countries followed 70

suit. Fortunately enough, the number of exhibiting companies reached 1,138, up 17 as compared from 2019.

hub, as it has been the case during the past few years. Therefore, the event cannot be termed a failure.

A call to stay calm and a wide array of health safety measures adopted, including distribution of hand sanitizers throughout the site, could not avoid a large decrease in visitors. Yet, whoever did attend, made the most of the occasion. In spite of the general fall in visitors, meetings were frequent. Handshakes and hugs were seen more often than face masks. ISE mobilized the AV universe and served as a business

However, the sentiment should have been a somewhat more positive one. We all were hoping for it. ISE 2020 had to be a triumphant farewell and set new record figures for both visitors and exhibitors, which would be unquestionably good news for the industry, even more so it we consider the leap of faith that organizers will take to relocate their activity to Barcelona. But such a big celebration did not quite

ISE 2020



come, as nor did many of the flights that should have landed in Amsterdam in the previous days and which storm Ciara had taken care to "cancel".

A show without big novelties The audiovisual industry, if we consider what we witnessed down the RAI hallways, finds itself now in a time of technological consolidation. Apart from a few rare exceptions, ISE did not operate as a big event for showcasing novelties. Quite the opposite, it allowed us to see how notions introduced in previous years are now understood as basic starting points to promote an inevitable digitalization. As for displays, 4K resolution is now an established standard, as are also modular LEDs. In parallel, alternatives for the future such as 8K options keep gaining ground. In the world of projectors, life goes on: power is making it into a large range of equipment through 4K resolutions and devices multiply their lumen capacities as they 72

become more and more compact in size. Laser, of course, is already the present. As for audio, the trend is largely influenced by full diversification in order to cover each and every AV area. Simultaneously, aesthetics stay on top so as to respond to the concerns of the HORECA channel; and compatibility with IP solutions is becoming increasingly standard. Something similar happens with control systems: companies favour offering complete alternatives targeting cloud environments so they can be remotely operated. At the end of the day, everything revolves around two notions: simplicity and effectiveness over showiness. Technology is a must-have supporting element, but creativity is the true input adding real value to integration. But, then what does the “WOW effect� come down to? Quite a few companies offered demonstrations on 8K, mobile panel structures or robotisation (and digitalisation) of

processes. However, we had already had the chance of discovering most of these capabilities. At any rate, this does not mean that the industry is not showing any progress. It is active and has multiple stimuli to continue exploring in the future. Even so, we cannot help feeling that we are in 2020 undergoing a transition period.

ISE 2020

In spite of this widespread impression, we were able to see some signs of what tomorrow will bring: the VR area will undoubtedly grow, eSports will be back with surprising formats, and the companies in small showroom 15, most of them with numerous original alternatives, will gain in importance.

Where is AV heading for? Consolidation of trends allowed us to glimpse a uniform picture of the main areas comprising the show. In education, for instance, the transition into interactive boards is now complete. Most manufacturers that were either engaged in projection or directly in

displays, have chosen to embrace these systems, which are being increasingly required in classrooms and compatible with all kinds of third-party services. An interactivity solution bound to stay with us. As for corporations, we can see that, in addition to unified control solutions, we are heading for mass collaboration between organizations. Intercommunication systems, represented by large-format screens, integrated communication proposals, smart audio capture for meeting rooms, or IP models engage in a perfect dialogue with marketleading communication providers. All solutions are designed to facilitate integration in the company's day-to-day operations by assuming the new collaborative work models. In retail, digital signage takes a leading role. Aside from a few proposals for translucent LED screens with multi-touch capabilities, the offering becomes simpler in regard to resolution and pixel 73


pitch. Manufacturers compete to deliver robust image quality and a simple installation procedure. We hardly ever see those holographic systems that were so popular some years ago or interactive options such as movement-responsive floors: presenting the product in a clear-cut fashion is what matters now. At this point, of course, the numerous IT resources available come into play, either in the form of control software or servers feeding the graphics supported by these media.

panels, which price falls as quality improves.

Many other sectors were in one way or another present at the event. As for museums, projections are still on top with the addition of systems featuring optics that are either ultra-short or easy to adapt to confined spaces; in auditoriums, infrastructures progress towards solutions that are compatible with IP technology, searching for versatile consoles; in large formats, everything is about big projections; and in sectors such as mass infrastructures, all revolves around LED

In the restaurant sector, beyond ‘invisible’ audio systems and proposals for projecting sound and so adapt to the various areas within a establishment, we could see how screens are there not only for showing menus in an attractive way, but also to help clients to finalize their orders. On the other hand, as for hotels, the goal is (again) providing guests with all comforts. This translates into choosing TV sets that will allow direct access to their favourite OTT platforms. By the way: no sign of the


virtual assistants that seemed bound to easily integrate in all parts of the industry.

ISE 2021: 2-5 February at Fira Barcelona Fira Barcelona will foreseeably be the venue for ISE until 2023. According to official sources, the growth of the event was unsustainable at the RAI. For this reason, the right move was to relocate to a venue offering more showroom space: an extra 20% to be precise. Frequent renovation works being undertaken at the

ISE 2020

Amsterdam facilities, which could take place again over the coming years in view of the venue's age, might have been a cause for this move too. References to this new era in the Catalan capital, beyond (non-digital) signage found at the exit of the Dutch site were limited to two small event sand an area called “¡Hola Barcelona! Lounge”, comprising two audio systems playing flamenco music and two small booths offering information on lodging options. In 2021, ISE will get back to relative normality. It will have to face many challenges, including full relocation to a completely new venue. Furthermore, it will have to regain the trust of the exhibitors that have experienced a sadly watered-down edition. Even with these challenges to be overcome, we are sure that the scene of technology trade shows will move on. The decade just beginning will be marked by digital transformation and we are

confident that this kind of events will continue playing a key role in driving forward a world as wide as exciting.

Novelties for consolidating world trends As we said before, ISE 2020 has not been an event filled with big novelties: no breakthrough technologies, those who mark a turning point, have been seen. However, an ample number of brands have certainly showcased some interesting novelties that deserve being taken into consideration. One of the event’s most spectacular booths was Panasonic’s. This technology company stressed the ‘Freedom to Create’ concept in order to show how versatility is at the core of its proposal. In this regard, worth highlighting is the Solid Shine range of projectors, which offers over 30,000 lumens and is a strong alternative for the rental market, which favours a combination of compact size and power; projector series PT-RZ790 for museums and exhibition 75


spaces; the PT-FRZ60 family, more projectors for environments that have not yet made the move to digital boards; the SQ1H series, new displays for digital signage that are ready for integration of different solutions; or a new solution for unified wireless presentations. Worth mentioning also is its eSports stadium deployed for the occasion, in which the brand was able to show its reliability in projection, mapping and production solutions. Crestron was another major player in the trade show. In every edition this firm manages to draw the industry’s attention thanks to the wide versatility of the solutions offered, with dozens and dozens of products being exhibited at disposal of visitors. The brand showcased multiple solutions that could very well deserve a detailed article. However, we can highlight that amongst the most coveted products was the DM NVX® AVover-IP platform, which came with new capabilities –most especially compatibility with the AES67 standard-; new solutions in 76

Intelligent Workplace, amongst which worth noting is the latest enhancement in the Crestron XiO Cloud™ platform; or an astounding cooperation with Logitech to combine Crestron Flex C Series with Logitech MeetUp, Logitech Rally or Logitech Rally Plus. Matrox, a firm well known for its approach to the broadcast world, presented several novelties that deserve close follow-up. Worth noting are their D-Series Video Wall Graphics Cards, capable of managing 4Kp60 outputs and integration with the Quadhead2Go solution. On the other hand, the

firm showed the power of its Extio3, an IP KVM Extender; and introduced interesting collaborations with brands such as BrightSign, NEC o Panasonic. We also enjoyed or visit to the disguise booth. The brand demonstrated again the adaptability of its systems, not only with regards to shows or videomappings -well known for their power-; but also in the retail sector, where promising solutions were anticipated. After all, its server's only limit is user imagination. In a more specific area, worth highlighting were the new additions to its vx4 and gx

ISE 2020

2c solutions, as well as their installation in cooperation with Novaline and Carbon Black. Sennheiser focused once again its efforts in stressing the benefits offered by its TeamConnect Ceiling 2, a solution that keeps surprising us for its versatility each time we witness a demonstration. Additionally, the brand focused not only on innovative experiences, but also on their adaptability to a highlydemanding communications and audio systems environment: universities and campuses. In this

regard we had the opportunity to take a detailed look at its MobileConnect Assistive Streaming system, or at its SpeechLine Multi-Channel Receiver, targeting the SpeechLine Digital Wireless Microsoft System. On the other hand, Bose Professional carried out a demonstration of its power by introducing a new corporate area, thus completing a determined move for diversification. As the person in charge of the Bose Work product line remarked, this range will see a strong boost in coming years. Amongst the proposals of the Bose Work family, worth

mentioning are the 700 UC headphones with sound cancellation or the Bose VB1 Videobar, which features a 4K UHD camera. Other important firms visited ISE 2020 and showed their mettle. However, they did not showcase any major novelties. Canon, for instance, offered a small demonstration area of their XEED 4L6021Z broadcast cameras; Sony introduced its ceilingmounted microphone MAS-A100 for academic environments, 20 new options for work environments with its TEOS 2.2, and the 13,000 lumen VPL-FHZ131L installation projector; and Epson put to work –by means of a powerful deployment- its EBL30000U, a solution that had been already preannounced at ISE 2019. Additionally, this company showcased its Ultra Short Throw lens (ELPLX03) for 30,000-lumen and 25,000lumen projectors, as well as small projectors for digital signage as the EBW70 or EB-805F. 77


In the projection scene, worth mentioning is the powerful effort by Digital Projection to showcase a groundbreaking proposal. The company showed in operation, after several closed-door previews, its scalable Satellite Modular Laser System (MLS), which separates the light source to a remote location, thus enabling a marked reduction of the production source in size. Simultaneously, the company showcased other solutions, such as its 78

INSIGHT Laser 8K projector. Each brand adopted a different strategy at ISE 2020, although we could see two definite approaches. The first one was showcasing small novelties to consolidate technologies already known. The other was, in order to bolster the good job performed over the previous months, highlighting progress made and celebrating success cases.

Riedel, for instance, falls into the second category. This brand keeps strengthening the widespread implementation of their Bolero wireless communication system, extended both in production and in integration of environments such as theatres or auditoriums; or its 1200 Series SmartPanel. Additionally, the company premiered a promising implementation of the Embrionix portfolio.

ISE 2020

Leyard was committed towards showiness by means of appealing installations that demonstrated the company’s power in implementation of microLED technology systems. Amongst its novelties worth noting is the 8K Micro LED screen featuring a 0.6mm fine pitch. This device attracted the attention of each visitor passing by. But this was only one of its many novelties, amongst which we can mention the

DirectLight® X LED Video Wall System and the TVF Series LED video wall line. Absen, on the other hand, continued insisting on its commitment towards MiniLED products by showing the power of its main product ranges: Acclaim (A27 Pro), Aries (AX), Control Room (CR) and the new HC Series for mission-critical environments. And of course, novelties came in the shape of the Venus LED series for concerts and festivals; or the AW Series targeting the Digital Out-Of-Home (DooH) advertising market. Datavideo showcased a large number of novelties to commemoration its 35th anniversary. Amongst them, we were able to discover the efficient operation of its TP-800 Teleprompter for conferences, comprehensive bundles for production BDL-1601 y BDL-1602, based on the HS-1600T MobileCast portable studio and PTZ cameras; TPC-700 controller for production in corporate

environments; and the KMU-200, a promising 4K production system. AJA was also present in the event with its HDR Image Analyzer 12G, a muchanticipated solution that supports material in 8K format and developed in collaboration with Colorfront. Similarly, we saw the latest developments of its Ki Pro GO Multi-Channel H.264 or well-established proposals such as its HELO H.264 streamer and recorder. And, of course, we could not fail to mention Black Box, as its solutions are certainly in much demand in the area of visualization and distribution of 4K AV content. Its main strategic novelty was joining in the SDVoE alliance. In the meantime its technical proposal was based on continued showcasing of the benefits offered by the iCOMPEL digital signage platform, its MCX Multimedia Management System, or the Radian Flex software-based video wall processing platform, amongst other solutions.  79


ENCO enCaption Full automated subtitling

Although we are by now used to performing tests and laboratory analyses on novel products, HDR processors, new 4K cameras or IPbased systems, mundane issues such as subtitling still give headaches to many of us. This is where ENCO comes to our aid with a powerful, versatile automatic subtitling system. By Jeray Alfageme

Subtitling of live programs has always been a problem for any broadcaster around the globe. The cost of doing it manually and the experience and quality of operators were crucial for achieving a good standard and accuracy in subtitling. As soon as the first automatic subtitling systems appeared, they were swiftly adopted due to all the problems they claimed to solve, but their inaccuracy and integration problems within the broadcast chain put a stop on their mass implementation. ENCO has always been a reference in these kinds of systems and its new 80

enCaption 4 includes really complete features and quite high accuracy in operation. Let’s see this in more detail.

All inputs and some more The automatic subtitling unit we tested, a one-rack unit (although a more powerful version formed by three rack units is also available), features one Blackmagic Decklink video capture card with HDMI and SDI inputs. Both SD and HD signals in multiple resolutions and standards are supported, so the video input signal will not be an issue when it comes to integrating the equipment within our

broadcast chain. A welcome surprise we find here is the inclusion of Newtek's NDI standard as video signal format on IP supported. We have not been able to test this feature as we did not have a valid NDI signal, but this implementation promises an easy integration with IP environments. What is not so convincing to me is how a traditional subtitling signal such as the one generated by the ENCO enCaption unit fits within an IP-based workflow, as more simple and user-friendly solutions are available for the task of including subtitles in IP signals, although an automated subtitling


solution is always interesting regardless of the standard being used.

is the so-called offline

In addition to video inputs, either base band or IP, enCaption is capable of reading video files and creating the relevant resulting subtitle file. This

bases and deep filing

model, which is very powerful for documentary systems. Transcription speed in offline mode exceeds 10 times real time, which is really fast.

And now, what should I do with my subtitles? The software automatically and effortlessly detects the subtitles as soon as a video signal is inserted. However, it is true that in 81


specific environments, some words may not be found in enCaption’s dictionary. In this event, the system allows for manual insertion of said words into a dictionary, thus optimizing voice reading and improving subtitle output. It is worth noting that enCaption supports more than 25 languages. This is something truly remarkable as, in addition to this, the system is capable of distinguishing between different voice tones and automatically assigns different colours for different voices. By combining enCaption with enTranslate (ENCO’s software capable of translating subtitles into as many languages), subtitling capabilities available by using only these two systems are impressive. Once subtitles have been read in any of these 25 languages available, we have at our disposal several options for further integration within the production chain. If the signal will be broadcast live, we then have serial RS422 connections for 82

linking enCaption with our decoder and base band video subtitles inserter. This is the most widespread standard, but also the oldest and most limited one. Another option is an IP Telnet connection, with is also widely used in systems of this kind and makes good use of the network infrastructure without compromising reliability. Last, we have the MOS protocol for NRCS. This is a standard protocol in the industry for exchange of

information, events and step outlines amongst news systems, which also handles subtitling information. My recommendation is using IP Telnet or MOS if possible, as these are the most versatile systems and use the existing infrastructure, thus avoiding dedicated wiring for subtitle insertion. If, instead of going forward with our broadcast, we would like to save these subtitles into a file for storage,


enCaption has export capabilities for most subtitle file formats. This feature, in combination with the offline operation option, subtitling of video files, offers us a very powerful and fast file subtitling system.

Streamlining the process In order to get enCaption work properly, a number of functions are provided in the system that we must explore in order to improve the subtitles to provide, regardless of output format we decide to use.

Word library Although enCaption has a huge vocabulary database for each of the 25 languages supported, some words may not be included by default. This is why there is the possibility of creating our own word library –or dictionary- for any words we may spot that the system is not able to automatically recognize. I find this option particularly helpful when it comes to including words from another language in a specific language library.

Filters As opposed to a word library or dictionary, there is also the possibility of prevent the system from recognizing certain words by using prior filters. In this instance, usage is somewhat more complex, although this could be useful in regard to names of people or places that could be confused with words from other languages, for example.

GPI As it is the case with other traditional broadcast systems, enCaption comes with GPIs as well. This feature is provided through an USB adaptor that transforms a serial USB port in order to have pins from physical GPIs. Basic operation functions such as start, stop, mute and the like are the only ones available. Not very advanced, but also useful.

Conclusions I have been always reluctant to use automated systems for critical operations such as this one. Having said this, enCaption has surprised

me positively. Both voice recognition achieved and features implemented comfortably deliver what it promises and, in some instances, with a positively surprising result. Subtitles would seem a somewhat outdated requirement, but both due to broadcast regulations and to accessibility and functionality, it is something that must be offered in all our broadcasts, either traditional or digital, OTT or VOD. If enCaption were capable of inserting subtitles into a base band video output, it would be the perfect equipment for inclusion as subtitling processor in or broadcast chain, thus saving a lot of time and costs in multilanguage subtitling, which is always expensive and time-consuming. Its file subtitling capability is really powerful and can be a highly valuable tool for deep file transcriptions in a very efficient manner.ď ľ 83