TM Broadcast International #88, December 2020

Page 1

EDITORIAL 2020 is coming to an end. Some voices in the broadcast industry will boisterously celebrate the passage to a new decade, willing to bury deep down months marked by uncertainty. Others will take stock (maybe more sensibly) and remember these times for what they were: the biggest booster the broadcast industry had seen in a long, long time. Concepts such as remote contribution, virtualized production or cloud-based collaborative work were latent. However, the current circumstances have been the spark that has set off their regular implementation. Whoever has tried these new ways will stick with them. What is awaiting us in 2021? We can envisage two industry trends, more clearly noticeable in recent years; trends that will spread in the near future. First of all, companies will continue striving for partnerships with other industry players, either in an attempt to expand their markets, strengthen certain business areas or just to better cope with the pressures borne in recent times. This extremely competitive industry –as many others- requires mergers and acquisitions to further progress. Therefore, the playing field is dominated by large manufacturers that provide all kinds of solutions. But, how can one compete against these companies that have plenty of resources? A clear alternative is to match the strength of these massive companies

by merging with others, but this is not the only way out: small and medium-sized enterprises can also stand out thanks to innovative solutions, flexible business models or extremely dedicated customer support. Secondly, we can say that the subscription-based digital model will keep increasingly spreading to more and more solutions on the market. Both the quest for adaptability and the constant evolution of technology seem to lead to a business model in which choosing a solution that will last five or ten years makes no sense, a model that encourages instead to go for the solution being more convenient at a given time. Almost an industry standard in certain services, this will gradually spread to more and more areas. In the past few months, the industry has been facing unprecedented challenges in our recent history. But far from choosing a conservative approach to tackle them, progress was the way. We can be proud of having ensured both information and entertainment for society in a solvent, professional and determined fashion by relying on the most recent innovations available on the market. We will continue down this path in 2021. Thank you for walking along with us.

Editor in chief Javier de Martín

Creative Direction Mercedes González

Key account manager Susana Sampedro

Administration Laura de Diego

Managing Editor Sergio Julián

TM Broadcast International #88 December 2020

TM Broadcast International is a magazine published by Daró Media Group SL Centro Empresarial Tartessos Calle Pollensa 2, oficina 14 28290 Las Rozas (Madrid), Spain Phone +34 91 640 46 43 Published in Spain ISSN: 2659-5966




22 ORF:

A television ready for the future

TM Broadcast International contacted Michael Götzhaber, technical director at ORF, in order to delve in the technological strategies embraced by this media corporation.

34 Jakob Ihre,

FSF (Chernobyl): Light, atmospheres and talent

A few years ago this Director of Photography had not yet become professionally interested on television. However, after authoring acclaimed productions such as “Thelma” (2017), “Louder than Bombs” (2015) or “The End of the Tour” (2015), his following project was taking charge of the cinematography in what eventually became the breakthrough fiction for 2019: “Chernobyl” (HBO), created by Craig Mazin and directed by Johan Renck.



40 34 40 Wireless camera systems This caption comprises very differing technologies, workflows and purposes that must be properly distinguished in order to be able to choose the most adequate option for each scenario and circumstance. Our interest will focus on technology and main alternatives that are available.

52 The Good Film Company & Remote Filming We got to know more about the proposals brought by both firms from Yanina Barry, founder of The Good Film Company and cofounder -together with Alex Seery- of Remote Filming.


First broadcast of 8K UHD signal in DVB-T2


Long-term industry changes demanded by 2020, by Vizrt 5


Magewell’s Pro Convert for NDI to SDI and Pro Convert H.26x to SDI now available Magewell has begun shipping the newest decoder models in its Pro Convert family of lowlatency video-over-IP converters. Available immediately, the Pro Convert for NDI® to SDI and Pro Convert H.26x to SDI transform live, IP media streams into SDI outputs for connection to monitors, projectors and legacy production equipment. Magewell’s Pro Convert products bridge 6

traditional video signals with IP-based production, distribution, and streaming workflows. Pro Convert decoders are suited for applications including remote production; multi-site video distribution; video walls; digital signage; image magnification (IMAG); surveillance monitoring; or bridging legacy and nextgeneration media infrastructures. The Pro Convert H.26x

to SDI decodes live H.264 (AVC) or H.265 (HEVC) compressed video streams up to 1080p60 for output via a 3G-SDI interface. The low-latency device supports a wide range of input streaming protocols, including SRT (Secure Reliable Transport) – developed and open-sourced by Haivision – as well as RTSP, RTMP, UDP, RTP, and HLS. The Pro Convert for NDI to SDI offers all of the same functionality plus support for Newtek’s


NDI® technology for production-grade media transport, including both full-bandwidth NDI and the high-efficiency, lowerbitrate NDI®|HX mode. The plug-and-play decoders feature DHCPbased network configuration and highquality, FPGA-based video processing. The compact devices measure 3.97 by 2.4 inches (100.9 by 60.2mm) with a height of 0.92 inches (23.3mm), and

can be powered via external adapter or Power over Ethernet (PoE). Users can specify source streams and control the decoder’s advanced settings through a browser-based interface; connected keyboard or mouse; or using on-device buttons that overlay an intuitive menu on the output. With the release of the new models, Magewell now has three Pro

Convert decoders available with SDI output. The third, the Pro Convert for NDI® to AIO, offers NDI, H.264, and H.265 decoding with simultaneous HDMI and 3G-SDI outputs. The decoders can be paired with Magewell’s Ultra Stream live streaming encoders; Pro Convert NDI encoders; or thirdparty encoding products. 


Aviwest to release 5G field transmitter and encoder series in December Aviwest, a global provider of video contribution systems, has announced its new PRO35G and AIR-5G Series solutions, which are to be released in December. Using the 5G versions of Aviwest’s bonded cellular transmitters, video professionals will be able to deliver live news, multicamera sports and events coverage “with greater efficiency and higher quality”, according to the press release. “5G networks are expected to bring many improvements to the current bonded cellular transmission scenario,” said Ronan Poullaouec, chief technology officer at Aviwest. “Gigabit speeds and reduced transmission latencies are the initial advantages broadcasters can realize from 5G. Beyond the new capacities and services that 5G networks enable, our Aviwest field units will help to maintain a consistent quality of service and make the best 8

AVIWEST will release its new PRO3-5G and AIR-5G Series solutions in December.

use of 5G network performance.”

backpack PRO3-5G unit

Aviwest’s flagship camera-mounted and

3G/4G/5G modems that

features six embedded are compliant worldwide


with a high-efficiency, patented custom antenna array. The AIR-5G also features a long-life internal rechargeable battery and a large set of audio and video interfaces. Both mobile transmitter series leverage Aviwest’s SST (Safe Streams Transport) protocol, which aggregates multiple IP network

connections and dynamically adapts video bit rates according to network bandwidth fluctuations. Aviwest has been involved in multiple field tests of 4K/UHD and full HD/HEVC video delivery over 5G since 2017. Teaming up with major international broadcasters and telcos enables Aviwest to test

the real-world impact and performance enhancements of 5G on live broadcast video production using the PRO3-5G, AIR-5G and HE Series. The company has recently demonstrated this at a major tennis championship at Roland Garros in France and as part of the IBC Accelerator Program in Amsterdam. ď ľ


LiveU made possible first 5G smartphone sports broadcast produced by Sky Deutschland LiveU has powered the first 5G multi-cam sports broadcast produced by Sky Deutschland with its LUSmart smartphone kits. The live production took place at the Flens-Arena in Flensburg, Germany, where Telefónica installed a dedicated 5G network to live stream a first-league handball match. Fans at home watched the entire game in high quality with close-to-zero delay on Sky’s online platform Supported by netorium, LiveU’s longstanding partner in Germany, Sky was able to capture the live action on the handball court and simultaneously transmit the live feed. The German TV broadcaster deployed several smartphones equipped with LiveU’s proprietary software for real-time encoding and upstreaming to cover multiple camera angles. Leveraging Telefónica’s high-performing 5G network on the production site, the


video data was transmitted “with maximum speed and bitrates”. Alessandro Reitano, SVP Sports Production at Sky Deutschland, said: “Reinforcing our reputation as an early technology adopter, Sky successfully completed the first end-toend 5G transmission in Europe on With LiveU’s latest IP transmission technology and the great support of

netorium, we were able to produce a solid 5G stream of the handball game and to let our online audience become part of a dynamic and highly engaging viewing experience.” The Sky video stream benefited from LiveU Reliable Transport (LRT™) technology. The live feeds were received via physical LiveU servers located in Sky’s mobile control room outside the arena. With


each server providing four SDI output channels, the camera feeds could be fed into a hardware mixer and cut on the production site. LiveU’s IFB feature (Audio Connect) served as the audio backbone for smooth, two-way communication between the control room and the field.

emerges when four technology leaders join forces – with LiveU and netorium bringing the IP video transmission expertise, Sky Deutschland contributing outstanding live production skills and Telefónica providing the O2 cellular network of tomorrow.”

Zion Eilam, VP Sales for EMEA at LiveU, added: “This joint venture demonstrates the disruptive power that

Peter Frantz, CEO at netorium, commented: “With this path-breaking project, we accomplished

our mission to provide a best practice reference case for 5G streaming. Thanks to LiveU’s superior technology, the days when you needed to have an extensive and expensive tech setup to deliver a broadcast-quality stream are definitely a thing of the past.” The live production in Flensburg was implemented in compliance with the current COVID-19 safety measures. 


Extreme E teams up with Red Bee Media the broadcast product is right, and shared with as many screens as possible. “I’m delighted to be working with Red Bee Media, a company that has a track-record in delivering content far and wide.” Extreme E is a sport for purpose, and one of those purposes is the environment. In Season 1, the series will visit five Extreme E, the new electric off-road racing series, has teamed up with Red Bee Media, which will support its broadcast package worldwide. Red Bee Media will provide satellite distribution services to major broadcasters around the globe, picking up live signals from Extreme E’s Production gallery and transmitting these through resilient global satellite networks via its distribution hub in Hilversum in the Netherlands. In addition, Red Bee Media will also provide a digital distribution


hub/network which will be used to transcode and live stream Extreme E’s race content on all of its digital platforms. Finally, Extreme E will utilise Red Bee’s OTT platform for global live streaming on the championship’s website and other digital assets. Ali Russell, Chief Marketing Officer at Extreme E said: “This support from Red Bee Media adds up to Extreme E being able to reach a truly global audience. In line with the championships environmental and sustainability missions, there will be no spectators on site, so it is imperative

remote locations in Saudi Arabia, Senegal, Greenland, Brazil and Patagonia, all of which have been negatively impacted by the climate crisis. By visiting these locations, the championship looks to highlight these issues and encourage change. In a bid to reduce carbon emissions, numbers on site will be kept to a minimum and this includes spectators, so the only way the championship can tell its story is through the screen, making its broadcast product particularly important. 


Sennheiser provides all microphone needs for Latin Grammy Brazilian ceremony In 2020, the Latin Recording Academy hosted its first ever Premiere ceremony for Brazilian audiences. Broadcast before the 21st Latin GRAMMY awards on November 19, the Premiere ceremonies were openings to the” Biggest Night in Latin Music”, combining exciting music programmes with the announcement of a large majority of the winners. For its Brazilian Premiere ceremony, live-streamed via Facebook, the Academy had partnered with Sennheiser to provide all microphone needs for the music performances. “Sennheiser was a key partner for us to deliver at the same level as the Grammy Awards in its other editions,” said Dilson Laguna, artistic director of the Brazilian Premiere for the Latin GRAMMY 2020. The Brazilian edition of the Premiere ceremony, where all winners in the Portuguese language categories were honored, was hosted by model Lais

Ribeiro, who presented live from Los Angeles. The event featured two compelling music acts: a duet from Emicida (who won in the Best Portuguese Language Rock or Alternative Album category and was nominated for Best Song in Portuguese) and singer and composer Marcos Valle, plus a performance by the band Melim who sang “Eu Feat. Você” from their eponymously named album, which was nominated for Best Portuguese Language Contemporary Pop Album. Sennheiser provided equipment for both performances. Melim performed in a rooftop setting. The band and two guitarists, who also provided the backing

vocals, all used evolution wireless IEM G4 monitors. While Melim’s vocalists sang with wireless SKM 9000 handhelds with MD 9235 heads linked to an EM 9046 receiver, two wired E 945 were used for the backing vocals. In addition, two MKE 600 shotgun microphones captured the ambience of the rooftop location. For their duet, Latin Grammy winner Emicida and Marcos Valle chose a studio. Both artists used ew IEM G4 monitors and sang with SKM 9000 handhelds topped with super-cardioid MD 9245 capsules. Their mics were linked up to two EK 6042 dual-channel camera receivers.  13


BLAST TV chooses EVS solution for its new esports production flypack BLAST TV, a global esports media network, has selected EVS’ VIA and Media Infrastructure solutions as the backbone of its new esports production flypack. The mobile solution is designed so that BLAST TV can produce its own live gaming events in-house. The first of these events was the BLAST Premier Fall Series that ran from October 26th to November 4th and featured the world’s elite-level Counter Strike teams. Live coverage was distributed via various OTT and linear TV channels including Twitch and YouTube, as well as broadcasters such as TV 2 Denmark and TV 2 Norway. It was followed by the next stage being the BLAST Premier Fall Showdown, which was held last week. The Blast Premier Final competition then takes place in December. Powering coverage of all BLAST TV Premier


Tournaments’ live-action is EVS’ XT-VIA and XS-VIA production servers. The XTVIA servers will help BLAST TV to record and produce live replays and highlights from the players and observers, while the XS-VIA servers provide ingest and delayed playout of the TX feeds for betting requirements. The XT-VIA solution includes 36 Full HD 1080p channels recording all the player feeds and ingame action. EVS’ IP media sharing network is at the heart of the system. Overseeing this infrastructure is EVS Media Infrastructure’s broadcast control and monitoring system Cerebrum. Deployed by BLAST TV’s crew to configure and manage all SDI workflows within its production environment, the EVS system manages all devices in the flypack, including the router, multiviewer, vision mixer and audio desk. Cerebrum also handles UMD and Tally

for the entire system, including the XT-VIAs builtin multiviewers. And it provides customizable panels that can be designed to BLAST TV’s specific needs with production configurations that allow to save and reload various types of events. EVS’ all new IP-based replay and highlights system LSM-VIA will create super slow-motion in-game replays and support studio analysis. By providing direct access to all the content on


the network, LSM-VIA will streamline BLAST TV’s workflows. Furthermore, BLAST TV team opted to use EVS’ live production asset management (PAM) solution, featuring applications for live media content browsing, control, edit and playout. The team has also chosen to use EVS’ web browsing tool, which extends its capabilities to access, browse and select content regardless of location, as well as publish multiple in-game clips to their social media engaged fans. Part of the workflow’s speed is also due to certified integration with

Adobe Premier Pro, allowing BLAST TV editors to access clip elements made available by EVS’ Live PAM suite. The backend resources of the EVS Live PAM Production Asset Management are hosted on EVS’ virtualization platform. The EVS solution also offers extended file exchange, backup and transcoding, which is convenient when publishing clips and highlights to Twitch and social media. “Our priority is delivering world leading esports entertainment to our audience, in both pandemic and non-pandemic times, so

partnering with a cuttingedge technology provider like EVS makes perfect sense,” said Andrew Haworth, Director of Operations & Production at BLAST TV. “EVS’ approach to this project has been incredible, understanding our unique challenges and then working through detailed technical workflows to allow us to produce however we want and beyond.” “BLAST TV was faced with a unique combination of challenges. These required it to adapt to new ways of producing large-scale live esports events while still providing the high level of quality and engagement its fans expect,” added Nicolas Bourdon, Chief Marketing Officer at EVS. “Our highend live production solutions along with our new range of control and monitoring tools raise BLAST’s operations to the next level with greater speed, control and flexibility.”  15


The Turkuvaz Media Group updates its media center facility with Lawo Turkuvaz Media Group, a major Turkish media company with TV, radio, newspaper and magazine assets, has commissioned a newly-updated media center facility in Istanbul with Lawo as a premiere technology supplier. System integration services for this key broadcast installation are supplied by Lawo’s partner in Turkey, Radikal Elektronik Ltd. Sti. The Turkuvaz Media Center is Turkeys biggest media campus, comprising a Plaza, a TV block and a Printing block, an area of 50.000 square meters that hosts 160.000 square meters of production space. The company operates nine TV channels and four radio channels, with 12 12-meter-tall studios measuring from 200 m² to 1500 m², and five 6meter-tall studios between 100² to 350 m² in the TV complex. The TV buildings also contain ten 16

studio control rooms, six producing in 4K and four in HD. Turkuvaz’ IP infrastructure includes several networked Lawo mc² mixing consoles and crystal radio consoles linked with a central routing system, as well as a V__matrix video processing system equipped with Lawo C100 software-defined processing blades. In addition to renovating the broadcast studios, a new OB van equipped with Lawo IP networking was rolled out in November 2020, with the overall project completed in December. The newly-modernized studio control rooms are customized for their individual tasks: news, 4kvideo post-production, HD sports programming, and a shared HD SCR, integrated by an IP infrastructure built upon Lawo systems. This major

renovation allows for flexible IP workflows, including remote production capabilities for future-proof operation. Lawo equipment installed during the upgrade includes eight new 48-position mc²56 mixing consoles fitted with the Dual-Fader option to provide 80 total faders, giving Turkuvaz operators direct surface access to large numbers of audio channels. Audio connectivity comes by way of DALLIS stageboxes; a central routing matrix with 8,192


x 8,192 mono crosspoints was built using a Nova73 HD Router. In addition to an mc²36 console providing “all-inone” functionality, the audio setup includes six radio studios with Lawo crystal mixing consoles and companion Compact Engines. The media center’s video and audio processing is handled with eight V__pro8 video processing units and Lawo’s V__matrix software-defined IP Core Routing, Processing & Multi-Viewing platform, fitted with two C100 processing blades to perform SDI IP encapsulation and deencapsulation. The networking fabric of the new Turkuvaz facilities

connects the DALLIS units to the Nova73 central router via MADI, with RAVENNA linking router and console cores. AES67 / ST2110 networking with video devices between console cores and router are also implemented. “Before we even started planning our new facilities, our project team analyzed the market situation to find a coherent future-proof solution for an IP infrastructure,” states Yavuz Nart, General Technical Manager at Turkuvaz. “With Lawo, we found a company with of the latest IP broadcast technology under one roof. Lawo IP technology has been long established in the market, and they

are known a technology leader with high quality products and an extraordinarily high expertise in broadcast. And who better to install this new system than Radikal, who have been a trusted audio partner for many years.” “As an audio engineer and systems designer in broadcast, you must of course have faith in the quality of the products you offer to your customer,” adds Seha Akbaş, Systems Engineer at Radikal. “With Lawo, we are working with a manufacturer that delivers the very highest quality standards, paired with innovative solutions and fast, expert support. We are very happy to be involved in what has turned out to be one of the biggest projects in Radikal’s history, providing Turkuvaz the best broadcast solutions available on the market, with the most modern IP infrastructure found anywhere in Turkey today.”  17


Broadpeak powers new StarHub TV+ service in Singapore

BroadpeakÂŽ, a provider of content delivery network (CDN) and video streaming solutions for content providers and pay-TV operators worldwide, has announced that it is powering StarHub TV+, a new converged IPTV and OTT TV service offered by StarHub, one of Singapore's telcos. Using a complete CDN, content packaging, storage, and origination solution from Broadpeak, StarHub is able to deliver OTT live TV and VOD content on all screens, ensuring a superior quality of experience (QoE) for subscribers, with reduced 18

OTT playback latency and buffering. "The primary goals of revamping our OTT platform were to deliver better quality, and to offer more services and advanced features such as start-over and instant availability of 24-hour time-shift TV for our customers," said Ong Bee Lian, Vice President, Interactive TV and Media Engineering at StarHub. "Broadpeak also provides us with a flexible and future-proof solution with real-time packaging that delivers substantial storage savings, allowing for a smoother viewing experience."

Broadpeak's CDN solution optimizes bandwidth use, reducing StarHub's overall content delivery costs. The StarHub TV+ service also uses Broadpeak's BkS350 Origin Packager to securely deliver live, time-shifted, and VOD services to any device. Combining an on-demand packaging feature with a built-in cache mechanism, the BkS350 Origin Packager reduces the need for encoding and storage resources while also providing StarHub with a high-throughput capacity to generate additional cost savings. ď ľ


Ross Video acquires Image Video Ross Video has announced the acquisition of Torontobased Image Video. Image Video was founded in 1974 and is best-known for its TSI tally control platform, used by major broadcast network providers, sports venues, corporate video facilities and houses of worship. This acquisition – Ross Video’s sixteenth since 2009 – will see Image Video’s product team led by Zach Wilkie and David Russell, along with their R&D and technical support teams all transition over to Ross. Zach Wilkie now Product Manager, Ross Tally Systems, is excited at the prospect of life as part of Ross Video: “As a fellow Canadian company that was founded in the same year, we’ve obviously

grown up alongside Ross Video and we’ve seen the impressive growth and expansion of the Ross business, especially during the last decade. We’re very pleased to become part of the family and this acquisition will help us reach new international markets and scale much more effectively.”

father to start his own

David Ross, CEO, reflects on the personal significance of the acquisition. “Back in 1973, my father was in hospital after breaking his leg. Jim Leitch, the founder of Leitch Video (now Imagine Communications) visited and advised my

may not be named Ross

company. He said that there was a Leitch Video and an Image Video in Canada, so there should also be a Ross Video. It’s possible that if Image Video didn’t exist, my father may not have had such role models to show the way, and even if Dad did start our company, we Video today! All these years later, Image is now part of Ross, something that my father agrees is quite remarkable. Image’s TSI tally platform is a great complement to our existing range of solutions and enables Ross to bring even more choice to the live production market.”  19


Qvest Group acquires OnPrem Solution Partners LLC Headquartered in Los Angeles, with additional offices in New York and Austin, OnPrem is an end-to-end solution provider that strategizes, advises, designs and develops technology solutions. The acquisition brings a highly experienced team of more than 250 consultants to deliver solutions in areas such as Digital Media Supply Chain, IP and Rights Management, Data and Analytics, Customer Experience Design, Salesforce and Program Management/PMO to the Qvest Group. OnPrem has built dynamic partnerships with leading technology providers and platforms such as AWS, Salesforce and Microsoft. The Qvest Group, headquartered in Germany, has offices across Europe, the Middle East, APAC, and Australia. The company has been accelerating the digital transformation of its clients, offering capabilities in technology design consultancy and systems integration, software development, cloud engineering, and technical infrastructure operations. As part of the acquisition, the Qvest Group acquired 55 % of the shares in OnPrem. The contract was signed on December 2, 2020. The partners at OnPrem, Frank Leal, Candice Lu, Christophe Ponsart, Jon Christian, and Vanessa Fiola, along with leadership will continue to build on its long-standing, valued relationships with employees and clients.  20

Densitron announces new reseller partnership with Primecast Technical Solutions Densitron has announced a new reseller in the form of Qatar-based Primecast Technical Solutions. Located in the city of Al Wakrah, the company will serve as reseller for Densitron's innovative products, including the IDS (Intelligent Display System) solution, in Qatar and other key territories in the Middle East. Part of the S'hail Holding Group, Primecast Technical Solutions provides systems integration, design, consultancy and equipment supply services to the broadcast and media industries. In recent years it has undertaken flagship design and integration projects for leading regional broadcasters including Qatar Media Corporation, Al Rayyan Satellite Channel, beIN Media Group and Al Jazeera Media Network - the last-named project involving a close collaboration with Densitron that entailed the extensive specification of the IDS solution. 


Bridge Technologies to collaborate with Broadcast Solutions Bridge Technologies has announced a partnership with systems integrator Broadcast Solutions. Founded in Germany 17 years ago, Broadcast Solutions maintains subsidiaries in Europe, Asia and the Middle East, and have planned and implemented high-level broadcast projects around the globe. With the majority of their business activities focused on assisting in the creation of both mobile and studio-based production centres, the integration of the VB440 into Broadcast Solutions’ product offerings will offer monitoring and analysis of ST2110 and ST2022-6 high-bitrate broadcast media traffic in production studios, master control centres and outside broadcast vehicles and venues. Furthermore, Bridge’s full product list will also be available to Broadcast Solutions.

“We’re extremely pleased to be working closely with Broadcast Solutions.”, said Simen Frostad, Chairman and joint-founder of Bridge Technologies. “Not only do they represent experts in the field of designing and implementing futureproof production environments – meaning that we can be confident of their technical expertise in explaining, implementing and coordinating our products within complex set-ups – but they also echo our philosophy of fostering close relationships with customers. The level of support they provide before, during and after implementation is really important to us”. Petteri Pulkkinen, Sales Manager of Broadcast Solutions Finland said: “Bridge Technologies’ products are a natural addition to our product range offering, representing as they do

cutting-edge, futureproofed, IP-based monitoring solutions. But as a company, they also represent a natural partner – driven by a philosophy of innovation and ongoing improvement, and focused on delivering excellence in terms of customer experience and support. We look forward to a prosperous ongoing relationship”.  21



A television ready for the future





Four TV channels, 12 radio stations and a good number of web portals make up the audiovisual universe of ORF, the Austrian Broadcasting Corporation. The public corporation, headquartered in Vienna, is Austria’s largest media provider. Nowadays, 65 years after its establishment, still maintains considerable significance in the country as 49% of Austrians view TV programs every day, figure that increases to 63.2% of the population for radio. ORF has been getting ready for the future for some years now. It has carried out a great deal of revamping in many key areas in order to adopt innovative workflows and technologies that will enable it to keep offering cutting-edge formats over the next decade. TM Broadcast International contacted Michael Götzhaber, technical director at ORF, in order to delve in the technological strategies embraced by this media corporation. By Sergio Julián Gómez, Managing Editor at TM Broadcast International

Michael Götzhaber, technical director at ORF

ORF has been on air for more than 50 years. What has been the biggest technological challenges the station has overcome in these years? 24


ORF center

The challenges are always at the transitions between technologies – from film to magnetic tape, from tape to disc, from disc to files, from files into the cloud.

Change management in processes, culture and people are often more difficult to overcome than engineering problems.

ORF has a decentralized structure. Could you

delve into this? What are your main production centres? Federalism is a very important concept in the Austrian constitution and the governing principles. 25


As such we operate a national TV operations center in Vienna and a separate regional studio for the city of Vienna. We


also operate a combined Radio/TV/Online regional studio in each of the other 8 provinces with around 100 employees each - and

only 15 of those are technical. In addition, we build and support the broadcast and IT installations for our 25


correspondents all over the world.

Is there “technological coherence” between

your production centres? Are they all on the same page? Very important is the coordination and management of technical standards, infrastructure and projects across all the productions centers, which the technical directorate in the national ORF operations center is responsible for. The coherence is very effective. We renewed all the Radio and TV production galleries in the regional studios in the exact same way, we use the same video and audio Infrastructure and software, and the IT Security is aligned as well

as user, licence and contract management.

What’s the role of broadcast technology in ORF? Do you consider that trending innovations like HDR or 4K technology vital to your future? Broadcast technology is the enabler for our success in content distribution and delivery. We have to deliver reliable quality to legacy households and at the same time create new experiences. As such, UHD and HDR are part of our technology and architecture strategy. 27


Is your current production 100% HD? Are there any areas waiting for an update? And what about the UHD? Will it come to ORF in the next few years? We have achieved 100% HD production, playout and distribution 4 years ago, so there are no areas waiting for an update. We have already established a UHD/HDR delivery chain for online streaming and we will update our production and playout infrastructure along the renewal timelines over the

coming years.

What’s your bet on AV capture? Do you have a camera / audio manufacturer? We are aligned for the current camera technology on Ikegami for studio and OB vans, and Sony for ENG. We are aligned on Lawo for Audio production and DHD for Radio continuity. However, all our new projects have a procurement and tender phase before the actual ordering, so we are always open to new manufacturers and concepts.

Let’s walk into the control room of one of your main studios. What’s the technology we could find? Mixer, replay system and so on… Video switchers are typically Sony MVS 8000 and XVS series, audio mixers are Lawo MC2 66 with Lawo Nova 73 28


matrixes, and videoservers are EVS 16 channel.

Graphics are becoming more and more important on television. What system are you currently using? Are you implementing any AR/VR/MR experiences? Graphics and lower

thirds are handled by Vizrt Trio and Vizrt Pilot installations. We use some AR graphics overlays for effects in TV sports shows, but there is AV/AR/MR specific content production.

Moving to your playout system.... What’s your bet?

Our playout systems at the moment are from Grass Valley and from Stryme, with EVS playing an important part for short-form playout in sports and entertainment. Our playout center is in a tender process at the moment and I can say there are many new interesting solutions as 29


“channel-in -a-box� concepts coming at us.

Is ORF considering a future transition to an IP-based production infrastructure? Have 30

you currently tried any systems based on virtualized production solutions? Yes. We are working intensely on the IP

production and have our first full-IP OB Van operational since one year already. We will continue to plan our new installations on an IP base along the timeline.


Virtualized Production solutions are used for small trials in the fields of sports events as well as for some experiments in the context of Covid measures.

It’s time for an external coverage! Austria has a complex orography. How do you manage to overcome your remote contribution? Do you trust on OB Vans or do you bet for mobile

transmission systems? You are right that good 4G coverage is not available at all locations across the country. As such, we use a combination of 4G/LTE multi-SIM solutions as



LiveU and Mobile Viewpoint, together with legacy SNG contribution, IP-SAT solutions and dedicated fiber connections to the most important outside locations. It is all about “Bonding� of all the bandwidth which is available at any given location, and our new award-winning multimedia reporting vehicles have all the connection technologies on board to accommodate this concept.

About this, 4G is pretty extended. What about 5G? Have you done any test on this technology? Are you planning to implement it in your daily workflow? 5G is a buzzword for many different aspects of the broadcasting chain. 5G for ENG contribution will be usable very soon, for general production purposes such as in-ears or wireless microphones we are looking at a much longer timeline. 5G for distribution is in our focus within a dedicated EBU Media Action Group, and 32

will depend on the cooperation of mobile network operators, handset manufacturers and regulators. We also have to wait for Release 17 of the 3GPP specifications for the details what can be implemented.

A national television such as ORF surely needs to manage a large

amount of assets. What MAM system are you currently deploying? Does it integrate in your newsroom? Yes, of course. We are very proud that we managed to use the same MAM System across all TV production and genres, from Newsroom, Sports and Entertainment up to the archival stage. We


fully deployed Ardome in 2016, which is upgraded to Viz One at the moment.

within 30 days is on hard

What about storage? Do you go for a local solution with LTOs / Optical Disc? Are you using a cloud-based system? Maybe a hybrid one?

storage was migrated to

Our working storage for editing and production

disk raids, of course. Our entire video archive full hard disk technology two years ago with great success and we are quite unique in the world with this. We do not use a cloud system because it is more expensive than our on prem system. We run an additional offline tape safety copy offsite as an emergency copy.

What’s next for ORF? Are you going to update any of your infrastructures? What’s your technological future? Most of the important points have been mentioned already, such like the upgrades to IP; to UHD/HDR, the new playout center, the ORF Player etc. Key for success will be to integrate all those projects in a

OTT and VOD will play a great role in the future (and even present) of television. How is ORF developing this area?

common architecture, use

The ORF has a very strong standing in the Austrian online consumption with the biggest News site “” and the most used OTT platform “TVthek”. As such, we continue to strengthen the position of those services and will develop a highly integrated video platform called “ORF Player” over the next months.

areas, and at the same

cloud, microservices and containers wherever possible, align the formats between all production time, guarantee IT security, quality and reliability. Over all, as we are dependant on public licence fees, we have to be very careful what we are spending the money for and operate as efficiently as possible, in order to create the best value for the consumers in Austria.  33


Light, atmospheres and talent A few years ago this Director of Photography had not yet become professionally interested on television. However, after authoring acclaimed productions such as “Thelma” (2017), “Louder than Bombs” (2015) or “The End of the Tour” (2015), his following project was taking charge of the cinematography in what eventually became the breakthrough fiction for 2019: “Chernobyl” (HBO), created by Craig Mazin and directed by Johan Renck. The power of completely developing -with the same team- the visual finish of the 5-episode miniseries was the final push that this creative Swede just needed to get involved in the world of series. His brilliant participation -it could not be otherwise- has paid off: the project has been a boost for growing -even more- his international recognition and made him win an Emmy award in the ‘Outstanding Cinematography for a Limited Series, Movie or Special’ category. TM Broadcast had the opportunity of chatting with Ihre to delve in his views about the profession and in some of the creative keys for the 'Chernobyl' universe.





First, how would you define your work as a DoP? Are there any common elements among your works? I am a cinematographer telling stories, expressing emotions with moving images. Hopefully, the projects I do are so diverse that you shouldn’t really be able recognise who is behind the camera.

Technology is critical to DoPs. How is your relationship with technology? Does


technology help or limit you? Every project needs an enormous amount of technology, but it shouldn’t be an obstacle or take space, but rather something so consumed that it becomes a part of your own intuition and creativity.

“Chernobyl” is your first TV experience. Why didn’t you decide previously to get involved in this area?

I have always focused on feature films where a fixed group of the filmmakers are part of the entire journey of a film. In that sense, it is something very personal for all of us. You are giving birth and raising a child. Chernobyl felt in many ways like a feature film. The construction of Chernobyl as a miniseries made it possible to have the same team (Director, Prod designer, DoP, 1st Ad, etc.) thought the shoot one voice.


You just said you understand “Chernobyl” as a “long feature film”. But, isn’t it true that the barriers between contemporary TV dramas and feature films are increasingly blurred?

ALEXA Mini and Cooke Panchros.

The cinema theatre makes the big difference. To present your work on the big silver screen to an audience is the ultimate presentation. It’s a temple in the dark, where a ray of light travelling over the audience projects an

Due to the realistic depiction of what happened in “Chernobyl”, it seems you replicated stock footage. What was it like working on the reproduction of those videos?

image larger than life. TV can’t beat that.

What was your camera + lenses package for “Chernobyl”?

We never wanted to replicate the stock footage. The research material helped us to understand the world of Chernobyl. The amazing art department with production designer Luke Hull and costume designer Clair Levinson Gendler set out to make our world believable. But still in their work they make the own decision on how they see Chernobyl. It’s not a 1 to 1 replication. As for the cinematography, we took our own stand and wanted to portray the Soviet Union our way, based on facts, but also artistic and creative preferences.

The main characters in the show are not heroes, but people in difficult and on-theedge situations. How does it translate into your work? Did this drive any creative decisions? The cinematography sought out portrays real people. To be honest and truthful in that approach. No need to enhance the protagonists with heroic lighting or framing. Just honesty. 37


Did you have any visual references to the TV world when creating the “Chernobyl” Cinematography? What’s Jakob Ihre’s visual imaginary? When shooting a film for 8 months about the Soviet Union in Lithuania and Ukraine, once part of the former USSR, there is an urge to know more about the period. We all studied the history and politics of the era, and you get, of course, interested to study the cinema of the time. Films like “Andrei Rublev”, “Stalker” and “Come and See” inspired us all in making the film.

“Chernobyl” presents both an oppressive feel, and interesting work on particles and how light transform them. How did you work on both fields? We introduced the sun in many scenes. Rays of light are often present in the frame, often overexposing parts of the witnesses and victims’ faces and bodies. We had from early on made the decision to portray the radiation with 38

beams of light. The more radiation, the more overexposure. Also, hearing the accounts from Chernobyl witnesses that some thought they were seeing the radiation in the dust and particles when backlit by the sun, gave us the motivation to use sun as something foreboding and oppressive.

Post-production and VFX are very important in “Chernobyl”. How do you get involved in these processes?

The main approach is to not see it as post production (something you solve later), but rather try to early in the process visualise how you see the result. VFX and SFX work should work hand in hand with the camera on set. The VFX/SFX supervisors are, of course, as important to you as your gaffer and camera crew, which you should develop a very close collaboration with. Not be seen as something remote that speaks another language.


Moving on to a broader view of the DoP world… What technological solution would you like to see developed in the future? If it is physical impossible through new technology to create a laboratorium that will develop motion picture camera film faster, cheaper and closer to the location where you are filming, then I hope that one day we can reproduce

the texture and quality of film emulsion with a digital camera. So far, there is no digital camera that can resemble the glory of something shot on 35mm film.

Finally, what’s next for you? Will you continue to delve into TV shows beyond “Chernobyl” and “Dispatches from Elsewhere”? Johan Renck is developing the TV adaptation of “The

Last of Us”. Are you going to work with him again on this project? I am trying stay away from longer projects in these trying times. It would be amazing, of course, to team up again with Johan Renck (“Chernobyl”) and with Jason Segel (who shot the pilot of “Dispatches from elsewhere”). I had such an inspiring time filming with them both. 



SHORTENING TIME AND BRIDGING DISTANCES How to reach our viewers quicker and better Text: Luis Pavía





This deceptively simple caption comprises very differing technologies, workflows and purposes that must be properly distinguished in order to – as usual- be able to choose the most adequate option for each scenario and circumstance. Without doing an exhaustive analysis of all devices and cameras in the market, our interest will focus on technology and main alternatives that are available. It has been many decades now since television transmission from the emitter to viewers –the traditional ‘broadcast' environment in a wireless format is the basis for traditional TV. Something we all are quite familiar with and which has been all along going throughout different stages and technology leaps. But the area that is drawing our attention today is exactly the other side of the broadcasting station: the one in charge of conveying signals, from camera capture to the production and direction centers. 42

Also on this side of the production chain we are acquainted with its usual possibilities, with wellknown deployments for live broadcasts. Especially for major sports or cultural events such as Olympics or concerts. To make it simple, we could sum it up by saying that in these instances, the usual way of implementing such deployments has been through mobile units that would take most production and direction tasks up to the event’s location and relay to the station a single signal already mixed by means of a sophisticated and very expensive satellite link or a dedicated line, thus turning the broadcast centre into little more than a relay in charge of disseminating the relevant signal. We insist on the fact that this is a rather simplified view of things, as in some occasions these productions are extremely complex and require participation of hundreds of excellent professionals in perfect coordination, both in mobile units and in the various broadcast centers. Another usual scenario

such as news or stories, is that in which a single reporter or camera operator sends content to the agency or station and from there said content goes through the traditional ingest/editing/production procedures. But this scheme has normally had a limitation in the times that are required for sending content of sufficient quality through the available media. Even



in today’s Internet times until very recently devoting a significant

amount of time to upload files to some kind of server was required. This was due to the volume of files and the limited capacity of transmission lines. The further away the location where news or events were taking place, the more noticeable the limitations. But it has been now a few years since, thanks to the inclusion of new features in cameras, more efficient compression algorithms and higher bandwidths in transmission channels, doing live broadcasts has become a much easier task. And this not only with regards to simpler operations, as the real boost has been to see costs drop to nearly

negligible levels. Now things are really easy and affordable. And mainly for this reason, these novel ways of operating are now a reality that facilitates a whole new range of new options and, in view of the doors bound to open in the near future, such possibilities will mean yet another transformation in the way of creating and distributing content. But, what are we specifically talking about? Actually, we are making reference here to the various technologies, the different concepts and different capabilities that converge so as to create a new horizon. Cameras that come with built-in connectivity capabilities or having them added afterwards. Compression systems that succeed in keeping the same quality and resolution with a much lower data flow. Transmission methods in different kinds of networks: Wi-Fi, 4G, 5G and structures such as ‘bonding’. And even new possibilities such as cloud directing. 43


So, let’s gradually delve in these camera systems that title our content and, which we will shortly see, sometimes are not actually cameras. In order to properly understand what they do and what possibilities they offer, we must first analyze what the purpose of these systems is and what means are used by each of them to achieve said purpose. The basic idea is very simple: make the flow of binary data generated by the camera during capture reach in real time the ingest/mix/edit/directing unit without requiring any physical media for transport. Actually, this is similar to what we do when we are watching videos in any platform on our mobile phones. The big difference here is that said content is already recorded and our mobile phone can gradually download and play it from a cache memory with just a few seconds’ delay so as to compensate for any fluctuations in data 44


transfer speed through the relevant communication medium, thus ensuring a smooth playback. When capturing, this is unfeasible and even more so if we want to sync content from different cameras. Therefore, our channel must ensure a sustained transfer capacity per second over lengthy periods of time. A stable "bandwidth", sufficient for conveying the huge volume of data generated by the camera. Especially when dealing with content that has high quality requirements -including resolution, dynamic range, color depth, frame rate per second, etc.- that cause the volume of information to transfer gradually increase.

In this regard, a traditional Wi-Fi network featuring a standard 54 Mbps bandwidth and


limited reach in distance can turn out insufficient for high-quality content. With the generic specifications of our phones’ 4G-LTE networks we should have a bandwidth of up to 1 Gbps, which can decrease up to a maximum of a mere 100 Mbps when the mobile phone is traveling at speeds of up to 200 km/h. With such an available bandwidth we would already have a viable channel, although it

may actually fall short as well. Not because a 1Gbps network is not enough, but because of the fluctuations in speed or network congestion issues at certain times. In order to sort out these limitations we have two main alternatives. The first one is based on increasing the number of available connections by resorting to a technique known as ‘bonding’. By this method, the signal is

distributed along several channels that allocate the volume of information being transferred based on availability with sufficient margin. This ensures availability of the required bandwidth by dividing the signal between several mobile data lines working in parallel. The second one is decreasing the volume of data to transfer. In order to achieve this without losing quality, different compression algorithms are developed which, based on the perception of the human eye, place the emphasis on ensuring the quality perceived by viewers. Thus, for example, with less than half volume of data and bandwidth, the H.265 algorithm provides a perceived quality that is clearly higher than the one achieved by H.264. It must be borne in mind that the various algorithms yield different outcomes depending on the type of content to be compressed. Although compression itself deserves a special 45


article, suffice it by the time being to get the basic idea: better compression algorithms do decrease volume without harming quality. By combining both techniques -'bonding' and more advanced algorithms- we can uncompromisingly ensure quality levels in transmissions. We already have 5G around the corner, which will obviously mean a whole new quantum leap in bandwidth, although this will go hand in hand with creation and distribution of content with higher requirements, such as 4K, color depth, HDR, HFR, etc. This will keep rolling the wheels of growth in technical possibilities in perfect harmony with requirements. And we have yet a couple of icings for our cake, that we must not lose sight of under any circumstances. One: that destination of our cameras’ connection will not necessarily be a traditional station or a 46

mobile unit as such. In addition to these connections, it is also nowadays feasible that content be managed from virtual production systems allowing us to perform live direction in real time by operating virtual mixers and broadcasting content simultaneously through streaming platforms, even from the Cloud. And another particularly interesting one is when the camera is uploading all content in real time, although with just enough quality for a first broadcast –but along with all the metadata. What makes the system different is that all content can be recorded for further editing and then request from the camera only the necessary fragments in maximum quality so as to offer the best content while having moved the minimum amount of data as strictly necessary. Before moving on to outline some of the options available, we would like to remind you that due to the


international scope of our publication, it could be the case that some devices or public data network capabilities may differ from the ones described in this article. Starting by the cameras and as long as we stay within the field of


handheld cameras, multipurpose cameras, ENG or even digital cinema cameras, nearly all manufacturers have multiple models featuring wireless connectivity and various technologies. Amongst these technologies, worth noting are the two best-known ones: Wi-Fi and mobile data. The purpose of having Wi-Fi in a camera is to enable the possibility of establishing connection

through an already existing router or to utilize the user’s own mobile phone as a communications gateway. This can also be done by means of autonomous WiFi+4G routers that carry out the same mobile gateway role, but without needing the phone. Distances to cover will be short when using Wi-Fi, and significantly longer for 4G-LTE networks. And careful here with the possibilities provided

by each firmware release of each camera version, as in some of them Wi-Fi could be only usable for remote operation but not as a means for sending content. It would not serve our purpose in this case. Regardless of said Wi-Fi connectivity, some cameras also have the possibility of directly using a USB port for connecting the typical dongle (wireless USB modem) that holds a mobile carrier’s network card for using mobile data networks as means for transfer. In this case, keep in mind that several USB ports may enable camera bonding or may be restricted to specific functions. Both autonomous routers and dongles have the advantage of allowing use of the same devices in different countries and markets, or even with different carriers within a country based on the various coverage maps just by changing the telephony carrier’s network card and without blocking the use of 47


carrier’s mobile device. Additionally, thanks to the imminent availability of 5G networks, just by changing devices we will have the performance of the new network by means of a minimal investment. In view of the broad product portfolios offered by manufacturers such as Canon, JVC, Nikon, Panasonic, Sony, etc. and the huge number of models in their various ranges, any enumeration would be incomplete. Furthermore, when considering the new functionalities that are normally added by successive firmware updates, we recommend referring to the updated specifications of each manufacturer whenever we may need to make sure that the features of the relevant model and version are suitable to our needs. Let’s move on to external devices, but now from standard video connections of the camera or any other source, such as SDI signals in their 48

different variants or HDMI. In this instance the source is freed from network configuration and all parameters relating to compression, data rate, network set-up, etc. are passed onto the encoding/broadcasting device. These devices offer the advantage of being functional for all kinds of cameras or sources and, although they naturally increase the weight, volume and power requirements for the

whole deployment, they are more efficient as they allow more flexibility in operation, an increased amount of 'bonding' channels, an even a streamlined control of data traffic. All these video flows that have been fed into a data network are assembled back as a conventional video signal by the relevant decoders. We have actually replaced the physical cable that



connects the camera – located anywhere within the globe- with its master input on the production mixer, located in the mobile unit or in the station itself anywhere else in the world. Once that broadcasting and receiving devices are synced, they are capable of managing all compression/decompressi on and network parameters in order to distribute traffic among the channels made available in a seamless fashion for operators. In these two latter groups of devices, as in

many instances they will work as broadcasting/receiving pairs, two major operating styles can be distinguished. One the one hand equipment making point-to-point wireless connection through proprietary radio-link, sightline between antennae, and reach of a few hundred meters, such as those from AbonAir or Teradek. And, on the other hand, systems supported by data networks that are normally accessed through network operators or mobile carriers, which offer broadcaster-receiver

reach in practically unlimited distances. In this second instance, we would be dealing with systems such as those provided by TVU Networks or U-Live. But these external devices do not necessarily have to operate in pairs. And this being so because if our intention is not broadcasting from a traditional broadcast antenna, but creating content to be exclusively distributed through streaming channels, it is feasible to use only the broadcasters within the second group for making the conversion and managing the whole production in virtualized systems in the cloud, such as those from TVU Networks. Therefore, once the camera’s signal is on the data network, all production, composition, forwarding to the streaming platform and distribution to clients is performed without leaving the network. Obviously, the cameras can be deployed in different places across the globe, the director in a 49


different location and the clients be spread throughout the world with no geographical limitations other than the reach of the data networks operated by the various telecommunication carriers. We even have the possibility of generating the streaming flow for direct forwarding to the platform from a single broadcast device, which could be the camera itself or any of the abovementioned devices. Last, and offering the utmost versatility and efficiency -it can be used for live broadcasts and subsequent consolidation of top-quality contentSony’s XDCamAir system is based on an advanced functionality in certain cameras combined with specific servers to which content is uploaded in real time during capture in quality up to HD. Content is sent along with all metadata, but with a very low data rate in order to make it viable through remote or limited network infrastructures. 50

This facilitates for editors the possibility of setting up a program relying on instant access to the entire material. The interesting thing of this notion is that, once the program is assembled and validated, the edition system just needs the remote camera to send only the fragments that are necessary for consolidation of the final program with maximum quality. This only requires the camera to be on and connected to the data

network; thus, with no need for intervention of a camera operator, and even if the latter may remain in places with limited connectivity, all necessary contents in the station can be made available with the maximum efficiency To sum up, the current panorama of wireless camera systems allows us to configure different workflows, spanning: - Cameras that take themselves charge of



getting connected to the data network. Wi-Fi and 4G/5G. We will need to configure the video and network parameters in the camera. - Devices that convert and forward SDI or HDMI video signals and manage them through

data networks. We will need to configure the video and network parameters in the encoder/broadcasting device. - Cameras and broadcasting devices capable of generating a streaming flow that is directly sent to distribution platforms, with no need of any elements in between. - Reception systems that re-construct a traditional video signal coming from data networks for injection into conventional mixers. We will need to configure the video and network parameters in the receiver/decoder. - Directing platforms in the cloud, which gather video flows from several sources, process them and directly generate streaming flows that reach distribution platforms. - Servers capable of receiving a limitedquality signal -but containing all metadata-, making it available to

editors for full edition and based on edition metadata. The system interacts autonomously with the camera and downloads with maximum quality only the fragments of a recording that are necessary for the final program. It is however a bit paradoxical that nowadays anyone equipped with a camera or a mere mobile phone can broadcast live content at nearly no cost, reaching an audience that the world's biggest broadcasters could only dream of just a decade ago. As we can see, with all these possibilities available and if we additionally combine them with the elements at our reach, we have resources to face nearly any project with the higher chances for success, reliability and efficiency possible. ď ľ 51




THE GOOD FILM COMPANY & REMOTE FILMING MAKING PRODUCTIONS HAPPEN IN COVID-19 TIMES The UK media industry has always been characterized by a relentless activity. Dozens of productions are undertaken every day in a keen effort to meet the expectations of a valuable industry and make the most of the inconicity of British locations. The Good Film Company has been, since 1988, offering a wide choice of production services to companies around the globe. Throughout its history, the company has gained the trust of major players such as Amazon, Netflix, Sony Pictures and the BBC. But far from halting production activities in the light of the current context, it has remained active and guaranteed productions, either on location or thanks to Remote Filming, a recent initiative with plenty of possibilities. We got to know more about the proposals brought by both firms from Yanina Barry, founder of The Good Film Company and co-founder together with Alex Seery- of Remote Filming.



First, could you tell us more about The Good Film Company and Remote Filming? Based in London, Good Films was founded by Yanina Barry in 1988 and is widely recognised as the UK's leading Production Service Company, with a truly global client base. Yanina is Executive Producer at Good Films, which has produced a wide variety of films, including TVCs, Drama, Movies, Music Videos, Corporate Films, BTS, Documentaries, Digital & Branded Social, CGI and Animation, and Stills. Good Films works extensively with Celebrities - Film Stars, Sports Personalities, Actors and Musicians. As a result of her international work, Yanina recognised the need and potential for remote streaming from UK shoots to clients overseas who were unable to travel. In September 2019, Yanina co-founded Remote Filming, developing an easy-to-use, encrypted and zero54

latency streaming service for Production Companies, Agencies and Clients enabling them to livestream their shoots from anywhere in the world, to viewers anywhere in the world. The two companies are separate, but Good Films’ wide experience in film production has informed Remote Filming immeasurably. Remote Filming’s other principal Partner is Alex Seery. Alex is founder of, a highly-regarded provider of a range of on-set and post-production services that simplify digital workflows and support creativity. Hijack provides DITs, lab services, colour grading, audio post, mastering and deliverables. Alex is hugely experienced both on set and off – which makes him unique in understanding the technical and practical demands of the industry’s needs in remote streaming, having worked on commercials, movie, and TV productions

nationally and internationally. Remote Filming’s development was extremely timely; the remote streaming technology, now winning acclaim from so many Producers, was originally created to reduce the need to travel for shoots, saving money and benefitting the environment. Now, in this Covid-era, the technology has helped facilitate safe shoots on six Continents.

What kind of services do you offer? Good Films is Production Service Company, supplying a fully set-up production partner infrastructure for clients from overseas and locally to shoot in the UK. We provide the total package to facilitate productions, with a hugely experienced production team. Services include casting, studio, location, special effects, motion control, product/pack shots, animation, animatronics, CGI, AR, and stunt driving. We have a UK location




library of over 100,000 images and a specialist car film unit. Remote Filming at its simplest, enables Production Companies, Agencies and Clients to participate in a shoot anywhere in the world, from anywhere in the world, without the need for expensive hardware or specialist crew, and no need to install complicated apps or software. Remote Filming is new and unique technology allowing encrypted, lag-free remote viewing of single or multicamera shoots. Uniquely, Remote Filming works for all stages of film production, from prep to post – and is perfect for stills shoots.

What’s your vision on technology? Do broadcasters and production companies often request the latest innovations in the industry? Having the latest and most advanced system isn’t top of their wish list – what they really want is a 56

reliable solution that they know will work well and won’t hold up the production, often tight on deadlines and budget. But some projects do need innovation – they need something that hasn’t been done before. It’s exciting to be a disrupter. Innovation doesn’t have to be expensive for them – that is where experience comes in, applying it to the brief and finding a new way around. That’s how Remote Filming came to be in existence, applying experience and creativity to the environmental problem of so many people flying from one end of the world to the other for a shoot, and spending such a large proportion of a budget on travel and hotels. There were remote filming options around already, but they can be clunky and complicated, often with poor image quality and high latencies. Remote Filming disrupted all that and made it easy for productions to transmit and viewers to see.

Would you say UHD is currently an industry standard or are we still living in the HD era? Do your customers demand 4K and HDR workflows? Clients in all fields are certainly preparing for UHD and so are we. As the technology becomes more universally available, costs will adjust to suit future budgets. The confluence of reducing costs combined with consumer demand will encourage UHD production as a first choice.

What have been your latest broadcast / VOD projects? Could you name some of the clients who have trusted your services?


ensure that Remote Filming is ahead of the curve for encryption and security protocols without losing any of the ease of use or stream quality. In addition, we can react quickly to finesse Remote Filming as we get feedback from users ‘on the ground’. We have signed so many NDAs with clients that it feels like it’s an extra level of encryption. It’s completely understandable and makes perfect commercial sense. It applies to Good Films, and even more so for Remote Filming. I can say we have worked with Amazon, Netflix, Sony Pictures, the BBC, and a host of independents.

Technologically speaking, what are the solutions or standards more demanded by your customers? Our clients need all that Remote Filming offers – stability, reliability, nearzero latency and 100% encryption. We are working constantly to

With Good Films’ 33 years of film experience and our Remote Filming Partners, we have designed the easiest system to set up and to stream. It takes a lot of technology and development to make something look easy!

Do you have technology partners? Which manufactures did you trust on for your production equipment? For Remote Filming, our in-house developers allow us to harness the oftencreative genius of development partners, working together on a vision of how the industry will develop and what it will value. Good Films values experience, creativity, and professionalism above all

else within the industry. The relationships we have with crew, casting, equipment hire, location teams and art departments makes it no surprise that suppliers trust us and have a genuine friendship with us. Of course, technology is playing an ever-increasing role in our work and our relationships allow us to quickly identify and understand developments.

Remote production is a great deal, especially in today’s context. What technologies are part of the workflow you offer to customers? Communication is key to the success of Remote Filming. Remote Filming is unique in the level of support we give our clients throughout. We give 24/7 live remote telephone support and can remotely access the shoot immediately to assist any problems with our clients. Our cloudbased solution runs on the most stable networks and our network is triple secured in terms of load bearing. 57


At Remote Filming, are you currently implementing IP – Cloud – Virtualized production solutions? Do you consider that the industry is heading towards a production paradigm in which an important part of technologies + human resources will be delocalized? This shift is undoubtedly happening as a result of the Covid-19 pandemic. Our brilliant industry has had to adapt to the challenges and difficulties that have impeded travel and shoots. People forced to work remotely are now enthusiastic adopters, clearly seeing the advantages. Good Films has always worked remotely with clients up the point they arrive – we have been conducting remote international preproduction many years. So, for us, nothing dramatic really changed in that sense. Remote Filming was developed pre-Covid as a direct response to the 58

challenges of budgets, schedule workflows, and the money and time costs of international travel. Our clients were looking to make savings by travelling less people across the world and others also looking at the environmental costs of travel. One of our huge advantages has been the involvement of international producers and crew in the design and development of the System. It has been developed by people who do the job, not people with limited experience of what our day-to-day life is really like. Feedback from Clients has been enthusiastic for long-term adoption. Production offices have it ‘on the wall’ so they can keep ‘in touch’ with shoots even if they would not be on set. Main directors can participate in 2nd unit shoots that they would be unable to attend normally. Art department can keep ‘eyes on’ sets whilst they are working on the next set-up. Clients who don’t

need to be on set all day can easily ‘drop in’ to make key decisions. These situations are hugely advantageous, saving time and money.

How has covid-19 transformed your workflows? Covid-19 has certainly made it essential to prepare for every shoot meticulously, but experience and training has allowed us to incorporate the necessary protocols without too much fuss. Our production teams have also worked primarily remotely – but there are so many communication tools available, we have


found it easy to work together as long as we have decent internet. Our brilliant trade associations – for example, Steve Davies at the APA and Dawn McCarthy-Simpson at PACT – have been tireless in supporting producers and crew by creating internationally respected protocols, clearly communicated, to allow us to carry on working. Our production team are all qualified Covid-19 supervisors – the buck always stops here! So far, nothing has been too challenging – we have been able to provide remote casting and location scouting as well as filming and stills.

Regardless, despite covid-19, the AV industry has not stopped as VOD + OTT customers keep demanding more and more entertainment. Do you consider that the AV industry is active nowadays? Undoubtedly, yes. It’s been clear throughout this Covid-era that many productions have been delayed but most have found a new way to work within the guidelines. The business in the UK is extremely busy and shows no signs of slowing down. Through Remote Filming, we are also aware of the high level of international production. With more and more content needed for VOD, OTT and other online channels crying out for more and more content, the sector also is set to grow, more and more. Part of the growth will depend on technologies like Remote Filming.

What’s the future of The Good Film Company and Remote Filming? Do you have

any technological refurbishment planned? Which technologies do you think will be especially significant for the future of your area? For Good Films, we are looking for more ways to support our international clients. We are constantly reviewing our systems and responding to our clients’ demands. We have seen an increase in the demand for very small drop kits and contactless production solutions for remote video and photo capture. However, this may just be a result of the Covid-19 restrictions. Our primary focus is providing the very best service to our clients at whatever scale of shooting they require. For Remote Filming, we are brimming with ideas and new possibilities – but we won’t share them until we launch! Our in-house and international developers are a demanding bunch – they are always looking over the next horizon.  59



8K UHD SIGNAL IN DVB-T2 Text by José Manuel Menéndez, Professor of the Higher School of Telecommunications Engineering and Director of RTVE’s Chair at UPM

8K in DVB-T2 But, is it possible? The Chair of the staterun Spanish TV and radio corporation Radiotelevisión Española at Universidad Politécnica of Madrid presented last 21 October 2020 the world’s first pilot broadcast of Ultra-High Definition signal – 8K UHD in DVB-T2. The following day, the international written press echoed the news and commented on this event in their opening lines by saying “...they have carried out what they call the world’s first pilot broadcast of 8K signal in DVB-T2”. So, how do we know this 60

is actually the world's first full pilot? It is very simple. When the Chair initially considered carrying out this pilot project, they did so as usual: By undertaking an actual broadcast with commercial and broadcasting equipment, and using also real equipment for reception and completing the pilot. A broadcaster without a receiver that will check and ensure that all steps followed are correct does not make sense. So, as in other instances, we got in touch with the major TV screen manufacturers and requested their help. Again, manufacturers warmly welcomed our

8K & DVB-T2

initiative and offered to provide us with screens. Once the screens were received, a test broadcast was performed in the RTVE Chair’s Labs at the Higher Engineering School, UPM, and we found that the screens were not capable of displaying the signal. We contacted once more the manufacturers, who in turn got us in touch with

their developers at their own headquarters and the response we received was unanimous: “8K in DVB-T2 is NOT possible. Or is it? To show them it actually WAS POSSIBLE, they were forwarded a short fragment of the Transport Stream and they went hands to work with it in order to modify their screens' firmware. In a matter of a few weeks and

after several hardware update iterations performed on the screens, reception was possible and the signal was displayed without issues. We then realized we had the only TV monitors in the world capable of receiving and displaying 8K via DVB-T2 thanks to the cooperation from manufacturers. We searched the information on file and we have not found any previous experiences in broadcast of 8K signal in DVB-T2 internationally having been published. Therefore: yes. What “we called” the world’s first full pilot for broadcast of 8K signal in DVB-T2 can be apparently regarded as the first one worldwide.

A bit of history The RTVE Chair at UPM was born out of an agreement signed in January 2015 between Radio Televisión Española and Universidad Politécnica of Madrid through the Higher Technical School of 61


Telecommunications Engineering. This agreement has enabled the establishment of a strategic partnership between both organizations with the aim of carrying out training, research, academic and dissemination activities within the field of new signal formats (from an engineering standpoint) and this signal’s new processes for transport and dissemination to users. The Chair has carried out several events associated to dissemination of UHD signals, always in cooperation with other companies and entities in the sector. Some of these events are: • 21/05/2016: Full, live broadcast of the Parsifal opera in 4K UHD. The 4K UHD signal, with a refresh rate of 50 frames per second was produced by RTVE at Teatro Real strictly live and was broadcast via the Hispasat satellite. Four reception points were available: The Kinepolis movie theatres, the Prado del Rey 62

auditorium, the ETSITUPM university school and the Collserola Tower in Barcelona. The signal received at the ETSIT-UPM was also broadcast on the fly by the RTVE Chair under the DVB-T2 terrestrial digital television standard, with coverage for a wide area around the Ciudad Universitaria university facilities in Madrid. • 01/07/2017: Spain's first

live broadcast of the change of guard at Palacio Real of Madrid, featuring 4K UHD signal and HDR technology in Dolby Vision and HDR-10, with dissemination via DVB-T2 for Madrid, Barcelona and Seville. • 09/10/2018: First production and broadcast in Spain through DVB-T2 of UHD-1 Phase 2 in full, that is, 4K UHD signal, including high frame

8K & DVB-T2

refresh rate, high dynamic range, expanded colour space and next-generation audio. The signal was broadcast for Madrid, Barcelona, Seville, Málaga and Santiago de Compostela.

8K UHD signal pilot in DVB-T2 As in other occasions, this pilot has been possible thanks to the cooperation given by all

members that comprise the Advisory Committee of the Chair in technological aspects: Cellnex Telecom, Dolby, Sapec and Corporación Televés (Gsertel, TRedess and Televés), as well as Abacanto as collaborating entity. For signal capture, reliance was placed on the cooperation provided by SONY, which lent an F65 camera to RTVE for a

few sessions. In view of the camera’s busy international bookings agenda, it was not possible to arrange for outdoor production as initially planned, and recording was made indoors as well as at the outdoor gardens in Prado del Rey. Given the fact that this was a technical pilot operation, requirements for signals so captured were very strict in terms of contrast and backlighting (so as to test the camera’s and the display screen’s highdynamic range capabilities) and colorimetry (in order to make the most of the colour gamut offered by recommendation ITU R BT.2020). For signal postproduction reliance was placed in the collaboration with Abacanto and the work contributed by SGO with their Mistika editor. Display of the signal during the event offered on streaming was entrusted to Samsung, which set up a special Samsung 8K QLED model for the occasion. 63


In view of the current circumstances marked by the COVID-19 situation and following the relevant recommendations and safety measures, an event was organized in which attendees were able to follow the action remotely thanks to the streaming signal provided by the RTVE Official Institute, the venue where the event was held. In order to make the event more lively and entertaining, the chosen format was that of a program featuring interviews in which the moderator (role played by Carlos Garrido, Head of Media Relations at RTVE) would have interviews in succession with a selected number of participants who explained the broadcast pilot's technical features. In the event took part Federico Montero, Corporate General Manager, RTVE; David Valcarce, Director, Televisión Española; Adolfo Muñoz, Director Broadcasting and Networks, RTVE; José Manuel Menéndez, 64

Professor of the Higher Technical School of Telecommunications Engineering and Director of the Chair at Universidad Politécnica of Madrid; Miguel Ángel Doncel, CEO at SGO company; Miguel Ángel Cristóbal, CEO at Sapec company; Javier Foncillas, Vice President Commercial Partnerships Europe at Dolby company; Miguel Ángel Bona San Vicente, Head of Planning for DTV and Satellite Platforms, RTVE; and Álvaro Llorente,

researcher, Visual Telecommunications Implementation Group, Higher Technical School of Telecommunication Engineering of Universidad Politécnica of Madrid, and member of the Chair. The 8K UHD signal broadcast has a spatial resolution of 7680x4320 pixels -four times the resolution of 4K UHD- and includes all technologies inherent to the so-called UHD Phase 2 , such as:  High Dynamic Range

8K & DVB-T2

(HDR) including HLG (Hybrid Log-Gamma) transfer function as approved under the STD-B67 standard by ARIB (Association of Radio Industries and Businesses).  High Frame Rate (HFR) of 50 images per second in progressive format.  Wider Colour Gamut (WCG) consistent with the colour space from recommendation ITU-R BT.2020.  5.1.4 multi-channel audio.

 10 bit/pixel depth and 4:2:0 subsampling for dissemination. Sapec was in charge of performing the video signal compression by means of the HEVC encoder. Broadcasting 8K signal via DVB-T2 entails a significant challenge as the following complexities had to be tackled.  Compressing the 8K UHD signal, which involved over 33 Gbps of video, into less than 33 Mbps while maintaining the video quality

required by a signal having these features. This gives a compression rate exceeding 1000:1. To this end, all parameters improving HEVC encoding had to be perfectly tweaked and optimized.  Handling the enormous amount of bits generated by the 8K UHD signal: 50 images per second, 32 Mpixels each (about 32 times more than HD).  The necessary computing capacity. This capacity was estimated to exceed 6 times what would be needed for encoding 4K UHD, so ways of running processes in parallel were sought. In principle, and according to the technical literature existing on this topic, using the HEVC standard for 8K signal requires more than twice the binary rate in order to maintain the same quality as compared to 4K resolution video. The bit rate needed for broadcasting 8K live video with optimal quality 65


would range between 60 and 80 Mbps. However, through a proper adjustment of compression parameters, video compression was achieved at an average bit rate of 30 Mbps, thus preserving high-quality viewing and, once multiplexed with audio channels and DVB tables, it was kept just below 32 Mbps, which in turn allowed performing the broadcast over the DVB-T2 channel. Dolby was in charge of encoding the 5.1.4 audio multichannel signal by making use of the Dolby AC-4 standard, which offers maximum efficiency and excellent sound experience quality. The content was broadcast from the TV headend of the Higher Technical School of Telecommunications Engineering at UPM located in Ciudad Universitaria in Madrid, on channel 44 UHF, assigned on a temporary basis to the Chair of RTVE at UPM by the Department of the State for 66

Telecommunications and Digital Infrastructures (SETID) for test broadcasts. The 8K signal broadcast was also carried out on the UHD channel from Cellnex Telecom from the broadcast centres at TorrespaĂąa and San Fernando de Henares in Madrid, channel 36; Collserola and Baix Llobregat in Barcelona, channel 45; Valencina, in Seville, channel 36; Mijas, in MĂĄlaga, channel 26; Monte Pedroso, in Santiago de Compostela,

channel 33; and La Muela, in Saragossa, channel 23. From a technical point of view, the resulting binary rate for the audiovisual content was around 32 Mbps. Parameters for COFDM modulation used for broadcasting via DVBT2 from UPM were: 32K extended, 1/128 guard interval and PP7 pilot pattern. A set-up of a single PLP was used with 64QAM modulation and 5/6 FEC. With this configuration a maximum usable bitrate of around

8K & DVB-T2

36 Mbps was achieved, which was valid for broadcasting the content without any issues and also for receiving the relevant signal at the RTVE Institute. On the other hand, Cellnex Telecom broadcast at its various centres with the following DVB-T2 parameters: 32 Ext. 256QAM 2/3 GI 1/8 PP2 SISO. This is also a configuration suitable for supporting binary speed requirements while remaining committed to

maintenance of coverage at a wider scale and to resilience of the service. For ongoing monitoring and supervision of the signal Hexylon portable equipment was used alongside a RCS-100 unit, both from Gsertel (a company within Corporación Televés), which allowed a real-time analysis of all relevant parameters of the digital channel, both from the radio-frequency point of view and in regard to content features.

Reception and viewing of content during the event was made possible by the participation of manufacturer Samsung, which adapted on of its commercial Samsung QLED 8K models for completion of the broadcast-reception cycle and therefore being able to view the 8k signal being broadcast. In order to show whoever wanted to see themselves the live broadcast-reception -and also after the event- a LG NanoCell screen especially adapted for the event and provided by manufacturer LG was deployed for continued operation. This pilot was one of the activities carried out by the Chair of RTVE at Universidad Politécnica of Madrid, created in January 2015, and which has taken part in the development of a wide range of experience in the field of UHD signal production and broadcast for the past few years.  67


LONG-TERM INDUSTRY CHANGES DEMANDED BY 2020 By Scott Carroll, Director of External Communications for the Vizrt Group

Looking at the lessons we as an industry learned this year, we quickly discover that we must do more than simply opine on what emerging technologies are coming our way. Rather, 2020 has issued a demand: that we take action in how we operate as an industry for the long term. This demand was first felt drastically in the early months of the year as the COVID-19 pandemic forced us out of our studios and offices and into our homes. Our response then was to emulate the environment we’d lost. And, largely, we found a way to do that. Broadcast news was 68

among the first to make the shift. Journalists continued to report the news – it just began to happen increasingly from kitchens, living rooms, home offices, and garages. Sometimes it had a virtualized background – and sometimes not. But did viewers seem to mind? The answer appears to be no. Sports broadcasts then began to adapt as live competition returned. Announcers were increasingly working from home, commentating on matches from afar. Empty stadia filled up with virtual fans – be it with computer generated overlays or with webcam

images of supporters from their homes.

What isn’t going away Not everything will stick around, of course. While late night programming originally went into the presenter’s home, the desire to return a live


vanish. They will likely adapt, improve, and augment our content offerings. Here is what we are likely to see going forward as a blend of content.

Remote connections have gotten better

studio audience – eventually, when it is safe – will never go away. Sports are already working within reduced capacity to return fans to the stands. However, as we returned to the studio under social distancing guidelines, we realized something critical: choosing between

the way we “used to” do production and the new ways we discovered in 2020 didn’t need to be an either/or proposition. Rather, we could implement new techniques into standard workflows. Simply put, these new capabilities won’t just

The most obvious shift will be toward more remote participants in our productions. This was, of course, a common occurrence before the pandemic. However, a new professionalism has come to video calls. As participants have become increasingly comfortable in front of the camera – and those cameras have gotten better – we can suddenly bring them into productions more seamlessly. A major enabler here is the ability to add either PTZ cameras to the home environment or to utilize high-definition cameras from our mobile devices. In both situations, the camera can be brought into common video conferencing software and then into live production 69


systems on the receiving end.

The flexibility of home studios – and home control rooms This addition of “more professional” cameras in the home gives way to the next obvious solution: bringing broadcast equipment into the home. While it’s unlikely it will ever become common for a full studio build out to exist in home environments – although it’s very likely some television personalities have flirted with the idea – there is now the capability to set up studio lights, green screens, and cameras to create a 70

“studio lite” environment for talent. More impressive, however, is the seamlessness with which we’ve seen production systems offer remote capabilities for directing, switching, graphics, and other real-time capabilities. This move of production tools into the cloud – and, perhaps, blended with on-prem tools – will only increase going forward. If not just for home use, to enable access to production tools anywhere with a network connection!

We’ve gone next level with graphics Graphics capabilities

have taken a monumental step forward as a result of the pandemic. As mentioned previously, the sporting world seized upon the strength of graphics early on to add fans to an empty environment. Later, live fans were brought in individually and added to video walls within stadiums. In both scenarios, necessity was the mother of invention. And the ability to use virtualization and graphics was key. This creativity is likely to continue going forward. In fact, if you look at the U.S. election, you may have seen new capabilities in augmented reality and


virtual sets. Soon, it is likely we will see real time virtualized spaces where two presenters appear on the same set – despite being physically located time zones apart!

Fault tolerance will never be the same Of course, all of this has an underlying benefit. While these new capabilities are a boon to creative storytelling, the ‘software-defined’ nature of it all allows for a new form of fault tolerance. And that softwaredefined component is critical. If we can increasingly move to software, we greatly improve our ability to

respond if physical studios fail again. While fault tolerance used to apply to having additional hardware to replace a failed box, it now has to consider what happens if we can’t physically go into work. Even in a scenario where we ignore the creativity these tools enable, the fault tolerance component alone should be more than enough to demand we change the way we work!

Embrace the change Software-defined visual storytelling enables all of the capabilities outlined above. SDVS is the answer to the demands of 2020. It

is how we plan to never be caught on the back foot again. It is how we become more creative in our storytelling. Those who embrace the scalability, movability, flexibility, and additional storytelling capability of software-first mindsets will thrive going forward. So, look for ways to continue what you’ve learned over the past year. Seek opportunities to continue the innovation. And, most importantly, keep the creative tools in your toolkit. You never know when they might shift from creative solutions to critical solutions!  71