TM Broadcast International 72, August 2019

Page 1

Summary 24 6



That’s how the Champions League was made Step-by-step organization of a Champions League final, by Oscar Lago (Mediapro)..................................... 24 A HDR Champions League final with BT Sport ........ 34

The world of 12G 4K/UHD processing By LYNX Technik AG


42 64

Vizrt Case Study: La Chaîne l' Equipe

BBC Click



Technology in production and broadcast of news programs


Interview with

Big Data

Test Zone


With Qligent and Imagen

Canon XA 55

Editor in chief Javier de Martín

Creative Direction Mercedes González

Key account manager Susana Sampedro

Editorial staff Sergio Julián

Translation Fernando Alvárez

Administration Laura de Diego

TM Broadcast International #72 August 2019

TM Broadcast International is a magazine published by Daró Media Group SL Centro Empresarial Tartessos Calle Pollensa 2, oficina 14 28290 Las Rozas (Madrid), Spain Phone +34 91 640 46 43 Published in Spain ISSN: 2659-5966

EDITORIAL If there is a kind of broadcast challenging the expertise, technological ambition and solvency of an audiovisual production company, that is sports broadcast. Throughout this year you have been able to gain an insight on some events of world relevance such as the Superbowl, the FIS Alpine World Ski Championship, or MotoGP. In this issue, we deal with one of the TV events featuring a higher number of viewers in the year: the UEFA Champions League 2019 final, played between Liverpool and Tottenham. We offer you an overview of this event through a feature article by Óscar Lago, Production Manager at Mediapro, the company in charge of producing the final; and an interview with Rob Levi, the game’s manager for the British channel BT Sports. We also tackle the future from two different perspectives: the first one, by a series of articles on Big Data, a reality that is deeply influencing the world's technological trends. On the other hand, we talked with Simon Hancock, Editor at BBC Click, to dive into this innovative British program that is well-known for implementing in broadcasts formulae such as virtual reality or content interactivity. Last, we would like to say a word about the IBC Show, a European referent in the broadcast field. Our newsroom is receiving and increasing number of press releases, an unmistakeably sign of how the industry is preparing for this event. So far, we have been able to see major innovations that will set the path for the future of our industry, but we are sure the best is yet to come. A broad team of our magazine -media partner for this event- will travel to Amsterdam to tell you everything about this must-attend appointment. Are you coming with us?

4 AUGUST ‘19


Ross Video launches XPression Graphics V9.0

More than 40 new features have been added to Ross Video’s Xpression Graphics software via its 9.0 version. The real-time motion graphics platform receives innovations such as extended font management, acquisition of content to controlling playout of multiple Xpression graphics engines, the ability to import files and layers from Adobe Photoshop, or improvements to the data-driven graphics content tool DataLinq™. “Each new release of our XPression software has a big

6 AUGUST ‘19

impact on customer workflows”, said Patrick Twomey, Director of XPression Product Marketing. “Version 9.0 introduces new powerful workflow tools for almost everyone who touches XPression in their daily tasks, no matter how remote.” A new edition of the XPression Remote Sequencer is also part of the release. A few of the new features are: - Extended playout control of rundowns at the level of individual channels and

Take IDs - Possibility of modifying Take Items’ content and channel assignment directly, before going to air - Seamless integration of the XPression Project Server and Scene Manager for adding new Take Items at the last second. Ross Video has announced that customers with an active software maintenance contract should contact Ross Video Technical Support ( ) to schedule their upgrade to XPression V9.0. 


Cinegy Air 14 is now available Simplify challenges: that’s the main objective of the multi-channel playout server and broadcast automation software Cinegy Air, whose version 14 has just been released. The integrated solution, which supports multiple features in a single box, delivers D Dolby® Digital encoding, EAS, Nielsen

watermarking and Cinegy Titler channel branding, has new features that the brand has defined as “major changes”: - 8K support – via IP and SDI (e.g. BMD Decklink 8K Pro) - SRT encapsulated IP input and output - More performance opti-

misation for Cloud + GPU (up to 50% less CPU) - WDM and WebCam input devices as sources - Item Type customisable colour categories - Playout Engine as a Windows service 


Imagine Products® updates ShotPut Pro® and introduces its new Imagine HQ™ iOS App

The ecosystem of solutions of Imagine Products expands with the new Imagine HQ iOS App, which provides remote access and real-time status reports as long as both the computer and the user’s iPhone or iPad are connected to the internet. The iOS App can also link multiple 8 AUGUST ‘19

workstations together and it’s available with any ShotPutPro license type. Imagine Products has also wanted to update its ShotPut Pro Mac 2019.2 update, which enables reliable offloading of media. These are the new features of the software: - Real-time status updates via the Imagine HQ iOS

app for iPhone and iPad - Support for ARRI .ARI file offloads with Codex HDE integration, perfect for the soon-to-be-released ARRI Alexa Mini LF - A redesigned simple mode for quick and efficient offloading - Media Hash List (MHL) reporting


- More PDF reporting options, including the ability to designate first frame versus percentage sampling - Thumbs for Codex .ARX RAW frames - A new RED droppedframe flag - Support for Blackmagic RAW and Canon RAW formats - An updated RED SDK for

the most current metadata retrieval Michelle Maddox, marketing director at Imagine Products, talks about both systems and anticipates a cloud solution: “People are moving faster than ever, and workflows need to be more flexible and accessible. Our new iOS app transforms ShotPut Pro users’ day-to-day work

experience, bringing them much greater mobility while giving them peace of mind with real-time data and notifications. Going forward, we anticipate that the integration of these two solutions with our forthcoming cloud solution will have the potential to change workflows forever.” 


disguise gx 2 powers Billie Eilish’s “When We All Fall Asleep” world tour

The rising alt-pop star Billie Eilish has decided to implement disguise gx 2 technology for her first full world tour “When We All Fall Asleep World Tour” through her content design partner Comix. Tom Brightman, Creative Director of the company, explains the singer’s needs: “Cour Design tasked us with coming up with visuals based on a nighttime journey through Billie’s dreams and nightmares. We set to 10 AUGUST ‘19

work developing individual treatments for the songs. The show would begin by introducing a nightmarish landscape then each song would be represented by its own dream world from the inner demons of ‘Bury a Friend’ to the doppelganger-infested forest of ‘Copycat.’ We were excited, we knew Billie’s reputation for pushing creative boundaries combined with this interesting concept

and could tell this was going to be a great experience.” The kick off of the tour was at the Coachella festival. That performance required the gx 2 to have two 2K outputs and three full HD outputs for an upstage HD LED video wall, 4K LED floor and two HD IMAG screens. For the following shows, offered in concert venues, the company opted for one 4K and four 1080 outputs. Lewis Benfield, video


director, explains why disguise’s solution was the one that best suited Billie Eilish’s creative requirements: “disguise has a lot of advantages when handling a show like this. Its versatility is perfect for a cameraheavy, time-coded show from its pinpoint accuracy when playing back largeformat content with timecode, to Notch integration and the

smooth capture and playback of cameras, which were all paramount to this show. We couldn’t afford any lag anywhere in the system and it was all going to be time coded so it made perfect sense to use disguise as our playback system.” According to Lewis, the gx 2 provides the necessary stabiility for this world tour that will end in Mexico City in November

2019: “We wanted to get as much GPU performance as we could for Notch as we have numerous looks that we designed, and disguise’s gx range is the best when it comes to running Notch. Not to mention the amazing I/O, which fits our every need; timecode; cameras; outputs. The ability to easily tie other departments is also brilliant.” 


Matrox Monarch EDGE and SBG Sports Software’s Focus: Selected by International and Top-Tier Football and Rugby Teams Monarch EDGE encoder teams up with SBG Focus to deliver a combination of multi-HD encoding power and easy-to-use video playback for match-winning results Football is the mostwatched sport in the world, with global viewership for the 2018 World Cup final having reached a record-breaking 1.12 billion people, according to For football fans, the sport is in their blood, and tuning in for match day is as essential as oxygen. What fans are not always aware of is exactly what goes into their favorite teams’ preparation. Few might guess that sports video technologies play a key role in training regimens. One powerful combination, Matrox Monarch EDGE and SBG Focus, provides best-in-class video quality and highly-interactive performance. 12 AUGUST ‘19

Matrox Monarch EDGE and SBG Focus provide best-in-class video quality and highlyinteractive performance.

Monarch EDGE’s multiHD encoding capabilities paired with Focus’ instant replay and live-clipping software allows analysts, coaches, physiologists, and doctors to access and engage with pristine video

streams from the sidelines of any match, and from multiple angles. Once the video has reached the team staff, any number of operations can take place. Coaches can check instant replays and tag moments


in the match, review the tactics and pinpoint the weaknesses of the other team, make adjustments within their own team, and go on to perform better throughout the rest of the game. If an incident occurs during the match that results in injury, doctors or physiologists can see which body parts have been affected, and they can come up with treatment plans based on what they have analyzed. With the ability to use a variety of streaming protocols, Monarch EDGE can deliver four independent RTSP live feeds at 1080p50 to Focus servers while keeping data rates extremely low. Even 1080i25 SDI feeds can be upscaled to 1080p50 to provide the most temporal information possible for review by team video analysts. For team staff using Focus, it is easy to create a unique streaming channel for each event, where resolutions and bitrates can be mixed and matched. Replay operations give

users full control over searching for relevant actions within the catalog of video, and tagging capabilities allow users to expertly mark these actions to analyze later.

by these teams, Monarch

Many leagues and competitions provide multiple SDI feeds from the broadcast provider to the coaching technical areas. These can be combined with analysts’ fixed cameras, all of which are fed to Monarch EDGE’s SDI inputs. All angles are then streamed to SBG Focus’ server where they are recorded, displayed and made available for instant replay and clipping.

team staff required for

The winning lineup of Monarch EDGE and Focus has set a new, gamechanging standard in terms of capability and form factor. Having been rigorously tested in a live environment, side-by-side with alternative products, the combination is clearly the preferred solution for those who want the best of the best. Of the encoders that were trialed

EDGE was the only one that could deliver multichannel synchronized video feeds to Focus at the low latencies that

analyzing tactics, reviewing injuries, and delivering productive half-time team talks. “Our customers are always looking for ways to improve their workflows, reduce turn-around times and increase the quality of their deliverables,” said Simon Cuff, commercial director at SBG Sports Software. “Matrox Monarch EDGE addresses all three aspects. The quality of the picture, even at low bitrates, is the best we’ve seen, and the unit is so easy to setup and configure that it takes all the stress out of matchday setup.”  13 AUGUST ‘19


NEP UK implements Calrec consoles and routing technology at Wimbledon 2019

15 Calrec consoles, alongside Calrect networking and routing technology and I/O boxes, were deployed by NEP UK for the coverage of the renowned tennis tournament Wimbledon 2019. In addition, three extra consoles were used in NEP OB trucks for the Centre Court, Court One and Court Two coverage. 14 AUGUST ‘19

The host broadcaster of the tournament was Wimbledon Broadcast Services (WBS). The video network was all IP (SMPTE ST 2110) and eleven Calrec consoles, 10 Summa and One Brio, were implemented, including consoles for the Wimbledon Channel and the media centre. Furthermore, NEP

collaborated with a major US broadcaster for their coverage, including streaming, using three Calrec consoles – two Artemis and one Brio – in their three control rooms. Jimmy Parkin, Sound Engineer at NEP Broadcast Services, gives more details about the Calrec technology used at the


tournament: “We have worked extensively with Calrec over the years. The networking flexibility and easy scalability of their technology were central to the success of this Wimbledon 2019 project. We needed audio operators to be able to easily access any source from any court without having to move anything or plug anything in; any console can “talk” with

any other console on the network. We also needed to integrate IP networks with Calrec’s Hydra 2 networking technology and using their interfaces made this very easy. Preevent offsite planning, using Calrec’s H2O GUI for the Hydra 2 network, meant that we could accurately name and label ports before any hardware was connected.”

Parkin adds: “It worked seamlessly. The way that Calrec’s technology allows resources to appear on a network, and then how effortless it was to move those resources across the network, made our life so much simpler. Adding additional I/O boxes was also really easy too. This was a very complex project, but we were working with good people and great technology.” 


EVS technology will be used in a major multi-sports event of 2020 in Asia

The host broadcasters of a “global multi-sports event” that will take place in 2020 in Asia have chosen EVS technology for the centralized ingest, management and distribution of its content. 16 AUGUST ‘19

In addition, EVS will provide a comprehensive media center and live production solutions for the creation of highquality replays, slow motion and highlights. These services will allow

the production teams located at the various venues and within a central international broadcast center (IBC) to create a massive amount of content.


Some of EVS technologies that will be used are the new centralized video content distribution platform Media Hub or the EVS’ VIA Flow, a central workflow engine that enables users to exchange content between the venues, as well as monitor the flow of media throughout the EVS systems. Dr. Pierre De Muelenaere, Chairman of the Board of Directors and CEO ad interim of EVS, said: “This deal provides yet another example of a host broadcaster turning to EVS for support at a major international event that is going to be watched by billions of fans globally.” Xavier De Vynck, SVP, Big Events & Business Development at EVS added: “Our solutions provide the tools needed to enrich the viewing experience for audiences around the world, enabling the quick creation and distribution of content that encapsulates the drama and elation experienced during such high-profile live events.” 


Prague’s Univerzita Karlova, the 15th oldest university in the world, adopts IP video with NDI and Newtek RTL, the Univerzita Karlova TV studio, has decided to renew its equipment with NDI’s IP technology. The educational center of Prague, founded in the year 1348 by decree of Charles IV, King of Bohemia and future Holy Roman Emperor, is the 15th oldest university in the world. Jan Peml, head of the Radio Television Lab in the Communications Sciences and Journalism department of Faculty of Social Sciences, discovered NewTek’s solution at IBC 2017: “We could see integrating NDI through our Wi-Fi network; using it as a universal route for AV sharing instudio—connecting all possible screens and sources; and using tunneled UDP streams to simulate an external OB 18 AUGUST ‘19

In addition, Jan and his team have found different ways to integrate the system for use in the classroom: “Now we use the instructor’s computer as a source, and 15 workstations connected using NDI at the students’ desks viewing the source at their desktops. The opportunity to have, in front of each student, the same screen that the teacher is working on is great for us.”

van input. We could also use NDI Monitor for fullscreen output from Adobe Premiere.” The RTL staff evaluated several systems but finally chose the NewTek VMC1: “We can capture a signal from VMC1 in different places around the studio, in the classroom. We can ask students to edit using the same sources their

colleagues are using in the editing room”. Jan’s students also can create virtual studios, include graphics or connect the NDI system with computers: “Now I can just put NDI on any computer I want, instead of a separate computer and a scan converter, bring in the Skype call and send it wherever I want.” 


The Rebel Fleet chooses Quantum Storage for post-production in remote film environments The Quantum StorNext storage solution of Quantum Corp. has been chosen by the digital workflow service provider for television and film production crews The Rebel Fleet. The Quantum Xcellis workflow storage and QXS hybrid storage solution have allowed the film company to support realtime processes involving high-resolution content in the field. Both systems

were assembled along multiple Apple Mac Pro computers for ingesting data, calculating checksums, archiving to LTO tape and color grading. In collaboration with The Rebel Fleet’s partner Factorial, the company was able to scale the ingesting volume – doubling its size from 200 to 400 TB – without interrupting the filming schedule. Michael Urban, founder

and managing director at The Rebel Fleet, talks about the experience: “With the Quantum solution, we were able to crank out 150 frames of footage per second in three different formats while also supporting realtime color grading. Overall, this online environment delivered 6 GB per second of throughput. It was very impressive, and it helped us meet tight turnaround times”. 


The world of 12G 4K/UHD processing By: LYNX Technik AG Varun Patel, Product Manager, LYNX Technik AG

Technology moves fast and when it comes to audio and video broadcasting, there is always room for improvements in audio and video quality and ease of production. One of the main goals of broadcasters and content providers today is to create an immersive experience for the viewers, giving them the feeling of being part of the viewed content. The 4K/UHD buzz has been in the consumer world for some time but what does that actually mean for the content producer?

What is 4K/UHD? In 2012 the Consumer Electronics Association (CEA) introduced the term Ultra High Definition, which partly defined a resolution of at least 3840 x 2160 pixels. With four times as many pixels as HD, 4K moves the level of visual details to a completely different level. Consumers are however always wary of new television technologies (thanks to 3D) from companies that try to entice them to part with their hard-earned 20 AUGUST ‘19

money. But when it comes to 4K TVs, the initial price has dropped significantly making the technology affordable to many.

Then why a slow adoption of 4K? 4K has become an established format in the production sector as major over-thetop (OTT) players like Netflix and Amazon, demand content in UHD. Even though there is a huge demand for 4K content, there is a slow adoption of 4K in the broadcasting value chain. There are significant challenges for content producers in adopting 4K in the media production workflow. Major technical factors include: - Sheer volume of the digital data. - Post-production challenges like cleanup works, computing power and speed issues of hardware. - Broadcast bandwidth requirement:


Delivering high-quality video content in 4K from point A to B over multiple existing cable types or single new higher bandwidth types or fibre. - Interfacing between various production Standards (SQD, 2SI, 12G, IP SMPTE 2022 and 2110) As the Broadcast and Professional AV industries face the task of integrating new standards and multiple formats, LYNX Technik AG continues to offer the most advanced, innovated and reliable 4K solutions to overcome the challenges of ever-increasing complexity in the acquisition, production, post production and signal distribution environments. Building on its worldwide reputation for no-compromise German design, manufacture, and support, LYNX Technik AG presents a range of yellobriks® and the already multi-award-winning greenMachine® titan, (the customer configurable ‘Super Brik’ modular processing platform) for 4K/UHD broadcast and professional AV signal

conversion, fibre transport and advanced processing. LYNX Technik 4K products are designed and manufactured to solve the problems that are currently faced by content producers and broadcasters as they migrate to 4K/UHD. They reduce the complexity, cost of ownership and production resources, while providing state of the art video and audio quality.

greenMachine® - multi-purpose processor The flagship greenMachine® titan hardware has been designed for 3G HD and 4K/UHD requirements. The hardware is a powerful processing platform for video, audio, and metadata that can perform many different functions using one of the available greenMachine® titan constellations (configurations). 3G HD or 4K/UHD (12G-SDI) constellations (preconfigured application sets) for greenMachine® include; Static and Dynamic HDR <> SDR conversion, audio 21 AUGUST ‘19


titan Functional Diagram

embedding/de-embedding including MADI and DolbyE®, video and colour adjustment, frame synchronization, up/down/cross conversion, ROI scaling, audio/video test signal generation and a Bi-Directional Quad Transport system. The functionality of the greenMachine® titan can be simply switched by deploying a different constellation and assigning a license via the free greenGUI control software. The perpetual licenses can also be moved between machines making them ideal for reconfigurable OB and fly away or temporary system applications. greenMachine® titan is the future-proof reconfigurable ‘toolbox’ hardware base ideal for studios, OBs, fly-away, and remote production professional AV and other media facility that require 12G4K/UHD AV signal processing, or are currently working in 1.5G-SD or 3G-HD and are looking for a cost-effective software licence only upgrade to 12G in the future. 22 AUGUST ‘19

yellobrik® - application specific modules Alongside the trusted 3G modules, LYNX Technik AG continues to expand its yellobrik® range of application specific modules into the 4K arena. The compact size and innovative rackmount and redundant power supply options have found favour with users for whom quality and reliability over many years and in diverse environments are prerequisite. New 12G 4K/UHD models include: 4K Format Conversion: The CQS 1441 yellobrik® model provides a bridge between 4K UHD quad-link (2SI) devices and 12G SDI single-link devices. Conversion modes include: 12G SDI single link to 4x 3G Quad link (2SI); 4x 3G Quad link (2SI) to 12G SDI single link; 6G SDI single link to 4x 1.5G SDI and 4x 1.5G SDI to 6G SDI single link monitoring. A 12G fibre in/out SFP option is also available.


multiplexing or up to 80Km range.

12G 4K/UHD Distribution Amplifiers: The DVD 1417 & DVD 1423 yellobrik® are re-clocking coax distribution amplifiers for up to 12G 4K/UHD and allow users to input any signal, whether SD/HD/3G or 12G SDI and feed multiple re-clocked outputs to devices across a facility.

link streams on an HDMI display at 1080i resolution.

For the transport of either 12G on 4x3G coax format (2SI or SQD) or four discrete 3G signals, the OTR 1441 and 1442 offer a single box (at each end) solution. Modules may be ‘daisy-chained’ to transport either 8x 3G or 2x 4K/UHD over a single fibre or add Ethernet or RS232/RS422/RS485 signals to the multi 3G or 4K/UHD fibre link.

12G 4K/UHD Fibre Transports: OTX 1410 and ORX 1400 single channel 12G coax transmitter and receiver models are supplied with SFPs fitted or users can select from a range of CWDM types for

All yellobriks® and greenMachines® come with the intuitive, free of charge yelloGUI or greenGUI software which provide access to all advanced settings and adjustments within the units. 

12G 4K/UHD HDMI Monitoring, PMV 1841 Quad-Split yellobrik® is a compact, portable processor with an integrated down converter, for applications that require low cost, reduced resolution confidence monitoring or floor monitoring of 4K/UHD 12G, SQD quad23 AUGUST ‘19


Step-by-step organization of a Champions League final by Mediapro

24 AUGUST ‘19


25 AUGUST ‘19


By Ă“scar Lago Production Manager MEDIAPRO In September 2017, UEFA appoints the Metropolitano de Madrid stadium as venue for the final match of the UEFA Champions League, which is to be held on 1 June, 2019. With little less than 2 years to go until kick-off, the machinery starts rolling. Although UEFA had not yet decided who would be awarded production for this event, we in MEDIAPRO set to work immediately. For many of us it would not be our first UEFA Champions League final. Technical managers, producers, video and sound technicians, or operators we had already been in charge of the final match of Europe's most important competition at club level, which was already held in Santiago Bernabeu Stadium in 2009 or in Lisbon in 2014. However, the fact that the one at Metropolitano stadium was our second or

26 AUGUST ‘19

third final never meant that the project would be designed more calmly, on the contrary, the experience of having gone through an event like this increased even more the feeling of giddiness and

the sense of responsibility. Once the event is over is when one becomes fully aware of the importance of the biggest one-day sport event in the world. Only ten years have passed since the first final


in which we were involved –back in 2009-, but it feels like ages in the TV broadcasting world. A new language, new technologies, new tools. Hardly anything of what we did in 2009 is valid today as developments are frantic. But there is one thing in common: both instances were the time to show the world that we, in this corner of

the world, are fully capable of and have enough talent for facing the challenge entrusted to us by UEFA. In January 2018 the timer starts ticking and we hold our first meeting with the UEFA team in charge of the final. There is something that has not changed over this decade: UEFA always maintains

absolute respect and keeps full trust in the people in charge of the broadcast. We could say that their motto is: ‘Maximum cooperation, minimum intervention’. All is based in a relationship of mutual trust. We know each other and we have been working on a weekly basis in the broadcasting UEFA Champions League and UEFA Europa League

27 AUGUST ‘19


seasons for more than 10 years. We know the demands as to procedures required by UEFA and they know our work methods and what we can offer them. The previous preparations for each UEFA Champions League final require quite an exhaustive job by UEFA. At this stage, there are still 18 months to go and we already have the schedule for all visits and meetings to be held at

28 AUGUST ‘19

Metropolitano stadium: five in total between January 2018 and March 2019. Meetings are a genuine proof of the work and supervision of all details as required by UEFA. On one side there is us, the MEDIAPRO team that will be in charge of TV production alongside the UEFA team, but there are a lot of parallel meetings taking place in adjoining rooms in regard to a wide array of aspects surrounding the final:

security, marketing, power supply, press, etc. Faces show a mix of enthusiasm and responsibility. Looking at it in retrospective, the sport event only lasts a couple of hours, or five if we consider the run-up, so if anything goes wrong there will be no time to react. Everything must be foreseen, even incidents and contingency plans for sorting them out. TV approaches for the UEFA Champions League finals held in the past four


years have all been quite uniform. UEFA sets minimum requirements to be complied with, but from there on there is room for proposing positions and types of camera we feel comfortable with. And the final held at Metropolitano is, technically speaking, the second time in which 4K

and HD production is carried out with the same cameras. Camera deployment is well-know by all: 8 SuperSlow, 6 Ultra-Slow units, 2 travellings, 2 pole cams, an aerial camera, the helicopter camera, 2 beauty cameras on the outside on cranes, 7 cameras for viewing team arrivals –including 2

steady cams on quads for covering the arrival of buses from the outside to the stadium’s inner tunnel, plus 5 cameras for covering the moment in which the players come out to the pitch from the tunnels. However, this year we wanted to take a risk and introduce some changes in order to improve this 29 AUGUST ‘19


approach. The most personal of these was replacing the central camera on the field by two cameras located 20 meters away from both corner flags. This has become usual in Spanish football productions in recent years, but hardly ever seen in the rest of Europe. In MEDIAPRO we believe this provides a clear improvement on the images featuring players involved when an action ends. On the one hand, we always rely on two options as both cameras keep a close-up view of the player who carried out an action once it is over and this allows choosing the best angle based on where the player is looking to depending on what players are there in the middle, as they are placed between the camera and the relevant player. On the other hand, as these cameras are away from the technical area, constant obstructions of the view caused by the movements of both coaches and the assistant referee in the surroundings of the 30 AUGUST ‘19

central camera are prevented. This is the reason behind the fact that most producers from other countries discard this close-up view from a low-positioned camera, with spectators unfocused on the background –a take with a high aesthetical and narrative value- which forces them instead to

resort to a high camera which background frame will be the pitch itself. This latter take has a much lower aesthetic and narrative value as, on the one hand, it seems to be isolated from the context and, on the other, as it is an oblique take, watching the player’s expression becomes much harder.


role based on each action in the game and then have all this inside your mind so in a matter of tenths of a second you know that the relevant action will be seen in a much better way in one or in other camera without needing to look or going over the image. Everything goes very fast and there is no time for hesitation. Our production team for this final had a large advantage on its side, no other than the fact that for three years now the TV production model for LaLiga –the Spanish

Sport production in 2018

For TV production of the grand final we have 42 cameras. This may seem a lot, but the most important thing is not quantity but knowing how to use them. It may seem a trivial thing, but what does this actually mean? The key in production is knowing how to assign to each camera a specific 31 AUGUST ‘19


league championship- has been set at a very high standard: 23 cameras in two games every week, about 30 cameras in the Clásicos, the name given to the matches featuring Real Madrid vs. FC Barcelona. This leaves us in a very comfortable position when facing a match involving 42 cameras... which actually turned out to be 43! We held a lot of meetings and a very strict level of foresight, but just 10 days to go before the match and with the assembly already in progress, we decided to include a camera providing a view of each coach with the fans on the background, in such a way that if a goal

32 AUGUST ‘19

were scored, viewers could see -in the same open view- the celebrations of both the relevant fans and coach. A change that was welcomed by UEFA as they –like us- felt this would substantially improve match production. A special note deserves production and execution of the opening ceremony. In recent years this ceremony has become a kind of mini-concert lasting about 7 minutes

and this entails a series of technical requirements and a very high standard of production that Daniel Lozano and his team succeeded in achieving in brilliant fashion. This has been a very complex and demanding


job, but a highly rewarding one for the whole MEDIAPRO team. As for the quality of the football played by both teams, we would rather not say anything. A pity, although sure that the Liverpool fans did not leave Madrid greatly concerned about that... 

33 AUGUST ‘19


34 AUGUST ‘19




Interview with Rob Levi, TV Director at BT Sport The word is on the street: HDR is the technological future of sports broadcasting. Manufactures and viewers agree with this statement, as do broadcasters such as BT Sport. The leading sports TV Channel in the United Kingdom decided to offer for the very first time a Champions League Final in HDR. We wanted to know more, so we met in the lobby of a renowned hotel in Madrid with Rob Levi, TV director of the final for BT Sport, a few hours before the event.

35 AUGUST ‘19


You are about to direct the Champions League final. It your first Champions League final as a director? I have worked on a few Champions League finals. I think my first one was in 1999, just as a junior assistant producer. In terms of directing, I directed the presentation for BT Sport in Cardiff two years ago, which was a major operation because we were the host broadcaster. But, in terms of the actual match, this is the first one. It will be in HDR, we are saying it is the world first. My bosses keep telling me “You're making history!” Yes, it is quite exciting. People ask me about how the HDR affects me as a director. To me, it is the same thing. The change is the experience that it offers to the viewer. They can see, and we can see in the truck as well, the amazing picture quality. The definition, the colors and everything is amazing. In terms of directing a game of football, it is business as usual”. 36 AUGUST ‘19

Could you summarize for our readers your experience in this type of events? In previous Champions League finals I have been working alongside with UEFA TV. You know there are specific requirements and protocols you have to follow for the Champions

League and any UEFA event. I've worked on various World Cups, European championships and many of the English Premier League coverage; many different tournaments with different TV organizations. I understand what their needs are, what is


required. We have to follow a specific protocol in the game. We have to mirror the image and be coordinated in terms of all the graphics and timings. In addition, there are specific running orders that we follow. So, I am very familiar with all that because I have done all

the Champions League games in Manchester, I did all up to the quarter-finals.

What peculiarities does a final have? What are the differences compared to a firstphase match? It is just the grandiosity of the event. It’s a very

special Champions League for us. Having two English teams in Madrid has a different feel. There is an opening ceremony with Imagine Dragons: we were watching those guys rehearse yesterday and we were visualizing what we can make with the cameras. There are extra things. There is a helicopter, so you are covering the arrival of the teams… It is a real grand cup final, so it is all about the whole event, the whole ceremony. When you get to the match, you go into game mode; it is just a game of football. You have to tell the story of the game, but also everything around it: all the color and the atmosphere of the event. We are just trying to make the viewer experience what the fan in the stadium is experiencing.

BT Sport is one of the European references when it comes to sports broadcasting. We guess they did not skimp on the production for this event. Do you have your own on-site cameras, 37 AUGUST ‘19


apart from the international feed? Yes, we do. Basically, the host broadcaster provides 40 cameras, but we have put 20 additional cameras for the HDR, because all the cameras have to be specifically HDR-ready. We have a full crew, OB trucks, our own graphics and our own sound department. Basically, we have our own standalone outside broadcast doing that game for the HDR.

They keep saying that remote production is the future, but here you are in Spain to direct the final, you even brought your own OB truck. Is travelling mandatory for a large event like this or do you think that these kinds of finals will be remotely produced in the future? I have experienced a little bit of these remote productions and I think that, for certain events and productions, it will definitely be moving in that direction. It is quite exciting and on BT Sport, you know, I am sure they have been looking at it. It is a very exciting avenue to explore. I have read a lot about it and it looks great. Sometimes you have to be at the stadium, depending on how big the event is, but I would say that it will definitely be like that in the future.

Do you think being in the stadium really marks you as a director? It depends. Sometimes you can go to some stadiums 10, 15 or 20 times a season, so you have got the picture in your head already, obviously. But if you are in a stadium for an event like this, it absolutely does make a huge difference. Example: Imagine Dragons. They did three rehearsals. For the first one, i sat in the stadium. I didn’t want to watch it through a lens, I wanted to watch it in my mind. I had a peripheral vision of what's going on, where the performers are coming from. Then I went to the truck and rehearsed it there. Yes, there are benefits 38 AUGUST ‘19


39 AUGUST ‘19


to certain events. It is really helpful to be onsite. But, at the same time, if you look at the remote, it gets you to the event because it makes some productions viable. You can go to an event you probably never get to because of the cost or budget. That gives you the opportunity to get there, you can get the cameras in there and your presentation team in there. You don’t have the cost of all of the rest of the broadcast. If anything, the remote production is an opportunity to get to more events. That is how I see it.

The final will be produced in 4K HDR. Does this affect your work as director? It’s the same. For me, it’s the same monitor. For all the technical people is more work but for me, as a director, it is business as usual. There is no delay; there are no challenges for me. The challenges come to all the brilliant technical people who somehow make it work. The results are amazing.

Can you see that HDR final image in your OB Trucks? Yes, the truck is HDR-ready. We have that.

The football broadcast is getting more and more complex. VAR technology, Polecam, spider-cam, slow motion, helicopter… Will these cameras be used for the broadcast of the match? The host broadcaster is implementing all of that. For us, literally because of space, we can't have all those cameras, 40 AUGUST ‘19


but what we can do is access some of those cameras from the host broadcast and we can use them in our coverage if we absolutely have to. We can use some of the angles for replays in case we need to clear something up editorially, but the plan is keep everything using all the HDR cameras, not to use any of the others. But we’ve got access to those: spider cams, rail cams… They bring everything for the Champions League final.

Have you worked in other sport broadcasts besides football? Yes. At BT Sport I have done a bit of hockey and, in the past, I've worked on tennis at the London Olympics. I have done rugby, darts, snooker and even shooting at Commonwealth Games, which is a challenge.

As a professional, we assume that you keep a close view of the trends of the industry in, you name it, NBA, NFL, etc. Which solution would you like to see implemented at football broadcasting that haven’t brought yet? I always try to watch SuperBowl every year because I wait to see what they are going to show, what the latest innovation is. I guess… I don't know! I think an interesting one is having your main camera angle on a track. It is like the one you have on a Playstation or something. It moves up and down. I think Spanish broadcasters have tried that on a “classic”, maybe a few years ago. Otherwise, they are always trying to find the newest ones. The polecams have been great additions to the coverage. A tracking camera is something that I also found very interesting. But that depends a lot on the type of stadium you are in.

Let’s look at the future. Should sport broadcasters keep abreast of innovations? I think we always have to be exploring the next level. That is kind of where we are at the moment with the HDR for the final. I am right behind that: moving on into the next level and what is going to be the next innovation. We have to go with it. The future is the future. You look back five years and it has jumped on so much in that short space of time!  41 AUGUST ‘19


French TV channel La Chaîne l‘Equipe use Vizrt AR Graphics in the 2019 Women’s Football World Cup

42 AUGUST ‘19


Broadcasters around the world continue to find new and interesting ways to leverage analytical graphics to help tell better stories. During the recent Women’s Football World Cup, French sport TV channel La Chaîne l' Equipe used highly innovative augmented reality (AR) graphics of the football pitch to help explain important plays during the various matches.

During a roundtable discussion on the channel’s highly rated “L’Equipe Du Soir” show regarding the Championship Finals match between France and the United States, as the group discussed key plays, a 3D football pitch appeared on screen between the commentators (one commentator snapped his fingers and a 3D representation seemingly

fell from the sky). They were then able to look directly at live action replays from above and stop and start play to make a point during their discussion. On-set visual tools like this help viewers get a better understanding of the discussion and the subject matter at hand. “We wanted to find an innovative visual support tool to enrich our debates,” said Jérôme 43 AUGUST ‘19


Aubin, Head of Production at La Chaîne l' Equipe and Olivier Ferrand, Head of Broadcasting to add “The editorial goal was to give the various speakers, journalists and consultants the ability to control the action, analyze the key moments and choose the best line of sight without being dependent on the footage from the broadcaster.” Each highlight is available in real time approximately 30 minutes after the live event takes place. The effect was created with Vizrt´s AR graphics systems, Viz Virtual Studio with Netventure NQuad, using a design from local system integrator Post Logic, 3D footage from 3Dreplay and new software plugins from Erizos called ezMesh. The Viz Virtual Studio system was used to process the highly accurate camera tracking and software-generated information while the Vizrt platform, Viz Engine

44 AUGUST ‘19

rendered the entire scene in real-time. 3Dreplay supplied the replays, recreated in 3D, used for the AR effect. 3Dreplay has developed a new 3D export process of their own proprietary tools that tightly integrates with the Viz Engine. Post Logic then merged the live action and computergenerated graphics in order to create an augmented reality 3D replay. The state-of-the-art AR

tabletop design offered the commentators the ability to play and replay an event, and choose the playback speed when simultaneously the director can change the viewing angle. The commentators were also able to add visual information on the replay displayed, such as visually identifying on screen the offside-line, player highlights, trajectories, ball speed and players stats.


collaboration between Post Logic and 3D replay that gave birth to this project,” she said. “In parallel we pushed the limits of the Viz Engine to get an optimal rendering. In terms of quality, we wanted to offer l'Équipe high-end rendering quality with smooth and realistic players, shadows, ambient occlusion and even depth of field. I think we accomplished that very effectively.” It’s new Vizrt graphics “This project by Post Logic shows what is capable using Vizrt and the plugins provided by Erizos, and some ‘out-ofthe-box’ thinking,” said Ronny van den Bergh, CEO, Erizos. “The results are absolutely amazing. This is a great example of how we enjoy working with customers and providing them with tools to push the boundaries.” Emmanuelle Jaïs, Sales ManagerManger at Post Logic said that the Erizos

team modified its 3D import plugin in order to manage and animate the 23 characters on the virtual pitch at HD resolution while letting the Viz Engine power the display of other graphical elements, such as environment, data, projectors and drop shadows and additional realism added by adding ambient occlusion using the ezPostFx plugin by Erizos. “It was close

technologies like these that are helping broadcasters such as La Chaine L’Equipe increase viewer ratings and keep them coming back for more. The overall audience for the Women’s World Cup coverage on the channel was “very good,” according to everyone involved, and enhancing it with AR graphics helped distinguish its coverage from the competition.  45 AUGUST ‘19


46 AUGUST ‘19



It is hard to establish how a sector as changing as news programs production will develop, being so influenced as it is by the evolution of content management technologies, its inevitable binding to telecommunication companies for signal broadcasting and, of course, due to the ever fulminant communication trend as shaped by consumers themselves. However, the TM Broadcast Informative Breakfast, hosted once more by the renowned Santo Mauro Hotel, proposed quite a faithful approach to the lines that will ultimately configure the trends to be present with us in the coming years. Text: Sergio JuliĂĄn Photos: Pedro Cobo

47 AUGUST ‘19


Víctor Sánchez, Director del Área Técnica de Televisión Española

This was achieved, of course, thanks to the knowledge shared by experts who attended to this breakfast spanning over two and a half hours. Representing the broadcasters, we were honored by the presence of Amadeu Gassó, Operations and Engineering Director, Corporació Catalana Mitjans Audiovisuals (TV3); Alberto Alejo, Head of the Software Engineering Software, Corporació

48 AUGUST ‘19

Jesús Sánchez Villalba, Director Técnico de Castilla La Mancha Media

Catalana Mitjans Audiovisuals (TV3); Víctor Sánchez, Technical Manager, Televisión Española; and Jesús Sánchez Villalba, Technical Manager, Castilla La Mancha Media (Castilla La Mancha TV). On behalf of the companies that made this event possible attended, David Martínez, Operations Director, Datos Media; Rafael Zapardiel, Sales, Datos Media; José Antonio Bolós, Key

Account Manager, Professional Solutions Europe, a division of Sony Europe B.V.; Miguel Ángel Sánchez, Senior Solutions Architect de Professional Solutions Europe, a division of Sony Europe B.V.; Juan Dorrego, Technical Manager Ges-It; and David de No Coma, Commercial Director, GesIt.Luis Sanz was in charge of moderating this discussion in also took part Javier de Martín, CEO, Daró Media Group.


INTEGRATED NEWSROOM After a few minutes of relaxed conversation and the opening welcome to the event by Javier de Martín, consultant Luis Sanz took on his role as moderator by presenting the first topic for discussion: Integrated systems, how the same tool can be adapted to various types of releases. Amadeu Gassó, Operations and

Engineering Director, Corporació Catalana Mitjans Audiovisuals (TV3), opened the discussion by confirming this notion is a “driving idea” in his organization: “We have three semi-independent newsrooms, the one in TV3, in Catalunya Radio, and Digital Media". At present, they have three distinct Media Asset Media programs: “Digition developed in-house- for TV, Dalet in radio, and

Amadeu Gassó, Director de Explotació i Enginyeria de la Corporació Catalana Mitjans Audiovisuals (TV3)

Deliverty in digital media. Quite the opposite as it may seem, as remarked Alberto Alejo, Head of the Software Engineering Service of Corporació Catalana, connection between them three is permanent: “We have worked on making it easier to share contents between the various MAMs, as well as on improving workflows”. However, progress lies in complete unification: after

Alberto Alejo, Jefe del Servicio de Ingeniería Software de la Corporació Catalana Mitjans Audiovisuals (TV3)

49 AUGUST ‘19


Luis Sanz

having made a decision to implement the Galaxy version of Dalet in radio broadcasting, consideration is being made of whether it could eventually work in the TV newsroom. The vision expressed by Víctor Sánchez implies maintaining the particular language used in each area, which require "specialist tools" accordingly. The Director of the Technical Area at Televisión Española noted

50 AUGUST ‘19

Javier de Martín, CEO of Daró Media Group

that nowadays there is no alternative covering all their needs, something not being favored by the current trend prevailing in the market: “We see it heading toward specialization and focusing on doing well a certain thing". However, he admitted that such multi-disciplinary approach is pursued with MAM systems. On the other hand, Jesús Sánchez Villalba, Technical Manager,

Castilla La Mancha Media, says such unification of newsrooms is a reality in his environment: “There are three of them: news programs, radio and a digital one. All of them are very tightly packed at management level, which makes everything flow quite smoothly". Technology has contributed to enriching these conversations: “We were pioneers in the side of integrated digital newsroom, so there is a


broad culture of integration. Furthermore, with the updating of our AVID system, all contents reaching our station are shared in the three newsrooms". The vision of broadcasters on this topic is crystal-clear. But what about manufacturers? First to speak was Miguel Ángel Sánchez, Senior Solutions Architect, Professional Solutions Europe, a division of Sony Europe B.V. Beyond advocating

the relevance of integrated solutions and underlining that Sony’s vision lies in increasing such “flexibility” of being able to integrate various systems, he admitted that at present there is no perfect solution catering in full to the needs of TV stations. On the other hand, he endorsed the notion of "Story Centric”, based on the principle that each story is a container to which journalists gradually

David Martínez, Director de Operaciones de Datos Media

contribute content from various sources as the story unravels, thus allowing the creation of various types of content from different angles adapted to the various destinations for release. Later on the conversation, he reverted to this issue to stress that although “it is all the same for journalists where content is stored, what they really want is access to it”, further underlining this flexibility line.

Rafael Zapardiel, ventas en Datos Media

51 AUGUST ‘19


David de No Coma, Director Comercial of Ges-It

In representation of Avid, David Martínez, Operations Manager, Datos Media, remarked that is the direction where his company has been going to for years now: “What matters is the news, regardless of the media where is to be released; it is about content, looking for different angles". In the end, present-day newsrooms require "information to be available” for all areas. In regard to this issue, Amadeu wanted to

52 AUGUST ‘19

Juan Dorrego, Director Técnico of Ges-It

advocate for journalist specialization in contrast to this notion: “In the offices, the possibility that a journalist who is currently working in radio switching to TV in a week is idealized”. David Martínez went a step further: “integration of radio and TV is technically awkward, but it is even harder from a human point of view. In this area, he wanted to remark that maybe the road to simplicity should include

integration of third-party applications for managing specific issues, as it is the case with Avid’s Cloud UX.

UNIFIED SYSTEM, CUSTOMIZED TOOLS “What we are definitely witnessing is that small broadcasters prefer a unified tool, while the biggest ones –having separate departmentsrequire more specialized tools." This thought from Miguel Ángel opened a


José Antonio Bolós, Key Account Manager of Professional Solutions Europe, a division of Sony Europe B.V.

new topic for discussion. Víctor ratified his point: “Each medium has its own peculiarities and what they are claiming is to have the specific tool for each of them". Jesús shared this opinion: “In my company people flow a lot through various departments because, with the exception of radio, we have a small specific tool, all the rest is the same. What changes is the way of telling the story”.

Miguel Ángel Sánchez – Senior Solutions Architect of Professional Solutions Europe, a division of Sony Europe B.V.

David Martínez increased even more the scope of the issue by asking two questions: “Does content have to be moved around social media in order to “heat up” a specific information topic? Who coordinates that versus what will be placed into a news program?” After some puffing and sighing –by those who could not come up with answers- Amadeu broke the tension by saying “There are some battles…”

followed by a general roar of laughter. Alberto replied on behalf of the Catalan producer: “Nowadays, TV is the real big player. Although it is true that a couple of years ago this debate did not exist, it is now a hot topic. Workflows are constantly changing, but the problem there is not the tool, as it will be adapted as needed”. In this regard, the Head of the Software Engineering Service deems necessary making a distinction between to

53 AUGUST ‘19


layers in regard to content: “Although it must be available in any newsroom, in each of these a different treatment is made as adapted to the medium being targeted. For example videos in social media include text as in many instances they are played without sound activated”. Rafael Zapardiel, in charge of Sales at Datos media, closed this section highlighting a simple example in showing the peculiarities of each medium to which information pieces are channeled: “Graphic content is of no use in radio as it would not make sense, while on TV and, most of all, in social media, it is a key for highlighting the identity”.

NEW POSSIBILITIES OF MAM SYSTEMS: FLOW CREATION, AUTOMATION, AI… The development of MAM is unstoppable, as remarked Luis Sanz in the following topic in the morning, with issues such

54 AUGUST ‘19

as "automatic indexing" or the various "artificial intelligence" applications. They implement new solutions catering to the needs arising in news programs, but… to what extent is a benefit derived from them by broadcasters? Jesús recognized that in

this regard they have still some way to go, although they find it essential: “We have a MAM that is not even such a thing: it is a file that works the way it does. There is a project for improvement that we would like to start as soon as possible, which I think is absolutely feasible and I


recognition, changes of takes". Víctor sees two stages in which it can be implemented: “At the start of news generation, with the extraction of metadata enabling the journalist to put the news together: keywords, who is shown, changes in takes, and final storage of the finished piece of information".

hope we can have it finalized within a year. We are still at projectdefinition stage”. Víctor, on the other hand, makes the most of the possibilities offered: “We are working quite a bit in this area. At documentary fund level we are going through a conservative

renovation project, but which includes improved equipment, flows..." His gaze is geared toward the environment of artificial intelligence applications. “We are –in our innovation areas- working in speechto-text issues for subtitling; work is being also undertaken in sports

TV3 is also committed in making the most of its full potential: “For example, in the Catalan Process trials, all audio was being produced in Spanish, but as soon as it entered the system it would be translated into Catalan in order to enable access to the same in this language. On the other hand, we are currently focusing all regarding automated indexing and catalog creation in the documentation department”. In this area, as pointed out some months ago in the TM Broadcast Breakfast on documentation, professionals are becoming "coaches and validators" as machines show their novelties in the

55 AUGUST ‘19


field of "detection of people by means of face or voice recognition” through novel automations. Amadeu ratified the words of his colleague: “Documentation tells us: we humans make a highlevel catalog creation and automation enriches us". Rafael Zapardiel highlighted how advanced is the AI field in regard to sports and how contents are enriched nearly in real time, as achieved by Watson in IBM. This was ratified by José Antonio Bolos, Key Account

56 AUGUST ‘19

Manager, Professional Solutions Europe, a division of Sony Europe B.V. At the same time, José Antonio used his intervention to underline that although the future lies in automation, the human factor ensures and certifies this. At this point, Víctor wanted to highlight that technology still has some way to go. His example reverts again to the speech-to-text topic: “If just one person speaks on the same content and the machine is trained, it can perform reasonably well. But, if in a fivepeople discussion, each of

them speaks and the topic is a wide one, you manufacturers have a complicated task there”. Juan, from Ges-It took over to claim how AI can be also applied to production as he replied to the RTVE senior executive. “Machines are capable of deciding what take you will get or who will be monitored. It is a virtual director. We come from the IP world and this format is already a reality. I agree with you that machines should be given some time to learn, but once they have achieved this, they work smoothly.


The logical course of development for manufacturers is providing increasingly improved functionalities, including assisted production”. In this line of reasoning, Juan remarked that with the aim of speeding up processes, metadata insertion from the source could be enabled. However, Miguel Ángel was cautious in this: “We agree about using artificial intelligence as an automatism facilitating flows, but I do not know what would be the approach about how to enrich information on contents before sending them for filing". As they recognize themselves, “nobody has Google’s translation capabilities, so the alternative lies in getting their products integrated with systems and services. So has been recently done by Sony with a Swiss-based company which after developing an algorithm and implementing machine learning, created an app for translating from Swiss to German. Víctor agreed: “At

broadcaster level, what we want is production tools to be easily integrated with those from third parties". As David noted, Avid assumes such “integration” route through the creation of a “middlework” layer, as companies such as Amazon, Microsoft or Google "will not undertake such process".

RESOURCE MONITORING AND SEARCH Beyond the possibilities of identifying the truthfulness of content, a crucial matter is monitoring the number of times it is used in such a way that the discourse of a TV station is not impacted. Is it limited to online storage? Is there a full control in place for reuse of pieces? Víctor was the first to speak: “The whole file is unified in the documentary fund, although each area has a local file that would enable them to operate in the event of a general disaster. (...) There are thousands of files

traveling through the areas at documentary fund level, but I do not know about levels in regard to repetition". Jesús shared the view of the RTVE senior executive and contributed a field case: “In an afternoon news program comprising about 60 pieces there are about 90 and 100 videos coming out from the files. There is more filed content than live program”. On the other hand, Amadeu agreed that for them is an “absolutely essential tool”: in terms of online storage, according to his company mate Alberto, they have 1.5 petabytes. David, from Datos Media, pointed out that although MAMs report the number of times a file is downloaded, “downloading content does not mean it will be used". However, solutions such as artificial intelligence could in the future be capable of “analyzing the images broadcast” in order to “provide the file with feedback”. At that point, Miguel Ángel threw a question to

57 AUGUST ‘19


all attendees: “What tool does a journalist use for doing a file search’” Alberto noted that Digition presents several modules (Broadcast, current topics, programs and documentation) which access is a "Google-like" search engine, although then “you have to filter out by terms”. They are working on improving the search tool. In the end, documentation is still being used” for more specific searches.

THE CLOUD AS REPOSITORY, MANAGEMENT OF APPLICATIONS AND BUSINESS MODEL It is funny enough that "the cloud is actually in a basement" as commented by Luis Sanz to attendees. Jokes aside, the truth is that the notion has meant a revolution, not only in regard to storage or production tools, but also in business model, as remarked by Miguel Angel as he took over: “the major news agencies are shifting from investing in their own infrastructures

58 AUGUST ‘19

to a cloud-based service model. That is the trend we are seeing”. For the representative of Sony Europe, this provides flexibility and mobility, most of all to corporations deciding to move their central headquarters. On the other hand, he presented XDCAM Air: “What this does is that all camcorders featuring wireless backpacks contribute live content to the cloud, and this content is directly shared and made accessible by the various journalists at any time”. Additionally, “it allows both live streaming and file transfer” and, according to Miguel Ángel, "it is not costly if transport and number of people are considered". José Antonio added that the system allows "remotely controlling the camera", which enables a single journalist to travel and cover news with the equipment ; and "sending the planning metadata directly to the camcorder, so this content, when coming back to you, is already being managed within the system".

On the other hand, David noted that in regard to cloud, Avid has launched Nexis Cloud, a solution that adapts to the nearline models, which enables sharing several workspaces with different features. It also allows “migrating all tools to the cloud, even under a hybrid model, with the possibility of using an on-premises volume to migrate to the cloud little by little”. On the other hand, “many Avid third-party tools such as NetTV or Wildmoka are already based on cloud systems”. In this regard, Juan added that backpacks allow “working in the cloud”, with data uploading to a “web browser” accessible to users, although at present steps are being taken to have the possibility of working with said content from live production as that signal is received”, something that, to Miguel Ángel, is similar XDCAM Air. According to Víctor Sánchez, each and every comment made so far


about the cloud "makes sense": “All of you manufacturers tend to have the streaming signal available from the cloud in order to provide even faster release functionalities. And then, at production systems level, I agree that large, highly-global corporations are moving into a cloudbased system”. The problem lies in the large amount of data being moved, as that is precisely one of the pricing schemes of cloud providers: “Then you have to make some numbers about how much making it in the cloud or locally may

cost". The Director of the Technical Area of Televisión Española, faced with this and even though he admitted that price was strongly determined by the media manufacturer, the IP communications provider, and the storage provider, favored a mixed approach: having a certain amount of online storage locally or semi-locally and use, as David said, some applications that generate very specific content". In this issue, Miguel Ángel pointed out that even though he agrees with his colleague, “one must take into account how much having this infrastructure

in your headquarters will cost": Items such as staffing, infrastructures, space, air conditioning, power supply…

CURRENT USE OF THE CLOUD Alberto Alejo ratified that, of course, it is configured as an important solution for TV3, but he is not at present focusing in it "as a repository", as they have "a history, staff, machine, amortization of robots...” His efforts are instead directed to contracting “cloud-managed services” as “live transcoding service and VOD for

59 AUGUST ‘19


digital media (web, mobile phones, tablets and SmartTVs)”, in view of the “elasticity provided by the cloud, the assurance that you have machines available and you will not run out of them”. Víctor confirmed that RTVE focus its use in "added services" such as "uploading content with connected cameras" or "servers for a specific content, as subtitling". He is developing a project aimed at achieving a hybrid technology that will be geared, largely to "services focusing on production and journalists”. “It will be the first approach to be made at RTVE”, which will be performed taking special care in “data production” issues. This topic was closed by

60 AUGUST ‘19

Jesús Sánchez, who noted that in Castilla La Mancha Media are using the cloud for full management of their OTT Additionally, he uses the Dropbox platform to speed up processes with 20 collaborating producers scattered throughout the region: “It is quick, responsive and offered at a very low cost. A very simple way of managing and monitoring the work being carried out by these producers. (…) They finish up their work, drag it down and it is automatically managed”. Miguel Ángel raised two doubts regarding metadata management and data protection guarantees. In regard to the former, Jesús noted that as they are finished work, they are supplemented by scripts that are sent to

documentation through email. In regard to the latter issue, he added, "the same as with Azure: there is a confidentiality agreement in place when you purchase the service and we assume that everything coming in is protected, that access is exclusive…”

ELECTIONS: BIG DATA AND GRAPHICAL SPECTACULARITY In recent times, the Spanish political and social agenda has been deeply influenced by calls to elections, events that news programs have covered with staging featuring striking visual content and high presence of cross-section data. Alberto told that TV3 has been since 2015


researching on big data, which has enabled them four years later- to make a powerful deployment in this regard: “We had been gathering momentum. News programs were intending to do more stuff, as usual. We offered them to be able to load social and demographic data details together with working data so as to bring them together with election outcomes so as to enrich the electoral information. We already had processes and workflows in place for collecting electionsrelated information, validating and publishing results in the various media, including for automatic generation of headlines for broadcast on TV and digital media. We put social and demographic information together with electionrelated outcomes. Journalists were able to derive richer information than the information we generated during the days following the call for voting and therefore, to find patters in election results.” “In the end, they

have so much time to discuss these topics that the more they can get in, the better for them", concluded Amadeu. Televisión Española was “very much focused” on the making of special election formats, as well as on the monitoring of information regarding the election calls. “Production was made with the use of augmented reality and graphic displays behind anchor people was used for integrating graphics”, commented Víctor, who finds AR increasingly more stable and perfect” thanks to the new rendering systems and their implementation, which provide “value” to viewers: “We have made use of nearly all technology available in the market: Brainstorm, Avid for an issue, Vizrt and ChyronHego. All of them have better and no so good things, but the general comment is that the final result achieved is quite a good one". To Víctor it is clear that beyond their current efficiency, survival of this

kind of graphics is up to viewers: “Some years ago a move was seen from virtual to real; now we are in augmented reality. This is the trend. It is not a matter of being better or worse, but of what we are dealing with right now". In regard to this issue, Amadeu believes it necessary to introduce these resources at the right time, as sometimes "technology just for the sake of it turns out to be a bit of a disaster”. Castilla La Mancha Media “sticks to the model implemented in TVE”: “We use a graphic style that is worked out in detail. We have been relying on Virtualia for a long time. They use Brainstorm and undertake the whole technical side for us. We also use augmented reality, but we basically focus a lot on design, presentation”. Quite the opposite as other stations, Jesús and his team favor an approach of reducing on-screen data and go for clarity in view of their profile of viewers who live in the “countryside” and

61 AUGUST ‘19


are of “older” age. David had the opportunity to tell his experience in this matter along with Telemadrid, as his company was in charge of developing both augmented and virtual reality for the autonomic, local and European elections. Although, alter being awarded the job they still had three months for implementing their deployment, the call for general elections tightened the deadlines. “With just one month to go, we had to do crazy things, but things turned out quite well actually”. On the other hand, the Operations Manager at Datos Media reflected on standardization of these techniques for the day-today activities, as other players such as Atresmedia, Televisión Española or Telemadrid are starting to do. “That is the real challenge of graphical systems: How to produce content in a smart manner and consistent with the development of current issues".

62 AUGUST ‘19

5G TECHNOLOGY AND MOJO Although backpack standardization brought about a technical revolution as for live news coverage, the arrival of 5G promises to provide new solutions, either in view of stability achieved, lack of latency or the volume of data conveyed. Víctor and his team have been studying this connectivity for some time now: “We are already doing tests. This will allow us to generate content anywhere with fast connectivity as well as editing and assembling from any spot”. However, he does not hesitate in stating that success will depend on its degree of coverage. Juan, from GesIt shared this view and added the complications brought by the lack of existence of a global standard. There is no standardized 5G protocol for every country and we will go to the frequency of land digital TV, 700. In spite of this, as the DTV

network will be used, he finds there are reasons to be optimistic, as “the infrastructure is already in place and investments have been already made". Amadeu voiced yet another concern to journalists working with these machines: “We may have a problem here: our journalists were used not to have latency in connections. It was assumed to be 2-3 seconds under 4G. What will it happen when we have a mix of one and the other?" José Antonio also added the problem of access: Will you have your connection or your signal will be used by other services?” In the end, the process lies in the set-up of APNs itself or using a dedicated network in view of potential crowding, as pointed out on the desk. There is some way to go before the arrival of a real 5G network: Juan regards it will take about three years. That will be the time, for Victor, in which technology will be useful.


WHAT IS BEING MISSED IN NEWS PROGRAMS? Apart from all topics covered, Luis offered the attendees the opportunity to state the needs existing within the scope covered in this session. In regard to this Victor found necessary an improvement in product integration by manufacturers". “They are really hard to configure and update, while the trend in phone apps is that you install them and then they are automatically updated until the device

becomes obsolete". According to the representative of TVE, nowadays these systems are a kind of jigsaw that, from an engineering point of view, are hard to manage". All broadcasters present agreed with this, as also in the following proposal from Amadeu and Alberto: “De-localized work". In words by the Head of the Software Engineering Service: “Journalists want to be able to work anywhere”, a statement ratified by Jesús Sánchez: “Everything

would be much easier and quicker”. These last words put an end to the TM Broadcast Informative Breakfast devoted to technology for production and broadcast of news programs. Other highly-interesting meetings are scheduled, with topics to be announced in short. We at TM Broadcast we would like to express our appreciation for the cooperation shown by all attendees, thus making this interesting event possible. 

63 AUGUST ‘19


64 AUGUST ‘19


BBC Click

1000 episodes full of technological innovation Interview with Simon Hancock, editor If there is a program in the United Kingdom that clearly reflects the current technological news that is BBC Click. Since its birth in 2000, it has not only anticipated trends in the sector, but has decided to follow a path marked by innovation. In this way, some of its programs have been broadcast in 360 degrees or edited directly from mobile applications. Its next step? We discovered it interviewing Simon Hancock, editor of a television format that is about to reach its 1000th episode.

65 AUGUST ‘19


First of all, could you briefly explain the evolution of BBC Click? You’re almost 20 years old and have made 1000 shows! Yes - we can’t really believe it! The programme started in 2000 really to chart the dotcom boom and we have been on air every week of every year since. It is exhausting, but there is just so much going on in the world of tech and innovation and it’s a privilege to cover it. We aim to really capture some of the amazing work people are doing, to create “oh wow” moments for our audience and also to inspire the next generation. Of course we also look at some of the darker sides of tech as it comes to be so relevant, and so dominant in the world. Overall though, we aim to chart how tech is changing people’s lives for good and ill.

It has been said that BBC Click “push new tech” to its limits. Do you share this vision? What is the purpose of BBC Click? 66 AUGUST ‘19

Simon Hancock

Absolutely. We love reporting on the amazing world of tech and its innovations, but it’s also

hugely important to me that we are also innovating ourselves. We’re all quite geeky, but


to this end, within the team as well as people with production and tech journalism skills we have computer scientists, coders and an engineer. This allows us to do some really interesting project about 5 years ago we were the first programme ever to entirely shoot and edit on mobile devices (which was a nightmare, but fun!), a few years ago we were then the first programme to make an entire episode in 360 video (long before the kit was ready so we had to hack it all up ourselves and then more

recently we built an AI just to show how all that works). Now for our 1000th episode we’re doing this entire episode in Object Based Media which is incredibly ambitious but also great fun to experiment with.

360, AI, Big Data, User participation… What has been the implementation that your viewers have received best? I think the biggest reaction we had was probably for the show we edited and filmed on

mobile devices…it was just so hard, but something people could relate to this was about 5 years ago but even then it showed just what could be done with smartphones. I was really happy with the 360 show as it introduced a huge number of people to what at the time was a really novel new medium.

What have been your technological partners for your special tv programmes? Do you do an in-house development? Generally we tend to do

67 AUGUST ‘19


How is the process of creating a new special episode of BBC Click?

most of the work ourselves - we have a kind of start-up mentality on the programme and everyone on the programme from me downwards gets stuck into all aspects of production. We have an amazing multi-skilled team who love learning new approaches and skills so 68 AUGUST ‘19

we find we can generally do most things ourselves, and seeing it a bit smaller seems to mean we can achieve more but occasionally as with our collaboration with BBC R&D this time it’s great to work with other people who bring their knowledge and approach.

For this 1000th episode, made in this Object Based Media environment, we are having to invent and learn a completely different workflow for production - it’s very exciting, but also pretty mind-bending at times and we are swimming in flow-charts! People watching the episode online will be able to interactively explore the content we have made through the choices they make. It’s a bit more like playing a video game than watching normal video. The thing is because we


are creating this world with tons of optional content, we are having to gather maybe the equivalent of 5 episodes of the programme to make just this one special…it’s a lot of work but we hope it will be worth it.

Some would say that the kind of experiences that you create are standalone and cannot be applied to longlasting TV formats… What do you think? Does BBC Click intend to anticipate the future of TV? Our purpose when making shows like these is just to show what can be done with technology and new production

techniques - it’s not really our aim to develop longlasting formats. That said we often find while making the programmes that we discover things which we then incorporate back into our regular production - such as blending 360 shots into our normal show. That said I do think that an OBM approach to contentmaking has a huge amount to offer, and I genuinely believe it will over time become something used to make video content.

What is BBC Click preparing for the following months? After we recover from making this 1000th episode, we are making

two programmes to tie in with the 50th anniversary of the moon landing - one looking back at what was achieved and one looking forward to where space exploration is heading. We’re also making a documentary about Phoenix, Arizona which has become the world’s de facto capital of selfdriving cars. We always have a lot going on to be honest.

Finally, we would like to have your vision about the TV industry. In your opinion, what will be the next great technological innovations applied to broadcasting? I think the speed 5G offers and the way lower latency will create huge possibilities. It will be possible to do so much more in a live environment in quality and I think the interconnected-ness will potentially lead to some new and interesting formats. I wonder if it could also finally make VR a bit more compelling if it allows for a more social experience.  69 AUGUST ‘19


Discovering Dalet Unified News Operations Interview with Enrique Lafuente, General Manager, Spain

Dalet has an extensive career serving broadcasters from around the world. What does Dalet have to contribute to the field of newsroom? Dalet news solutions drive operations in many of the world’s largest newsrooms for major broadcasters. The solution enables the unification of the traditional newsroom components and the orchestration of resources, staff, systems, and content to create and distribute news stories across multiple platforms. We call it Dalet Unified News Operations (UNO). This approach better serves modern newsrooms, where remote collaboration and multi70 AUGUST ‘19

site workflows are commonplace and efficient multiplatform content production & distribution are key.

rooms allow users to open up news stories quickly, using ad-hoc groups to exchange media, scripts, and ideas.

Collaboration spans across the entire workflow: from planning through scripting, production all the way through broadcast and distribution. All of these activities are served by a single, unified platform and accessible from multiple devices. Chat

Dalet Unified News Operations brings together almost all the elements that need to be taken into account in the workflow of a newsroom. Is that the greatest virtue of Dalet UNO? The Dalet Unified News Operations solution allows


journalists, editors, and producers to collaboratively plan, create and deliver news. The intuitive desktop, web and mobile tools for planning, ingest, scripting, editing, production, playout, analysis, and archive are geared for busy multimedia newsrooms, with digital workflows, social media and audience engagement at the core of their operation.

What difference does Dalet Unified News Operations offer to other alternatives? There are five key standout Dalet Unified News Operations offerings: 1. Dalet Remote Editing seamless remote editing framework that enable in-the-field, remote users to collaborate and break news faster and better, even in lowbandwidth situations

2. Dalet Social Media Framework - robust social media capabilities that empowers storytellers to leverage social media for their evolving narratives 3. Dalet Media Cortex powerful AI service platform that automates content tagging and aid in news story development with an embedded smart assistant that surfaces content suggestions to 71 AUGUST ‘19


your creative and editorial teams. 4. Dalet OnePlay - nextgen automated studio production with orchestrated multiplatform distribution to open up new forms of audience engagement and revenue opportunities 5. Dalet Galaxy five cloudbased environment options - enable newsrooms to expand at will and bring down geographical barriers, allowing ubiquitous access to resources and full mobility for your staff

Powerful remote editing workflows Dalet Remote Editing is a highly scalable framework that brings the fullfeatured multimedia editing capabilities and speed to the editors working in the field or at remote offices without requiring a PAM/MAM at every location. Media organizations in fastpaced production markets such as news, sports and 72 AUGUST ‘19

reality TV often have extensive in-the-field production needs that are critical to their business. Unmitigated access to content located at the central hub and accelerated media exchange from the field to home and back are paramount for success. The Dalet Remote Editing framework securely connects journalists, producers, editors and other content creators to the central content hub, enabling the in-thefield/remote users to edit, assemble, collaborate and quickly submit packages or download highresolution media to finalize locally even in low-bandwidth situations.

Fully Integrated social media capabilities Broadcasters can also benefit from the Dalet Galaxy five’s enhanced Dalet Social Media Framework, which enables fast-paced workflows like news to treat social media as an integrated part of their overall operation. Journalists can harvest, analyze, produce and deliver fast-paced news on social media platforms alongside traditional outlets. The story-centric workflow offers familiar indicators such as the number of views, likes, shares, as well as audience comments and threads. Visual


metadata enrichment to provide deep content insights and discovery, and eventually surfaces the results at various levels of the Dalet application stack to provide actionable insights and real value to the users and to the organization.

engagement data lets journalists know how their posts are performing with their audience and discover new angles audiences are expecting.

Content Intelligence at your fingertips An extension of Dalet Galaxy five, Dalet Media Cortex is a SaaS service platform that enables consumption of cognitive services on demand in a pay-as-you-go model. Taking operations much further than the basic benefits of standard speech-to-text, face recognition and sentiment analysis, Dalet Media Cortex orchestrates advanced automatic

A new generation of studio automation for any show An extension of Dalet Galaxy five, Dalet OnePlay not only automates the control of all devices in the studio, it also fully leverages the MAM, NRCS and workflow orchestration capabilities of the platform to open up new forms of audience engagement and revenue opportunities, all the while optimizing the costs of the entire operations Dalet OnePlay benefits any production of scripted shows, making it an ideal solution for newscasts, sports magazines, live and live-to-tape studio shows.

High-performance and scalable cloud workflows Dalet Galaxy five features advanced integrations with AWS infrastructure services, enabling hybrid scalable architectures that introduce more mobility in the user experience. This allows new operation models, enhanced content security, and can help minimize content handling costs. With auto-scalability capabilities and native support of S3 and Glacier, new workflows can be enabled such as multi-site federation & content sharing, elastic disaster recovery and business continuity, or simply the migration of media archives to the cloud.

The multiplatform distribution is one of the highlights of the service. Can you tell us a bit more about this? Whether on TV, on VOD or on social media, audiences expect content to be available when and wherever they want it. 73 AUGUST ‘19


Multi-screen usage is even of more importance for audiences looking for timely and detailed information such as news and sports. With Dalet Galaxy five, the process of publishing multimedia content on digital platforms is fully automated. When a story is created, Dalet can automatically produces both a television and a digital story, and journalists immediately access all working assets associated with either platform. Because digital publishing is greatly simplified, broadcasters are able to move it upstream, with journalists now publishing their own stories to digital platforms. It’s all about empowerment! Cross-platform content experience is especially important for shows, which are becoming more sophisticated, driving complexity on the production processes. As a response, Dalet introduced Dalet OnePlay as part of the Dalet Unified News Operations 74 AUGUST ‘19

solution, an innovative new generation of studio automation designed to simplify, modernize and transform live show production. It helps production teams work more efficiently, collaborate better with the newsroom and easily build multi-platform experiences for the audience around any show. For graphics, Dalet Cube is a comprehensive suite of tools - integrated in the Dalet UNO solution - to design, manage and playout high-quality, broadcast 3D graphics. Dalet Cube allows journalists to seamlessly repurpose televisionformatted graphics for digital platforms, which now all feature graphics optimized for digital platform viewing.

Renowned companies have already implemented Dalet Unified News Operations. Can you name some examples? Definitely! In 2018, Euronews completed a

strategic transformation with Dalet that puts content customization for local audiences at the heart of its operation. The pioneering multicultural news provider replaced its legacy broadcast infrastructure with the Dalet Unified News Operations solution, transitioning from a single multi-language channel to 12 separate crossplatform channels tailored to local audiences, all while incurring no additional headcount. The 500 users across


Euronews’ four major offices in Athens, Brussels, Budapest and Lyon are connected and collaborating on a single environment, relying daily on the power and the agility of the Dalet Galaxy platform for the production, orchestration and delivery of their stories. With Dalet, Euronews is now producing more than double the amount of television content that it did under its legacy infrastructure. Additionally, by

synchronizing the production of content between its TV and digital platforms, Euronews is producing 20% more digital platform content than before the adoption of the Dalet Unified News Operations solution. Another interesting customer relying on UNO, includes CNNMoney Switzerland, using the solution to drive end-toend content production and distribution at its premium business news channel. The entire content production and

distribution of the new Zurich-based facility is using the full feature set of Dalet news solutions, powered by the robust Dalet Galaxy Media Asset Management (MAM), Workflow Orchestration and Editorial platform. Other customers that have implemented UNO include: NEXTRadioTV (BFM, BFMBusines, etc) dex.php/nextradiotvselects-dalet-powerchannels-majorexpansion-consolidation/ CVMC http://www.tmbroadcast.e s/index.php/a-punttelevision-valencia/

What will be the future implementation of the service? Are you currently working to enhance the possibilities of Dalet Unified News Operations? Visit us at IBC 2019 on stand 8.B77 to see the latest!  75 AUGUST ‘19


OTT, AI and the Big Data Connection By Ted Korte, Qligent

The media delivery business has become a game of seconds. The lines have blurred between broadcast and other IPrelated services for delivering media, while content creation has grown from original TV series and movies to howto videos and social media posts. Access to content for consumers seems limitless, with digital audio and video now the preferred media for nearly all of our daily activities. With so much content being consumed for a wider variety of purposes, viewing time and audience attention spans have grown shorter, making every second count. The main contributor to the rapid expansion of content creation and consumption has been the 76 AUGUST ‘19


emergence of Over the Top (OTT) delivery, made possible by broadband connectivity to a wide range of “connected” devices. This model gives consumers access, convenience and value that wasn’t available via traditional linear services. However, Cable, Satellite, IPTV and Over the Air (OTA) delivery will not completely disappear;

each will find their place in this new media delivery ecosystem. For these providers, competing in this fragmented landscape will require a mix of traditional linear services alongside new OTT services, combined with a strong data-driven approach. Content owners maintain very little control upon

turning their product over to CDNs and OTT service providers for delivery. To complicate matters, they lack insight into the viewer’s quality-ofexperience (QoE), as more and more third-party services become part of the end-to-end solution. This vacuum of information begs for new methods that ensure a quality experience and

77 AUGUST ‘19


proper measurement of viewer engagement. The aggregation of quality-of-service (QoS), QoE, and viewer behavior data produces extremely large but trusted data sets. By harnessing sophisticated Machine Learning (ML) and Artificial Intelligence (AI) technologies to process this data, media enterprises can glean the valuable insights needed to improve the viewer experience. Significantly, these techniques can be used to predict – in turn allowing operators to prevent – customer-impacting problems before they occur, which is invaluable in minimizing subscriber churn.

The OTT Challenge OTT has tremendous growth potential for the media and entertainment sector, with growth is projected to exceed 158B worldwide by 2025. OTT delivery can provide consumers with one-to-one, personalized experiences while offering providers the ability to collect immediate feedback. To maximize this opportunity, content creators will need to determine the right content, right duration, right time and right platform to reach their audience in real time. Regardless of the end goal, though, the first question in any decision tree should be “Is the quality great?” Studies consistently place poor quality in the top four reasons why viewers abandon OTT video. And with short-form content consumption on the rise, even relatively brief problems become very noticeable – for example, imagine a five-second delay in a four-second preroll ad. To complicate matters, OTT is extremely difficult to control end-to-end. OTA broadcasters controlled the entire chain through to their 78 AUGUST ‘19


transmitters, while Cable, Satellite, and IPTV distribution offered a single handoff both technically and commercially. The picture is quite different for OTT. Playout is moving to the cloud via thirdparty providers, as are streaming service functions including transcoding, packaging and DRM. Meanwhile, multiCDN and multi-ISP solutions are becoming the norm for reliable delivery and reaching consumers on-thego. This approach enables incredible scale and speed-tomarket, but comes with a cost: loss of control. There could potentially be several hand-offs between separate third-party service providers, thus making a holistic, end-to-end data aggregation, 79 AUGUST ‘19


monitoring and analytics system a “must-have” for a successful OTT channel. The best way to optimize OTT-delivered content is to start with high-quality delivery to a target audience, and respond to feedback in real-time. To achieve this, many are looking toward new technologies – most notably, AI.

Enabling Technologies AI has been talked about for decades, but adoption and useful results have been a rollercoaster ride. It didn’t really become a practical reality until the cloud, big data, and IoT enabled the capture, storage and processing of vast quantities of data. Large datasets can hold a lot of potential value,

Churn Reduction Impact of Quality Improvements

80 AUGUST ‘19

but it is challenging to find patterns, trends, and anomalies within them. Methods and approaches from computer science, mathematics and statistics have been joined together to extract and interpret knowledge. Approaches vary from Data Warehousing and Online Analytical Processing (OLAP) to Data Mining and Machine Learning (ML). Data Mining is defined as


the process of discovering patterns in data, either automatically or semiautomatically. This process is supported by tools and practical techniques also known as Machine Learning, which are used to identify the underlying structure of the data. Data Mining techniques can be used to predict future outcomes based on historical data – for example, identifying customers unhappy with their OTT service and predicting the likelihood of them cancelling their subscription. Machine learning can support this analysis by, for example, using clustering methods to categorize customers based on their consumption habits. There are numerous AI methods and approaches that can be used in Data Mining applications, depending on the characteristics of the available data and the questions to be answered. It is critical to pick the right set of tools and techniques. With the help of a Data Scientist, the

project goals can be decomposed into subsequent tasks that can be solved by certain Machine Learning techniques. Selection of the proper model or approach requires investigation of the data, which must first be cleaned, transformed and properly ingested into the system. The path to an optimal Data Mining solution may involve iteratively exploring, building and tuning many models. Various off-the-shelf software tools offer graphical and conceptual support for all phases of the knowledge discovery process. This eases the daily work of Data Mining experts and allows a growing number of nonexperts to start knowledge discovery projects, but since every use case is unique, you will need to understand how to properly use these components. There are always factors such as exceptions to rules and errors in data that require further analysis of data

and fine tuning of the models.

Big Data and AI in Action An example of the use of AI and ML to turn Big Data into actionable business insights is a project that Qligent deployed with a large-scale provider. Their primary objective was understanding, preventing and reversing subscriber churn, but to do so they needed a better understanding of their end-customers’ experiences and consumption habits. Working with Qligent, the provider deployed an intelligent analytics system that supplemented data collection and mining with controlled “Last Mile” probes and end-user IoT probes. A Big Data architecture was designed to process the new and legacy data in real-time, and a workflow sequence was created to process the data. Key Performance Indicators (KPIs) and Key Quality Indictors (KQIs) 81 AUGUST ‘19


were developed to create both predictive and prescriptive analytics. The complex analytical computations behind the KQIs were modeled to indicate service availability.

information to optimize the performance and reliability of their network.

To simplify the understanding and use of the results, the KPIs and KQIs were broken down into three topological domains – the headend, the network and the subscriber – and designed such that any output metric lower than 95% would trigger corrective action. By leveraging these insights, the provider realized quantifiable improvements in quality and viewer engagement while reducing support calls and churn.

The headend and network KQIs were initially already above their minimum target of 95%, but increased another 1.4% for the headend and 1.7% for the network over the first six months with the help of analytics-driven corrective actions, and continue to grow. Interestingly, this seemingly modest improvement in quality was followed by an increase in concurrent subscriber usage. The provider was subsequently able correlate that the service quality improvements attracted more concurrent viewers and longer average viewing times.

The Qligent analytics system currently generates approximately 20,000 predictive tickets each week across all KQIs in all macro-regions. The number of tickets is expected to continuously drop as the provider’s first-line and second-line support teams use this

Between 150 and 300 subscriber-related predictive tickets are generated by the system daily per macro-region, each representing an individual or small group of subscribers predicted to be affected by a critical fault in the next three to five days. The second-line

82 AUGUST ‘19

support team investigates each predictive ticket, with a goal of preventing the fault from happening. As a result, the first-line support team saw a 6.6% decrease in the number of incoming customer problem reports. Even more impressive has been an astounding 93.8% decrease in repeat calls from customers about the problems detected and sent for investigation by the analytics system. Similarly, the second-line support team saw an 86.2% decrease in customers calling multiple times about the same problems. This confirms the benefits of quickly determining the root cause of any issues. The analytics results also enabled the provider to create a prioritized “churn prevention” list for customer service agents to proactively contact. Initially, the weeklygenerated list had a large number of subscribers to call, with roughly 35% of them predicted to have a very high probability of leaving the service. After


six months, the list was reduced by over 80%. Furthermore, by arming service representatives with analytics about subscribers’ preferences and past technical problems, the agents were able to demonstrate the provider’s commitment to customer service when speaking with the subscribers. Having this personalized knowledge before the call proved far more successful than generic questionnaires or robo-calls.

Conclusions OTT enables an array of compelling new business models, including personalization at a

global scale. It changes everything from the size and type of content, to how content is measured and monetized. This trend also introduced new players into the media and entertainment landscape, many of whom were early pioneers in the use of the cloud, Big Data and AI. Now, the new and traditional players alike are looking toward these technologies to gain a competitive advantage. Bringing these technologies together can provide media organizations with valuable insights they can use to improve their subscribers’ experience. Most importantly, the use

of AI and ML enables providers to predict problems before they actually occur, and thus correct them before they impact their viewers. As seen in our case study example, early project results demonstrated a direct correlation between quality improvement and end-user engagement. More viewers tuned in, watched longer, and were less likely to cancel their service after knowing their provider is staying on top of QoS and QoE issues. The only thing worse than not addressing quality problems quickly enough is not knowing about them at all.  83 AUGUST ‘19


Gaining Value from Big Data

By Tim Jobling, CTO, Imagen

Broadcasters are dealing with big data on a constant basis, primarily in the form of video content which is naturally larger as a data source than other forms. The advent of the internet and the resulting volume of data that needs to be managed, 84 AUGUST ‘19

stored and analysed has led to the emergence of modern-day big data technologies.

also so they can gain value from it.

However, in broadcast, although over-the-top (OTT) and streaming have been around for a while now, the boom of upcoming OTT services, such as Disney+, Apple TV+ and HBO Max, means that the volume of content the consumer now has access to is far beyond anything broadcasters were previously equipped for. In addition, broadcasters are now handling more information on their audiences than ever before. Subsequently, broadcasters need to find a way to store and manage this big data in a suitable way, not only so it’s easily accessed, but

The volume of data owned by broadcasters continues to increase as they create more content and generate more data on customers. With the rise of streaming platforms, broadcasters are gaining increasing amounts of data on their audiences. Historically, broadcasters were unable to access this information and instead were reliant on third parties, such as Nielsen, or surveys to provide the insights they needed to understand viewing habits. But, as broadcasters are now able to capture data on viewers (and most want to gain as much as possible), they are also required to store

Data analytics


85 AUGUST ‘19


and manage this big data, – particularly if they want to maximise its value to inform business models and to drive engagement. This is even more prevalent for the likes of Netflix or YouTube which use algorithms to continue to engage users in their content; by using big data to recommend other things they might like to watch, they are able to retain the consumer’s attention and ensure they stay on the platform. In this scenario, Netflix is using big data to great effect with its recommendation system reported to influence 80% of the content watched on the platform. The streaming service is also using big data to inform which shows to commission and for how many seasons. For instance, Netflix used big data to form its decision to commission House of Cards for 26 episodes despite not having seen the pilot, basing it on information such as the subject matter, the fact that there was an existing fan base due to the 86 AUGUST ‘19

original British series, and the appeal of the actors and director attached to the project. It has continued to use this model with the film ‘Bird Box’ which seems to have paid off with the film being watched by more than 45 million accounts during its first seven days. Big data, therefore, plays a valuable role in the decision-making process within broadcasters.

Historic vs current data As Disney+ prepares to launch, the sheer scale of data it has at its disposal could be seen as an advantage. After all, the first Disney film, Snow White and the Seven Dwarfs, was released in 1938 – almost 60 years before the launch of Netflix and 68 years before Amazon Prime


Video. During this time, Walt Disney Pictures will have gained huge amounts of data about which films and programmes are the most successful, which themes resonate the most, and which demographics they should aim at. All of this data will have been extremely valuable to the organisation when making decisions regarding new films to commission or

which films should have a sequel, for example. However, despite this, the data owned by Disney may not be as advantageous and useful when it comes to its new platform, owing to the fact it will have aged substantially. In this situation, Netflix and Amazon will have the upper hand as the data they possess is based on current users so they’re able to utilise this information to personalise the platform for each individual user. With data on audiences spanning the past 81 years, Disney+ will be unable to do this – at least not immediately. Consequently, in some ways, Disney+ will have to start afresh when it comes to capturing data on its viewers and build up this information over months and years. That’s not to say its existing data will go unused, instead, it will have to be combined with new and up-to-date information to provide any real value.

Gaining value from big data By 2022, it is predicted that there will be 777 million SVoD (subscription video on demand) subscribers globally, rising from 474 million in 2018. This increase will lead to broadcasters creating even more content to cater to the differing tastes and viewing habits of such a vast audience. Additionally, the amount of data captured on viewers will increase exponentially as viewer numbers increase. As a result, it’s vital that broadcasters are able to store and manage this data effectively in order to derive as much value from it as possible. As competition increases, this will be vital in enabling broadcasters to get ahead and gain a larger proportion of the SVoD market.  87 AUGUST ‘19



CANON XA 55 The first time the XA55 is held in the hands, one could be misled into believing to be in possession of an advanced home camera. However, this impression soon vanishes as it is actually a professional, truly complete and compact camera. Let us look at it in more detail. By Yeray Alfageme, Business Technology Manager Olympic Channel

The Canon XA55 has been launched into the market along with its twin sibling, the XA50. The only difference between them is the 3G-SDI output featured in the XA55 as compared to the 1.5Gbps output in the XA50. This enables the XA55 to deliver a linear video signal up to 1080p50 at 10 bits, while the XA only reaches 1080i50 at 8 bits. Aside from this small difference, all other features are just the same.

Body As we have being by making a mention to it, let us discuss in detail the body of XA55. Canon’s XA range has

88 AUGUST ‘19


89 AUGUST ‘19


always boasted being a truly compact device and, with the leap into UHD, Canon has not left this important feature aside for its most compact product family. And the XA55 weighs less than a kilo, yes, less than one kilo. And if the small battery and the upper handle –which provides the connectivity and audio control, as well as additional recording controls- are added, it hardly reaches 1,400 grams. Really light stuff. Its matt finish and build –resistant plastic- provide a high degree of durability against abrasion or scratching. All controls, from menu buttons to zoom controls or power knobs, achieve a really professional, reliable feel. One limitation Canon had to adopt when keeping such a compact format was that the camera just has one control ring around the lense, which only enables controlling focus or zoom, but not at the same time. A small selector at a side of the ring enables us selecting its function.

90 AUGUST ‘19

Lense The XA55 uses a compact lense from Canon featuring a 15x zoom that ranges from 25.5mm to 382.5mm, equivalent to 35mm, and an opening between 2.8 and 4.5. Although this is not particularly impressive, it can be seen that Canon is an experienced lense manufacturer, as no aberrations or errors can be noticed in the image, either at widest opening or at the highest zoom range. Also, the camera has available ultra-wide angle accessories and 1.5 tele, which make it even more versatile. Furthermore, digital options featured by the software are also very flexible. Although it is true –mostly in 4K, of coursethat the 300x digital zoom is not really recommendable, the 2x doubler does work as a useful tool. Under HD, the loss of quality is hardly noticeable mainly due to the interpolation technique from UHD used by the camera to generate


an HD signal, as we note below. A great detail is the camera’s touch screen, which enables controlling the focus just by selecting a point in the image. This feature was already present in most cell phones offered in the market, but including this in a reliable manner in a professional camera was quite a challenge, one which Canon has more than successfully overcome.

Image quality The XA55 belongs to the first family of UHD cameras in Canon’s XA series along with the XA40 and the above-mentioned XA50. As opposed to the other models in the range, the sensor of the XA is a CMOS 1.0 sensor. The XA range had come this far equipped with a ½.84 sensor, but now the sensor in the XA55 features a higher area, about 6.8 times bigger, and therefore increased sensitivity and lesser noise, a must-have in UHD. As this is a bigger sensor 91 AUGUST ‘19


featuring a bigger bouquet achieved through lesser field depth, it provides the image with a much more perfected aesthetics, which coupled with the autofocus feature, allows for great possibilities in regard to image composition. As noted before, HD signal generation deserves a separate comment. And the Canon does not discriminate between pixels when generating the HD image from the UHD sensor, but captures a full image with 4K definition and interpolates an HD image by discarding extraneous information. This is noticeable mostly in high-contrast scenes where HD image generation is really faithful to reality. The XA55 features a 5-axle image stabilizer resulting from a combination of both optical and electronic stabilization. Autofocus is offered by means of a proprietary technology developed by Canon, the so-called Dual Pixel CMOS AF. This technology offers a continued autofocus during recording that covers 80% of the center of the image. As we have already mentioned, the touch control through the focus screen perfectly blends with object monitoring, which makes the focus follow the object when the latter is moving along the image. An incredible, really easy-to-use effect. Last, the device features face detection and tracking and an advanced focus system only for faces, which -again- makes focus easier, something that is critical in 4KUHD scenes.

92 AUGUST ‘19

Recording formats The XA55 is an UHD-native camera that supports the following recording modes: • UHD: 3840 x 2160, 50P, YCC 4:2:2 (8 bits) • HD: 1920 x 1080, 50P, YCC 4:2:2 (10 bits) And their 30FPS and 60FPS equivalents, respectively. As for recording formats, it offers both MP4 and XF-AVC (MPX) with the following qualities: • MP4: 3840 x 2160 (150 Mbps)/1920 x 1080 (35/17 Mbps) • XF-AVC: 3840 x 2160 (160 Mbps)/1920 x1080 (45 Mbps)


copy of the material but also the same material in the two formats, which is very useful if you do not know for sure about the end use it will be given. As for audio, the XA55 offers to AAC channels or 4 16-bit LPCM channels. With the standard accessory included, we have two XLR connections in addition to the inbuilt stereo microphone located in the body. Each of those external connections has a microphone preview featuring adjustable or automatic gain, including 48V phantom power. In addition to the already mentioned SDI output, 3G in the XA55 and, 1.5G in the XA50, the device has a headphone jack and microphone mini-jack input, directly on the body, and HDMI output.

Conclusions Alter testing both formats, we can say that the compression quality of MP4 is nearly the same as XF-AVC, so the codec choice will be determined by the subsequent use we intend for our images. If our production targets ENG or social media, no doubt the MP4 format will offer us quickness and higher compatibility, while the XF-AVC format will offer greater integration with advanced postproduction and edition environments. As the camera has two recording slots for SD cards, dual recording in both formats can be achieved, if required. This offers both extra assurance and the possibility of having not only a backup

Incredible as it may seem in a body weighing less than one kilo, the XA55 has all features that are to be expected in a professional camera. Its small size makes also possible reaching up to 2 hours of uninterrupted recording with a single charge of its small battery. The XA55 is a professional camera within the body of a compact camera, which is really surprising when we see its recording quality and the countless possibilities it offers. The arrival of UHD to the XA range is god news, as it allows news reporters to make small productions and even documentaries. An option to really bear in mind when size matters. ď ľ 93 AUGUST ‘19