TM Broadcast International #108, August 2022

Page 1

NEWS - PRODUCTS

Titular noticia Texto noticia

1



EDITORIAL In our industry, the content we produce is worth millions. An important part of our job is to preserve and protect it so that we can distribute or store it. What is the best way to do this? Traditional methods relied on the creation of a robotic library that was located in large, expensive facilities with huge and very high energy costs. Today there is an alternative: the cloud. In an effort to learn about the current state of technology and the capabilities offered by this cloud infrastructure, we spoke with Symon Roue, Managing Director of VIDA. This company is part of Visual Data and the fruit of its work is a platform for organizing, distributing and monetizing the valuable content we produce. Of course, the cloud also entails costs —often high, it must be said— but it provides a clear advance in the agility and flexibility of sharing our resources. Alternatively, we can always store it in an old shack with a leaky roof and pray for the best preservation conditions. The final of the UEFA Europa League is a football event that brings together millions of fans through traditional broadcast media and new digital media. The last edition was produced by Mediapro Group, an association of audiovisual companies covering all steps of the content production chain with international presence.

Editor in chief Javier de Martín editor@tmbroadcast.com

We have interviewed Òscar Lago, director of the match, to know all the details of his work in this event. Did you know about the application of virtual reality techniques on animation processes? Can you imagine a 3D artist creating animated characters and environments while “drawing” them immersed in a VR environment? Dada! Animation is a French 3D animation studio that aspires to incorporate these recent technologies into its work. We spoke with Quenting Auger, Head of Innovation, to find out how they are incorporating it thanks to the power of game engines. On the other hand, Gravity Media and TVN Live Production, two companies with international capabilities dedicated to providing production services, have shared with the editorial staff of TM Broadcast International the capabilities of their fleet of mobile units. Here you will find all the details and experience of these companies in the Outside Broadcast area. Last but not least, our contributor Yeray Alfageme has dedicated his space in this issue to discovering and making us understand the characteristics of another of the constant transitions that broadcasting is undergoing: producing live in UHD.

Creative Direction Mercedes González

TM Broadcast International #108 August 2022

mercedes.gonzalez@tmbroadcast.com

Key account manager Susana Sampedro ssa@tmbroadcast.com

Administration Laura de Diego

Editorial staff press@tmbroadcast.com

Published in Spain

administration@tmbroadcast.com

ISSN: 2659-5966

TM Broadcast International is a magazine published by Daró Media Group SL Centro Empresarial Tartessos Calle Pollensa 2, oficina 14 28290 Las Rozas (Madrid), Spain Phone +34 91 640 46 43

3


SUMMARY 6

20

NEWS UEFA Europa League Final An interview with Óscar Lago, producer of the Mediapro Group

28

40

Dada! Animation Game engines, AI and VR in 3D Animation

Outside Broadcast with Gravity Media Gravity Media is a technical and human services provider focused on broadcast and multimedia event production that operates worldwide.

TVN Live Production TVN Live Production always wants to ensure the best quality and the highest security to its customers. This is one of the key reasons why they continue to rely on SDI technologies.

4


56 62 66

74

Live production in UHD More definition, more reality Delivering high resolution live television, by Caton Technology

VIDA VIDA It is a cloud-native SaaS platform delivering efficiencies for managing, migrating, distributing and monetizing library content. It works as a content operating system. The solution runs in the cloud across multiple cloud providers and uses the latest tools and serverless multi-cloud infrastructure.

Case study: Clip Factory Pro for editing, clipping, and exporting content to the new media platforms for RTV 5


NEWS - PRODUCTS

Telestream releases version 3.3 of its PRISM waveform monitor software Telestream has announced the latest software for its PRISM family of hybrid IP/ SDI waveform monitors. Version 3.3 adds new video format support for postproduction applications faster switching of IP inputs, Dolby ED2 metadata support, among others features. PRISM is a solution designed to assist any user regarding these steps of the production chain: video engineering, operations, live acquisition, event production, and postproduction. It includes compatibility with today’s wide color gamut, high dynamic range, and highresolution formats. With faster switching of IP inputs, at around half a second, camera shading operators can make comparisons regardless of whether they are working in an IP, SDI, or hybrid facility. Version 3.3 now supports DCI 4K post production resolutions up to 4096×2160 and refresh rates from 23.98 to 60p.

6

The latest version also offers Dolby ED2 metadata decoding which is used for Dolby Atmos and guard band measurement for comprehensive audio monitoring. PRISM is a software-defined instrument family, so users can add new features and capabilities with an upgrade. The system also feature AV Delay Measurement for SDI, ST2022-6 and ST211020/30/31 using the AV delay test signal from Telestream Sync Pulse Generator products; Safe Area display with AFD graticule; colorized RGB Parade Waveform / Stop display; and larger floating timecode display.

“As the dividing lines between Film and TV production blur, we are adding capability to the PRISM platform to extend the format capability in SDI and IP to meet these needs for a broad range of customers doing higherend post work,” says Charlie Dunn, SVP, Video Test, Synchronization and Quality Assurance Business Unit. “Now that ST-2110 can be considered mainstream, we are adding the expected capability to measure AVDelay, Dolby E/ED2, and enable a seamless switch so that the benefits of an IP infrastructure can be realized without any operational compromises.” 


NEWS - PRODUCTS

7


NEWS - PRODUCTS

Atomos introduces cloud-based products from the Connect and Atomos Cloud Studio lines SHOGUN CONNECT is an integrated device that combines HDR monitoring and RAW recording, as well as advanced network and cloud workflows. It has a 7-inch HDR display that delivers 2000 nits. ZATO CONNECT has been The company Atomos has announced that its cloudbased project ATOMOS Cloud Studio is underway and that three of the first products in the Connect range are now on sale worldwide. The combination of these network and cloud-connected devices establishes an ecosystem for collaboration and remote working in video production environments. V/NINJA V+, SHOGUN CONNECT, ZATO CONNECT, ATOMOS CONNECT for NINJA and ATOMOS Cloud Studio products have been announced.

8

ATOMOS Cloud Studio is a collection of new cloudbased video production services. It will launch with ATOMOS Stream and soon after with ATOMOS Capture to Cloud, which will introduce C2C workflows. At launch, ATOMOS Stream will include free support for delivering content directly to a single platform, including Facebook Live, Twitch, YouTube, and more. ATOMOS CONNECT is an accessory for the NINJA V and NINJA V+. When attached to the NINJA V/ V+, the ATOMOS CONNECT transforms professional film, mirrorless and DSLR cameras into cloudintegrated devices.

designed to be a more compact and easy-to-use gateway to ATOMOS Cloud Studio. The device supports HDMI and USB UVC sources for live streaming to Facebook Live, Twitch, YouTube and other social networks. It can also be used to create a webcam feed for conferencing software. With support for WiFi 5 and Gigabit Ethernet (via USBC), the solution can convert HDMI sources to webcam signals and includes the ability to merge sources. Later this year, ATOMOS will launch Live Production, a cloud-based control room for live video and remote collaboration. 


NEWS - SUCCESS STORIES

9


NEWS - SUCCESS STORIES

TV2 and DR conduct a 5G Broadcast proof-of-concept with Rohde and Schwarz and Qualcomm in the context of the Tour de France Rohde & Schwarz has recently taken part in a public demonstration of 5G Broadcast mobile content delivery. The proof of concept was made by Danish broadcasters DR and TV2, with special mobile devices provided by Qualcomm. First stage of this year edition of the Tour de France has been held on Denmark. The tour started with a time trial in the capital Copenhagen on 1 July. TV2, the host broadcaster, offered specially tailored content on 5G Broadcast looking to increase audience engagement. The advantage is that the content was not streamed over large numbers of one-to-one connections, but broadcast to devices which do not even need a SIM to receive it, eliminating network congestion and allowing broadcasters to retain the transmission rights.

10

For the TV2/DR demonstration, the transmission used a Rohde & Schwarz TMU9evo medium power UHF transmitter installed in Copenhagen for the project. Content was prepared by TV2 as part of its host broadcaster productions. Viewers saw the live, low latency broadcasts on smartphone-like devices from Qualcomm Technologies. “5G broadcast is an exciting new way for consumers to connect to live video broadcasts wherever they are, adding to the experience and engagement of events like

the Tour de France,” said Manfred Reitmeier, VP broadcast and amplifier systems at Rohde & Schwarz. “This was part of a continuing programme of demonstrations in association with Qualcomm to showcase what this technology can deliver, and we are grateful for the participation of DR and TV2.” Lorenzo Casaccia, VP technical standards at Qualcomm Europe, added “This was a great demonstration of the way that 5G Broadcast can bring a new viewing dimension.” 


NEWS - SUCCESS STORIES

11


NEWS - SUCCESS STORIES

FACEIT relied on Panasonic solutions to save its esports season during the worst stages of the pandemic FACEIT is an independent competitive gaming platform for online multiplayer PvP players. In addition, the organization has its own television and media production unit that provides professional gaming coverage of some of the biggest esports events in the world. Like all companies around the world, the pandemic affected their working models. For example, the emergency situation prevented FACEIT’s production team from traveling to the United States to host their Counter-Strike:Global Offensive Flashpoint Season 2 tournament.

They changed plans and moved the production site to Twickenham Stadium in the UK, in addition to using Panasonic’s PTZ camera and robotics solutions. “We had worked with Twickenham in the past, and as a venue it is suitably secure and locked down,” said Andrew Lane, Director of Broadcast & Production at FACEIT. The company decided to go ahead with the event but with all the players competing remotely and creating a studio at Twickenham. The team used a PTZ camerabased system featuring the Panasonic AW-UE150 camera, for three purposes. They could control these cameras over a network, so that the team could operate in isolation, from their hotel rooms, in the worst-case scenario. “We had access to the full hotel network, and had patched this to make it our own. This meant that

12

everyone from the bubble had wired network to their room from day one,” assueres Andrew. The production team’s system included six Panasonic UE150 cameras, an AW-RP150 controller, and a Tuning rail system with an elevated column – all supplied via ES Broadcast. The Panasonic UE150 cameras and Tuning system fitted in the environment because they didn’t require proprietary cables, and could be controlled from a simple IP network. The wider production system was very Blackmagic-based and included an ATEM Constellation mixer and RIEDEL comms. The team also used an infinity green screen and Augmented Reality effects on the live show and behind the scenes, in use with the Panasonic UE150 cameras, thanks to their compatibility with the FreeD protocol. 


NEWS - SUCCESS STORIES

13


NEWS - SUCCESS STORIES

The latest addition to NEP’s fleet features Cobalt Digital technology for signal processing, conversion and distribution

Supershooter 6, the newest NEP’s Outside Broadcast vehicle has incorporated twelve frames of Cobalt Digital’s signal processing, conversion and distribution gear onboard. Cobalt provided NEP with 12 HPF-9000 20 slot, frames packed with openGear form factor cards and Cobalt’s OGCP9000 ethernet remote control panel optimized for live color correction applications. The

14

configuration comprises 4K and HDR capable cards and supports up/down/ cross conversion, audio processing and color correction. The system maximizes space with capacities such as multirate distribution, 3G/HD to SDI and analog down conversion, high-density coax to fiber conversion, and multi-path up, down, cross conversion to be leveraged from a compact footprint.

Other highlights of the elaborate system of the Supershooter 6 include over 100 Cobalt openGear Distribution Amplifiers from several of the Company’s card families, including the 9910, 9501, 9502, and 9410. Functionality includes dual and multi-rate reclocking, analog video looping distribution with EQ, and downconverting and output crosspoint features. The 53’ mobile unit has


NEWS - SUCCESS STORIES

numerous Cobalt 9905-MPx software defined audio/ video processing cards installed. Supershooter 6 also has the 9902-UDX 3G/HD/SD-SDI up/down/ cross converter, frame sync, audio embedder/deembedder with four analog audio and composite analog CVBS video input/ output. In addition, Cobalt loaded a sizeable number of options onto the cards for

NEP including presets to convert SDR to HDR and back, 3D LUTs, 4K color corrector software, RGB color correction with gain, lift and gama chroma and luma clipping. “NEP was Cobalt’s first customer when the company launched 25 years ago and we’re proud that they still come to us for the most advanced and reliable solutions for their fleet,” said Suzana Brady,

senior vp of worldwide sales and marketing for Cobalt. “NEP is globally renowned for bringing content to life, and those events must be flawlessly delivered for an exceptional viewer experience. There is no room for failure or compromised quality, and they’ve trusted Cobalt to deliver market-leading and technologically advanced solutions they can count on for over a quarter of a century.” 

15


NEWS - SUCCESS STORIES

Ideal Systems creates a virtual studio for Malaysian network SUKE TV based on NDI Ideal Systems has recently built a TV studio in Kuala Lumpur for SUKE TV. This broadcast network was launched by Dato’ AC Mizal and it aims to deliver original content of

micro sellers to showcase

Technology and also uses

all genres for Malaysian

their products on TV.

Aximmetry virtual studio

TVs and digital mediums. It was launched in April 2022 and is a Free-ToAir home shopping and general entertainment channel. SUKE TV airs on the Malaysian national Digital Terrestrial Television (DTT) platform as well as streaming. Founder and Chairman, Dato AC Mizal, said; “It’s been my dream as an artist to have my own TV station.”

Ideal Systems was selected by SUKE TV because the company could provide a one-stop-shop for the design and build of the TV studio including all TV Network’s broadcast and transmission systems as a

and 3D graphics. This solution has an option to run Unreal Engine which is also NDI enabled. For the transmission, encoding and decoding for contribution and streaming the system will use Kiloview SRT and

single turn-key project.

NDI encoding systems.

The infrastructure is based

Sofiyant Neo, Director

on NDI. This protocolbased cameras feed to a Tricaster TC1 from NewTek for switching, recording and

Media & Creative Content for Ideal Systems said, “I have been working very closely with our Technology

The studio was designed

running the virtual studio

Director Updesh Singh and

to create and produce

and the latest Kslim LED

his team to leverage the

local programming and

video wall from Unilumin

latest technologies to help

entertainment content

which is 4mX2.5m. Ideal

Dato AC Mizal realise their

and present opportunities

have deployed Xplayout

dreams of creating great

for local celebrities, public

for playout, Xingest for

content and ensuring the

figures, social influencers,

Ingest and YouPlay for

market impact and success

entrepreneurs as well as

studio playout all from Axel

of SUKE TV”.

16


NEWS - SUCCESS STORIES

Gravity Media launches OB truck named Sirius for French and European markets Gravity Media has officially launched the Outside Broadcast truck Sirius, based in France. It is a custom-built outside broadcast vehicle which supports 18 cameras, with up to three vision operators, two Replay operators, and an OB supervisor position. It can accommodate 14 staff. This truck has been designed specifically for the French and European broadcast market. The OB truck has already been successfully deployed at the ARES Fighting Championship, the International Outdoor Meeting of Athletics and the Archery World Cup.

Furthermore, Sirius adds to Gravity Media’s extensive fleet of mobile units and OB trucks, Mobile Units and DSNG trucks across the UK, Australia, USA, France, Qatar, and Germany. Solène Zavagno, General Manager at Gravity Media in France states “With an exciting summer of sport ahead, this is the perfect time to launch our new French-based OB truck into the European market. We are thrilled to serve our clients further, adding another card to the already existing wide range of services. Sirius is readily available for world-class events.”

Ed Tischler, Managing Director at Gravity Media commented “Gravity Media is delighted to be extending our range of OB trucks, Mobile Units and DSNGs as we continue to innovate and tailor our broadcast and technology solutions for our client’s needs. We operate every day across all tiers of world sport and entertainment – at the heart of our business is a rich tradition of delivering high-end content and live production across a wide variety of world brands. With the arrival of our latest OB truck Sirius we can ensure we can support yet more projects in France and across Europe.” 

17


NEWS - BUSINESS & PEOPLE

Broadcast Pix celebrates its 20th anniversary

The company Broadcast

two decades of listening

Pix specialized in the

and reacting to customer

development of integrated

feedback.

video production systems is celebrating its 20th anniversary this July.

Today, the product line has evolved into format- and resolution-

assures. “With 20 years’ worth of accumulated knowledge and user interface design, Broadcast Pix products are

Broadcast Pix was one

independent software

the most comprehensive,

of the pioneers in the

that runs on consumer-

yet easy-to-use solutions on

integrated production

available hardware. “From

the market. As we continue

systems industry and

StreamingPix to the

established its reputation

production switcher, the

to grow, we want you to

for developing toolsets for

GX Hybrid; the company

live video content creation.

has a product for every

These tools continue to

application: the corporate

be enhanced, and user

event, the classroom, or

says Graham Sharp, CEO,

interfaces refined through

the OB truck,” the company

Broadcast Pix.

18

know that one thing will never change: our excellent, people-driven support,”


NEWS - BUSINESS & PEOPLE

X-Pert Playout now integrates with Caliope Media Software’s OnAir TV scheduling system X-Pert Multimedia Solutions

The customers can

has been integrated with

take advantage of the

OnAir TV, the scheduling

integration with OnAir TV

system of Caliope Media

regardless they have a

Software.

perpetual licence of X-Pert

OnAir TV is a TV scheduler

in-a-Box or rely on a

that replaces the well-

subscription-based model.

known manual drag &

Playout or X-Pert Channel-

Multimedia Solutions. “We want to make media operations more efficient, and that is why we invested our efforts in developing a user-friendly, stable, flexible, and technical advanced software,” says

“OnAir TV gives additional

Norbert Deelstra, the

value to our products. Now

founder of Caliope Media

we can guarantee a 100%

Software. “We are very

auto-mated process for

happy to partner with

every linear TV channel,

X-Pert. We can now be even

created, it can be changed

without any need for any

surer that the integration

manually if needed before

human interaction. As

between OnAir TV and the

being exported to the

a result of the reduced

X-Pert software will work

X-Pert Playout.

human activity, we can

flawlessly at all times. This

ensure fewer mistakes

will increase the reliability of

and a more opti-mized

the whole system stack and

process in terms of time

will reduce the time and

includes a Media Asset

and human resources

effort the TV broadcasters

Management system (MAM)

invested,” explains Ludmil

need to spend on their

and an EPG generator.

Kushinov, the CEO of X-Pert

software.”.

drop playlist generation process. It creates a playlist based on a set of rules and templates. Once it is

Besides the playlist scheduler, OnAir TV

19


SPORTS

20


UEFA EUROPA LEAGUE FINAL

21


SPORTS

An interview with Óscar Lago, producer of the Mediapro Group Óscar Lago is one of the most experienced and renowned audiovisual producers in our sector. Recently, he has carried out the broadcast of an event as important as the UEFA Europa League final. This event, which brings together millions of international fans - both on-site and through television - is one of the most important events being broadcast every year. On this occasion, the final was held at the Ramón SánchezPizjuán stadium in Seville and brought together two teams from beyond our borders: Eintrach from the German city of Frankfurt and Glasgow Rangers from the Scottish city. According to our interviewee, the work carried out by the broadcasters, by the organization and by the party’s attendees was exemplary. Cuttingedge technologies were used to offer the best possible signal to international fans. The Mediapro Group, thanks to its experience on these occasions, provided technical and human resources of a high standard that ensured that every action in the game was enjoyed with great intensity around the world. Here you will find all the details about his work.

22

The final of the Europa League took place this time at the Ramón SánchezPizjuán stadium in Seville. What are the peculiarities of the stadium and what challenges have you experienced and what technical solutions have you come up with on this occasion? As you can see, the Ramón Sánchez-Pizjuán is a stadium where fans are very close to the pitch and this creates a very intense atmosphere during a final. In this sense, it is an ideal stadium and thanks to the work of the security teams deployed there were no incidents, either inside or in the vicinity. As for the facilities for the production of a final as this one, the space in the


UEFA EUROPA LEAGUE FINAL

TV Compound and the perimeter of the pitch are both limited, but thanks to the work of UEFA and the Broadcasters (Telefónica / Movistar), it was possible to locate all the elements that are required for the production of the match. What human and technological means has the Mediapro Group deployed to cover this event? The Mediapro team consisted of about 150 professionals, two large mobile units, a mobile unit equipped with special cameras and a mobile unit for drones. The match was held with 35 cameras, nine replay operators and five cameras for international customer services. As special cameras the following were used: seven SSM cameras, two ultra-slow cameras, two ultra-slow polecams, two cameras for reactions of fans in the stands, two steadicams and a drone. The ambient sound was produced by Dolby Atmos and the video signal on

multiformat; HDR UHD, SDR UHD, HDR HD and SDR HD. This is the most demanding format that takes place in major competitions and in the European finals. Mobile unit 89 was in charge of the production of the match signal and mobile unit 53 had two functions: to produce 3 different feeds with customization for international customers and to serve as mobile back-up unit for the 89. The pandemic has generated many changes in the infrastructure of an event like this. What innovations arising from this process have you integrated into your workflows? The pandemic has accelerated some issues such as remote or delocalized production. At Mediapro we were already doing remote productions, but due to the pandemic, the difficulties to travel and the need to maintain physical distance between the members of the production team, we had to increase these productions and create other hybrid

ones, in which part of the team is delocalized. Regarding signal contribution, on which network infrastructure have you relied? It is a multi-supplier infrastructure, since we always try to get the best proposal, on quality/price ratio, from the suppliers having coverage within each venue. At Mediapro we have a high-capacity communications network with global reach and fiber installations ranging from Hong Kong to Buenos Aires through North America and Europe. This worldwide network includes our redundant Iberian fiber network that connects all Spanish first and second division stadiums, thanks to which we can easily offer live signal to any customer worldwide. We are very scrupulous with our level of service and comply with the 3, 2, 1 rule of signal robustness to make it virtually impossible for a match to go offline. This means that we have

23


SPORTS

3 stadium outputs with 2 different suppliers and 1 of the outputs has a different technology than the rest. How was this match produced? Remotely, locally or hybridly? At the moment these large productions are still local, but it will not take long to see that, even in big finals, part of the production will be relocated and work done remotely. Regarding the integration of graphics, how have you developed this area? Has a dressed signal been transmitted or has a master been launched for customization? UEFA has commissioned a full signal with graphics. The aim is to create a consistent product that reflects, in all countries having rights, the spirit of competition. Finally, regarding broadcast, what has been your workflow and on what technology have you relied? Efforts have been made

24

to keep the workflow

and configured so as to

simple and clear. We

enable it to send and

have developed our own

transmit the video signals

automated management

between the stadium and

software thanks to which,

the production center

when a mobile unit

of 22@ (Barcelona). The

arrives at the stadium

technology on which we

and connects to the data

have relied from the outset

network, all the equipment

has been video over IP

is automatically detected

networks.


UEFA EUROPA LEAGUE FINAL

Óscar, what has been your personal experience and what challenges have you faced in producing this event? This was a complex final. On the one hand, the teams that played it are not among the best-known in

Europe. That is why we had to make an extra effort to know how they play and, at the same time, study each of the footballers. The fans of both sides had an exemplary behavior and did not stop cheering at all times. The stands were all covered in white (Eintracht

Frankfurt) and blue (Glasgow Rangers), thus creating an image similar to that of a Euro or a World Cup match, where fans are a very important part of the show. Mediapro had two cameras continuously focusing on the fans. The result

25


SPORTS

was very satisfactory as it allowed us to convey, after each action, the joy or sadness of the respective fans. On the other hand, thanks to the work systems that we have in the Group -weekly with the LaLiga matches and European competitions- we have our own staff in all areas of activity: production, technical, broadcasting,

26

camera operators and replays. This gives us an advantage over other production systems, since on the one hand we are a very committed, talented group and on the other, we know each other perfectly since we work together every single week. This, in a match at this level, gives you a huge edge and ensures the most chances for success.

What can technology do to make the coverage of a UEFA final more efficient and less complex? It can provide higher quality for signals broadcast, lower delays, greater flexibility or, for example, minimization of possible failure points. Technology is everything, mastering it is the key for the show to be global. The


UEFA EUROPA LEAGUE FINAL

before the match begins.

initiatives that will surprise

Those same people leave

viewers: from projects

two minutes after the

that include artificial

referee blows the whistle to

intelligence, rendering

end the match. Technology

of 3D environments,

can also make syncing

implementation of graphics

between the audience’s

and commentators in

interest and resource provisioning increasingly

the local language, up to the creation of new

efficient.

virtual environments that

How can cloud-based

possible without these

infrastructure help cover

would not have been new developments. We

this event?

reach a point where we

Although it can still be

of imagination and the

considered that cloud

go beyond the boundaries business models rule here.

technology is not mature enough, Mediapro believes

In your opinion, what

it will help in the future.

will be the evolution of

The cloud offers signal breakthrough of computing technology - whether on-prem or in the cloud -

processing resources that are complementary to on-prem solutions. In

allows the possibility that

addition, pay-per-use will

while a signal is relayed

make working in the cloud

from point A to point B it

-in some instances- the

can be converted through

most efficient solution not

low latency operations. This

only technically, but also

means that you can include,

financially.

for example, graphics or

broadcast technology and production models linked to these events? I am convinced that the evolution will be associated with the improvement of the quality and efficiency of broadcasts. From the point of view of production, our company is always looking for new elements that will

The cloud is a technology

allow us to improve the

that has come to serve the

audiovisual narration of a

This sport attracts millions

world of sport as well. At

first-class show such as a

of people just 10 minutes

Mediapro we have many

European final. 

audio to customize it.

27


PRODUCTION

© 2022 Dada ! Animation / Squarefish / Diabolo films / Virtual Journey / Seppia All rights reserved / Les Hoofs de TFOU © 2021 TF1

28


DADA!

Game engines, AI and VR in 3D Animation Dada! Animation is a company dedicated to the production of bold animation and unafraid to explore unknown terrain. This French animation studio has focused on bringing the promises of new technologies such as virtual reality, video game graphics engines or AI to traditional animation workflows. Quentin Auger, Head of Innovation of Dada! Animation, tells us that the way of working in international animation has been the same for 20 years, but it is starting to change now. The process is exciting and there is still a lot to explore, experiment and invent.

29


PRODUCTION

Dada! Animation has recently changed its identity, what is the reason for this change? Two years ago, seven people, including myself, re-founded an existing company. It was called Hue Dada! Productions. It’s a name we loved, very French and powerful, and the identity was already formed. So we decided to keep it. Today we decided to change our identity to get closer to our goal. We have kept Dada! but we have lost Hue because it is a French word that is difficult to pronounce. Our new identity is closer to what we really are, a mix between a technology lab and an animation studio specialized in CGI that produces content for all audiences.

Quentin Auger.

hasn’t changed much over Has a change in workflows matched with this change in identity?

20 years. The co-founders

Our DNA is to be versatile in workflows and to be able to accommodate any capacity. But the truth is that the animation industry

industry. We started in the

30

of this company have that experience in the 1990s and, although the technology has evolved and become more powerful, the ways of working have not

changed over time. However, it is only now that they are beginning to change. We are experiencing a technological acceleration that is exploding traditional ways of doing things. It’s really exciting. So many


DADA!

industries are colliding this time -like video games, architecture, design, manufacturing and, of course, VFX and animationthat it’s really changing the landscape. In fact, what’s going to happen is that we’re going to change workflows from now on. Two interesting questions arise from your answer. The first is in reference to the immobility of workflows

and the second to the current evolution. Let’s start at the beginning. Why haven’t workflows changed in such a long time? This work is artistic, but it has a large component of craft. You have an idea that you want to express, but the truth is that you depend entirely on the tools that are available to apply it. So the craft is constrained by technology, in our particular case.

In our industry everything depends on time and money. It’s clear that big budgets are used to buy time. But the interesting thing about all this is that the more time, the closer the big animation studios get to artisan work. It’s like haute couture. The best quality in a designer piece of clothing is that which is handmade, not massproduced. Therefore, the bigger the studio and the more

Dada Animation team.

31


PRODUCTION

© 2022 Dada ! Animation / Diabolo films - All rights reserved

32


DADA!

established and why Pixar had to change its pipeline. It became an optimized industry and everyone was doing the same thing. Schools also emerged to help sediment this knowledge and these practices.

powerful the project, the more it is developed in this artisanal way or, as we say, pushing the pixel by hand. In smaller production companies with smaller budgets we can’t afford this way of working. We try to find ways to automate processes, prepare for the future and rationalize resources. This is the main reason why the technology hasn’t changed in all this time: by tending to do it by hand we don’t need to change the tools. The tools were established 20 years ago and they were based in the architectural industry. They remain today. In fact and curiously, in those days there were more tools and different ways of doing

things than there are today. For example, between “Toy Story 1” and “Toy Story 2”, Pixar had to change their pipelines completely because the tools that settled and became a standard were not the ones they used to make the first movie. They came from the mathematical models used by engineers and the use of polygons ended up being imposed. In fact, I myself understand this change very well. I come from product engineering and in this industry polygons are forbidden because they are very imprecise. However, in animation the technique works because the result looks good and the technique is easier to apply. This was the model that ended up being

And now, what is happening, why is everything changing? The main thing, I think, is that many different industries are sharing more and more common ground every day. In particular, there’s another way of doing things coming out of the video game world. There is an essential tool, the graphics engine of the video game industry, that is changing things. What we hear most about in our field is Epic’s Unreal Engine, but we also work a lot with Unity, especially when using Virtual Reality. Other tools have also been developed that as, I say, come from other industries. For example, we are relying on Adobe tools that allow us to animate 2D projects in real time. Or, on the

33


PRODUCTION

other hand, virtual reality solutions that are also giving us the possibility of changing work models thanks to technological evolution and the collision of these worlds. My job as head of innovation at Dada! Animation is to detect, test and propose new ways of doing things. Also, part of my job is to team up with research centers to work on projects that make us improve or experiment with different tools. It’s part of our company’s R&D work. We will soon go into all these new tools that are being tested, but to give some context: what are the established tools in the industry? First of all, we all use a tool like Autodesk Maya, it is used in 95% of VFX and animation companies. Now a tool like Blender is appearing, but it is still the same kind of tool than Maya, only with a different user experience and business model: it is free and open source. About fifteen years ago a tool like Houdini came out. This is specialized in VFX.

34

All our workflows have always been based on these tools. You model and rig in the same way. I myself have been rigging for a long time and I can assure you that 20 years ago a standard was developed that is still valid today. To give you an example, 17 years ago I worked on an American show. Our workflow was based on these tools that I have mentioned. For the tedious and meticulous tasks of enveloping and rigging, a Maya tool was needed. I developed my own tool to help me handle that shortcoming and I shared it with a few colleagues. Well, it turns out that a few weeks ago, during the last edition of the Annecy Festival, a professional who was developing a new rigging tool told me that some rigging supervisors from bigger companies like Mikros Image or Superprod asked him to integrate the same features into his software. Many tools like that have naturally emerged to complement the deficit that the program had, but I never thought that mine would be so widespread and sadly still needed.

How has the world of video games interacted with the world of animation? It happens the other way around too. They use some of the tools we use in animation because they have to build characters and environments that are beautiful. But they don’t use the whole workflow because their limitations are different. We want to make it beautiful and control the art direction whatever the cost per frame. In games, beauty is also important. But the main concern is speed and frame rate as well as control. That goes to the detriment of artistic control. In our world, real time game engines, Unity or Unreal, that I was talking about earlier, have been implemented. The thing is that these graphics engines are optimized to deliver a lot of images at the cost of limiting the number of polygons and the richness and artistic control. The thing is that now, thanks to technological evolution, these engines can offer both speed and artistic


DADA!

35


PRODUCTION

richness. That’s why we are implementing them into our workflows. We get the same results much faster, and you can iterate a lot more. How do you use virtual reality in your work? We use virtual reality techniques as a tool to create content. VR is great for manipulating data in 3D. I always give the same example. If you wanted to model a piglet’s tail with traditional techniques, even with the best tablet in the world, you would have to move in 2D interfaces, you couldn’t do it easily. Instead, thanks to virtual reality tools you can model it in a single gesture. And the process goes from taking hours to a few minutes. There are tools that are very accessible and that are developed in Unity or Unreal Engine that work very well in this section: Oculus Quill, Medium, Tilt Brush or Gravity Sketch. They allow you to “draw” 3D and export a file compatible with the traditional 3D workflow. We use this technology on top of the graphics engines.

36

© 2022 Virtual Journey – All rights reserved

Apart from being faster and having more processing power they provide us with a very important capability in these environments. They are good for collaborative work. When you bring animators into virtual reality for scouting, the technology has to be able to accommodate multiple people in the

environment. Video game graphics engines are multiplayer natively, so they allow for collaborative work. The truth is that, as we have seen, the boundaries between these technologies are blurring and tools from different industries —such as video games, animation, VFX or industrial design, but


DADA!

also web development— are increasingly coming together. But we have to be careful because the terrain is slippery and we all know that detail can be the problem. What remains to be done now is to test these tools in different workflows. This way we will know if we can implement them in our tasks or if they

will just help us get through different tasks. What is your workflow like and how did you introduce this tool into it? The traditional way of working would start with a script, then a drawn storyboard and a 2D

‘animatics’, its animated version with template sounds. In parallel, we would design the environment and the characters. Sometimes we use the created environment and storyboard to finish this process. Then all the 3D assets are built and the next step is to do what we call the design. This is the translation of the 2D storyboard or animatic into a 3D scene. The result is a rough, unfinished 3D animation. Subsequently, we would develop all the stages of the animation and, simultaneously, the asset would be textured, shaded and, when the animation is finished, we would put the light, shadows and all the makeup. We would render and composite it. In a separate process, but at the same time, all the sound composition would be done. Having tools like the virtual production ones provided by the videogame engine, what we do is create the environments for the storyboards and the

37


PRODUCTION

“cameras”. These “cameras” are virtualized viewing positions that allow us to capture the artist’s vision. With these capabilities, the animator explores as he would on a movie set. The storyboarder just has to use his skills to compose whatever comes to mind.

would react in a normal situation. Having this kind of technology that allows real-time reaction has made a huge impact on our industry.

allows, this tool is also capable of providing different formats for platforms outside of traditional broadcast, such as YouTube or TikTok.

What projects are you working on with these techniques?

This way of working is much better because the artist can perceive and feel the 3D environment taking into account distances and volumes in an extremely better way because of the immersive nature of the process. We have tested these techniques on the same show with the same team. The first season we did without VR and the second season we created with it. It was crazy, it was incredible what we gained. In fact, we found far fewer errors in the animation than in the traditional method.

We are developing “Captain Tone-up”. We are also in the process of commercialization of a preschool TV series called “The Nebulons”. Another one I would like to highlight is “Mekka Nikki”, an adaptation of the sci-fi comic book for teenagers. Finally, a project that we love very much and that I would also associate with this section is “French Patisserie”, inspired by Gaël Clavière the Prime Minister’s chef, well, actually by the chef who has served several prime ministers.

On the other hand, we also do services for other people. For example, right now we are developing a VR experience called Lady Liberty. It’s about the construction of the Statue of Liberty in Paris. The episode we narrate tells the story of the visit that the famous writer Victor Hugo made to Bartholdi’s workshop. We created the design, modeling, animation of the characters, lighting and construction of the scenery.

When you perceive the space immersively you can check everything in real size. You can see if the assets are out of proportion. It allows us to react to these faults much earlier than we

38

They are all 3D, sometimes with a very 2D style look. The projects I mentioned have been mostly rendered in Unity although we are doing some tests with Unreal. In addition to the fast rendering capabilities and the collaboration it

Finally, I would like to highlight a project we are working on that allows us to explore the VFX volumetry of moving actors in virtual reality environments. This is a documentary on the West Europeans in the 18th and 19th century who emigrated in differents parts of the world. The name of the project is “They Were Millions”. We have used Unreal Engine to quickly create large


DADA!

environments such as

much when you interact

example, French animation

cities and also to capture

with it. We are learning how

schools offer five-year

the movement of the

to handle the right data.

training programs, but in

characters. The interesting thing about this project

How and how much

is that the final rendering

do the tools we have

is heavily modified after

been talking about

real-time rendering with Unreal Engine, and it is done by AI. We used AI in this project in two ways: the first to transfer the style of a specific artist to the images, and the other to add animation over actual portraits of that time, with deepfake-like tools. What AI tool do you rely on to perform these tasks? For the deepfake-type shots, we collaborate with a company that specializes in it ; they do the first layer of work and then we modify what we need to bring it closer to our own style. We transfer the style with tools that were developed for this task. AI or Machine Learning tools are like

have to evolve to be more adapted to the animation industry? The short answer is: a lot. Everything has yet to be invented. So a big part of my job is to get the message out that we can experiment together on methods that already exist. Industry also has a part to play in the evolution of these techniques. We, for example, try to be part of the whole process actively. We talk

one year all the technology has changed. We are part of a French producers’ union task force (CPNEF/ AV) that has conducted an audit on how to deal with technological evolution and the impact it has on schools. We are trying to help create programs that are more flexible and able to withstand the changes. We are also part of an institution —we French love to create institutions, especially cultural ones— that tries to collect best practices in cinema, the CST. It was created after WWII and just a few

to tool developers to

months ago they created a

share information and

department on immersive

impressions. For example,

technology and real-time

something we insist a lot

technologies. We are part

on is the incorporation

of that department.

of animation and VFX standards into videogame engines.

It is only by exploring, experimenting and inventing that we will evolve

algorithms dealing with

The process is not only

the technology to the

blurry statistics, and you

industrial, it is also cultural

point of opening up new,

don’t want to confuse it too

and educational. For

unexplored markets. 

39


OUTSIDE BROADCAST

40


GRAVITY MEDIA

Gravity Media is a technical and human services provider focused on broadcast and multimedia event production that operates worldwide. As part of our research into the current state of this technology, we approached this company to see for ourselves what their capabilities are, how they are adapting to new workflows and in which direction they will evolve in the race to adapt to challenges that will change this industry; challenges such as the cloud.

41


OUTSIDE BROADCAST

What resources does Gravity Media have for Outside Broadcast? Our fleet of Gravity Media mobile units, outside broadcast and DSNG trucks serve a worldwide market. With vehicles based in Australia, US, UK, France and Qatar, our outside

42

broadcast and DSNG trucks are designed and built taking into account all broadcast environments.

with up to three vision

Sirius, is the latest addition to our fleet. This new custom-built outside broadcast vehicle is based in Paris, France and supports 18 cameras,

accommodate fourteen

operators, two replay operators, and an OB supervisor position. It can staff in high tech, climatecontrolled comfort. Specifically, what technology have you


GRAVITY MEDIA

implemented in this new vehicle?

MediaStore channels and

The technology that can be found inside our latest addition to the equipment is detailed in different sections.

Regarding the video and audio grid, it has a 64x64 HD/3G video solution and a 256x256 audio solution. There are audio embedding and de-embedding capabilities on all inputs/

The mixer is a Sony MVS 8000G model with 51 inputs and 24 outputs, four

eight DVE.

outputs (64 madi).

As we said, there is space for 18 3G/HD cameras, but also for twelve camera chains on fiber or triaxial and six special cameras such as PTZ, HF or mini cameras. The audio console is a Calrec BRIO 96 Hydra 2 Hub / Dante & Madi card. It

43


OUTSIDE BROADCAST

has 36 faders, 96 channels, 156 DSP, 36 Busses and four Stereo Mix or 5.1. A Behringer X32 is installed as reinforcement console. The audio processing is done with a Yamaha RSIO solution with Dante / 64 Madi conversion. A Madi M1K2 grid with 1024 x 1024 channels; two Andiamo Stage Boxes over fibre; two more Calrec Hydra 2 Stageboxes; and a double telephone insert have also been implemented. The Sirius replay system is based on EVS technology and has capacity for eight channels. It also has other EVS solutions such as a Xile 3 and an IP Direct. It also has twelve combined recorders in BMD Hyperdeck SSD / Xdcam / AJA Kipro solutions and four H264 recorders via USB. The truck’s intercom system is the Riedel Artist model with 128-port capacity. There are twelve integrated panels, an Intercom Camera Prod + ENG, 32 tielines 4W Ext / Audio mixer and three Talback duplex radios with twelve Walkie Talkies. On the other hand, the camera control positions

44

are three equipped with Sony PVM technology, WFM Harris 3G, Mon Pix and Touchdown. In addition, there is also a position equipped for the manager. The audio monitoring is based on two Genelec 8020 loudspeakers, a 5.1 PMC listening system, RTW vumeters, Sony PVM17 devices, etc. What is the status of Gravity Media’s Outside Broadcast equipment? Equipped with advanced HD and 4K technology, they are crewed by highly experienced Gravity Media supervisors and engineers who are skilled in supporting a wide range of productions. Features range from the provision of 4K monitor walls, 4K/HD cameras, and multi-format audio-visual capabilities to van-based units fully equipped to transport live HD footage from hard-to-get-to locations. What is the latest renovation process these infrastructures have undergone?

We partner with specialized body builders to oversee the design and construction of Gravity Media’s custom outdoor broadcast trucks. We can design, build, integrate and finish any type of vehicle or mobile unit, from rigid and expandable trucks, to remote and “in house” production vehicles and small camera and capsule units. For every custom OB truck build, we offer our broadcast systems


GRAVITY MEDIA

expertise, from planning and project management to design, engineering, installation, commissioning, training and aftercare. From the capture to the sending of signals, what technology and what brands can we find in your most outstanding OB vehicles? We always aim to align ourselves with the leading manufacturers

in the industry – working closely with them to fully understand their roadmap and that’s how we guarantee that we bring the latest innovation to our clients.

UHD format. We design our OB vehicles with our clients future proofed needs in mind - to deliver the best experience for our clients. Always ensuring we stay ahead of the tech curve.

What is the current use made of this type of equipment?

What are the maximum resolution capabilities of your OB Vehicles?

Workflows are evolving all the time – there is constant change in this space in line with the shift in remote and

All our Flypacks and OB units are UHD / HDR capable and incorporate the latest IP tech.

45


OUTSIDE BROADCAST

In the age of cloud and remote production, what does the future hold for these personnel? It is where you allocate and what skills set is required – a shift in talent needs, but we have always adopted a strong policy of training and development – building for the next future talent. We regularly hold proof of concepts – i.e. the recent AWS where we invite people to collaborate around that and bring everyone

46

together to make decisions around how to best use cloud-based tech. Can you tell us what that proof of concept consisted of and what conclusions you drew? At that event, together with ATP Media, the broadcast and media arm of the ATP Tour, and Amazon Web Services, we collaborated with some of the industry’s leading vendors to conduct a proof-of-concept (PoC) of large-scale virtualized

live streaming production, based on AWS, during the first 4 days of the Rolex Paris Masters (an ATP Masters 1000 event). Manufacturers and software developers included were Grass Valley, Vizrt, Simplylive, Sienna ND, Deltatre, Grabyo and MAVIS. The goal of the trial was to gain first-hand experience of the capabilities and benefits of AWS-based virtualized live streaming production in a real-world application.


GRAVITY MEDIA

multiviewers and the idea

Gravity’s Farnborough

of sending signals; audio

facility provided an

mixing; and fast editing. All

additional aspect of

cloud-based.

discovery, as the industrial

In addition to working in a

terms of being connected

stand-alone configuration, testing included various from different vendors, so

over a standard enterprise-

that interoperability could

grade Internet line.

we discovered the reality of

Paris, twelve live camera

using AWS cloud services

streams, were sent over AWS Elemental Media Connect fiber. These were

Media’s Farnborough facility. There, a live broadcast was produced

Through this opportunity

From the main court in

Hawk-Eye ball tracking

Paris, was set up at Gravity

network, so the two PCRs that were built had to run

feeds, with audio, and two

production facility in Bercy,

to ATP Media’s global fiber

combinations of products

be evaluated.

A twin of ATP Media’s main

unit was not “on-net” in

delivered to a dedicated virtual private cloud (VPC) in the Western Europe region. Vendors deployed their solutions in instances of this VPC. In addition to

in a live sports production. Today, we continue to work closely with AWS and manufacturers to further diversify our offering to incorporate more off-prem solutions. We want to make the end-to-end virtualized solution a reality. What will the OB Vehicles of the future look like and how will

and key aspects were

running live matches, with

assessed such as live video

graphics and replays, an

they be used?

mixing, including the use of

instance of Adobe Premiere

multiple MEs, transitions,

Pro was running in the VPC,

Gravity Media is currently

DVEs, keyers and physical

controlled via a NICE DCV

control surfaces; ingest

desktop client. Editors were

and recording capabilities;

able to access and edit

sustainable. We look

slow motion replays;

content seamlessly and in

forward to bringing

data-driven graphics; the

real time to share with live

an industry first to the

ability to generate remote

production.

broadcast industry. 

building an exciting brandnew truck in late 2022 which will be completely

47


OUTSIDE BROADCAST

48


TVN LIVE PRODUCTION

49


OUTSIDE BROADCAST

TVN Live Production always wants to ensure the best quality and the highest security to its customers. This is one of the key reasons why they continue to rely on SDI technologies. The promises of flexibility and resource sharing that IP promises are left in the background when it comes to guaranteeing the signal to the client, which must arrive in the best possible way. The industry, despite all the technological advances and evolutions, will be able to empathize with this feeling: content is king and, by all means, it must get where it is expected to go. That is why the best conditions for these vehicles are necessary, because seven years from now, they will still represent the highest guarantee for quality content.

50


TVN LIVE PRODUCTION

What resources does TVN Live Production have for Outside Broadcast? TVN Live Production operates six OB vans plus six support/equipment trucks. Four of the six OB vans are UHD/HDR-capable. Our fleet also contains

ten SNG trucks and one flightpack (complete high performance direction unit in flightcases). What is the status of TVN Live Production’s Outside Broadcast equipment? Thanks to regular investments, the equipment is always updated. We currently have over sixty Sony UHD cameras of the latest generation, each equipped with a Canon UHD lense; 36 EVS XT-VIA servers; Dolby Atmos sound control rooms in two OB vans with the additional option of generating Dolby Atmos in Hanover via remote. One thing that makes us proud is that our TVN-UE6 unit has been the most modern and powerful OB van in Europe for more than two years. What is the latest renovation process these infrastructures have undergone? The most elaborate project was to replace our OB van TVN-UE2 with the TVN-UE6.

Our camera and server inventory underwent a major renewal. Last but not least we upgraded our OB van TVN-UE3 to UHD/HDR. Which OB units have participated in the various UEFA competition finals held this year? Our technical infrastructure has been deployed in several European capitals to cover the recent soccer finals. Our units TVN-U6, TVN-U4 and TVN-U3 participated in the final on May 28th at the Champions League final in Paris. Simultaneously, on the same day, the DFB Pokal women’s final was held in Cologne. That match was covered by our TVN-U1 unit. A few days earlier, on May 21, we covered the DFB Pokal men’s cup final in Berlin. Also with UEFA, our TVN-U4 unit covered the Europa League final in Seville and, finally, on May 25, our TVN-U2 and SNG A+B trucks produced the UEFA Conference League final. Immediately afterwards, TVN started Europe-wide production for the UEFA Nations League 2022/23.

51


OUTSIDE BROADCAST

52


TVN LIVE PRODUCTION

From the pickup to the sending of signals, what technology and what brands can we find in the most prominent OB vehicles that have been involved during the UEFA Finals? We rely in cameras Sony 4300 and 5500; Canon UHD lenses; Lawo mc² mixers; EVS XT-VIA servers; Leader measuring devices; Hitomi measuring devices for synchronicity; Sony XVS vision mixer; Miranda router; Sennheiser microphones; Gunterman and Drunck KVM; Riedel Artist and Bolero; Lynx Titan converters; or VSM Studio, among other technology. What are the maximum resolution capabilities of your OB Vehicles? UHD/HDR. Where are your units at with respect to IP technology? A lot of IP technology is now standard (KVM Centre, etc.). For the original signal routing, however, we continue to rely on the proven SDI format – for quality and security reasons.

How have you approached this technology migration, what challenges have you encountered and what solutions have you deployed to overcome them? We focus on the needs of our customers. Major priorities are transmission reliability, fast problem solutions and high quality. The technology we use to achieve this is of secondary importance to our clients. We still see great advantages in SDI, especially with regard to security and fast problem solving. Due to the increased availability of 12G-capable equipment, the advantages of IP technology in terms of cabling and flexibility have become smaller again. How have you adapted the workspaces and technology in your OB Vehicles to the demands of the pandemic? From the very beginning, we were involved and developed pandemicrelated solutions with our clients for safe productions.

TVN has always attached importance to comfortable working conditions for our clients and staff, which is why our OB vans have always been roomier than average. The TVNUE5 and especially the TVN-UE6 continue to set standards in this respect. It was therefore quickly possible to set up distance regulations and other protective measures such as partition walls etc., within the OB vans. Furthermore, we equipped all OB vans with additional medical air purifiers and replaced workflows with regard to remote solutions. Have the new COVID-19 related features that these vehicles have streamlined workflows or have they created headaches? Are they going to stay or are they going to go back to normal? The Corona-related changes were, of course, initially additional framework conditions that had to be taken into account and therefore further complicated the

53


OUTSIDE BROADCAST

work processes. However, this also gave rise to new ideas that provided important impulses for positive changes. In general, it has to be said that the changes in production methods in Germany have remained manageable due to the quickly functioning hygiene concepts. In other European countries, we see much stronger changes, some of which will certainly endure, as they have proven to make sense.

54

In the age of cloud and remote production, what does the future hold for this OB Vehicles? Premium productions will continue to be produced with OB vans for the foreseeable future (the coming 5 to 7 years). Flexibility and security on location are the decisive aspects here. TVN is very strong in this segment, especially in terms of football. We are therefore planning to build another large OB van next year.

Nevertheless, remote productions (which presently everyone still defines differently) offer many advantages for small and medium-sized productions and are an enrichment for our branch with many options. So you always have to look at the respective need to provide the optimum solution for the production and the customer. How will these expensive and voluminous units adapt to a future where


TVN LIVE PRODUCTION

software located in data centers with access through high-capacity networks, such as 5G or fiber, will predominate? Connectivity is certainly the magic word here. You have to remain compatible within the media ecosystem and add new technologies and workflows where they make sense.

What will the OB

safely by other solutions.

Vehicles of the future

It is imperative to keep

look like and how will

an eye on technological

they be used?

developments and ensure

As already indicated above, in the coming years, we

that OB vans also function as part of an overall

still see a market for

system. As I said before:

large OB vans, as there

connectivity and flexibility

are special requirements

are and will remain

that cannot necessarily

absolute key features, along

be covered efficiently and

with reliability and quality.

55


TECHNOLOGY

Live production in

UHD

A lot has been discussed here about the technical implications of using new formats such as UHD in our live productions. And it is clear that it is one of the aspects that changes in a greater deal and in which we must learn the most. However, technique is nothing if it does not tell a story. Let’s see what implications 4K and 8K, UHD, WCG and NGA can have, not as regards of technique, but in telling stories, which is what matters. By Yeray Alfageme, Business Development Manager at Optiva Media an EPAM company

56


LIVE PRODUCTION IN UHD

More definition, more reality

the definition that is actually perceptible by the human eye.

Let’s start by the definition. We already know -because we’ve done a lot of dissemination about it- that UHD is not just 4K, but much more. However, the definition is the aspect of these new formats that is easier to understand and manage by both professionals and consumers. For this reason, increasing definition has implications on what we see and how we see it beyond seeing it better.

And more definition does not always imply better stories. It is necessary to remember that sometimes we want to show reality, as it is the case in news or sports, but sometimes we want to tell a story, even in a live broadcast. That is why we must take into account the new tool we have at hand.

If we increase the definition and allow more details to be seen, we may be showing things that were not seen before and that we did not want to be seen. Perhaps we would not look at them because the technique did not allow us to show them, so the technical limitations were of help, but now we have to be careful. From a more elaborate makeup to a few more refined sets, planes or movements are necessary now that we are approaching -or perhaps exeeding in the case of 8K-

The main implication of an increased definition lies in the composition of the plane and camera movements. It is not necessary -it must be even avoidable- to make movements, both from camera and zoom, as abrupt as we used to do in HD. With such high definition, viewers cannot appreciate everything we show them, it can even become confusing. With such high definition it is necessary to “move through the plane” so as to be able to see everything. We must allow viewers a prudent time to walk through the image and look at what we want them to look at, not force them to do so.

57


TECHNOLOGY

The second consequence: field depth And very close to definition comes field depth, the space between the first and the last sharp or focused object. Because with greater definition an increase in field depth is implicit. This goes against a trend that has been going on for some time now that encourages -even the smartphones in our pockets offers tricks for this- to decrease filed depth in our planes. But what is the goal of decreasing field depth? In addition to being something fashionable and more or less transient, it helps to focus attention on a point within the image; it provides more drama to the shot and even brings viewers closer to the action. There are two ways to decrease the depth of field: increase the focal length -zoom-, with which we modify the frame itself, or increase the opening of the diaphragm, with which we modify the exposure and brightness of our scene. Both aspects

58

have implications in the composition of the plane that cannot be ignored. We will always have to strike a balance between achieving the desired field depth and showing what the new definition allows us. Later we will delve in the implications of using the same planes in dual productions, HD and UHD

-which at first glance are a lot, but in practice not so many- in which the depth of field has much to play.

The HDR, seeing more with the same number of pixels It has always been said -and I agree- that HDR is the “wow effect” of UHD.


LIVE PRODUCTION IN UHD

conveniently hydrating our camera controls for the constant effort and continuous adjustment they had to make on the exposure of all our cameras. Never again with HDR. Whether HLG, more widespread in live productions due to its backwards compatibility with SDR and the static metadata that allows interchange more easily -as in Dolby Vision or any other PQ curve with dynamic metadata that are much more capable but complexevery HDR format allows us to show an image much closer to that captured by the human eye and shows much more than an increase in definition, Fact. And it has been proven, even in studies carried out by the BBC and the EBU, that viewers prefer an HD-HDR image over the same 4K-SDR image. There must be a good reason for this. Because the main difference between what our eyes see looking at the advantage and what we see through a screen is not so much definition

but the dynamic range, the luminosity, that we can appreciate. Until today if we did a highcontrast shot -the most classic example being a football match in sun and shade in the middle of the

afternoon- we would have to choose between showing the sun or the shade, in addition to

This drives fiction creatives mad as they used to rely on shadows, and even lights, to hide what they did not want to show in a scene. It also has implications for live shows as it prevents focusing the lens of our image on a point so much. Showing more, in a similar way to what happens with definition, forces viewers to assmilate more information

59


TECHNOLOGY

and therefore more time and more effort are needed; we must allow for this. In the beginning, when we did not understand very well what we could do with HDR, its effect would be exarcebated, even exceeding the eye’s dynamic range; not so much so. At the other end would be those trying to increase contrast to give up some HDR effect as they would feel more comfortable controlling a “traditional” SDR image. Just as well we are learning fast. Today we have learned to use HDR in a way that we are able to offer viewers a much more natural image, very similar to what it is seen in the real world, helping them to focus the lens on the desired point of the scene. This is not only achieved with the luminance control, but is a combination of definition, field depth and dynamic range. The latter is now controlled, although there is still a long way to go.

WCG: yes, color also counts If definition is highly praised in UHD, color has been mostly neglected. Yes, color matters too. Because the color space 709 that we had before in SDR, as defined by the number of colors that a traditional SDR monitors could display -which is not bad- has not been just left behind, it has been blown to bits., From a space of color that allowed us to show only 35.9% of the spectrum of color visible to the human eye, such as 709, we moved on to one that allows us to show nothing more or less than 75.8% of all the colors of our visible world, 2020. And this goes beyond mere doubling the number of colors, as now our palette moves in a space that is three times larger, so, again, we must make choices that we were not previously making. From saturation or hue in our scenes to white balance, although the D65 white point remains in both color

60


LIVE PRODUCTION IN UHD

spaces, these are components of our image where we now have much more room for maneuvering and also less room for error. With limited color space, viewers found it understandable that colors were not the same and even allowed certain changes from one camera to another. But now these mistakes can be much greater and also less acceptable, since we have already shopwn them how well we can perform. If you give me lobster, I won’t like pea soup anymore.

Conclusions, if any It is hard to summarize in a couple of paragraphs what artistically implies the change of format from HD to UHD in our live broadcasts, but if you skim over these lines again, you will spot a common element: we have many more tools at our fingertips now. And having more tools implies more responsibility, having to make more decisions and investing more effort. Although perhaps not everyone is willing to do so. Many of us used to complain that we couldn’t see the shadows before or that the colors weren’t “natural”, whatever this means. Now that those technical limitations have been removed, the thing has become more complex. And no one said the future would be easier, they just said that it would be better. And UHD is better than HD for our live productions, no doubt, but there are many more choices to make, information to show or hide, and that takes an effort. For years and years, we have worked to achieve great results despite the limitations of technology. Well, now those limitations have disappeared. So, our adaptation does not apply, we must learn again. One thing is for sure, viewers so appreciate innovation and evolution as long as we offer them the rght implementation of these. The “HDR effect” is a bad example of this.

61


OPINION

Delivering high resoluঞon live television By Karl So, Caton Technology

Broadcasters in many parts of the world are now promoting 4k Ultra HD. They use it as a market advantage, a differentiator which audiences recognise and which boosts viewing figures and potentially revenue. In Japan and China, 8k Ultra HD is now on air. Receivers are readily available across the Far East, at price points equivalent to around US$5000. According to researchers Strategy Analytics, one million 8k televisions were sold globally in 2021. The forecasters predict that about 72 million households worldwide will have an 8k receiver in 2025, although to put this in context there are 1.7 billion homes with television in the world. Ultra HD – 4k and particularly 8k – is seen as a premium service, something that is reserved

62

for key content where it will make the most impact. Japanese broadcaster NHK produced 200 hours of 8k video during the Summer Olympics in Tokyo held in 2021, and a Chinese broadcaster went live with 8k for the Beijing Winter Olympics earlier this year. The FIFA World Cup, to be held in Qatar late in 2022, will be covered in 4k throughout, with some additional 8k shoots for key games. The BBC coverage of this year’s Glastonbury Music Festival was in 4k Ultra HD from the Pyramid Stage. All these and more are high profile, live events, where absolute security of delivery and superior quality is essential. The challenge for engineers is to get the signals from the venue to the rights-holding broadcasters, wherever they are.

With the best HEVC encoding available, contribution quality 4k demands a minimum of 50 Mb/s. Add in ancillary services and overheads and you need 60 – 70 Mb/s to be sure. A leased line with that capacity would probably cost around $100k to establish plus a hefty monthly rental. And you would probably want two geographically diverse routes for redundancy, doubling the cost. Satellite capacity, if available, is priced at around $1500 per megabit per month, so again in the $90 – 100k region. But satellite capacity is


LIVE TV

constrained, not least because the spectrum for C-band satellites is being taken over by 5G (and future 6G) cellular services.

for 8k is more complex

it. At the moment there are

because there is no

a handful of hand-crafted

finalised and agreed codec.

coders and decoders in

VVC/H.266 is currently in

the world: the services at

development, but it could

the Beijing Olympics used

These numbers are for 4k Ultra HD. The situation

be five years before there

these rare resources at a

are production chips to do

reported 85 Mb/s.

63


OPINION

For practical 8k contribution circuits today, the solution is to split the signal into four quadrants and treat it as four 4k streams. We need H.266 to be standardised as soon as possible!

productions become a

the Caton Transport

routine matter.

Protocols (CTP). These

What is clear from this analysis is that delivering Ultra HD on contributionquality circuits is a very expensive business. The pressure is on to find a way to cut costs as live

far from deterministic, so

64

We see the solution as a format-agnostic transport layer built on the public internet, for at least part of the circuit. But, as we are all aware, the internet is any scheme has to build in extensive protections to deliver the security broadcasters expect. This is why we developed

implement more than 30 algorithms, including machine and deep learning approaches, along with our unique dynamic forward error correction. Security is guaranteed through AES-265 encryption. CTP is designed for high quality, low latency contribution video at scale, over any IP network, including the public internet as needed.


LIVE TV

To this we add the Caton Video Platform (CVP), ultralow latency, high availability dedicated interconnectivity network with points of presence in more than 60 countries worldwide. Together, we package the video stream, in any format (including NDI for dynamic production) into CTP, route it over the public internet or locally available circuits

to the nearest point of presence, then deliver it to one or many remote points and, if necessary, over local circuits again to the destination. Users have access to management tools to establish and monitor the connection, and routinely expect up to five nines stream availability, secured end to end. This is provided

at significant cost savings over “traditional” routing – 50% or more – and is available as required, rather than subject to expensive minimum rentals. If you only need the circuit for three hours, why pay for a month? Broadcasters around the world now see 4k Ultra HD over CTP and CVP as routine. We first demonstrated 8k more than two years ago, and we have delivered circuits, to our high standards, as our users have requested them. Broadcasters want to deliver the best quality to their audiences, and to use new offerings like Ultra HD as a means of standing out in competitive markets. Budgets are always tight, while the prospect of black screens is what keeps chief engineers awake at nights. CTP and CVP provide the SLA, the simplicity of use and the costeffective solutions which allow broadcasters and production companies to unlock the full potential of IP connectivity, anywhere in the world. 

65


STORAGE

How to manage million-dollar content in the cloud

66


VIDA

At TM Broadcast International we have been approaching the evolution of storage and content management systems in professional broadcast and VOD environments. As everything else in this industry, also content management systems are currently leaving behind the classic local infrastructures based on hardware to evolve towards technological models based on the cloud and software services. To learn about this development first hand we have contacted VIDA, a company that offers content distribution and storage services on a cloud infrastructure. Its parent company is Visual Data, a company with many years of experience in professional media environments. Our contact was Symon Roue, Managing Director of VIDA. Here you will find all the details about this tool as well as the reasons why content management is evolving towards the cloud.

67


STORAGE

What is VIDA? VIDA It is a cloud-native SaaS platform delivering efficiencies for managing, migrating, distributing and monetizing library content. It works as a content operating system. The solution runs in the cloud across multiple cloud providers and uses the latest tools and serverless multi-cloud infrastructure. Serverless basically means that our application treats the cloud infraestructure as if it were a huge computer. It abstracts the requirements away from you. Normally and as your system grows, you have to start up your server, put the operating system on it, load some compute, etc. You’re always building servers or shrinking servers, starting up more storage and then shrinking it. This system allows you to scale in depth in storage size, and compute on a totally dynamic basis as you’re using it. It would be very easy for us to go from 1,000 users to 10,000

68

Symon Roue, Managing Director of VIDA.

or 100,000. The system

as-is library migration,

would not be subject to any

direct distribution to more

restrictions on the power of

than 500 partners and a

the service that needs to be

“shopping cart” functionality

commissioned.

to facilitate secure content

VIDA streamlines media workflows and gives users control over their content

purchases. How was VIDA born?

in a secure, agile and

We focus on media and

central environment. The

entertainment because

Software as a Service (SaaS)

this is the industry we

model flexes to fit business

know inside-out. The VIDA

needs, offering a suite of

team is entirely focused

features and capabilities

on developing this content

including natural language-

operating system. However,

based clip searching and

we are part of a parent

viewing, unlimited users,

company called Visual Data.


VIDA

This makes us more mature than a startup and gives us a more solid base to start from. For more than 30 years, Visual Data has been managing content libraries, distributing TV shows and movies, and localizing them for broadcasters, in home entertainment, in movie theaters, and most recently in streamers. About two years ago, we decided to

create the VIDA content operating system. Mainly because we couldn’t find any other software on the market that matched our vision of where the industry was going. Technically speaking, VIDA has been in development for about eighteen month and officially at the NAB Show in April. In addition, we demonstrated it at The Media Production & Technology Show in

Olympia, London, during the month of May, and plan to showcase it at IBC this September. We already have some amazing clients onboarded including BBC Motion Gallery. VIDA is used to house their library of digitized television programs. If you make a documentary about Churchill, for instance, you can search the BBC library and find the clips, and place an order to receive those materials. That was our first launch customer. Given the pendency, what options does your company bring to the paradigm shift in television workflows? We had our busiest time. During the pandemic, the streamers —Disney, Netflix and Amazon— needed to get content onto their platform at an accelerated rate because more and more people were subscribing. At the same time, production itself was at a standstill. That caused a lot of customers to come to their catalogues looking for titles they had unreleased on their platforms. We had one of the busiest times in

69


STORAGE

our company, working with catalogue, processing it, localizing it and delivering it. VIDA can be deployed on Amazon, right? Can it be deployed on other clouds? Computing is predominantly on Amazon, but we use storage from various cloud systems for VIDA. These are different customers. This is necessary because there are different economics with different cloud providers, and some customers already have a preferred relationship with certain cloud providers. What are the main reasons for having a preferred cloud provider? It depends on the type of library you have. If your library is large and very slow moving, you might use technology like Amazon Deep Glacier for your content storage because it’s much more economical, but the restoration costs are higher and it takes longer. Other customers have libraries that are

70

Deep search across multiple libraries.

very transactional. They never really know what they are going to sell next. Speed of execution is really important, so they need to have their content in hot storage. The content is immediately available. It’s always online. When we onboard a customer, we profile their preferred cloud storage, and then VIDA connects to those cloud providers. How do you integrate with the different cloud providers? If your library is stored in Azure Storage, for example, we would connect Azure

to the VIDA software with and index. We use Dolby Hybrik to parse the assets and technical metadata, perform automated QA and create the proxy files. What services does VIDA offer? The first thing it offers is the management of your content in the cloud. It offers unlimited user and it allows for a lot of collaboration between teams. Traditionally, for a content distributor, the marketing team might have a few systems to manage marketing collateral, package shots, movie


VIDA

information, banners, and artwork. The production department manages, among other things, scripts, cue sheets and metadata about the program. Then there is the materials team, the actual master files, localized dubbing versions, subtitles, etc. VIDA allows all these teams to work together on one platform where they can collaborate, view each other’s materials, share them and deliver them. The interface is built in a familiar format, similar to Dropbox, but the system is a custom built platform for media and entertainment industry files. Capable of handling anything from 4K HDR files to Word documents, all assets are easily managed and shared between people. We also have a library system that allows the content owner to manage their collaborators and suppliers. They can have one company do the dubbing, another do the subtitling, and so on. From the system, the VIDA user can invite a supplier to upload an asset to the

platform. All this also means it’s easier to monitor your contributors. When assets come into the system they go through a series of quality control processes that are automated, but we also complement it with the expertise of our team who supervise the work. Then when they have done their quality control process they can store it in the system. The material delivery system is based on a “shopping cart” solution. If, for example, a user needs access to scripts, cue sheets or film images, they can select them in the interface as if they were shopping online. The user presses the “buy” button and the files are delivered to the recipients that have been added to the system. We also offer the tools to work as a robotic e-commerce storage. For example, if the user wants to send an asset to a location, let’s say a German broadcaster. VIDA knows that it needs a specific video format, that it needs particular dubbing tracks

and subtitles. VIDA has that capability because it is onboarded with the broadcaster. You don’t have to search for the files individually, build a package and send it to the destination you need. And let’s not forget the principle, once the content is localized, the VIDA platform allows you to transcode, transcribe, translate, deliver, share, collaborate and all those other things that the cloud gives us. What are the advantages of cloud versus onpremise installations? I would say speed of execution. You can do huge jobs or small jobs in parallel at the same time, whereas on-premise, you have a certain bandwidth, and after that everything becomes sequential. I think that’s a key aspect. Human resources are also more optimized with a cloud infrastructure. In an on-premise installation, you always need to have extra staff to meet any need. However, cloud

71


STORAGE

infrastructures do not need to have multiplied human resources. Not to mention downtime. In local installations, you sometimes need to stop the system to update and correct certain things. They also have a certain risk of crashing. The cloud is operational 99.99999% of the time.

very large video and film libraries in buildings. Then we had to pay rents and also acclimatization. Back then, it was associated with costs because you had to take care of it. Now it’s the same thing, it also has costs associated with it. You simply shouldn’t put your archive in an old garage with a leaky roof.

This operability also means that, in a traditional set-up, if a sales manager calls late on a Friday afternoon and says that they have to send content to a customer who has just closed the deal, this task will not be completed until the following Monday. By contrast, in the cloud this task could be accomplished instantly because that customer, with a system like VIDA, would be able to access the content themselves. It allows working 24 hours a day.

There is always the possibility of creating your own data center, though. You need a robust LTO system. It has to be dualpowered because you always have to be able to get the content at any time within a short period of time. You got to have a team of technicians. You got to have air conditioning. You’ve got to have resilience in your infrastructure, and you got to have a very powerful Internet connection. All this has a cost associated.

How does the business model work? It is a subscription model. Using the cloud, of course, has its costs. Many years ago we stored

72

The same goes for the cloud. Microsoft, Google, AWS, they all have their own data centers. But there’s another consideration to keep in mind. They run the data centers much more efficiently and more

environmentally friendly than we ever could. I think there’s always a cost associated with managing the value of intellectual property. It’s worth billions, so the cost of storing it securely is a big part of it. You can do this in huge and very expensive on-premise facilities, or you can rely on technologies like the cloud that offshore and facilitate this task in a very secure way. For example, at VIDA, we always make three copies of every piece of content. This is an


VIDA

production, we will be more into integrations with other systems, but for now we want to remain focused on our clients’ core needs. How do you secure your content and service?

important thing to keep in mind. Is there a possibility to integrate the VIDA system with a cloud editing program?

At the moment, we are mainly focused on TV and film distribution. When we get more into TV

We have several layers of security. Our multi-cloud storage means content has an added layer of backup and security. We also use Office 365 and Google single sign-on, which protects access to the platform. Users are invited into the platform and then required to identify through a two-factor authentication. Users can be given specific types of access within the system to limit content exposure. 

Cart checkout and delivery.

Yes. Although we have already done proofs of concept relating to this, we have not yet implemented it for any clients. We can integrate the Adobe Panel into VIDA for example. Do you have plans to integrate more of these tools into the platform?

73


CASE STUDY

Clip Factory Pro for ediঞng, clipping, and exporঞng content to the new media pla orms for RTV Clip Factory PRO ingests RTV channels 24x7 and provides an advanced workflow to simplify and speed up the content re-purposing to the web

Background Broadcasters and content owners need to publish their content quickly to their Web, OTT, VOD, and social media platforms like Facebook, YouTube, Twitter, etc. Some of these pieces usually include VOD content, programs’ highlights, etc. The solution has to be user-friendly (and not only for professional video editors) and easily accessible, especially after COVID, when most people started working from home.

RTV inquiry RTV Slovenija produces programs for three national TV channels, two regional TV channels, two TV channels for the Italian and Hungarian national

74

communities, three national radio channels, two regional radio programs, and a radio program for foreign citizens. Therefore, RTV was looking for a solution to easily select News, Weather, and Sports clips, add metadata and quickly publish any relevant content online. Moreover, RTV aims to publish the content online, on their VOD platform and be the first to publish the content.

ActusDigital Clip Factory Solution: not only a compliance logging Actus Digital developed its Clip Factory solution to simplify, speed up the process of exporting clips


CLIP FACTORY PRO

to the web and allow not only professional video editors to create clips. Clip Factory can be a standalone solution or an addon to Actus compliance logging solution. It enables the broadcasters to have one solution for compliance recording, quality monitoring, alerting, and clipping requirements.

RTV has deployed the company’s intelligent monitoring platform for tasks beyond traditional compliance and is using the Actus platform to simplify the creation, editing, and publishing of media clips (Clip Factory PRO). The Actus platform eliminates the need for professional video editors or old legacy

editing systems, reducing costs and complexity for RTV. Clip Factory PRO enables RTV to quickly re-purpose content for its VOD platform and news website. The Actus platform provides RTV with advanced editing and exporting options, allowing fast content

75


CASE STUDY

publishing to boost viewer engagement. In addition, the Actus platform enables RTV to include and exclude segments for clips, add metadata and graphics, pre/post-roll, and select relevant profiles to speed up the clipping process. Utilizing the Actus platform, RTV records its national and regional channels in full HD, maintaining high video quality. The system includes

76

a mix of both SDI and IP videos. Furthermore, the Actus platform allows RTV to select a program from the electronic program guide (EPG), identify and automatically remove all commercials, and then publish the concatenated program using the same branding and transcoding settings from the saved

profiles. Clips are created in multiple renditions based on the EPG with extra metadata.

Robust end-to-end clipping workflow Clip Factory PRO is HTML based and is easily accessible from any browser such as Chrome,


CLIP FACTORY PRO

Microsoft Edge, or Firefox.

ingesting the channels and

No client installation is

keeping them in HD quality,

needed. It can be accessed

any input, or a mix such

anywhere, anytime, and from any workstation. There is no need for a dedicated machine to create and export the clips. The deployment can be on-premise, cloud-based, Virtual Machine, or Hybrid. The workflow goes from

as IP, SDI, HLS, etc. Once the content is ingested and archived, it is easy to mark in/out the relevant content or add metadata with overlays or graphics. A transcoding engine is

Feedback According to RTV, which has been using Actus solutions before and continues to trust it, Actus completely solves the challenge to easily select News, Weather, and Sports clips, add metadata, quickly publish relevant content

included to transcode

both online and on the

the clip to any format

VOD platform, and be

needed. Integration with third parties transcoding farms is also available. It easily defines the number of profiles, allowing even a faster workflow. Clip Factory is fully integrated

digital-first. “Clip Factory PRO offers an easy-touse workflow for editing, clipping, and content repurposing. We have been using Actus Clip Factory PRO for several years and recently upgraded to

with all social media

the latest HTML5-based

platforms. The clips can be

platform. Both platforms

created manually or use

are reliable and user-

external metadata such as EPG or As Run log files for automation. It also uses automatic metadata generation such as speechto-text or ads detection. The clip automation is done based on metadata or pre-defined rules. The automation example is the automatic removal of all the ads from the program and the “clean” program export to the VOD library. 

friendly, allowing us to publish our content both online and on our VOD platform quickly and efficiently”, concluded RTV. Actus Digital will demonstrate its intelligent monitoring platform at IBC2022, Sept. 9-12 in Amsterdam, at stand 7.B40. More information about Actus Digital’s platform is available at www.actusdigital.com.

77



Issuu converts static files into: digital portfolios, online yearbooks, online catalogs, digital photo albums and more. Sign up and create your flipbook.