TVB Europe 98 June / July 2023

Page 36

Intelligence for the media & entertainment industry

UP CLOSE WITH

Creating the Cocaine Bear

IP and standards

JUNE/JULY 2023
©2023 Telos Alliance® All Rights Reserved. C23/2/22203

Here comes the summer

It’s that time of year again; the sun is streaming in through the window as I write this so it can only mean that summer has arrived. The arrival of summer usually means a wealth of sport on our TV screens. However, this year it’s somewhat quieter than usual. We have got the FIFA Women’s World Cup to look forward to. Women’s football is the most popular it’s ever been and millions of fans will be looking forward to the tournament’s kick-off in Australia and New Zealand next month.

and in the UK we’ve said goodbye to BT Sport and hello to TNT Sports following the launch of the joint venture between BT and Warner Bros Discovery. BT Sport always pushed the boundaries in terms of technology. From the first cloud-based live broadcast to delivering 8K top-tier sport into the audience’s home, Jamie Hindhaugh and the team constantly took sports broadcasting to places it hadn’t been before. While I think our industry will miss that innovation, I look forward to seeing what TNT Sports brings to the table.

www.tvbeurope.com

FOLLOW US Twitter.com/TVBEUROPE / Facebook/TVBEUROPE1

CONTENT

Editor: Jenny Priestley jenny.priestley@futurenet.com

Graphic Designer: Marc Miller

Managing Design Director: Nicole Cobban nicole.cobban@futurenet.com

Contributors: David Davies, Kevin Emmott, Neil Maycock

Group Content Director, B2B: James McKeown james.mckeown@futurenet.com

MANAGEMENT

SVP Wealth, B2B and Events: Sarah Rees

UK CRO: Zack Sullivan

Commercial Director: Clare Dove

Managing Director, B2B Tech & Entertainment Brands: Carmel King

Head of Production US & UK: Mark Constance

Head of Design: Rodney Dive

ADVERTISING SALES

Advertising Director: Sadie Thomas sadie.thomas@futurenet.com +44 (0)7752 462 168,

SUBSCRIBER CUSTOMER SERVICE

To subscribe, change your address, or check on your current account status, go to www.tvbeurope.com/subscribe

ARCHIVES

Digital editions of the magazine are available to view on ISSUU. com Recent back issues of the printed edition may be available please contact customerservice@futurenet.com for more information.

LICENSING/REPRINTS/PERMISSIONS

TVBE is available for licensing. Contact the Licensing team to discuss partnership opportunities. Head of Print Licensing Rachel Shaw licensing@futurenet.com

I’ve said it before, but sport is always a driving force in our industry, especially in terms of adopting new technologies. While this summer will be relatively quiet in terms of major events, I suspect a lot of broadcasters will be busy thinking about next year. Euro 2024 takes place next June-July in Germany, followed by the Paris Olympics and Paralympic Games. The summer of 2024 is going to be an incredibly busy year for European broadcasters with two major events taking place on the continent. I’m excited to discover the innovation that will bring both events to our TV screens.

Things are changing in the way in which viewers watch sport. Over the past year, Apple has gotten into the game with the launch of the MLS Season Pass, Netflix is believed to be planning to broadcast its first live sports event,

Talking of innovation, our focus for this issue is the development of IP and standards. It feels like IP is something we’ve been talking about for quite some time, and yet we’re still not even close to full-scale adoption. Oddly, while some broadcasters haven’t yet started their journey to SMPTE ST 2110, others are already asking, what next?

Elsewhere, we catch up with the producers of Netflix natural history series Our Planet II to find out how they used technology on the series; and Kevin Emmott finds out how you create a 500-pound bear that’s eaten almost 75 pounds of cocaine! And what’s even more surprising, it’s based on a true story. n

www.tvbeurope.com TVBEUROPE JUNE/JULY 2023 | 03
“While this summer will be relatively quiet in terms of major events, I suspect a lot of broadcasters will be busy thinking about next year”
JENNY PRIESTLEY, EDITOR @JENNYPRIESTLEY
All contents © 2023 Future Publishing Limited or published under licence. All rights reserved. No part of this magazine may be used, stored, transmitted or reproduced in any way without the prior written permission of the publisher. Future Publishing Limited (company number 2008885) is registered in England and Wales. Registered office: Quay House, The Ambury, Bath BA1 1UA. All information contained in this publication is for information only and is, as far as we are aware, correct at the time of going to press. Future cannot accept any responsibility for errors or inaccuracies in such information. You are advised to contact manufacturers and retailers directly with regard to the price of products/services referred to in this publication. Apps and websites mentioned in this publication are not under our control. We are not responsible for their contents or any other changes or updates to them. This magazine is fully independent and not affiliated in any way with the companies mentioned herein. If you submit material to us, you warrant that you own the material and/or have the necessary rights/permissions to supply the material and you automatically grant Future and its licensees a licence to publish your submission in whole or in part in any/all issues and/or editions of publications, in any format published worldwide and on associated websites, social media channels and associated products. Any material you submit is sent at your own risk and, although every care is taken, neither Future nor its employees, agents, subcontractors or licensees shall be liable for loss or damage. We assume all unsolicited material is for publication unless otherwise stated, and reserve the right to edit, amend, adapt all submissions. Future PLC is a member of the Periodical Publishers Association We are committed to only using magazine paper which is derived from responsibly managed, certified forestry and chlorine-free manufacture. The paper in this magazine was sourced and produced from sustainable managed forests, conforming to strict environmental and socioeconomic standards.

06 Thestandardsparadox

12 Usingtechnologyto capturelifeonOurPlanet

Netflix natural history series Our Planet II returns with a focus on migration. Series producer Huw Cordey tells Jenny Priestley how drones and night-time cameras helped the team capture images that haven’t been on screen in almost 30 years

16

NMOSsetsitssightson compressedvideo

Following on from the first major focus by supporting organisation AMWA – an application using ST 2110 uncompressed video within a media centre – the networked media open specifications (NMOS) are now being expanded to include compressed video and more, writes David Davies

22 The[tongue-in-cheek] pyramidofincompetence

24 Previs,postvis,andabear highoncocaine

No animals were harmed in the making of this year’s horror-comedy Cocaine Bear Kevin Emmott talks to Halon Entertainment’s Brad Alexander about what it takes to breathe life into the character, how pre- and postvis can benefit the creative process, and how to turn a rat into a bear using groom maps

29 Next-generationaudio preparestogomainstream

Larry Schindel, senior product manager at Linear Acoustic, takes a look at what the future holds for broadcast audio

38 Aclearpathtoproduction anddistribution

Prime Focus Technologies co-founder and CEO Ramki Sankaranarayanan talks to Jenny Priestley about the company’s acquisition by DNEG and what it will mean for customers

42 Helpingtheworld stayconnected

SMPTE executive director David Grindle discusses how technology created by the media industry helps bring people closer together

04 | TVBEUROPE JUNE/JULY 2023 12 24
2023
IN THIS ISSUE JUNE/JULY
42
TelosAlliance.com/InfinityVIP COMMS EVERYWHERE YOU ARE. NEVER MISS A BEAT. The award-winning Telos Infinity® VIP is the first fully-featured cloud-based intercom system, bringing sophisticated comms to any computer, tablet, or smartphone, and making on-premises, cloud, and hybrid media production workflows easier. VIRTUAL INTERCOM PLATFORM ©2023 Telos Alliance® All Rights Reserved. C23/4/17085 BROADCAST WITHOUT LIMITS TelosAlliance.com ON- TV PROCESSING ME SU ENT AND MONITORING BR AST PHONE SYSTEMS OEM SOLUTIONS AND PARTNERSHIPS PROCESSING AND AUTOMATION ENTERPRISE FILE-BASED AUDIO IP INTERCOM AND COMMUNICATIONS ATSC 3.0/NEXTGEN TV

The standards paradox

Our industry has been built on technical standards, but how important are they going forward and why does it seem to be getting harder to define them?

All through my early career as a software engineer, standards dominated the development of products. Video formats, communication protocols, and for a poor graduate software engineer the literally irrational fractions involved in drop frame timecode seemed almost sadistic. However, engineers adapted to these standards and our products all worked with each other... well, most of the time!

There are of course two types of standards: the official ones created by formal bodies/institutions such as SMPTE 259M – SDI; and then de facto standards such as Sony 9 pin protocol for serial device control which become standards through mass adoption. Our industry relies on both, so when is it right to formalise standards and when should we take the arguably simpler approach of adopting a manufacturer’s implementation?

When we go down the formal standardisation approach, I think we face the standards paradox; the more use cases the standard encompasses the less standard it becomes. Let me illustrate my point with a non-technical example: if I wanted to define a ‘standard’ sweater that everyone could wear, the definition of the sweater would end up being very large to accommodate different body shapes, something like size XXXL. The problem is that while most people can put the sweater on, it doesn’t fit the needs of the majority.

I’ve used a size analogy because the standardisation problem becomes worse with size, the broader the definition the greater the scope for incompatibility. I remember at the time that MXF (media exchange format) was being worked on, I heard the phrase “compliant does not equal compatible”. Unfortunately, I can’t remember who to credit this phrase to, but it wonderfully captures our standards paradox. More specifically what it means is that two systems can both be compliant with a standard to exchange information, but this compliance doesn’t guarantee they are compatible and interoperable. If the standard encompasses many different data formats and structures, each system could have implemented different parts of the standard, so are compliant but will never work together.

A real-life example of the compliant/compatible standards paradox is the existence of AIMS [Alliance for IP Media Solutions]. The excellent work that was done in the creation of the SMPTE 2110 standard for broadcast video over IP still left significant issues with interoperability. This was addressed by the creation of a consortium of businesses coming together (AIMS) to address interoperability within the standard. Is this a failing or shortfall in the 2110 specification? No, I don’t believe so, it is more a reflection that as we move into a world where generic IT equipment is being used for a wide range of media applications the creation of a standard is massively more complicated. We have moved from a world where products were predominantly single-function hardware devices with point-to-point connections for media and control, to a world where generic computers and networks are required to fulfil wide-ranging media operations; a far more difficult environment to introduce standards into.

The next logical question is, should we just forget attempting to define open standards? Look at the success of NDI, another form of video over IP. Often, and inaccurately, positioned as a competitor to the SMPTE 2110 standard, NDI is a proprietary licensed technology for video over networks that has seen incredible adoption. The single point of truth (or more accurately, control) also alleviates the type of interoperability work fulfilled by AIMS. Is this a better approach for standards in our industry? Clearly it has worked with NDI, but the counter-argument is that open standards aim to prevent any commercial advantage for specific businesses, and should avoid the risk of financial exploitation. For many companies, this will be key in deciding whether to adopt a standard.

So, with the complications and different approaches in the world of standards, are they still useful, and should we invest in them? In my view they are essential, both open standards and proprietary. Our industry is constantly pushing the boundaries of technology, and we have a large number of vendors whose systems must effectively interoperate for our customers’ businesses to function.

The simple answer is that we need standards that encompass ‘everything, everywhere, all at once’, and I’ll leave that to our industry’s technical leaders to grapple with. n

OPINION AND ANALYSIS 06 | TVBEUROPE JUNE/JULY 2023

Securing media’s IP future

IP is unleashing innovation and efficiency in the media industry at an unprecedented scale. Nevertheless, alongside the flexibility, scalability, and transformation that IP brings, there are challenges that come with this fundamentally different media paradigm. Moving from closed and controlled to open IP-based workflows means network control and security rises to the top of the priority list for media companies that need to ensure their high-value content is protected. The stakes are high as any media network vulnerability can prove detrimental. This is the time for media companies to level up their network control and security to ensure they reap the full benefits of IP while removing its risks.

Before IP brought unlimited opportunities to media companies, the industry utilised controlled interfaces like SDI to transport uncompressed and unencrypted video feeds securely. However, SDI can no longer meet the requirements of today’s complex and ever-changing media landscape, with rich and high bandwidth 4K and 8K UHD video formats becoming the norm. Moving to flexible and scalable workflows is a competitive necessity for media organisations that need to up their game to satisfy demanding, content-hungry viewers.

IP isn’t purpose-built for media. When transitioning to IP workflows, media companies need to make critical decisions about their networks to ensure the control, security, and quality of high-value content. Transmitting video, audio, and data over IP means entering different domains and network links and ports. Media organisations need to control the type of IP media traffic that can pass through the networks and the type of streams that can go in and out of each network domain.

Due to its nature, IP media is vulnerable to security risks that don’t just refer to external threats. Even ‘secure’ IP media traffic can create serious network problems. For instance, if the content isn’t configured properly, it can flood the network and cause packet loss, jitter, and delay. Additionally, there is always the possibility of ‘human error’. If a camera setting is misconfigured and the connection is mistakenly defined as high-definition 4K, the IP media flow going into a switch could risk the entire event network going down. This is a challenge that the closed interfaces used previously would not encounter, but that can be a common risk in the current IP media era. These errors can prove detrimental to the live broadcasting of high-value content, leading to financial and reputational damage for broadcasters. Media

organisations need complete visibility and control of their IP media traffic to enhance the security of their most valuable content.

While control and security are mission-critical capabilities for media networks, not all security solutions are born equal or fit for purpose. Generic enterprise IT firewalls and solutions can’t meet the reliability, high-bandwidth, and low-latency requirements of any organisation moving video over IP. In addition, due to the nature of IP media networks, enterprise firewalls dramatically increase costs – as more expensive firewalls need to be deployed to secure high-bandwidth networks – and often compromise the quality of video streams, deteriorating the overall user experience.

To secure IP media networks while maintaining super-high-quality video, media organisations need security solutions that are tailored to the media industry. A key alternative network security model leverages the reliability and capabilities of a real-time transport protocol (RTP) media proxy.

These solutions terminate a media flow at the network boundary and re-establish it at the destination network without disrupting other active IP media flows. In this way, they enhance the security of the overall media network by preventing outages and security risks, including hijacking and spoofing. At the same time, they preserve the integrity of established flows. Additionally, the solutions enable the monitoring and assurance of the IP media payload as it passes between networks with ETR 101/290 P1 performance metrics, frozen frames, and audio silence. This means that media companies can rest assured that their IP media network is fully protected without risking the quality of the video signals transported.

Although media technology can build upon broader IT innovation, media tech solutions need to be tailored to the specific stringent requirements of the media industry, especially the high-bandwidth and low-latency standards.

Traditional IT firewalls fail to meet the requirements of seamless highquality IP transmission, raising infrastructure costs and impacting the broadcasting-grade quality of video feeds. Media-native solutions like RTP media proxies address these challenges, delivering a new media security paradigm while driving cost efficiencies and ensuring signal quality.

When it comes to the control and security of media networks we need to think ‘media first’. This is the right time for media companies and broadcasters to leverage media tech innovation and benefit from solutions that are tailored to their needs. n

OPINION AND ANALYSIS www.tvbeurope.com TVBEUROPE JUNE/JULY 2023 | 07

Exploring phase two of IP adoption

Media companies across the globe are coming to a crossroads in their evolution. The convergence of emerging technologies, fast-shifting consumer trends, and increased costs calls for bold strategies and a forward-thinking approach.

Leading media brands in Europe have recognised the immediate benefits of transitioning to an IP-based model that makes clear business sense. An IP-first approach to acquisition, media production and distribution paves the way to deliver more content to larger audiences more efficiently and reliably than ever before. Media organisations operating on a global IP network are leading the charge towards a future defined by customisation, scale, and seamless monetisation across all platforms and devices. As IP moves past the stages of innovation and into the mass adoption phase, it’s no longer about why media brands should make the shift; instead, it’s about how they can future-proof their business in the long run.

UNLOCKING AN INTELLIGENT TRANSPORT MODEL

Content distribution is undergoing a significant transformation. Traditional mechanisms such as satellite and fibre fall short of meeting the demands of today’s media landscape. In contrast, an IP-first model offers a cost-effective, reliable, flexible, and scalable alternative. Internet-based delivery through multicast capabilities enables media companies to direct and customise traffic from any network point to any destination without a physical end-to-end path, meeting the needs of modern multi-platform distribution while keeping costs under control. Top content owners need proven, ultra-reliable means of delivering high-value live video with technology they can trust to give them a good night’s sleep. Leveraging a managed IP transport network with innovative packet recovery protocols and routing algorithms means content providers can overcome network congestion challenges and ensure the delivery of live and real-time video content with ultra-low latency (<200ms) and 99.999 per cent reliability. In a closely interconnected media environment like Europe, media companies often share content through content exchange networks which can be a slow and complicated process. Managed IP-based network infrastructure with automation-driven workflows can help media owners efficiently monitor, identify, customise, and deliver content across regions, solving complex workflow challenges and adhering to stringent rights licensing and business rules.

TAKING CREATIVE CUSTOMISATION TO THE NEXT LEVEL

Beyond a flexible, cost-efficient alternative to traditional media distribution mechanisms, content owners must harness IP foundations to experiment with new business models and services. Underpinned by reliable, high-

quality content delivery with minimal delay, an intelligent IP-first network frees media leaders to drive new customisation opportunities and serve audiences with tailored, platform-specific programming across multiple platforms, including over-the-top (OTT), digital and free ad-supported streaming TV (FAST) services.

For media companies in Europe, the importance of regionalisation and customisation cannot be overstated. Content providers need efficient tools to serve diverse, multilingual audiences as well as a wide range of cultures and communities. IP-based content versioning solutions offer a scalable means of achieving customisation, empowering content owners to modify primary feeds with custom programming, audio, captions, and/or graphics to meet diverse audience, language, and platform requirements.

IP POWERS NEXT ERA IN LIVE SPORTS

The sports rights landscape is becoming increasingly complex and fiercely contested due to platform, audience, and device fragmentation. As traditional sports viewing shifts towards hybrid broadcast/OTT models or exclusive streaming, leagues, federations, rights holders, and broadcasters need efficient, reliable methods of producing, customising, and distributing live sports events while pioneering new business models and direct-to-consumer offerings. The adoption of IP-based solutions enables rights holders to efficiently and reliably distribute language-specific, culturally relevant live sports coverage to audiences across multiple platforms and geographies. With an IP-based transport network, fragmentation in the sports rights landscape transforms into myriad opportunities for targeted content distribution and amplified global reach. Advances in automation-driven content versioning capabilities enable easy preparation and customisation of high-value live sports content with platform-specific graphics, audio, and SCTE insertion, offering fans immersive and engaging experiences while maximising monetisation.

IP FOUNDATIONS TO SECURE BUSINESS GROWTH

Beyond the immediate transition to IP lies a world of untapped potential for the media industry. Operating on an IP network isn’t merely a means to finding a tried and tested, future-ready alternative to satellite or fibre distribution. It’s a launchpad for scale and customisation across any platform or region.

In today’s fast-paced and unpredictable media market, content owners are moving quickly to secure the necessary foundations for business success. By transitioning to an IP-based model, media companies can drive missioncritical efficiencies to ensure ROI on high-value content while serving consumers with more vibrant, engaging, and relevant content experiences. n

OPINION AND ANALYSIS 08 | TVBEUROPE JUNE/JULY 2023

REVOLUTIONISING CAPTIONING WITH AI-POWERED TECHNOLOGY

For broadcasters and streaming services, captioning content is becoming increasingly important. A report from UK media regulator Ofcom published in 2021 found that 63.8 per cent of on-demand video platforms were offering captioned content in 2020, up from 58.1 per cent the year before. For traditional broadcasters, Ofcom sets a statutory target of 80 per cent of content should be captioned.

Most broadcasters looking to utilise captioning have three main goals:

• They need to caption ever-increasing amounts of content as they expand into over-the-top streaming

• They are looking to reduce costs by leveraging advancements in AI and ASR

• They want solutions that provide world-class security and uptime, to ensure captions are always delivered to screens

As the global leader in captioning technology, Ai-Media’s end-to-end solutions allow broadcasters to tick all these boxes.

INTRODUCING LEXI 3.0: THE FUTURE OF AUTOMATIC CAPTIONING

Ai-Media is continually working to enhance its captioning technology with the latest advancements in artificial intelligence. At the 2023 NAB Show, the company launched the latest breakthrough in AI-powered automatic captioning with LEXI 3.0.

Trusted by most of the world’s leading broadcasters, this new and improved version of Ai-Media’s flagship live captioning solution leverages the power of AI to deliver results rivalling human captions, at a fraction of the cost.

Independent audits have proven that LEXI 3.0 delivers an average accuracy of 98.7 per cent based on the globally recognised NER model. It also achieves up to 60 per cent fewer recognition errors, formatting and punctuation errors than the previous version.

LEXI 3.0 also provides an enhanced viewer experience thanks to innovative new features, including automated speaker identification and AI-powered caption placement to avoid on-screen interference.

All this makes LEXI 3.0 the perfect solution for broadcasters looking to cost-effectively ensure compliance with accessibility requirements; whether to caption a linear TV broadcast or an OTT content platform with multiple channels.

LEXI 3.0 can also break down language barriers by delivering multilingual captions in over 30 languages.

in the world of sports broadcasting, with world-famous franchises like baseball’s the Portland Trail Blazers making the most of its enhanced accuracy.

The Trail Blazers previously used live human captioners to provide accessibility for its game coverage. However, its technical team was spending significant time managing captioning resources, and the cost of human captions soon exceeded its production budget.

By using LEXI 3.0, the Trail Blazers can now live caption their games with over 98 per cent accuracy while also decreasing per-hour captioning costs. LEXI 3.0 provided enhanced accuracy by recognising sportsspecific terms, as well as athlete, commentator and venue names.

“LEXI has provided us with accuracy above expectations, while significantly reducing our costs,” says Bruce Williams, technical manager for the Trail Blazers.

HOW AI-MEDIA ENSURES THE MOST SECURE, BEST QUALITY CAPTIONS

In an increasingly volatile cyber landscape, it’s more important than ever for broadcasters to partner with a captioning provider that can ensure maximum security and uptime. By combining LEXI 3.0 with Ai-Media’s cutting-edge captioning infrastructure and iCap Cloud Network, broadcasters can ensure the highest quality, most secure captions in the world.

LEXI 3.0 seamlessly integrates with Ai-Media’s range of captioning encoders and its iCap Cloud Network; the world’s largest and most secure captioning delivery network. Over 88.9 million minutes of captioned content is carried over the iCap Cloud Network yearly, with broadcast content representing a sizable proportion. And it’s no wonder why, as it ensures unmatched network uptime and the cleanest method of distributing audio and video, providing the best quality captions.

The iCap Cloud Network provides the ability to connect with whatever caption solution the broadcaster chooses, including human captioners across the globe. It also provides a back-up plan if technical difficulties prevent captions from being delivered to screens; allowing the user to switch to LEXI 3.0 for a primary or back-up, always-on service.

Traditional methods of caption delivery such as dial-up modems are more vulnerable than ever. So having a guaranteed failsafe like that provided by the iCap Cloud Network is a must. And users can always rely on Ai-Media’s technical support team, who are located across the globe and provide 24/7 support.

HOW

THE PORTLAND TRAIL BLAZERS ENHANCED BROADCAST CAPTIONING WITH

LEXI 3.0

LEXI 3.0’s AI-driven capabilities have achieved particularly stellar results

EXPERIENCE THE FUTURE OF CAPTIONING WITH LEXI 3.0

Discover more about the revolutionary capabilities of LEXI 3.0 and get started with a free trial today. n

ADVERTORIAL www.tvbeurope.com TVBEUROPE JUNE/JULY 2023 | 09

ICYMI

TVBEurope’s website includes exclusive news, features and information about our industry. Here are a few of our recently featured articles…

BEHIND THE SCENES AT CHANNEL 4 NEWS’ REMOTE LEEDS STUDIO

ITN’s director of technology, production and innovation Jon Roberts, studio and technology manager Sam Parker, and senior programme director at Channel 4 News Martin Collett discuss the broadcaster’s new Leedsbased permanent news studio, the decision to use a physical rather than virtual set, and how Channel 4 News is taking to the cloud for disaster recovery. n

EVERYTHING WE KNOW ABOUT: IPMX AND WHAT IT MEANS FOR THE BROADCAST INDUSTRY

HOW TEN POUND POMS CREATED A 1940’S FEEL WHILE SHOOTING IN HDR

Offering support for multiple codecs, and a cheaper option than SMPTE ST 2110, IPMX is becoming more prevalent within the media industry. We talk to PlexusAV’s Aaron Doughten and Steven Cogels to get the lowdown on the new standard. n

APPLE VISION PRO: TRANSFORMING THE SPORT AND MEDIA INDUSTRY WITH AUGMENTED REALITY

The £10 ticket to sunny Australia was a no-brainer for people looking to escape the post-war gloom and grey skies of Great Britain. However, the weather waiting for the cast and crew when they filmed BBC drama Ten Pound Poms in Sydney last year was worse than what they left behind. n

IN MEMORIAM: FARAH JIFRI

The UK broadcast industry has been left saddened by the death of PR and writer Farah Jifri. Farah had worked within the media industry for over 20 years. She was a well-known member of the IBC Daily team, and in recent months had begun writing for TVBEurope n

ICYMI 10 | TVBEUROPE JUNE/JULY 2023 #
Matt Stagg, sport, media and entertainment technology innovator, details his thoughts on Apple’s new AR/VR headset and its potential impact on the entertainment industry. n
Server-based Processing LAWO.COM

USING TECHNOLOGY TO CAPTURE LIFE ON OUR PLANET

Netflix natural history series Our Planet II returns with a focus on migration. Series producer Huw Cordey tells Jenny Priestley how drones and night-time cameras helped the team capture images that haven’t been on screen in almost 30 years

Released in 2019 to critical acclaim, Netflix natural history series Our Planet travelled to more than 50 countries over four years with more than 600 crew members to tell the life of the planet we and multiple other species inhabit. The series’ producer Silverback Films has teamed up with Netflix once again on Our Planet II, released by the streamer on 14th June.

As the first season focused on habitats, season two looks at migration, following everything from an abandoned Laysan albatross chick determining the best way to leave his island of origin, to billions of minuscule red crabs attempting to cross from beaches to the forest where they’ll be safe; without getting eaten in the process.

For season two, the team at Silverback wanted to move the dial forward in terms of both the storytelling and the technology used, explains series producer Huw Cordey. “As David Attenborough says in his narration, the

health of the planet depends on the free movement of animals because, at any one moment in time, there are billions of animals on the move,” he says. “So it seemed a sensible next step beyond the first series to do an Our Planet II where we are talking about how these habitats are joined up by the animals moving between them.”

For the new season, the production team visited more than 30 countries across six continents, with filming taking place over threeand-a-half years. The first shoot of Our Planet II took place in June 2020, while many countries were still under travel restrictions due to the pandemic. “It’s challenging enough to make natural history because you end up going to very remote places where there’s not always the information you need and the logistical support,” states Cordey. “But it was even more challenging because we couldn’t put people on planes.”

Instead, Silverback worked with local crews, which Cordey believes is a

12 | TVBEUROPE JUNE/JULY 2023

big positive for natural history producers. “We’re so used to sending crews abroad that we didn’t necessarily have lots of connections with people in different countries. I think one of the great things to come out of lockdown in terms of natural history is discovering this whole new wave of talent around the world that can shoot high-quality material without the need for producers to be burning carbon on sending crews.”

DRONES AND NIGHT-TIME CAMERAS HELP TELL THE STORY Technology also helped the production team reduce its carbon footprint on Our Planet II. Instead of using great big fuel-guzzling helicopters, drone technology has developed so significantly between seasons one and two to enable the producers to use them instead. “When I started in this business we used helicopters only for scenery because they didn’t have the kind of gyro-stabilised systems where you could get close to animals,”

explains Cordey. “I was on the original Planet Earth series and that’s when we first used a piece of technology called the Cineflex, which is a very long lens; the sort of lens we’d use on a tripod, in a gyro-stabilised system. Finally, we were able to film behaviour from up to a kilometre away with quite a good frame size.”

“What’s changed hugely between Our Planet season one and two is drone technology,” he continues. “It’s taken a big leap over the last ten years. When we did season one, we used them as much as we could, but it was still fairly limited. For Our Planet II, drone technology is almost fundamental to every sequence. DJI brought out these tiny drones with very high-quality lenses, and we can have them in the air for 40 minutes with a tiny battery. They make some noise but not anything like the noise of a full-scale helicopter and we’ve been able to capture behaviour that was frankly impossible prior to being developed. We’ve actually reduced

www.tvbeurope.com TVBEUROPE JUNE/JULY 2023 | 13
“For Our Planet II , drone technology is almost fundamental to every sequence”

our carbon footprint quite significantly while increasing the opportunities of filming wildlife. It’s been a win-win.”

The producers wanted to tell the stories of Our Planet II in as fresh a way as possible, with technology such as drones and night-time cameras playing a vital role alongside their “bread and butter” RED and ARRI cameras. “You can’t understate the massive difference drones have made to what we’re able to do,” states Cordey. “There’s a locust sequence in episode one and some of those shots just wouldn’t have been possible with a full-size helicopter.”

Night-time technology also helped the team tell a story that hasn’t been seen on screen for over 30 years. Episode one includes a sequence with the ancient murrelet, a small bird which lives on an island off the west coast of Canada that only comes out at night. “What we were able to achieve with the night-time cameras we used and the post production techniques we employed to remove noise, I think transformed that sequence and made it a very exciting and fun sequence.”

Remote technolgy also helped the team capture an episode between some grizzly bears and salmon, using camera traps that were operated remotely. “It took a long time to get a very few number of shots because we couldn’t move the cameras very easily,” explains Cordey. “We used our natural history skills to know where an animal might move or chase something and put cameras in the appropriate places at the appropriate height. There’s no science to it. It takes a long time and you have to be very patient.”

All of the footage was shot on 4K, some on 8K, with the series available to watch in UHD for viewers with a Netflix subscription that covers the resolution.

FACE TO FACE WITH A TIGER SHARK

Like any natural history series, filming in remote locations around the world is never easy whether you live there or not. It’s often boiling hot or freezing cold, and a cool beer or warming hot chocolate can be many miles away, as can help. On top of that, the stars of the show are some of the most difficult to deal with. During the filming of Our Planet II, one crew had a particularly close encounter with a tiger shark. The team arrived at the location intending to complete a recce dive. After travelling out in their inflatable boat, known as a rib, things took a scary turn.

“Within a few minutes a 15-foot tiger shark appeared and came right up to the boat and started bumping it quite aggressively,” explains Cordey. “There were five people in this boat as well as all of the camera tech. So they decided to give this particular tiger shark a wide berth and moved on to somewhere else, and they were immediately approached by another tiger shark, equally as big. But this time instead of just bumping the boat, the tiger shark came out of the water and started trying to bite the rib and in fact punctured it and the boat started to deflate.

“Luckily it has sections, so the whole thing didn’t start going down like a cartoon but bits of it did start to flatten and this 15-foot tiger shark was out of the water with his head on the side of the rib. It was almost like in Jaws where they’re trying to push an aqualung into the shark’s mouth. They managed to push the shark off and do an emergency landing on the beach. That would have been quite terrifying for a cameraman who thought, ‘right I’ll just jump in the water’ and then suddenly have a tiger thark showing far too much interest in him.”

All of the animal and insect footage is real but the series does employ visual effects and graphics to help tell its story. The VFX team created images to tell the journey of the Earth around the sun, and explain the Earth’s tilt, and seasonality. The series is based on the journey of the Earth around the sun, and it is that and the tilt of the Earth that creates the seasons. With four episodes of Our Planet II, the producers chose to focus each episode on three months of the year. “Where we have visual effects is

14 | TVBEUROPE JUNE/JULY 2023
Episode one’s sequence with a swarm of locusts wouldn’t have been possible to film without using a drone Our Planet II was able to significantly reduce its carbon footprint by using drones instead of helicopters

really to explain where we are on that journey,” explains Cordey, “but also to try to show large-scale journeys that are just not possible using camera technology. For example, locust swarms moving from East Africa to India and Mongolia. There’s absolutely no way you can visualise that with a normal camera system.”

Almost all natural history series record sound separately from pictures. It’s impossible to have a microphone close to a herd of bison if you’re filming it on a long lens. All the sounds of the animals and the atmospheres are always completely authentic, but, there is creative licence used. “You don’t always know what the sound of an animal pushing its way through a bush is going to be like, you can’t necessarily record that, so you have to record footsteps that are often done by a human,” explains Cordey. “Sometimes when we tell people that, they’re disappointed, but how do you get the sound of an animal running, which you cannot record at the time? But all animal calls and atmospheres are completely authentic to that animal and to the place.”

With Our Planet II having only just been released, there’s no news yet on whether Netflix will commission a third series. But Cordey says if it does, the team at Silverback already has an idea, although he wouldn’t give TVBEurope any indication of what it is. “We went from habitats to migration, and there’s another idea that takes that story one step further,” he says. “What we will take on is the need to have a global look at these natural ecosystems, so it would be something that’s relevant to everybody no matter where they live. It’ll be a big global project, looking at the natural world in a fresh way.” n

www.tvbeurope.com TVBEUROPE JUNE/JULY 2023 | 15
The production team anticipated the likely movement of the animals to put cameras in the appropriate places and height One of the crews had a particularly close encounter with a tiger shark

NMOS SETS ITS SIGHTS ON COMPRESSED VIDEO

Following on from the first major focus by supporting organisation AMWA – an application using ST 2110 uncompressed video within a media centre – the Networked Media Open Specifications (NMOS) are now being expanded to include compressed video and more, writes

Even in an area as complex and well-populated by different technologies as media networking and control, the Networked Media Open Specifications (NMOS) initiative has made a remarkable impact in a comparatively short space of time. A little over four years since the first specification was published (IS-04 for discovery and registration of devices by broadcast controllers), NMOS continues to gain traction as a means to connect and control products and services from a range of suppliers in an open, interoperable and secure manner.

In line with the then-new SMPTE ST 2110 standard, the first targeted user case for NMOS revolved around the deployment of ST 2110 uncompressed video in a media centre. Felix Poulin, a user-chair of the NMOS steering committee of the Advanced Media Workflow Association (AMWA) – the organisation which develops and supports the NMOS specifications – and director of Media over IP Architecture and Lab at CBC/Radio-Canada, remarks that the “response to the original application of NMOS has been very enthusiastic, and [accordingly] we can see that the number of compliant products has increased considerably in the last few years.”

But now the scope of the organisation looks set to grow significantly with the news, announced in March, that work has begun to include a range of compressed video formats and ‘other transports’ within the NMOS control architecture. While broadcast centre operations remain critical, “the needs of many media companies are broader than this;” says Poulin, “for example, to move content between premises, not to mention to the cloud and back. This means that the requirement to use compressed video and other transports also has to be accommodated.”

So whilst NMOS has been primarily used as a “major control plane” in conjunction with ST 2110, the development of new guidance for

compressed stream “opens the door to the use of NMOS for a wide range of other applications, such as remote, web and cloud productions. It also makes it easier to have a [combined environment] in which you have compressed and uncompressed streams, but there is no need to separate the control aspect.”

BEST CURRENT PRACTICE

The first fruition of this new phase of work is a best current practice, BCP-006-01 (available here), which enables registration, discovery and connection management of JPEG XS endpoints using the aforementioned IS-04 and IS-05 (device connection management) NMOS specifications.

The incorporation of JPEG XS is fully in line with industry trends, believes Poulin: “Demand for JPEG XS has been growing for some time; for example, we already see its use as a mezzanine compression codec within IPMX [Internet Protocol Media Experience], which is essentially a pro-AV implementation of ST 2110. So the main idea with this new guidance is to enable the use of JPEG XS within the same NMOS stack.”

The next steps will be to add guidance for a host of other compressed video technologies, including NDI, H.264/H.265 and MPEG Transport Stream. In each case, “the specification was architected to be independent of the transport, so it’s not a big challenge to make the specifications available for the users of those technologies, and potentially save a considerable amount of implementation time. There is a demand from vendors and this sort of guidance helps simplifies things if it means they don’t need to [do a lot of work around different control protocols].”

The remarkable growth in popularity of NDI during the past two to three years would lead many observers to expect users of this technology to be among the most numerous beneficiaries of the NMOS guidance. Poulin agrees that there is “more and more NDI in use, both on-premise and in the cloud, and a growing number of instances in which NDI coexists with ST 2110 on the same infrastructure. So a benefit of this latest development is that if you have a controller to control all of your endpoints, you can use the same NMOS specification to make those connections.”

Rounding out the latest wave of developments, it has been confirmed – by AMWA and fellow industry groups the VSF (Video Services Forum) and AIMS (Alliance for IP Media Solutions) – that NMOS is the control

FEATURE 16 | TVBEUROPE JUNE/JULY 2023
Felix Poulin

protocol for IPMX, and this will be written into the IPMX standards. As Poulin observes: “NMOS brings a lot of ‘plug and play’ capabilities to IPMX, which continues to be targeted at the pro-AV market.”

FORTHCOMING SPECIFICATIONS

Back at AMWA, developmental work on new specifications and best current practice documents is very much ongoing. Currently in progress are: IS-11 Stream Compatibility Management, which allows the configuration of media parameters of ‘senders’ based on information about physical destination devices, associated with ‘receivers’; and MS-05 NMOS Control, which will provide a uniform mechanism for devices to expose a structured combination of private control, status and monitoring within the upcoming IS-12 API.

Regarding Stream Compatibility Management, the objective is to provide “a plug and play resource so, for example, if you have JPEG XS into a receiver

that could do JPEG XS or uncompressed, it becomes very easy to manage the receiver to get the stream [that you require].” Whilst understandably reluctant to suggest specific publication dates, Poulin hopes that the latest NMOS documents might be ready in time for IBC this September.

Invited to consider the overall progress of the NMOS project since its first major public showcases in the late 2010s, Poulin is upbeat. “If you contrast a test event we did in 2020 with another that took place last summer in Wuppertal, at the offices of Riedel, it’s evident that there had been a very big jump in adoption,” he says. “In fact, I think the number of devices supporting NMOS tripled during that time period.

“If you also add in developments such as the inclusion of our specifications in the ‘Green’ or ‘Widely Available’ layer of the latest EBU Technology Pyramid [above], that trend becomes even more apparent. We sense that there is a lot of interest from different parties both in terms of implementing NMOS now and developing it further.” n

www.tvbeurope.com TVBEUROPE JUNE/JULY 2023 | 17
AMWA Incubator Workshop at dB Broadcast in Ely, UK, March 2023

MANAGING THE IP TRANSITION

Miroslav Jeras, CTO at Pebble, explains why working with an expert partner is the best way to get the most out of the transition to IP

After many years of optimisation, we were at the point where SDI was almost the ultimate plug-and-play technology. You plugged equipment in and it just worked. Yes, there were ‘gotchas’, there always are, but in contrast to some of the headaches that IP networks have introduced into the broadcast space, they were comparatively easy. No broadcast engineer of the 20th century had to worry about configuration problems or network issues. Given SDI’s ease of use and very well-known and understood quantities, it is understandable that some broadcasters are reluctant to move on from it.

However, it is acknowledged that IP brings many benefits to the table: scalability, flexibility, remote workflows, etc. It’s very much a case of when rather than if to make the switch. It would be remiss to hang back and let the early adopters work out the kinks.

IP is the future, and the key to its successful adoption is going to be managing this current transition period as IP and SDI systems coexist. The good news is that this is entirely possible and that the benefits of an IP infrastructure can be deployed gradually through an organisation. ‘Big bang’ deployments grab the headlines, but for most broadcasters, they are never going to be practical. What most need instead is a managed programme of adaptation, adopting IP technology where, and critically when, it makes both business and technical sense in the overall workflow.

The key to the success of this approach lies in the application of industry-wide standards that allow IP and SDI systems to interoperate without issues, and there has been a lot of work done in this field. There has been a concerted effort across the industry in adopting and progressing standards describing how to send media over an IP network, as well as the development of the comprehensive NMOS suite of protocols. These provide an open and easy-to-use controlplane solution to leverage interoperability in managing IP-connected devices. The result is a new generation of hybrid systems that bridge the operation of the SDI and IP universes in a way that eases the transition rather than causes further challenges for future years.

One of the key benefits that emerged as a result is the chance to leverage the current standards to make the deployment of IP systems much simpler, bringing them more in line with the plug-and-play philosophy of SDI.

There are some important criteria that any solutions operating under this framework should ideally follow. They should be self-contained;

they should be scalable to be able to allow for the connection of many devices from many vendors in a facility of any size; and they should be extensive, covering as many aspects of the broadcast workflow as possible. Once these are achieved and the known complexities of IP connection and device management that still persist are successfully wrangled, we end up with a solution that enables broadcasters to implement IP projects with ease, whether that is running a small IP test sandbox, commissioning an ST 2110 OB van, or implementing a largescale IP architecture.

A smoother transition to IP will bring everyone, including broadcasters and other media organisations large and small, along with it, resulting in a pan-industry momentum that will make implementing IP workflows so much easier.

The SDI era is undoubtedly coming to an end for most of us over the course of the next decade. But the transition doesn’t have to be like jumping off a high diving board. Broadcasters can lower themselves into the water gently, making the shift as seamless as possible to minimise disruption and maximise the benefit of IP workflows in the here and now. As the saying goes: come on in, the water’s fine. n

FEATURE 18 | TVBEUROPE JUNE/JULY 2023
IP brings many benefits to the table: scalability, flexibility, and remote workflows

LIVE SPORTS PRODUCTION IN THE CLOUD

THE PLAYERS

Munich Media, a German-based media service agency, serves clients in the sports, media and advertising industries. Managing director, Andreas Göttl, leverages a background rich in IT and cloud transformation to provide broadcasters with cloud-based playout and management systems.

MITM is a German consulting firm founded by Thomas Mitschelen who has 30 years of project management, system planning, integration, and training experience within the broadcast industry. MITM helps today’s media companies adapt to the rapidly changing technological landscape within the media/TV market by designing and providing innovative, practical, and realistic solutions.

Göttl and Mitschelen share a vision for the future of broadcast and a passion for providing ‘out-of-the-box’ solutions. They implement only the most advanced best-of-breed technologies that keep their customers ahead of the curve, leading to an enriched and successful business paradigm. As a team, they embody a ‘can do’ attitude; always designing solutions that cut through the complexity inherent in today’s broadcast workflows and maximise customer resources.

THE CHALLENGE

When a major German telecom company approached Munich Media to migrate its infrastructure to an all-IP environment, Göttl and Mitschelen joined forces to design and implement a system structured solely around managed services in a multi-vendor interoperable environment. A critical component of the transformation had to include a system capable of

remotely monitoring multiple live streams originating from sport events. It had to be controlled by a small and movable staff, and it needed to be based on an on-demand financial model. They were also challenged with a very short timeline to get it up and running.

“Projects of this size can take up to three years for public broadcasters to complete; one year for testing, one year for implementing, and another year for operating and adjusting,” explains Göttl. “We didn’t have the luxury of a multi-year timeline. Nor did we want one. Whether a project of this size and calibre can be completed within a tight timeframe is dependent upon one’s mindset. If you come from an IT background and you’ve consulted on IT transformation projects in the past, the question is not ‘how can we?’, it’s ‘why can’t we?’”

The project was further complicated because as a telecom provider, the client had different objectives than a traditional media company. A telecom company transports feeds to other organisations, so it is not output-driven, but rather outcome driven, which may have posed a challenge to traditional broadcast solutions providers, but not for Göttl and Mitschelen.

A VERY FAST SOLUTION

The answer to the company’s monitoring and visualisation needs was TAG Video Systems’ all-IP Realtime Media Performance (RMP) platform. The TAG system operates totally in the cloud and allows team members to work remotely, drastically easing recruitment issues. It also offers an opex licensing model that maximises flexibility by enabling all formats to be monitored

www.tvbeurope.com TVBEUROPE JUNE/JULY 2023 | 19
The TAG system monitors multiple live sports SRT streams on-demand running in an AWS cloud environment

together on one licence without limitations on the number of systems or where they are located; and incurs fees only on inputs, not outputs.

“I knew TAG from my previous position at a major content distribution company and had an incredibly positive experience with both the organisation and their solution,” says Mitschelen. “We worked with Shaharit Ben Latin, TAG’s director of sales in EMEA, who understood everything we needed and helped us get the ball rolling, and with Golan Simani, technical application specialist, who was always available to answer any question. The entire company was extremely responsive and delivered innovative solutions to every request, which have paid off big time already.”

An IP-based monitoring system in the cloud was a no-brainer for Göttl: “From my perspective as an IT consultant, the industry is not moving fast enough toward IP. There are always excuses such as ‘the staff is all wrong’, and ‘it’s hard to go off premises’. But I am not concerned so much about all that. The technology is there. You just have to use it. We have no on-premises infrastructure at all for this project. And because it’s a live sport production use case within a teleco, we avoided all the complexity that inhibits public broadcasters and were free to concentrate on live production in the cloud.”

Originally planned as a proof-of-concept, the system was actually up and running with the first streams on air within a month. Currently, it is monitoring 3,000 to 4,000 streams per season without any issues.

UNDER THE HOOD AND UP ON THE CLOUD

SRT is the telco’s main distribution protocol; it is working with H.264 and testing H.265 to reduce the bandwidth without any loss of quality. The TAG system is monitoring multiple live sports SRT streams ondemand running in an AWS cloud environment. Because this is an IP-based environment in the cloud, the source of the signal is irrelevant. A cloud video mixer receives signals and pushes them out to customers in whatever format they desire.

Adopting an opex model allows the telco to speed up or spin down instances as needed, maximising resources and minimising costs. The combination of AWS on-demand and an opex licensing model allows the company to pay only for what is used allowing its business to grow as its requirements evolve and expand.

TAG’s open-source API allows the MCM to be controlled from anywhere, giving team members the ability to either work remotely or monitor feeds on-site.

“Of course, we are a tech company, but we’re not selling tech,” states Göttl. “We’re solving problems for our clients. We don’t care where the signals come from, and we don’t have to worry about satellite uplinks. What matters is that they are live sports streams and we’re making a smart contribution of those streams that the TAG system can then monitor and analyse to ensure the client delivers pristine signals successfully,” he adds.

CASE STUDY 20 | TVBEUROPE JUNE/JULY 2023
“From my perspective as an IT consultant, the [broadcast] industry is not moving fast enough toward IP. The technology is there. You just have to use it”
ANDREAS GÖTTL
TAG Video Systems’ all-IP Realtime Media Performance (RMP) platform operates totally in the cloud and allows users to work remotely

THE RESULTS

“The system is being used to monitor streams from several different sports including basketball, women’s football, men’s football, and ice hockey; basically everything within the streaming arena,” explains Mitschelen. “It is being used as an analysing tool as well as a monitoring tool and it has helped our client get to the cause of several problems quite often.”

“The TAG system is intuitive and super-easy to install and operate. We had great support from the TAG team who always came through and made everything work together. It’s functioning as smoothly as possible,” he continues. “The system has enabled operators to work from Munich today and New York tomorrow. It doesn’t matter. We currently have people working worldwide and within Germany. One of our colleagues is even travelling with a caravan and working remotely around the world, using Starlink. This off-site capability is giving companies the flexibility to hire the right people even if they are in remote locations. Everything we have done has maximised flexibility with no limitations whatsoever.”

LOOKING FORWARD

“TAG’s flexibility also enhances our ability to deliver the most agile systems,” adds Mitschelen. “If the customer decides to use a signal format that its system doesn’t support today, we can add it easily. It’s no problem.

“We are still learning all the features and functionalities that the TAG platform offers. Our next step is learning to use its automation to relieve the staff of mundane and time-consuming tasks. Instead of sitting around watching signals on a screen, which is not very productive, operators will focus on solutions to issues that might arise.”

He continues: “Going forward, one of the key things I’m evaluating for more intense use is TAG’s API connectivity features. For example, now people create the mosaic layouts themselves and do all the UMDs manually; this shouldn’t have to be done because all the information is in our database. I’d like to use TAG’s API connection to create a perfect layout, fully labelled, every day for each of our operators so they don’t have to repeat the same task every time they sign on.”

The project has proven to be a been a major focus of Munich Media and MITM, but the team looks forward to expanding and offering similar services to other customers. “Down the road, we want to explore and expand our playout in the cloud services where we could acquire signals as part of the broadcasting world and work in playout with graphics, loudness, audio processing, etc,” explains Mitschelen. “This could be used for different platforms such as YouTube, Facebook... you name it. The people we are hiring now for monitoring and analysing the streams have the potential to do these other functions as well.” n

www.tvbeurope.com TVBEUROPE JUNE/JULY 2023 | 21
CASE STUDY
Overview of the TAG Video solution

THE [TONGUE-IN-CHEEK] PYRAMID OF INCOMPETENCE

We regularly hear about how the broadcast industry has a ‘skills shortage’ that needs to be addressed, but it’s often not clear what this actually means in the real world. Much of the industry continues as normal without viewers noticing, but this doesn’t mean there are no underlying structural issues beneath. In particular, the move to IP, cloud and software has exposed the industry skills shortage. An understanding of IP workflows, and in particular fault-finding in the IP, software and cloud domain, is substantially more complex than in traditional workflows such as SDI.

I believe there is a large gap between the ambitions of broadcasters to move to IP/cloud, the overall direction of travel in technology, and the skillset on the ground needed to deliver. For this reason, I created the tongue-incheek ‘pyramid of incompetence’, a diagram showing topics in which I feel broadcasters lean too much on manufacturers instead of trying to understand these topics themselves.

It is a parody of the EBU’s TechnologyPyramidofMediaNodes, which shows vendor compliance to ST 2110 and other technical standards. Essentially it turns the pyramid on its head; instead of broadcasters scrutinising vendors, it’s a vendor scrutinising broadcasters.

Whilst this pyramid was intended as a joke, it seems to have led to a lot of industry comment; perhaps because these industry concerns have been simmering underneath the surface, but it has been considered uncouth for vendors to make comments about their potential customers. But again, I would argue serious conversations like this are exceptionally important as the industry goes through this unprecedented transition.

System architecture and complexity in the IT-centric world far exceeds that of legacy hardware and appliance-based pipelines. In many broadcasters, the immediate reaction whenever there is an issue of any sort is to press a figurative ‘big red button’ (where available) to switch to a backup and then immediately call up a vendor and say ‘it’s not working’. And to be fair, in the SDI world, this was a reasonable thing to do. If there was a problem with SDI from a device, the chances are it was the fault of that device and something the vendor needs to know about.

But IP and cloud are different; there’s a patchwork of different systems and workflows out there that require careful and methodical monitoring and fault-finding. And yet in a small, but vocal minority, fault-finding skills are non-existent and so vendors get the ‘it’s not working’ call about

a network they didn’t design and build and are expected to troubleshoot. Calling up a vendor about an IP/cloud issue and saying ‘it’s not working’ without making any effort to fault-find is unacceptable. This forces a product manufacturer to be both an IT department and systems integrator to try and solve a fault that’s most likely unrelated to the product. Product support teams are there to solve product issues (providing recurring value to the vendor), not to run an IT department for customers that will break even at best.

One of the existential challenges many broadcasters have is to move from big-iron hardware to IP and then to cloud. But IP and cloud requires a complete change in mindset. This isn’t like the change from analogue to digital or SD to HD, it’s a complete step-change in all technological aspects of broadcasting. All the competitors to linear broadcasting – Netflix, Youtube, TikTok, etc – are all cloud-native to begin with. As the pyramid shows, some broadcasters struggle with basic concepts such as TCP vs UDP or Unicast vs Multicast. There’s no chance of ever moving to the cloud if broadcasters struggle with the basics. It’s not going to get easier as technology progresses. Netflix is working on research about TCP performance and network optimisation, yet in broadcast we are still struggling with the first lecture in a networking course.

So what’s the solution? Training and a cultural shift. Engineers on the ground should be empowered to fault-find themselves in a blame-free culture. It’s easy to just reboot a device and say ‘that fixed the issue’, but was it actually something more complicated in the architecture? For vendors, the current economic situation will likely put an end to the ‘it’s not working’ calls as CFOs find it unreasonable to run loss-leading support departments, which actually function as IT departments for their clients.

One other possible option of course is for broadcasters to use managed service providers. This is especially beneficial for smaller broadcasters where a critical mass of engineering knowledge may not exist. There is a wide range of managed service providers out there covering all or just a portion of a given technical workflow in broadcasting.

This diagram was intended to spark discussion at a conference but at the same time many a true word is spoken in jest. The skills shortage, unfortunately, isn’t a joke. n

FEATURE 22 | TVBEUROPE JUNE/JULY 2023

FOCUS ON QUALITY.

DON’T COMPROMISE ON COMPLIANCE.

SMPTE 2110 & NDI

Vir tualized and Cloud environments

Broadcast Monitoring and OTT Analysis

The Global Standard for Compliance

PREVIS, POSTVIS AND A BEAR HIGH

Whether you see it as a nature documentary with a serious message, or one of the best films ever made about a bear on illegal drugs, no animals were harmed in the making of this year’s horror comedy Cocaine Bear. Kevin Emmott talks to Halon Entertainment’s Brad Alexander about what it takes to breathe life into the character, how pre- and post-vis can benefit the creative process, and how to turn a rat into a bear using groom maps

“THIS IS A TRUE STORY”

Brad Alexander has worked on some of the biggest sci-fi blockbusters of all time. Starting his career in film on the Star Wars prequels at the Skywalker Ranch, he has gone on to work on multiple action flicks. By his own admission, flying spaceships into massive space battles is probably his favourite thing, “but then again, that’s probably everyone’s favourite thing in this industry.”

So, when he was offered something more realistic and down to earth, something rooted in nature and with a bit more heart – a true story, no less – what did he do?

Well, he bit their arm off.

Alexander is senior previs supervisor at Halon Entertainment in Santa Monica. After meeting business partner Dan Gregoire at the Skywalker Ranch, they started Halon Entertainment in 2003 and haven’t looked back since: “Halon has been a sweet success,” he says. “Starting with the Star Wars prequels we’ve had the opportunity to work on Avatar (I was a CG supervisor four years), and then Star Wars: The Force Awakens came back around. We’ve worked on lots of fun titles.

“And then, after about 25 movies, we were offered Cocaine Bear. I was helping out on another project and I got an instant message which simply said, ‘Hey, wanna work on a cocaine bear?’”

ABOVE AND OPPOSITE: Previs means that when a filmmaker goes to shoot a scene in real life, they know exactly what to put in the shot

TVBEUROPE JUNE/JULY 2023 | 25 PRODUCTION AND POST
Brad Alexander

“It was totally unexpected,” he adds, “but anything with that name will attract curiosity.”

ADDING VALUE WITH VIS

Celebrating 20 years in the business, Halon Entertainment is a visual effects studio which provides a full suite of visualisation services, including previs and postvis, for a range of commercial clients. Previs, or pre-visualisation, helps a filmmaker choreograph the final look and feel before getting on set. It can reduce the number of set-ups and re-shoots, speed up the production process and help to develop the narrative. It is fundamental to maintaining the integrity of the filmmaker’s artistic vision.

“Previs gives you scope to explore camera positions and to change the set-up because at this point the set hasn’t been built,” says Alexander. “It’s a great conduit for information; once previs is locked, sets are built and shoots are based around it.

“Postvis happens after shooting. We use the filmed plates and we 3D-track their footage, and then we’ll put our visual elements back into the shots to make sure they work. That way, when filmmakers communicate their ideas to the final visual effects vendors, they can be very clear about exactly what they want.”

“Everything we provide to a client should help them simplify a process in the future,” he continues. “Previs means that when they go shoot it in real life, they know exactly what to put in the shot. We can even dial down to the exact lens on the camera because our camera accuracy is so precise that it knows what will be inside or outside the viewfinder. We can define what sets to build and to what dimensions, and what the characters need to do to interact on set; everyone knows exactly what to do as soon as they get there.”

Previs also speeds up the postvis process as everything is already mapped out, which in turn makes postvis more forgiving because everything is already planned; in previs, you are figuring all of that out. With a director as hands-on as Elizabeth Banks was on Cocaine Bear, it was even more important for Alexander and the Halon team to be on the same page. Luckily, this wasn’t their first picnic.

GETTING HANDS-ON

“The first thing that we do for any client is have their vision out first, and if we have time to do a couple of extra shots that we think might be cool, we’ll throw them into editorial,” says Alexander.

“On Cocaine Bear we would have daily meetings and Elizabeth was in most of them. She would mimic the

bear, she would shoot YouTube videos and say, ‘let’s do the bear moving like this’. She was ‘on it’; it was great working with her. She knew exactly what she wanted.”

“I remember her calling the special effects and prop company and being very verbal about matching the previs exactly,” he adds.

The film really is based on a true story; a 175-pound black bear did ingest a huge amount of smuggled cocaine in Georgia in 1985, although in real life the drugs killed it in less than an hour.

With some footage already shot and a tight production schedule to keep, Alexander had to move fast to get Cocaine Bear to the screen, and so he took inspiration from the actual cocaine bear which is taxidermied at the Kentucky Fun Mall in Lexington. While he admits there was huge creative licence taken with the final design, the early version was even more horrific.

BEARING UP

“It looked like a giant rat,” laughs Alexander. “With any project, we need time to create the assets that are going to be in the vis, and we used an online model to get up and running quickly. Changing it into something more natural was a constant development.”

PRODUCTION AND POST
26 | TVBEUROPE JUNE/JULY 2023

Fur is notoriously difficult to render, but it was important to get right, even for the previs. Even though the raw components gave the production team everything they needed to start shooting, there was no emotional attachment to the content. The team needed to bear-up.

Alexander agrees: “That was Elizabeth’s first reaction too! We had to have something to animate in low fidelity

to get our ideas across, but the first render that she saw where we had the fur on the bear, she was sold.

“We relied heavily on Unreal Engine for a fur plugin, we used a lot of ZBrush for asset creation and Adobe After Effects for compositing. We did numerous iterations using groom maps, which are black and white texture maps that can be placed on the geometry of a character. White represents the longest length and black represents the shortest length, and the map would be constantly altered to make the bear look more natural.”

BRINGING THE STORY HOME

Previs and postvis contribute to the development process by encouraging storytelling, helping with planning, communicating everything to the crew and providing many opportunities for refinement before the big bucks are spent. They save time and money and ensure the filmmakers’ artistic vision is translated to the screen.

“Everyone loves animating spaceships, robots, and explosions,” says Alexander. “But this was very different and with a character you can have a lot of fun with. To have an opportunity to animate a coked-out bear was a blast, because what would a bear do on coke? There’s no real reference point for that.” n

AND KVM FEELS RIGHT.

You can‘t always see G&D right away. The products and solutions are often hidden. But they are systemically relevant and work. Always!

You can rely on G&D. And be absolutely certain. That‘s quality you can feel. When working in a control room. With every click. When installing in a server rack or at workplaces. G&D ensures that you can operate your systems securely, quickly, and in high quality over long distances. People working in control rooms can rely on G&D.

G&D simply feels right.

TVBEUROPE JUNE/JULY 2023 | 27 PRODUCTION AND POST
www.tvbeurope.com
VISIT US! GDSYS.COM OR LIVE AT

A PROMPT REFRESH

Following Autocue’s recent refresh of its entire range, Jenny Priestley talks to Robin Brown to find out what it means for broadcasters

Since it first launched onto the market in 1908, Hoover has become the catch-all phrase for any kind of vacuum cleaner. The same can be said for Autocue and prompters. While Autoscript is the prompter used by tier-one broadcasters, Autocue is still popular with local newsrooms and regional broadcasters.

At NAB 2023, Autocue announced it was shaking up its entire range of prompters with a focus on speed, simplicity and sustainability. The company’s reasoning behind the refresh was that the range had been left to “stagnate” over the past few years, explains Autocue and Autoscript product manager, Robin Brown. “It was just sort of left to sort of muddle along with little support and no additional updates,” he adds.

The big feature of the relaunch is that Autocue has moved to an annual subscription model. Previously, companies would buy the software outright; now, Autocue customers pay an annual charge but won’t have to pay for additional licences. “If broadcasters want to buy products which are slightly more cost-effective, and are not interested in using an ethernet workflow, then they can buy Autocue,” explains Brown. “The functionality is pretty much the same. The software looks a lot cleaner, and it’s a lot simpler.

“When a broadcaster buys the product they get a single package with everything in it, including HDMI and power cables,” he continues. “It’s a very simple out-of-the-box experience. You get a little scratch card with your software and that’ll have a number on it. Go to the Autocue website, put your scratch card number in, enter your details and then when you connect your laptop, it will automatically check that you have a licence and off you go.”

Users can also move their licences to a different machine. For example, if a rental house has a licence but needs to move it to another piece of hardware, they can de-register the original machine and then register the new one without needing to speak to Autocue.

Sustainability has also been a focus of the refresh and it’s something that Autocue says it is helping to lead

the broadcast technology industry on. The company’s base in Bury St Edmunds is completely powered by solar panels, while the factory in Costa Rica where the products are built also has solar panels on its roof. “We’re using cardboard instead of plastic for all the packaging,” adds Brown, “so when you open it, there are no bags or anything like that. It’s all cardboard.”

The new kit also consumes less power and produces less heat, meaning that its components last longer.

Autocue hopes this refresh will see the company re-entering the prompting market with a higher-quality product that is more resilient. “The problems with different sizes of cameras and different lenses, have always meant that it’s kind of hit and miss whether something’s going to fit,” Brown adds. “With the new Autocue range, we have made all the components adjustable. If you want to have a great big talent monitor on the front with a clock and all the rest of it, you can have that. If you just want to have a simple piece-to-camera, so it’s a relatively small prompter unit, you can use a very simple set-up. All the adjustment and all the flexibility is built into the range.” n

PRODUCTION AND POST 28 | TVBEUROPE JUNE/JULY 2023
Robin Brown

NEXT GENERATION AUDIO PREPARES TO GO MAINSTREAM

Larry Schindel, senior product manager at Linear Acoustic, takes a look at what the future holds for broadcast audio

The adoption of next generation audio (NGA) is increasing now that the world has largely moved past the Covid-19 pandemic. In some cases, broadcasters are picking up projects they were in the middle of when the pandemic hit and were forced to change gears in order to stay on air. In other cases, broadcasters are starting new projects incorporating NGA.

Sporting events are the driving force for real-time, linear NGA services, and NGA is typically paired with 4K/UHD video. Just like in the early days of HD video and 5.1-channel surround sound, this is regarded as premium content, and broadcasters are able to charge extra for such services. Within the next few years as adoption increases, it will become more mainstream. New deployments often ramp up around events such as the Olympics or the World Cup, and in fact, both of these events are now being produced in immersive audio by the host broadcasters.

Both the amount of content being produced in immersive audio and the demand for it are growing in many regions around the world. Broadcasters in North America, South America, Europe, the Middle East, and Asia are creating immersive content for distribution across cable, satellite, and OTT services. In the United States and several other countries, the ATSC 3.0 system is capable of supporting NGA on terrestrial broadcasts, but broadcasters are not yet taking full advantage of the available features. This is, in part, due to regulations requiring them to simulcast content that is on their ATSC 1.0 services.

While immersive audio is the first thing most people think of when they talk about NGA, object-based audio – which provides an easy and efficient way of handling personalisation – and dialogue enhancement are also key features. Dialogue enhancement is an inherent part of the NGA codecs (MPEG-H and Dolby AC-4) and requires no additional effort on behalf of the broadcasters to make it available to viewers.

Some broadcasters in the US and Europe are experimenting with object-audio based personalisation, and while this isn’t being broadcast on a regular basis yet, successful tests have been performed from the venue all the way to the viewer. Their goal is to have this aspect of NGA deployed before next summer’s sporting season.

For those unfamiliar with object audio-based personalisation, it represents a different approach to how the elements are delivered and mixed. By transmitting the base music and effects mix (M&E) and dialogue tracks separately within the same bitstream, the NGA receiver is able to mix them together based on choices made by the viewer. This approach, coupled with the higher efficiency of the new MPEG-H and Dolby AC-4 emission codecs, saves bandwidth compared to transmitting multiple complete mixes containing different dialogues while providing a better experience for all viewers regardless of which language, team announcer, or AD service they prefer.

The broadcaster determines which dialogue options are available to the viewer and can, if they choose to, offer additional controls to the viewers for things like adjusting the level of the dialogue, or choosing where the dialogue is placed in the sound field. It is possible for the broadcaster to set limits on these controls, and choose logical default values to ensure the viewer has a good experience without having to take any action when they tune in.

These settings are carried as metadata in the final bitstream as transmitted to the viewer. This metadata is typically static throughout the programme, but the industry is also migrating to a new form of linear and streaming metadata called ‘serial ADM’ (sADM), which can be incorporated into NGA.

sADM is a few years away from being commonplace, and some technical details are still being worked out, but it won’t be long until these are resolved. Some early experiments and test broadcasts using sADM have been performed in Europe, and the EBU hosted a conference last autumn to discuss sADM, with technology providers demonstrating it working in practice.

Despite pandemic-driven delays and slower-than-expected deployment in some parts of the world, Europe continues to lead NGA adoption. Meanwhile, other regions of the world are gaining momentum as more supported content becomes available, broadcasters and distributors invest in the necessary infrastructure, and viewers become aware of its many benefits. n

PRODUCTION AND POST www.tvbeurope.com TVBEUROPE JUNE/JULY 2023 | 29

SERVING AN ACE

Formed in 1999, ATP Media is the global sales, broadcast production and distribution arm of the ATP Tour, the elite worldwide men’s professional tennis circuit. It holds the digital broadcast rights for 64 global tennis tournaments across 200 broadcast territories, involving up to 100 broadcasters serving a worldwide audience of more than 800 million. Through its digital archive site, ATP Tour Archive, thousands of hours of both match and non-match action are available to rights holders and third parties. This includes everything from the original live coverage to interviews, highlights and analysis. However, having amassed such a vast archive of historical tennis footage, ATP Media was presented with the unique challenge of re-invigorating content that had been sitting on shelves for decades. To serve both existing and potential audiences, ATP Media quickly

realised that properly preserving its content library was key to making sure it continued to be accessible and even open up potential future monetisation avenues.

To do this, ATP Media first wanted to focus on migrating its iconic ATP 250 and ATP 500 Series tournament footage to a file-based format. With footage from 1990 through to 2009, these matches featured top tennis legends in action, as well as the emerging talent who would themselves go on to eclipse all who had gone before them.

Because of the sheer volume of media stored in older legacy formats, ATP Media decided to host its archives in an entirely new cloud location, managed via a media asset management (MAM) platform. This is where Memnon’s UK team (previously known by the name LMH) stepped in as the media migration partner for the project.

30 | TVBEUROPE JUNE/JULY 2023
ATP Media first focused on migrating its ATP 250 and ATP 500 Series tournament footage to a file-based format

DIGGING THROUGH THE CRATES

The advent of the ATP Tour in 1990 saw the reinvention of tennis as a brand, taking households across the world by storm. ATP enjoyed increased success on the back of a 1991 broadcasting deal for 19 tournaments and was aided by the uniquely compelling storylines and intense rivalries that played out on the court. The ‘90s witnessed the US trio of Pete Sampras, Andre Agassi and Jim Courier all vying for supremacy. This gave way to the 2000s which saw major titledeciding battles between Roger Federer, Rafael Nadal, Novak Djokovic and Andy Murray. These two significant eras of tennis delivered the much-needed televised action outside of the Grand Slams that fans had been wanting for years.

With such a wealth of broadcast programming and recorded content tucked away, ATP Media needed a workflow that could inspect, catalogue and preserve its vast repository of archived material. While the industry has transitioned into a digital era where file-based formats are used almost exclusively, most of ATP’s content was stored on legacy-format media. Previously, most assets – predominantly recordings of quarter-finals, semi-finals and finals of the ATP 250 and ATP 500 tournaments stretching as far back as the ‘90s – were held on SD Digital Betacam with some HDCAM tapes. This made specific metadata requests complex, often causing delays in delivery due to the manual process involved in searching for specific content. On top of that, most of the footage was stored in formats that required machines that were long since obsolete and becoming harder to come by.

Memnon was enlisted to offer its full range of content services, ranging from library migration to file-based content services such as quality control (QC) and standards conversion (SC).

ESTABLISHING THE RALLY

In partnership with ATP Media, Memnon had to devise a multifaceted workflow with a strong quality control ethos. Due to Memnon’s emphasis on QC and clean-up remedial work to achieve the best possible outcome during ingest, ATP’s large volume of content remained high-quality so it could be used for years to come.

With a live archive already commercially available, a lot of flexibility was needed to deal with requests that needed to take priority as well as needing to deliver on a diverse set of endpoints. The workflow itself included a full clean and physical inspection of every single tape, using Memnon’s Indelt TC-Matic Betacam tape cleaner, which was utilised to identify physical defects on a tape’s surface and clean the tape, aiming to remove as many flaws as possible prior to ingest.

Once the tapes were barcoded (asset labelling) they were then loaded into eight Sony Flexicart robotic cassette changing units. The Flexicarts

were under the control of the MediaFlex Mediacart application. The migration from the physical domain to a file domain then took place.

Mediaflex uses the RS422 remote control interface and imported migration schedule to place the VTRs into play at the correct timecode point. The SDSDI signal was output from the videotape recording machines (VTRs) into a production router. This in turn was inputted to a Telestream LiveCapture encoder, which generated the DNx.mxf and IMX50.mxf files required. The ingest was carried out using its SERVOLOCK mode, which enabled the file recording to be automatically started and stopped without the need for any human pre-qualification intervention.

Memnon went a step further to collate record reports into one automated xml report via a Python script to combine all the pieces of metadata into one export file. The delivery was a PDF file that contained images of the tape case, tape cassette, paperwork, vidchecker report, ISR XML, MD5 hash, ATP descriptive metadata and the cleaning report.

SERVING UP FUTURE OPPORTUNITIES

Investment in upgrading archive libraries is crucial for both future preservation and monetisation opportunities. With Memnon, ATP Media was able to continue preserving its tennis heritage, increasing the longevity of its media assets, and reducing overall costs associated with physical storage.

With an explosion of content driven by SVoD/TVoD providers such as Netflix, Amazon Prime Video, and Disney Plus, as well as video assets having more impact on social media platforms than ever, there’s a huge market ripe for leveraging video assets. n

www.tvbeurope.com CASE STUDY TVBEUROPE JUNE/JULY 2023 | 31
The workflow included a full clean and physical inspection of every single tape, using Memnon’s Indelt TC-Matic Betacam tape cleaner

MODERN TECHNOLOGY MEETS ANCIENT HISTORY

Orchard Clips aims to help producers find video clips for and about the Middle East. All of the content available has been filmed using the latest technology by its network of filmmakers and is available to content creators around the world. Luke Smedley, head of Orchard Clips, explains more

CAN YOU GIVE US SOME BACKGROUND TO THE BUSINESS?

Orchard Clips is a newly-launched footage sales business focused on footage from and about the Middle East and North Africa. Our core collection comes from the archives of our sister company, OR Media, a Londonbased TV and feature production company that has been making documentaries about MENA for the last 30

years. We are also working with content owners across the region to make their footage available to creatives worldwide and commercialise their collections.

MENA is a fascinating region with amazing diversity. It has immense history yet some parts are thoroughly modern. It has both grinding poverty and untold wealth. These are countries with increasing confidence and influence on the world stage. And, whether it’s through

32 | TVBEUROPE JUNE/JULY 2023

geopolitics, sports or culture, the world is taking more interest in the engrossing and unexpected stories of the region. Unfortunately, that diversity is not reflected in the footage available in the marketplace.

It’s often difficult to find good quality, authenticlooking footage of the region. The quantity and diversity of clips in the major footage agencies are often limited and skewed towards news with an international perspective. And, local broadcasters in the region often don’t have the expertise and resources to commercialise their own archives. Our aim is to bridge the gap so that programme-makers and creatives can easily access diverse and interesting footage from the region and broadcasters and other content owners can better manage and commercialise their archives.

HOW BIG IS THE TEAM AT ORCHARD CLIPS?

We’re a team of five right now: two experienced content curators Sanja Adamovic and Martin Benedyk, with backgrounds including ITN and AP; Paul Maidment, ex-director of BBC Motion Gallery on business development; and recently appointed head of sales, Greg Aslangul, with extensive experience at stock image agencies in the Middle East.

WHAT DOES ORCHARD CLIPS OFFER BROADCASTERS?

Our main offering is our collection of clips. They range from beautiful cityscape timelapses to intimate fly-onthe-wall shots and extensive interviews with important and influential figures within the MENA region. We currently have nearly 15,000 clips online, with hundreds more added each week. They are available to broadcasters around the world for use in their programmes and news bulletins. We offer a variety of business models, from single-clip purchases through to unlimited access subscriptions.

For broadcasters within MENA we also offer archive management, cataloguing and archive consultancy services as well as commercialising archive collections.

WHAT TECHNOLOGY ARE YOU USING?

We are entirely cloud-based. Our website and media asset management system is provided by our technology partner, Veritone. Their best-in-class tools provide a simple and user-friendly process for our customers, from initial search and discovery to full ecommerce for the final clip purchase.

PRODUCTION AND POST www.tvbeurope.com TVBEUROPE JUNE/JULY 2023 | 33
Luke Smedley W INNER

HOW ARE THE CLIPS STORED, AND HOW DO PARTNERS DELIVER ASSETS TO YOU?

All of our clips are stored within Amazon S3, and our partners deliver all assets to us digitally.

HOW ARE YOU DIGITISING CLIPS THAT WERE ORIGINALLY ON TAPE?

We are lucky that OR Media went through a programme to digitise all of their videotapes a few years ago. We also have a network of suppliers who can carry out digitisation services should any partners have legacy collections on videotape or film.

WHAT QUALITY ARE THE CLIPS AVAILABLE IN?

As you’d expect from a legacy archive, there’s a variety of resolutions. Our oldest material is SD, and there’s quite a lot of HD too. OR Media is currently producing a slate of documentaries with unparalleled access to Saudi Arabia’s Vision 2030 transformation programme. This is all being beautifully shot at 4K DCI resolution and will be available from Orchard Clips shortly.

IS THERE A LIMIT ON THE LENGTH OF THE CLIPS THAT YOU CAN STORE?

We’ve not really reached a limit on clip duration yet, which is quite a good thing because many of our interviews are an hour long or more.

ARE YOU CURRENTLY EMPLOYING AI FOR CREATING METADATA?

Yes, we are using AI. One of the reasons we partnered with Veritone is that they are fundamentally an AI-focused business. They have access to a suite of AI engines from all of the major players, which means they can deploy the best ones for any given requirement. We are currently using AI for object recognition, face recognition and transcription in a number of languages. We will shortly roll out AI-based translation so that customers can also work in Arabic. And we are looking at how we can integrate ChatGPT or other chatbots into our processes.

Although AI technology is impressive and helps speed up our processes, I’m firmly of the belief that we still need expert archivists to provide context and nuance. Our curators work alongside the AI and add important information that highlights what is so valuable about our clips.

DO YOU HAVE ANY PLANS TO EXPAND INTO EUROPE?

Although our content is very much focused on the Middle East, we expect our customer base to be worldwide. Many of the most talented documentary makers and creatives are based in Europe and we see them as being vital to the success of our business. There are ties that bind Europe to MENA, whether that’s from colonial pasts or current business relationships. And that means that there are many fascinating, sometimes controversial, and occasionally surprising stories that we can help European producers and broadcasters to tell. n

PRODUCTION AND POST 34 | TVBEUROPE JUNE/JULY 2023
“We are currently using AI for object recognition, face recognition and transcription in a number of languages”
Orchard Clips currently has nearly 15,000 clips online, with hundreds more added each week
FOR ADVERTISING OPPORTUNITIES PLEASE CONTACT: SIGN UP NOW: WWW.TVBEUROPE.COM TWITTER.COM/TVBEUROPE FACEBOOK.COM/TVBEUROPE1 LINKEDIN.COM/COMPANY/TVBEUROPE SADIE THOMAS +44 (0)7752 462 168 sadie.thomas@futurenet.com MEDIA, ENTERTAINMENT AND TECHNOLOGY: WE’VE GOT IT COVERED

ALIGNING VIDEO TRANSPORT COSTS WITH CONTENT VALUE USING SRT

In today’s digital era, the continued adoption and increased bandwidth capacity of unmanaged internet networks has revolutionised the way we can transport broadcast-quality video. At the forefront of this transformation is the secure reliable transport (SRT) protocol, an open standard solution that has gained immense popularity across the media industry. However, until recently, limitations in the capacity, density, and flexibility of commercially available SRT solutions hindered its widespread adoption in high-end video production and distribution. Hardwareaccelerated SRT changes this by providing SRT solutions capable of delivering the power and robustness required by media and entertainment organisations.

THE EVOLUTION OF SRT

Initially developed by Haivision in 2013, SRT emerged as a protocol for the transportation of high-quality video over unpredictable networks, such as the public internet. Since its inception, it has become the de facto standard for video transport over ‘lossy’ networks, offering numerous advantages and capabilities over other protocols. With its ability to enable broadcast-quality video transmission over inexpensive unmanaged internet networks, end-toend encryption, efficient packet loss protection, and firewall traversal capabilities, SRT has transformed the way media companies can approach video delivery and opens up exciting future use cases. While the benefits of SRT are widely known, existing solutions were until recently primarily server-based with limited capacity (less than 500 Mbps), restricting their usage to small-scale live events and content reception of niche channels for operators. However, as the cost savings and operational simplicity of SRT became apparent, media and entertainment companies realised a need for robust, highdensity, and flexible carrier-grade solutions that seamlessly integrate with their existing workflows. For this reason, hardware-accelerated SRT has emerged as a solution addressing these limitations and unlocking new opportunities for using SRT in low latency, high bit rate premium content production environments.

THE BENEFITS OF HARDWARE ACCELERATION

Hardware-accelerated SRT brings significant cost savings and operational efficiencies. In contribution scenarios, traditional SRT gateways could only support a limited number of cameras before requiring additional servers. In contrast, a single rack unit enabled with hardware-accelerated SRT can effortlessly handle up to 22 UHD camera feeds compressed with JPEG XS, resulting in substantial opex savings. In distribution, this second generation of SRT solutions provide the most cost-effective channel transmission over the public internet, empowering operators to replace expensive satellite links and dedicated fibre circuits. Furthermore, the use of high-density rack units as an SRT gateway offers massive scalability and significantly reduced space, power consumption, and overall costs.

This is where the real difference of SRT comes into play: the exponential growth in the potential use cases for transporting video over the public internet. It’s not that long ago that the only solution for transporting large groups of live channels was satellite, with all its associated costs. A decade or so ago, leased circuits would have been the only alternative to use. Now, thanks to SRT, it is possible to slash channel bouquet transport costs by using the public internet; something unimaginable even three years ago.

SHAPING THE INDUSTRY STANDARD

The increasingly undeniable cost advantages of SRT and the public internet will define the future of media delivery. Thanks to ongoing optimisation efforts and hardware acceleration, SRT will continue to evolve, enabling the public internet to be leveraged in ways that were once inconceivable. While the future of video delivery remains uncertain, one thing is clear: SRT and the public internet offer increasingly irresistible cost advantages for most video transport use cases.

With continuous optimisation and hardware acceleration, SRT enables the public internet to be harnessed in ways that were unimaginable a decade ago. As the industry faces the challenge of meeting consumer demands for more choice while managing costs, the implementation of SRT provides an attractive solution, loved equally by CFOs and CTOs. n

TECHNOLOGY 36 | TVBEUROPE JUNE/JULY 2023

EXPANDING ST 2110 BROADCAST HORIZONS

The SMPTE ST 2110 standard was purpose-built to help migrate from SDI to IP and was designed to sustain performance at scale.

With ST 2110, broadcasters are no longer beholden to a large matrix switcher with a fixed number of physical ports. Instead, they can connect and cascade an unlimited number of devices on the IP network. That’s what creates scalability.

SMPTE ST 2110 is well-designed for people who need to create IP broadcast environments. For those in the pro AV market looking to move to an IP environment, it can be advantageous to borrow useful bits from ST 2110 while shedding some of the broadcast-specific requirements.

ENHANCEMENTS THAT MAKE ST 2110 MORE FLEXIBLE

Thankfully, SMPTE ST 2110 was just the beginning. Since broadcasters began implementing it, industry groups have been working to expand its utility. The standard has evolved so that people can borrow all the benefits of ST 2110 in lighter and more flexible ways.

AMWA [Advanced Media Workflow Association] created NMOS (network media open specification) for device control, which many ST 2110 users have adopted into their workflows. NMOS greatly facilitated asset discovery and asset management.

Then came IPMX, an ST 2110-based media-over-IP standard targeting the pro AV industry. Using ST 2110 as a starting point, the Video Services Forum [VSF] added elements that would serve the pro AV market. (This suite of technical recommendations is called VSF TR-10.)

IPMX expands the scope of ST 2110, rolling it up with NMOS and the new VSF TR-10 protocols. Modifications to the ST 2110 protocol enable IPMX to work with relatively inexpensive network equipment typical of enterprise-class installations.

WHY IPMX FOR BROADCAST?

Broadcasters are taking notice of IPMX and how it can fit into their facilities. Those who installed ST 2110 environments have almost all organically adopted NMOS as their control protocol. To remove any question about device control, IPMX products must support NMOS. This introduces a higher likelihood that assets could be discoverable

and easier to network together because they all do it in a common way. Also, IPMX equipment must support environments with and without PTP. This means all ST 2110 transmitters can be used in IPMX networks, and all IPMX receivers can be used in ST 2110 networks, creating a crosscompatibility that may address the transport needs of some in the broadcast space.

WHERE COULD BROADCASTERS USE IPMX?

Since IPMX is compatible with ST 2110, broadcasters can pepper their ST 2110 infrastructure with IPMX devices where it makes sense, knowing IPMX gear will be compatible with both. For example, since not everything in the facility must be synchronised, broadcasters could share content from the more advanced PTP network with other parts of the business running on simpler networks; such as sending media to an executive’s office for monitoring.

IPMX would also be beneficial for smaller broadcasters with fewer engineers and tighter budgets, as it would enable them to install and support networks analogous to what may already be used in the facility. It offers a way to start the SDI-to-IP migration with less perceived challenges. Because IPMX is compatible with ST 2110, broadcasters can gradually build into a full ST 2110 experience while retaining the benefits of their IPMX equipment.

IPMX will help broadcast facilities integrate non-native broadcast signals such as KVM outputs, webcams, and Zoom sessions into their broadcast environments much like scan converters did, without the need for dedicated equipment. Because IPMX supports both genlocked and non-genlocked (asynchronous) devices, broadcasters that use IPMX have the flexibility to support asynchronous sources. Device manufacturers can choose to offer an option to re-sync any IPMX signal to an ST 2110 clock by adding a small synchronisation delay, should the downstream workflow demand it.

IPMX is poised to transform the pro AV industry much like ST 2110 is transforming the broadcast landscape. Cost reduction, signal timing flexibility, and the simplification of network requirements are driving this initiative. Native ST 2110 environments will be the right choice for most broadcast operations, but given the inherent similarity with ST 2110, IPMX will no doubt be used to complement many of these installations for the very same reasons. n

TVBEUROPE JUNE/JULY 2023 | 37 TECHNOLOGY www.tvbeurope.com
“IPMX will help broadcast facilities integrate non-native broadcast signals such as KVM outputs, webcams, and Zoom sessions into their environments”

A CLEAR PATH INTO PRODUCTION AND DISTRIBUTION

Prime Focus Technologies co-founder and CEO Ramki Sankaranarayanan talks to Jenny Priestley about the company’s acquisition by DNEG and what it will mean for customers

Well-known as a provider of cloud-based software and artificial intelligence, Prime Focus Technologies (PFT) is entering a new chapter following its acquisition by global visual effects company, DNEG.

While PFT was a subsidiary of DNEG’s parent company, they were siblings who didn’t speak. However, the acquisition by DNEG means that the visual effects vendor will be able to enhance its client service offering thanks to PFT’s media workflow and automation software suite, CLEAR, and purpose-built AI platform, CLEAR AI.

“Historically, the construct was that DNEG does creative services for big Hollywood films and high-end episodic and streaming content, and PFT is more on the technology and distribution side,” explains Ramki Sankaranarayanan, co-founder and CEO of Prime Focus Technologies.

“A lot has changed over the past few years,” he continues, “our customers are consolidating very heavily. What we’ve been hearing from customers really is that part of their transformation journey is looking for economies of scale. So for the first time, we feel like customers are looking at both production and distribution spending to really come together.”

While PFT’s customers are looking to consolidate, Sankaranarayanan believes technology vendors should be as well. “Strategically I think there’s really no argument against it otherwise vendors remain super small as customers are becoming bigger and bigger.

“The scale that DNEG brings to the table and the access to production is big for us. It’s already influencing our product roadmap in the production side. We have some ambitious projects to announce in the next 12 months. It’s going to be quite an interesting period for us.”

INTEGRATING TECHNOLOGY

The acquisition means DNEG will take CLEAR, PFT’s automated content supply chain solution, into the production services stack. Sankaranarayanan believes this will be particularly important for the visual effects industry as it will reduce the amount of time required to

work on projects. “If you take any VFX shot, it takes roughly two weeks of activity from the time production finalises the edit and the VFX vendor is able to take on that work,” he explains. “There are many pieces of automation that are required on the production side. Why would DNEG go elsewhere to develop its own technology when a product as robust as CLEAR is available? Access to CLEAR and CLEAR AI was really a motivation for DNEG as to why they acquired PFT.”

TECHNOLOGY 38 | TVBEUROPE JUNE/JULY 2023
CLEAR Clip (above) and CLEAR Reframe

With more and more productions using the cloud and technology such as virtual production, there is an increasing demand for automation, which is where CLEAR will be valuable to both DNEG and its customers. “The volume of VFX work in production is only increasing, and [it] is becoming a pre-production activity as opposed to post production,” states Sankaranarayanan.

He also believes the acquisition gives PFT a key benefit over its competitors. “There is no other vendor like us. It’s a great place for us to come together and very exciting. PFT has its technology CLEAR being used by customers in production workflows already; CBS, Lionsgate and several customers of that kind use CLEAR for production workflows.”

Sankaranarayanan is keen to stress to PFT’s customers that nothing will change. Instead of being a subsidiary of its parent company, PFT will become a division of DNEG. “What is brilliant about this, is that the teams that customers are operating with won’t change. DNEG does none of what PFT does. There are no overlaps, so operationally, nothing changes.”

However, one thing that will change is the innovation for both CLEAR and CLEAR AI. Both solutions will become more relevant to DNEG’s business (the company has been using CLEAR for the past year). “As DNEG expands their growth within the production services space, PFT will be adding CLEAR and CLEAR AI to their offering, which I think is of significance,” states Sankaranarayanan. “Also, as part of DNEG’s

relationship with their customers, they can offer PFT’s distribution and technology capabilities.

“I think the innovation journey for CLEAR and CLEAR AI will continue. Being part of a larger company means CLEAR will get a higher capital allocation to make it better. That’s really what I think will come to happen out of this transaction.”

AI DEVELOPMENT

At NAB 2023, PFT announced two new products for social media: an enhanced version of CLEAR AI that includes ChatGPT; and CLEAR Clip, which automates editing and accelerates content creation.

“CLEAR Clip uses AI to curate clips of interest, and that’s typically what editors look for,” explains Sankaranarayanan. “Producers can now make their paper cut very, very quickly, even before they engage the editor because of the power of AI. So I think storytelling is going to become that much better.”

There’s a lot of chatter at the moment about artificial intelligence and its impact on the media industry, and Sankaranarayanan believes the industry is at a ‘eureka’ moment in terms of adoption. “AI now is a dinner table conversation, it affects everybody in some sense,” he adds.

“Generative AI creating imagery is coming and it’s something that we need to be very conscious of. We’ve got to stay ahead to make sure that we are not consumed by AI rather than us consuming it.” n

TECHNOLOGY www.tvbeurope.com TVBEUROPE JUNE/JULY 2023 | 39

ELEVATING IP-BASED TELEVISION

The Society of Motion Picture and Television Engineers (SMPTE) introduced the SMPTE 2110 suite of standards for media transmission over IP networks in 2017, which was promptly followed by the Advanced Media Workflow Association’s (AMWA) development of the networked media open specifications (NMOS) control protocols to support it. Over the years, these standards and specifications have been widely adopted, serving as the foundation for countless successful projects worldwide and becoming essential components of IP-based television facilities.

Recently, SMPTE issued updated versions of the core 2110 standards. These new publications keep the 2110 documents aligned with current industry practices, including some minor revisions to address small issues that have emerged in the field. The revisions include enhancements to the description and management of asynchronous streams, and some clarifications to support the upcoming publication of protocol implementation conformance statements (PICS) for the 2110 standards.

Like SMPTE, AMWA also maintains and updates a set of best current practice (BCP) documents for NMOS. A significant focus of the organisation’s work has been on improving NMOS’ receiver capability detection methods, as addressed in BCP-004-01. This specification allows an IS-04 receiver to express parametric constraints on the types of streams it is capable of handling, which controllers need to know before connecting it with a sender’s stream. The issue arose frequently in the field when broadcasters began deploying UHD content, as much of the existing equipment lacked the necessary processing capabilities.

Another priority has been security considerations. To enable secure control, BCP-003-01 outlines best practices for secure transport in NMOS API communications, while BCP-003-02 provides guidelines for API servers when accepting or rejecting requests based on client authorisation, and BCP-003-03 offers best practices for the automated provisioning of TLS server certificates. In addition to receiver capability detection and security considerations, AMWA is currently working on BCP-005-01, which will provide guidelines for expressing EDID information through receiver capabilities, part of supporting the IPMX ecosystem. However, while the updates being made to SMPTE ST 2110 and supporting protocols like NMOS aren’t necessarily headline-

Updates to SMPTE 2110 and NMOS specifications put the spotlight on receiver capability detection, security, and adoption of JPEG XS, writes John Mailhot, CTO and director of infrastructure product management, Imagine Communications

grabbing, that doesn’t mean that the standard’s functionality isn’t evolving. The SMPTE ST 2110 suite enhanced its capabilities to cater to adjacent market demands by incorporating support for compressed video streams, including support for the JPEG XS standard as a codec within SMPTE 2110-22.

Offering low implementation complexity, JPEG XS specifies a compression technology with an intrinsic latency of just a few lines, making it highly suitable for applications that require real-time interaction and fine-touch control for processing of video content. This standard is also optimised for visual lossless compression, adhering to the ISO/IEC 29170-2 guidelines for both natural and synthetic images. It achieves typical compression ratios of up to 10:1 for 4:4:4, 4:2:2, and 4:2:0 images, or even higher depending on specific application requirements. With support for various pixel formats, JPEG XS ensures compatibility across diverse image sources and display technologies. It allows for precise bit-rate control, and the end-to-end delay is exceptionally low, equivalent to a fraction of a frame.

The integration of the JPEG XS standard with SMPTE 2110 caters to the growing interest within the broadcast industry to work with compressed formats. This combination allows JPEG XS to be applied in various applications that benefit from the reduced bit rate, including inter-facility video links over IP transport, and ground-tocloud video transports.

The Video Services Forum (VSF) has documented this amalgamation of JPEG XS and SMPTE 2110 in the Technical Recommendation TR-08, which has played a significant role in recent global broadcasting events and which Imagine Communications is proud to support with our Magellan Control System and Selenio Network Processor (SNP). AMWA has also documented its support of JPEG XS in NMOS through BCP-006-01, which enables the registration, discovery, and connection of JPEG XS endpoints using the NMOS IS-04 and IS-05 specifications.

By utilising less bandwidth for signal transport, JPEG XS significantly reduces costs, while delivering a production-quality signal with minimal latency. For this reason, the broadcast industry’s acceptance of JPEG XS as a payload format in SMPTE 2110-22 will continue to grow, enabling remote and distributed production workflows – including cloud-based productions – with low latency and exceptional picture quality. n

TECHNOLOGY 40 | TVBEUROPE JUNE/JULY 2023

SCALING AMBITIONS FOR INGEST WITH IP

In recent years, the media and broadcast industry has begun to untether itself from the physical constraints of hardware and traditional processes. Media organisations are moving towards both cloud and hybrid asset storage, as editing and content processing workflows become increasingly decentralised. On the contribution and distribution side, IP is being leveraged to reduce costs with spin-up and spin-down broadcast environments that deliver content securely using advanced transport protocols.

But the full range of applications for IP is only just starting to be realised. There are many more areas that can benefit from a cloudbased approach, and the transition will soon touch every link in the chain from camera to viewer.

THE MEDIA TRANSFORMATION

As the media industry continues to transition from hardware to the cloud, the need for remote workflows has taken centre stage. By integrating IP into remote production, broadcasters can now harness a range of advantages, including enhanced collaboration, flexible network expansion, and economical operations. Through the utilisation of dependable internet connections and fibre-optic connections, remote production has become a feasible alternative for conventional on-site production.

The essential elements of IP-based remote production encompass IP transport for video and audio, servers for remote production, cloud-based services and applications, and network infrastructure and security. From an ingest perspective the benefits of moving away from SDI are clear; a significant reduction in the resources needed to capture content and ingest into asset management systems, including disrupting media industry dependence on costly equipment and travel.

TRIMMING THE EXCESS

With content creation schedules becoming significantly more demanding, any innovation that cuts down on production delays will be given a warm welcome. At transitional stages in the production process such as ingest, it is crucial to avoid bottlenecks. Editors are under enormous pressure, and so, are naturally keen to get to work on footage straight away, but this can cause challenges if stages are skipped to expedite the process.

However, if production teams can record directly to the cloud, then things move into the fast lane extremely quickly. All that is needed are cameras and encoders; the process can be operated from anywhere, which drastically reduces the number of people who need to be on-site and the hardware that needs to be shipped. This also frees-up time to ensure that workflow orchestration processes are properly followed,

and assets are tagged with the appropriate metadata; which causes a lot less hassle later down the line.

To meet ongoing consumer demand, remote production needs more flexibility to accommodate sudden schedule changes and improve cost-effectiveness. Recording directly to the cloud plays a significant role in achieving these goals, as it trims the excess all round. From an infrastructure perspective, IP reduces the size, weight, and power consumption for the cables, number of devices, and overall set-up. A lighter set-up makes for a much more agile production crew.

Transport standards such as ST 2110 carry the necessary payload and synchronisation data to safeguard professional video and audio during transport. It offers much higher density for the number of channels across a significantly smaller physical footprint, both on- and off-site. An ST 2110 truck can have more than 200 planned channels of UHD. This is an unthinkable density for a single vehicle using traditional SDI workflows.

MIND THE SKILLS GAP

Live ISO streams can now be captured directly in the codec needed for editing and file delivery, and cloud ingest enables production teams to deploy on-demand set-ups within minutes. Alternatively, crews can opt to record content in a high-resolution, superior-quality format for subsequent transcoding that meets streaming-deliverable requirements. Traditional methods which necessitate saving the footage to a hard drive and physically shipping it to the right location or uploading it to a cloud-based transfer service seem archaic in comparison. So why aren’t more media organisations jumping at the IP opportunity?

As with any transition, whilst the end result might be desirable, the journey to get there can be challenging. IP-based remote production is an intricate process that requires the proper functioning of various components. Currently, we are at a crossroads in the industry, especially in the development of a new cloud-based skillset. With technology advancing at a rapid pace, the varied logistics and technical complexity involved in IP implementation is holding some production teams back.

This makes perfect sense, as when transitioning from SDI to IP, small details can have big consequences. With a huge number of channels and customisation options for integration, the relevant experience is crucial to ensure workflows flourish. The industry has now reached a tipping point, but things always get more complex before they get easier. Production teams must enlist the right technical knowledge initially, so that they are setup for long-term success. IP standards have changed the game for ingest, it’s time to enjoy the rewards but with a solid foundation in place. n

TVBEUROPE JUNE/JULY 2023 | 41 TECHNOLOGY www.tvbeurope.com

HELPING THE WORLD STAY CONNECTED

SMPTE executive director David Grindle discusses how technology created by the media industry helps bring people closer together

How did you get started in the media tech industry?

I came to media tech through live entertainment. Using projection and video in theatre I was able to learn the basics of how things work. Since coming to SMPTE in 2022, I’m excited to continue to learn more of the science and technology advancing media.

How has the market changed in your time?

My first experience was 30 years ago working with PANI large format projectors in an opera house during grad school. Cooled with compressed air and using glass slides, these were state-of-the-art at the time. Video was still miles of coax. Now we are driving images with gaming engines to LED walls over IP and cameras are feeding direct to the cloud.

What makes you passionate about working in the industry?

This industry is how the world communicates. The technology is the tool we use to connect people to one another. I’ve often told of my wife saying goodbye to her father who died during the pandemic via video call. That was possible because of engineers in media and entertainment.

We share moments close and personal and also gather in times of celebration or mourning because we can see events broadcast live or streamed to a screen in a theatre or on the side of building. This is all possible because hundreds of people who rarely get credit love to explore how to use technology to connect people.

If you could change one thing about the media tech industry, what would it be?

I would show more appreciation to those behind the scenes. Especially deep behind the scenes. The computer scientists who make better drivers. The colourists who even things out and finish the look, and the engineers who are finding new ways to make the impossible possible.

How inclusive do you think the industry is, and how can we make it more inclusive?

There’s always room for more people at the table. We have to make the space and listen and learn from each other. Everyone has insight and sometimes the newest person asks the most insightful question because they aren’t afraid to ask why it can’t be different. Making space for people who have different backgrounds and points of view opens the opportunity for problems to be

solved from new perspectives. While the laws of physics are constant, the approach to solving things using them is infinite.

Each of us has our own confirmation bias. We need to solve a problem and go to that circle of people closest to us because that is who we know. We have to push ourselves to meet new people, invite their participation in societies like SMPTE and learn from each other. That takes intent. It means going out and finding someone with a diverse perspective to speak to, and then listening.

How do we encourage young people that media technology is the career for them?

We have to show younger people the exciting things that are going on in M&E. Our greatest accomplishment is our greatest weakness; we make it look easy. Video flows ‘seamlessly’ from one device to another and the number of engineers, scientists, and others required to keep that flowing go unnoticed. We need to promote the inner workings of our industry more, from cybersecurity to optics.

Where do you think the industry will go next?

The virtual world will be a major driver. Not just in virtual volumes, but in increasing use of virtual performers. There are real ethical issues we’ll have to tackle on that one. Just because we can use a dataset of an actor’s voice to re-voice characters, should we?

There are many ethical uses of deep-fake tech, but far more unethical and even nefarious ones. We are going to have to police our own technology so people have faith in the news they receive. It’s one thing when we say, “hey this is a film or scripted show for your entertainment”. But even there we need to acknowledge and promote what is real and what is not. It’s much different when folks are trying to use this technology to intimidate, coerce, and create false narratives.

What’s the biggest topic of discussion in your area of the industry?

Pipeline. How are we going to develop the next generation of professionals?

What should the industry be talking about?

I refer back to my comment on virtual and deep-fake. While folks are talking about it, I think we need to be talking about it more and addressing it publicly outside of our industry conversations. n

THE FINAL WORD 42 | TVBEUROPE JUNE/JULY 2023

15 -18

SEPTEMBER 2023, RAI AMSTERDAM

TRANSFORMING MEDIA.CHANGING PERCEPTIONS.

REGISTER NOW AT SHOW.IBC.ORG #IBC2023

EM PO WE RI NG C ONTE NT E VE RY WH ER E
9000
Issuu converts static files into: digital portfolios, online yearbooks, online catalogs, digital photo albums and more. Sign up and create your flipbook.