TVBEurope 92 September 2022

Page 16

ME HAPPY,

THE YEARS

SEPTEMBER 2022Intelligence for the media & entertainment industry HOW DENTAL SCANNING TECHNOLOGY BROUGHT LOST MORECAMBE & WISE TAPES BACK TO THE SCREEN MAKE
THROUGH
Untitled-2 1 01/08/2022 14:14

Welcome back

Here we are then, back in Amsterdam after three long years. For many, this will be the first time at a trade show since the start of the pandemic.

While I suspect it won’t take long to get back into the swing of things (and for me to get lost in the RAI... happens every year), it’s going to be interesting to see how the industry feels about returning to the traditional trade show model. Do companies want to continue investing in physical stands, or is a hybrid model now the way to go?

While no-one disputes the importance of face-to-face meetings, do we really want or need to travel around the world to attend them? If the media tech industry wants to promote its sustainability credentials, then should it be flying to Amsterdam and Las Vegas en masse every year?

hear his thoughts on the return of the show and what the future holds.

Much of this issue’s focus is on UHD and HDR and we hear from numerous stakeholders in the production chain about the impact of both. We also take a deep dive into the metaverse, which is coming whether we want it to or not.

Kevin Hilton finds out how dental technology helped restore lost footage from

ADVERTISING

Please don’t think I’m being overly negative, I can’t wait to see everyone again. The opportunity to network, discuss new developments, or just catch up with colleagues for a beer or glass of wine is most welcome. I just wonder if it’s time for the industry to have a long, hard look at how it engages with itself both now and in the future.

Those are just my thoughts, you may feel very differently. And, if you see me running around the halls of the RAI feel free to stop me and let me know what you think.

We speak to IBC CEO Michael Crimp to

the BBC’s The Morecambe & Wise Show, and Philip Stevens talks to the people who have restored a piece of broadcasting history.

I hope you’ll indulge me, but this is Philip’s final article as he heads off into retirement. Philip has been a stalwart of our writing team and given me so much support over the years. Thank you Philip, and enjoy your retirement from everyone at TVBEurope n

ARCHIVES

LICENSING/REPRINTS/PERMISSIONS

www.tvbeurope.com TVBEUROPE SEPTEMBER 2022 | 03
If the media tech industry wants to promote its sustainability credentials, should it be flying to Amsterdam and Las Vegas en masse every year? JENNY PRIESTLEY, EDITOR JENNY.PRIESTLEY@FUTURENET.COM @JENNYPRIESTLEY Future PLC is a member of the Periodical Publishers Association We are committed to only using magazine paper which is derived from responsibly managed, certified forestry and chlorine-free manufacture. The paper in this magazine was sourced and produced from sustainable managed forests, conforming to strict environmental and socioeconomic standards. www.tvbeurope.com FOLLOW US Twitter.com/TVBEUROPE / Facebook/TVBEUROPE1 CONTENT Editor: Jenny Priestley jenny.priestley@futurenet.com Graphic Designer: Marc Miller Managing Design Director: Nicole Cobban nicole.cobban@futurenet.com Contributors: Peggy Dau, Kevin Hilton, Philip Stevens, Russell Trafford-Jones Group Content Director, B2B: James McKeown james.mckeown@futurenet.com MANAGEMENT Senior Vice President Group: Elizabeth Deeming UK CRO: Zack Sullivan Commercial Director: Clare Dove Head of Production US & UK: Mark Constance Head of Design: Rodney Dive
SALES Advertising Director: Michael Pyatt michael.pyatt@futurenet.com (0)330 390 6290 SUBSCRIBER CUSTOMER SERVICE To subscribe, change your address, or check on your current account status, go to www.tvbeurope.com/contact-us or email customerservice@futurenet.com
Digital editions of the magazine are available to view on ISSUU. com Recent back issues of the printed edition may be available please contact customerservice@futurenet.com for more information.
TVBE is available for licensing. Contact the Licensing team to discuss partnership opportunities. Head of Print Licensing Rachel Shaw licensing@futurenet.com All contents © 2022 Future Publishing Limited or published under licence. All rights reserved. No part of this magazine may be used, stored, transmitted or reproduced in any way without the prior written permission of the publisher. Future Publishing Limited (company number 2008885) is registered in England and Wales. Registered office: Quay House, The Ambury, Bath BA1 1UA. All information contained in this publication is for information only and is, as far as we are aware, correct at the time of going to press. Future cannot accept any responsibility for errors or inaccuracies in such information. You are advised to contact manufacturers and retailers directly with regard to the price of products/services referred to in this publication. Apps and websites mentioned in this publication are not under our control. We are not responsible for their contents or any other changes or updates to them. This magazine is fully independent and not affiliated in any way with the companies mentioned herein. If you submit material to us, you warrant that you own the material and/or have the necessary rights/permissions to supply the material and you automatically grant Future and its licensees a licence to publish your submission in whole or in part in any/all issues and/or editions of publications, in any format published worldwide and on associated websites, social media channels and associated products. Any material you submit is sent at your own risk and, although every care is taken, neither Future nor its employees, agents, subcontractors or licensees shall be liable for loss or damage. We assume all unsolicited material is for publication unless otherwise stated, and reserve the right to edit, amend, adapt all submissions.

IN THIS ISSUE

12

12 Liveandin-person

IBC CEO Michael Crimp talks to TVBEurope about what attendees and exhibitors can expect at this year’s show, and why he’s excited to be back to an in-person event

16 Thevideoindustry’srolein the metaverse

By José Somolinos, solutions strategist and XR lead, Accedo

22 Restoringapieceof broadcastinghistory

Philip Stevens vists an OB unit that was a leader of its day

30 Acenturyofinnovation

TVBEurope looks back at GatesAir, from its formative years in making consumer radios through to its leadership role in wireless content delivery for TV and radio broadcasters

38 Turninganamazinggoal intoaRembrandt

Could NFTs be broadcasters’ next big revenue stream? Vislink CEO Mickey Miller certainly thinks so. He explains why to Jenny Priestley

39 Restoringthesunshine

Some Moreambe & Wise’s BBC shows are available again following some intricate restoration and reconstruction work, writes Kevin Hilton

48 LiveHDRremote productionatSkySports

Russell Trafford-Jones investigates how the sports broadcaster seamlessly produces programmes in both SDR and HDR 700 times a year

57 We need to talk about the scalabilityofstreaming

Harmonic Inc’s Thierry Fautier discusses why the adoption of higher resolutions is causing problems for the infrastructure of the internet

TVBEUROPE SEPTEMBER 2022 | 05www.tvbeurope.com
39 22
SEPTEMBER 2022

4K/UHD for all: a pipe dream or reality?

Higher resolution formats like 4K and UHD are quickly becoming standard formats for all types of media content. We can see this through examples such as the Beijing 2022 Winter Olympics, the first Winter Games broadcast in UHD, the successful 4K HDR deployment at last year’s Tokyo Summer Olympics, and regular broadcasts during the 2021/22 Premier League season. However, while consumers are accustomed to these eye-catching experiences, only a small amount of content is filmed natively in 4K due to high production costs and limitations on delivery network capacity. The battle to deliver topresolution quality to all consumers everywhere comes down to a handful of core technology innovations re-shaping the playing field.

DECADES OF INNOVATION

According to Omdia Consumer Research, the share of consumers claiming that 4K/UHD content is an important feature in their video services generally rose faster than any other factor across the market over the last two years. Productions and contribution links in 4K/UHD are quickly becoming the standard practice for broadcasters and studios. Most devices, including smart TVs and streaming devices, support 4K, UHD and HDR. Even streaming platforms including Netflix, Disney Plus, and Amazon Prime Video have launched TV series and movies in UHD, usually with their most popular titles.

So, what technologies are making these rollouts feasible? Many will remember the terms 4K and UHD breaking into the media landscape around a decade ago, promising to elevate the viewing experience for good, albeit remaining a premium offering for live and on-demand content. Since then, we have seen rapid developments in technologies such as WCG (wide colour gamut), HDR (high dynamic range), advanced video compression, and advancements in more captivating audio solutions. Perhaps the most influential enabler to the widespread rollout of 4K has been HEVC (high efficiency video coding), but now there is a new codec vying for attention: VVC (versatile video coding).

VVC ADOPTION

VVC is the latest video compression standard developed to improve compression performance and support a wide range of applications. It is the most efficient video coding standard available today and is starting to gain real-world implementation in consumer devices. VVC allows far more efficient encoding compared to HEVC, which is particularly important for higher resolutions and frame rates such as 4K/UHD and

above. Implementations of video codecs (i.e., real-world video encoders) continually undergo major advances to improve their efficiency, including dual-pass and adaptive encoding, statistical multiplexing, and more.

New video codecs such as VVC are significant enablers of enhanced video experiences for broadcast and broadband, like the future of 8K and the growth of the adoption of 4K/UHD by relieving the bottlenecks and costs associated with less efficient delivery schemes.

Although the adoption of VVC is in its early stages and may take several years to achieve high levels of adoption, leading industry bodies and organisations are starting to get involved. Earlier this year, Digital Video Broadcasting (DVB), an industry-led consortium of the world’s leading media and technology companies, announced it would add VVC to its core specification. It’s the first standards body of its kind to implement VVC for video and audio coding in broadcast and broadband applications. Big technology firms such as Apple, Samsung, and LG have subsequently acquired 4K/UHD content rights, sensing a unique technology opportunity on the horizon.

BANDWIDTH DEMANDS AND LACK OF SPECTRUM Terrestrial broadcasters have not made much progress in UHD adoption partly because of the bandwidth requirements and the ever-increasing demands on spectrum around the terrestrial frequencies by mobile operators seeking connectivity to IoT devices. Some satellite service providers have successfully done so for their own channels, such as Sky Sports and Sky Atlantic, but generally, uptake in the terrestrial world is relatively low.

We’ve also seen that the same broadcasters have typically looked at their streaming services to deliver their UHD variants where it’s much easier to do so, even though costs at very large audience scales are still likely to be higher than traditional broadcast.

Bandwidth and spectrum have also been put in a chokehold by the availability of C-band spectrum, which offers a very desirable balance of reach and bandwidth for mobile networks. It explains why broadcasters have struggled with UHD adoption to date. VVC enables broadcasters to achieve significant spectrum efficiencies by reducing the bitrate needed for the same video service quality where satellite remains in use, allowing better spectrum efficiency or lower streaming costs.

For now, HEVC still provides a good way to deploy UHD, but VVC opens the potential for much wider adoption of 4K and possibly above, in a world where connectivity is key for everyone. n

06 | TVBEUROPE SEPTEMBER 2022 OPINION AND ANALYSIS

Storage: the invisible friend

Ultra HD – 4K resolution and high dynamic range –is now a reality, and the norm for premium quality production, in drama, entertainment and sport. Augmented reality is also being widely used and, while a lot of the production is live with virtual elements generated in real time, the data to build the models has to be managed.

Away from broadcast, virtual and augmented reality experiences are rapidly growing in popularity as the technology to support them grows. Oculus (now part of Meta) continues to be the benchmark; Samsung and Sony are the market leaders; and the rumours of an Apple headset – probably augmented reality – grow ever firmer. Research by Iberdrola suggests that demand for VR devices has grown by more than a factor of 16 between 2018 and 2022.

What these file formats have in common is that they involve very high data rates, another big leap up from HD.

A full 4K HDR DPX stream, for example, is close to 10 gigabits a second, or 4.4 terabytes for an hour of content. More practically, Ultra HD ProRes 422 HQ is around 1.7 gigabits a second, so a single hour-long file would be 800 gigabytes.

The target – although the hardware does not yet support it – is for 4K per eye in a virtual reality headset, which means we would be looking at double those data rates.

In post production – for ultra HD and in future for high resolution virtual reality – we need multiple streams to play in real time to allow editing, compositing and effects. In our industry, the demands on storage and data rates are far in excess of all but a few rare scientific applications.

To meet the everyday needs requires appliances that can store these massive quantities of data, and deliver multiple streams to tight time tolerances to multiple clients simultaneously. The paradox is that we do not regard our storage systems as hero products: we want to ignore them

and just assume the pictures and sound will be there when we call for them.

Despite the need to be unobtrusive, media servers clearly need to be built for the job. To deliver the sustained data rates, you have to use flash storage (solid state disks or SSD). At the very least, hard disk storage needs to be accelerated with flash, but this can lead to bottlenecks.

With today’s available technology, that means Space SSD, a 2U storage appliance from GB Labs, can store up to 720 terabytes, or more than 160 hours of the highest 4K DPX format. With in-built data acceleration, the sustained data rate of a single Space is 48 Gbps, but devices can be coupled in parallel, so a 6U cluster could deliver up to 144 Gbps if the network could support it. Up to 15 Spaces can be combined into a single 10.8 petabyte store.

Pure bandwidth is not the only consideration. To meet the demands of the post industry you also need consistent, very low latency access from Mac, Linux and Windows clients, and the ability to manage data flows across the storage network to provide sustained performance. GB Labs achieves that through dynamic bandwidth control, which means that the storage system understands the priorities of the users and thereby allocates bandwidth automatically to maximum efficiency all the time.

The result is that ultra HD in all its forms can easily be supported with maintenance-free flash storage and automated storage management. Of course, 8K is on the horizon, and while this is a bit more of a challenge, we do have a number of servers at Super Hi-Vision pioneers NHK in Japan.

Space SSD appliances are in use with broadcasters and post facilities around the world, and with non-broadcast users from Kenneth Copeland Ministries to the Royal Shakespeare Company. They rely on secure storage, fast and reliable delivery, and on them being invisible. n

TVBEUROPE SEPTEMBER 2022 | 07 OPINION AND ANALYSIS www.tvbeurope.com
“We do not regard our storage systems as hero products: we want to ignore them and just assume the pictures and sound will be there when we call for them”

Growing demand for 4K UHD content

Over the last few years, it’s become almost impossible to buy a new television which isn’t 4K UHD capable. In its Global 4K UHD TV Market Growth report published last year, MarketandResearch.biz projected a compound annual growth rate (CAGR) of 16.9 per cent in the 4K UHD TV market from 2021 to 2026, and a colossal global market size of $90,400 million.

There are two main drivers for this. The first one is the customer’s desire for superior picture quality, made possible with sharper resolution, faster frame rates and high dynamic range (HDR). The other is the increasing supply of 4K UHD content across both traditional over-the-air (OTA) broadcast networks as well as over-the-top (OTT) streaming platforms.

Historically, live sports transmissions have always been one of the biggest drivers for the uptake of new technologies in the broadcast and media industry.

4K UHD is no different, and it’s growing. The International Association of Broadcast Manufacturers (IABM) published a Content Chain Trends report in 2022 where it highlights that many of the big streaming players are only accepting content shot in 4K. In addition, a number of streaming companies are buying up rights to live sports as competition for viewers’ eyeballs increases, which explains the rise in popularity for 4K UHD.

Amazon recently retained the rights to broadcast English Premier League football matches for the 2024/25 season, once again shared alongside OTA broadcasters like Sky Sports, BT Sport and BBC Sport. This year, North America’s Major League Soccer also announced a $2.5 billion deal with Apple to broadcast live matches for the next decade.

Competition is fierce, and with streaming companies like Netflix charging a premium for 4K UHD and HDR content, we expect to see more broadcasters investing in high picture quality to increase viewership and monetise top-tier content.

The adoption of ATSC 3.0 – the OTA broadcasters’ response to the easy availability and flexibility of OTT content – will drive the adoption of 4K UHD even higher.

While ATSC 3.0 is still a relatively new standard for broadcasting, it is gaining ground and it will allow TV stations to broadcast 4K video to compatible devices for all live transmission over the air.

Realistically, the rate of adoption will always depend on regional market conditions, as well as the type of content being produced. Although some types of content are commonly transmitted in full HD, others (like sports) are becoming increasingly popular in 4K UHD, and as we move forward we are likely to see 4K UHD productions increasingly implemented for archival purposes.

The IABM Content Chain Trends report also indicates that broadcasters are beginning to future-proof themselves. It states that “news studios are increasingly adopting 4K and HDR to prepare for the future and to maximise ROI even without immediate plans to implement 4K”.

While 4K is more common in sports coverage, it’s especially noteworthy that news studios are also investing in 4K equipment. Moves to embrace new production formats like this always require upgrades to studio equipment, such as the need for studio cameras that can operate in both 1080p and 4K to cater for any format request.

Manufacturers should be monitoring these industry trends to support broadcasters and content creators.

For example, Dejero’s new EnGo 3x is a 5G mobile transmitter designed to transmit 4K UHD live video with very low latency, even under challenging internet network conditions. Customers don’t need to pay for a separate 4K UHD licence; it’s already included in the base unit.

With hybrid encoding technology and customised firmware on its Intel processor, the EnGo 3x delivers stunning picture quality while making it easy for remote teams to monitor productions in real time. Along with Dejero’s WayPoint 3 receiver, which reconstructs video transported over multiple IP connections into a combination of 4K UHD outputs, these new products allow content producers to stay flexible in whatever format they need to work in. n

08 | TVBEUROPE SEPTEMBER 2022 OPINION AND ANALYSIS

HyperDeck Shuttle HD is a recorder and player that’s designed to be used on the desktop! That means it’s more than a master recorder as it can also be used as a clip player. You get support for ProRes, DNx and H.264 files in NTSC, PAL, 720p and 1080p video formats. Plus SD cards, UHS-II cards and USB-C external disks can be used for recording and playing media.

Elegantly Designed Professional Broadcast Deck

HyperDeck Shuttle HD is perfectly designed for the desktop. This means the front panel can be operated with a single hand! The design is the same depth and angle as ATEM Mini Extreme, so they match perfectly when used together! Scrolling media is fast as the machined metal search dial has a natural inertia, and the soft rubber surface feels nice to the touch.

Traditional Broadcast Deck Controls

HyperDeck Shuttle includes a control panel that’s very fast to use. The buttons are similar to a traditional broadcast deck. Simply press the record button and you instantly start recording the video input to a file! It will record in the file format that has been set in the record menu. There are other buttons for playback, to cue to the start of the clip and to move to the next clip.

Supports SD Cards and UHS-II Cards

HyperDeck Shuttle supports recording to common SD cards and UHS-II cards so you don’t need expensive custom media. When using H.264, the files are so incredibly small you get very long recordings even on smaller low cost cards. This means you can record up to 157 hours of H.264 in 1080p59.94 on a 1 TB card. That’s over 6 days of recording in HD using a single 1 TB card!

Record to External USB-C Media Disks

If recording to other types of media is required, the USB-C expansion port lets you plug in an external flash disk for recording. USB-C flash disks have unlimited capacity because they can be physically larger than an SD card or SSD. Plus, it’s even possible to record to a disk you’ll use for editing, so you don’t need to waste time copying files before starting post production.

New HyperDeck Shuttle HD The desktop HyperDeck that’s the perfect live production clip player and master recorder! HyperDeck Shuttle HD €515 Learn More!www.blackmagicdesign.com/nl SRP is exclusive of VAT

EYES

OLYMPIC

NDI: CONNECTION

FEATURE 10 | TVBEUROPE SEPTEMBER 2022 # ICYMI TVBEurope’s website includes exclusive news, features and information about our industry. Here are some featured articles from the last month… TVBEurope hears from a selection of industry stakeholders about their typical IBC routines, and what they’re looking forward to at this year’s show. n https://bit.ly/3ptbCOo MY IBC: POLLY HICKLING The show will honour the BBC in its centenary year, while its Special Award goes to Sonya Chakarova, Daniella Weigner and Phillip Covell, who led a project to provide resources, solutions and hardware to Ukrainian TV channels and media outlets. n https://bit.ly/3dxt1T3 IBC TO PRESENT BBC WITH INTERNATIONAL HONOUR FOR EXCELLENCE We speak to the cofounder, volunteers and students who took part in Rise’s inaugural summer school. n https://bit.ly/3PCKIOl
BROADCASTING SERVICES LAYS OUT INITIAL PLANS FOR PARIS 2024 OBS said it is aiming to produce an “unprecedented” amount of content and deliver it to rights-holding broadcasters in a wider range of options. n https://bit.ly/3dA3qci
WITHOUT COMPLICATION TVBEurope sits down for a Q&A with Tarif Sayed, president and GM of NDI Global, to discuss his background, his path to NDI, and what makes it a great fit for content creators of all sizes. n https://bit.ly/3Ar0OX7 IBC 2022 CONFERENCE PREVIEW We look ahead to the content programme at IBC 2022 and the big issues that delegates can expect to dominate this year’s keynotes, discussions, and technical papers. n https://bit.ly/3QTwI3Q ‘IT’S OPENED MY
TO OTHER CAREER OPTIONS’

END-TO-END PRODUCTION FREEDOM WITH LIVEU AND THE CLOUD

Remember when the questions that resonated around the industry were how do we capture live content? How do we justify the cost?

Is there fibre available or satellite capacity and is that in the budget? How can we get to an event quickly and be ready to go live as soon as we arrive? If you do, you’re recalling the days before LiveU changed the live production world with the introduction of IP bonding all the way back in 2006, forever transforming the way we think and work. We untethered the industry and introduced a freedom for live coverage that has been embraced by news, sport, entertainment and now other verticals, too.

LIVEU AND THE CLOUD

Of course, we haven’t stopped there! Now we’re once again disrupting the industry via our use of the cloud to bring dynamic, real-time production, orchestration, ingest and distribution services to our customers. We are putting creativity first, empowering users to produce the dynamic content that attracts and holds viewers’ attention, in turn bringing new levels of engagement and content monetisation to media companies of all shapes and sizes.

POWERFUL CLOUD PARTNERSHIPS AND THE ACQUISITION OF EASYLIVE.IO

The cloud is hardly new, of course, but we’re only now experiencing how much it continues the opening up of our industry, allowing inspirational freedom across content acquisition, production services and distribution. Having established multiple partnerships with other powerful industry players – Avid, Blackbird, Grabyo, Grass Valley and Vizrt, with all these still highly active collaborations – we recently took the next step and acquired cloud-based video production provider easylive.io. This reinforces our cloud strategy, with easylive.io’s all-in-one live streaming production studio to be included in LiveU’s end-to-end cloud-based workflow for live contribution, management, orchestration, ingest and distribution.

Founded in 2012, easylive.io is a pioneer company in the live streaming market. Its platform offers a suite of professional cloud-based services to produce, process and broadcast live videos everywhere needed. It ensures the reception of all format types and their conversions for optimal management and redistribution to multiple destinations simultaneously. The easylive.io allin-one cloud-based live streaming production studio combines all the tools needed to edit, mix and broadcast live streams by operating from the client’s browser and in an entirely collaborative environment.

The move will provide remote and collaborative tools for cloud-based and hybrid productions, enabling customers to operate and scale up quickly and easily from anywhere, including video switching, audio mixing, adding

graphics, localising content and bringing on guests, while still reducing equipment and production costs.

VISIT US AT IBC 2022 TO SEE HOW WE CAN HELP YOU

We’re really excited to be back at IBC! For the first time, we’ll be showing our end-to-end cloud production workflow on stand 7.C30. We’ll present our new LiveU Ingest solution for automatic recording and story metadata tagging of live video and the award-winning LiveU Air Control, serving as a single collaboration solution to get remote guests on-air and live feeds into the system in broadcast-quality, together with easylive.io.

NOT FORGETTING CONTRIBUTION, OF COURSE!

We’re unlocking the potential of live video with our 5G, 4K and multi-cam capabilities. LiveU will demonstrate its native 5G live contribution and distribution solutions including the multi-cam LU800 5G productionlevel field unit, the recently introduced, small-sized LU300S 5G video transmission solution and rich remote production tools, using the LiveU Reliable Transport (LRT) protocol.

AND DISTRIBUTION WITH MATRIX

Increasingly used for global news and sports coverage, such as Volleyball World, the LiveU Matrix IP cloud video management and distribution solution offers significant cost savings compared to satellite and fibre. LiveU Matrix will be demonstrated with its new Dynamic Share service, enabling users to share and receive an exponential number of live feeds using the Global Directory of 5,000+ customers in news, sports, entertainment, and other vertical markets.

With the accelerated adoption of remote production workflows, we’ll also be presenting our cloud-based solutions for sustainable live productions. Leveraging our IP technology to reduce travel, power and equipment costs, LiveU helps to lower each organisation’s carbon footprint – while enabling them to produce high-quality content in the cloud.

KEEP AN EYE OUT AS THERE’LL BE MORE NEWS AS IBC APPROACHES

In the run-up to the show, LiveU will unveil its new 5G live video transmission solutions as part of its complete 5G product suite for broadcast-quality coverage. n

To find out more, we’ll be hosting one-on-one meetings on our IBC stand. Visit the LiveU IBC page to book a meeting or demo. See you in Amsterdam!

ADVERTORIAL TVBEUROPE SEPTEMBER 2022 | 11www.tvbeurope.com

LIVE AND IN-PERSON

After three years away, the media tech industry is ready to return to Amsterdam this September for IBC 2022. The show’s CEO Michael Crimp talks to TVBEurope about what attendees and exhibitors can expect as the event gets “back to business”

HOW HAVE YOU AND THE IBC TEAM DEALT WITH THE POSTPONEMENT OF THE LAST TWO SHOWS DUE TO THE PANDEMIC?

Naturally, it was a challenging couple of years for us at IBC and the whole industry in general. Unable to stage a physical event at the RAI Amsterdam, we worked hard to adapt the show to virtual formats for two years, keeping the IBC community connected and engaged through digital networking, collaboration, and thought-leadership initiatives.

It’s been an incredibly busy yet rewarding time for our team as we gear up for our first physical show since 2019. It’s been fantastic to see live, in-person trade shows back up and running around the world, and we feel a hugely positive momentum behind IBC 2022, with very strong engagement levels from exhibitors and show visitors.

WHAT HAVE YOU LEARNT FROM THAT TIME THAT YOU’RE BRINGING INTO THIS YEAR’S EVENT?

It’s felt like a long three years without an in-person IBC show, and we all know how much we’ve missed doing business and engaging with our community the way we enjoy most: face-to-face.

I think it’s also become clear that all corners of the content and technology ecosystem are eager to engage in person again; more than ever, technology providers understand the importance of human interaction with customers, partners and the wider community.

WHAT HAVE YOUR EXHIBITORS TOLD YOU ABOUT HOW THEY WOULD LIKE THE SHOW TO CHANGE?

It’s an interesting question. We haven’t heard too many comments on how the show might change going forward. However, there is a lot of interest in how the pandemic has changed behaviours. So, we have commissioned research into this area to help ensure IBC’s value proposition remains strong. Perhaps we will uncover something transformational, but anecdotal feedback suggests this is more of an evolution than a revolution.

At IBC 2022, we’ll see a real hunger for in-person networking, which

FEATURE 12 | TVBEUROPE SEPTEMBER 2022

will be focused around our themed content programme as well as social events. The bookings we’ve seen so far suggest this is well in tune with exhibitors’ thinking.

HOW MUCH OF A BOOST HAS IT BEEN TO HAVE MAJOR EXHIBITORS SUCH AS AVID AND SONY ANNOUNCE THEIR PLANS TO RETURN TO IBC?

It’s great to see major industry players like Sony and Avid on the show floor at IBC 2022, alongside a raft of other long-standing exhibitors, and companies making their show debut this year. It’s a testament to the value IBC offers for companies across the media value chain.

IBC aims to galvanise the entire media industry, and bringing a technology leader like Avid back to major trade events ensures the show will be as rich as possible for show visitors while underlining IBC’s core value proposition for major technology brands.

WHAT ARE YOUR TARGETS FOR THIS YEAR’S SHOW IN TERMS OF ATTENDEE NUMBERS?

Naturally, we’re expecting IBC 2022 to look slightly different from the last edition in 2019. As ever, final attendance numbers are hard to predict.

We’re seeing high-quality registrations from key players and decision makers across the media and entertainment community, and we’re confident that IBC 2022 will be a highly valuable event for both exhibitors and visitors.

THIS YEAR’S SHOW WILL BE OVER FOUR DAYS INSTEAD OF FIVE. IS THAT SOMETHING THAT ATTENDEES CAN EXPECT TO REMAIN IN PLACE GOING FORWARD?

Emerging out of the pandemic, cost control is essential to exhibitors, so the decision to reduce the show to four days became clear to us. However, many exhibitors would also argue that the extra day allows them to extract additional benefit without too much extra cost.

So, I think we’ll see how a four-day show works out and take a view on it from there. Currently, IBC 2023 is being planned as a four-day event.

You can‘t always see G&D right away. The products and solutions are often hidden. But they are systemically relevant and work. Always!

You can rely on G&D. And be absolutely certain. That‘s quality you can feel. When working in a control room. With every click. When installing in a server rack or at workplaces. G&D ensures that you can operate your systems securely, quickly, and in high quality over long distances. People working in control rooms can rely on G&D.

G&D simply feels right.

FEATURE www.tvbeurope.com TVBEUROPE SEPTEMBER 2022 | 13
The RAI awaits IBC’s return
AND KVM FEELS RIGHT.
VISIT US! GDSYS.COM OR LIVE AT

WHAT DO YOU HOPE ATTENDEES AND EXHIBITORS WILL TAKE AWAY FROM IBC 2022?

We want IBC 2022 to be the catalyst that brings the content and technology community back together to unlock business opportunities, fuel learning and celebrate innovation.

Our IBC Conference will provide delegates with mission-critical insights on some of the biggest business issues faced by our market today. Attendees can also enjoy the wide-ranging free-to-attend content programme across the show floor stages, designed to give every visitor access to knowledge-sharing and spark business conversations. Our Accelerator Programme will also showcase some truly groundbreaking solutions to real-world challenges and empower new use cases.

We want to play our role in driving positive change in media technology, so our new Changemaker sessions this year will enable the IBC community to learn from industry trailblazers, covering topics such as raising equality, advancing sustainability and mental health awareness.

EARLIER IN THE SUMMER, TVBEUROPE ASKED TECHNOLOGY VENDORS IF THEY FELT TRADE SHOWS WERE STILL AN EFFECTIVE PLATFORM FOR BUSINESS RETENTION/GENERATION.

MOST OF THEM TOLD US THAT THEIR VIEW OF SHOWS HAS CHANGED DURING THE PANDEMIC, WITH SOME SAYING THEY WILL BE MORE SELECTIVE IN THEIR INVESTMENT. WHAT WOULD YOUR ANSWER TO THAT QUESTION BE?

The past couple of years certainly made all companies think strategically about their investments, and as I mentioned earlier, cost control is of paramount importance today. Perhaps we’re seeing some exhibitors shift their focus toward more networking and thought-leadership activities to fit their business goals, and we work closely with our customers to ensure we help maximise their ROI.

One thing is clear, however. People have missed interacting with each other on the show floor, catching up over a coffee or a beer in the evening, and having those chance encounters that would only happen at an event like IBC. That’s where many of our attendees see the real added value of trade shows that you just can’t replicate with the same effect and at the same scale.

WHAT AREAS OF TECHNOLOGY INNOVATION HAVE STOOD OUT FOR YOU SINCE THE LAST IBC?

IBC champions ground-breaking technology, and innovation is at the heart of our mission. We’ve seen fantastic advancements in remote production alongside other particularly exciting breakthroughs harnessing 5G, VR/AR and, of course, the metaverse, across media and entertainment.

Cloud stands out for me, and it’s an area where our Accelerator Programme is forging into new territory at IBC 2022. Our ‘Cloud-Based Live Events, Analytics, and Low Latency Protocols’ project shows how cloud workflows can more efficiently deliver live professional sports events employing multiple camera feeds at scale while minimising latency. ‘Cloud Localisation Blueprint’ explores best practices and critical considerations for building a cloud-based media supply chain for the localisation of entertainment content.

I’d encourage all attendees to head to the free-to-access Innovation Stage and Accelerator Zone, both in Hall 2, to discover each of this year’s eight proof-of-concept demonstrations and engage with the different project teams made up of world-leading media brands and technology pioneers.

IF YOU WERE TO SUM UP THIS YEAR’S IBC IN THREE WORDS, WHAT WOULD THEY BE? Back to business! n

14 | TVBEUROPE SEPTEMBER 2022
“We’ll see how a four-day show works out and take a view on it from there. Currently, IBC 2023 is being planned as a four-day event”

THE VIDEO INDUSTRY’S ROLE IN THE METAVERSE

José Somolinos, solutions strategist and XR lead, Accedo, explores seven areas where the video industry could become the cornerstone of the metaverse

Since Facebook rebranded to Meta and presented its idea of a virtual universe, the metaverse has become a widely discussed topic. While many people are in agreement that the metaverse will be central to the future of human interaction, thoughts on what this will look like vary greatly.

As a concept, the metaverse is as varied as the internet. It is not a specific technology or place, but rather a way to communicate with each other. The metaverse will have applications for every field and industry in the world. It will be up to us, as users, to define how we want it to help us to communicate better.

The video industry has a big role to play in defining the metaverse, developing the first use cases, and attracting users to dip their toes into this world. There are a number of technologies that video providers need to understand and address in order to adopt the position of leading this wave.

1 METAVERSE PLATFORMS

We often hear people refer to these platforms simply as the metaverse. In fact, we should be calling them metaverse platforms or, at the very least, metaverses. While there is suddenly a lot of excitement around

Niantic has revealed its vision of a “real-world metaverse” (Niantic)

FEATURE 16 | TVBEUROPE SEPTEMBER 2022

this area, the concept in itself is nothing new. Some metaverses already exist and have been around for some time, including Minecraft, Fortnite, Roblox or the pioneer, Second Life (yes, it is still live).

As with many new technologies, this will very much begin with many different metaverses competing against each other. While there is a strong will to have a free metaverse, where publishing and monetising is as easy as on the web, the technology is simply not ready. This means that, for the time being, we will see big players creating independent metaverses to compartmentalise the users and retain them within their boundaries.

One example is Meta, which presented to the world its idea of a virtual universe where people can meet, play, be entertained and shop. Its vision, for now, covers a virtual reality world, where people can visit and interact with each other, and with brands.

The other active player is Niantic; parent company of the global success story, Pokémon Go. It has a very different opinion of the metaverse. Niantic wants to create a shared interactive layer on top of the real world, facilitating connections between real humans, not just avatars. An expert in location-based entertainment, it has created an AR platform called Lightship, available for everyone.

While Meta and Niantic are the most active today, we expect Apple, Microsoft, Google, and even Valve to enter the metaverse race in the near future.

There is a lot of work ahead. The metaverse is not a place that will open on a specific date, but it’s something that all industries will be building step by step, as happened with the internet. Video providers are used

to creating content and services that can be delivered across multiple platforms, making adjustments according to specific requirements and formatting. We will have the same challenges with the metaverse. Video providers looking to move into this new world will need to ensure they are aware of the platforms available and their individual requirements.

2 AVATARS

Virtual identity is expected to be worth millions. Not surprising then that there is already a battle to become the cross-platform identity in the metaverse. The current frontrunner is Ready Player Me, which aims to allow people to define their identities when they enter the metaverse. We are already seeing that kids are not interested in traditional social networks and have been already transitioning to metaverses such as

Creative On-Air Camera Moves On Track

Space-limited studios or larger environments that want to add new creativity and production values now have the perfect solution in the Telemetrics ceiling or floor-mounted TG4 TeleGlide® camera trolley and track system. It delivers smooth, repeatable and dynamic camera moves ideally suited for live, automated, and virtual or augmented reality productions. Available in customized Straight/Curved/Ceiling/Floor configurations. Stay On Track With TeleGlide.

FEATURE www.tvbeurope.com TVBEUROPE SEPTEMBER 2022 | 17
Avatars open a new way of monetisation (Ready Player Me)
sales@telemetrics.com • telemetrics.com The Innovators in Camera Robotics

Fortnite or Roblox. There they meet friends (real and virtual) to play and interact with. They think nothing of spending money (both real and virtual) on changing their skins (or virtual appearance or clothing) on a regular basis. Some may also wear clothes to support a specific cause or a unique object showing they have watched a recent movie.

They will collect and display anything that helps them build their identity. While this may be difficult for some people not immersed in this world to understand, the fact that these companies make the bulk of their money selling skins for the user’s avatars goes to prove just how popular it is.

For video providers, who are challenged with finding new ways to engage with audiences, this represents a massive opportunity to give subscribers a way to build their own identity. Couple this with a digital hub, and you can create a really compelling environment that goes beyond just video.

3

NON-FUNGIBLE TOKENS (NFTS) AND WEB3

NFTs or non-fungible tokens could be literally anything: a picture, a video, a sound, a virtual place, a tweet and most interestingly, virtual props and products.

They use blockchain technology to identify a virtual object uniquely. Each NFT is so unique that its value is subjective. NFTs will become the way to monetise in the metaverse, with companies creating and selling unique virtual objects and experiences.

Everyone is craving decentralised networks for the metaverse. There should be no owner of the transactions and let people buy and use their

4CREATION TOOLS

There will be a boom of metaverse positions. CMO won’t refer to marketing anymore, but to ‘chief metaverse officer’, and metaverse manager and metaverse interior designer will be very researched profiles. Every company needs to start defining a strategy to become immersive. A clear parallel is that companies today have many types of roles working in the social media sphere. New expertise was required that wasn’t part of traditional roles.

The metaverse is designed in 3D, so 3D designers and developers will be demanded in huge numbers. In the beginning, big companies will be the only ones to have the money to pay, until there are more graduates and specialists in these fields. Attracting and retaining these key professionals will be crucial for the companies who want to be first on the metaverse.

At the same time, we are already seeing traditional design companies adapting to this new 3D demand. For example, Adobe is improving its 3D offering to adapt to the unique requirements of the metaverse. Asset creation with Adobe substance designer allows designers to create procedural 3D, which are 3D models that adapt to the environment using parameters. Using this new design paradigm, the designer doesn’t make a final object but the components that will conform to the object, leaving the final set-up open. For example, creating a building with a variable number of floors or windows depending on where it is located. There is some parallelism between procedural 3D being for the metaverse what responsive design was to the internet.

On the interactive part, Adobe is also improving its tool, Aero, which enables the addition of interactivity and animation to 3D scenes using simple triggers and attaching them to actions. This simplified approach is key to empowering more designers into the metaverse construction.

Big deployments will require a platform to manage them across multiple metaverse platforms. At the same time, advertising is going to become much more interactive and again, will need to be managed across platforms, allowing for interactive elements where available. At Accedo, we are looking at the role our Accedo One cloud platform will play in helping to expand our vast experience with the video industry to the metaverse.

5 VIDEO FORMATS

NFTs whenever they go, and this is, in essence, what Web3 is all about. Start-ups will appear to help us gather NFTs ownership across metaverses so big companies owning the metaverses are going to struggle to control any monetisation beyond their payment gateways.

Video providers and sports organisations can take this opportunity to create their own coin that can be used to purchase videos, games or even real merchandising. They can also sell virtual assets such as collectibles or virtual wear. It will be a new era where literally anything can be sold. The challenge will be to find the balance of what can be monetised and what shouldn’t. How to keep on building a valuable brand image without killing the golden goose?

The metaverse has, in short, two native types of video format: immersive video (360, 180), and volumetric or holographic content. Volumetric video takes all the eyes now as it works both for VR and AR. The volumetric technology is not a test anymore, and many companies use it to create experiences in the metaverse and on immersive social platforms such as Snapchat or Instagram. Volumetric content also provides a very subjective view of the content, not only immersed into the action, but the viewer, or in this case ‘visitor’, chooses their point of view. The natural evolution of this technology will transform video experiences into something interactive.

We can identify two types of volumetric companies. First, big studios that record high-quality holographic videos. Each frame is a 3D model

FEATURE 18 | TVBEUROPE SEPTEMBER 2022

that needs to be captured with many cameras, sometimes more than a hundred. Then the video is processed using a semi-automatic process to clean, simplify and polish the 3D models. Secondly, small tech companies who record ‘3D pixels’, called voxels, point cloud or mesh streaming. This method produces lower quality videos as it involves typically fewer cameras (up to 12) and post-processes on the go. However, it is way cheaper and, more importantly, can be broadcast live.

Understanding the difference between these two types of productions and partnering with the key actors at each level is critical to propose relevant offerings to our customers.

6IMMERSIVE DEVICES

As mobile technology continues to improve, devices are becoming better year-on-year for AR interaction. Lidar sensors make it possible to interpret the world around you, 5G brings almost zero-latency and highspeed transmissions required to play holographic videos, and foldable phones put even more real estate for these immersive experiences, even on the go. Augmented reality is thriving today in mobile devices and will continue to do so for some time.

We will see various mixed reality devices come to the market this year. Mixed reality devices are VR devices that have cameras showing you the world around you, allowing you to also behave as in an AR device. Users will be able to move between realities, from VR to AR and back. Meta will probably release the successor of the Meta Quest, project Cambria.

The new device is rumoured to have high-quality external cameras that will give the illusion of AR using a VR headset. Built with face and body recognition sensors, it will be a huge step forward for social interaction on the metaverse. Based on numerous reports, it seems that Apple is also preparing a similar device. Other companies like Google and Samsung are most likely preparing something soon.

These types of devices will be for living-room use only. However, augmented reality headsets such as the Nreal Light or Spectacles by Snap are perfect for an on-the-go use case. They will not replace the

mobile phone yet, and instead, they will complement it more or less like the smartwatch. The technological challenges involved in developing these devices, such as battery or comfort, mean that eyewear is not yet for all-time use but more for casual use.

7 NETWORKS: 5G & WI-FI 6 FOR THE METAVERSE

On top of selling immersive devices, telcos have another important card to play; 5G and Wi-Fi 6 are necessary for the development of the metaverses. Telcos and operators will be a pivotal cornerstone to catalyse the development of these technologies; 4G networks were introduced in the golden era of mobile video when people wanted to watch high definition videos on the go and 3G was not good enough.

Native immersive video formats can be very bandwidth consuming, and 5G will be necessary to broadcast holographic or immersive live video to your XR device. 5G not only enables high bandwidth but also very low-latency, which ultimately means that the immersive experiences will happen and react in real time. This is very important in immersive platforms. It is ok if you are watching a video on your phone and the video freezes for a couple of seconds, on an immersive platform these two seconds will look like an eternity and will break the experience.

THE FUTURE OF THE METAVERSE

At Accedo, we have always placed great importance on monitoring new technology that we think will change the way video is consumed. When it comes to the metaverse, the technology is evolving fast, so our team is tracking it carefully to understand the multiple opportunities for video. n

FEATURE www.tvbeurope.com TVBEUROPE SEPTEMBER 2022 | 19
A volumetric capture studio featuring Andy Murray (Dimension Studio) The Nreal Air glasses let you watch any video in a virtual giant screen (Nreal)

ARE NON-BROADCAST APPLICATIONS DRIVING BROADCAST TECHNOLOGY?

One of the dramatic shifts in our industry in recent times is that we now see non-broadcast applications using the latest in high quality technology. The events industry is now regularly seen as driving creativity forward.

Nowhere is this more evident than in augmented reality or mixed reality. The idea, of course, has been around for many years, but what held it back was that the virtual elements were seen as cartoon-ish. Producers wanted photorealism before they would really embrace it, but that requires a lot of computer power.

Just at the point when the ever-rising processor capabilities reached the level when photorealism became practical, something else happened. Covid came along and crippled the live events industry. No events happened, at least at first, because people could not meet.

Inevitably, event producers looked to technology to save their businesses. Online meetings suddenly became the norm: according to the Royal Statistical Society in London, users around the world spent more than three trillion minutes – 5.5 million years – on Zoom calls in 2020.

Valuable as Zoom and other video conferencing platforms are, producers of major events wanted to do more to give their digital-first presentations the wow factor. Augmented reality became forced into prominence, and spurred remarkable new creativity, which in turn is finding its way into broadcast production.

Large-scale events production is also a driver for higher resolution video, and again the broadcast industry is also seeing the benefits. Ultra HD, in terms of 4K frame sizes and high dynamic range extended colour gamut, is becoming much more common, at least at the production stage.

Streaming services saw ultra HD as a competitive advantage, and now some broadcasters are finding routes to deliver it to the home to keep up. For the consumer, receivers are widely available and increasingly affordable, and for many that leads to a demand for content, feeding the production sector’s requirement to deliver excellent picture quality.

In the events market, ultra HD is required to provide immersive video on very large screens, for live and augmented imagery. In broadcast it is sport that is leading the push for ultra HD, so live production is vital.

Economics, and the practicalities of camera positions at sports stadiums, mean that producers are forced into a single production workflow. We already have experience of this, of course, with HD shoots being down-converted for SD channels.

The difference with ultra HD is that it is not just a matter of reducing the number of pixels. It brings with it a very different colour space, much

wider than the Rec 709 everyone uses for HD, that is the benefit of the extended colour gamut. The industry has not yet converged on a single production and delivery standard for HDR, so you may be faced with two levels of colour space conversion: from the production format to the delivery format; and from HDR to the standard dynamic range suitable for HD (and SD). There is also a side issue, in that people often assume ultra HD is solely in the domain of IP connectivity, whereas in fact many facilities have a significant capital investment in SDI plant which needs to live out its life. Ultra HD on 12G-SDI is very common; quad-3G-SDI is also regularly seen.

FOR-A was a pioneer in the ultra HD market, and it reflected the early adoption there. The company developed a number of products, starting early in the cycle of ultra HD, and continually refining them based on growing experience.

A good example is the FA-9600 signal processor. This is a very flexible device, but one of its most important applications is to provide the single point of conversion in an ultra HD for HD common workflow. It supports 3D LUTs so the colour space can be converted sympathetically and artistically, related to the nature of the content rather than on a fixed basis.

It is also a clear indication of the way that broadcast and nonbroadcast applications are converging, in ultra HD, in AR and in other areas. Each is now looking for stable and reliable signal processing to converge multiple resolutions into a single, seamless workflow with the minimum of intervention required away from the creative input. And each is looking to do this at a cost-effective price point.

Inevitably, the technology will continue to drive through into other markets, and towards the consumer, in the immediate future. 4K cameras are commonplace, which means there is an expectation that social media platforms should be capable of distributing ultra HD.

Video conferencing platforms now routinely include keyers – of varying capabilities – so users can conceal their bedroom offices with more exotic backgrounds.

High quality video and augmented reality are no longer the sole privilege of broadcast. The key to success in the future is to keep listening and learning, in all directions. n

FEATURE 20 | TVBEUROPE SEPTEMBER 2022

RESTORING A PIECE OF BROADCASTING HISTORY

Philip Stevens takes a look inside an OB unit that was a leader of its day

Back in July, anyone watching the BBC’s coverage of the middle Sunday at Wimbledon will have seen some historic technology that originated half a century ago. Parked within the hallowed grounds of the All-England Lawn Tennis Club (AELTC) was MCR21; a restored BBC outside broadcast (OB) unit which was delivered to the Corporation in 1963 and then entered service in 1964. But why was an early OB unit there in the first place?

“Centre Court at Wimbledon was celebrating its centenary year and some of the BBC programming involved looking back at the past events at the venue,” explains Nick Gilbey, one of managers of the restoration team for MCR21. “Paul Davies, head of AELTC Broadcast, wanted the unit for his Centre Court Centenary programme, which he produced and directed for BBC One. Paul used to be the executive producer on Wimbledon for the BBC, but now the Corporation is one, albeit the most important, of his customers. I think Paul thought that showing MCR21 was a good way of introducing the pre-record about the history of Centre Court.”

Of course, MCR21 was not around 100 years ago, but one of its first jobs was the coverage of Wimbledon in 1964. So, just what has been involved in this restoration project and how did it come about?

“I’ve always been interested in the technology and the production side of the history of television,” explains Gilbey. “When I was young, I enjoyed taking photographs of outside broadcast units. I met Brian Summers and he invited me to see his incredible collection of cameras. When I arrived at his home, I saw MCR21 parked in his driveway. He had bought it in 1980, and I suggested to him that we should think about restoring it.”

FEATURE 22 | TVBEUROPE SEPTEMBER 2022

Summers was a vision control engineer for BBC outside broadcasts with special expertise in the planning and installation of broadcast equipment. He now runs the TV camera museum website, and restores vintage cameras to operation.

CASH NEEDED

Of course, taking on a project to restore an outside broadcasting unit involves money, and quite a lot of it. Enquiries were made about receiving support from the National Lottery Heritage Fund.

“We found out that a charity needed to be set up before any application for funds could be made,” explains Gilbey. “The Heritage Fund helped us with the establishment of the Broadcast Television Technology Trust, under which the restoration of MCR21 could be carried out.”

The object of the Trust is defined as: ‘The advancement of education for the public benefit in the technologies, techniques and equipment used to produce television programmes in particular’. Its slogan reads: ‘Preserving the history of television technology for future generations’.

FEATURE www.tvbeurope.com TVBEUROPE SEPTEMBER 2022 | 23
MCR21 prior to the renovation MCR21 outside New Broadcasting House in July 2022 COBALT DIGITAL ENGINEERING BEYOND THE SIGNAL

Alongside Summers and Gilbey, the other two trustees are John Trenouth and Jeremy Owen. Beyond these four, an army of volunteers has helped with the many facets of the project. With the formation of the trust, Summers sold MCR21 to the charity for £1.

Gilbey points out that Summers runs workshops on Mondays, Wednesdays and Fridays for those helping with the project. “There’s quite a contingent of guys who come and there are specialist volunteers who work on the displays, the monitors, some of the work being done at their homes. There’s a whole range of skills that we’ve been able to call upon.”

Once the restoration plans for the unit were put forward, the Heritage Fund granted the trust £100,000 for the work. The team had to put in £10,000 of its own resources. The work then started in mid-2019.

HISTORIC REMINDER

Why was MCR21 so important? “It is the oldest and most complete OB unit from that era,” says Gilbey. “MCR21 was one of ten units built by Pye TVT and delivered to the BBC, starting in July 1963 and finishing in mid1964. It was designed as a four-camera unit, but capable of being de-rigged as required. Built on a Commer chassis and with coachwork by Marshalls of

Cambridge, it represented the best of the technology of its day. Historically, it is very important, so our criterion was, and is, to get it back to the original 1960’s condition as far as that is possible. There is a technology ‘bible’ as part of the restoration agreement which states that any changes to the original specification that are necessary have to be documented and an explanation given.”

FEATURE 24 | TVBEUROPE SEPTEMBER 2022
Brian Summers, who originally purchased MCR21 The production desk in the refurbished unit The ten-channel vision mixer designed and built by BBC engineers and installed by Pye The Pye sound desk and telephone unit

One such change involves the camera control units (CCUs). The unit was kitted with Pye Mark VI cameras that Summers already had within his collection. That camera was removed from MCR21 when it was converted in 1969 to become a two-camera colour unit equipped with two Philips PC 60 cameras. One more Mark VI camera was obtained from the Science Museum in London, but a further two are proving difficult to locate, and so Mark V CCUs may have to be installed with an explanation.

“As long as it’s got a couple of original cameras in there, we believe it is genuinely enough to show what it was like in the 1960s. The Mark VI was only used by the BBC and was built to its specification, so other organisations would not have had them in their inventory. I don’t think we’re ever going to get four Mark VI cameras in there,” states Gilbey.

In the somewhat cramped space in the unit, the two vision control engineers controlling the CCUs sat in front of the production desk just inches from the display monitors. The designers felt that just seven monitors were enough for the production and technical staff. The displays comprised one for each camera, two for engineering and production previews and in the centre the 17-inch transmission monitor. Associated with each 14-inch monitor is a BBC designed waveform monitor.

Top centre was an Optical PPM, also a BBC design. The idea being that the sound supervisor can see the PPM in the same eye line as the transmission monitor. To either side are the clock and the dual standard off-air check receiver.

MIXING IT UP

Despite being constructed as a four-camera unit, MCR21’s vision mixer has ten channels. “The unit was designed to be used as a de-rig where necessary,” reveals Gilbey. “And when that happened, the equipment might be combined with another OB unit meaning more than four cameras would be involved. That was some very helpful forward thinking on the part of the Corporation.”

The mixer, with A and B banks and cross faders, was designed and built by the BBC’s own engineers, but installed by Pye during the final construction. “One of the unusual things associated with the vision

mixer is the fact that the engineers decided to use transistors, which were unique for that time,” adds Gilbey. “Marconi, for example, was still using valves because they thought they were more reliable.”

That’s not to say valves were not used elsewhere in MCR21, and Gilbey recalls wondering at the start of the project where they could be sourced today for the restoration. He need not have worried. Summers’ extensive collection of technology included a considerable selection of valves. “Brian’s a treasure trove of bits collected over the years. It would have been far more difficult to have done restoration without his collection and his input,” says Gilbey.

The audio console is located to the left of the vision mixer. It has 20 channels divided into three groups, with provision for PA outputs and extensive monitoring. It is a flat response mixer, meaning there is no provision for equalisation. In case of power failure, the whole sound chain of the MCR reverted to battery operation to allow the programme to continue in audio only. The engineering managers’ desk had the 15-line manual telephone exchange and talkback facilities.

THE FUTURE

So, what happens next? “This is an ongoing project, of course,” states Gilbey. “As far as the future is concerned, we were discussing that with our Heritage Fund case officer. Our current heritage project ran out at the end of July, but the fund organisers are really pleased with what has been achieved. And if we’ve got another project to take forward, then they would help fund that scheme. Meanwhile people can always donate to help with our costs. I can’t emphasise enough the support we have received from the National Lottery Heritage Fund.

He continues, “The last couple of years have been pretty intense for myself, but particularly Brian to get to this stage. It may be time to take a break and look at what we’ve achieved and see where we go from here.” n

More details of the Broadcast Television Technology Trust can be found at www.bttt.org.uk

Valves were still used in the Pye cameras when MCR21 was delivered

FEATURE 26 | TVBEUROPE SEPTEMBER 2022
MCR21 with a Pye Mark VI camera on display

THE CAMERMAN’S VIEW

What was it like working on the new scanner in 1964? Harry Coventry was appointed senior cameraman for the unit.

“We assembled a completely new camera crew once the van was delivered,” he recalls. “My selection was Selwyn Cox as number two, then number three was Andy Tallack, who I got transferred from Television Centre, and the fourth operator was Bob Buttimere. Peter Cook was a trainee.”

Unlike outside broadcasts today where 30 cameras are not unusual, MCR21 was allocated just four, but that sufficed for the level of coverage needed in those days.

“Those cameras were Pye image orthicon, and were used as a replacement for the Marconi cameras in use with other scanners.”

Coventry goes on to talk about some of the programmes he and the camera team worked on during those early days. “We were part of the outside broadcast that covered Sir Winston Churchill’s funeral. I was on the camera in front of the memorial at St Paul’s Cathedral. We had another camera in the Whispering Gallery, while another was placed on a rostrum in the dome. Holes were made in the dome so that scaffolding could be erected. It was obviously safe, although to look at set-up it felt a little scary.”

Coventry relives another incident which is even more frightening. “A vintage aeroplane failed in its attempt to ‘loop the loop’ and came crashing to the ground during a broadcast. I snatched off my headphones and rushed over to see if help was needed. The pilot was shocked, quite obviously, but apart from a few bruises survived the ordeal. There is more to being a cameraman than simply operating the equipment.”

MCR21 was the first BBC OB unit to be sent abroad to record programmes. “Britain’s first television magazine programme dedicated to motoring was called Wheelbase and was first broadcast in 1964. We were sent to Sweden and recorded our first programme on the boat going to Gothenberg. Then we travelled to the Volvo factory where another programme was made.”

Closer to home, Coventry and MCR21 were used on the very first Match of the Day football programme. “It was handled by a wonderful producer called Alan Chivers and in my opinion was much underrated. He had directed coverage of the 1948 Olympic Games in London. Alan called me to a meeting one day and said we are going to record a football match that will be transmitted later that evening. But the whole thing must be kept secret. The football authorities fear that no

spectators will turn up if they think they can watch the game at home later.”

That first game was at Anfield and involved Liverpool playing Arsenal. And remember, it was covered with just four cameras. How things have changed!

Coventry continues, “We were chosen to cover all the England matches at Wembley for the 1966 World Cup, not realising at the time that we had any chance of winning. Together with another unit, we had eight cameras, plus an extremely cumbersome radio camera at pitchside. I operated the main close camera on the gantry, next to Maurice Abel, the other senior cameraman, on the wide angle. Against Germany in the final, as Geoff Hurst scored the final goal, it became an iconic moment in football history!” n

FEATURE www.tvbeurope.com TVBEUROPE SEPTEMBER 2022 | 27
Reliving memories: (left to right) Peter Cook, Andy Tallack and Harry Coventry

HYPERCONVERGENCE: THE BEST OF BOTH WORLDS

Ross Video will mark its return to IBC as an in-person show by focusing on the full range of new Hyperconverged Solutions. Designed to integrate multiple independent devices into a single, unified software-defined production platform, this range of solutions for broadcast centres and facilities provides users with large-scale capabilities in a small package.

Flexibility and simplification lie at the heart of Hyperconverged Solutions’ new approach to technology. Because less cabling is required to implement the concept, it is possible to create more straightforward systems that are faster and easier to design, build and deploy. The powerful internal processing of Hyperconverged Solutions also makes it possible to reduce the amount of external equipment needed for multiviewers, video/ audio processing, synchronization and other vital functions, while matrices can be much smaller in size.

Designed for live production, Hyperconverged Solutions is based on Ross Video’s established Ultrix software-defined signal management and routing platform. Introduced at IBC 2017, Ultrix provides a 12G backbone that can be configured to support SDI over coax cable or fibre connections, SMPTE ST 2110 installations or hybrid I/O applications.

In designing the Hyperconverged range, Ross Video aimed to create an easily scalable system for any size of production or broadcast centre, with technology that not only conforms to the latest standards but which is also ready for the future. This flexibility comes from Hyperconverged Solutions offering different combinations of devices drawn from the Ultrix family of products. These include: Ultriscape, the world’s first software-defined multiviewer; the Ultrix Carbonite production switcher, which has set a new standard in performance, flexibility and value; and Ross Video’s flagship production switcher, the Ultrix Acuity.

The thinking behind Ultrix is that the various components can be configured as either a single frame solution or as part of a multi-frame, distributed infrastructure. The latest version of Ultrix, which will make its IBC debut, features a powerful addition to what Ross Video calls “the Hyperconverged Advantage”. By taking one or more of the Ultrix systems that make up the Hyperconverged Solutions range, it is possible to build a full, integrated system that is able to cope with any and all aspects of live production.

“Ultrix has converged hardware and software together in a way that makes it really customisable,” explains Brandon Rhoda, applications manager for Vertical Markets and Solutions at Ross Video. “It’s an

incredibly powerful hardware set that you use software features to unlock. You keep all the processing internal to the platform, without having to leave the platform. It’s a very efficient and compact solution that has tons of power.”

At a time when some broadcasters and facilities operators are considering moving technical operations to the cloud as part of a virtual production environment, Ross Video recognises that many others still want to operate within traditional-style broadcast centres. We realise that our customers’ business models are not uniform and continue to evolve and it has become apparent to Ross Video that in order to offer value and flexibility to the wide range of markets we serve, having both on-prem and cloud products is crucial. The Hyperconverged Solutions Ultrix platform is key for us in achieving the superior performance and flexibility that our on-prem solutions demand.

“For those that prefer an on-premises solution, our revolutionary Hyperconverged products provide the best of both worlds: Ross’ hardware expertise at a significant cost and space savings plus the creative flexibility of a software-defined, next-generation workflow.” says Mark Sizemore, vice president of Hyperconverged Solutions. “Hyperconvergence is about giving our customers the very best on-premises solution to their production challenges.”

This is why Ross Video is excited about being back at IBC. We look forward to showing this groundbreaking technology platform for live production to both long-term customers and potential new users in person. Hyperconverged Solutions and the future of live broadcasting will be on Stand 11.B10 at the RAI Convention Centre. n

ADVERTORIAL 28 | TVBEUROPE SEPTEMBER 2022

A CENTURY OF INNOVATION

This year GatesAir celebrates 100 years in business. TVBEurope looks back at the company from its formative years in making consumer radios through to its leadership role in wireless content delivery for TV and radio broadcasters

1926

The Gates Radio Company launched in 1922 from the Gates family home in Quincy, Illinois, where 14-year-old Parker Gates made the company’s first radios including the Echophone ($75.99) and Electra ($200). Parker soon delivered this “industry-first” transcription turntable for radio studios.

1936

Gates Radio’s primary focus shifted to radio broadcast equipment in the mid1930s, including amplifiers, microphones, mixers, and turntables before adding AM modulation to its line.

The company sold the industry’s first 250-watt broadcast transmitter to WJMS-AM in Michigan for $2,750.

1947

Radio consoles and mixers were a part of GatesAir’s history of innovation over many decades.

The Gates SA-40 was an early innovation for the company that was very popular with radio stations. The nine-channel mono audio console was manufactured from 1947 and into the early 1950s.

1957

Harris Intertype’s acquisition of GatesAir was a watershed moment in the company’s history. Parker Gates (bottom row, second from left) worked closely with Harris Intertype management (including president George Dively, directly right) to maintain the family-owned spirit of the company and ensure that the existing Gates Radio staff was retained.

1928

The company’s innovations evolved to new areas of sound delivery, including motion picture sound. Pictured here is the MotioTone, a 16-inch sound disc system that aligned speech and music with film. A condenser microphone for radio studios followed one year later.

1966

Some Harris Broadcast TV automation systems of 2000-2013 live on today at Imagine Communications, but few recall that Gates introduced Nite-Watch, the world’s first “automatic programming” system, in 1955. This 1966 Harris Intertype catalogue image shows three Gates Automation Systems offered at the time.

FEATURE
30 | TVBEUROPE SEPTEMBER 2022 TVB92.GatesAir.indd 30 23/08/2022 12:32

1974

The 1970s was a decade of growth and innovation. On the growth side, the company introduced its first TV transmitters. Technical innovations were at an all-time high, including the MW-1, the world’s first AM solid-state transmitter. This began the company’s now near five-decade embrace of solid-state technology to drive audio and efficiency improvements.

1998

The company stepped up its TV game in 1988 with the Platinum Series of solid-state VHF transmitters. The DiamondCD series followed a decade later, featuring then-state-of-the-art LDMOS solid-state technology to provide a new level of UHF reliability.

1987

The company’s first digital transmitter was the Harris DX Series, which substantially improved audio quality and reliability thanks to its breakthrough Digital Amplitude Modulation. This proved to be a wildly popular series, with 1,500 transmitters shipped worldwide by the mid-2000s. Many remain on the air worldwide, and GatesAir today still makes DX systems for AM customers internationally.

FEATURE TVBEUROPE SEPTEMBER 2022 | 31
TVB92.GatesAir.indd 31 23/08/2022 12:32 Discover smartWork The NoCode Media Integration Platform Democratizes the business improvement Revolutionizes productivity Enables anyone to design and create business processes – Ready for creativity Customers’ self-sufficiency Enables secured business process automation For more information please visit: www.tedial.com VISIT US AT IBC STAND 10.D30 TVB92.GatesAir new 2.indd 1 23/08/2022 14:39

2005

The late 1990s initiated a period of rapid company expansion, with Harris Corporation acquiring automation, processing and test and measurement companies for the TV business. The only acquisition that remains part of GatesAir today is Intraplex, acquired in 1999 for its radio STL expertise. The Intraplex brand has since launched numerous innovations, including Intraplex NetXpress, the radio industry’s first managed platform for audio over IP transport.

2009

Harris introduced its Maxiva TV and Flexiva radio aircooled transmitters with PowerSmart technology in air-cooled models. The liquid-cooled lines, Maxiva ULX and Flexiva FLX, excited higher-power broadcasters that could now justify transitioning from liquidcooled tubes from a cost and labour perspective. Redundant, integrated pump modules also provided greater confidence in their reliability.

2012

Long a leader in worldwide DTV standards development, Harris’ introduction of the Apex M2X exciter was groundbreaking as the industry’s first truly software-defined exciter that could work across most DTV standards as well as analogue, providing customers worldwide with a simple upgrade path to digital TV without requiring a hardware replacement.

2017

In preparation for the US repack and large international DTV transition projects, GatesAir rebuilt its Maxiva transmitters from the ground up to provide the industry’s highest efficiency and reliability. This resulted in PowerSmart Plus, a next-generation improvement on its worldrenowned PowerSmart design infrastructure. Using the latest amplification technology, GatesAir now offered air-cooled and liquid-cooled models (UAXTE, ULXTE, VAXTE, VLXTE) from 100 W to 100 kW for UHF and VHF, and ultimately delivered over 350 repack transmitters.

2019

2007

Harris “threw out the playbook” for UHF/VHF transmitter design with PowerSmart, the world’s first green transmitter solution for reduced energy consumption and carbon footprint. The PowerSmart architecture also introduced a modular design to simplify maintenance, and adopted new cutting-edge LDMOS technology that gave broadcasters greater confidence to shift from tube transmitters to solidstate for good.

Harris’ divestiture of the broadcast division in 2013 resulted in a spinoff company that soon broke into two. The core transmission business relaunched as GatesAir out of respect for their roots, bringing only Intraplex along. Intraplex hit the gas on innovation with its IP Link family of codecs in 2013; LiveLook, the industry’s first software-based analysis toolset for IP network performance in 2015; and Ascent, GatesAir’s first cloud transport platform for TV and radio content in 2019.

GatesAir’s strategic acquisition of ONEtastic in 2019 significantly strengthened its footprint in Europe, as well as its range of low-power TV and DAB radio solutions. Now GatesAir Europe, the clever engineers are responsible for GatesAir’s most recent product innovations, including the PMTX-1 pole-mounted transmitter and highdensity Intra-Mast multi-transmitter system, pictured here. The GatesAir brand now lives on under the ownership of Thomson Broadcast.

FEATURE
2022 32 | TVBEUROPE SEPTEMBER 2022 TVB92.GatesAir.indd 32 23/08/2022 12:32

INTELLIGENT, INTELLIGIBLE, CLEAR SPEECH TECHNOLOGY

How leading-edge technology is helping audiences understand speech more clearly

ABOUT THE PROJECT

MPEG-H Dialog+ technology has been available since 2020 and as a way to evaluate the technology, Fraunhofer IIS and Westdeutscher Rundfunk (WDR) conducted some field tests together. As part of those tests, they employed an online survey which revealed that many television viewers, especially older audiences, experienced difficulties in understanding speech and more than 80 per cent of the 2,000-plus survey participants indicated that they would appreciate the option to switch to an alternative ‘clear speech’ mix.

The software automatically separates the dialogue from the background of an existing film mix and outputs a new, easier-to-understand remix. The goal of the project was to develop a professional workflow and bring MPEG-H Dialog+ into use. Fraunhofer conducted field tests over DVB and the VoD platform ARD Mediathek to refine requirements

and production workflows. The results were then fed into the product development of the Telos Alliance Minnetonka AudioTools Server Dialog+ module. The software has now been implemented as part of an automatic workflow – from archive to transcoding farm – in the WDR production infrastructure.

The distribution of tasks to complete the project were as follows: Fraunhofer IIS developed and trained the deep-learning solution, which is also capable of enabling speech-level personalisation for content with only the final audio tracks available. WDR provided suitable training material for the deep neural network, led the workflow design, and launched the Klare Sprache (Clear Speech) service on ARD Mediathek. Telos Alliance’s Minnetonka Audio created the Dialog+ module as part of their AudioTools Server production environment and as well as providing regular application support.

ABOUT THE TECHNOLOGY

MPEG-H Dialog+ contains a deep neural network performing dialogue separation. As training data, real-world broadcast content was used, mostly originating from WDR and other ARD broadcasters. Dialog+ combines dialogue separation with a unique automatic remixing

CASE STUDY 34 | TVBEUROPE SEPTEMBER 2022

algorithm, where a global and a time-varying background attenuation can be combined. Global background attenuation lowers the relative level of the estimated background component by the same speci ed amount over the entire signal. is can be bene cial for users that prefer to always lower the background signal. For others, this might not be the optimal solution, as attenuating the background while the dialogue is not active does not improve speech intelligibility while potentially damaging mood, atmosphere, and sounds of narrative importance. A solution is to lower the background level only when the dialogue signal is active and only as much as is necessary to reach the desired level.

e implementation of Dialog+ is a groundbreaking development for WDR towards more accessibility and automated audio processing work ows for the ARD VoD platform.

Bene ts for broadcasters include:

• Automated, cost saving, and scalable work ow approach

• State-of-the-art quality of the dialogue separation algorithm

• Dynamic remixing algorithm which only a ects the background level when dialogue is present. is prevents unwanted changes to the mix and helps to preserve the artistic intent as much as possible

• Set of presets customised for di erent use cases. is way, the content provider can apply processing optimised, for example, for documentaries, music lms, and sports content.

• Wide range of possible combinable AudioTools Server modules to create individual work ows

WHAT DOES THIS MEAN FOR THE FUTURE?

Dialog+ is part of the MPEG-H Audio production so ware, providing all features of an object-based audio (OBA) system like advanced user interactivity and personalisation. is makes the use of Dialog+ a futureproof decision for broadcasters and content producers. MPEG-H Audio is one of the most advanced Next Generation Audio systems on the market, and its features are already implemented and ready-to-use in the current Telos Alliance Minnetonka AudioTools Server implementation. is also includes the full support of the audio format of the future, audio de nition model (ADM). Even if broadcasters do not use all OBA features today, they can invest in a future-oriented solution beyond current legacy work ows through the implementation of MPEG-H Dialog+. ■

CASE STUDY www.tvbeurope.com TVBEUROPE SEPTEMBER 2022 | 35

TACKLING THE CREATIVE CHALLENGES OF HDR

An increasing number of European broadcasters are investing in HDR production and distribution. They recognise that, to audiences, it makes an obvious difference; HDR is a means of attracting and retaining viewers.

Data shows that consumers see HDR as having a big impact; bigger, perhaps, than 4K. It may be that we will see the practical choice of 1080p/ HDR as a dominant format; it is much more cost-effective than the jump all the way to 4K HDR, for basically the same audience impact.

HDR is already commonly used in cinema/drama production, where the concepts of colour management are well-established, and the standard edit software platforms are HDR-compatible.

The challenge comes in live production, particularly live sports. This is the genre that always drives innovation in broadcasting, because of the stakes involved (expensive rights and giant audiences) and the existing technical challenges (e.g., camera shading for sunlight and shadows in the same scene).

Television production isn’t just about relaying the exact scene to the viewer. Producers and networks may also apply their own specific ‘looks’, driving their unique brand value, while also showcasing the added dynamic range and colour gamut. Frequently, they will apply custom LUTs tuned to create the look they want.

The main cameras are routinely ‘shaded’ by trained professionals to provide the matching and to supervise the output quality. But a large sport shoot will involve many other cameras; PoVs, stump cameras in cricket, goalmouth cameras in football. In HDR, the inter-mixing of these SDR (standard dynamic range) sources into the HDR production is disruptive if not managed; these signals must be processed to match the look of the main signals as closely as possible.

On an outside broadcast, where desk space is limited, any additional colour shading must fit into the existing workspace and operator footprint. Imagine has worked on innovative solutions to address this issue, collaborating with Cyanview to enable their control surface for colour manipulation, as well as with many other operational control systems to support our SNP UHD/HDR processing engine.

Creating a consistent production from a large footprint of sources and delivering the artistic vision of the director in HDR are the primary objectives. But it is also critically important to translate that HDR look back into SDR; the way the vast majority of today’s viewers will see it.

Responsibility for these translations is complex; the camera shaders and technical directors must consider how their fantastic HDR scenes will look in SDR, and the eventual translation downstream at the network hub must be done the same way it was ‘checked’ on the truck.

Imagine has been working on HDR with NFL Network for a number of years

Video processors and converters are found at many steps along the production pipeline, and as these productions shift into HDR, all these processors must be HDR-capable to preserve the signal fidelity, sometimes including custom LUTs and production-specific adjustments. All while fitting into the rack space, power, and other operational requirements of what is already in place.

At Imagine, we have been working on HDR with broadcasters and production companies for years, delivering UHD/HDR capabilities to operations as diverse as QVC Japan and the NFL Network in the USA. That has meant many conversations about HDR and colour space conversion, even as the production industry has been trying to standardise on how these HDR workflows should work. Flexibility is key, as there are many aspects of HDR production that are not yet settled or agreed among the production professionals, so our HDR processors must be able to adapt to the needs and wants of each user.

Major broadcasters are continuing to study how HDR affects viewership, with a focus on the practicalities of delivering productions at the right cost. The next wave of broadcasters coming into HDR all benefit from this accumulated knowledge and experience.

With the right technical infrastructure enabling consistent HDR delivery even within established SDR workflows, the scene is set for increased deployment of HDR into more live event productions. n

FEATURE 36 | TVBEUROPE SEPTEMBER 2022

TURNING AN AMAZING GOAL INTO A REMBRANDT

Could NFTs be broadcasters’ next big revenue stream? Vislink CEO Mickey Miller certainly thinks so. He explains why to Jenny Priestley

Non Fungible Tokens. Everybody’s talking about them. Content creators and owners are getting involved, with everyone from Quentin Tarantino to CNN and Liverpool Football Club launching their own NFTs. But could they also be a way for broadcasters and streaming services to make extra revenue?

Vislink has developed a new video editing tool that enables users to create NFTs. After acquiring Mobile Viewpoint last year, the company put a lot of investment into its AI-assisted cameras and as part of that development, found that fan engagement was becoming increasingly important to rights holders.

“We looked at some ideas of what we could do to help fan engagement and we created this clipping tool in LinkMatrix, which is our overall operating platform, that allows users in real time to clip a highlight and then immediately mint that into an NFT,” explains Vislink CEO Mickey Miller.

“There are a lot of ideas that we’re working on with rights holders around being able to monetise that, to be able to create an engagement with the fan that hasn’t been done before. A lot of these fans are engaging with the content in the metaverse and, being fans, they are fanatics and so they would love to have an NFT of their favourite player.”

As the editor creates their highlight and employs the tools to add graphics and audio, they have the option of minting that content onto whichever blockchain they choose. “There’s an option to do it on Ethereum, but it has higher gas prices, which is the cost to do it,” explains Miller. “Most people prefer to do it on a chain like Polygon or Solana, which have much lower gas prices.

“If you think of the blockchain as kind of a conveyor belt with blocks, you would own that content on that block,” he continues. “Within that there’s a pre-defined algorithm of the rights; so say you own it, but you want to sell it, the proceeds will be dependent on how you, the content owner, wants to set it up. The seller could get 70 per cent of the price, the publisher gets five per cent, Vislink gets a share of it. That’s the beauty of a blockchain because now you can see who owns the content, it’s completely transparent, and completely decentralised.”

As well as creating NFTs, broadcasters and streaming services can use the clipping tool to create videos that can be posted to social media accounts. “That’s nothing new,” admits Miller. “But automating it is what we’ve brought to the party.”

And it’s not just sport where Miller thinks broadcasters can find a use for the tool. He suggests an event like the Eurovision Song Contest would be a perfect example of where an NFT would be hugely popular.

“If you’re watching the game or something like Eurovision and you see an incredible highlight, you can immediately buy it and brag to your friends, I own this. Also, in the metaverse, ownership and how you represent yourself becomes who you are. So you’d be able to display the NFTs that you own and it’s kind of like, here are my Rembrandts.”

Having created the NFT-Ready Clipping Tool, is Vislink planning to develop any other technology that could be used for NFTs or the metaverse that might appeal to more traditional broadcasters or streaming services?

“We have quite a roadmap,” says Miller. “I recently attended an event with all the athletic directors for all the universities in the US. The level of engagement and enthusiasm around the concept and the tech that we’ve developed was huge. And through that we’ve come up with some other ideas.

“It’s one of those things that we iterate as we continue to evolve the technology, see the use cases, and it’s a way to, again, engage the fan in a lot of different ways. It’s about keeping those eyeballs and monetising them as much as possible.” n

PRODUCTION AND POST 38 | TVBEUROPE SEPTEMBER 2022

RESTORING THE SUNSHINE

Perennially popular comedy double act Morecambe and Wise continue to appear on TV all these years after their 1970’s heyday. Now some of their lost BBC shows are available again following some intricate restoration and reconstruction work, writes Kevin Hilton

The quest to find television programmes missing from broadcast archives is a lesson in patience and perseverance. Even when a missing show is found, it may still be some time before it can be seen again, especially if the recording requires a great deal of intricate restoration work. An extreme example of this is the very first episode of The Morecambe and Wise Show recorded for the BBC, which has been recreated from a badly decomposed film print and now features on a DVD collection with other lost shows starring the still much-loved comedy duo.

The reconstructed episode was made possible through specialised dental scanning technology and is the technical highlight of Morecambe and Wise: The Lost Tapes. This collection was overseen by producer Charles Norton, who had been trying to get BBC Studios to

release a new M&W compilation for some time. “I knew there was a lot of material that hadn’t been released before,” he explains. “Initially that was when I was working at BBC Audio Books and then later I was talking to Rebecca Richmond, the commissioning editor of BBC DVD. In parallel I got involved in a joint venture project between BBC R&D, BBC Information and Archives and Queen Mary University of London to restore this highly damaged film print.”

This was the first episode of Eric Morecambe and Ernie Wise’s 1968 series for BBC Two, among the first TV comedies to be shown in colour. Recovered as a black and white film print from a vault in West Africa, it was in a highly decomposed state, the result of vinegar syndrome, which occurs when cellulose-acetate – or triacetate – base film is kept in conditions of high temperature and humidity, causing

www.tvbeurope.com TVBEUROPE SEPTEMBER 2022 | 39

a breakdown that melts, buckles and shrinks the film.

This posed a major problem in not only seeing what was on the film but also attempting to safely transfer the material to another format because the original was continuing to degrade, producing vapour that could affect other films if they came anywhere near it. “All film scanners are based on the idea of unrolling a film and shining a light through it,” says Norton. “The problem we had was you couldn’t unroll it or touch it because it was falling to bits and decomposing. So we had to find a way of looking inside a film without touching it.”

The means to do this were found through a form of CT scan called microcomputed axial tomography. This was done over a seven-year period, on a machine usually used for scanning teeth, at Queen Mary University, London by Professor Graham Davis and Dr David Mills. The print had to be cut into 1cm cubes so it would fit in the scanner and the images could be seen using a process similar to X-rays, only in 3D rather than 2D. Not every frame could be recovered

PRODUCTION AND POST 40 | TVBEUROPE SEPTEMBER 2022
Meet the EnGo 3x Unmatched 5G performance dejero.com/engo3x Our 5G mobile transmitter incorporates next-gen RF and antenna design. Broadcast in 4K UHD, transmit up to four fully frame-synced HD feeds, and get more done with built-in internet gateway. • 4x4 MIMO antenna design • Wide support of bands • Low latency video transmission • Fast file transfers up to 500 Mbps • Reliable multi-path hotspot

but Norton estimates one in every six could be pulled out, although that depended on where in the reel the images were.

When as much as possible had been retrieved, Norton faced the daunting task of assembling the images in the correct order. Fortunately his job was made much easier through the use of software developed by senior BBC R&D engineer Adam Wiewiorka. From originally fearing that it would take a few months to complete the process, Norton was able to do it in an afternoon.

The result is not a smoothly running moving image but something that resembles a slide show or security video. However, it is a record of a historic TV moment as Morecambe & Wise began an important phase in their career. The next necessary component was the sound, which had to be synced to the images from off-air recordings. The soundtracks of the entire 1968 series were recorded by Peter T Tatchell, a comedy enthusiast and editor of online magazine Laughterlog, when it was repeated on Australia’s Network 7 in 1973. Another audio track from a 1969 BBC repeat of the first episode had also been recorded by the late radio presenter Ed Doolan.

Audio for the different shows on The Lost Tapes was restored and re-synchronised by composer, sound engineer and musical director of the Radiophonic Workshop, Mark Ayres. He says over the 40 years he has been working on programme restorations, the goal has been to not only collect the best possible audio source but all available sources. “We’ve discovered fans around the world who recorded the soundtracks off-air and even if the film print exists, there’ll be an off-air recording that is better sound quality.

“Of course, it’s non-synchronised so I have to laboriously re-sync it. For this particular episode, there is a minute at the beginning that exists and is from a documentary, which gave me a sync reference. Then we had two off-air recordings, one made in the UK, the other in Australia, and the first thing I did was to sync-match them using the existing clip as a reference so they played at the right speed,” Ayres continues. “Then I swapped between the two recordings, using the best quality from each for every scene and sometimes individual words. Then the soundtrack was de-noised and re-EQed ready for Charles to drop the still images into the timeline.”

The finished result is what Ayres describes as “a glorified slide show” but one that becomes “very watchable”. The other three episodes from 1968 were on good quality black and white film, from which they were transferred and restored using the Colour Recovery technology developed by former BBC R&D software

engineer Richard Russell. The fifth previously lost M&W show on the DVD was Eric and Ernie’s first to be broadcast on BBC One, which was the duo’s TV home until they went to ITV in 1978. This was discovered by Eric’s son Gary Morecambe in his mother’s attic while he was looking for old scripts.

Peter Crocker of SVS Resources, who handled picture restoration and Colour Recovery for most of the episodes on The Lost Tapes, comments that the unusual aspect of this discovery was that it was a film recording negative instead of a print. “If anyone had tried to watch it on a projector it would have been inverted, like photographic negatives, so that was an oddity.” Crocker had previously restored much of the 1968 episodes and worked on the 1970 show in time for it to be shown by the BBC last Christmas.

Which was fitting given people of a certain age would associate Eric and Ernie with that season. But it’s good to now have more of them to watch at any time! n

PRODUCTION AND POST www.tvbeurope.com TVBEUROPE SEPTEMBER 2022 | 41

WHEN ONE CONNECTION

ISN’T ENOUGH

all been in situations where our cellular connection was slow or even non-existent. Many things can impact our connectivity. Where you are, how many people are using the network, what you’re trying to do, and even the weather can all make a difference in reception and the available bandwidth. We’ve outlined some in the graphic above:

- Obstructions (hilly terrain, dense foliage, large buildings)

- Weather conditions (humidity, heavy cloud cover, fog, precipitation, electromagnet interference, temp inversions)

- Number of users

- Location (city, urban, rural)

- Building materials (metal, concrete, tinted and low-e glass)

- Data-intensive applications

- Spectrum bands

- Stationary vs in motion

- Proximity to tower

What if you were a reporter going live from the field? You have to send a high-quality video feed back to the newsroom for broadcast and need a reliable wireless connection no matter where you are. Or a media production company, shooting somewhere remote. Collaborating and communicating in real-time helps avoid expensive reshoots and ensures you get the footage you need.

We can even encounter situations where news media, production companies, and public safety agencies compete for the same bandwidth, along with thousands of people at a large event or sports game.

PREPARING FOR THE UNEXPECTED

Sure, a single cellular connection from a single carrier can do the trick, but relying on just one can leave you vulnerable.

Networks can go down without warning, and upload and download speeds on mobile networks vary greatly by country, carrier, specific location, and the degree of congestion on the network. It’s not unusual to only have a 1 Mbps upload speed from a single connection, especially where crowds gather and cause network congestion. But many applications, like streaming video, require more. Approximately 5 Mbps upload bandwidth is needed to send high-definition, low-latency live video. For 4K UHD streams, you’d need around 25 Mbps upload bandwidth.

Enter cellular bonding. It combines two or more cellular connections to provide more bandwidth for uploading and downloading and more resiliency when cellular network coverage is stretched, or signal strength is diminished. So if a connection drops or packet loss occurs, packets are rerouted across the other connections in the bonded link.

MORE CONNECTIVITY, LESS PROBLEMS

Dejero Smart Blending Technology not only aggregates 3G/4G/5G cellular connections but also combines other wireless connections such as Wi-Fi and satellite and cable/DSL/fiber broadband connections in a fixed location. In fact, any Internet Protocol (IP) connection from multiple providers can be aggregated to form a virtual ‘network of networks’ with Dejero’s technology.

While each connection path has its own characteristics, Dejero dynamically and intelligently manages the fluctuating bandwidth, packet loss, and latency differences of individual connections in real-time, seamlessly redirecting packets and maintaining session persistence if connections degrade or are lost.

The result? Resilient, high-bandwidth internet connectivity when and where you need it. That means video, voice, and data can be sent and received uninterrupted. Dejero delivers the resilient internet connectivity you need, no matter where you are. n

Discover the Dejero Difference. Learn more: www.dejero.com/solutions/ internet-access

42 | TVBEUROPE SEPTEMBER 2022 ADVERTORIAL
We’ve
1
2
3
4
5
6
7
8
9

BUILDING INNOVATIVE PRODUCTION GALLERY INFRASTRUCTURE

Sky Italia is part of Sky Group, Europe’s leading entertainment company with 23 million customers. Sky delivers its pay offer via satellite and internet, while NOW is its OTT service that allows customers to stream Sky’s cinema, entertainment and sports content. In Italy, Sky is also available on free-to-air digital terrestrial TV via the TV8, Cielo and Sky Tg24 channels.

As part of its production capabilities, Sky Italia needed to distribute gallery multiviewer signals for monitoring across the company headquarters in Milan. Multiviewers are traditionally located only in the video/audio gallery, but it was important for their teams to have them available almost everywhere within their HQ location.

Part of the challenge associated with this project – and Sky’s main priority – was to keep its employees safe during the pandemic outbreak in March 2020. While most colleagues could work remotely, the company also operated a ‘Gold team’ – critical workers – who were required to be present in production areas. For studios, an easy way to respect social distancing was to move workstations to other rooms as much as possible, but at the same time, these remote operators (including EVS and graphics specialists) still needed to view numerous video signals at the same time.

With a network device interface (NDI) solution in development and a gallery based on the technology in the construction phase, the first stage of the project was to analyse the problem.

The Sky team needed to mirror existing video signals and distribute them across a variable distance, with minimal infrastructure modification. NDI was identified as the right technology to meet these needs for several reasons: first, once the input streams are converted and available, they are automatically available wherever a network connection is present. This, in a corporate facility, means almost everywhere. Second, as part of the standard NDI Toolkit there is already a free application for monitoring signals, with no additional costs except COTS hardware.

For the second stage of the project, the Sky team identified Kiloview converters as the right tool for the job because they are bi-directional and can be used as either an encoder or decoder according to the real application.

As a result, Sky implemented a number of Kiloview N3 and N4 converters for both incoming HDMI (from the multiviewer output) and SDI (additional video, if needed) signals. For monitoring, Sky used N3/

44 | TVBEUROPE SEPTEMBER 2022 CASE STUDY
Inside Sky Italia’s video/audio galleries

N4 in decoding mode, or desktop PCs with the NDI Studio Monitor application, as needed.

As a third stage, the network was later expanded to add more galleries and also proved a useful solution for studio floor monitoring, allowing talent to view multiple signals instead of a single router output.

In addition, the solution is also ideal for Sky production teams who need to monitor more than one location. In their sport newsroom, for example, they need to check incoming live feeds from multiple galleries, and in this situation, Kiloview converters can store several presets, enabling users to select the signal they need with just a single click.

“This project represents a number of firsts for Sky Italia. Not only

www.tvbeurope.com TVBEUROPE SEPTEMBER 2022 | 45 CASE STUDY
Screenshot of the NDI monitor
www.pebble.tv Your expert partner for hybrid cloud playout and IP transition Visit us at IBC 2022 | Hall 8 | Stand B75

RENDERING THE NEXT GENERATION OF BROADCAST GRAPHICS

People are visual creatures, which is why television has been such a successful medium. But with more channels on both linear and OTT platforms, broadcasters and streaming services need eyecatching graphics and on-screen presentations to set themselves apart.

While modern tools are now widely available to both create visually arresting graphics and incorporate them into any delivery platform, broadcasters still face a number of challenges. These include fierce competition for audiences, the logistics of implementing effective graphics for remote production, and the lack of a single workflow to control all graphics.

The most effective way of making a production stand out is through the use of data, extended reality (XR) and augmented reality (AR) objects. Today’s graphics workstations and XR/AR engines can deliver powerful, distinctive and ever more realistic visuals that combine text, images and virtual environments to produce visuals that capture the audience’s attention and make people talk.

When looking to incorporate exciting graphics in their production, broadcasters should consider a number of important factors. This includes developing the best studio design for the project; the use of data specific to the programme, (politics, sport, entertainment); the ability to

work across both linear broadcast and OTT platforms; providing remote production capabilities; and having high visibility on social media.

At disguise, our extended reality platform allows for virtual graphics to be rendered in real time in-camera. The disguise set extension feature allows for small studio sets to become expansive virtual universes. Designed for both creatives and technologists, our xR workflow simplifies the workflow involved in an LED virtual production stage, allowing for new creative possibilities for broadcast productions, with bigger ideas for smaller sets.

Supported by Unreal Engine, the most powerful and widely used creation tool for virtual graphics, disguise xR allows broadcasters to augment the viewer experience to capture eyes on screen in today’s competitive market. Earlier this year, our capabilities and expertise were further expanded when disguise acquired Polygon Labs. By incorporating its powerful cloud-native production platform, turnkey services and high-end graphics with real-time data visualisation, used by leading broadcasters including CNN, The Weather Channel and TV Globo, disguise is already looking at delivering the next generation of broadcast graphics n

Learn more about disguise: disguise.one

46 | TVBEUROPE SEPTEMBER 2022
ADVERTORIAL

LIVE HDR REMOTE PRODUCTION AT SKY SPORTS Russell Trafford-Jones investigates how the sports broadcaster seamlessly produces programmes in both SDR and HDR 700 times a year

Sky has a long history of adopting new technologies. The move from analogue to digital back in 1998 brought more channels and improved the audio and picture quality. More recently, Sky Glass was a bold move on many fronts, all of which were aimed at a better viewing experience. Sky Glass is a UHD TV offering, which is no surprise from a company that’s delivered UHD programming for six years now, and it also supports Dolby Atmos which Sky has offered for the last four years. But for those who believe in the ‘better pixels’ philosophy of quality over quantity, high dynamic range (HDR) is the technology to look out for.

Starting in 2020, Sky customers could watch ondemand content in HDR. August last year saw HDR arriving on live English Premier League matches and it has since expanded to 700 events per year.

Bringing HDR to live broadcasts was no small task. Looking around the truck and gallery now, it may seem that little has changed, but that’s exactly the point;

adding HDR has been a story of minimising changes to the workflow. Through careful research and planning, Sky has added HDR to live transmissions with no increase in headcount and disruption has been kept to a minimum by only using HDR in key positions such as the technical supervisor and match director. Along with HDR, Sky has moved to using the BT.2020 colour space, a standard that allows many more colours than the Rec.709 colour space popularised by HD SDI. This is an important element to sports events which often involve highly saturated colours around the stadium, on players’ kits and on match balls.

Carys Hughes, senior picture quality engineer at Sky, explains the challenges in bringing HDR to the company’s well understood UHD workflows. “Anything we do in HDR has to work for our SDR viewers,” she says. “It was important to validate conversions to and from SDR. This project touched many workflows and we needed confidence that video would look right everywhere in the chain.”

48 | TVBEUROPE SEPTEMBER 2022

HDR is not just about the cameras. Hughes’ work extended to ensuring workflow compatibility with systems including studio graphics, channel branding, playout servers, and advertising content.

Sky is using HLG, a popular HDR format for live broadcasts. Its full name is hybrid log-gamma which is a nod to the way the format works. HDR requires a different interpretation of the digital video data signals. SDI video has a stream of numbers that describe the brightness of a scene. Just like the lookup tables (LUTs) used in camera work and colour grading, HDR standards define how image ‘luminance’ levels are distributed between each number and the next. Converting between SDR and HDR requires correct selection and application of specific LUTs. Hughes explains, “Adding HDR to live broadcasts, and conversion functionality to and from SDR, was in part an extension of a lot of the signal processing understanding and validation required of our team to ensure that our earlier launch of HDR for on-demand was a success.”

Fluent conversion between HDR and SDR is essential to meeting Sky’s aims of minimising the impact of HDR programme delivery to staff. In fact, there is little HDR monitoring in the galleries at all, as David Adams, senior technical supervisor for remote workflows explains: “We use the same down-mapping everywhere in our gallery that our SDR viewers will see, even when shading the cameras. This allows us to keep everyone working in SDR with the assurance we have eyes on the picture the majority of our viewers are seeing.”

Sky has been producing sports programmes using remote production since 2020. Although the pandemic certainly accelerated the broadcaster’s plans to move to remote production, Sky’s commitment to environmental sustainability was already working to reduce travel. The remote trucks for Formula One, football and cricket are operated by third parties who are in-step with Sky Sports’ move to remote production and helped pioneer the project in the early tests where a single HDR centre camera mirrored the in-production SDR camera.

“Line-ups are an essential part of delivering an HDR programme”, comments Adams, “so we use test cards designed for HDR such as the EBU 3373 and ITU 2111 HLG bars.”

This isn’t without its challenges when Sky delivers an SDR programme feed internally or to third parties. When a third party is expecting SDR, receiving a picture that says ‘HLG 2160p50’ is confusing but more importantly, on a waveform monitor the colours won’t be in the target boxes and the luminance values will appear incorrect. Adams believes creating a simpler test card for operational use is the way forward when exchanging with SDR organisations.

Monitoring HDR video is limited to one or two locations within the OB truck, and five positions within Sky’s production facilities ahead of transmission. Quality viewing is done both on SDR and HDR monitors in conjunction with highlighting overviews on a Telestream PRISM waveform monitor. The ‘stop display’ and ‘false colour overlay’ highlight areas of the picture which are outside of the normal colour gamut or outside of SDR dynamic range.

These displays are critical because they describe the image in a way an HDR monitor doesn’t. HDR monitors will show the user if a picture looks good for HDR, but it won’t give insight into how much the picture will be changed for SDR. The PRISM also allows the SDI payload ID to be seen, which is used extensively in the workflow. The payload ID contains format information about the video including its resolution, structure and HDR metadata. With so many possible combinations of video, Adams worked with vendors to ensure monitors and other equipment reconfigured themselves to follow the input video formats when they are switched.

Bringing HDR to live sports on Sky is an achievement born out of careful research, building on top of knowledge gleaned from Sky’s track record of building new technologies into their platforms. Adams concludes, “Our goal was to, as much as possible, apply HDR as a seamless layer on our regular productions and I think we did just that.” n

PRODUCTION AND POST www.tvbeurope.com TVBEUROPE SEPTEMBER 2022 | 49
Carys Hughes David Adams Sky Sports’ studio

‘THE WORLD’S MOVING TO HDR’

HDR can lift the image on a screen to an entirely new level. From HDR 10+ to HDR Vivid and SLHDR, there are numerous versions of the technology. Cobalt Digital’s Ciro Noronha sits down with Jenny Priestley to look at how HDR has developed and where it’s headed

First introduced in 2014, high dynamic range (HDR) is finding its way onto more and more screens. To put it in basic terms, HDR heightens an image’s dynamic range; the contrast between the darkest blacks and brightest whites.

Advanced HDR by Technicolor is a suite of HDR production, distribution, and display solutions that leverages machine learning technology to maximise image quality and enhance the viewing experience. It can optimise images in real time, which means when a viewer is watching a football match that starts during the day and ends under artificial illumination, Technicolor’s SL-HDR automatically adjusts and corrects the changes in lighting.

Cobalt Digital partnered with Advanced HDR by Technicolor to develop solutions that enable broadcasters to manage multiple formats and quality levels and deliver both SDR and HDR in a seamless, dynamic and automated process.

Ciro Noronha, executive vice president of engineering at Cobalt Digital, likens the move to HDR to the adoption of colour. “When we went from black and white to colour TV, you had a signal that was black and white with the colour information on the side,” he explains. “The idea with SL-HDR is the same thing.”

When the viewer’s TV receives the base signal it is traditionally in SDR, which means it’s compatible with every kind of TV on the market. The HDR information is included as metadata, and the TV includes technology to use and reproduce the full HDR signal.

“That was written up into the international standards,” adds Noronha. “It’s in one of the ITU standards and it’s also been written into ATSC 3.0 in the United States. There is more than one HDR standard, in fact, there are many to choose from.”

As well as being employed for live broadcasts, there is the possibility of using HDR in legacy content. Again, Noronha cites the move from black and white to colour: “When you think about black and white content that’s been colourised, a human being did that using a computer.

They added information that wasn’t there. It’s the same thing with SDR. The HDR information wasn’t captured when it was first made. So if you want to bring that legacy content into an HDR workflow, the best way to do that is to get a colourist, someone who knows what they’re doing,” he continues. “Technicolor brought together a group of colourists to work on a whole bunch of scenes, and they had artificial intelligence learn from that. Now, they have a system, which is called Intelligent Tone Management (ITM), which is as good as when you used to get the colourist to do it.”

Noronha states that as HDR technology becomes cheaper, the world will move towards it. “It’s just a matter of formatting the signal and getting the pipe,” he adds. “I would say more people can see a difference in content that’s in HDR than 4K or 8K. The nice thing about HDR is there’s a continuum of TVs and the metadata that comes in the stream helps the TV to map it to what it can do. It’s just a matter of putting together the infrastructure. There have been false starts and stumbles but the infrastructure is coming.”

HDR and SL-HDR will continue to evolve, believes Noronha, as both will be constantly updated to give better performance. “What you’re going to see more of is adoption by the TV manufacturers and broadcasters,” he adds. “The problem with SL-HDR1 is that not that many TVs that have it yet, but on the other hand, it is just firmware so it can always be sent as an over-the-air upgrade.”

Does that mean Noronha believes the move towards HDR adoption is being driven by TV manufacturers or streaming services who want to offer their viewers content in the highest quality available? “Well, that’s the big question!” he admits. “Gone are the days where you bought a TV and kept it for 20 years. The manufacturers want you to change it every two, three years and they need to give you a reason to do that. The broadcasters want you to watch it some more because then they can make money from advertising. I wouldn’t identify one sector that is pushing it, it’s the whole thing.” n

“It’s just a matter of putting together the infrastructure. There have been false starts and stumbles but the infrastructure is coming”

TECHNOLOGY www.tvbeurope.com TVBEUROPE SEPTEMBER 2022 | 51

THE NEXT STEP IN THE CONNECTION BETWEEN CREATION AND TECHNOLOGY

The metaverse is everywhere. Or, at least it feels that way, which is actually the point. What is the metaverse? Why is there so much hype? What does it mean for content owners and creators?

The promise of the metaverse is the ability for we humans to easily exist in both physical and virtual ‘digital twin’ environments. While no-one agrees on the definition of the metaverse, most agree that it represents the natural evolution of the internet. Imagine a virtual destination where humans, in the form of their self-defined avatars, enjoy three-dimensional spatially and contextually aware experiences. We’ve been embracing digital communities for decades in the form of video games (Fortnite), social communities (Second Life), augmented reality (Pokemon GO).

Why is the metaverse attracting so much attention now?

Entertainment budgets will soon be in the hands of those generations that embrace technology and already easily shift between digital and physical experiences. A Verizon Media study found that 54 per cent of 18 to 24-year-olds say they appreciate it when brands “connect with them in new and innovative ways”. A shift is coming in where time is spent. Deloitte’s 2022 Digital Media Trends report found that gaming is a very close second to watching TV or movies at home. Approximately 50 per cent of all US gamers said that playing games has taken time away from other entertainment activities.

Digital natives, those born after 1980, have grown up with devices in hand and a seemingly intuitive grasp of how to manipulate new tools, apps and environments. “The metaverse will usher in in even more forms of digital media and applications wherein consumers can interface with their favourite brands, characters and worlds,” says Jon Wakelin, partner at Altman Solon.

What does the increased attention to the metaverse mean for content owners and creators? It represents an opportunity to create evermore immersive experiences for fans and increased demand for talent to design these 3D worlds. The metaverse opens the door to new production methods, eliminating the need to mix the physical world and green screen technology. It increases demand for graphic designers with CAD, CGI and 3D modeling and spatial computing. It requires an infrastructure that can respond to millions of simultaneous requests and rendering of images, on demand.

Media brands, large and small, recognise the opportunity to extend

their brand. Global media giants Disney and Warner Bros Discovery are already exploring next-generation storytelling and monetisation of NFTs. Disney’s metaverse strategy is led by the former head of Disney Parks and Disney Interactive, bringing a blend of knowledge in creating different types of immersive experiences that reinforce the Disney brand. Warner Bros Discovery will capitalise on the Harry Potter franchise to create interactive worlds for its global fanbase.

“I anticipate media and entertainment brands taking control of their intellectual property.” states Tom Ffiske, editor of Immersive Wire. “Both brands and individuals will be able to design and licence non-fungible tokens (NFTs) and avatars with blockchainenabled contracts opening the door for both trusted relationships and systematic allocation of revenue.”

An example of the creativity, immersion and opportunity is represented by the development of new digital communities. MotoWorld, to be launched in January 2023, is an immersive community where automotive enthusiasts can explore and participate in auto-centric activities. Visitors become creators, designing their homes, building and personalising cars by ‘buying’ digital parts from known automotive brands, creating their own race environments, or choosing existing race tracks to race against other members. CEO, Darcy Lorincz, explains, “I believe there has long been a connection between creativity and technology and the next step is adding in the social element. The metaverse provides

TECHNOLOGY 52 | TVBEUROPE SEPTEMBER 2022

multiple monetisation opportunities, both for those creating platforms like MotoWorld and those who use these platforms to create and monetise their personalised digital products.”

multiple monetisation opportunities, both for those creating platforms like MotoWorld and those who use these platforms to create and monetise their personalised digital products.”

The evolving metaverse is not without challenges. The key issues are related to networks, safety and interoperability. Media brands expanding into the metaverse must take each of these into account as they develop their virtual worlds.

The evolving metaverse is not without challenges. The key issues are related to networks, safety and interoperability. Media brands expanding into the metaverse must take each of these into account as they develop their virtual worlds.

Today, the internet, particularly in the United States, is not optimised to handle the high volume of real-time interaction, rendering and dynamic resource allocation that the metaverse requires. Media brands must understand and address this concern to ensure a meaningful user experience.

Today, the internet, particularly in the United States, is not optimised to handle the high volume of real-time interaction, rendering and dynamic resource allocation that the metaverse requires. Media brands must understand and address this concern to ensure a meaningful user experience.

Kavya Pearlman, founder and CEO of the XR Safety Initiative, is concerned about what can be learned from volumes of data related to our actions within the metaverse. “While much of that data would be used to improve our experience, that same data also reveals more about individuals than they may realise, leading to unanticipated consequences,” she says.

Kavya Pearlman, founder and CEO of the XR Safety Initiative, is concerned about what can be learned from volumes of data related to our actions within the metaverse. “While much of that data would be used to improve our experience, that same data also reveals more about individuals than they may realise, leading to unanticipated consequences,” she says.

be as easy to move between different environments digitally, as it is physically.

be as easy to move between different environments digitally, as it is physically.

Boo Wong, director of live entertainment at Unity, advises: “The critical issue is in managing the IP created by brands and individuals rather than tech standards. The recently announced Metadata Standards Forum, of which Unity is a founding member, already reflects the interest and commitment from a wide variety of market participants to defining common standards.”

Boo Wong, director of live entertainment at Unity, advises: “The critical issue is in managing the IP created by brands and individuals rather than tech standards. The recently announced Metadata Standards Forum, of which Unity is a founding member, already reflects the interest and commitment from a wide variety of market participants to defining common standards.”

Entertainment and media brands expanding into the metaverse clearly see the opportunity to increase engagement with their fans. They also have a responsibility to ensure the best possible experience for users.

Entertainment and media brands expanding into the metaverse clearly see the opportunity to increase engagement with their fans. They also have a responsibility to ensure the best possible experience for users.

Interoperability is the biggest challenge when it comes to the sharing of digital goods and assets across digital communities. Due to the lack of standards, it is not clear how identities established, or goods purchased in one world can be used in another. If the metaverse is truly meant to function like a digital twin of the physical world, it must

Interoperability is the biggest challenge when it comes to the sharing of digital goods and assets across digital communities. Due to the lack of standards, it is not clear how identities established, or goods purchased in one world can be used in another. If the metaverse is truly meant to function like a digital twin of the physical world, it must

Wong says it best: “Media brands and IP owners will create engaging stories for their audience, wherever that audience is, on whatever platform they prefer, in a lean back or lean forward fashion.” n

Wong says it best: “Media brands and IP owners will create engaging stories for their audience, wherever that audience is, on whatever platform they prefer, in a lean back or lean forward fashion.” n

XtraMotion

TECHNOLOGY www.tvbeurope.com TVBEUROPE SEPTEMBER 2022 | 53
Kavya Pearlman Tom Ffiske
TECHNOLOGY www.tvbeurope.com TVBEUROPE SEPTEMBER 2022 | 53
Kavya Pearlman Tom Ffiske
Add
a touch of magic from the Cloud
XtraMotion combines the power of AI and the Cloud, to instantly generate super slow-motion replays from any camera on your production. evs.com TRY IT YOURSELF!

THE COMING OF AGE OF 12G

With recent advances in product design, it’s increasingly apparent that many in the broadcast industry are going to bypass quad-link solutions in favour of 12G-SDI for 4K/UHD, writes PHABRIX SVP sales and operations, Martin Mulligan

Ever since 4K/UHD production started to become a realistic proposition, the need to have a supporting infrastructure capable of dealing efficiently with its high-bandwidth requirements has become increasingly acute. Factor in the rise of higher frame rates, wide colour gamut and higher dynamic range, and it’s clear that you have the conditions for the most significant step-change in broadcast technology for many years.

Consequently, broadcasters around the world have lately been engaged in reworking their infrastructures in order to cope with the demands of these new technologies. As ever, the overarching objective has been to ensure that there is full support for the latest SMPTE standards, ideally over a single cable to reduce the complexity of implementation. And with data rates continuing to soar, systems designers and integrators need to have a keen awareness of how requirements are likely to develop over the next few years in order to future-proof broadcasters’ investments.

In 2015, SMPTE introduced 12G-SDI with standards catering for single-link (1 x 12G), dual-link (2 x 6G) and quad-link (4 x 3G) solutions to help support 4K 4:4:4 30fps and 4:2:2 60fps formats. Until recently, it was common for 4K/UHD content to be delivered via quad-link connections (4 x 3G) cables. There are several reasons for this approach, including familiarity with 3G-capable systems, which are well-proven and reliable, and the initial complexity and challenges associated with making 12G chipsets and cabling.

But it was only a matter of time before single-link 12G-SDI solutions became more workable, and there are growing signs that that moment has now arrived. There are numerous benefits to adopting a single-cable solution, including reduced complexity, weight and cost; all issues of particular concern to providers of outside broadcast trucks and flyaways.

With many manufacturers having now overcome the challenges of designing 12G products, the impetus to bypass the intermediate step of using 4 x 3G-SDI in favour of 12G-SDI is increasingly apparent.

Meanwhile, in the IP world, SMPTE developed its now widelyadopted standards suite, ST 2110, to be somewhat agnostic to image size, with 2110-20 being able to address image sizes up to 32K. And with the advent of 25GE networks, 4K/UHD over IP (a 12G signal) can be easily accommodated. Simultaneously, the use of UHD for virtualstudio applications (including VR and AR) and large video walls in entertainment venues is becoming increasingly commonplace.

These applications often require huge computer-generated images as their backdrops, scenes or graphics, heightening the demand for extended UHD (eUHD) formats. These generally involve 18-21 Gbps data rates in order to accommodate 4K/UHD output up to 60p RGB 4:4:4 12-bit.

PHABRIX Qx and QxL

Looking ahead, it is likely that there will be demand for 8K content, so in preparation for this some manufacturers are now working on enabling quad-link 12G (4 x 12G-SDI). With the industry’s overall ‘direction of travel’ now being clear to see, PHABRIX has developed its Qx range of test and measurement products to cater for 12G-SDI and ST 2110 IP workflows.

Designed to support compliance testing and interoperability between vendors, the range includes price-performant 12G-SDI solutions as well as industry-leading SDI-STRESS testing tools. Meanwhile, the QxL family is targeted at being price performant in the 25GE ST 2110 IP market, with the option to support the new eUHD formats.

So whilst 12G-SDI undoubtedly represents a major shift for the industry, the tools to make it a workable reality are now becoming accessible to all. n

TECHNOLOGY www.tvbeurope.com TVBEUROPE SEPTEMBER 2022 | 55

WE NEED TO TALK ABOUT THE SCALABILITY OF STREAMING

Thierry Fautier, vice president of video strategy at Harmonic, is one of the leading industry voices when it comes to ultra high definition. Here, he discusses why the adoption of higher resolutions is causing problems for the infrastructure of the internet

How did you get started in the media tech industry?

I started as a research engineer at Philips Research Labs in 1988, building the technology foundation for the MPEG-1 video compression standard with CIF resolution for CD ROM applications at 1.5 Mbps. This led to the 1995 development of the MPEG-2 standard that has revolutionised the video world.

How has it changed since you started your career?

When I started my career, video compression was a new concept. We did not have live systems nor silicon. Today, we can encode 8K video, meaning it’s possible to process 256 times more pixels using software in the cloud. The evolution of silicon technology enabled more powerful chips to be built, more complex systems to be developed, and significant computing power.

What makes you passionate about working in the industry?

My passion comes from the ever-evolving technology changes in our industry. While at Philips I embraced change, moving from algorithm research to SoC decoder implementation. I then transitioned from R&D to marketing, which was a big leap. In 2014, I left the

THE FINAL WORD www.tvbeurope.com TVBEUROPE SEPTEMBER 2022 | 57

semiconductor business for Harmonic, which introduced me to the exhilarating world of software running on COTS servers and virtualised environments in the cloud. But perhaps the most exciting transition for me has been leading the charge for IPTV, streaming and UHD solutions at Harmonic, as well as being in the position of president of the Ultra HD Forum for five years. Today, I am tasked with spearheading new initiatives related to streaming scalability.

The great thing about working at Harmonic is that over the last 20 years we have shown that we’re capable of reinventing ourselves and evolving as a company to keep pace with industry changes. I’m proud of our history in pioneering technological innovation. It keeps me inspired and moving in a forward direction.

If you could change one thing about the media tech industry, what would it be?

I’d love for there to be more collaboration in our industry. We have multiple standards groups (i.e., MPEG, DVB, CTA, SMPTE) and forum groups (i.e., DASH IF, Ultra HD Forum). Yet, we still have work to do to jointly market new technology, such as codecs. AVC was a huge success, and HEVC was a failure. Ultimately, this situation gave birth to AOM and its AV1 codec. We now have more market fragmentation in the codec space and have lost several years of progress, with HD still massively rooted in AVC, 19 years after the standard was published!

How inclusive do you think the industry is, and how can we make it more inclusive?

Our industry has traditionally been male dominated. For instance, only ten per cent of the students in my engineering school classes were women. Now we are seeing more women taking leadership roles in corporations and in standards developing organisations. We also need to have more minorities represented. This is going to take time to achieve, but the awareness and commitment to change is promising.

How do we encourage young people to embark on a career in media technology?

Young engineers entering the market are faced with new technologies, changing consumer behaviours and always evolving business models. The tools we have to build new solutions and products are quite advanced. It’s important for young people to not only focus on the technical side, but also understand the business and product management sides; that’s a must if you want to progress in your career.

Where do you think the industry will go next?

Our video industry is broad, encompassing everything from production to primary distribution for broadcast and streaming applications. Consider sports, for example: we now see remote production using 5G in the field. That style of production is being adapted to distribution. When you’re watching a football match, you can now see the yellow cards, the red cards and the replay of goals during the game thanks to streaming technology. Streaming is a game changer and will bring younger audiences back to consuming sports.

The second evolution I see happening is related to social networks. It’s intriguing that a company like Netflix — with more than 200

million subscribers — has not yet found a way to connect all of those subscribers. Netflix subscribers are very passionate about sharing the movies and TV series they enjoy. The same passion can be found in sports fans. If you can find a way to connect people, then they will be even more loyal to your service.

Another trend to keep an eye on is immersive experiences, also known as the metaverse. The take-up of VR and limited AR has been pretty small for industrial applications, with big players like Facebook or Apple putting a lot of resources behind those efforts. This is a space to watch, but the industry needs to be ready to attract consumers, and it will take time.

What’s the biggest topic of discussion in your area of the industry?

Cloudification is well underway, but still has a small adoption rate on the video side. I am confident we are moving in the right direction, driven by streaming.

The scalability of streaming is a huge topic. With only 12 per cent of broadcasts available through streaming services in the United States for the Super Bowl and 40 per cent in the UK for the UEFA EURO 2020, internet infrastructure is being challenged. And we’re only talking about HD; the UHD streaming wave is coming!

Another big focus is sustainability. We have to reduce our carbon footprint in production, delivery and on devices. But we also need a common tool to measure sustainability. The industry forum Greening of Streaming is helping to steer those discussions.

Metaverse is also a hot topic that Apple is drumming up excitement about. For the metaverse to succeed, we will need to see compelling use cases, and powerful yet affordable devices.

What should the industry be talking about that it isn’t at the moment?

The industry needs a better understanding of consumer behaviour and must adapt products to people’s needs. These efforts will help the pay-TV market get stronger. For example, we still serve ads that are unrelated to consumers’ tastes. The industry needs to embrace an opt-in system where users can declare their interests as opposed to advertisers using smart algorithms to guess viewer preferences.

Sports is another frustration for viewers, even on streaming services. Viewers tend to have a similar experience with streaming live sports as they do for broadcast. Innovators such as DAZN and Eleven Sports, which comes from the digital space, are working on enhancing the sports streaming experience, but in general, we are only about halfway there.

I laud all initiatives to bring internet to the masses in developing countries. However, developing countries have limited internet access, so I question whether internet delivery is the best option for them. Those countries might benefit from a download-and-go delivery mechanism, newer codecs and more affordable phones.

Video delivery to mobile devices is a trending topic. Unicast is not well tailored for delivering live events, such as the World Cup and Olympics, at scale. There are some alternatives on the business and technology side, such as 5G broadcast (FeMBMS) and multicast. We will also need those new tools to limit 5G network capacity. n

THE FINAL WORD 58 | TVBEUROPE SEPTEMBER 2022
9000