Top of the Pops turns 60
Live production and 5G
Top of the Pops turns 60
Live production and 5G
By adopting new automation technologies, broadcast studios can enhance efficiency and ensure long-term success in a rapidly changing industry
Vinten has produced a series of free technology guides, exploring the applications and benefits of AI and machine learning in modern studio production
or anyone working in any part of the live TV industry, 2024 is going to be a year like no other. Yes, we’ve had the Euro football championships and the Olympics and Paralympics in the same summer before, but we haven’t also been preparing for elections in both the UK and United States at the same time.
It all means that the technology being used to beam these events to TV viewers around the world is going to be more important than ever before. It’s very often live production where we see new technology being introduced. I don’t expect either the host broadcasters for the Olympics or the Euros to push the technological boundaries too much, they are tier-one events after all and broadcasters will want the guarantee that the host broadcaster can deliver pictures. I do expect the cloud and 5G to be a part of individual broadcaster workflows. We’ll definitely be looking at plans for the Olympics in more depth in the coming months.
Live production is our main focus in this issue. We hear how 5G is becoming more popular as a means of delivery, and break down the difference between 5G and 5G Broadcast. Plus, Kevin Emmott looks at how remote production is key to German sports streamer Dyn Media’s coverage.
“I don’t expect either the host broadcasters for the Olympics or the Euros to push the technological boundaries too much. I do expect the cloud and 5G to be a part of individual broadcaster workflows”
Elsewhere, Kevin Hilton charts the history of iconic music show Top of the Pops as it turns 60. We meet the cinematic sound design team tasked with creating a full Dolby Atmos mix for Apple TV’s Masters of the Air and discover how one post production house’s rendering storage requirements are helping to heat a public swimming pool.
As you’d expect, this issue takes a look at what attendees can expect to see at NAB Show, from the exhibition to the numerous zones around the LVCC. It sounds like it’s going to be another great show.
As always, I am looking forward to seeing lots of new innovations on show in Las Vegas. In 2023, the industry was just starting to talk about artificial intelligence and Gen AI. Over the past few months that discussion has grown ever louder, and at NAB I expect it’ll be a full-on shout! See you in Vegas.
JENNY PRIESTLEY, EDITOR @JENNYPRIESTLEYTwitter.com/TVBEUROPE / Facebook/TVBEUROPE1
Editor: Jenny Priestley
jenny.priestley@futurenet.com
Content Writer: Matthew Corrigan matthew.corrigan@futurenet.com
Graphic Designers: Marc Miller Sam Richwood
Production Manager: Chris Blake
Contributors: David Davies, Kevin Emmott, Kevin Hilton, Neil Maycock, Robert Shepherd ADVERTISING
Advertising Director: Sadie Thomas sadie.thomas@futurenet.com +44 (0)7752 462 168 SUBSCRIBER CUSTOMER
To subscribe, change your address, or check on your current account status, go to www.tvbeurope.com/subscribe
Digital editions of the magazine are available to view on ISSUU.com
Recent back issues of the printed edition may be available please contact customerservice@futurenet.com for more information.
TVBE is available for licensing. Contact the Licensing team to discuss partnership opportunities. Head of Print Licensing Rachel Shaw licensing@futurenet.com
SVP, MD, B2B Amanda Darman-Allen
VP, Global Head of Content, B2B Carmel King
MD, Content, Broadcast Tech Paul McLane
VP, Global Head of Sales, B2B John Sellazzo
Managing VP of Sales, B2B Tech Adam Goldstein
VP, Global Head of Strategy & Ops, B2B Allison Markert
VP, Product & Marketing, B2B Scott Lowe
Head of Production US & UK Mark Constance
Head of Design, B2B Nicole Cobban
To mark its 60th anniversary, Kevin Hilton looks at Top of the Pops’ long history and technological innovations, with reminiscences from those who worked on it
When a cinematic sound design team is tasked with creating a full Dolby Atmos mix for Apple TV’s Masters of the Air, they tell Kevin Emmott the sky is the limit
With anticipation building for NAB Show 2024, TVBEurope speaks to Chris Brown, outgoing managing director and EVP, global connections and events at NAB, about what visitors can look forward to seeing and hearing about in Las Vegas
Developed and jointly operated by EBU and Nagravision, Eurovision Sport aims to enhance the amount of free sports content available to viewers across Europe and worldwide. The company’s Jean-Luc Jezouin tells Jenny Priestley about the technology under the platform’s hood
39
Remote broadcasting might be an established way of working, but can it be done better? Germany’s Dyn Media says it can. The streaming startup tells Kevin Emmott how it has redefined sports coverage to promote flexible growth for its sports, its coverage, and its people
Amidst fears of AI displacing jobs lies a lesser-known narrative—where AI collaborates with humans and fosters job creation in TV and film production, writes
Robert Shepherd50
It is expected that a newly enhanced version of the AES70 audio control standard will lead to increased adoption in various professional applications, writes
David DaviesMost of us know 5G as the latest mobile network technology for cellular service. It is a standard capability in modern mobile devices, and we know that it provides faster data transfer.
The industry collaborative organisation that defines cellular standards is the Third Generation Partnership Project (3GPP).
Founded in 1998, to bring together all interested parties to work on 3G cellular, the team has subsequently stayed together to develop LTE/4G and 5G standards.
Structurally, 3GPP has seven national or regional telecommunications standards organisations as primary members, with a variety of other organisations representing the market as associate members. It is recognised as the authority on cellular standards.
5G is the current state-of-the-art in cellular, bi-directional wireless connectivity. It theoretically offers transfers of 20 gigabits a second, although in the real world, we expect to see rates around 50-100 Mb/s.
This is more than enough bandwidth to carry broadcast-quality video signals, and we have seen it become adopted as part of remote production architectures. Back in 2019, BT Sport provided coverage of a football match at Wembley Stadium: all the cameras were fitted with 5G transmitters sending directly to a production suite 20km away.
This and many other products and services are broadcast applications of 5G cellular. But they are not 5G Broadcast: that is something very different, and potentially very powerful.
Included in 2017’s release 14 of the 3GPP standard was Further Enhanced Multimedia Broadcast Multicast Service, or FeMBMS. In its practical implementation, the name has been simplified to 5G Broadcast.
The name FeMBMS defines accurately what it does, and in particular it includes the word “multicast”. Cellular communication – whether it is carrying a voice call, checking Instagram or a professional video camera output – is a unicast stream. It goes from one source to one destination.
5G Broadcast operates like a multicast stream, sending data from one source to numerous receivers. It’s similar to how traditional terrestrial TV works, with a single transmitter reaching potentially millions of viewers within its coverage area.
It is designed for reception on devices like mobile phones and tablets, but it does not take up any of the cellular bandwidth, nor does it need a SIM card. That is central to what makes 5G Broadcast important.
It requires adaption to the broadcast frequency band in the receiving device but that is easily accommodated. Technology company Qualcomm has developed prototype devices specifically for 5G Broadcast demonstrations. Although commercial devices are available, they need firmware configuration to receive 5G Broadcast signals.
The most obvious application is live streaming content to large audiences in very high quality, without utilising cellular bandwidth.
What if many people want to watch the same event, such as a major sports match, all at once? Providing individual streams for each viewer can greatly strain bandwidth. During these peak times, even those with a paid subscription might not be able to access the stream. This heavy usage can also interfere with other internet needs, including making emergency calls.
Because of its strength in serving large numbers of simultaneous users, 5G Broadcast is ideal for pop-up services, for instance around music festivals or major sporting events. The stadium experience could be enhanced with replays and statistics; the gaps between bands at music festivals could be filled with interviews and backstage insights.
The 5G Broadcast transmission architecture could also be used for multicast of other data. It can feed information to Internet of Things devices, disseminate public safety information, or send data to automotive applications.
5G Broadcast can be delivered from a traditional high-power, high-tower network typical of broadcast television. It can be readily added to an existing transmission infrastructure, which remains independent of the cellular network.
The availability of modular digital transmitters means that localised pop-up services can be quickly set up.
The system has been proven in multiple demonstrations around the world. The 2022 Eurovision Song Contest saw a live, four-hour, mobilefocused 5G Broadcast show from the venue in Torino, Italy to four countries in Europe.
5G Broadcast is a distinct and powerful multicast delivery system. It is part of the 5G standard and ready for roll-out but is fundamentally different from unicast cellular communications.
5G: such a short abbreviation for networks that have started to transform content contribution and ground-to-cloud workflows. As an industry, we’re really still at the beginning when it comes to optimising 5G for primary contribution, major sporting events being the key case in point. Why is this and how is it now changing?
5G has tremendous potential to overcome the hurdle sports broadcasters/rights holders have faced when it comes to network congestion at major sports stadia/events and therefore to significantly increase their content contribution options.
There are four types of networks we’re talking about. The first are non-standalone 5G networks where a considerable amount of the back-end network technology remains 4G. Then there are standalone 5G networks where the infrastructure is fully 5G. Not only does this increase bandwidth, it also opens up possibilities for option three: network slicing.
Network slicing is the ability telcos have to create a dedicated bandwidth “slice”, guaranteeing a certain amount of up and downstream bandwidth for a given period of time – the length of a major sporting event, for example. We see telcos investing a lot of money in 5G, especially around major venues; it’s not just about what we want to do, it’s also about improving the basic service for consumers at events. Network slicing for major events is the next step and is definitely now on telcos’ radars following multiple trials in which we’ve been involved; they do see revenue in this and business cases.
Bandwidth allocation is clearly a commercial decision and we’re certainly seeing more discussions around this for broadcast media contribution, and we’ll see an increase in deployments over the next couple of years.
Then there’s private 5G, defined as a wireless network technology that delivers fifth-generation (5G) cellular connectivity for private network use cases. Private 5G works in the same way as a public 5G network but enables the owner to provide restricted access. Private businesses, third-party providers and municipalities use private 5G networks.
Alongside the aforementioned major multiple public network and network slicing trials in which we’ve been involved, including at an EU projects level and the Coronation of King Charles III,
LiveU has also taken part in private 5G tests. Last year, LiveU and Sky Deutschland further developed their relationship to create a multi-cam, private 5G and cloud video production proofof-concept (PoC) at the Special Olympics World Games. Sky Deutschland worked with both the Local Organising Committee and a 5G service provider to create a private 5G network with the table tennis tournament used for this successful PoC.
5G is also very important for cloud use. When you’re talking about full cloud production and getting signals into the cloud and the whole ethos of lightweight productions, there’s no longer a need to have a big truck on-site.
Having a camera plus backpack – or, in the case of the multicam LU800, up to four cameras per unit – live feeds can be sent wirelessly to a remote production centre. This is not only for flexibility and to provide as dynamic a workflow as possible, but also for cost-effectiveness. Then processing content and sending it back over 5G to consumers, even to those in the crowd watching live, completes that loop. Using 5G removes considerable technical and logistical complexities.
There are environmental benefits as well, with 5G able to play a central role in increasing production sustainability. It’s not only the truck aspect but also all the associated reduction in on-site staff, combined with the increased ability to service multiple events in one day, resulting in a significant reduction in carbon footprint.
5G is changing the connectivity landscape and, combined with bonding capabilities, opens up new content creation possibilities. Network slicing and private 5G networks hold tremendous possibilities, as the trials in which we’ve been involved demonstrate. We have major customers who want to switch to full wireless cloud production for big events with 5G playing a central role.
Broadcasters and rights holders have already embraced this technology, whether they are using it over 4G or 5G. Although, of course, we don’t have control over the networks or the available bandwidth, LiveU is in discussion across multiple potential projects where telcos are looking at all the possibilities 5G affords, including major sporting events, and we believe that 2024 will be a milestone year for innovative 5G use cases.
Increasingly, broadcasters and production crews expect the infrastructure they purchase to work out of the box. Once they have decided on what is best-of-breed for a given task, the product doing the job is considered little more than a piece of a much larger puzzle. This has been facilitated by broadcast control systems — which, by the way, are not only used in broadcast—that abstract the proprietary user interfaces of the various devices, replacing them with a customisable UI for all tasks in control rooms, studios and beyond. The growing number of wide-area networks and distributed production scenarios obviously requires a control system that is able to orchestrate all devices anywhere in the world.
Such a system furthermore needs to be able to adapt to radical workflow overhauls and new technological approaches, which brings us to a second consideration: although some vendors are rumoured to be working on it, there is as yet no one-size-fits-all blueprint that can just be implemented to make a new customer happy. Tweaks to a roughly standardised solution will remain necessary for some time to come. The question some developer teams are grappling with is to what extent the control system’s building blocks can be preconfigured to work equally well under vastly different circumstances.
Mindset shift
A control system as such is unable to provide a definitive answer to all the functional requirements content producers may have. An IP environment is built on hardware and software that needs to be visible to all devices on the network. Before the release of management software such as HOME, this used to involve assigning hundreds or even thousands of IP addresses manually, eating up lots of time. A clever discovery and registration routine that has simplified this process is a device that announces itself as it is connected to the network, and with the press of an ‘admit’ button, is assigned an IP address and able to interact with other devices.
Additionally, the management system needs to allow operators to set parameters on stageboxes and other devices that may be thousands of miles away and to route streams among them effectively, i.e. by allowing the user to select a bunch of inputs in one go and connect them to the equivalent number of outputs.
Of course, the ability to speak to a vendor’s R&D team directly is even better for customers, although it requires a lot of manpower on the
vendor’s part and can be tricky to provide across the board. Luckily, a lot of requests in this area not only elevate a particular solution but usually find their way into the products by means of subsequent software updates. In this way, a vendor with an open mindset automatically works in close collaboration with all its customers, and the community at large stands to benefit from the tweaks performed following a single issue.
While allowing customers to use the infrastructure almost out of the box is an important service that sets a vendor apart, an even bolder idea is to provide them with a platform that can be used in the most agile way imaginable. It started with the intention of abandoning bespoke hardware, replacing it with COTS servers able to run any app-based processing functionality, whether video or audio, at the required time and with the matching specs. Apps that are not needed can be stopped to free up the CPU or GPU processing power for other tasks. This is embraced by a rapidly growing number of operators.
An even cleverer concept hinges on finding a way of combining hardware devices with apps in a way that allows operators to solicit the required processing functionality either from an app or from its hardware equivalent, depending on the remaining processing capacity. With no noticeable difference between the two from the user’s point of view, such an overarching platform causes the hardware and software worlds to converge, providing a comprehensive platform for anything operators need to create compelling content and respond quickly to unexpected production jobs. Again, the physical location of the proprietary hardware or standard server is of no consequence.
Ideally, the platform also allows users to unlock add-on processing functionality on hardware devices, under a unified monetisation system that leaves ample room for creative workflow tweaks without breaking the bank.
The service customers are waiting for hinges on the availability of experts for consultation and handling everything that cannot yet be provided by the platform. In an ideal situation, customers explain what they need and the vendor provides them with a solution that ticks all boxes. Present-day and future infrastructure should be similar to cars: everybody can drive and enjoy them, without needing to know how to tune or fix them. In our example, however, the car’s design and specs could change several times a day to be just perfect for every occasion.
he truth is out there” is the famous tagline from The X Files, a series renowned for investigating mysteries and conspiracy theories. Today, with the vast number of information sources available to us, if we want to find the truth it can feel like there are so many “truths” you can pick the one you want!
Of course, social media is a dominant source of news information, and it is notorious for being an echo chamber. This is the phenomenon that people often follow social media feeds that align with their own beliefs and values, so their sources of information just reinforce their view of the world and even their prejudices. Posts and opinions are rarely challenged with balanced debate, at best the challenges most likely take the form of aggressive personal attacks from trolls. Has there ever been a time as much as now where the onus is on the individual to do their own investigation to find a balanced view of the truth?
This wasn’t always the case, if we go far enough back in time, people had a single point of truth, perhaps the town crier. Even relatively recently, news was only available from a limited number of sources, and they were considered trusted, such as national broadcasters and newspapers.
The trust came from the understanding that there was a level of regulation, and while this varied between countries and platforms in most cases people trusted what they read. There’s no doubt the news could be partisan, for example on the print side with tabloids famously right or left wing, but even then businesses operated within legal boundaries.
Is social media to blame for the changes we see in the platforms we historically trusted? For some time television has been responding as the internet changed consumer behaviours. Twenty years ago, breaking news would be aired on the broadcast channel first, and the technical challenge was to have fastturnaround workflows that could get video news content onto the broadcaster’s website as soon as possible. Famously in 2006, Twitter changed that with breaking news reaching consumers faster on social media than by television. Now, broadcasters are just as likely to break news online first or at the same time as broadcast, consumers often following simultaneously on multiple devices from the same news outlet.
Speed and availability of content are positive changes, but the issue of trust has changed in parallel and, in fact, deteriorated. In the UK it is not uncommon to see people accuse the BBC of political bias. I for one cannot remember this even a few years ago. The BBC brand had an international reputation for the highest journalistic standards. I believe a big part of the trust issue comes from the rise of populist politics, public figures telling the people what they want to hear whether they themselves believe it or can deliver it. There seems to be a symbiotic relationship between social media and this form of politics, and there are high-profile examples of political figures leveraging social media platforms to get messages to their followers that wouldn’t have been possible traditionally. Also, there are very credible claims of foreign governments using social media platforms to influence election results in other countries.
All of this generates more and more highly polarised news content, conflicting truths that erode public confidence in even news sources that historically were above question. There are quite infamous examples of news and magazine shows that air unashamedly political partisan content.
What is the solution? I don’t think we should hold our breath for better self-regulation from the social media giants, they seem unable or unwilling to effectively address even the most “nobrainer” standards such as child safety. It seems safe to assume that effectively regulating political content is a long way off.
As I’ve written this, I am conscious that I have the benefit of personal experience over many years to not always take things at face value, to understand that there are varied points of view to every issue, and that the best way to get close to the truth is to listen to many perspectives and form a balanced view. However, it must be hard for the generation that has grown up in this social media world of varied extreme views and influencer opinions versus facts. There are multiple surveys that show Gen Z get most of their news from TikTok, and even if the recent US legislation causes the demise of TikTok this generation of consumers will just switch to the next unregulated platform. One must hope that society is able to find the right balance between freedom of speech and regulation that protects it. The truth is still out there, but for now, it remains more challenging than ever to find it!
WHAT WILL THE BROADCAST TECH ECOSYSTEM LOOK LIKE IN THE NEXT 5-10 YEARS?
TVBEurope editor Jenny Priestley meets Grass Valley president and COO Jon Wilson to discuss everything from the power of the cloud, to AI adoption and this year’s summer of sport.
DON’T WORRY ABOUT A THING — BUILDING THE EDITING WORKFLOW FOR BOB MARLEY: ONE LOVE
Post’s
TVBEurope’s website is your online resource for exclusive news, features and information about our industry. Here are some featured articles from the last month…
A LOOK BEHIND THE SCENES OF THE JURY: MURDER TRIAL
The production team behind Channel 4’s “social experiment” explain how they enabled viewers to look over the shoulders of two separate juries in a real murder trial.
Róisín McKeniry, head of technology, Gravity Media, explains why the media industry needs to change its mindset in the way it works, and utilise technology to improve work-life balance if it hopes to encourage a truly diverse workforce.
Pam
The producers of thriller No Way Up tell Robert Shepherd how they created visually stunning scenes set in the Pacific Ocean, from Basildon and Maidenhead. .
Top of the Pops was BBC TV's weekly look at the UK singles chart, reflecting the musical and fashion styles of the time. To mark its 60th anniversary, Kevin Hilton looks at the show's long history and technological innovations, with reminiscences from those who worked on it
The BBC's answer to commercial rival ITV's Ready, Steady, Go! (1963-66), Top of the Pops kicked off the weekend and featured artists miming to their records surrounded by young people dancing in the studio. While also featuring acts performing to a youthful crowd, TOTP (as it is usually abbreviated) was based on the Top 20 (later extended to 30 and then 40) singles, featuring current hits and a run-down of the chart building up to the Number One.
The format of the show was laid down by Johnnie Stewart, the first producer, and Stanley Dorfman, who succeeded him. From those early days, TOTP's presentational style went through many changes, with technology playing a major role. The show also moved around different studios; while it is now mostly associated with BBC Television Centre (TVC) in west London, TOTP was also produced at nearby Lime Grove Studios, Riverside Studios and Elstree. But its beginnings were in a converted Methodist church in Manchester.
The first edition of TOTP was broadcast at 6.35pm on New Year's Day 1964 from Studio A at Dickenson Road Studios. In 1966, at the insistence of the Musicians Union, the show moved away from artists miming to their records to either performing live or playing along to a pre-recorded 'new' version of songs. As part of this, an orchestra featuring leading session players of the time under the musical direction of Johnny Pearson was brought in to provide backing to those
The production team worked in London for most of the week and would travel to Manchester for rehearsals and the recording. Working alongside Johnnie Stewart and Stanley Dorfman were producer's assistants, among them Kate Greer.
"I joined the BBC as a trainee production secretary in 1965. Mid-way
through the year I was assigned to Johnnie Stewart and, to begin with, we would take the overnight sleeper to Manchester but sometimes we would fly up,” she explains. “On one occasion London was fog-bound, so I was sitting at the airport with Dusty Springfield and some of the other artists waiting for the fog to clear. The show itself was prerecorded as live, so we were flying by the seat of our pants.”
Also in 1966, recording sessions transferred from Dickenson Road to TC2 at TVC in London but later switched to Studio G in Lime Grove to accommodate live performances and the orchestra. Among the technical innovations of this time was a special accessory for cameras. This was designed by John Henshall, who recalls this development on the Tech-Ops History website (http://tech-ops.co.uk/next/top-of-the-pops).
"I had designed a fisheye attachment, which had been rejected as a technical suggestion but Johnnie Stewart loved it. Its bayonet fitted in place of the lens hood on 2-inch lenses but only worked on the EMI 203 cameras in Studio G due to their extended focus rack and not the Marconi Mk IVs in TC2 (until I later fitted an additional element)."
"For me the development of lightweight cameras was significant. We developed new techniques back then, some of which are still seen today”
TIM HORE
Another member of the camera and technical team working on the show after its move from Manchester was Alec Bray. "I worked on TOTP, firstly in TC2 and then Studio G, where we had a Mole [Richardson] crane. During a Jimi Hendrix session. I was on a camera with a 12-inch lens on it right in front of the drum kit. I had to crane up on the pedestal to take the shot, then quickly crane down to be out of shot of the other cameras – and I am sure I must have hit the cymbals on the way up and down."
By the time the ‘60s gave way to the ‘70s, TOTP was a well-established fixture in the BBC's schedules, with its regular slot early on Thursday evenings. The show had also moved back to TVC with the switch to colour transmission.
While the focus for viewers was how the show looked, sound was of course extremely important. Typically the sound crew comprised a sound supervisor at the mixing console in the control room, assisted by 'grams' operators playing in different tracks, while on the studio floor
assistants would rig the equipment. Among those who worked his way up from the floor to sound supervisor was Keith Mayes.
"I first worked on TOTP in 1978 at TVC as a floor assistant, rigging tracking loudspeakers and fake mics, because most of it was mimed then,” he says. “A few years later I became a tape and gram op. The songs for each show would be recorded on to quarter-inch tape and put on a tape machine — usually a Studer A80 — with three others; one for applause into the song, one with applause out and another with the DJ backing music.
“I became sound supervisor in the 1980s and they had a lot more live vocal then. I also did the first completely live band on TOTP when New Order insisted on playing their 1983 hit Blue Monday live. But as most of it was electronic it wasn't too bad."
Music videos came into their own during the 1980s, which changed the look of TOTP, although the policy was still to get bands in the studio. The show also faced more competition; Channel 4's The Tube (1982-87) in particular had a more modern, anarchic style and featured fully live acts.
To reflect the visual influence of music videos, TOTP exploited the latest vision effects and camera hardware to update its presentation style. Among the people involved with the update was Steve Burgess, who joined the BBC as a graduate engineer in 1979.
"In the studios the EMI 2001 early-colour and newer Link 110 cameras were joined by truly hand-held Ikegami HL79Ds. On the effects side, TOTP often used digital framestore synchronisers for freezes, quarter-framing and moves,” explains Burgess.
“There was also the Quantel DPE5001, a digital production effects unit that used up to five framestores and a combiner unit. TIPSE (Technical Investigations Picture Shuffling Engine) was a more basic video effects device that was used frequently on TOTP in the early ‘80s, featuring a single framestore with a four-way input switch so four cameras could appear in quarter frame."
Among the regular camera operators during the 1980s was Tim Hore. "For me the development of lightweight cameras was significant.
We developed new techniques back then, some of which are still seen today. The production teams were great, with the vision mixer being a very important person to us. Their skills on the buttons greatly assisted the overall programme.”
The Nineties/Noughties
The changing TV and musical landscapes of the 1990s made this a difficult decade for TOTP. The singles chart was also less important than in the past. Nigel Saunders, who started work as a trainee camera assistant at TVC in 1979, observes that during the last days of the regular weekly episodes of TOTP, people were discovering new artists through YouTube and other social media outlets.
"TOTP was the first show I worked on when I arrived at TVC. I remember Billy Idol [with Generation X] being on the show and I was swinging one of the three camera cranes.
“By 1993 I was camera supervisor and remained on the show until the start of the Andi Peters era (executive producer 2003-2005). In 1992 Technocranes arrived in TV. Unlike the ride-on cranes TOTP had
used up to then, the camera operator sat at a desk with a joystick and a technician controlled the telescopic extension of the 21-foot arm and two people 'swung' the counterbalanced arm with the lightweight camera on the end. Technocranes are still very much in use on big shows today and TOTP — and I — pioneered that,” states Saunders.
As the ‘90s wore on, the BBC decided TOTP needed a revamp and brought in Chris Cowey, whose credits included The Tube and the short-lived but influential The White Room (1995-96) as executive producer (1997-2003).
“The important thing for me is to make something feel live and alive. Quite often I would leave in slightly wobbly camera shots because it had an energy and liveness about it that had been absent from [from the show]”CHRIS COWEY
"I did realise that at the time I took it over TOTP had its issues. I was aware albums were outselling singles by about six to one. Quite often my challenge was to make a good, engaging TOTP despite the chart rather than because of it,” he says.
“What I wanted to do was get a bit more unpredictability, with more people performing live or at least singing live. I also got rid of videos, unless the artist was in America or dead. I wanted people to come into the studio; the trick is to make bands on TV look good and sound good at an artistic level."
Cowey continues: “I used six cameras in the studio along with the Technocrane, which can afford a wide range of shots. But the important thing for me is to make something feel live and alive. Quite often I would leave in slightly wobbly camera shots because it had an energy and liveness about it that had been absent from TOTP."
Cowey, now head Of TV entertainment at 1185 Films, is currently looking at ways to present a music show along the lines of TOTP for
the much more diverse and fragmented music and media world of the modern era.
While there is no longer a weekly show, the TOTP name and brand still exists through annual specials, documentaries bringing together classic performances and interviews with artists and members of the production team, plus repeats of selected old shows.
Tudor Davies worked on TOTP from the early 1990s on and is still its sound supervisor. "I started at the BBC in 1991 and during my first week worked on TOTP,” he says.
"Being a sound assistant on the studio floor was the best job ever because you were interacting with the artists. We would have one person on the monitor desk and someone else on the stage with a microphone connected directly to that desk. You would go up on stage, introduce yourself to whoever it was and then do the sound check, with the track playing from either quarter-inch tape or MiniDisc. Between you and the artist, you would set the vocal level and things like reverb.
“When we moved into the Chris Cowey era there was more of a shift towards live vocals and sometimes live bands. But from 1991 to the present day there have always been live vocals. And people don't realise how complex that is."
As Cowey observes, TOTP is a programme of record, chronicling not just the music but also the styles and attitudes of the past 60 years. While no longer the weekly fixture it was in its heyday it is still a fondly remembered and — mostly — respected televisual institution.
The technology provider for the live media and entertainment industry is marking its 65th anniversary with a cluster of breakthrough camera and switcher enhancements
In an industry that moves ever forward, there is rarely much opportunity to look back. But as Grass Valley marks the 65th anniversary of its founding in April 1959, ahead of what promises to be another highly dynamic instalment of the NAB Show, it’s hard to resist asking some of today’s GV executives about what they regard as the key milestones of the past sixand-a-half decades.
On the camera side of the business, director of product marketing Klaus Weber highlights several innovations including the first market launch of CMOS sensors with global shutter. “Only this technology made it possible to realise HD cameras and later also UHD cameras with the necessary light sensitivity, and only these sensors made it possible to realise both super slow-motion cameras and HDR operation with the best quality,” he says.
Examining GV’s history of video production switcher development, VP for switchers Greg Huttie pinpoints “from the tech point of view an additive trio were introduced directly into switcher processing frames: 2110 IP I/O capability, full raster UHD 4K 2160p at a large scale, and HDR processing. Through this integration, along with user interfaces built from the technical director’s point of view multi-format productions could be realised with greater flexibly, cost-effectiveness, and without creative compromise.”
But whilst it’s nice to look back, the slew of announcements that Grass Valley has planned for NAB Show 2024 confirms that the company’s commitment to media technology innovation is very much ongoing.
Headlining GV’s broadcast camera presence at NAB will be several significant enhancements to the LDX 100 Series platform that
underscore the company’s commitment to addressing the complex demands of live production environments.
The latest updates include:
• Advanced HDR/SDR Operations: Integrated LUT processing in all LDX 100 series cameras ensures optimal HDR/SDR signal quality directly from the camera head and base station, reducing the need for external converters. Notes Weber: “In the area of HDR/ SDR productions, there are more and more applications that require an efficient and cost-effective workflow, and the LUT processing integrated into the cameras meets exactly these requirements.”
• Improved Design for Enhanced Usability: Modifications to the shoulder pad, handgrip and housing materials have made the camera lighter and more ergonomic, while airflow optimisations have resolved previous heat generation issues.
• Innovative High-Speed Shooting: New capabilities allow for 6x 1080p shooting with native UHD resolution of the live signal output, supported in both NativeIP and XCU base station operations. Weber says: “In the area of native UHD productions, there is an increasing desire to integrate super slow-motion cameras with the best possible image quality into the workflow. The native UHD signal output for the live signal in 6x 3G super slow-motion mode addresses this requirement.”
• Wireless Camera Version: A new camera model optimised for wireless operation paves the way for seamless integration with wireless transmission technologies and cloud-based production solutions. Weber explains: “The launch of a wireless camera in the LDX 100 series fulfils users' desire for a flexible camera that can be used for a wide range of applications, including wireless cameras in sports or entertainment productions, as well as remote production applications with wireless integration into IP infrastructures.”
The K-Frame XP video production switcher series is already the first choice for high-profile live productions thanks to its exceptional capabilities and versatility. But at the NAB Show, Grass Valley will introduce multiple new enhancements to the series, including:
• Advanced IP I/O boards: New generation IP I/O boards with JPEG XS encoding/decoding capability without loss of I/O count, 100GbE QSFP and 25GbE SFP ports, plus extended buffers in 2160p.
• Colour Mapping HDR<>SDR / 3D LUT Translation: Internal mapping and translation within the video processing frame to support the most demanding event coverage, assisting productions in handling the ever-increasing variety of programme outputs required.
• ClipStore II: A new generation of ClipStore offering up to 4-channels of playback and record in all formats including UHD 4K 2160p with both SDI and IP connections.
Reflecting on the new additions, Huttie remarks: “The expanded capabilities of the IP connection and the additional possibility of supporting compressed signals in the video production switches meet the requirements of users in the most demanding applications. The same applies to the integration of LUT processing, which makes HDR/SDR productions with mixed signal sources a simple affair.”
These enhancements are also likely to be pressed into service during this year’s very busy schedule of sporting events. “For major sports productions scheduled for summer 2024, these extended functionalities will be used to make the workflows as efficient as possible,” he adds.
More generally, Grass Valley is continuing to witness increased adoption of IP infrastructures, SaaS applications and cloud-based media environments, as supported by innovations including the AMPP (Agile Media Processing Platform) and GV Media Universe, which is Grass Valley’s groundbreaking vision for transitioning to the software- and cloud-based future of content creation and distribution.
“We are seeing increasing acceptance and use of IP infrastructures and SaaS applications in all areas of media and entertainment applications, with almost all new investments using these technologies in some form,” confirms Huttie. “There are also initial examples that could either be achieved much more efficiently and cost-effectively through the use of IP infrastructures and SaaS applications or were only possible thanks to these solutions. Through our AMPP platform, we offer the best possible integration of SaaS applications in all areas of the production environment. One to take a good look at during NAB is the new software-based Maverik X production switcher.”
Finally, the NAB 2024 showcase will certainly not represent the full extent of GV’s innovations for this year, with Huttie revealing that the company is at present working on “many other exciting new developments and advancements. These include both hardware and software solutions and will be announced later this year as they become available.”
NAB Show 2024 attendees can visit the Grass Valley stand (C2308) to experience these innovations first-hand and explore its complete portfolio of cameras, production switchers, multiviewers, routers, processing, asset management and live playout systems. www.grassvalley.com
Streaming is changing not only the way consumers watch episodic television but how they listen to it too. So when a cinematic sound design team is tasked with creating a full Dolby Atmos mix for Apple TV’s Masters of the Air, they tell Kevin Emmott the sky is the limit
The sky is so hugely and unutterably vast that before radar was developed, Big Sky Theory was what many pilots relied on to avoid mid-air collisions.
Streaming on Apple TV, Masters of the Air is not about Big Sky Theory. If anything, it’s the opposite.
Telling the story of the US B-17 bomber crews
navigating the skies during World War II, the show is less about open spaces as it is about the enclosed conditions of those planes and the bravery and fortitude of the people who crewed them.
Produced in cooperation with Amblin Entertainment and Playtone, and executive produced by Gary Goetzman, Tom Hanks and Steven
Spielberg, Masters of the Air is the first television series produced by Apple Studios and follows the multi-award-winning Band of Brothers and The Pacific as the third part in an epic World War II trilogy.
It is event TV on a cinematic scale, and from a sound perspective it has some big combat boots to fill. But for a show which has such an expansive soundstage, the sound design is all about delivering real intimacy; when you are with the aircrew in the plane, you really are in there with them, and the plane is a cramped, noisy and monstrous presence.
Streaming from the cloud(s)
It is something which wouldn’t have been possible – or necessary – on
the small screen a decade ago, but a combination of consumer technology, accessibility and viewer expectation required a shift in approach.
“I think the modern filmmaking approach for television has elevated over the past decade to levels that we had never seen before, and largely due to streaming and support from studios it has given us these opportunities,” says Australian-born re-recording mixer Duncan McRae, who along with fellow re-recording mixer Michael Minkler and supervising sound editor Jack Whittaker, were part of the sound team on the production. In fact, Whittaker says none of the team approached this like a television series. “We are not traditionally television people, and I think that's something that the studio was looking for,” he adds.
“This is one of the largest scale things that's out there in this format and it is not something that was achievable 10 years ago. The technology that Apple is bringing to the game, the adoption of Dolby Vision, of Dolby Atmos, and the ability to deliver over streaming; these are new technologies that we really tried to take advantage of.”
“We recorded in one of the few remaining B-17 bombers and we mic’d up everything we could; 60 mics on the inside and six sound recordists on the outside to capture every angle”
JACK WHITTAKER
The same but different
In his sixth decade as a Hollywood sound designer and with three Best Sound Oscars and three BAFTAs under his belt, Minkler has been part of this shift: “I don't think any of the filmmakers ever considered this to be television, and that's not because television is a dirty word or even a small word. But to them, this has been a gigantic feature from its inception.
“For the last 20 years, all of the films that I've mixed had a home theatre version which decreased the size of the mix for the dynamic range. We did not take that approach for this because we are at a point now where we can be immersive in the home theatre system. This is a motion picture that happens to be delivered in a streaming process on Apple TV.”
Similarly, the creation of source material was just as epic. Having previously worked on The Pacific, Minkler says the team went into Masters of the Air with the same high expectations that the sound effects would drive the story just as much as the visual effects.
How best to achieve that? Well, sometimes the best way to recreate the brutal and explosive power of a four-engine heavy bomber aircraft is just to get up in one.
"A key aspect of the sound design is that the planes are a key character and we needed it to be as visceral as possible with engines as big, and as giant, and as beastly as they could be. Reproducing that on a track is no small task,” says Whittaker.
“We recorded in one of the few remaining B-17 bombers and we mic’d up everything we could; 60 mics on the inside and six sound recordists on the outside to capture every angle,” he continues. “To achieve the visceral nature of the battles we designed some amazing gut-punching guns and placed them in various positions on the plane, as well as screaming German planes going by at 500 miles an hour.
“Those were our starting points. We needed the viewer to be there as one of the crew members, and what the viewer hears is hyper-real, but it's also totally authentic.”
Being able to locate sounds in Atmos not only gives streaming
audiences more detail, but the Atmos format also helps with separation, and translating everything simultaneously meant the team could use it to create a clearer narrative.
“In the battle sequences the audience is in the B-17 with the crew, and we used Atmos to track where they are in the plane as well as where the threats are coming from,” says McRae. “Atmos allowed us to track all that while maintaining music and dialogue levels. Atmos is definitely the best way to watch Masters of the Air.”
All that said, working in Atmos for home viewers also gives the sound team more headaches. Streaming in a fully immersive format means that the programme maker has no idea what viewers will be listening on, or in which format. In a cinematic presentation, the mixer understands what the delivery format is going to be. On a streaming service it could be a full Dolby Atmos system, but it could also be decoded on a soundbar, or on a TV’s built-in stereo speakers, or even as a full binaural mix over Apple Airpods. It’s never going to be the same for everyone.
And that means a lot of late nights. And a lot of listening. “We started off mixing in full theatrical Atmos and then we drilled down to ensure we could still convey the full intent in smaller listening environments” says Minkler. “We used multiple systems from Sonos, Apple, headphones, soundbars, 5.1 home theatre packages, built-in television speakers; it's vital to keep listening because the mix has to be representative in every one of those listening formats.”
McRae adds: “It may not be a shift so much in approach, but knowing that there is a range of ways that people are going to be listening is so important to hear how our narrative would play emotionally in different formats.”
This all means that however you watch, on whatever equipment, the result is the same. It’s massive, and claustrophobic, all at the same time.
“Our approach was to make it as big as possible and then work out how to get this thing to play as well as it could do,” says Whittaker.
Or, as Minkler puts it: “I read a review which said that ‘regarding the sound, you are fully immersed. It's like a retro era, Top Gun Maverick of sorts, but a lot more harrowing.’ I think that kind of sums it up.”
“We needed the viewer to be there as one of the crew members, and what the viewer hears is hyper-real, but it's also totally authentic”
JACK WHITTAKER
With anticipation building for NAB Show 2024, TVBEurope speaks to Chris Brown, outgoing managing director and EVP, global connections and events at NAB, about what visitors can look forward to seeing and hearing about in Las Vegas
What should attendees expect from NAB Show 2024?
In its 101st year, NAB Show is looking to future innovation as we set the tone for our industry’s next 100 years. It’s an exciting time! We’re seeing truly extraordinary technological advances in how content is produced, distributed and consumed. In a business where the only constant is change; we continue to adapt and grow stronger, and have an optimistic outlook given improving macroeconomic conditions.
One of the amazing characteristics of NAB Show is the diversity of the audience it brings together, and we are always looking to welcome new and emerging sides of the media and entertainment universe. With that in mind, we have built the 2024 NAB Show to attract new audiences, as well as provide inspiration and upskilling for our traditional audience segments. We’ve created new and expanded exhibit floor initiatives, including Creator Lab, a homebase for the Creator Economy, and PropelME, a new startup showcase and community for innovative tech entrepreneurs.
The popular CineCentral – a destination that spotlights Hollywood’s cinematic trends and techniques – is expanding. It will include an outdoor “Open Air Lab” for drone training and demonstrations by Lightcraft, as well as motion and movement workshops with cranes and dollies from Chapman Leonard.
We’ll also introduce the AWS Partner Pavilion, which will offer a unified look at the possibilities available through the AWS ecosystem.
On the education side, we’ve recast our foundational programme –what was called the NAB Show Conference is now the Core Education Collection. It covers the content universe from A to Z, and includes three series, or tracks, that map to our content pillars, Create, Connect and Capitalise.
A key part of that Core Collection will be our highly-respected Broadcast Engineering and IT Conference, which will cover more ground than ever with nearly double the sessions as last year’s version.
Another new aspect of the 2024 NAB Show will be a different campus layout, as renovations continue at the Las Vegas Convention Centre. The South Hall is back open and the North Hall now closed. This will require a bit of re-orientation for our regular attendees, but our team has worked hard to minimise the impact and make it as easy as possible to navigate.
The last point I would make here is that we have continued to look for ways to help our audience glean as much as possible from their show experience. You know that we introduced a new organisation to the various aspects of the show in 2022, tied to the content pillars already mentioned — Create, Connect and Capitalise. Within that
construct, we also introduced new hubs, new experiential zones, that have been set up to help attendees get a firm orientation - within each pillar – to the products, trends and connections that are most relevant to them. This year we have consolidated into three larger zones, now called Community Zones, with each providing interactive discovery, learning and networking. We have added some new elements to facilitate better interaction and networking and have tweaked the on-floor learning activities to make them more impactful. There will be some great education offered in these zones, and they are open to all at no charge. I would encourage everyone to carve out some time to stop in these zones. There is one in each hall being used this year. The Connect and Capitalise Zones are in West Hall, and the Create Zone is in South Upper Hall.
What are you most looking forward to seeing at this year’s show?
To be honest with you, I always get the biggest adrenaline rush at the show when the doors are about to open on day one. Hearing the buzz and seeing the crowds accumulating is the payoff to 14-16 months of hard work for our entire team. It only gets better when you have the opportunity to follow that crowd onto the show floor and watch them begin to filter into the exhibits and interact with our exhibitors. This reflects the heart of what the show is all about.
Beyond that, I am excited to see the South Building open again. When we occupied that building a few years back it always had a
There's always a buzz as the show opens
unique energy and a great lineup of tech and exhibitors. I would expect to see that same sort of energy this year, as well.
I am also excited to see our Main Stage lineup over the course of the show. This year we have added an emcee and we’re very excited to have TV personality and video blogger Shira Lazar playing that role. She’ll add great perspective and will help tie together the major themes across all of the Main Stage programmes.
What piece of technology innovation or new trend has stood out for you over the past year?
There are a few trends and innovations that stand out, and you
can be assured we’ll have programming and discussion focused on them at the 2024 NAB Show. Three of particular note:
• Virtualisation and the shift to cloud-based and remote production continue to have significant impact. This has all happened within an overall shift from hardwarebased to software-based infrastructure. These trends rose to the top during the pandemic and have only accelerated. Attendees will see a unique demo of a cloud-based workflow in the Connect Zone in the West Hall, produced in partnership with Diversified. We’ll also feature an AWS News Studio in the West Hall Lobby using a cloud-based workflow. There’s also a track dedicated to remote production that’s built into the Post|Production World Conference.
No matter the attendee’s focus, organisers are confident they’ll find something that will spark a new idea or answer a critical question
• Generative AI is a game-changer, but questions still remain – from application to overall impact. NAB Show will thread a focus on AI and its impact through both the on-floor education and in virtually every conference programme and on the Main Stage. We’re offering dozens of sessions to cover all the angles, from unlocking generative AI in media creation to content delivery, captioning and accessibility to creating efficiencies and increasing revenue, and an in-depth look at current and future trends.
• XR and spatial computing are poised to revolutionise content creation and consumption. Fuelled by the recent introduction of Apple’s Vision Pro device, it’s a new frontier that is of great interest to our attendees who should look for opportunities to dive into this in our Core Collection Create Series, as well as in other programmes throughout the show.
What do you hope attendees and exhibitors will take away from the 2024 NAB Show?
We hope they leave with energy and inspiration for their work in all things media and entertainment! From the innovations they discover on the exhibit floor to the education in the sessions and workshops to making valuable connections - no matter the attendee’s focus, we are confident they’ll find something that will spark a new idea or answer a critical question that will lead to practical value for their work. On the exhibitor side, the answer is simple: we hope they walk away with business; a lot of business. Year-in, year-out NAB Show has always been a catalyst for generating business, both on the spot and in the months following the show. On average that has meant about $2 billion of transactions. The show really represents a tremendous economic engine for the industry.
Looking at the European media technology landscape from the United States, is there anything you feel Europe is ahead of the US on in terms of development and adoption?
As the world continues to get smaller, the US and European media industries can continue to learn from each other. For instance,
Europe has been an early leader in interactive television experiences via the HbbTV initiative which has been widely implemented and adopted by European consumers. As US broadcasters implement ATSC 3.0, which offers advanced interactive capabilities, studying the success of the HbbTV use cases helps jumpstart interactive services. At the same time, the advantages of the IP-centric nature of ATSC 3.0 has led European standards organisations to adapt their standards to be IP-interoperable.
Looking further at the European media technology landscape from the US perspective, several areas stand out where Europe seems to be ahead in terms of development and adoption.
Regulations: Europe leads in areas like data privacy with GDPR and net neutrality. For instance, the EU's antitrust actions against Google and Apple contrast with the current US approach. Additionally, the "fair share" debate on telecoms, where telcos seek compensation from tech giants for delivering their content, is gaining traction in Europe and might influence global discussions.
Content: While the US boasts a robust sports-rights market and powerful tech players, Europe has a different approach. Content regulations like origination rules and quotas for homegrown content (e.g., France) aim to protect cultural diversity, though their effectiveness is debated. Moreover, Europe has a strong network of public broadcasters like BBC and RTL, funded through government and licence fees, which offer distinct content and play a crucial role in the media ecosystem.
Is there a way for people who can’t make it to Vegas to get involved with the show?
We do understand some people are unable to attend in person and have created some ways to participate remotely. Watch NAB Show LIVE on our website. Viewers can explore the trends and tech on the show floor through entertaining hosted presentations. We’ll post content on our YouTube channel. Catch thought leadership sessions and interviews throughout the show.
With a slew of major technology trials and a growing range of supporting solutions to its credit, Sony is in the vanguard of harnessing 5G for live media production
The excitement around 5G has been growing for several years now. As soon as technology trials were announced in the late 2010s, it became clear that the low-latency, high-bandwidth capabilities of 5G had the potential to be transformational for live news and sports production, among other high-stakes media applications.
All of that is in no small part thanks to the efforts of Sony, which has been at the forefront of companies pursuing the use of 5G for broadcasting applications since real-world use cases started to be discussed, including the pioneering work of Sony subsidiary Nevion with the EU-funded VIRTUOSA project. From a series of trials with organisations including the BBC and Deutsche Telekom, to a growing product range that has just been joined by the 5G-capable PDT-FP1 portable data transmitter, Sony has created a roadmap that will allow broadcasters and service providers to make the most of 5G networks.
Fresh from the latest edition of Mobile World Congress (MWC) that saw Sony showcase multiple 5G innovations, including the new portable data transmitter, Sony Europe strategic technology development manager Peter Sykes is in no doubt about the burgeoning interest in 5G throughout the media community. “I would say there is a consensus view that the prospect of using 5G in the live environment is very exciting,” he says. “So we have been really focusing on developing solutions that will allow media organisations to use the network effectively in order to achieve the ultra-low-latency and robust quality that is required.”
Arguably the most high-profile event at which Sony participated in a 5G trial was the Coronation of King Charles III in London, in May 2023. Highlighting the company’s advances in optimising network access for live broadcasting, the trial involved the use of Nevion’s VideoIPath media orchestration platform to achieve dynamic prioritisation of live media signals over 5G connectivity. The test was carried out by the R&D arm of UK public broadcaster the BBC and private 5G standalone (SA) networks developer Neutral Wireless, with the participation of Sony and Nevion.
The benefits of 5G are compelling, but intense usage during major events such as the Coronation can create challenges to effective broadcaster use of public networks. “People inevitably want to get smartphone connectivity to send photographs or videos to friends and family, and maybe download some content as well, which means there’s a lot of pressure on the network [in terms of broadcast access] – that’s why we were especially interested in testing dynamic prioritisation of camera feeds in such a demanding environment. It was a very successful trial,” says Sykes.
A few months later, a proof of concept was completed with TV2 Denmark in a live TV studio environment. It was an excellent illustration of the potential of using 5G workflows in studios as opposed to outside broadcast. The benefit here was that it would mean cameras could be moved between studios and even locations, without having to worry about the cabling.
Even more recently, in February 2024, Sony and telecommunications partner Deutsche Telekom announced the successful testing of Nevion’s VideoIPath platform with 5G network Application
At Mobile World Congress, Sony and Deutsche Telekom announced the successful testing of Nevion’s VideoIPath
Programming Interfaces (APIs) that follow the standards established by CAMARA, the open-source network API project within the Linux Foundation, of which Deutsche Telekom is an active member.
The tests were carried out in one of the telco’s labs in Krakow, Poland, using its 5G SA network. Designed to demonstrate the ability of standardised, programmable APIs to simplify complex media productions, the trial also made use of two of Sony’s most recent product developments: the CBK-RPU7 HEVC 4K/HD remote production unit for video compression, and the aforementioned PDT-FP1 portable data transmitter – both of which were attached to a camera.
Making its show debut at MWC, the PDT-FP1 resembles a mobile phone at first glance, but in fact its compact form-factor hosts a highly versatile data transmitter with 5G (public and private networks) mmWave and sub6 capabilities. Integrating optimal antenna design and versatile input interfaces, the solution is simple to use and
offers the high-speed and stable sustained communication that are obligatory for a diverse roster of professional applications.
It’s also very robust, with Sykes revealing that the device was in constant operation throughout the show hours of MWC and consistently ‘kept cool’, thanks to a highly effective heat dissipation structure including inbuilt fan. “We were really pleased by the public reaction to the PDT-FP1 at the show and can’t wait for our customers to start using it,” says Sykes of the product, which is set to be made available in selected European countries this Spring.
With a busy calendar of live events and trade shows in prospect, including at NAB Show (Booth C8201), Sony is sure to have plenty of further opportunities in 2024 to showcase its impressive advances in 5G live production technology.
To learn more about Sony’s 5G technologies, please visit pro.sony.eu and watch videos on Revolutionising Live Media Production with 5G, Networked Live, and PDT-FP1
Developed and jointly operated by EBU and Nagravision, Eurovision Sport aims to enhance the amount of free sports content available to viewers across Europe and worldwide. The company’s Jean-Luc Jezouin tells Jenny Priestley about the technology under the platform’s hood
Sport on TV is dominated by a small number of disciplines, notably football, rugby, Formula 1, horse racing, tennis, golf, athletics, snooker and darts.
But sport accounts for a huge proportion of the TV viewing landscape. If you look at the most-watched TV programmes of all time in the UK, the UEFA Euro 2020 final between England and Italy sits at number five with 29.85 million viewers. In an era when viewers have more choices of what to watch than ever before, that's pretty good going.
It also helps prove that there is a huge appetite for sport among viewers, one that events not classed in that so-called ‘top tier’ can tap into. To help sports leagues and federations find an audience, the European Broadcasting Union has launched Eurovision Sport, its directto-consumer service. The platform aims to become home to thousands
of hours of content that will complement existing coverage provided by public service media – and showcase every second of a wide variety of events. It has already streamed content from the World Athletics Championships and The Boat Race to viewers across Europe.
The EBU chose Nagravision to power and operate the platform. The company has been providing technical services to TV operators around the world for over 30 years and has recently brought together all of its products required to create a sports streaming platform, including streaming technology, security, content ingestion, storage, distribution, advertising and CRM. All of these capabilities led the EBU to partner with Nagravision.
“We share the same strategy on how to serve sports and how to serve sport fans,” states Jean-Luc Jezouin, SVP and GM of Nagra Sport. “Their motto is connecting people with sport and sport with people; and for a long time, our motto has been, connecting people with content and content with people. Both of us are very focused on offering sports that are not very well served on TV, such as the tier two, tier three sports, or even the Olympic sports during the three-and-a-half-year period between the Olympic Games.”
Jezouin adds that there are numerous strategic matches between the two partners, not least that they are both based in Switzerland which makes working together even easier.
Discussions between the two initially began a year ago, with testing of the platform’s capabilities carried out last summer. “We ran some pilots, not proofs of concept because we knew that it was going to work, during summer 2023 on the Athletics World Championships and Summer Biathlon World Championships,” explains Jezouin. Those tests continued under the radar through the autumn and winter, before a soft launch of the platform in December followed by the official launch on February 6th.
The workflow
Jezouin describes the Eurovision Sport workflow as similar to that of a traditional broadcaster or streaming service. A host broadcaster produces the event coverage with the video streams sent to Nagravision which ingests them into its video cloud. “The EBU and their Federations are in charge of producing the content, and we manage it and distribute it."
“We ran some pilots, not proofs of concept because we knew that it was going to work, during summer 2023 on the Athletics World Championships and Summer Biathlon World Championships”
Nagravision has different ingest points around the world to minimise latency. IP SRT streams are sent from the venue via the cloud.
“We are handling literally thousands of live video streams every month and this is going to rise because our ambition is really to aggregate as much content as possible without limitation,” continues Jezouin.
"The sports market is extremely polarised. There are around 800 different sports that people participate in around the world, but only about nine of them are regularly on TV. And even then, there are often lower
leagues that are not shown on TV. We're pretty selective on which sport and leagues to onboard, but we're very open to creating that democratisation of access in one place of all this content. We have created an app which will have multiple live feeds every day. We want to make it easy for the users to find the content they want.”
There will be a time, says Jezouin, when Nagravision will become more involved in creating content. He expects this to begin with making all of the different feeds available to viewers in their own language. “The Public Service Broadcaster mandate is that all of this content should be available for free to everybody,” he explains. “That also means that the content has to be in a format that people can easily watch, and obviously commentaries are very important. Watching sport without commentary is not a great experience. Watching sport with a foreign language commentary is not a good experience either. Watching subtitles is better than nothing but still far from ideal. We're working on some very deep technology innovation around that together with the EBU, and we expect to announce more details very soon.”
Currently, translations are being done by humans but Jezouin expects artificial intelligence to become more involved as the platform expands from three languages currently up to 20 or 30 for major events. “We're already running some pilots on that, and it's amazing,” he adds. “The quality of the machine-based translation completely respects the emotion of the original human commentary.”
The Eurovision Sport platform is made up of existing Nagravision products that have been customised. Under the hood, there are products that have been developed and used by the company and its customers for 20 years, such as content management, the orchestration of the video pipeline, anti-piracy etc. “We've put all of these together as a solution that has very a specific focus. The user experience is customised to the specific needs of a multi-sport, worldwide platform,” he states.
Nagravision is working in full cooperation with EBU members. For example, with the recent Indoor World Athletics Championships, the content was available on the platform, but if a viewer logged into Eurovision Sport to watch the events and an EBU member such as the BBC was broadcasting, they would be directed to the relevant streaming service or linear broadcast.
“But the second the BBC stopped broadcasting or if they were not yet broadcasting live, the viewer could watch live on our platform, and iPlayer would refer them to Eurovision Sport. The cross-promotion is super important because people will come to our platform not only from all the broadcasters but also from the sport federations and leagues. This is a lot of organic promotion that we're going to benefit from because we have this cooperative attitude within the ecosystem,” explains Jezouin.
“There are around 800 different sports that people participate in the world, but only about nine of them are regularly on TV. And even then, there are often lower leagues that are not shown on TV”
“For the EBU members it is becoming difficult to broadcast every sporting event because there is more and more content and frankly, budgets are not following this trend. It takes time, cost and manpower for them. They're very happy to see this platform,” he adds.
Security around live sport is an ongoing battle against illegal streaming sites. Eurovision Sport has the full suite of Nagravision anti-piracy technology built in, however, some viewers may find themselves geoblocked. “We hold the worldwide rights to some of the content, but still we don't want it to be leaked. So we have the full-blown Nagravision security technology that has been serving the biggest sports with the biggest operators for 30-plus years.”
As the platform continues to grow, Nagravision will deliver technology updates on a bi-weekly basis. “There is an endless list of things we can do to make it better for the fans, it's going to become a massively scalable platform,” says Jezouin.
“The EBU currently holds about 43,000 hours of rights, so they have a sea of content and that's only a fraction of the content we can ingest. Very soon, we will have a potential of 100,000 hours of content on the platform, and when you multiply by the reach of EBU members, in Europe alone that's 800 million people. Even if only a fraction of that comes to the platform we will have tens of millions of users.
“It's a challenge that we're very comfortable with, and it's something that we are addressing every day,” he concludes.
The launch of a brand-new, national channel is a very rare event. The small Mediterranean republic of Monaco used to be served as a regional variant of a French network but when that left Monaco, His Serene Highness Prince Albert II determined that the country needed a broadcaster.
As well as serving Monaco’s 38,000 residents, Prince Albert’s vision was that the new broadcaster – to be called TVMonaco – should give a voice to the country on the international stage, reflecting Monégasque culture. The intention is that it will become a member station of TV5Monde, allowing its content to be broadcast in 200 countries.
At home, the broadcaster delivers content on linear and digital platforms, with live news and sports the cornerstone of its output, along with recorded programming to complete the 24-hour output. Prince Albert was particularly keen that the station should focus on environmental issues.
While the first thoughts of TVMonaco began to emerge in 2021, the project started in earnest at the beginning of 2023, with a target on-air date of 1st September that year. Everything had to be achieved during this tight timescale, from finding premises and building a technical infrastructure to recruiting staff and preparing productions.
With a heavy requirement for live programming and little or no broadcast industry in Moncao, a new broadcast centre was required.
This needed to provide comprehensive production facilities, capable of fast turnaround work on news and sports. It also needed to support the creation of magazine programmes and other long-form content.
Sylvain Bottari is TVMonaco’s CTO and led the development of the station. “I knew what we wanted to achieve,” he says. “We needed good, story-centric tools for journalists, backed by a good asset management platform.
“Running sports shows every weekend meant we needed a lot of ingest capacity,” he adds. “And high-quality graphics were vital.”
TVMonaco appointed Qvest to provide overall system integration services, with CVS Engineering taking responsibility for installing the broadcast infrastructure. Given the requirement for simple yet powerful production facilities, good asset management facilities, excellent integrated graphics, and simple ingest and archiving, Qvest recommended nxtedition to provide all the central functionality. nxtedition has developed a unique approach to microservices-based production environments. Its technical origins lay in the development of the CasparCG broadcast graphics platform. Today nxtedition provides all the tools required to produce, manage and deliver premium television with a fresh, coherent, integrated approach. The result is a completely secure and solid technology platform combined with a simple and intuitive user interface that is expressly designed to be very fast to learn.
Qvest arranged a demonstration of the nxtedition platform, that impressed Bottari and the TVMonaco team. The platform incorporated
all the key functionality required, and its microservices architecture meant that it was fully scalable.
This ensured that it could be precisely tailored to meet the specific needs of TVMonaco without introducing custom developments which are harder to support. Using web services for access meant that clear and intuitive user interfaces could be created for each department.
Being entirely software-defined, it also meant that it could be rolled out within the tight timescales. This was important because, in parallel with the construction of the studio centre and its technology, TVMonaco was also recruiting staff for the project, including a team of bright young journalists, many of whom had not worked in television before. The emphasis had to be on creating a set of values for the way the station covered news, which meant intuitive operation of the technology to realise their own way of telling stories was vital.
“The challenge we set ourselves was to achieve a very high-quality output with a relatively small crew,” Bottari explains. “At the same time, we have goals for the future, so we needed a platform with scope for advances in the future.
“We found that nxtedition suits the way we want to work, and it is completely futureproof,” he continues. “And because high-quality graphics were vital to our output, the heritage of CasparCG was very valuable.
“Looking to the future, we can scale as we need. When we are ready to add new tools like AI, it is simple to plug them in.” Despite the challenges of building a new infrastructure and recruiting, training and rehearsing a completely new team, TVMonaco hit its target of launching on 1st September 2023.
It’s well known that the media and entertainment industry is facing a skills crisis. But how should the media technology sector address the issue?
Carrie Wootten and Chris Redmond have jointly founded the Global Media & Entertainment Manifesto which aims to build a global agreement and approach to address the issue. TVBEurope spoke to them to find out more
Why the need for this initiative?
You only have to attend a media event or walk into an office of a media company to see how disproportionately the employee count is and how heavily it is dominated by an older age bracket. While there is the place for experience in any sector, there also has to be a next generation of talent working their way up. When you look at the numbers in the media and entertainment industry the talent deficit when it comes to younger team members is shocking, and worryingly so. The facts about the average age of employees in the broadcast space alone point strongly towards 50 per cent or more of most broadcast professionals being within 10 to 15 years of retirement age. The action that the industry needs to take to address this needs to happen now. That's why we have mobilised the Global Media & Entertainment Talent Manifesto to try and unite the recognition of this across the globe and create an industry-wide initiative to create programmes and initiatives to address the deficit.
What do you hope it will achieve?
Ultimately, the main goal is to see the skills shortage dissipate and disappear completely. As well as seeing a thriving pool of diverse talent entering our sector.
It’s a big ambition and we need to make working behind the scenes of M&E attractive again. Part of our thesis is that so many people in the younger generations are so used to being in front of the camera that they don’t really even think about the entire industry that exists to create professional-format productions. We need to educate and inspire them to think about a career in the media technology sector and highlight how their skillsets and abilities can be utilised across roles within the industry,.
How will the Manifesto stand out from other organisations aiming to fill the skills gap within the media tech industry?
The Manifesto is a movement, it's not a company or an organisation. We exist to create an alliance across the industry that facilitates what they know they need to achieve with the economy of scale associated to multiple influential companies being involved. We stand out because what we represent is a global alliance working to achieve a mutual objective.
The Manifesto can be used by CEOs, companies and individuals. How do you plan to tailor it for all three?
The triangulation recognises the nuanced agendas that each one of these stakeholders has, the influence they can impart and what they stand to gain. Therefore it's important to embrace the position each stakeholder group has. Our plans change so that we can address the CEO community with more intimacy and provide a commercial context to appreciating their perspective. For companies (which generally means management led by HR) it is about integrating the equity associated to their corporate
identity into how we promote and create advantage from what they have to offer. This is also the level at which we can engage more frequently and operationally by getting these people to interact with the audiences where we intend to attract attention. Finally, for individuals there are two camps; people currently in the media and entertainment trade who want to help, and the audience we intend to address. The scale of these stakeholder audiences is large and so the way we address them has to create intimacy with the individual, but also offer us the opportunity to be operationally effective. Generating and maintaining traction and momentum is key and we aim to do that by recognising their agenda and motivation to be involved and then catering to that.
You’re planning to focus on five key projects to help raise awareness of the industry. What are they, and how will they do that?
Schools Assembly Campaign - This is a simple initiative to reach as many young people as possible. We’re asking all industry representatives to deliver one school assembly presentation on their job role, their company and their work in the industry. Each assembly approximately has 300 young people in it and it takes 15 minutes to deliver — you can also be back at your desk by 10am. If we have 100 people sign up for this, we suddenly reach 30,000 young people. And this is what we need, economies of scale to ensure we inspire as many young people as possible.
Neurodiversity - It has been noted for a long time that engineers and people in technical roles are neurodiverse. But there hasn’t been any evidence or research on this anecdote. In late December 2023, we ran a survey to see if we could identify any patterns between those who are neurodiverse and those who are in engineering and technical roles. A pattern was identified, but what rang out loud and clear was how much support people needed who were neurodiverse in the sector. As a result, we are running a small mentoring programme, which launched in early February. Interestingly, 60 per cent of the mentees have asked for their names not to be shared publicly, as their managers/colleagues are unaware that they are neurodiverse and they felt very vulnerable sharing this information. We delivered a leadership training session in early March and we will be rolling out further initiatives over the coming months too.
World Cafe and Skills of the Future - One of our key ambitions is to create a live document, which will be the foundation of the Manifesto which companies and individuals can sign up to. But in order to do this, we need to first understand what the skills shortages currently are in detail and what skillsets we need to prepare for in the future — particularly as we look to embedding new technologies into our ecosystem such as AI and cloud technology. We will therefore be hosting a World Cafe at IBC, where we will work with senior leaders across the sector to delve deep into this conversation so that we can start to collectively understand what the industry needs for the future and what commitments every company can realistically commit to to support the future generations entering our sector.
Bootcamps - We know that there is an immediate skills shortage, as well as a long-term need too. We are looking to deliver some hands-on, practical bootcamps to convert people with basic-level skills to enter the workforce over the coming year. We are waiting on some news on a funding application to support this work, so will share this as soon as we have news!
What’s been the reaction of the industry so far?
It has been really amazing. We have a great network of companies across the industry who are looking to address this issue and meet regularly to share knowledge and information on initiatives and programmes of work that have delivered concrete results for their individual companies. The intention is that we can all learn from each other, whether a programme has been delivered in Germany or Australia, and this work can then be replicated or delivered at scale in news geographical areas. We can then also collaborate on other areas of work where new initiatives are needed, such as our neurodiversity programmes of work. We’re also delighted to have the support of Vizrt and Deluxe who are our founding members and lead sponsors.
How can other organisations get involved to support the Manifesto?
There are multiple ways organisations can get involved — through becoming a founding member, joining our Manifesto network, or delivering a School Assembly presentation or perhaps joining the World Cafe at IBC. We are more than happy to chat to anyone that would like to get involved.
How are you tailoring it to be relevant for a global industry?
Members of the Manifesto are from all corners of the world, to ensure that we retain that global perspective. It is critical that we keep this at the core of our work, to ensure that we have the widest range of viewpoints and learn from initiatives that are being delivered internationally and we will always monitor this through our Manifesto network and founding members.
More details about the Global Media & Entertainment Talent Manifesto are available at mediatalentmanifesto.com
Live sports is a uniquely pressurised environment, often involving complex locations over distance, so there is a need for test and measurement devices to be robust, versatile and very mobile, writes PHABRIX CEO Martin Mulligan
In many ways, sports can be seen as a crucible of innovation for broadcasting technology. Over the years, so many important new developments – HD, 4K, HDR, advanced camera systems and immersive audio among them – have been put to the test first in live sports. And as you might imagine, this tendency has meant the sector’s expectations of test and measurement (T&M) equipment have been similarly exalted.
If anything, broadcast engineers’ needs are becoming even more extensive in 2024 as hybrid production infrastructures – involving SDI, IP or a blend of both – become more commonplace. And whilst HD remains dominant in many territories, plenty of sports service providers are also having to contend with the call to deliver content in 4K, HDR, and so on.
As we head into what promises to be a hugely exciting year of sporting action, many broadcasters will be reviewing their T&M inventories and considering if they are really up to the job. It’s not difficult to work out why when you consider the scale of set-up for a stadium football match nowadays. For example, it’s by no means uncommon for a standard shoot to involve 30 or more cameras, including units at pitch-level, on helicopters, Spidercams etc. As high-quality cameras become even more ingeniously compact and versatile, the number of sources is bound to increase further.
Then there is the sheer time-pressure of a match day, where you are absolutely locked to the specific start time of kick-off. It therefore befalls to the broadcast engineer to verify all of the sources in a compressed timeframe, and if there are any issues – such as no-signal or electrical noise leading to jitter – resolve them as quickly as possible, wherever they may have originated. All of which means that portability is another top priority.
Of course, live production is always evolving and giving rise to other challenges. It’s often in sports where you observe these first as it’s such an important testing ground for new technologies. Indeed, it’s in sports where one of the capabilities we have finessed in more recent times – which helps the technical engineer strike a visual balance in HDR productions between the graphics
(generated in an RGB world) and video (coming from a YCbCr environment) – has proved most beneficial.
The adoption of HDR is by no means the only recent big change to have occurred in sports production that has given T&M solutions developers cause for pause.
Unexpectedly provided with an extensive ‘trial phase’ during the pandemic, when amassing large teams on-site became impossible, remote production – whereby smaller teams can be sent on location with most of the production work taking place back at the broadcast centre – has now matured into a familiar production technique. Not only is it good on an economic level, it can also assist broadcasters to reduce their environmental impact and improve the work-life balance of their crews.
With smaller teams overall going out on location, it’s inevitable that the number of technicians is often reduced; in fact, there are likely to be quite a few scenarios in lower-tier sports nowadays where a single engineer is given responsibility for maintaining the broadcast tech for a service provider at an event site. In which case, being equipped with a mobile, easy to use, and multi-faceted T&M product is bound to be a boon.
A few years ago, the fervour around IP migration was so intense you could have been forgiven for thinking that SDI had a relatively limited future. But in fact, the durability, cost-efficiency and – yes – familiarity of SDI has meant that it has continued to be used very extensively across the industry. Whilst the adoption of IP is continuing to grow, it’s likely that many broadcasters will be using SDI, too, for a long time yet. Broadcast engineers therefore will continue to need the same versatility from their T&M tools across both infrastructures.
With so many new or emerging technologies predicted to be deployed during coverage of this year’s big sporting events, it’s going to be fascinating to observe either as a participant or viewer. But it also means that the stakes are very high for achieving an end result that is robust and consistent, so if you haven’t already checked your T&M gear is still ‘fit for purpose’, now would be a very good time to make a start.
Remote broadcasting might be an established way of working, but can it be done better? Germany’s Dyn Media says it can. The streaming startup tells Kevin Emmott how it has redefined sports coverage to promote flexible growth for its sports, its coverage, and its people
Before we get started, let’s get this out of the way; bigger isn’t better.
Bigger is just bigger. It’s ponderous. It’s lumbering, clunky and slow. Being bigger makes things more difficult to change and it goes against everything that modern broadcasters are striving for.
Over the last few years, broadcasters have become increasingly flexible in their approach to both the production and the distribution of content. Remote production has become
a byword for efficiency and broadcasters are constantly looking for different ways to engage their audiences over streaming and traditional broadcast channels.
Germany’s Dyn Media is one of these broadcasters and has leaned into flexible broadcast models so hard that it’s practically bent them out of shape. Working in partnership with NEP Germany, it has not only found more space in its production ecosystem but also in the content it is producing.
And it has done that by keeping away from the traditional headline sports.
"We started with the concept of questioning everything that we were told wasn’t possible”
ANDREAS HEYDEN
"We started with the concept of questioning everything that we were told wasn’t possible,” says Dyn Media CEO Andreas Heyden. “Football has a proven business model and millions of people who have a guaranteed interest in the sport. We are not in the football business; we operate in an arena where sports rights increase in value, and we needed to build an ecosystem which is flexible enough to adapt to it.”
Heyden knows all about how huge the world of football is. He joined the sports industry in 2015 as CEO of DFL Digital Sports GmbH, a subsidiary of Germany’s Bundesliga, and teamed up with former Bundesliga CEO Christian Seifert at Dyn Media, a streaming platform providing sports coverage which goes way beyond football.
Launched in August 2023, Dyn Media provides live sports coverage of hockey, table tennis, basketball, handball and volleyball across 19 different leagues. Working in partnership with NEP Germany, it built a remote production ecosystem which not only allows it to cover over 2,000 sporting events per season but to grow and develop that coverage as the sports grow and develop.
“There have been many remote production proofs of concept, but not every problem is solved by throwing money at it,” says Heyden. “The classical media model of sport is not spearheading the development of remote production because it does
not have to think about costs or find new ways to broadcast. We are empowering these sports by driving the awareness and recognition of them to a higher level; we believe this is our sweet spot.
“Remote production is the only way we can do it. Not because we want to save the planet, but because we believe that it is no longer necessary to do it the old way.”
While Covid has inspired many live sports presentations to take advantage of remote workflows, the ecosystem that Dyn and NEP built is unique. Designed from the ground up, Dyn and NEP sat down in August 2022 to build something entirely different and do it in a more efficient, more sustainable and more human way.
“Everybody talks about remote production, but Dyn took risks to move all the production to the front,” says Zlatan Gavran, managing director of NEP Germany. “Dyn wanted to create an ecosystem unlike anything else in Germany and pushed the technology to its limits. Without clients like Dyn taking those risks, there is no way for any of us to move forward.”
Dyn’s ecosystem consists of two large remote production control rooms (PCRs) located at NEP Germany’s headquarters in Munich, with one large and three small remote PCRs stationed at Dyn Media’s headquarters in Cologne. All shared hardware is housed at NEP in Munich, and all seven control rooms are connected over high-speed fibre. It means that Dyn’s production teams can work in either city or in an outside broadcast truck on site, and be part of a similar production environment.
Both facilities are managed and powered by NEP’s Total Facility Control (TFC) which manages, configures and controls the entire broadcast and network infrastructure. Meanwhile, six venue kits manage the capture of audio and video at the venue. Each kit consists of a core rack with three stage boxes, all connected over fibre to provide support for four LDX 150 SMPTE ST 2110 cameras plus minicams, microphones and other audio sources.
The entire network is IP glass-to-glass with fibre connectivity to more than 40 venues provided by Riedel.
Ringing the changes
“We have two 10 Gb rings connecting Wuppertal, Munich, Frankfurt and Cologne, which enables everyone to work together seamlessly,” says Heyden. “Having a commentator sitting in Cologne and the
video guys in Munich is just as simple as having a commentator in the stadium, production in Cologne and sound control in Munich if we want to.”
Gavran adds: “It's proper decentralised production and it enables us to move and shift production resources. All the processing and equipment is located in Munich as is the Network Operation Centre (NOC) where we monitor all the signals.”
But the efficiencies don’t stop there, and Dyn bucked the trend again by taking an unconventional route to get to air with as small a footprint as possible.
“The problem everybody faces when they go remote, especially in countries like Germany, is that connectivity is very expensive,” states Gavran.
“Our first concept was built on 10 Gb for uncompressed video, and we ended up talking about using 1 Gb. This meant we needed to compress the picture down. What we ended up doing was 40 Mbit VBR, H264, 10-bit 4:2:2, and the results are indistinguishable from uncompressed.
“People say that it is important to work uncompressed. It's not. The viewer doesn’t see a difference,” he continues. “When we did the first match and our technology partners looked at the picture in Munich, they commented that this is the death of uncompressed.”
Applying this level of compression not only lowers the bandwidth but also minimises delay from the venue to Munich to around 400 milliseconds and below. Above all, it gives Dyn the ability to do whatever it wants at both ends of the production
“Dyn wanted to create an ecosystem unlike anything else in Germany and pushed the technology to its limits”
ZLATAN GAVRAN
chain, which enables it to quickly pivot in line with consumer needs.
“It's flexible in both the input and the output,” notes Heyden. “Do we take one camera signal? Is it done by an OB van at the stadium? Is it done by a software vision mixer, or do we take camera feeds from the stadium to our remote production facilities? Do we want to work with AI cameras? We want the flexibility to have a commentator in the stadium, in our remote production facility or completely remote, and we want flexible distribution to be via satellite, IP or file transfer.
“This degree of flexibility enables us to choose the most cost-effective level of coverage for a specific sport and upgrade it as it develops; whether that is upgrading the signal, the audio, or the distribution.”
Management of the entire network is aided by using NEP’s TFC application, which aims to simplify IP networks by bringing different elements of a live event network, like control data, intercom, video and audio signals, onto the same platform.
“While we have specialists, we are not IP engineers,”
says Garvan. “But if you want to ensure a normal EIC can operate an IP network, you need to have something in between. And it's TFC which makes sure that whatever you press is happening in the background.
“The whole ecosystem is IP glass-to-glass, and TFC helped us to quickly stabilise and run it without having to be an expert in IP networking. We designed and built the network, but TFC adds an overlay of control, management, SDN and monitoring. I couldn't imagine doing this without it.”
There is more infrastructure work to come. Building a foundation for future development isn’t just about the technical backbone, and it’s not just about
environmental sustainability. It is also about people, and Garvan is very open about the lessons learned from Covid and how that impacts the workforce.
“NEP produces hundreds of matches a year across Germany and we realised it is not sustainable,” he says. “Everybody wants to go out and do a job on-site, but there is a big difference between going out on one job and going out on five successive jobs in five different cities. Eventually, it's the travel that gets to people, not the job.
“We wanted to create an environment where people can go home at the end of a shift. We wanted to enable them to focus more on home than on travel.”
And it’s not just today’s staff who are looking at working in these new ways, but future employees, and according to Heyden they are shaking things up even more. “There is a whole generation of Gen Zs who grew up as content creators doing Instagram stories, TikTok, Twitch and YouTube, and they are way more sophisticated than people who started their careers as a slow-mo operator 20 years ago,” he states. “These people want to work differently. They are used to customisation. We need to cater for that.”
Dyn’s model works with Reidel's Simplylive Production Suite, a live production platform which adapts itself to
the tasks that need to be performed. At the beginning of their shift, directors plug in a USB stick that contains front-end elements customised to their needs, enabling the software to adapt to the way they work.
“We have introduced touch screens for people who prefer to work that way and have moved from one- to two-handed usage,” says Heyden. “We want to attract the best people to tell the best stories, and the technology should disappear and become a natural connection for them. You don't necessarily get this with a mixer with buttons which all just do one thing.
“We are not saving the planet; we are doing it because we want to have the most motivated and most creative people on our team who want to do a great job," he adds.
On average Dyn’s customers visit its website 10 times a month and each unique user watches up to 25 hours. It’s creating unique content every day, and it’s illustrating how remote production can maintain quality while maximising output.
“When you're sitting at home or wherever you
“We want to attract the best people to tell the best stories, and the technology should disappear and become a natural connection for them. You don't necessarily get this with a mixer with buttons which all just do one thing”
ANDREAS HEYDEN
watch the game, from a creative point of view there is no difference in the quality of the output,” says Gavran. “When you consider it replaces a whole truck's worth of equipment it is impressive.”
Amidst fears of AI displacing jobs lies a lesser-known narrative—where AI collaborates with humans and fosters job creation in TV and film production, writes
Robert ShepherdIn the era of rapid technological advancement, the spectre of artificial intelligence (AI) replacing human labour looms over every sector. AI-driven robotics have revolutionised assembly lines in manufacturing, enhancing efficiency and reducing costs.
Similarly, in customer service, AI-powered chatbots operate seamlessly, addressing inquiries 24/7. Despite the benefits these advancements bring, they also pose a significant challenge for displaced workers and further highlight AI's profound impact on industries, reshaping the workforce and business operations. Nevertheless, some sectors, like media and entertainment, are experiencing AI's reverse effect, bucking the trend of job displacement.
Benjamin Field, founder of Deep Fusion Films, highlights how AI contributes to job creation. "Firstly, it opens up new positions for AI specialists who can programme, manage, and maintain AI systems,” he explains. “For example, AI technicians are needed to create, build, develop, and maintain the AI software that automates tasks such as preliminary editing and visual effects rendering."
VFX and animation is one area of production where AI automates procedural tasks like rotoscoping and motion tracking. This frees VFX artists to focus on enhancing CGI realism and crafting imaginative visual sequences. As a result, demand rises for artists blending technical expertise with creative flair to push storytelling boundaries.
Gaurav Gupta, co-founder at post production studio FutureWorks (Westworld, Fallout), which specialises in VFX, colour, and sound, believes that if AI can help the industry increase the size of the market, that can only be a positive for companies and their workforces.
“Another example is visualisation and concept art,” he adds. “This is made a lot easier through AI. For example, a cinematographer will be able to support concept artists with a clearer brief by using images made with generative AI that reflect their vision.
“Like any other tool, AI, as it takes over some of the functions performed by humans, will also create new opportunities and jobs as these functions are transformed. Just like how smartphones, the internet, and email have become universal, AI will permeate everywhere and to everyone. AI is an inevitable evolution in content production, distribution and consumption. It’s everywhere.”
Meanwhile, Field says AI is also paving the way for interactive media, where content can adapt to viewer choices. “This innovation requires writers and developers who can create complex narrative trees and ensure that each pathway is as engaging as the next,” he continues. "The demand for professionals skilled in writing for interactive formats is growing as audiences seek more personalised and engaging content.”
Paul Robinson, president of Kartoon Channel, the kids’ broadcaster and on-demand service, says AI can improve the efficiency, accuracy and timeliness of creative decision-making activities by automating routine and repetitive tasks that are not particularly motivating for people to undertake. “For example, the use of Gen AI to create a draft script or a background for a television animated show significantly increases the time it takes to deliver the first draft,” he says.
“The skills of creative humans to iterate and improve can then be deployed more quickly in the process with two potential outcomes dependent on the goals, i.e. increased quality or reduced time to deliver a completed product or both.”
Field concurs and believes AI in script analysis can evaluate scripts for various elements, including plot structure, character development, and potential audience engagement. “This analysis can help writers and producers refine their content to better meet audience expectations,” he continues. “The result is a growing need for script analysts who can interpret AI-generated feedback and collaborate with writers to enhance scripts. Additionally, this technology creates opportunities for developers and data scientists focused on improving AI algorithms tailored to scriptwriting and storytelling.”
Moreover, he believes the increased efficiency means companies that can be flexible with budgets and therefore work more creatively will create an environment where commissioners are able to take more risks, as the financial risk on any one show will be less. “Having more cost-effective shows means that network budgets can be freed up in order to commission more shows,” Field adds. “More shows mean more creatives in work.”
Robinson points to the fact that “AI produces vast amounts of data very quickly, but data has little value unless it is used to make better decisions”.
He says “those better decisions are contingent on the analysis of the data and interpretation” in a way that is directly relevant to the production or creative that is being produced. “Whilst it is wrong to generalise, it's probably a pretty good bet that much creative talent is not also data literate, so AI systems engineers, ML experts and data scientists will be needed to do the analysis and give context". “The demand for these skilled individuals in the TV and film sector is already starting to show signs of growth, which will continue as deployment of AI becomes more widespread.”
“AI technicians are needed to create, build, develop, and maintain the AI software that automates tasks such as preliminary editing and visual effects rendering”
BENJAMIN FIELD
For Field, “adaptation leads to innovation”, and in the TV and film industry it means creating new forms of content that captivate audiences in ways previously unimagined. He says that “as professionals acquire new skills to harness AI's potential”, they're crafting the next chapter in the evolution of the industry, one that promises to be as exciting as it is transformative.
Looking to the future, Field is convinced new creative leads will emerge. “These will be the early adopters of the technology and they could well be producers or creatives that haven't been in the industry before, as their skillset — being able to blend AI and TV — wasn't necessary prior to the latest tech revolution. Those who adapt early will have job security, those that do not run the risk of being left behind.”
Box to Box Films is an award-winning documentary production company, based in London, Paris and Los Angeles. Led by much-garlanded producers James Gay-Rees and Paul Martin, the company sets out to push storytelling to new heights, elevating their subjects into dynamic and dramatic narratives.
Early successes for James Gay-Rees were on cultural topics, like the 2010 Exit Through the Gift Shop, directed by the street artist Banksy, as well documentaries Senna and Diego Maradona, the compelling narrative detailing the career of the footballing icon. Paul Martin, with a strong interest in sports, started his partnership with Gay-Rees on a documentary about Cristiano Ronaldo.
Currently, Box to Box is enjoying huge success with Formula 1: Drive to Survive for Netflix, with series six launching in February 2024.
Many of these popular and prestigious programmes are produced in conjunction with and delivered by major international streaming companies. During the production process, content needs to be signed off by contributors, producers and the creative team. At this stage the material is still secret, so the delivery of these approval reels must be highly secure, while still being readily accessible by executives, lawyers, and some of the networks.
Box to Box technical editor Rafael Bettega first used MediaSilo from EditShare in 2013 on Amy, and now uses it as standard on all productions. The software streamlines video collaboration, by managing and distributing work-in-progress, securely and simply.
For the production manager, the software allows groups of users to be set up, each with their own set of permissions. Content is delivered only to those who need to see it; and those with the appropriate permissions can comment directly on each clip, even adding comments time-coded to sections of the content.
At the hub, setting these users, groups and permissions is simple
and intuitive. Users can decide if recipients can download the material, or just view it. The content is distributed securely, with users receiving notification of new material. It supports multi-factor authentication to add an extra layer of protection.
End users do not need any special software or applications to view the shared content. Once they have proved their right to view it, the material is a simple QuickTime file which can be played on any device.
The software tracks usage and reports when each recipient viewed the material and how often. For the production manager, all the analytics and feedback are collated into a single point of information, showing at a glance the status of each piece of content.
Box to Box has settled into a production and delivery workflow in which MediaSilo plays a central part. “It varies from show to show, but often we are sending dailies to the production and post team, so they can start to build the programme as soon as possible, as well as checking the quality of what has been shot,” says Bettega.
“The advantage of handling dailies this way is that we don’t have to create log-ins to the asset management platform. As well as avoiding complications, it also means that we are not using seat licences, so it potentially saves significant costs, too.”
As the programme takes shape, cuts are distributed to
different constituencies, depending on progress. By the time the cut is nearing finalisation, it will be seen by key figures at the commissioning company as well as the production team and those responsible for finishing.
“On a typical project, we would send a cut to maybe two executives, three or four producers, showrunners and DoPs, post supervisors and more,” Bettega explains. “We send the picture lock cut to the composer, graphics company, dialogue editors and contributor creative agencies.
“The networks we partner with, for example, would expect the highest degree of security while producing the show, using only approved platforms for internal or external review,” he says. “They are also very meticulous about platforms used for the production and post, and MediaSilo is one of the few delivery methods they have approved. We are all very focused on content security and secondfactor authentication for added security.”
MediaSilo was developed for applications just like this. At every stage of production – from dailies and storyboards to VFX and final cuts – there is a need to manage the way material is shared across the collaborative creative pipeline. MediaSilo presents content in the best quality and a simple custom-branded user interface, so clients are free to focus on creativity.
It is expected that a newly enhanced version of the AES70 audio control standard will lead to increased adoption in various professional applications, writes
David DaviesOriginally published in 2016, AES70 built upon an extended programme of work by the OCA (Open Control Architecture) Alliance that addressed the long-standing absence of a control and monitoring standard capable of administering today’s professional, IP-based networked audio. With the OCA Alliance continuing to promote the standard and its adoption, AES70 began to establish a market profile through its first iteration and a 2018 revision.
Now the standard has been the subject of a further “thoughtful” update that, says the OCA Alliance, “both enhances the standard’s usability and flexibility while expanding its capabilities to meet the dynamic needs of professional applications.” Showcased at ISE earlier this year, the update – known as AES70-2023 – includes additions to the three core standards (AES70-1, AES70-2, AES70-3) as well as several forthcoming adaptation standards for networks utilising AES67, ST 2110-30 and MILAN (AVB).
Jeff Berryman, senior scientist at Bosch Communications and technical committee chair of the OCA Alliance, explains that the most important task for AES70-2023 was “to revise the connection management mechanism, which is the set of functions that AES70 provides for making, breaking and routing stream connections.
“It’s not a streaming protocol – that’s what you use things like AES67 for – but AES70 can be employed to control the making and breaking of those connections, where the streams go, and how their clock rates are matched up,” he adds. “We realised that it was a little too inefficient to implement in smaller devices especially, while there were also a number of other functions and generalisations that we wanted to make. So what we have now is a fourth and hopefully final iteration of the connection management function.”
Morten Lave is a highly experienced industry consultant, formerly known for his long association with TC Electronic and now consulting for companies including Adamson Systems Engineering, Focusrite and PreSonus. “I’ve been involved with the development of AES70 since its early days, and essentially what we wanted to do with the connection management mechanism was to create an abstraction for media stream patching which was independent of the underlying technology [used for transportation]," he explains.
Whilst there is a consensus expectation that the latest version will satisfy all likely applications, three “adaptation standards” are also in the works and approaching completion.
AES70-21 is likely to emerge first and will provide full control of AES67 and ST 2110 connection routing, as well as support for all related streaming options and integration with other connection management methods, such as SIP and SDP.
AES70-22 will address MILAN (AVB) connections with extensive routing control, streaming option support and device configuration support, whilst AES70-23 will perform a comparable role for Dante connection management.
“Now that we have this very solid abstraction and the new adaptations on the way, it would be fair to say that AES70 can complement any media transport protocol [effectively],” says Lave.
Using
What Berryman deems to be the second most important aspect of the new update is the introduction of large dataset storage and retrieval, covering datasets including media files, logs, command sets and programmes, stored parameters and custom datasets.
“Devices are getting smarter and smarter, so the relationship between the device and its controller can be quite complex and involve the movement of large amounts of data back and forth,” he says. “So with this new feature, we have a general capability for the movement and management of large datasets between controller and device.”
In addition to a number of new programming features – including a streamlined event notification mechanism and better support for concurrency locking and waiting – the supporting documentation has also been significantly overhauled.
Along with new informative annexes that provide examples and advice for developers, the new documentation benefits from “greatly clarified language throughout, with more consistency and precision. We have worked hard on that to ensure that the text is clearer and more accessible,” notes Berryman.
Matt Hardy is business development partner, UX and strategy at DeusO, a company specialising in web-based control user interfaces for pro-audio devices. As experienced computer programmers who also have strong backgrounds in audio, DeusO’s involvement with AES70 since its inception and through work on the latest iteration has included “plenty of discussion and lots of little improvements,
including the addition of new APIs, that mean the standard is easier to use from a developmental point of view,” says Hardy.
Whilst he acknowledges that “there are a number of incumbent control systems on the market that are clearly wellentrenched”, he thinks that AES70 now holds particular potential for existing companies and/or start-ups to be able to “utilise a reference design for control and not need to allocate so many resources to that aspect. By providing a constant approach to control, AES70 represents a great opportunity for manufacturers to innovate with networked pro-AV gear.”
Several applications in which AES70 might experience more rapid traction are earmarked by Lave, including in conjunction with MILAN (AVB)-based networks in broadcast studios, theatres and live events. Deterministic behaviour and optimum synchronisation are among the features of MILAN that have meant “it has been chosen to be the standard of choice for many sound reinforcement applications, and I think there could be quite a lot of use cases where it’s pragmatic to employ AES70 for control.”
Berryman also seems enthused by the broader potential of AES70 in its latest iteration, noting the suitability of the standard to “give people a faster ramp-up to the position where they can have network-controllable devices without a lot of expense. The community [of newer companies] tends to be full of smart young people who read the specifications very diligently and produce great
comments. In fact, the bulk of the interesting feedback in the last year has come from the keen energetic crowd of start-up people.” It’s an observation which suggests there is reason to feel not just optimistic about the prospects of the refreshed AES70, but also the potential of the next generation of pro-audio R&D teams.
“Devices are getting smarter and smarter, so the relationship between the device and its controller can be quite complex and involve the movement of large amounts of data back and forth”
JEFF BERRYMAN
Post production by its very nature is an energy-intensive industry. Editing suites, sound mixing desks, colour grading tools, VFX rendering all use electricity. But as the TV and film industry aims to become more sustainable, how can a post production house reduce its carbon footprint? London-based Dirty Looks has found a unique way of repurposing the heat generated by its computing and rendering storage requirements. The company has joined forces with data centre company Deep Green, which is using the heat to help power a swimming pool. TVBEurope spoke to both companies to find out more
Why is sustainability important to Dirty Looks?
The world is facing a climate crisis, and I am a human being and a father with children who are very anxious about the future. As the owner of Dirty Looks, I have the means to use my business as a force for good and help to change the way we do things in the post production industry, and align with the UN's Sustainable Development Goals.
What made Deep Green's data centre stand out?
Reducing our carbon footprint is a top priority. Our biggest challenge comes from the energy demands of our compute and data storage. Last year we moved a lot of equipment off-site to run a more efficient machine room in a dark fibre-connected centre in Shoreditch (instead of locally in Soho), but with expanding processing and storage needs traditional data centres didn't feel like the answer. Then, we discovered immersive servers and re-captured heat technology. Deep Green's approach resonated deeply. The idea of transforming waste heat into a free resource for a local community is both ingenious and inspiring. We're excited to be part of this sustainable solution.
How many of Dirty Looks' projects have used the data centre so far, and how big are those projects in terms of terabytes etc?
We are delighted to be in production on VFX-pull systems that are helping to heat Exmouth's public swimming pool in Devon, along with some machine learning processes which are training our AI models for post production improved efficiency in the future. Currently, the distance to the Exmouth site, and reliance on public internet with VPN, creates challenges to move large amounts of data, so our use has been handfuls of terabytes (compare to the petabyte+ of data that we have to keep live in London). We look forward to re-locating to a closer Deep Green site this summer and connecting our workstations and Soho studio via 100 GbE dark fibre to empower our compute workloads' data to travel without bottlenecks.
Which projects have employed the data centre?
So far we have used the Exmouth site for rendering some shots on two projects for Netflix, including The Kitchen (directed by Daniel Kaluuya and Kibwe Tavares) and The Abyss/Avgruden (director Richard Holm). In addition to these films, we have been using the site's powerful NVIDIA cards for R&D, writing and developing our own tools to streamline this process even further in the future.
You've signed a deal for 18 months, do you expect to continue using this kind of technology further?
Absolutely, it's like being asked if you would buy a petrol car again. I think once you make the step in this direction why would you go back? Over 18 months we intend to migrate all of our hot GPU compute and storage into Deep Green site(s). There are a lot of challenges associated with re-locating powerful real-time cinema-quality graphics workstations away from under artists' desks (or in a machine room down the corridor), but we're confident that we can solve the last few pieces of the puzzle in the next six months.
What would you say to other post houses that might be considering a similar project?
Do it! Capture your heat, don't waste even more energy trying to throw it into the outside air. Give heat away for good, to a community that needs it.
Image processing demands are rapidly accelerating, and storage capacity requirements are always climbing. If you are at the point of refreshing hardware anyway it's a total no-brainer. In terms of security, you aren't using public cloud, think of this approach as a private cloud of which you control engineering and costs. We can do it — so you can,
too. Dirty Looks is happy to share our experience with peers and help if we can, just get in touch.
Mark Bjornsgaard, founder and CEO, Deep GreenDo you work with any other companies within the M&E industry?
Dirty Looks was the first in the sector to commit to working with us and we're now having several conversations with other post production houses since we announced our partnership. Dirty Looks is leading the way for the media and entertainment industry to create huge social value for local communities. The data centres underpinning the sector produce a huge amount of heat. This should no longer be wasted, vented into the atmosphere, but instead re-deployed for social good. By sharing the heat for free with, for example, swimming pools, we reduce their reliance on fossil-fuel boilers, cutting energy bills and reducing carbon emissions.
How is the heat from the servers transferred to the pool?
Deep Green's data centres are made up of servers specially adapted to be fully immersed in mineral oil. The oil captures the heat generated and is pumped through pipes to a heat exchanger where the heat is transferred to the cold pool water, raising its temperature. This reduces the pool's reliance on its fossil-fuel boiler.
How many computers in total are needed to heat the pool, and how many are Dirty Looks running?
Each Deep Green unit varies in size. The technology is scalable and a greater number of computers can be added to meet whatever requirements are needed. Typically each data centre is around 200 kW in size with the vast majority of that energy being available to us as usable heat, which in turn can save a pool up to £80,000 a year on their energy bill.
Are you considering launching in swimming pools closer to London, where the majority of the UK's post production sector is based?
There are over 1,500 public pools across the UK. This means there are sites all over the country that Deep Green could deploy in. It also means that any post production company is never too far away from a local swimming pool that could use the heat from the film rendering process.
IPMX – IP Media Experience – is a proposed set of open standards and specifications to enable the carriage of compressed audio and video, with associated data, over IP networks. It was created by industry body AIMS, builds on work from SMPTE, AWMA and VSF, and is incorporated as VSF TR-10. Although the standard allows the carriage of uncompressed SMPTE ST 2110 streams, in practice it is unlikely that anyone would do that, so typical implementations are going to use compression, which might be a low-latency distribution format like JPEG-XS, or it might be a highcompression format like H.264. IPMX is completely signal format agnostic, so any connection could freely mix ST 2110, JPEG-XS, H.264 and more in a single feed.
It is naturally allied to RIST, the Reliable Internet Stream Transport, another open standard and widely interoperable. This interoperability is really important if IPMX is to deliver widespread, ready connectivity.
I should declare an interest here: I took a major part in creating the specifications for RIST, I was also closely involved in the creation of LibRIST, the library of routines used to build RIST into practical applications, again open and freely available from VideoLAN.
If you have been around the broadcast industry for a reasonable length of time, you will know that production and delivery used to be concentrated in large buildings, and a facility was provided to all the offices and working areas in the building which allowed workers to switch between multiple feeds to see what was happening: what was being transmitted, what was happening in the studios, and so on.
Today the nature of the industry is very different. The big players are international businesses, and the work of production and delivery is decentralised. The pandemic initiated a major shift towards home working, and many now value this as an important boost to work/life balance.
This includes key people in production and planning, who certainly need to be in touch with all the content at all times. So how do you engineer a modern equivalent of the old in-house distribution system, for a very large number of streams, to very many destinations globally? We have been working with a major American broadcaster to tackle this issue, using IPMX and RIST. At present, this uses four hubs in Washington DC, Los Angeles, New York and London, with the capability of bi-directional flow between them. Each can take in large numbers of ST 2110 feeds and compress them for viable transport. Uncompressed distribution is allowed, but it is unlikely to be used. JPEG-XS provides very low latency, but in practice, H.264 achieves less than a second latency glass to glass.
There is a really important point here: when I say “ST 2110 feeds”, I mean everything in the feed. Along with the audio and the video, that includes all the ancillary data and PTP timing information to ensure signals are accurately synchronised end-to-end. Every ST 2110 to RIST conversion is convertible back to ST 2110, potentially allowing the system to be used for contribution and distribution feeds, and for remote post.
In the system we implemented, the RIST client is hosted inside the media server. That ensures it is continually monitoring for content availability and requests. We call this RIST on Demand.
High-quality H.264 software compression needs a lot of processing power. We use SuperMicro or Asus COTS hardware, with Mellanox or Intel network adapters. SipRadius developed custom code to bypass the network stack, a notorious Linux bottleneck.
The result is a very powerful engine, capable of processing as many as 21 concurrent streams on a single piece of hardware. Load balancing routines spread the processing load as well as optimising network bandwidth. From the hubs, the streams are distributed to multiple users.
For those in the headquarters building, we developed a set-top box, using a Raspberry Pi processor and a VLC client, an intuitive graphical interface and comprehensive media security. RIST has AES encryption and 256-bit secure remote password protocol built in. The STBs access the RIST stream directly, which means they have all the associated metadata also available.
For remote users, the content is distributed using WebRTC, with a custom DRM. WebRTC means the service is available in any browser on any device, using packet recovery for stable performance and an overall latency of less than a second. Like the STB, the browser interface looks like a corporate IPTV service and is completely intuitive in operation.
In the implementation we have delivered, the service is capable of delivering hundreds of concurrent streams without impact. But the advantage of RIST on Demand is that only programmes currently being viewed are transmitted, while putting the RIST servers inside the media farm means instant response to remote requests. This manages and minimises the bandwidth required between hubs. Additional viewer requests are merely additional accesses to the distributed signals.
IPMX and RIST, together, have enabled a practical, scalable and secure solution to a practical issue. The use of open standards and protocols have further enabled it to be built on COTS hardware and virtual machines, and delivered to the devices we all have in our homes, hands and offices.
Intuition is a force to be reckoned with, and if you are a broadcaster covering breaking news or a content creator working from a remote location, your instincts might be screaming that despite the noise around 5G, you need something more robust in order to guarantee connectivity.
We’ve never had better or more abundant connectivity, and wireless networks like 5G provide even more ways to access reliable backhaul from remote locations.
It’s convenient, it promises low-latency, fast speeds and it’s easily accessible. But not everything in life is as reliable as your instincts. And all too often, neither is 5G.
While early adopters of 5G like the UAE and South Korea are achieving median download speeds in excess of 500 Mbps, it is a different story elsewhere. Globally, 5G is still in its infancy and in fragmented regions like Europe, performance is especially inconsistent.
According to the GSM Association, there are more than 80 5G operators in Europe, and few are capable of delivering speeds in excess of 200 Mbps. Thanks largely to their dedicated 5G spectrum and a supportive regulatory environment, the Nordic countries of Sweden, Finland, Denmark and Norway, as well as Cyprus, are all capable of download speeds which qualify them as “High Performers” by connectivity specialist Ookla. However, the five biggest European economies, Germany, the UK, France, Italy and Spain, are all struggling with spotty coverage and median download speeds as low as 98 Mbps. And it’s not getting any better. As more and more smartphones, wearables and IoT devices piggyback on Europe’s 5G infrastructure, the resulting network congestion means that median 5G speeds are actually decreasing every year.
A report by the European Court of Auditors estimates that 5G deployment across the EU will cost around €400 billion, and with service providers already struggling to see a return on their existing 5G investment, future development into standalone 5G also looks dubious. For anyone who needs reliable, robust and high-bandwidth connectivity, 5G isn’t enough.
In preparation for this summer’s sporting events in Paris, Dejero has already conducted 5G tests in the French capital, using a variety of SIM configurations across seven of the venues, including the South Paris Arena, the Emile Anthonie Stadium and the Stade de France.
Setting the transmitter to 20 Mbps and 2 seconds of latency in fixed latency mode, we performed 5 minutes of streaming for each SIM combination. We found that 5G coverage is performing fairly well in the city core, but LTE is still the better option, especially the further out you go. Factor in the other variable of crowd congestion and you could be in trouble if relying on a single cellular network carrier.
It’s safe to say that for live broadcasting and streaming at the Summer Games, one connection won’t be enough, which is why many live sports professionals will be relying on Dejero Smart Blending Technology at whichever venue they find themselves.
With uncertainty over the development of 5G even in advanced economies like Europe, it’s good to have more options when we need them. And we need them more than ever.
We’ve gone from sending video from point A to point B, to having everything connected over data. Bringing all this into one environment, combining video transmission with data connectivity, enables remote broadcast crews to operate in the same way they would in an internal facility.
We’ve seen those limits consistently pushed. Our equipment has been flown in helicopters and planes; it’s been on the ground in the most inhospitable of locations; it’s been in rivers, floods and fires, and in areas affected by earthquakes. It’s worked with broadcasters in Ukraine and in the Maui wildfires, at times when reliable connectivity has been absolutely critical.
The landscape is constantly evolving, but in a world where there’s still a huge disparity in 5G connectivity, it’s good to know we don’t have to rely on just that.
Who can you trust to transform your production workflows?
Sony and Nevion (a Sony group company) have built on their vast experience to bring to you Networked Live, an ecosystem of solutions, products, ser vices, and partners, helping you take full advantage of your resources in high-quality mission critical live production through connected hybrid on-premises and Cloud capabilities
Talk to the exper ts
pro.sony/networked-live