EDITORIAL
We are opening the month of July with one of the most anticipated sporting events, the Wimbledon tournament, and from TM Broadcast we wanted to bring our readers a detailed analysis of the technical deployment that ACS and EMG Connectivity, in collaboration with WBS, have undertaken for the present year’s broadcast: new points of view and coverage from a bird’s eye view that will ensure that we do not lose detail of any confrontation.
Many of the images that we will have access to during the famous tournament will reach our homes through state-of-the-art cameras; For this reason, we include in this issue a special analysis on 4K and 8K broadcast cameras, technologies that are already part of the day-to-day life of the industry.
Sport is one of the protagonists of the summer season, and we did not want to waste the opportunity to highlight the importance of remote production
in the broadcasting of this type of event; Singular. live tells us in this issue how its graphics enhance and energize the remote production of all types of content.
In relation to content production, we spoke with one of the leading post-production companies, the Irish company Windmill Lane, responsible for content post-production for all the major international platforms and who has worked in collaboration with some of the best film directors and television of the last years.
To close, TM Broadcast will transport the reader to Japan House in London, to attend the presentation of the latest line of audio consoles from Yamaha, and we will end the trip with a relaxed chat with Jeff Rosica, CEO of Avid, who we were able to ask about the future of the industry in the last stop of the European tour “The Future of Post”.
Wimbledon with ACS and EMG Connectivity
Over several years, EMG Connectivity have supplied various broadcasters’ RF services but it was when WBS took over the host broadcasting of The Championships from the BBC in 2018 that the group became the exclusive supplier of RF cameras; on the other hand, further cementing their expertise in the area, Aerial Camera Systems (ACS), also part of the EMG Group, has been supplying specialist cameras for The Championship for over 20 years.
Ateme integrates protocol BISS-CA for content protection in all its premium solutions
Free license is included to simplify BISS-CA delivery; with this movement, Ateme aims to help fighting against piracy. Besides, by doing this the company hopes to continue BISS-CA enhancement and innovation.
Ateme has made the BISS-CA standard for content protection accessible within all Ateme premium products and solutions, since the firm is co-developer of the BISS-CA standard; offering the solution for free strives to facilitate industry adoption of the protocol.
A scrambling protocol codeveloped with the European Broadcasting Union and Eurovision Services, Basic Interoperable Scrambling System Conditional Access (BISS-CA) offers open, conditional access and gives users a transparent, traceable, and vendor-agnostic contribution and primary distribution scrambling system. BISS-CA is secure thanks to its encryption with rolling keys. As a standard, it is interoperable across encoders, transcoders, multiplexers, and decoders. It is also scalable and simple to operate, using public keys to manage assets. Ateme incorporates BISS-CA into its KYRION, TITAN, and PILOT premium solutions.
Ateme is driven to help content
owners fight back against the piracy of live sports streams. Since the introduction of BISS-CA in 2018, Ateme has successfully employed the standard and enhanced its benefits. For example, by adding watermarking solutions from different data security providers, with direct control through BISS-CA, Ateme has transformed channel protection into a full anti-piracy solution, with assets removal when a leak is detected. Adding BISS-CA audio mode also increases the granularity of asset management, enabling separate encryption of each data channel. This makes it possible to isolate audio or specific data and transfer it securely to dedicated receivers.
“Piracy is increasing. According to international research firm Parks Associates, it could result in a $113 billion loss for streaming services by 2027,” said Julien Mandel, Solution Marketing Senior Director at Ateme. “So it’s important to continue the fight against piracy. That’s why at Ateme, we want to make BISS-CA more widely accessible, including it in our premium products and solutions. Launched at NAB Show 2018, BISS-CA has evolved to meet new modes of operation. Today it is not just for satellite distribution: combining BISSCA with any Automatic Repeat Request (ARQ) transmission protocol, such as Secure Reliable Transport (SRT), enables secure transmission over the public internet as well as secure ingest/ egress to/from the cloud. As we continue to innovate and improve our solutions, we are also excited to provide all new premium delivery solutions with free access to BISS-CA support in our KYRION encoders, our TITAN suite of transcoders, and our PILOT management system.”
HIVE introduces new SDM-compatible media engine
The English company launches BeeBlade, an SDM-compatible media engine which aims to level the playing field thanks to its flexibility and scalability
HIVE, a manufacturing company in the media engine market, announces the launch of the BeeBlade, a new product offering that aims to redefine the way media players are used across various applications. This media engine signifies a new approach to media playback, making it one of the most advanced and adaptable solutions in today’s fast-moving media engine market.
“I’m very excited about our new BeeBox and BeeBlade products, featuring our gamechanging BeeSync technology. Developed in partnership with Intel, this cutting-edge technology enables seamless frame-perfect synchronisation across devices without the need for separate reference sync cabling and expensive HDSDI hardware,” explains Trey Harrison, CoFounder at HIVE. “The BeeBlade sits inside a projector or display and requires only an Ethernet connection. The simplicity this
brings to synchronised blended multi-projector installations will set a new standard in our industry!”
BeeBlade sets itself apart with its ability to be deployed in three different formats. Firstly, it can be inserted directly into SDMcompatible projectors or displays, leveraging the advantages of the SDM format whilst simplifying installation and eliminating the need for expensive video distribution systems. Alternatively, BeeBlade can be deployed in a standalone box, a BeeBox, adecuate for situations where display technology lacks an SDM slot, yet where users still want to harness the power of the distributed architecture. Lastly, for those seeking a high-density rack solution, multiple BeeBlade modules can be inserted into an accompanying product, the BeeHive, enabling an 16 x 4k output from a single 5U rack,
delivering a high-density media playback solution on the market.
The new BeeBlade range incorporates HIVE’s proprietary BeeSync technology, achieving great synchronisation across multiple displays, with precise frame-accurate synchronisation of HDMI video outputs across all Ethernet-connected video players, wherever they are located. Comparable to more expensive genlocked systems, BeeSync boosts synchronisation capabilities, eliminating the need for extra hardware and cabling. HIVE’s flagship BeeBlade modules are fully compatible with all SDM-equipped Panasonic projectors and displays as part of an ongoing relationship between the brands.
“The flexibility this product offers the customer is disruptive – an industry first. Users can buy an SDM card from us which they can
put directly into a projector or display negating the requirement for expensive, unreliable, and awkward-to-install distribution systems, or they can use the standalone box option when an SDM slot is not available, confirms Dave Green, CoFounder at HIVE. “It’s a one-sizefits-all solution.”
Furthermore, BeeBlade’s modular design allows customers to purchase the exact number of outputs they need, offering a scalable solution that optimises costs. Unlike other manufacturers that may require
the purchase of multiple systems to accommodate specific output requirements, BeeBlade and BeeHive empower customers to scale their systems precisely to their project needs. Green explains this concept further: “HIVE customers only need to buy the number of outputs that they need. If a project requires 11 4k outputs, they buy a BeeHive and populate it with 11 BeeBlades. The cost involved scales according to the requirement, offering a futureproofed solution. This is unlike anything else on the market.”
“BeeBlade offers the highest density of outputs, silky smooth playback and amazing scalability, all within the same great HIVE software ecosystem,” says Mark Calvert, Co-Founder at HIVE. “We believe the BeeBlade module and accompanying BeeBox and BeeHive products are disruptors in this market, and we look forward to seeing how our users deploy this technology over the upcoming months.”
BeeBlade and BeeBox are available now and are already being sold worldwide. HIVE expects the BeeHive product to ship in Q4 of 2023.
Chyron releases CAMIO 5.3 to facilitate production and boost graphics workflow efficiency
Chyron, company whith headquarters at New York, has announced the introduction of the latest update of its solution, CAMIO 5.3, for enhanced graphics workflow efficiency and production flexibility; besides, new update offers new features for adapting graphics across multiple news markets as well as integrations with newsroom computer systems and latest non-linear editing suites
new timecode OUT for graphics allows journalists and producers working with CAMIO’s NRCS plugin, LUCI, to specify a precise time when the graphic playout ends, overriding any duration set in the template’s animations. The automation system pulling from the newsroom rundown can use this timecode information to manage graphics automatically, keeping the rundown on schedule.
Chyron recently announced the release of CAMIO 5.3, the latest edition of the company’s Media Object Server (MOS)driven newsroom graphics management system. Targeting the evolving requirements of news broadcasters, updates in CAMIO 5.3 empower producers and journalists through tighter integration with the latest nonlinear edit (NLE) and newsroom computer systems (NRCS), more extensive use of MOS newsroom information to drive efficient playout, and a new workflow for dynamic management of graphic projects across multiple teams within distributed workflows.
The most notable update in CAMIO 5.3 is its new User Access Control Panel, which allows administrators to create
dynamic user permission groups across distributed teams. As a result, local stations enjoy edit access over localized graphics projects without overwriting master packages that should only be edited by the designated art department. Whether the station or station group uses a centralized hub-and-spoke model or a more distributed stem-andleaf model, the right people have the right permissions to access and edit content.
Another key area of improvement in CAMIO 5.3 is providing more graphics metadata to drive automated productions from the newsroom computer system. A new API connecting PRIME and CAMIO enables use of XMP metadata to auto-fill replaceable text fields for PRIME graphics within CAMIO, saving time and eliminating typos. A
With support for the latest versions of Grass Valley Edius (versions 9 and 10), CAMIO 5.3 provides an easy pathway to bring PRIME graphics from a CAMIO template library into an Edius editing project. The CAMIO NLE Plugin has long been a feature of CAMIO, and ongoing updates to accommodate leading NLE applications help users ensure a consistent look across live and post-produced content, often through efficient, centralized creation of all graphics for live and post.
CAMIO 5.3 now integrates with SNews Arion, an NRCS that is gaining traction in the news industry with its easy-touse interface and tools. This furthers Chyron’s commitment to ensure that CAMIO works in all newsroom environments, regardless of the NRCS vendor.
Vizrt helps out ELF to achieve cloud production in less than 3 weeks
European League of Football (ELF) achieves cloud production in record time thanks to Vizrt; Vizrt, novel.media, and Amazon Web Services (AWS) collaborated to give support to the ELF for it to achieve full end-to-end cloud live production in less than three weeks for more than 100 American Football games
Vizrt, specialized in real-time graphics and live production solutions for content creators, helped the European League of Football (ELF) achieve its vision of moving to cloud live production for its 2023 season, in a mere three weeks. Representing 17 teams across nine countries –with more joining each season – the ELF viewership is on track to reach more than half a billion households worldwide through the official ELF Game Pass and popular European TV networks. To continue this growth, the ELF and their production agency, novel.media, were looking to improve viewer experiences with new features, advanced graphics, and increased video quality. Additionally, they wanted to increase value for sponsors with improved advertising options, while reducing production costs and carbon footprint. Moving their live productions to the cloud would make all this possible.
End-to-end cloud live production within weeks
Realizing that the flexibility of a “best in breed” approach to their required live production tools, combined with the scalability of cloud infrastructure was essential to their continued success, the ELF and novel.media sought the expertise of Vizrt and AWS to support them in their transition
to “go cloud”. With just weeks to go before games began, Vizrt and AWS proposed an end-to-end cloud setup powered by the Vizrt Live Production Solution and selected 3rd party tools and all connected by NDI®. By choosing cloud, the ELF and novel.media operators have access to highend live production tools usually out of reach of niche sports broadcasters. High quality image-based augmented reality graphics and virtual advertising are powered by Viz Arena, with software-based video/audio switching made possible with Viz Vectar Plus. Graphics are rendered in real-time using Viz Engine, and 3Play by Viz Now ensures no moment is missed with unmatched replay and slow-motion. Essential to achieving the tight deadline, Viz Now saved weeks of deployment and setup time by automating the deployments into AWS and provides a simple portal to spin up and delete the live production environments as needed.
Due to Vizrt’s interoperability, it was also possible to integrate additional third-party tools into Viz Now’s automated deployment workflows, including Comprimato Live Transcoder for encoding NDI® to SRT for playout, and Tractus Multiviewer for NDI® for viewing up to 16 NDI® sources at once.
On the ground, two LiveU LU800 field units convert a total of 8 SDI signals to SRT in 1080p and transmit them over 5G to AWS, where they’re decoded to NDI® by virtualized LiveU LU4000 units.
With up to four games being produced concurrently, novel. media operators work in four new control rooms on-site
AN
USING VIZ VECTAR PLUS AND VIZ ARENA TO PRODUCE EUROPEAN LEAGUE OF FOOTBALL GAMES.
at novel.media premises and access and manage up to six of these cloud live production environments at the click of a button through Viz Now.
Elevating the fan experience and increasing value for sponsors
Viz Arena was key to achieving the goal of improving the viewer experience and increasing sponsor value and visibility. Controlled by OnAir Graphics operators, Viz Arena uses data generated during the game to add augmented reality (AR) graphics to the main cameras, captivating fans with informative and exciting visuals and the
virtual 1st and 10th lines and displaying virtual “cam carpet” ads.
To increase content relevancy for fans, the solution was designed to support multiple signal outputs from Viz Vectar Plus. Depending on the match being played, a graphics-only signal is sent to local broadcasters who insert their own presenters and L-frames, and another signal broadcasts through the ELF OTT channel and uses Spalk Virtual Commentary Studio for commentary in up to seven languages. A standard world feed with graphics, presenters, ambient sound, and multilingual commentary is published to ran. de, while an equivalent signal
featuring ELF, Czech, or Polish presenters and virtual sponsor advertising goes out through the ELF Platform and a Czech or Polish broadcaster, depending which team is playing.
“What the European League of Football has achieved in just three weeks with AWS and Vizrt is incredible, and a testament to the power of the cloud,” said Marc Aldrich, General Manager, Global Media & Entertainment at AWS. “The fan experience is central to any sports production, and by moving its production pipeline into the cloud, the European League of Football will be able to create more memorable match broadcasts that let fans dive deeper into the game. We can’t wait to see how this architecture continues to evolve as well as what it enables.”
Scoring a touchdown for sustainability
Speaking of the ELF’s decision to produce the entire season in the cloud, Zeljko Karajica, CEO of European League of Football commented that it’s “helping us reduce our carbon footprint, as we can significantly pare back the volume of travel and outside broadcast (OB) truck equipment that productions of this caliber typically require. This means that directors, graphics designers, operators, sound engineers, and other technicians no longer need to travel to the game venue. With this, we estimate that we can achieve a reduction of over 300 tons of CO2 emissions this season. We are delighted to be pioneers in Europe by implementing this innovative production technology.”
SpinVFX generates colorful crowds producing ‘Big George Foreman’ with DaVinci Resolve Studio
Led by VFX Supervisor Andrew McPhillips, the SpinVFX team helped create period appropriate VFX for each of Foreman’s notable boxing matches in the film and declared that DaVinci Resolve Studio’s ease of use allowed the team to quickly dive into things so they could focus on the creative and not get bogged down with the technical aspects
Toronto based visual effects (VFX) house SpinVFX assisted to create the ringside VFX for Foreman’s many pivotal boxing matches, starting in the 1960s and continuing through the 1990s when Foreman was crowned the oldest heavyweight champion in the world.
McPhillips explained, “Since the film is based on a true story, we were able to reference the actual crowds and rings from each of these matches. Everything began in preproduction with Director George Tillman Jr.’s detailed color script and ran through post with us using different looks and color palettes to tell the distinct story for each of the fights.”
The SpinVFX team worked on 10 fights for the film, with each having a distinct look inside and outside of the ring. For example, when creating cheering crowds for Foreman’s first fights in the 1960s, SpinVFX used more of a harmonious palette of browns and blues. Then, for The Rumble in the Jungle fight vs. Muhammad
Ali in the mid 1970s, things became a bit more garish and colorful. Along with these looks reflecting the actual crowds and fashions of the time, they also mirrored Foreman’s emotions and life outside the ring to aid in the film’s storytelling.
“The costume department spent a lot of time getting things historically accurate and following the color script. We took photogrammetry scans of the crowd during production and used them as the base for our background shots, which we then keyed and brought into DaVinci Resolve Studio for the look development phase,” noted McPhillips.
“DaVinci Resolve Studio really let us try out different looks and brainstorm the approach we wanted to take for each fight. Before we began designing the VFX lighting and backgrounds, we played around with different color stories that enhanced the work already done by the costume department and set
design. We used Resolve to build themes for each of the crowd scenes,” McPhillips said. “With each fight having its own defined look for the VFX, we were able to play with the nuances of color within the historical palettes. Since the film is based on a true story, it was really cool to be able to look back at photos of the actual fights and build the story from there, whether it was emphasizing bright, happy colors for the Moorer fight in the 1990s or more of a monochrome look with occasional hits of bright yellow, blue and pastels for the gold medal Olympics fight vs. Čepulis in 1968.”
McPhillips noted that DaVinci Resolve Studio’s ease of use allowed them to quickly dive into things so they could focus on the creative and not get bogged down with the technical aspects. “DaVinci Resolve Studio was crucial during the concept phase. We used the color wheels to easily try out different palettes, which allowed us to flex our creative muscles,” he added.
ETC fixtures to enhance Greek TV show ‘The Voice’, among others
Rental Greek company Papadimas AVL S.M.P.C., wich is known for popular TV shows ‘The Voice’ and ‘Just the 2 of Us’, has recently chosen ETC’s automated luminaires and entertainment fixtures to light up its productions; Papadimas is as well known due to its recent production of the hit musical ‘Phantom of the Opera’ in Athens and Thessaloniki
Aproximately 200 High End Systems moving lights feature across the productions, including Lonestar, SolaPix, SolaFrame and SolaSpot automated luminaires
ETC’s economic Lonestar fixtures took their Greek TV debut on talent show ‘Just the 2 of Us’, bringing their professional feature set, and quality output all wrapped up in a small package. They joined over a hundred other High End Systems moving lights as well as 50 Source Four LED Series 3 Daylight HDR profiles, all controlled by a mix of Hog 4 and Eos consoles. Series 3 continues its road on-camera around the world, with its patented X8 array giving great control over how skin tones and objects render onscreen.
SolaPix 19 luminaires returned to world-famous reality competition show ‘The Voice’ this year, working alongside High End Systems SolaSpots and almost one hundred Source Four LED fixtures. The unique face-look of SolaPix, with its HaloGraphic Pixel Definition and innovative
FleX Effects engine is a big hit on-camera, but the fixture is also capable of incredibly smooth washes, adding even more versatility to the rig.
Papadimas AVL S.M.P.C established its presence in the broadcast market in Greece through working on such shows and supplying ETC products. “We met the demand for high quality equipment and services on the ‘The Voice’, ‘Just the 2 of Us’ and the ‘Phantom of the Opera’ with ETC gear, which has also set the bar high for our competitors. The certified quality of ETC products combined with the large quantities we own have helped us create a fleet that provides large results, with consistency and competitive prices,” comments General Manager at Papadimas AVL S.M.P.C, Panagiotis Papadimas.
The live productions of ‘Phantom of the Opera’ used a wide range of High End Systems spots and wash-lights alongside industry standard Source Four LED Series 2 Lustr profiles to bring the story
to life. Released in 2014, Series 2 continues to earn its keep in hire stocks around the world, with venues and rental companies continuing to invest in its reliability and great color control, all backed up by ETC’s warranty and support.
ETC Regional Sales Manager, Konstantinos Vonofakidis commented: “I would like to thank Papadimas AVL for their continued trust in our products and services even during the challenging times of the pandemic. This has been a great collaboration that has been forged through the unparalleled support of our local dealer – Audio & Vision. The cooperation between Papadimas AVL and Audio & Vision has had a significantly positive impact on the TV and live events industry in Greece. The overall quality of lighting has improved with many productions being upgraded with ETC equipment, and DOP’s and LD’s continue to create unforgettable lighting designs with our products.”
NEP together with WBS delivers expanded UHD coverage for The Championships 2023
NEP and Wimbledon Broadcast Services (WBS) are delivering The Championships 2023 with expanded Ultra-High-Definition coverage; to produce this ongoing event, NEP Group’s global production ecosystem provides specialty cameras and connectivity, MAM and Live Display Solutions for supporting World Broadcast feed
NEP UK and Ireland, the company provider of broadcast solutions and part of the NEP Group worldwide network, announced recently that Wimbledon 2023 will feature more Ultra-HighDefinition and High Dynamic Range (UHD-HDR) broadcast delivery options than ever before, using NEP’s technology. By utilising NEP’s industry-leading IP technology, broadcast rights holders around the world can now showcase every match from Centre Court and No.1 Court in UHD-HDR with immersive 5.1 Sound. The 16 other courts will continue to be available in either 1080p HDR or 1080p SDR.
Initial planning for the expanded offering began more than a year ago with NEP UK’s Technical Projects & Engineering teams working closely with Wimbledon’s Broadcast Technical Manager, James Muir, to design a custombuilt workflow, which underwent real-world testing in March. The new workflow includes additional UHD-HDR feeds while continuing to deliver the widely used HD SDR feeds to rights holders.
“We’re incredibly proud of our successful partnership with Wimbledon Broadcast Services and we’re excited to continue to deliver the very best broadcast
technology, solutions and resources to their teams as they bring The Championships 2023 to life,” said Sam Broadfoot, Technical Project Manager, NEP UK. “Together, we’re moving the technology forward, now covering all of the action from the two most visible courts in Ultra-High-Definition and High Dynamic Range, giving rights holders around the world more opportunities to bring additional high quality viewing experiences to their global audiences.”
Broadfoot adds: “With the need to continue to offer rights holders HD SDR feeds as we have in previous years, we’re now using 224 channels of conversion, and we’ve found that the quality of the converted SDR feeds have improved as well, since it is now being captured with a higher dynamic range. We’re very pleased with the solution our teams have been working on to support WBS and The Championships.”
Specialty Cameras, Connectivity, Media Asset Management and Live Display
The increase in UHD-HDR feeds is only part of NEP Group’s full global production ecosystem in play at The Championships.
NEP’s full suite of solutions includes broadcast facilities and OB trucks, specialty camera systems, connectivity, media asset management, live display and other broadcast services supporting the world feed.
Fletcher, an NEP specialty capture division, is again providing its AI-driven automatic camera
system, called Tr-Ace, capturing action on seven of the 18 courts in 1080p SDR. Fletcher’s Tr-Ace cameras use image recognition and LIDAR technologies to automatically track players on the court, meaning just one singular operator can control and manage the full camera system.
Mediabank, NEP’s award-winning media asset management solution will be utilised for remote access to match highlights and other content to be ingested, managed and distributed for rights holders.
NEP Connect, the leading provider of media connectivity in the UK, is providing a 10G link to Oslo from IMG, with further
support from NEP Netherlands, which is supplying 1PB of onsite storage. Additional broadcast services from NEP include 36 EVS VIA machines, 58 host Sony cameras, 28 specialty cameras, 150 talkback panels and over 90 km of cable installed each year. More than 300 broadcast engineers, technicians and crew members are onsite supporting the host broadcast and other rights holders.
Creative Technology, an NEP Group Live Events company and trusted service provider to Wimbledon Broadcast Services is delivering radio talkback and microphone facilities to ensure seamless capture by hosts, rights holders and other key personnel.
CT’s support of All England Club / Wimbledon Broadcast Services extends beyond broadcast solutions, having their visual display service recently renewed for a four-year period, which includes LED scoreboards, match information display screens and the Aorangi Terrace (Henman Hill) screen and content production. In addition, CT is managing the technical aspects of the All England Club IPTV network. This entails overseeing the transmission and management of 119 channels of television content over 2000 televisions around the Club, ensuring seamless delivery of camera feeds to the court commentary positions.
Media Mania integrates Lawo IP solutions into its latest UHD OB trucks
This technological optimization aims to dinamize Media Mania’s development in the Gulf and Middle East regions, complying with a diverse range of production needs.
Under the leadership of Roland Daou, Founder and CEO of Media Mania, the company prides itself on being one of the few entities in the MENA and Gulf states region that possesses
the capabilities to design, coach build, install, and operate their own OB vans. “We are committed to serving the broadcast and streaming sectors in the Gulf and MENA region, be it TV stations or private clients,” stated Daou.
Recognizing the increasing prominence of the region in hosting high-end sports events and attracting high-profile clients, Media Mania proactively
embraces technological advancements. In anticipation of the imminent transition to 4K high dynamic range (HDR) standards in the production world, they have developed and deployed 4K HDR OB vans and 4K portable production units (PPU). Emphasizing the essential attributes of stability, redundancy, and reliability in OB van equipment, Daou emphasized the criticality of
The UAE-based production company specializing in OB van construction and broadcast services, Media Mania, has recently announced its late decisition to trust the Lawo mc²56 MkIII audio production console, accompanied by Lawo stageboxes, for being integrated into their latest ultrahigh-definition (UHD) outside broadcast (OB) truck
selecting reputable equipment manufacturers. For the audio component, Lawo, for him, was an obvious and ideal choice.
Waddah Thabit, the Chief Technology Officer at Media Mania, expressed excitement about the advanced workflows offered by the Lawo mc²56 audio production console. Thabit noted, “The mc²56 boasts an extensive array of features meticulously designed to enhance workflow efficiency, increase productivity, and provide ergonomic and intuitive operation. Users can confidently operate the console even in high-pressure situations.” The Lawo mc²56 MkIII console provides sound engineers with the option for two-person operation, featuring decentralized control of all parameters, including bank and layer selection, EQ operation, and dynamics control. Moreover, the console’s touch operation optimizes operating sequences for VCA and bus assignment, meter pickup/mode selection, and N-1 configuration. Advanced functionality is further exemplified by the “Button Glow” feature, facilitating channel strip color coding, and backlit rotary encoders, ensuring enhanced visibility and user guidance even in low-light environments.”
The mc²56 console’s monitoring and metering capabilities have garnered praise from users, with the console providing permanent metering of the central 16 faders. Integrated local I/Os provide abundant connectivity. To further enhance workflow efficiency, users can seamlessly connect required third-party devices directly to the console whenever monitoring,
metering, headphones, or command inputs are necessary. Additionally, the newly integrated Reveal Panel within the console’s overbridge streamlines tasks such as identifying surround channels assigned to a fader in a 5.1 surround mix.
For optimized performance within IP video production environments, there is full support for native ST2110, AES67, RAVENNA and DANTE, while Lawo’s revolutionary LiveView™ feature enables thumbnail previews of video streams directly in the fader labeling displays.
Great performance in networking applications has been taken to the next level with the addition of capabilities such as IP Easy™ simplified network setup and DSCA™ Dynamic Surface to Core Allocation. All of this and more, simply reinforces this console’s place as the number one choice within complex IP-based production infrastructures.
Daou emphasizes the profound impact of this state-of-the-art technology in their OB vans, stating, “With our advanced technology, we not only cater to the discerning local market that demands top-notch quality but also provide flagship results for clients worldwide in their production endeavors.” Media Mania’s enduring loyalty to Lawo over the years – based on their experience with mc² audio systems, AV devices and VSM broadcast control – and their commitment to continuing this partnership in the future is a testament to the shared pursuit of excellence.
Hiltron concludes Satellite TV uplink for German public broadcaster
Hiltron Communications announced recently the completion of a satellite television uplink for one of Germany’s largest public service broadcasters. The company was selected to carry out the project for its long experience as a globally active satcom system integrator, manufacturer and distributor, since it has decades of experience in the development and production of tracking systems for fixed and mobile antennas and hundreds of installed base around the world, for tracking LEO, GEO, MEO, and inclined orbit satellites.
The project encompassed the design, planning, installation and commissioning of a dual redundant HPA system feeding a 3.7 metre TV uplink antenna. Each system (one for horizontal and one for vertical polarisation) is based on two 750 watt traveling-wave tube amplifiers in a 1:1 redundant configuration. An integral phase combiner allows the active amplifiers to transmit in tandem if higher output power is required. Supervision and operation are performed via a Hiltron HCS4 satellite communications controller.
A very compact fully airconditioned shelter was also provided by Hiltron to ensure a high level of weather protection. The shelter is equipped with air intake and exhaust ducts which
can be used to perform forcedair cooling of the amplifiers. The cooling system is itself configured with main-plus-secondary protection. Motorised vents allow warm exhaust air to be directed into the shelter to heat the shelter in winter. Temperature control within the shelter is supervised and monitored via the HCS4.
The Hiltron HCS-4 forms the central control element for a wide range of satcom applications. These include easy switchover between devices such as downconverters, high power amplifiers, waveguides, MPEG digital video broadcast encoders and integrated receiver/decoders. The HCS-4 can also be used to control and monitor optical-fibre transceivers and antenna heating panels. Other features include an N+1 redundancy switch and a BUC/ HPA controller, LNB supply, LNB redundancy switch control, a fully redundant low-noise 10 MHz reference generator, GPS
synchronisation and automatic switchover.
The HCS-4 occupies a 2U high 19 inch rack mount unit with main and backup power supplies plus 13 slots for active modules. Also available are two chassis-mountable frames accommodating up to six or up to 14 active modules respectively. All three versions can be powered from 24 volts DC and are operated via an HTML-based graphic interface. Modules are hot-pluggable to allow on-air exchange. Any new or replaced module is automatically sensed and its address registered. All units in the HCS-4 series have SNMP remote control, hotswappable dual redundant power supply and an internal data bus. When configured as an antenna controller, the HCS-4 accommodates a freely selectable combination of axis controller cards for SSI-encoders, resolvers or potentiometer-angle readers as well as an optional integrated antenna de-icing controller card.
ARES Fighting Championship broadcasts to be enhanced by BOOST Graphics
Since its launch in December 2019, the ARES Fighting Championship has brought together some of the best French and international fighters in a series of events in North Africa and in France. It was set up by Fernand Lopez, head of MMA Factory and Sports Director of ARES Fighting Championship, who has long held the ambition to widen both the talent pool of athletes and the sport’s viewing audience. The first French MMA league is the only one to be currently broadcast on UFC Pass, and has also closed a long-term rights deal with the French paytelevision broadcaster Canal+.
Integral to the presentation of the ARES Fighting Championship matches is Boost Graphics’ suite of technology and technical and creative expertise. This includes providing a refresh of the motion graphics re-design for the broadcasts and integration with social media for inclusion in the live coverage.
Matthieu Skrzypniak, CTO of Boost Graphics, explains, “We are excited to work with ARES Fighting Championship to showcase all the power, action and athleticism of mixed martial arts to fans and new audiences alike. We will provide a dedicated interface to the editorial crew which will enable them to ingest tweets related to ARES Fighting Championship and to select the best ones to be displayed as graphics on screen. Boost will also deliver a dedicated interface for their statistics experts to allow
The subsidiary of EMG Group, company specialized in international broadcast solutions and services provider, Boost Graphics, has been chosen to deliver graphics for the broadcast of ARES Figting Championship; the company was appointed to provide, for this and the following four seasons, a range of graphics capabilities and integrations; these will include a dynamic design refresh for the ambitious Mixed Martial Arts (MMA) league
them to quickly and easily select and prepare relevant data to be displayed as graphics during live broadcast.”
The Boost Graphics platform is also connected directly with the event’s official timing system to extract precise clock information and refereeing decisions. In
addition, the company has developed a bespoke tool for ARES to manage all graphical elements including clock, fighter data and statistics, VIP names, fight results, and match ID. The next event, ARES FC 16, will take place on June 23 in the Dome De Paris in Paris, France.
Dalet joins AWS ISV Accelerate Program and concludes AWS Foundational Technical Review for Dalet Flex
In order to boost global delivery of media workflow solutions on the cloud the company especialized in technology strengthen its relationship with AWS
Dalet recently announced it has completed the AWS Foundational Technical Review (FTR) for Dalet Flex, the acloud-native media logistics platform, as well as joined the Amazon Web Services (AWS) Independent Software Vendor (ISV) Accelerate Program. These actions reinforce Dalet’s global relationship with AWS and accelerates servicing customers who seek to run cloud-native media operations with high performance and security standards.
In order to successfully complete the AWS FTR, AWS Partners must adopt specific best practices around security, reliability and operational excellence as defined by the AWS Well-Architected Framework. Dalet’s focus on security and long-standing ISO/IEC 27001 certification, combined with a mature global cloud operations team, can keep customers’ data secure and operations online when migrating to the cloud.
As members of the AWS ISV Accelerate Program, a co-sell
program for AWS Partners that provides software solutions running on, or integrating with, AWS, Dalet meets AWS best practices and standards that are designed to produce better customer outcomes.
“The attractiveness of a cloudnative platform like Dalet Flex is the scalability and elasticity achieved running on AWS. As your workload shifts and evolves, you can rightsize your operations on the fly,” comments Lincoln Spiteri, CTO, Dalet. “With our completion of the AWS Foundational Technical Review and being part of the AWS ISV Accelerate Program, our customers can be confident that Dalet adheres to the AWS WellArchitected Framework, providing a seamless customer cloud experience, wherever they are in their journey.”
Dalet Flex powers collaborative workflows for a broad range of media-centric organizations, including customers Fox Sports Australia, STARZ, Sony Pictures International, and Peloton. With
Dalet Flex, customers benefit from a journey to the cloud with flexible SaaS deployment options, which include hybrid and fully cloud-hosted and operated.
Dalet Flex offers customers elasticity and scalability to cover a wide range of workflows including:
Modern Archive Management and Monetization – Costeffectively and quickly migrate aging archives into a modern content library for collaborative use and monetization opportunities.
Powerful Media Supply Chain and Distribution – Orchestrate, automate, scale and analyze content packaging and global distribution.
Streamlined Production Asset Management – Integrate and orchestrate production and media asset management to eliminate production silos and processing friction that bogs down collaboration.”
“Let’s Make it Real”, Ross Video’s new brand plattform as first step of planned IPO
Ross Video announced the launch of its new brand platform, “Let’s Make it Real”. The visually dynamic platform represents a significant milestone in the company’s evolution; it’s an especially important step towards a planned IPO in the next few years to communicate that customers remain the central focus for the company.
The brand platform serves as an invitation to customers, partners, and industry professionals to challenge conventional thinking, imagine the extraordinary, and collaborate with Ross to bring their creative visions to life. This launch signifies an exciting chapter for Ross Video and sets the stage for a future of innovation, partnership, and ground-breaking achievements in the world of live video experiences.
“Offering customers a superior experience has been at the core of Ross since the founding of the company. Turning customer ideas into reality is literally what we do and always have done,”
said David Ross, CEO of Ross Video. “That’s why we’re so excited about this new brand platform and what it holds for our future; it’s authentic to who we are and the partnership dynamic we hold so dear. With the planned IPO coming up, we feel it’s important to reemphasize our commitment to customer success.”
As a company with longstanding presence in the broadcast technology market, Ross has maintained a particular perspective; this allows customers to blue-sky based on their unique circumstances and trust that Ross’ production universe will allow their abstract ideas to take form.
“I think of it as more than a brand platform,” said David Ross. “It’s an invitation and a rallying cry to customers, partners, future employees, and everyone in the industry to challenge ourselves. Let’s imagine what could be. Let’s figure it out. Let’s make it real.”
“Our technology is in more settings and hands than ever,
and our customers are using it to design stunning productions on a daily basis,” said Jeff Moore, EVP and CMO of Ross Video. “This new brand platform welcomes our customers’ visionary thinking and puts them in the driver’s seat so that they can dream up creative solutions alongside us. The vibrance and energy of the new visual system is a significant evolution for Ross and a massive step in conveying who we are to this broader audience. It’s also fitting that this great work was a team effort. External partners bringing fresh eyes complimented by Ross staff who know and own the brand.”
Early pieces of “Let’s Make it Real” were teased out and received with interest at NAB in Las Vegas this past spring. With the official launch, the platform will anchor all marketing and communication efforts, including a revamped website. It is designed to communicate how Ross has become an important actor of the industry while remaining laser-focused on customer success.
Ninth Heaven Technology to distribute NewTeK in China
Both companies sign an alliance for distribution as NewTek points to satisfy the growing demand for quality live streaming and live production technology across the Asian region
By appointing Ninth Heaven Technology, NewTek ensures a wider distribution network and increased accessibility of its products in China. This move will facilitate the growth of reseller partnerships and provide customers with more options to leverage NewTek’s industryleading solutions for their video production workflows.
NewTek, company especialized in IP-based video technology and part of Vizrt Group, has recently announced Ninth Heaven Technology to be in charge of distributing its products in China. Expanding NewTek’s presence in the growing Chinese proAV markets, including Enterprise, Education, Government, Esports, and more, is the main motive for making such a significant step as establishing this strategic partnership with Ninth Heaven Technology. The company, part of Vizrt Group, identified the potential and importance of these markets and is dedicated
to meeting the increasing demand for innovative video production solutions.
Paul Dobbs, Head of Channel Sales, APAC, Vizrt Group, expressed his enthusiasm about the collaboration, stating, “Ninth Heaven Technology has extensive experience and expertise in the proAV markets, and coupled with their strong network and dedication to customer satisfaction, they are an ideal partner for us. We are confident that this collaboration will expand our reach to new heights in the Chinese market.”
“We are thrilled to embark on this partnership with NewTek,” said Song Wei, the general manager for Ninth Heaven Technology Co LTD. “As a leading distributor in China, we are constantly seeking state-of-the-art solutions to meet the increasing demand for high-quality live streaming and live production. Collaborating with NewTek expands our product portfolio and offers our customers innovative solutions that empower them to achieve their creative visions. We are excited about the opportunities this partnership brings and look forward to delivering exceptional value to our customers in the Chinese market.”
Gravity
Media’s
production
centre
in
London has new General Manager: Beth Lowney
Gravity Media has recently announced that Beth Lowney is the new General Manager of its brand-new Production Centre in London (White City), opened just nine months ago on September 2022
Beth Lowney joined Gravity Media in 2022, after working at the International Tennis Federation. She arrived with proven success in event servicing, broadcast production, sales and distribution and the ability to deliver excellent client service. This way, with over eight years of experience working across major global sporting events, Lowney joined Gravity Media as Business Development Manager – Outside Broadcasts & Projects.
As Broadcast and Media Manager at the ITF, she managed the team to deliver broadcast production at all major events including the Davis Cup & Billie Jean King Cup Finals, alongside handling worldwide distribution. As Event Manager, she worked closely with National Federations and Sponsors to achieve a professional cohesive package for all stakeholders, also servicing Davis Cup and Fed Cup ties at an operational level to ensure compliance with ITF standards and official regulations.
In her new role as General Manager, Beth will oversee operations at the next-generation Production Centre, act as the main point of contact for partners including ATP Media and Formula E and set the tone for the client experience at
WestWorks. Beth will continue to be a key member of Gravity Media’s Business Development team and focus on delivering a strong pipeline of new opportunities at the facility.
Gravity Media detailed its latest expansion at IBC 2022, where the 50,000-square-foot facility features best-in-class technology to support both on-premise and distributed remote production workflows. The state-of-the-art facility is based around a fullyfledged 2110 IP media fabric with dedicated master control rooms, six dedicated production control rooms with dedicated audio control rooms, seven flexi control rooms, multiple off-tube commentary booths, two studios, lighting and vision control facilities, fast turnaround and craft edit, flexi desk production spaces, media management and
client desking.
Working closely with the Business Development, Engineering, Production and Front of House teams, Beth will help drive Gravity Media’s strategy for growth across the EMEA region, ensuring the facility is intrinsically linked to end-to-end projects and remote production offerings.
Ed Tischler, Gravity Media’s Managing Director says: “We are delighted to announce Beth as the General Manager of our second Production Centre in London. Her extensive experience in the industry and valuable knowledge of Remote Production make her an invaluable asset to our business. We believe that her unique perspectives will further strengthen our ability to deliver exceptional results and maintain our position as a leader in the industry.”
Beth Lowney commented on her new role: “I am thrilled to continue my career at Gravity Media, as the launch of Gravity Media’s Production Centre in White City has marked a pivotal moment. I look forward to bringing my expertise and contribute to working alongside the talented team to drive continued success and growth of the company.”
How was Singular.live born?
Singular was built from a vision to create a new graphics platform that could revolutionise live graphic overlays. Traditional systems are very powerful but their underlying technology pre-dates the internet and smartphones. Content creators need different tools that take advantage of digital technologies to help them offer more flexible, scalable and cost-effective solutions for enhancing their content and viewer engagement.
What is Singular.live’s approach to remote production and how does it contribute to streamlining the broadcasting workflow?
Singular is a browser-based platform, so there are no dedicated graphics hardware or downloads required. Since everything in Singular can be done via a web browser including the creation, preparation and operation of graphics, the platform can be used by anyone from anywhere with a basic computer or tablet and an internet connection. As a result we are able to support remote production natively. We also have fully documented APIs and SDKs so that clients can easily integrate Singular into
their own control interfaces or systems giving maximum flexibility. Additionally, we are already integrated with the leading production platforms giving clients the widest choice of workflows without compromise.
How does Singular. live’s platform facilitate real-time graphics and overlays in remote production scenarios?
Singular’s web-based nature allows full operation from anywhere in the world, in real time. Not only does this enable graphic designers, operators and producers to work remotely from anywhere in the world, it also offers exceptional redundancy. If operators face issues either with their connectivity or computer, anyone else on the team can login and pick up operations from wherever they may be. It also makes it simple to share work and collaborate in the building phase and preparation of graphics. For example, a show producer can easily prepare their graphics from their home, hotel or even while travelling to the venue making it an incredibly flexible solution.
Can you share some examples of successful remote production projects where Singular.live’s technology played a key role?
One of the most ambitious remote production projects that we have helped deliver was with our partners Reality Check Solutions for their client Red Bull Media. This was for a Red Bull surfing event that was happening off the extremely remote southern coast of Tazmania. In order to minimise the environmental impact of their production, Red Bull Media wanted to send only a skeleton crew. As a result, the video signals were sent from Tazmania back to their production hub in Santa
Singular is a browser-based platform, so there are no dedicated graphics hardware or downloads required.
Monica where the Singular graphics were added before distribution.
More recently, Singular has been used in remote productions for the SPFL with partners QTV who do the matchday productions including Singular graphics from their remote hub in Glasgow.
We have also just launched a new project with partners Photron in Japan that delivers live Singular graphics, with
data integration, for a major domestic Sport league from a central production hub in Tokyo.
What are the key features of Singular.live’s platform that make it a preferred choice for remote production teams?
One of the main features is accessibility; the fact that Singular is all browser-based makes it quick and simple to access all features of the platform. Secondly, the SaaS
and cloud-native nature means that clients can scale up and down as they require ensuring they are not left with redundant hardware after they have had to upscale for a specific event. For example, football leagues will often have a week or two (typically at the end of the season) when all matches are concurrent. With Singular, you can simply scale up for those match days and then down again for the regular schedule.
Finally, Singular is the only live graphics platform that has Albert certification for sustainability. In very simple terms, using Singular can help any live production reduce its environmental impact through reducing both the required hardware and the shipping and transportation
for the project. Unlike virtualised cloud graphic solutions, Singular is cloud native meaning we use elastic compute resources rather than dedicated hardware, which is a far more environmentally friendly approach.
How does Singular. live ensure a seamless integration with existing production setups and equipment?
Our partner network has over 50 technology partners and continues to grow as we integrate with more partners. In addition, since the output from Singular is a URL, any production equipment that can take a browser as a source is automatically compatible with us. We also
provide solutions for Singular to fit seamlessly into SDI and NDI workflows. With our decades of experience working in live production and specifically graphics we know the importance of frictionless compatibility which is why we continue to focus on our growing partner network. We have also released a new updated version of our API with full documentation and a dedicated developer portal to help anyone looking to integrate with Singular.
What are the benefits of using cloud-based solutions for remote production, and how does Singular.live leverage this technology?
Cloud-based solutions enable remote production that in
turn helps reduce costs and the environmental impact of live productions. Cloud-native solutions go a significant step further by removing the need (and sometimes logistical challenge) of having to provision dedicated hardware.
As a cloud native platform, Singular offers the best option for remote production whilst also delivering the best solution for scalability, accessibility and sustainability. It does this without any
compromise and while offering next-gen features like localisation, personalisation and enhanced engagement through interactivity.
How does Singular.live address the challenges of latency and bandwidth limitations in remote production environments?
Singular graphics can be time stamped to ensure that they trigger at the desired time. This is essential in live
production and sport in particular since nobody wants to see a score graphic update while their video is buffering and they are yet to see the goal. Embedding a timestamp for the graphics into the video makes sure that, if a viewer’s video is delayed or buffering, the graphic will not display until the correct point in the video.
In addition, our output is in HTML which has a very small footprint and so requires
Our output is in HTML which has a very small footprint and so requires a stable but very small operational bandwidth. This helps ensure that we do not add any additional delays into the workflow.
a stable but very small operational bandwidth. This helps ensure that we do not add any additional delays into the workflow. We have also recently successfully
completed a new integration with our partners Videon on their cloud encode stack, which means clients can now add Singular graphics at the point of encode, further
Producing a Million Dollar Event: The 64 Matches of The Soccer Tournament (TST)
Tupelo Honey, a full-service production company with over 25 years of experience in sports, music and entertainment, recently faced an exciting challenge when tasked with creating broadcast graphics for The Soccer Tournament—a completely new ground-breaking 64-match event over five days, spread out across five fields.
With a tight six-week turnaround, Tupelo Honey sought to incorporate live game graphics, upcoming match details, live scores from multiple fields, and previous game results.
The Soccer Tournament presented Tupelo Honey with a unique opportunity to create broadcast graphics for this first of its kind event: the objective was to seamlessly integrate information from multiple fields, allowing operators to display up-to-the-minute scores from different locations.
reducing any potential delays in their production workflow.
What role does automation play in remote production, and how does Singular. live’s platform support automated workflows?
Many remote productions harness some form of automation either through data or pre-prepared playlists. Singular has robust and varied data integration solutions, and our fully documented APIs can
Tupelo Honey sought a solution that could condense a vast amount of information into a visually captivating and easily digestible format for their audience. Singular’s flexibility allowed them to leverage an existing package and tailor it to their specific requirements within the tight turnaround time. With numerous stakeholders involved, including Tupelo Honey, The Soccer Tournament, and NBC/Peacock, Singular delivered a final product that exceeded expectations and satisfied all parties involved.
also integrate with automation solutions. We have several technology partners who provide automation services that are integrated with Singular making it easy to automate workflows.
TV2 in Norway has a great hybrid example where they created a bespoke workflow for their Ice Hockey production. The Singular graphics are all integrated with data and the operator can choose which graphics they want to automatically play out on air (goals and penalty graphics for example) and then which graphics they want their system to prompt them with so that the operator can then decide if they wish to take the graphic on air. It makes for a really efficient workflow for graphics operation on a very fast moving sport with a lot of possible graphics.
Can you discuss the scalability and flexibility of Singular.live’s platform when it comes to managing multiple remote productions simultaneously?
In 2022, we had 10 million hours of output with Singular graphics. We have clients and partners producing literally thousands of hours of live productions and outputs every month. This is growing
as we work with more FAST channel providers. Scalability is one of the key benefits of Singular being a cloudnative platform rather than hardware-based or virtualised. We use elastic compute resources so there is never an issue finding resources, unlike systems that need to provision dedicated hardware in the cloud.
As a SaaS platform, clients can increase their outputs on demand and drop down when not needed. With Singular, when a production is finished, the operator simply shuts their browser and walks away. There’s no de-rigging or closing down server instances.
How does Singular.live ensure data security and protect against potential cyber threats in remote production workflows?
In addition to offering industry best practice solutions like SSO, we also conduct regular external cyber threat reviews. The most recent of these found no susceptibilities. However our team continually works to manage and improve our security. We fully understand the importance of security for our clients, many of whom have made significant investments for the rights to the content that they are producing.
In terms of user experience, what kind of training and support does Singular.live provide to production teams adopting their platform for remote production?
For anyone who signs up to Singular, we have a comprehensive set of video tutorials and previous webinars all available on our YouTube channel. These cover basic guides on how to get started all the way through to more technical and expert videos. We also offer monthly live webinars on specific topics that are available to anyone to join, with each session including a Q&A section. We also have a dedicated support portal that is monitored 24/7 and accessible either from within Singular or directly from the support website. This provides a highly effective way for people to get answers to any questions they may have or help with any challenges they encounter.
For our Enterprise customers, we also offer dedicated support that includes free training workshops for both designers and developers and a dedicated Slack channel for any specific questions. We also have a customer Slack channel where people post questions and our community
and our support team answer. Our certified partners can get direct access to our development team to help them with their integrations. This has proven very helpful especially with sharing some of our knowledge gained through our own experiences of integration video players and updating CEF versions for example.
What do you see as the future trends and advancements in remote production, and how is Singular.live prepared to adapt and innovate in this evolving landscape?
Singular was purpose-built for remote production by virtue of not requiring any dedicated hardware. As a platform we
are built using standard web protocols which helps both us, as web technologies continue to evolve, but also helps our customers. Finding developers and staff with experience in web technologies is much easier than trying to find and recruit specialist, experienced graphic systems developers and designers.
As technologies like 5G and NDI continue to mature and evolve, and more broadcast technology moves to cloud native solutions, remote production will become even easier and more prevalent. This will also allow more customers to take advantage of some of the more advanced features of Singular such as adaptive overlays and interactivity through the use
of our Intelligent Overlays. Production technology has evolved very quickly and a lot of what we can do fairly easily now would not have been possible even 5 years ago. As demand increases for personalisation and wider use of tools like AI and automation, so will the opportunity for more remote production will grow. The technology to deliver robust, scalable and professional live productions remotely already exists. There are already tier 1 productions being produced remotely so there is no technological obstacle to doing it. The delay in wider roll out is down to operational decision making and people’s readiness to adopt it rather than any issue with the technology itself.
The Wimbledon Championship is one of the largest annual sports Outside Broadcasts undertaken in the UK and a complicate project for all the companies involved. Last June we learnt that EMG Group companies (ACS/EMG-C) had signed a four-year extension with Wimbledon Broadcast Services (WBS) to be the specialist cameras and RF equipment supplier to The Championships, Wimbledon, staged by the All England Club’s annual Championships.
Over several years, EMG Connectivity have supplied various broadcasters’ RF services but it was when WBS took over the host broadcasting of The Championships from the BBC in 2018 that the group became the exclusive supplier of RF cameras; on the other hand, further cementing their expertise in the area, Aerial Camera Systems (ACS), also part of the EMG Group, has been supplying specialist cameras for The Championship for over 20 years.
As the major tennis event takes place, TM Broadcast wanted to know more about the deployment and coverage by EMG Connectivity and ACS, both from EMG Group.
With:
As one of the leading providers of sports production services, how does EMG approach the television production of Wimbledon?
Broadcasting an event of the calibre of Wimbledon requires a great deal of preparation and innovation each year. When we first started working with the BBC at Wimbledon we only supplied three or four camera systems. That provision has steadily grown and when WBS took over as host broadcaster they placed additional emphasis on creating unique shots of The Championships by placing more remotes around the Grounds to capture, for example, the crowd atmosphere and the players’ practice areas.
As a result of these years of experience, thanks to our historic relationship with the event, EMG Connectivity are uniquely suited to meet these challenges; for example, the All England Club’s historic venue in SW19 has a high standard of requirements to be fulfilled, and it often prefers broadcast standard compact robotic cameras as their unobtrusive nature minimises space requirements and line of sight issues. ACS’ specialist cameras, including ACS SMARThead™ systems, are strategically positioned so they achieve its goal whilst being imperceptible.
Can you describe the scale and complexity of the production setup for broadcasting Wimbledon matches?
Due to the scale and calibre of the event, the advance planning for The Championships was a vast undertaking involving a complex setup.
As in example, as well as covering several angles of on-court play, numerous beauty cameras provided contextual coverage of the event and included remote crowd cams, coverage of the player arrivals area, Aorangi practice courts,
and Media Theatre, plus topographic venue shots from a hoist-mounted GSS stabilised camera gimbal sitting high above the venue for the iconic overhead shots of the local area and London skyline.
This year was our largest Championships delivery to date and we strengthened all equipment to fulfil all requirements; this way, the vast majority of camera
systems were either UHD or 1080p HDR for the first time as WBS expands the high-quality format support from Centre Court out to all cameras at The Championships.
What are the key technological innovations and solutions that EMG employs to deliver highquality coverage of Wimbledon?
ACS and EMG Connectivity employed a range of solutions throughout the Wimbledon site to ensure that the best footage can be obtained in a range of situations. Throughout the event, broadcast standard compact robotic cameras were favoured as their unobtrusive nature minimises space requirements and line of sight issues. As we mentioned before, this year we had to use camera systems that were either UHD or 1080p HDR for
This year was our largest Championships delivery to date and we strengthened all equipment to fulfil all requirements; this way, the vast majority of camera systems were either UHD or 1080p HDR for the first time as WBS expands the highquality format support from Centre Court out to all cameras at The Championships.
the first time as WBS amplified the requirements.
EMG Connectivity even designed a special deployment plan for Wimbledon: Centre Court itself features some notable specialist units including an
ACS SMARThead™ mounted on a 10m railcam positioned along the baseline and housed within a purpose-built hide, whilst another four units are mounted on bespoke camera brackets designed specifically for Wimbledon, including two on the umpire’s
chair dedicated to player coverage.
Other than ACS’ specialist cameras, including ACS SMARThead™ systems, were strategically positioned to capture the action, which includes baseline angles,
Other than ACS’ specialist cameras, including ACS SMARThead™ systems, were strategically positioned to capture the action, which includes baseline angles, player coverage from umpires’ chairs, remote crowd cams, and topographic venue shots from hoist-mounted cameras.
player coverage from umpires’ chairs, remote crowd cams, and topographic venue shots from hoist-mounted cameras.
How did EMG ensure seamless integration between different production elements, such
as cameras, graphics, and commentary, during the live broadcasts?
ACS and EMG Connectivity worked to host broadcaster, Wimbledon Broadcast Services who oversaw all production elements.
But I can say that one of the challenges for the EMG Connectivity team was to ensure smooth steadicam RF coverage of the Walk of Champions from the Dressing Rooms to the entrance of Centre Court pre-match.
They also covered the champion from Centre Court up the stairs of the Clubhouse, through the corridor to the Dressing Rooms and onto the Members’ balcony to be greeted by the cheering crowd. This required its own dedicated receive installation with antennas secreted within the walkways and corridors, providing seamless coverage within the building and in addition to the 40 antennas EMG Connectivity have around the Grounds.
EMG Connectivity meanwhile was providing a wide range of kit and expertise to both the host and multiple unilateral broadcasters, supported by a crew of 6 for fortnight and more for the rig and de-rig of The Championships.
What challenges did you face in terms of signal transmission and distribution during the Wimbledon production, and how did you overcome them?
EMG Connectivity provided a wide range of kit and expertise to both the host and multiple unilateral broadcasters, supported by
a crew of 6 for fortnight and more for the rig and de-rig of The Championships. There were 40 antennas site wide which feedback to a central RF cabin in the Broadcast Compound, switched and fed into their appropriate receiver units. This is a cost effective way of multiple area coverage for events such as Wimbledon and golf events such as the Open and the Ryder Cup. A wide area return video system for roving RF monitors was also provided to a number of broadcasters, allowing them to analyse footage from anywhere within the Grounds.
Looking ahead, what do you foresee as the future trends and advancements in television production for major sporting events
like Wimbledon, and how is EMG prepared to embrace those changes?
One of the major trends that is growing year on year is the recognition and practice to ensure broadcasting events such as The Championships adopt sustainable methods. Our clients are requesting that we adopt sustainable practices. As part of the EMG Group, ACS and EMG Connectivity continue our efforts to make live broadcasting and remote production sustainable with specific innovations including the Group’s new remote production vehicles.
The history and events surrounding video cameras used in the professional audiovisual sector has always had one constant: innovation. A long path full of discoveries and technologies as important as the appearance of the first electronic cameras, the fitting of capture sensors, the capture and recording of video on the same equipment, the possibility of working with interchangeable optics, coaxial connectivity, optical figure and/ or IP, the development of the time code, digital cameras, new recording media and 4K&8K quality, among others.
A wide array of technologies and advances that have been fitting in almost perfectly over time under a single denomination: Broadcast equipment, that is, everything that boasts a professional quality meeting specific standards/ norms for and by the audiovisual sector. In this sense, we can highlight as key players two international institutions: the SMPTE and the ITU.
SMPTE (Society of Motion Picture and Television Engineers; https://www. smpte.org/) was founded in 1916 in Washington as SMPE. It currently has around 8,000 members worldwide and has published more than 800 standards and protocols, playing a decisive role in the development and dissemination of television and telecommunications in the United States and also for the rest of the world.
The SMPTE ST 20481 standard defines the main characteristics of an image with a 4K resolution (4096×2160 pixels). And the SMPTE ST2110, SMPTE ST.2082-12 and SMPTE ST 2036-1 standards for 8K UHD (7680x4320 pixels).
ITU, founded in Paris in 1865, is the original acronym for International Telegraphic Union. In 1932 it adopted its current name, and in 1947 it became a specialized agency of the United Nations. Its first subject of expertise was the telegraph, but nowadays ITU covers the entire ICT sector, from digital broadcasting to the Internet, and from mobile technologies to 3D TV. ITU currently comprises 193 member countries and some 700 private sector entities.
Headquartered in Geneva, Switzerland, it has 12 regional and area offices worldwide.
More than 5,000 specialists from telecommunications and ICT organizations and agencies from around the world participate in the Radiocommunication Study Groups to prepare the technical basis for Radiocommunication Conferences, develop ITU-R (Radiocommunication Standards) Recommendations and Reports and compile radiocommunication manuals.
Recommendation ITU-R BT.2100 proposes three levels of detail or resolution: high-definition television (1,920×1,080), 4K UHDTV (3,840×2,160) and 8K (7,680×4,320) all using the progressive image system with wide color gamut and the frame-rate range included in the BT.2020 ITU-UHDTV recommendation.
In this article, we are going to delve in and update the situation around professional 4K & 8K Broadcast video cameras, which are used both in the field of television and in the production of audiovisual content.
A first nuance we have to understand is the difference
A multi-camera production/operation is defined as one that offers audiovisual content, uses a wide variety of sources for video inputs, including more than one video camera. Therefore, the 4K&8K camera model has to be ready to integrate perfectly into the workflow behind a multi-camera, and the most suitable ones for this are the camera chains/systems.
between camscopes and camcorders. The first name is given when the video camera only does the capture and processing, resulting in a video/audio signal output. Conversely, when the video camera in addition to capturing and processing the video/audio signal, has the possibility of recording/ storing that signal, it is called camcorder, i.e. included in the camera body there is a
recorder, jack and/or slot for recording and playing back the video/audio signal.
The next aspect to consider when choosing a 4K&8K video camera is the mounting of the optics with respect to the camera body. We can find solutions with fixed optics, that is, the camera manufacturer offers us a model where the body and optics cannot be separated. And, on the other hand, camera models with interchangeable optics, which allows you to remove and place optical lenses of different focal lengths -always
compatible with the mount on the camera body- to adapt to the shot we want to get.
Thirdly, professional video cameras are conceived and designed according to the production environment for 4K&8K content where they are intended for use: singlecamera or multi-camera. This issue is very important because choosing one model or another will facilitate or complicate the work to be done. A single-camera production is one that uses just one single camera, so this model has to have the necessary features for
capture, shooting, operation and recording of both image and sound to do perform the task autonomously and independently.
Instead, a multi-camera production/operation is defined as one that offers audiovisual content, uses a wide variety of sources for video inputs, including more than one video camera. Therefore, the 4K&8K camera model has to be ready to integrate perfectly into the workflow behind a multicamera, and the most suitable ones for this are the camera chains/systems.
But why camera chains/ systems? The answer lies in the very way of producing, monitoring and offering visual content featuring the highest technical and artistic quality possible. A camera chain/system comprises the following items: camera body with the most suitable optical system based on placement of the camera in regard to what must be captured and the kind of take to offer, a
camera cable through which the various signals and communication orders come and go –such as tally and intercom- (featuring optic fiber or triax cable nowadays) and a base station for controlling the camera from which the outgoing video signal that we will use as input source will exit. These elements must be used based on the number of cameras independently deployed.
In addition, normally each camera chain will have an OCP (camera control panel) to adjust the camera’s technical settings.
At this point, taking into account the first three issues mentioned above, we can establish a typology of 4K&8K broadcast cameras within the audiovisual sector:
Studio camera chains/ systems of studio (Set
TV). This is a type of camera that stands out for the possibility of using interchangeable optics of a long focal length, that is, large telephoto lenses. Their main feature is that they have no section for making the recording within the camera body. In addition, they usually have high levels of quality in the video/audio signal they offer. They are used in multi-camera environments.
EFP (Electronic Field Production) or PCS (Portable Camera System) camera chains/systems, these being more compact and portable pieces of equipment. In addition, these are equipment items
having a recording section on the camera body and interchangeable optics.
ENG (Electronic News Gathering) shoulder cameras. They are equipment items whose main feature a great ergonomic design to fit comfortably on the shoulder of the camera operator and allow great ease in the handling of the buttons, the optics and the settings. They are lightweight and have a recording section in the camera body. They typically have interchangeable optics with a high focal length zoom.
Modular cameras. This type of camera is characterized by the fact that the camera body is robust, simple and
contains the minimum elements for operation. Therefore, for better handling and operation, accessories, complements, optics and other resources must be fitted into the camera body.
Compact/ultra-compact cameras (All-purpose). The really good thing about this type of camera is that in a small size we have all the global and operational features to give a great service for capture, recording and connectivity. They usually have fixed optics with zoom.
Handheld cameras. As the name suggests, size is what matters, taking the minimum space a camera in the hand can; they are also known as Handycams.
But why camera chains/systems? The answer lies in the very way of producing, monitoring and offering visual content featuring the highest technical and artistic quality possible.GRASS VALLEY LDX 150
PTZ remote cameras. Currently, and after the COVID19 healthcare crisis, this type of camera has really swarmed TV studios and is used in numerous events. A PTZ (Pan-TiltZoom) camera is a remotely controlled video camera, compact in size, light in weight and offering great possibilities for planning in a fluid and silent way. It normally comes with fixed optics but there are already models with interchangeable optics, as the SONY FR7, for instance.
Multipurpose remote cameras. Those that have a small camera body together with interchangeable optics featuring great performance and high connectivity for the audiovisual sector.
What all 4K&8K Broadcast cameras are telling us is that they feature camera sensors that are capable of generating images with a resolution 4096 x 2160 pixels (4K) and/or 7680 x 4320 pixels (8K) and therefore, higher than the FHD 8 (1920 x 1080 pixels) and the UHD 8 (3,840 x 2160 pixels).
The landscape of 4K&8K cameras is expanding in each of the above camera types and this is only expected to gather strength as 8K televisions are already being marketed since 2018, when Samsung launched the first QLED 8K model featuring 8K resolution in quantum dot technology.
Large manufacturers of Broadcast cameras already have models that can be purchased or rented to work in 4K&8K production environments:
What all 4K&8K Broadcast cameras are telling us is that they feature camera sensors that are capable of generating images with a resolution 4096 x 2160 pixels (4K) and/ or 7680 x 4320 pixels (8K) and therefore, higher than the FHD 8 (1920 x 1080 pixels) and the UHD 8 (3,840 x 2160 pixels).
SONY introduced the 8K EFP camera chain, which consists of a combination of the UHC-8300 camera head and the UHCU8300 camera control unit, the new HDC FHD/ UHD/4K series EFP camera systems or the versatile HXC series, which are ideal for applications such as live streaming and event production.
PANASONIC continues to develop its Broadcast & Pro-AV range cameras such as the compact AG-UX180; the shoulder AJ-CX4000GJ
camera; the EFP camera chain AK-UC4000 or AKPLV100GSJ; the PTZ camera range AW; or the advanced multi-purpose camera system 8K ROI.
CANON with its LEGRIA HF G70 model, the Canon XF605 and its PTZ CR-N500.
JVC continues to raise the stakes on its light ENG models GY-HC500E 4K, its ultra-compact cameras GY-HM170E GY-HM180E GY-HM250E GY-HM250ESB or the PTZ KY-PZ510NW/
NB and PTZ KY-PZ510W/B models.
BLACKMAGIC with its URSA Broadcast G2 model.
GRASSVALLEY with its LDX 150 model.
Finally, we cannot possibly close this article by making a brief reflection with other types of cameras such as digital cinematography or DSLR/ EVIL for RRSS, which even though feature different production modes and standards as compared to Broadcast cameras, come all of them very close in regard to 4K&8K: more resolution, more content.
How was Windmill Lane born?
In 1978 James Morris, Russ Russell, Brian Masterson and Miert Avis established Windmill Lane. At the time it was Ireland’s first world class recording studio and TV commercial post production facility. Back then Ireland was a country where people with ambition left (the late 1970’s saw Ireland fall into sharp economic decline) and everyone thought they were crazy but from a long established culture of storytelling, creativity & determination and a little bit of luck came opportunity. The leading opportunity, that formed the basis of what we are today, was TV Commercials. 2023 marks 45 years since we opened our doors as a music recording studio in the Dublin Docklands – famous for producing U2’s first five albums. Now based on Herbert Street, Dublin 2, we provide post production services for film, tv and commercials covering audio, colour grading, animation, VFX and editing.
Jason Gaffney - Marketing Manager for Windmill LaneAs a leading postproduction and visual effects company, can you tell us about Windmill Lane’s approach to delivering high-quality audiovisual content?
We believe that delivering high-quality audiovisual content is contingent on investing internally with regard to talent and technology. For example, one of our fastest growing areas is VFX for global streamers (some of our current clients include; Paramount, Netflix & Disney+) and therefore we need to ensure that our clients can rest assured their work is being processed to the highest spec and by people with experience. In February 2023 we appointed Stephen Pepper as VFX Supervisor – a multi Emmy nominated professional who has worked on District 9 & Ironman. Around the same time we announced major upgrades to our world class multi-room audio facility, including the installation of Dolby Atmos – a revolutionary spatial audio technology providing the most immersive sound experience. By May 2023 we had installed a new FLUX Store 360 and upgraded our existing pair of Baselight TWO systems,
allowing us to keep up to date with new technologies and workflows while improving the colour team’s connectivity, speed and performance. These are just some of the ways we stay on the edge of an ever evolving global media landscape and continue to produce the highest quality content.
Dave Quinn - CEO of Windmill LaneWe believe that delivering high-quality audiovisual content is contingent on investing internally with regard to talent and technology.
Can you share some notable projects where Windmill Lane’s expertise in post-production and visual effects made a significant impact?
In 2022 we partnered with Oscar nominated Henry Selick (The Nightmare Before Christmas, Coraline) and Oscar winning Jordan Peele (Nope, Get Out) on Netflix’s Wendell
& Wild. This particular project was a mix of stop-motion, CG and VFX which required a particular set of skills. This responsibility threw up some technical challenges that pushed Fred Burdy – Head of CG at Windmill Lane – and his team;
“Working on stop-motion shots revealed new challenges that were great fun to tackle. We
discovered that the amount of plates to work with was huge, because even simple shots had multiple exposure with different lighting setups – and more complex ones had multiple passes, multiple exposures, and quarter-scale passes for the background environments. All that was shot on motion control which helped the consistency. The production also provided us with textured 3D scans of the sets we needed to put together in our set extension, mostly done through a 2.5D matte painting approach.”
This level of expertise was required more so than ever as Wendell & Wild was shot during Covid. Fred reflects:
“The team were amazing, despite the COVID implications that made us use an hybrid approach: some artists were on site, but most worked from home, and a good few were abroad. We were careful to have frequent catch ups and made sure that people were as involved as possible even if they were not in the office – and to be fair it worked beautifully! It was a great experience overall to work with Mark Fattibene and Heather Abels, the super talented matte painting/ digital/VFX supervisor we had collaborated with before. We’re very proud of having worked on this film and that we could make such an impact on the final piece”.
What are the key considerations when it comes to collaborating with clients and understanding their creative vision during the post-production process?
Understanding that today’s content demand is aggressive and fast paced – our internal strategy has allowed us to meet external expectations.
Deborah Doherty - Head of Production says “As a business, and for us as a client facing partnership, we have become more solution driven. Our domestic sector is hugely important to us but doesn’t negate our international focus and the two exist together perfectly well. Across the leadership team, and the company as a whole, we now work together in a more strategic ‘business oversight capacity’. This has not been easy – it requires new thinking. This is the first time we are bringing in projects together. Recent examples include HBO, Netflix, Paramount, AMC and SKY along with strengthening relationships in the growing local film, tv and advertising sectors”.
With the effort on growing international business
Deborah and John Kennedy
- Head of VFX - knew that this would also require an evolution to Windmill Lane’s
service offering. John explains “We realised we would have to apply a more holistic approach to how we engage with current and prospective clients. We are now more involved with clients from start to finish. In addition to discussing creative, production and technical aspects of the project we aim to add value by being more consultative. This can cover everything from our creative offering to helping clients avail of tax incentives”.
How does Windmill Lane approach the integration of visual effects seamlessly into live-action footage, ensuring a cohesive and realistic end result?
“We try to be involved in the creative process as early as possible, work with the director
and the DOP on set to help them shoot in the best way possible to help us integrate our VFX down the line. We also make sure we capture all the needed information on set, such as cameras and lighting references, panoramas of the set, 3D scans of the set and sometimes the actors if needed. This is really important if CG is involved as we need to match the real set in CG. With all that, we have all we need to properly start our work on a shot. Then we usually do rough passes of the VFX to show the intent and make sure everybody in on board. After that, the creative process is a lot of work on integration, refining the work as we go to get is really seamless. And eventually we have a finished shot!” Fred Burdy - VFX Supervisor with Windmill Lane.
In terms of data security and protection, what measures does Windmill Lane have in place to safeguard clients’ confidential and sensitive materials?
We take the security of any content that crosses our threshold extremely seriously. Over the last year at Windmill Lane we have completely overhauled both our physical and digital security infrastructure, culminating in our first TPN audit in January. Our TPN journey is ongoing and we’re preparing for our TPN Gold Shield audit later this year. Everybody at Windmill is keenly aware of how important security is across the entire media
& entertainment space and we’re committed to adhering to best practice and giving peace of mind to all of our clients. Ed Smith - Head of Operations at Windmill Lane.
How does Windmill Lane leverage technology and innovation to enhance the post-production process and create stunning work?
To support this focus the company has recently unveiled major upgrades to its world class multi-room audio facility, including the installation of Dolby Atmos, a revolutionary spatial audio technology providing the most immersive sound experience. The new studios, featuring
Avid and Genelec equipment, ensure Windmill Lane will continue to operate at the highest standard of postproduction.
On the colour front Windmill Lane has installed a new FLUX Store 360 and upgraded its existing pair of Baselight TWO systems, allowing it to keep up to date with new technologies and workflows while improving the colour team’s connectivity, speed and performance. The Baselight TWO upgrade and new FLUX Store have provided immediate workflow benefits as well as the option to further expand in the future. Both investments serve to exemplify the importance of domestic projects to ensure
Our domestic sector is hugely important to us but doesn’t negate our international focus and the two exist together perfectly well. Across the leadership team, and the company as a whole, we now work together in a more strategic ‘business oversight capacity’.JOHN KENNEDY & DEBORAH DOHERTY
Irish content is produced to the highest standard. Having recently provided full post production on RTÉ’s massively successful crime drama KIN, Windmill Lane prides itself on making great stories about Irish life and showcasing them to a global audience.
On 5 July 2023 Disney+ will release Kizazi Moto: Generation Fire. This animated anthology, by Irish based animation studio Triggerfish, brings together a new wave of animation stars to take you on a wildly entertaining ride into Africa’s future and is a perfect example of how Windmill Lane is leveraging technology and innovation to evolve the company’s output.
Inspired by the continent’s diverse histories and cultures, these action-packed sci-fi and fantasy stories present bold visions of advanced technology, aliens, spirits and monsters imagined from uniquely African perspectives. Windmill Lane provided colour, sound editing & mix services on this ambitious project.
Jeff Rosica is the Chief Executive Officer (CEO) & President, and a member of the board of directors, of Avid Technology. Prior to being appointed Avid’s CEO & President in February 2018, he had served in a number of senior executive roles with the company.
Mr. Rosica is a 35 year industry veteran in the broadcast and media technology segment, with extensive experience spanning production, post-production and distribution technology solutions.
Prior to joining Avid in 2013, Mr. Rosica served in various senior leadership capacities with leading industry brands including Grass Valley, Thomson/Technicolor, and Philips Electronics.
Last June, 15th, TM Broadcast could assist to last stop in Madrid of The Future of Post, Avid’s european tour in which companies, distributors and suppliers learned from some of the senior leaders in the Avid team on how the future of post is shaping up.
TM Broadcast team met Jeff Rosica, Avid Technology’s CEO and President, and could exchange some impressions about the future of the industry and what are the cardinal points we need to be aware of, in order to navigate these exciting times.
First question, mandatory, is about how do you foresee the future of the industry; you have a privileged point of view about new developments at the content creation landscape… but we arrived here and first thing we found was a totem than says “The Future of Post” and I’d like to change the subject: What is the future of the post?
It’s a lot. I think we are in an interesting time at the industry, I’ve been in the industry for 35 years and I’ve seen a lot of change (the
introduction of digitalized archives… a great amount of change), but I’ve never seen anything like this. There is more change going on in our industry, just because our world is changing, and I think we are living a time of rapid change: viewer’s habits are changing, technology shifts are happening five times as fast as they were doing it five years ago; the pandemic accelerated the whole idea of people working in a more distributed way, and even though our industry was already doing that, because it’s always been a part of our industry working with remote
locations or freelance workers but it depended on traveling a lot. The pandemic allowed us to try and do new things: there were no choice, you really had to do it, because everybody was at home.
I think the future of the post will be different that we first thought but still will be quite good. The good thing is that consumption of content is rising because there’s a lot of new devices that allow people to watch content anywhere any time. There’s a strong appetite for good quality content on a global basis. In the old days, content
production used to be filmed at London or at studios in Hollywood or Los Angeles: all production was local… Today… well, we just need to look around us, here in Madrid the shots are getting big hits globally… and not only Madrid, all the Spanish territory.
I think the world is changing, appetites are changing; people want to see content that’s more global; usually you want to see more storytelling related to your culture and in your language, but now there’s a strong appetite for more global content: storytelling from around the world. And this is changing our industry. It’s an exciting time and future appears bright, but different: people are going to work very distributed, the content will be in HD, AI is almost, I believe, an industrial revolution. To summarize: it’s going to be a significant change.
Now, you almost step on my next question, AI; what role is going to have AI in content production and development?
I know there’s a lot of debate about this. I believe AI will be deployed level wide in all the industry; even generative AI will be deployed; I don’t think is going to replace creativity.
AI is nothing more than a very complex computing model that is copying human invention; it’s just repeating human works and serving that, so humans have to create. On the other hand, I believe we have to proceed carefully with AI, socially, legally, culturally…
But it can help?
It can help creativity and it can help efficiency; it’s really a help to create a lot of more efficiency workflows, and the industry needs it, as we are impelled to create more and more content and we need to find a more efficient way of creating
“I think the world is changing, appetites are changing; people want to see content that’s more global; usually you want to see more storytelling related to your culture and in your language, but now there’s a strong appetite for more global content: storytelling from around the world”.
content. Besides, AI is going to take routine tasks away, which is good for creativity.
I think that AI will help creativity too, making 3D animation and visual effects more accessible, since both have come down the costs. AI is going to revolutionize industry, and it can make available for any creator cheaper costs.
If you are creating a program and you decide “oh, I think I should insert an opening scene in a beach with golfing” but you’ve never shot that, you can go and shoot the scene and spend a lot of money… or you can just tell the AI to create the
synchrony you’re going to use… just like a Sybilla.
Though, I think there are elements to the thing that are important, as we’re going to have to protect copyright, privacy, people’s IP address … And we have to do it properly. But I really do believe that AI is going to be similar to an industrial revolution.
Talking about efficiency and reducing costs, recently we could read about Televisa - Univision and Avid developing productions workflows on Google Cloud… What can you tell us about this new alliance and how is it going?
It’s just started. Televisa and Univision, because it’s a combination of Univision and Televisa blended together, they have some really strong visions about the future and about how they want to not just how they want to integrate the companies but how they want to work in the future. They had a vision about how to virtualize all environments and how to create workflows to distribute people and work, so people can share content and share ideas. And they saw the cloud and smelled an
opportunity there; they also see the cloud as a good way of generate efficiencies in their organization…
Avid is a major supplier of technology for them and they wanted us to look at how they could standardize on a way to operate in the cloud. So that work has started, and Google is involved, and is going to be operating in a google tab. Google is helping with some of the research and design work that we have to do together. So, right now we are in the development stage because there is actually research on the go that have to be done to get to the real Televisa and Univision’s vision. I cannot tell you the exact schedule but we extracted too many peels and seeds pretty soon so we’re doing start transitions next year.
It’s exciting because Televisa and Univision are huge producers of content and obviously the main broadcasters in Mexico and USA, they are leaders in Sports, in News… to have somebody that large and sophisticated, one who really has seen how to brain all that operations and all that content flow is pretty exciting and I think is going to help lead the industry in that way.
Collaboration with Amazon and AWS
And another interesting one is the work that we are doing with Amazon; Amazon Studios, which is Prime Video, is developing a studio on the cloud initial which is building a cloud deployed studio; and we are building it right now with AWS and with Amazon. That is going to be a very interesting initiative; if a production, once is started up, needs to move they can literally deploy the technology wherever they are and open direct tab connections. That’s coming and it will change a lot.
Talking about solutions for production workflows… How virtual production can help?
Virtual production mainly reduces costs. The fact that you don’t have to take the crew –twenty people, 100 people— to a location has a significant costs-saving but we have to understand also the plausibility of virtual production: to be able to try new and brave things and to be able to actually see the outcome Live or near Live… This is an important impact of virtual production in the industry. I think is changing the workflows in a really good way.
During these times in audiovisual industry, and specifically in virtual production, there are claims about the lack of talent to recruit… Is this a temporary issue?
I think this is a real issue. I spoke at the Hollywood press association meeting, last February, and I gave a speech and talked about seven principles that are going to shake our industry. One of them is ‘We Are Running Out Of People’, because we are literally running out of people. If you look at the growth of content creation is happening across the world, and you look the amount of people that are in the industry today, the amount of who are coming to the industry and the amount of who are retiring out the industry, you realize that we
are quickly going to run out of people to keep up with the demand of content creation.
And I’m talking about creative roles, technical roles… almost every role that exist around broadcast and media production. We are coming dangerously close to a point where we are not going to have people for these roles. And you can already see it, in certain areas around the market: It’s happening.
We’re going to have to think about this and how are we going to get more people into the industry faster. We want the young talent to want to be in our industry, not to want to be in gaming or other different industries…
We will need to look for ways to be more efficient. I think AI is going to be helpful just in the right time, cause the demand of content is growing much faster than the growth of talent that we have in the industry available to produce it.
Otherwise, I think that people will work more distributed, that will help: it could be that somebody here in Madrid doesn’t find a project to work on and they could be hired in Germany or Los Angeles and
work remotely. AI will help to boost our efficiency and this will allow content creators to focus in creativity.
So short answer is yes, there’s going to be a fast lack of talent and a lack of skills to produce and to do what we need to do in broadcast and media content industry.
And finally, last but not least, what can we expect to discover from AVID at next IBC 2023? Are Avid planning on presenting new tools or technical developments?
We have disconnected trade shows from product launches. We don’t do that anymore. The only time we do it, it’s coincidental: if in the same month that we are going to launch a product that happens to be a trade show, we can adjust times for the launch to be one week before, for example. But we don’t tie ourselves to trade or fair shows’ calendar. Whenever is available, then it’s going to be launched. COVID help to do this, because people got used to learning about new things through virtual content, watching content on screens and displays, without having to go somewhere.
“I really do believe that AI is going to be similar to an industrial revolution.”.
Yamaha DM7 Series
Higher power at a lower price
On June 6, Yamaha showcased the new DM7 series at Japan House, Kensington, London; the Japanese mul na onal brought together a small number of journalists, including the TM Broadcast Interna onal team, to present this new launch in thir first face-to-face event a er the pandemic.
With an exquisite narrative - and coinciding with the thirty-fifth anniversary of the launch of its first digital mixing console, in 1988 - Yamaha chose the Japan House in London as the stage for the presentation of its new series
of audio mixing consoles, thus underllining its millennial heritage and linking the Japanese philosophy to the development of its products.
Ever since the creation of their first product, the Reed Organ,
in 1887, Yamaha -whose name refers to the samurai universe thanks to the combination of the words‘ yama ‘(mountain) and’ ha ’(sword)- has not let up on its way to developing new products, whether musical instruments or audio
Yamaha’s new range of digital mixing units, the DM7 series, consists of two consoles, a midsize model and a compact model, plus new control so ware that would work on both models, and two upgrade packages: one, for the Broadcast sector and the Theatre firmware, focused on room events.
management and control tools, under the principles of the strictest Japanese tradition: precision, flexibility and adaptation, in this case, to the market. In addition, in the development of these series, Yamaha has paid great attention to reducing costs, thus driving the positioning of the product with one of the most competitive prices on the market.
The values of the Japanese tradition are present in each of Yamaha’s actions; thus, the staging the presentation of its new digital audio mixing solutions within the framework of the Japan House in London, sharing space with an exhibition on the use of silk in ancient Japan, should not be considered as something secondary. Using the comparison with the silk thread, resistant but delicate, flexible but powerful material, from which various products are made -from braided belts to sophisticated samurai armors- the origin and purpose of the new DM7 line appears crystal-clear to the viewer: it is a new series that seeks to simplify workflows by using the most advanced technology and responding to the needs that have been identified over the last few years.
After a brief tour of the exhibition at the upper area, in which guests and the press were able to learn first-hand about the ancestral traditions in craftsmanship and silk looms of Japanese culture, the presentation began in the event room of the Japan House, structured in a simple way, through the interventions of Karl Christmas, Marketing Manager - Digital Mixers & Production, Tobias Weich, recently appointed Director of Yamaha’s Professional Audio Division, and Andy Cooper, director of the Yamaha Professional Audio engineering area in the United Kingdom.
As an introduction, Christmas made a brief tour around the difficulties posed by the 2020 pandemic and the subsequent supply crisis on the industry in general -and particularly on Yamaha- and celebrated being able to meet in person with both journalists and guests to present the latest and expected launch of Yamaha Music; Tobias Weich, Director of the Professional Audio Division for Yamaha in Europe, spoke about the challenges faced by Yamaha in Europe and briefly exposed the directives that have led the prestigious firm to develop the DM7 consoles.
Last, Andy Cooper, a longtime engineering specialist in the house, conveyed to the public the advances built into the new DM7 Series, not without raising some suprised exclamations among those present.
And rightly so: the new DM7 and DM7 compact consoles come with a multi-touch screen that offers new functionalities, full display of the channel selection and allows to modify the
parameters quickly; they use artificial intelligence to detect the most suitable pre-set for the piece being played (Assist mode)… and can even be split in two. Let’s go step by step.
The DM7 and DM7 Compact mixing consoles
Yamaha’s new range of digital mixing units, the DM7 series, consists of two consoles, a mid-size model and a
compact model, plus a new control panel that would work on both models, and two upgrade packages: one, for the Broadcast sector and the Theatre firmware, focused on room events. In this way, the DM7 line becomes the most versatile of Yamaha so far, being capable of adapting its functionalities to any type of event, from a concert to a theatrical show, as well as any type of streaming or live productions.
The new DM7 and DM7 compact mixing consoles have 120 and 72 channels, respectively, in addition to 48 mixing buses, 12 arrays and new functions, all of this aimed at obtaining high quality audio while facilitating the work of the technicians at the controls.
One of the elements that caught the attention of some of those technicians who were invited to the presentation, was the display that the
channel view screen does; It is a 12.1-inch multi-touch screen that offers the user all the information available on each of the channels. With the help of the fingers, the user can move the equalizer (EQ) or adjust the size in which channels and settings are displayed -this is thanks to the new DM7 control software, which allows editing all the elements that one can think of: keys, actions, controllers, joysticks, transport flows…-. In addition, it comes with two
new faders and can be used to expand the controls in previous models of Yamaha consoles.
Special mention deserves the so-called Split mode, which divides the console in two: one side would take care of the FOH and the other could manage a live broadcast, for example. This Split mode allows you to split the input channels, scenes and mixing buses, thus making the console, in either of its two
presentations, work as two separate mixing consoles.
This capability would not only allow two technicians to operate at the same time with different actions, but also, by splitting the actual ‘engine’ of the console into two independent engines and granting each of them access to specific input channels or output channels, the number of channels is halved. The user will be able to determine how many mixes are available on the ‘console’ A and how many mixes are available on the ‘console’ B. As indicated above, in this way it is very easy, for example, to use one of the consoles to produce
a program and the other to broadcast live, at the same time.
Both consoles use the Dante protocol as a basis and are fully compatible with RIO units and all common external devices, such as microphone receivers or speakers from other manufacturers, and so on, as usual in Yamaha products.
New features
The new DM7 system, comprising a medium-sized mixing console, a compact one, and a control system -more firmware adapted to different uses-, is designed with the aim that users who
are already familiar with previous Yamaha tools, such as RIVAGE PM, for example, feel at home. Of course, at a home with many improvements for even more ease of use. For example, as a preview of what we explain below in more detail:
By joining the DM7 with the compact unit of the same series plus the control panel, a one-meter rack is obtained from which it
is possible to handle the sound of any production with the maximum precision and flexibility unknown up to now. In this one-meter mixing system, a technician has control over 120 inputs , 48 mixing buses, 24 DCAs and 12 matrix buses. Along with 30 motorized faders, 36 encoders and 3 multitouch screens.
The DM7 series offers a wide variety of encoders
for a more precise and true adjustment to production requirements; it is possible to visualize the signal’s flow and even manipulate it in a tactile way, through the screen.
In addition to the inputs, the DM7 has phantom power, analog and digital gain dynamics and EQ inserts.
The DM7 series is based on the same natural sound concept introduced back in due time by the RIO D2 microphone pre-amplifiers, and for this purpose it features a whole series of effects and optimizations that will facilitate the work of a multitude of technicians: it offers up to 64 channels of plug-ins, various sound flow dynamics, modeling of virtual circuits inside, multiband compressor and dynamic noise suppressor.
Equalization (EQ) offers four styles or modes of application: Precise, Aggressive, Smooth and Legacy; the latter already known to Yamaha users. In this way, the range of tonals is expanded.
The visual interface has been improved and offers now a history of the actions that have been carried out. Dynamic 2 software includes
some new processes, such as FET Limiter and Dode Bridge Comp, so there is a choice now in the way that files can be compressed. The Mix Balance Control also adds easy parallel compression.
Inserts
Through the multi-touch screen, users can choose the start of the insertion and then select up to 4 plug-ins, from the Premium channel, from the Effects rack or from the EQ rack, or in a customized way.
Premium Rack: It has room for up to 64 units and includes favorites from Rupert Neve Design’s public such as Portico EQ, Comp or Primary Source Enhancer. Technicians have at their disposal OpenDeck Tape Saturation, Dynamic EQ, multiband compression and, above all, worth noting is the appreciated DaNSe dynamic noise suppressor.
Effects Rack: houses a wide range of high-performance HD and R3 reverberation controls, retro and modern delays, modulation and saturation effects.
EQ Rack: with capacity for 32 units or 64 channels of graphic or parametric equalization. Regardless of
PLUG-INS
BROADCAST PACKAGE
THEATRE PACKAGE
the internal process being inserted, it is possible to fully offset the delay, as well as the different routes of the output bus.
Other features that can be found in relation to inserts are the Dugan Automixer -available in the channel bar, always in post fader positionready to be used in up to 64 input channels. This tool is especially recommended to obtain a clear voice, in events such as conferences, discussion tables or similar.
DANTE Protocol
Dante Networking has been a key element in Yamaha’s mixing consoles for more than a decade now, and the new DM7 series could not be any different.
Whatever its size, every console within the DM7 line
offers 144 channels, both for input and output, at 96 kHz or 48 kHz. It is compatible with the AES67 and SD 2110 formats, and it is possible to mount and control up to 24 external devices by using only one of the DM7 mixing consoles, either medium or compact in size.
These external devices include Yamaha’s R-series and TiO racks, DZR-D self-amplifying speakers, XMV and PC-D amplifiers as well as the Nexo NXamp, in addition to the usual range of third-party wireless microphone systems and remote microphone pre-amplifiers. Nuendo Live, the control for multitrack recording software, is integrated into the network and, in addition, Virtual Sound Check, a patch dedicated to recording and virtual sound production is also included.
There is also a new sequential patch shortcut that is really useful to manage a lot of channels comfortably, simply and quickly, with control through the multi-touch screen.
Assist function
It is a new tool that uses integrated artificial intelligence and is able to suggest names for the input channels and -here is the novelty- configure the HA gains to adapt to the sounds being ingested. This function is especially useful for technicians with little experience or for technicians who have extensive experience but very limited time.
In addition, the Assist mode can even set fader levels to establish a starting point for the mixing. This is not
an attempt to eliminate the human presence in controls, warns Yamaha’s engineering expert Andy Cooper, but this function aims to reduce the time spent on technical or automated tasks so that technicians can focus more on creativity and sound editing.
“Having it both ways”: the ‘split‘ mode
This is one of the most surprising features in the DM7, since it allows you to split the physical console into two virtual consoles, as we have already mentioned. This splitting translates into the ability of the console to maintain two different forms of production at the same time; that is, it is possible that during the same session two technicians control, through the same console, the production of a live event and the streaming of the same event.
But, in addition, it is possible to manage channels, workflows and pre-sets in a tactile way through the display: greater agility and fluidity have been rarely seen in a mixing console. It is possible to set fixed levels, drag and copy them to the
other (virtual) mixing console where another production is being managed, and even hide buttons and levels quickly and easily.
Split mode is a different concept. Those who have attempted to use a mixing console for two different purposes, such as mixing a live stream at the same time as an FoH, will realize the relevance of this new feature by Yamaha. Thanks to this function, most of the resources are split at fifty percent, so the mixed buses can be defined in any way.
Of course, users are warned to choose either the split or the default modes before starting with the programming, since the ‘Scenes‘ and ’Console‘ files cannot be transferred from one mode to the other.
Specific firmware packages
Yamaha wanted to include -always mindful of the feedback received from its customers- new specific firmware update packages, with special attention to the two sectors that most demand Yamaha audio solutions: the Broadcast sector and the
Theatre sector. Each of them offers specific features for the chosen application. They will be available separately and can be installed in the mixer independently or combined, since they can coexist without problems in the same console; both firmware options come for free when the control unit is purchased.
The result: greater operability
The DM7 series is a step further in terms of operability, taking into account processing speed, the agility of its new functions and the flexibility it offers.
Availability and recommended price
Yamaha’s new DM7 series consoles will be available from September, at a recommended retail price of €26,450 for the mid-size DM7 and €13,950 for the DM7 Compact. The DM7 controller will go on the market in December 2023, with a recommended price of €4,500 and the software packages, which will be released on the same date, will start at €2,000.