And here we go, at last. The happening of the year, the annual meeting that all broadcasters and AV professionals have been waiting for a year is here. For our industry, September comes not only with the beginning of classes at school or the comeback to work from holidays —if you are so lucky to be on holidays during summer season— but with all the excitement of getting ready to travel to Amsterdam and join the buzzing of the industry’s engines roaring because the trip is about to start: IBC 2023, oh, boy, what a thrilling time!
And these times are indeed thrilling, if we stop just a moment and take a look around us, at how broadcast industry has been blooming and sparkling fueled by technological developments at all fronts: from managing live video delivery or virtual production workﬂows, developing new ways of video compression or better Cloud storage capacity, to AI being able to track players at ﬁeld —and even generate characters—, the hunger for improvement has penetrated all broadcast industry layers.
Because times are changing and our sector borders expanding, TM Broadcast is oﬀering a brief preview, just a glimpse if we bear in mind IBC’s numbers, of the International Broadcasting Convention in hope it will serve as a navigation card for the meeting. Besides, our staﬀ will be on site, ready for whatever may come, to ﬁnd out the most remarkable highlights and bringing them to our next issue. We always bet for face-to-face interaction and the IBC’s more than 37.000 attendants turns it into the ideal environment for sharing knowledge, strengthen participants’ network and forge alliances that will sure impulse new developments.
It couldn’t be any diﬀerent: the most expected broadcast convention in Europe boosts and dynamizes the entire industry with the IBC Accelerator Hub and thirteen innovation halls, in addition to show ﬂoor, exhibitors from 170 diﬀerent countries and renowned IBC Awards. But let’s say no more and please, come and see; the show is about to start…
This last year has been thrilling: a year full of progress, advances and new solutions that makes workﬂows easier and more eﬃcient, a higher quality in video and streaming delivery and automations that saves lots of time and money. With all the recent achievements in mind, this year’s IBC promises to be exciting as well as a boost for the AV industry.
Because IBC 2023 is not merely a gathering of industry players; it is a crucible of technical progress. With this special report, we will delve into the myriad technical advances that will be unveiled at the event. Whether it’s the latest advancements in 4K and 8K broadcasting, immersive audio experiences, cloud-based production workﬂows, or AI-driven content analysis, our keen eye for detail will dissect and scrutinize each innovation. As a visitor or professional, you’ll have the opportunity to speak with the engineers, designers, and thought leaders behind these breakthroughs, providing your readers with an informed and insightful perspective on the implications and potential disruptions these technical advances may bring to the broadcast and audiovisual production landscape.
TM Broadcast talks with A.J. Wedding, Orbital Studio’s CEO, about how to achieve success in so little time as well as the advantages of virtual production, video industry and what are the main challenges in audiovisual production nowadays.
Zero Density boosts virtual production with its high-precision tracking plattform
Zero Density has recently released a solution that makes high-precision tracking in virtual production easier than ever: its Traxis Tracking Platform. The Traxis Platform allows creatives to produce immersive stories with photoreal virtual graphics, while eliminating traditional barriers like long calibration times, distracting markers and expensive latency errors. Also, it features solutions for camera tracking, talent tracking and tracking control.
The plattform oﬀers an accuracy of up to 0.2 mm and ultra-low latency of just six milliseconds, so the Traxis Camera Tracking system oﬀers exceptional precision. On a
virtual production, this ensures continuous calibration and seamless real-time results, so that every camera move that happens on set is reﬂected in the virtual world without delay. Virtual graphics can then be broadcast live without breaking the audience’s immersion.
For talent tracking, the Traxis platform includes an AI-powered system that can identify multiple actors within a 3D environment at once. The Talent Tracking uses AI to extract the actor’s 3D location from the image in real time, then sends the data to Reality Engine to create accurate reﬂections, refractions and virtual shadows of the actor inside the 3D space. The technology
is completely markerless, which streamlines setup and maintenance, and means actors no longer need to be distracted by wearables or beacons that make it harder to perform.
At the heart of the Traxis platform is the Traxis Hub, a comprehensive tool that manages both tracking and lens calibration data. Compatible with all industry-standard tracking protocols, the hub brings together all available tracking and lens data into a centralized interface, signiﬁcantly reducing setup time. The hub has a library of the most commonly used broadcast lenses, enabling quick and easy calibration.
Latest updates to Varnish Enterprise 6 allow broadcasters controlling content delivery
Broadcasters, telcos and CSPs face mayor challenges in delivering content eﬃciently and eﬀectively, as the video industry grows. Varnish Software, a company that specializes in caching, streaming and content delivery software solutions, answers to this demand with the latest enhancements to Varnish Enterprise 6, with the goal of helping organizations gain greater control over their content delivery.
The shift from traditional broadcasting to streaming, coupled with the need to leverage existing infrastructure, are driving factors behind the need for more ﬂexible and scalable Content Delivery Networks (CDNs). These challenges are further exacerbated by growing demand for live events, like sports, where low latency streaming and eﬃcient management of unpredictable traﬃc surges are crucial.
However, the adoption and management of CDNs brings its own set of challenges. These range from having limited expertise, to orchestrating changes without disruptions, integrating with existing systems and more.
Recent updates to Varnish Enterprise 6 include the new Massive Storage Engine (MSE 4) and updated Traﬃc Router, which
aim to address these emerging challenges. Major telcos and broadcasters worldwide have swiftly adopted these enhancements to optimize and augment their content delivery, from origin to the edge.
“Skyrocketing demand for more content and digital experiences has placed an immense burden on broadcasters and telcos, who require eﬃcient distribution without extensive new hardware investments,” said Adrian Herrera, CMO, Varnish Software. “Optimizing through software is the answer. Varnish Enterprise 6 is constantly evolving to enable more eﬃcient distribution, reduced complexity and futureproof operations with greater control over content.”
Many leading broadcasters and telcos rely on Varnish Software
to support their migration from linear to streaming distribution. RTÉ, for example, uses Varnish Enterprise 6 to power ﬂexible live and on-demand video streaming, while future-prooﬁng its HD and potential UHD delivery capabilities.
Since deploying Varnish, RTÉ was able to minimize backend storage reads and maximize cache eﬃciency, allowing it to support greater traﬃc volumes at higher bitrates without additional hardware. In addition, RTÉ has found many unique creative use cases for Varnish Enterprise 6, which have helped rapidly implement new features and capabilities. For example, RTÉ is using Varnish Conﬁguration Language (VCL) to integrate dynamic ad insertion, customizing responses based on client requests.
Realty company PropNex includes new 4K NDI studio delivered by Ideal Systems
Ideal Systems, one of the major companies in Asia specialized in systems integrating for broadcast, cloud and professional audiovisual, recently announced that it has designed, built and delivered brand new 4K, NDI® based, corporate production studios with live streaming features for PropNex Limited, a leading real estate agency in Singapore
Studio as a natural extension of their customer driven business as they continually seek to better serve, educate and engage their customers.
At over 1,700 SqFt, the extensive studios, located at PropNex’s head oﬃce in Singapore HDB HUB, contain a fully featured production control room and a large chroma key green screen set for virtual productions. All the studio cameras are 4K Native NDI, and all the networking and production systems from BirdDog, Kiloview and VizRT are run on the latest NDI 5 technology standard.
The new PropNex studio is built in what was formerly a large oﬃce space. The studio will provide unprecedented
communications ability for PropNex to produce and live-stream high quality 4K professional video content and provide a content library to its customers and over 12,000 sales professionals.
PropNex has worked to ensure their clients have access to information and research content that can help them in their property investment journey. Tapping onto the potential of the fast growing digital media space, they see the high-quality video content created in their new PropNex
Due to the wide audience reach of online video platforms, many corporations are choosing to build their own professional TV grade studios to create content and communicate directly with their customers via social media and streaming to apps. For the PropNex Studio, Ideal Systems based the whole video production architecture on next generation NDI® infrastructure with zero legacy SDI equipment or cabling used in the entire facility. This is a next generation TV production system supporting end-to-end 4K over IP from camera, through production and live streamed securely up to 4K to the viewer.
By utilizing NDI® with live streaming capability of the TriCaster® Mini 4K, the Ideal Systems solution design reduces the complexity of the solution architecture for technically complex productions featuring multiple video callers from platforms like Zoom® and Microsoft® Teams in live interviews on the Video Wall or in Virtual Space in the Chroma set.
CP Cases teams up with Queen’s University and Matrix Polymer for a recyclable bio polymer project
Nowadays, the manufacturing industry faces a huge challenge to reduce its environmental impact, in a world that more and more strives for sustainability. The UK company CP Cases, renowned for designing and manufacturing protective equipment cases and containers, has joined a partnership with Matrix Polymer, experts in rotomoulding materials and Queen’s University, a renowned research institution, in order to launch a study project funded by Innovate UK for introducing a ﬁeld-tested recyclable Bio Polymer.
This project focuses on shaking up rotational moulding by integrating Matrix Polymer’s state-of-the-Art materials, with sustainability as a ﬂag.
A Shared Vision
The collaboration between CP Cases, Matrix Polymer, and Queen’s University is a testament to their shared commitment to sustainable innovation. These industry leaders have come together under the umbrella of Innovate UK’s initiative, aiming to introduce the power of recyclable bio polymers into the rotational moulding process, thus paving the way for a more eco-friendly manufacturing approach.
Matrix Polymer’s Pioneering Materials
Matrix Polymer brings to the table its expertise in raw materials, as a leading supplier to the rotational moulding industry. With a focus on reducing carbon emissions and plastic waste,
these materials align seamlessly with the project’s core objective of sustainability. Matrix Polymer’s key objective is to bring fresh ideas and exciting new materials to the market and strong technical roots is a cornerstone of this collaboration.
Queen’s University’s Research
The research ability of Queen’s University plays an integral role in this collaborative endeavour. With a deep understanding of material science and manufacturing processes, the university is providing critical insights and technical expertise to validate the feasibility of creating recyclable bio polymers to introduce into rotational moulding. Their involvement ensures that the project’s outcomes are grounded in scientiﬁc rigour.
CP Cases’ Legacy of Innovation
CP Cases specialises in crafting high-quality protective solutions for a wide array of industries. Leveraging decades of experience, they design and
manufacture custom cases, containers, and enclosures that safeguard valuable equipment and delicate instruments from even the harshest conditions. Their commitment to innovation and precision ensures that their solutions not only meet but exceed the stringent demands of various sectors, including defence, aerospace, technology, and more. With an unwavering focus on quality and functionality, CP Cases is your trusted partner in safeguarding your assets wherever your journey takes you. Their dedication to sustainability and their commitment to pushing the boundaries of rotational moulding make them an invaluable contributor to the project.
The project’s success hinges on rigorous testing and validation, a process guided by the combined expertise of CP Cases, Matrix Polymer, and Queen’s University. Through meticulous examination of the materials’ performance within rotational moulding processes, the collaboration aims to ensure a seamless integration that meets industry standards and requirements.
Gravity Media Australia allies with JAM TV for delivering coverage of AFLW 2023
Both companies, Gravity Media and JAM TV, recently conﬁrmed the renewal of their long lasting partnership to broadcast the 2023 AFLW season. This will be the seventh AFLW season delivered by Gravity Media’s broadcast facilities in collaboration with JAM TV. Besides, this broadcaster produces AFLW television coverage for Fox Footy and Kayo.
Gravity Media’s latest outside broadcast units, satellite and uplink facilities and crews will be working closely with JAM TV’s production team as they criss-cross the country over the coming three months. The australian company will be involved in the coverage of 47 games in this AFLW season. In addition to providing broadcast technologies for JAM TV, Gravity Media will also produce the big screens coverage at the stadiums.
Cos Cardone – CEO, JAM TV:
“Gravity Media has been our AFLW technical provider since the very f irst game in the very f irst year of AFLW. It’s been an outstanding collaboration which has delivered a seamless and successful coverage to our broadcast partner Fox Sports every year. We look forward to the launch of another AFLW season in September which together with JAM TV’s f inals coverage of the VFL, SANFL and WAFL competitions makes for a big and exciting month ahead for our talented football production team at JAM TV.”
Marcus Doherty, Account Executive – Media Services and Facilities, Gravity Media:
“We are thrilled to again be working with JAM TV on AFLW. We have a long partnership with JAM
TV working on major events and look forward to again being a part of such an exciting competition and delivering JAM TV’s Fox Sports and Kayo coverage of the AFLW. We are also pleased to continue our partnership with JAM TV on its production of VFL coverage.”
Kahleah Webb, Head of Operations, Gravity Media:
“Gravity Media will use nine outside broadcast trucks, including an SNG f leet, and a total team of 800 crew across the country to deliver a total of 130 hours of AFLW coverage f rom 17 venues around Australia over the coming 10 weeks. It is a signiﬁcant technological and logistical exercise that builds on our successful AFLW broadcast facilities provision to JAM TV over the past seven years.”
Cellcom and Novelsat to perform a pilot for delivering real-time multicam immersive experiences to in-stadium fans’ smartphones
The alliance between Cellcom, Israel’s cellular provider, and Novelsat, company specialized in content connectivity, aims to leverage 5G technology for delivering dazzling multi-camera perspectives to fans at live events. This collaboration will introduce an immersive real-time multi-camera viewing experience for in-stadium fans through the integration of Novelsat’s edgebased media delivery solution within a stadium equipped with Cellcom’s 5G network.
This way, video feeds from various cameras positioned throughout the stadium will be seamlessly processed by Novelsat’s solution and promptly distributed to fans’ smartphones over Cellcom’s public 5G network. Utilizing state-of-the-art edgecomputing capabilities, Novelsat’s
solution will stream video content with high quality and minimal latency to a dedicated application installed on users’ devices. This pilot will demonstrate the advantages of edge-based video delivery, oﬀering enhanced video traﬃc eﬃciency and an elevated end-user experience quality (QoE), all the while showcasing the potential of 5G to deliver customized, engaging experiences that draw fans even closer to the heart of the action.
“5G is a game-changer for sports and stadium business, stimulating fan experiences. The smartphone in the hand of each fan is generating the opportunity for an interactive and unforgettable fan experience,” said Gary Drutin, CEO of Novelsat. “Novelsat’s edge- based media distribution solution
leverages next-generation connectivity to enable new values for connected stadiums. Our pioneering solution ideally addresses the on-site demand for high bandwidth with low latency, oﬀering market-leading video delivery economics for large public venues. Creating a platform that supports the future of interactive stadiums, we empower the vision of the connected fan experience with future-proofed solutions that stay ahead of the game.”
“The 5G technology enables the advancement of highly innovative products and services across various domains. At Cellcom, we are diligently working to enhance Israel’s most powerful and highest quality 5G network, alongside fostering and developing greater and better experiences for our customers,” said Avi Grinman, Cellcom’s CTO and VP of Engineering.
“Stimulating innovation in sports fan experiences aligns perfectly with our mission, and by integrating Novelsat’s expertise in content delivery with our advanced 5G infrastructure, we aim to explore a new dimension of in-stadium entertainment.”
Sencore expands its broadcast solutions portfolio with the acquisition of Adtec Digital’s Aﬁniti Platform
Sencore has announced the acquisition of Adtec Digital’s Aﬁniti platform. Adtec Digital’s Aﬁniti has made signiﬁcant strides in contribution, news gathering, and REMI applications, and its integration into Sencore’s product portfolio marks a strategic move towards expanding our oﬀerings to our valued customers.
Existing customers of Adtec Digital can rest assured as Sencore remains committed to further developing and supporting the Aﬁniti platform, accompanied by their worldrenowned ProCare support services.
Some of the most important features of Aﬁniti platform are: the highest-quality, lowest latency encode and decode functionalities available in the market today. It supports up to 4K UHD resolutions at 4:2:2 10-bit, with full compatibility for HEVC, H.264, and MPEG2 video codecs. The modular design of Aﬁniti enables seamless customization, oﬀering options for contribution-level encoding and decoding, ASI, IP, and SRT I/O, as well as multiplexing capabilities – all in one platform.
In addition to the processing abilities of Aﬁniti, Sencore has ambitious plans to elevate
the platform’s functionality by introducing Centra Gateway. This software platform will provide users with full control over the Aﬁniti platform, simplifying the management and control of contribution encode and decode workﬂows like never before.
“We are thrilled about this acquisition and ﬁrmly believe that it will deliver immense value to our customers, empowering them with cuttingedge technology and enhanced support to achieve their broadcasting goals,” says Seth VerMulm, Director of Product Management at Sencore.
Zixi achieves AWS Service Delivery designation for AWS
Zixi, the company specialized in enabling costeﬃcient and highly scalable live broadcastquality video over any IP network or protocol has recently announced its achieving of the Amazon Web Services (AWS) Service Delivery designation for AWS Graviton. This designation means that it’s recognized that Zixi provides deep technical knowledge, experience, and proven success delivering the SDVP on AWS Graviton-based Amazon Elastic Compute Cloud (Amazon EC2) instances and deployed support for AWS Graviton2 and AWS Graviton3.
Besides, this achievement validates that Zixi’s capabilities in cloud architecture, engineering and cloud native application development on AWS are helping customers accelerate and scale their adoption of AWS Graviton to realize the price performance beneﬁts sooner across more workloads.
AWS Graviton processors are custom ARMbased processors built by AWS to deliver a great quality performance for your cloud workloads running in Amazon EC2. These instances are available in diﬀerent sizes and conﬁgurations to cater to various workload requirements. AWS Graviton2 and AWS Graviton3 processors oﬀer a signiﬁcant performance improvement, with increased core counts, faster clock speeds, and improved memory subsystems. One of the advantages of using AWS Graviton instances is cost savings where instances typically provide a lower price per compute unit compared to x86-based instances, allowing users to optimize costs for their workloads.
Special preview IBC
How to navigate in an ocean of progress
This last year has been thrilling: a year full of progress, advances and new solutions that makes workﬂows easier and more eﬃcient, a higher quality in video and streaming delivery and automations that saves lots of time and money. With all the recent achievements in mind, this year’s IBC promises to be exciting as well as a boost for the AV industry.
The ﬁfteen exhibition halls, themed by creation, management and delivery are complimented by a host of feature areas specially designed to enhance visitants and professional’s experience. The IBC Conference, as well as the guest speaker’s interventions, allows a deeper understanding as well as facilitating networking at events including the industry established IBC Awards.
At the heart of IBC’s allure are the industry giants and trailblazers who converge to unveil their latest creations. This preview will shine a spotlight on these prominent companies, from longstanding powerhouses to up-and-coming disruptors. Each of them brings its unique vision and innovations to the forefront,
vying for attention and recognition. This preview will serve every professional to navigate the labyrinthine exhibition halls, conduct interviews, and provide an indepth exploration of their oﬀerings.
Because IBC 2023 is not merely a gathering of industry players; it is a crucible of technical progress. With this special report, we will delve into the myriad technical advances that will be unveiled at the event. Whether it’s the latest advancements in 4K and 8K broadcasting, immersive audio experiences, cloudbased production workﬂows, or AI-driven content analysis, our keen eye for detail will dissect and scrutinize each innovation. As a visitor or professional, you’ll have the opportunity to speak with the engineers, designers, and thought leaders behind these breakthroughs, providing your readers with an informed and insightful perspective on the implications and potential disruptions these technical advances may bring to the broadcast and audiovisual production landscape. This guide is designed to be your navigation map in order to accomplish all your goals. Let’s begin.
Argosy will be launching its new Ultra range of 12G solutions at IBC. The ULTRA suite makes UHD/4K accessible to those working with traditional SDI workﬂows.
This new Argosy’s suite provides an end-to-end solution comprising a range of 12G products, including the Image 360 ULTRA, Image 1000 ULTRA, and Image 1500 ULTRA coax cables. Primarily used for in-rack wiring and OB trucks, these UHD SDI Digital Video cables are designed to deliver high performance for UHD/4K SDI video applications.
Additionally, the suite features the coax KORUS ULTRA HD 12G BNC’s, tested to meet the requirements of SMPTE ST 2081-1 & ST 2082-1 at 6G & 12G systems for use with the Argosy Image ULTRA coax cable range.
This new Argosy-branded ULTRA range enables straightforward installation for end-to-end UHD/4K and eliminates the need for expensive infrastructure ﬁbre upgrades.
Primarily used for in-rack wiring and OB trucks, these UHD SDI Digital Video coax cables are exclusive to Argosy and designed to provide superior performance for UHD/4K applications.
Ateme will be checking out the latest industry trends and help visitors and customers ﬁnd the most agile solutions, on-prem, cloud-based, or a combination of the two.
The company will be showing exciting demos showcasing the latest state-of-the-art solutions enabling:
Transformation — Transform operations with eﬃcient
solutions including Ateme’s true cloud-native Cloud DVR and the latest developments of our SaaS oﬀering, Ateme+. Monetization — Increase revenue through dynamic ad insertion, FAST solutions, and shoppable TV capabilities.
Viewer Engagement —
Engage current and attract new viewers with high OTT QoE, fan-retention solutions and enhanced stadium experiences.
Going Green — Curb costs and your business’s carbon footprint while improving the viewing experience with our winning solutions including end-to-end Audience-aware Streaming.
During the IBC tradeshow, BCE will showcase its remote production and gamiﬁcation modules, featuring two
cutting-edge solutions, Holovox and Fan engagement.
Holovox is a remote voiceover solution that enables users to add one- or multilanguage voice-over streams to live shows from anywhere with an internet connection. Its cloud-based platform allows users to manage teams, control live audio and video streams, and even incorporate live translations and sign language insertions seamlessly.
Fan engagement enables real-time gamiﬁcation and monetization of audiences. Integrated with Holovox, it allows sportscasters to entertain viewers by creating interactive games within the live broadcast, leading to a unique fan experience. It also provides opportunities for direct audience monetization through transactions and sponsorships, generates innovative ad inventory, encourages competition among fans for prizes, and fosters closer engagement with the action.
By leveraging these two solutions, BCE streamlines voice-over production, enabling sportscasters to seamlessly transition between diﬀerent events while
ensuring viewer engagement through live gamiﬁcation, enhancing the overall viewing experience.
Visit BCE at stand 8.B57 during the IBC tradeshow to witness live demonstrations of the Holovox and Fan engagement solutions among others in content management, distribution, and exchange.
Black Box will feature IPbased KVM and AV solutions that empower users with intuitive user experiences and high system connectivity for production control rooms and both remote and hybrid productions — including live events—.
Black Box will show how its latest solutions ensure ﬂexible, scalable, and secure workﬂows with industryleading low-IP-bandwidth usage. Visitors to the Black Box stand will discover new additions to Emerald® KVMover-IP system that unite KVM and AV video wall processing under a single management system. They will experience the company’s fresh approach — integrating automation into Emerald, enabling greater system reliability through predictive maintenance, and simplifying workﬂows through synchronization of user workstations.
Black Box will too demonstrate the Emerald DESKVUE solution and Emerald AV WALL.
The company will highlight IPbased and remote production intercom solutions and announce new features for the Arcadia Central Station.
Clear-Com will showcase the latest advances in IP-based intercom for a wide variety of applications. They will highlight innovative features of their ﬂagship Eclipse® HX Digital Matrix Intercom System, including Dynam-EC™ realtime production software, industry-leading role-based workﬂows, and the new 2X10 Touch™ desktop touchscreen panel. The award-winning IP-based Arcadia® Central Station will be available for demonstration, including the HXII-DPL Powerline device, an IP interface that delivers
power and digital audio from Arcadia to HelixNet® Digital Network systems, as well as highly anticipated new features to be revealed at the show.
Dynam-EC is an intuitive software tool that allows operators to quickly switch all audio input and outputs, audio mapping, IFBs and partylines at the touch of a button.
New features introduced in EHX 13.1, including the implementation of role-based workﬂows and support for NMOS 4 and 5, facilitate many of the needs of broadcast applications for large-scale events such as the World Cup or other global sporting events.
A popular choice for ﬂypacks and OB vans, Arcadia Central Station brings together HelixNet®, FreeSpeak®, Clear-Com Encore®, other 2W/4W endpoints, and third-party Dante devices in a single, integrated system. Arcadia oﬀers licensed-based scalability that allows it to meet numerous production needs, with support for over 100 beltpacks and up to 128 IP ports, with additional upgrades available in the future. The new HXII-DPL Powerline Device connects to any existing Arcadia system via its own network port to provide Powerline connectivity via 3-pin XLR anywhere on the network, opening a world of digital audio connectivity to HelixNet users.
CueScript will be showcasing a range of tools needed for broadcasters to more easily produce content in the ﬁeld. This includes its recently released cloud-based service, CueTALK Cloud, its ﬁrst voiceactivated solution for its CueiT prompting software SayiT, and OnTime, a WiFi enabled clock device that can be easily conﬁgured to the local time zone.
Besides, CueScript will demonstrate its various PTZ teleprompter solutions along with its new custom solution designed speciﬁcally for the Sony Electronics’ FR7 PTZ Camera.
CueTALK Cloud is designed to allow CueScripts’ prompting software CueiT- and CueTALKenabled devices, which feature the latest in IP connectivity, to communicate over WiFi. It allows controllers and prompt devices to be accessible via the cloud using standard public internet connection.
CueScript will also be highlighting SayiT, its ﬁrst voice-activated solution for its CueiT prompting software. SayiT software application receives a talent microphone input and allows those using the CueiT teleprompting software to be able to have the script automatically scroll in accordance to what they
are saying, matching what is displayed on the output of the teleprompter and eliminating the need to manually scroll the prompter script.
OnTime, a WiFi enabled clock device that can be easily conﬁgured to the local time zone, will also be a highlight at IBC. While it is a small accessory product, it is a great tool for those broadcast teams that require an accurate timing device on the road, especially for those that are traveling to multiple locations.
At IBC 2023, Dalet will showcase its new Dalet InStream hyper-scalable multichannel IP ingest solution, a signiﬁcant expansion of Dalet Brio’s capture capabilities, and the recently launched Dalet Cut web-based media editor. These powerful cloud-native SaaS solutions help customers achieve the operational eﬃciencies and cost savings they need. This is the ﬁrst time that end-to-end cloud workﬂows can run seamlessly across news operations powered by Dalet Pyramid and the wide range of Dalet Flex media workﬂows.
IBC2023 attendees can book a demo to see live the beneﬁts
and cost savings that Dalet’s full range of solutions bring to newsrooms and media workﬂows. This includes its new Dalet InStream and Dalet Cut innovations, the nextgeneration Dalet Flex platform and Dalet Pyramid modules, Dalet AmberFin high-quality media transcoding and processing, as well as the Dalet Brio high-density ingest and playout application.
Introducing Dalet Cut
Dalet Cut enables live webbased editing from anywhere with native access to all assets, including clips, sequences, projects and graphics, even with limited bandwidth.
Dalet will also showcase Dalet Cut’s news production capabilities within Dalet Pyramid:
- Scoop news and create multimedia stories from anywhere using cloud-native proxy editing along with intuitive story creation tools.
- Assembling and distributing compelling news for any platform is quick and easy in a single browser tab.
- Quickly generate multiple versions of each news story for social networks at any point in the workﬂow, from ingest to delivery.
Dejero will introduce its latest EnGo mobile video transmitter to visitors at IBC 2023, and visitors to their stand will learn that not only does the new EnGo 3s carry the same powerful features of the EnGo 3, including native 5G modems and built-in GateWay Mode for wireless internet broadband connectivity but it also oﬀers 12G-SDI and HDMI connectors.
The Dejero staﬀ on stand will explain to attendants and interested public that 12G-SDI can deliver eight times the bandwidth of HD-SDI, and users have the ability to handle high frame rate and live 4K/UHD signals, over a single cable.
Similar to the existing EnGo 3 and 3x mobile transmitters, the EnGo 3s features Gateway Mode which provides wireless broadband internet
connectivity in the ﬁeld to enable mobile teams to reliably, securely and quickly transfer large ﬁles, access MAM and newsroom systems, and publish content to social media. GateWay Mode also provides general internet access to resources for ﬁeld research, access to cloudbased services and also serves as a high-bandwidth access point for devices.
As IBC 2023 visitors will be able to see and conﬁrm the EnGo 3s also streamlines communication and workﬂow between the ﬁeld and station or post production facility. This is essential for mobile news teams that work to tight deadlines, as well as ﬁlm crews working from remote sets.
Dejero’s Smart Blending Technology powers the EnGo 3 mobile transmitter range by simultaneously blending together multiple wired (broadband/ﬁber) and wireless (3G/4G/5G, Wi-Fi, satellite) connectivity from multiple providers to provide a ‘network of networks’.
Densitron, company specialized in display, touchbased interfaces and speciﬁc ODM+ outsourced product
design and development for manufacturers, will showcase its latest HMI and control system innovations for broadcast manufacturers and systems integrators at IBC 2023. The company will be presenting as well its suite of ready to use product control panels and IDS Control System. In addition, the company will introduce at IBC 2023 a new technology, Jogdial, which provides a vital new component in its ongoing eﬀorts to enrich the tactile experience of future sets.
Also, Densitron will present to the IBC Show’s attendants its complete suite of ready to use control panels and IDS Control System, powered by ARM or X86 computer processors.
The company will shine a particular spotlight on its tactile and haptics innovations, including the new 2RU Haptics product, which delivers a ‘building block’ HMI that can be incorporated into vendors’ own designs, together with the Tactile HMI Evaluation Kit.
At IBC 2023, Densitron will be introducing the new Jogdial encoder for use with Densitron’s 2RU rack-mounted range, using IBC Show as a oﬃcial launch platform.
Densitron will also highlight its IDS Control System, which includes a growing product ecosystem (with 100 new pieces of broadcast equipment recently becoming compliant) and cloud deployment option.
At this year’s IBC, Edgio will demonstrate how media companies can create a complete streaming solution built on Uplynk’s components that deliver great viewing quality and reliability while optimizing monetization through advanced advertising.
Visitors will be able to learn about Edgio’s tried-and-tested solution which provides a high quick time-to-market and improves ROI.
In addition, Edgio will also share details of its latest partnerships and integrations that see Uplynk’s oﬀering expand to solve all the complexities of video streaming today, from ingestion to delivery and analytics.
At this year’s event, EVS is set to showcase its comprehensive and integrated suite of solutions, oﬀering a creative and cohesive working
environment from production to monetization.
EVS will highlight the beneﬁts of its VIA platform, which provides common and ﬂexible technology catering to the speciﬁc needs of various stakeholders including broadcasters, production teams, content owners, and digital marketers.
For production purposes, the LiveCeption solution leverages the power of the platform to enable creators to deliver captivating replays whether
they are on-site or working remotely. The MediaCeption solution utilizes the platform to provide web-based browsing and editing tools that enhance collaboration among production teams. On the distribution side, EVS will be showing its MediaHub solution, which allows content owners to monetize their assets through cloud-based operations and facilitates social media engagement.
Furthermore, EVS will unveil the latest version of its Xeebra multi-camera review system, designed to facilitate remote workﬂows even in low-bandwidth environments. This enhanced version caters to the demands of centralized video operation rooms (VOR), ensuring swift and reliable operations that reinforce collaboration regardless of geographical constraints.
Last but no least, the full line of EVS’s Neuron products will be presented, including the latest addition, Neuron View, a high-quality scaling and low latency live production multiviewer. Speciﬁcally designed to support the needs of live production teams, one View card can support two UHD outputs or up to eight full HD outputs in a fully customizable layout.
At IBC 2023, farmerswife —software provider for resource scheduling, project management, and team collaboration in the media industry— will be showcasing for the ﬁrst time version 7.0 of the farmerswife tool, and the new features of the Cirkus project management solution.
To make workﬂow management and collaboration easier, the version 7.0 of the farmerswife scheduling, management and reporting tool has incorporated several new features including:
Dark mode: For those who are working late at night, in a low-light grading suite, or just want to reduce eye strain,
farmerswife v7.0 oﬀers its highly anticipated new dark theme.
QR Codes: The new “QR Code” element adds an extra layer of information to reports, making it easy for clients or colleagues to quickly access additional details or resources, such as a project overview, a budget breakdown, or a promotional video.
Sync Mode: The new “Sync Mode” option allows users to control which direction data is synced —from farmerswife to Cirkus, from Cirkus to farmerswife, or both ways. Visitors will check out how the tool can be conﬁgured independently for Projects, Bookings, Time Reports and/ or Personnel Bookings.
Advanced Project Search: The latest release makes it easier for users to ﬁnd what they need. The dropdown “Filter From” allows users to ﬁlter by Container, Project, Binders and Projects & Binders. Users can also “Edit View” or activate the “Load View” option to load a saved view from another user, which is useful for collaboration with remote team members.
As IBC 2023 attendants will be able to see, farmerswife and Cirkus can work together
or as standalone products to address the speciﬁc needs of media organisations, with farmerswife providing inhouse scheduling, resource management and ﬁnancial reporting, and Cirkus sharing information with distributed teams internally and externally.
GatesAir, a subsidiary of Thomson Broadcast dedicated to wireless content delivery, will showcase new
low-power FM and DAB
Radio solutions from its two ﬂagship transmission product brands at IBC2023 that align with global radio broadcast trends. Highlights include an expansion of the Flexiva GX transmitter series to serve more low-to-medium FM power levels, and the European debut of an ultracompact, second generation Maxiva family of DAB/DAB+ Radio transmission solutions to support a higher density of digital radio services in a single chassis.
At their stand, GatesAir will be showing options for Flexiva GX that include GPS receivers for SFN support, and GatesAir’s Intraplex IP Link 100e module. The latter integrates within Flexiva GX transmitters, enabling direct receipt of contributed FM content instead of requiring an external codec. This further reduces rack space requirements inside RF facilities with limited open real estate.
Grass Valley will showcase at their stand a ecosystem that uniﬁes the traditional hardware that broadcasters have long-relied upon with next-gen softwaredriven solutions, such as its
cloud-native AMPP (Agile Media Processing Platform) production and distribution ecosystem.
As a fully cloud-native solution, AMPP is a highly secure, reliable open ecosystem architected with standard REST APIs throughout, integrating applications from 80 third- party partner companies, as well as over 100 cloud native applications of its own. Standard REST API integration points for every part of the solution allow for tight integration to both customer and third party control and deployment solutions, whilst comprehensive signal format support means that AMPP can be inserted into the customers’ signal path
anywhere it’s required, from simple graphics overlays through to full workﬂow chains.
Grass Valley will be demonstrating its video production switchers which are widely installed in all sorts of Tier-1 production facilities, including broadcast networks, mobile OB units and remote sites, as well as studios run by corporations, universities, and houses of worship.
Another Grass Valley’s highlight for this IBC 2023 will be LDX 100 series broadcast camera platform exempliﬁes the ﬂexibility and upgradeability of key products in its portfolio. This camera system lets customers choose the camera/XCU conﬁguration
(and IP and SDI functionality) of their choice.
The Mexican company Igson, specializing in automation and broadcast IP and TV solutions, will exhibit its innovations during the next edition of IBC 2023 with the aim of bringing its products and solutions to a European and global audience.
Igson products that will be on display:
• Igson MULTIPLAY with SCTE 35/104 insertion.
• Igson REPLAY MASTER: production of sporting events.
• Igson SIGNAL LOGGER: our exclusive development for the detection of SCTE 35/104 or DTMF impulses, closed captions, etc.
• Igson PROOFREC with baseband and IP signal recording.
• The new version of Igson GRAPHICS with unique and outstanding functionalities for live production.
In addition to the new products, the company’s extensive range of solutions will also be exhibited: hybrid playout systems (SDI/IP), Master Control automation, baseband and IP signal recorders, DPI event reception and insertion systems (SCTE-35/104) for content blocking in cable and satellite distribution systems, or in digital platforms and FAST Channels, signal detection and recording systems, plotters and titlers, among other solutions.
INFiLED, manufacturer of LED displays, will launch a brandnew LED solution at IBC, its Inﬁnite Colors technology.
If traditional LED video displays only use 3 emitters being Red, Green, and Blue, INFiLED is introducing a fourth emitter in a custom LED package that increases the color spectrum viewed from professional cameras by using a LED video Ceiling. This way, IBC visitors will learn how Inﬁnite Colours improves a variety of LED applications by allowing full variations in tone, saturation, and colour appearance in white light and custom colours, all featured in one display.
During IBC 2023 will be presented the INFiLED Studio AR Series, where the initial application of Inﬁnite Colors technology can be observed in by all attendants, and they will be able to experiment how the creators can seamlessly integrate attractive and dynamic lighting eﬀects that enhance the overall visual quality of virtual productions.
INFiLED will be also showcasing its ﬂagship WP 1.2 as well as one of its latest innovations, the M2 technology which has been seamlessly integrated into the STUDIO DB and STUDIO Xmk2 series. One of the standout features of M2 is its ability to enhance background performance through a high color accuracy.
At IBC 2023, Kiloview will demonstrate its Kiloview ecosystem for broadcasting and media, showcasing its latest solutions for video contribution, production, transmission, and distribution, all based on IP.
The ﬂexibility and interoperability of the IPbased ecosystem will be emphasized by showcasing SRT solutions, such as the newly added SRT coverage of N50/N60, rack-mountable encoder RE-3, 5G bonding encoder P3, dual-channel encoder E3, and NDI solutions, including the CUBE series and LinkDeck.
Enhanced IP Coverage of Kiloview Ecosystem
With constant upgrading of its IP-based hardware and software solutions, Kiloview Ecosystem aims to enable professionals to be more versatile in various workﬂows with less need for hardware and staﬃng. This said, attendants will be able to check out the latest Kiloview’s update: it will expand the IP coverage of N50/N60 including SRT, RTMP, RTSP, UDP and more protocols
except the existing NDI High Bandwidth and NDI|HX2/3 support. Another notable update for visitors to review is the NDI High Bandwidth support of MG300V2, as well as the ones that follow:
Kiloview P3 – 5G Bonding Video Encoder
It’s a 5G bonding HEVC video encoder which integrates up to 4CH 5G connections, Wi-Fi, and Ethernet leveraging Kiloview’s Kilolink bonding server to optimise signals across diﬀerent internet connections.
4K HDMI & 3G-SDI Video Encoder
The versatile video encoder, designed to provide highquality video streaming solutions for a wide range of applications, will be available for interested professionals at Kiloview’s IBC booth. It supports various input sources, including HDMI and SDI, ensuring compatibility with diﬀerent cameras and devices. Broadcasters may be interested in its powerful encoding capabilities that enable users to utilize the video sources from both inputs in multi ways, either encode simultaneously or mix
with PIP or PBP. This encoder eﬃciently compresses video streams with H.265/ HEVC technology, enabling low-latency transmission without compromising on video quality. Equipped with multiple streaming protocols like RTMP, RTSP, SRT and NDI|HX3.
CUBE R1 - A solution for reliable NDI recording
Kiloview CUBE R1 is an NDI recorder system designed to capture and store high-quality video content from multiple NDI video sources.
CUBE X1 - A solution for NDI multiplexed distribution
Another solution will be present at IBC 2023: Kiloview CUBE X1, designed for scheduling, routing and switching, distribution, and management of NDI signals. It accommodates up to 16 channel NDI inputs and 32 channel NDI outputs.
Lawo is set to unveil its latest innovations at IBC 2023 and visitors to its booth will be the witnesses. Among Lawo’s range of products IBC visitors will ﬁnd this year a couple of anticipated developments as:
HOME Apps – Server-Based Processing
Lawo’s HOME Apps will take center stage at IBC 2023. These applications, including HOME Multiviewer, HOME UDX Converter, HOME Stream Transcoder, and HOME Graphic Inserter, will demonstrate the power of a ﬂexible microservice architecture. They deliver high processing capabilities with reduced compute power and energy consumption, allowing users to adapt swiftly to changing requirements and budget conditions. Lawo’s HOME Apps support SMPTE ST2110, SRT, JPEG XS and NDI for increasingly mixed technology environments, and can easily adapt to new format requirements as these become relevant.
Enhanced .edge HyperDensity SDI/IP conversion and routing platform
Attendants to Lawo’s booth
at IBC 2023 will learn about the .edge HyperDensity SDI/IP Conversion and Routing Platform. Through licensable options like proxy generation and JPEG XS compression, with this solution Lawo addresses bandwidth constraints, streamlining IP pipelines and optimizing workﬂows.
mc²/A__UHD Core/Power Core Platform
To empowering live productions, Lawo will showcase the V10.8 software for the mc²/A__UHD Core/Power Core platform. With features like ﬂexible bus routing, expanded AUX count (up to 256 busses), QSC Q-Sys proxy integration in HOME, Remote Show Control via OSC, and more, visitors at IBC will witness how Lawo’s platform sets a new benchmark for live performance capabilities. Furthermore, NMOS support for the mc² Gateserver bolsters device compatibility, oﬀering seamless integration into Lawo’s ecosystem.
With its integration as a proxy in HOME 1.8, support for MADI front ports, and dynamic recognition of DANTE and/or MADI SRC cards, the Power Core ensures a seamless audio experience and Lawo is committed to show it to customers and visitors at IBC 2023.
At IBC2023, LiveU will showcase its latest live video workﬂows and collaborations in contribution, production and distribution across sports, news and other live productions, bringing greater eﬃciencies across workﬂows while shortening the time to air. These include its ﬂexible remote and on-site production solutions, using IPbonding and cloud workﬂows to replace traditional production hurdles with easyto-use, high-quality, and lowcost solutions.
A core element of the EcoSystem – its openness and interoperability – will
be highlighted on the stand. LiveU’s latest integrations will be demonstrated with other leading technologies encompassing 5G, cloud and AI. Real-time use cases will be presented across the entire video production chain.
Show highlights include:
LiveU Studio: ﬁrst Cloud IP Live video production service to natively support LRT™
The fully cloud-native IP live video production service, LiveU Studio, is the ﬁrst to natively support LRT™.
LiveU Ingest: ingest Cloud solution for automatic recording and story metadata tagging of Live video
It’s an automatic recording and story metadata tagging solution for cloud and hybrid production workﬂows: LiveU’s automated workﬂow solution accelerates the time-to-air and conversion of assets into digital media, increasing production eﬃciency.
LiveU Matrix: next-gen IP cloud video distribution service
LiveU will show how plugging into the LiveU Matrix, its large global private video distribution system, can boost content monetization and give instant access to a world of creative possibilities.
At this year’s IBC, Magewell will be showing its new space- and power-eﬃcient Eco Capture HDMI 4K Plus M.2 and Eco Capture 12G SDI 4K Plus M.2 cards, which capture 4K video at 60 frames per second. But the main course for Magewell will be the showcasing of its Control Hub as part of IP workﬂow and streaming demonstrations in its stand at IBC 2023; in
Magewell’s booth, visitors will verify that Control Hub provides centralized conﬁguration and control of multiple Magewell streaming and IP conversion solutions.
Administrators, IT staﬀ and systems integrators can easily manage encoders and decoders across multiple locations through an intuitive, browser-based interface. An HTTP-based API is also available for third-party integration.
As Magewell’s team will explain to visitors and interested professionals, the Control Hub software can be deployed on-premises or in the cloud and supports Magewell hardware products including Ultra Stream and Ultra Encode live media encoders; Pro Convert NDI encoders and decoders; the Pro Convert Audio DX IP audio converter; and the USB Fusion capture and mixing device.
Attendants will be able to ﬁrst-hand learn how users can remotely conﬁgure device parameters, monitor device status, trigger operational functions – such as starting or stopping encoding – and perform batch ﬁrmware upgrades across multiple units of the same model.
Marshall Electronics zooms in on its POV oﬀerings at IBC 2023, and visitors will able to learn all about them and discover four new NDI|HX3 models as well as recent updates to the line.
Marshall’s team will be present for attendants’ questions’ and new Marshall POV cameras will be on display at stand, the CV570/CV574 Miniature Cameras and CV370/ CV374 Compact Cameras all feature low latency NDI|HX3 streaming as well as standard IP (HEVC) encoding with
SRT. The cameras feature H.264/265 and other common streaming codecs along with a simultaneous HDMI output for traditional workﬂows. All will be available for AV professionals and broadcasters to see.
Interested visitors will learn that CV570 miniature and CV370 compact NDI|HX3 cameras both contain a new technology Sony sensor with larger pixels and square pixel array that Marshall is implementing into all its next generation POV cameras. These will have resolutions of up to 1920x1080p (progressive), 1920x1080i (interlaced) and 1280x720p. All four cameras provide NDI|HX3, NDI|HX2 as well as standard IP with SRT and streaming codecs such as H.264/265. Simultaneous HDMI output is available for local monitoring or low latency transmission.
The CV570 (HD) and CV574 (UHD) use a miniature M12 lens mount and is built into an ultra-durable and lightweight body with dimensions of roughly 2 inches x 2 inches x 3.5 inches. The CV370 (HD) and CV374 (UHD) use a slightly larger aluminum alloy body with CS/C lens mount allowing a wider range of lens options.
Last but not least, Marshall will present at IBC 2023 its newest NDI|HX3 camera
models as well as its recently upgraded POVs that have all been optimized to feature the latest in sensor technology and high-end processors. In addition, Marshall will bring its new CV420Ne, an NDI|HX3 version of its ultra-wide 100-degree angle-of-view streaming POV camera and will also be featuring its lineup of professional broadcast monitors throughout the show, including its now shipping V-702W-12G professional broadcast monitor.
At IBC 2023, Matrox will feature the critical technology powering broadcast workﬂows for live, tier-1 programming, broadcast television and specialty content production.
Broadcasters, integrators, and developers that visit Matrox’s stand will discover a range of production solutions:
Introducing Matrox ORIGIN, a framework that redeﬁnes live production workﬂows
for broadcast and media developers. Visitors will learn about how Matrox ORIGIN oﬀers a native, ITbased approach to television production, providing scalable, low-latency, and frameaccurate broadcast operations for tier-1 live television production. .
Video Conversion — Monitor, Route, and Convert Media ConvertIP —
Visitors interested in signal management — including two-way, high-density ST 2110-to-HDMI/SDI monitoring and conversion — are invited to the ConvertIP pod. There they will see how they can reduce the cost of ownership, ease switching requirements,
and gain baseband monitoring ﬂexibility and redundancy with Matrox ConvertIP.
As visitors will be able to learn, central to the Qx Series demonstrations at this year’s IBC is the new QxP portable waveform monitor. Inheriting the ﬂexible architecture of the QxL rasterizer, the QxP, distinguished by its capacity for 12G-SDI and 25GbE UHD IP workﬂows, oﬀers an integral 3U multi-touch 1920 x 1200
7” LCD screen with integral V-Mount or Gold-mount battery plates for portability.
Phabrix will also introduce new features for the Qx Series at IBC 2023. It now supports
Full Range generation and analysis, allowing for comprehensive testing and evaluation. The Qx Series also oﬀers enhanced waveform analysis capabilities, and Phabrix’s team will ensure that attendants to their booth verify it; like they would surely explain to IBC’s public, this waveform instrumentation provides the necessary precision for tasks like camera shading and image grading, while retaining realtime operation ﬂexibility and supports a wide range of SDR/ HDR SDI and IP video formats.
Visitors to the stand will also see the latest update to the Rx Series of 2K/3G/HD/ SD rasterizers, including a
new 6-bar gamut toolset, to highlight out of gamut areas in the picture window. The instrument oﬀers gamut monitoring meter bars for the YCbCr and RGB color spaces to enable pixel-level monitoring at both YCbCr/RGB levels simultaneously.
Phabrix’s Sx Series of handheld instruments will also be on show, including the Hybrid IP/SDI, Sx TAG, and SxE for advanced SDI physical layer analysis, key products designed for all broadcast engineers and especially for live event set-up.
IBC’s attendants will be able to see all range of Pixotope’s solutions, focusing in its software platform for endto-end real-time virtual production solutions; whether used by teams or individual users, Pixotope’s platform oﬀers an intuitive UI/UX that requires no technical expertise to operate.
Other Pixotope solutions that will be on display at IBC 2023 are:
Pixotope Tracking - Fly Edition, a ﬁrst-of-its-kind markerless through-the-lens (TTL) camera tracking solution, that eliminates the complex setup and creative constraints imposed by tracking markers, enabling productions and live events to engage audiences with dynamic real-time aerial graphics. .
Pixotope Graphics - XR
Edition reduces the technical complexities and associated resource costs of extended reality workﬂows and environments that use
LED volumes with a range of purpose-built tools that simplify setup and operation, accelerating time to market.
Pixotope Pocket, a new mobile application that enables students to use their smart devices to produce immersive content with virtual production, will be available for IBC2023 attendees to try for the ﬁrst time.
Qvest will present a comprehensively expanded service portfolio with new practices and products for future-proof customer solutions in the media and entertainment sector. Additionally, the company will provide updates and insights
into the international project business of the global Qvest group.
Under the company’s this year’s IBC motto “Expect more”, the focus will be on the expanded range of practices, as well as new digital products with which Qvest addresses all the requirements of international customers in the ﬁeld of media and broadcasting and beyond.
From September 15 to 18, members of the global Qvest team will be on site in Hall 10 at booth C31 to inform visitors about all the latest news. In dialog with Qvest product experts at demo pods, visitors can experience, among other things, the digital products makalu Cloud Playout and
ClipBox Studio Automation, which provide greater simplicity, productivity and scalability for broadcasters and publishers.
As part of its presence at the show this year, Qvest will oﬀer a dedicated Product Experience Zone. Visitors can get insights about Qvest products during compact live presentations that will be oﬀered several times a day. The integration platform for media workﬂows qibb, a strategic spin-oﬀ of Qvest, will also be presented. Their focus will be on automated workﬂows with generative artiﬁcial intelligence for media applications, integrating popular AI tools such as ChatGPT4.
In addition, visitors can get comprehensive information about the Qvest ﬁelds of expertise, which are now clustered into practices. The Qvest practice specialists will demonstrate how the company’s knowledge in areas such as Foresight & Innovation, Digital Media Supply Chain, OTT, Data & Analytics, Digital Product Development or Systems Integration enables the implementation of futureproof technology projects. By using state-of-the-art technologies and individual consulting processes, the global Qvest team can oﬀer specially tailored solutions for every need.
Trade show visitors and media representatives will also have the opportunity to arrange exclusive meetings with Qvest. Members of the management of the worldwide Qvest company group, as well as practice and product experts, will be on site on all days of the IBC.
With the integration of Simplylive, Riedel will be showcasing their live video production tools including the All-In-OnecProduction Suite, oﬀering ViBox for multicamera productions, with Slomo and RefBox. Additionally, RiMotion replay, RiCapture UHD
recorder, the Venue Gateway and the Web Multiviewer enabling simple, scalable production and replay for anyone, anywhere. Plus, new additions to the Simplylive portfolio will be unveiled, the company promises.
They will be demonstrating, for all attendants, its MediorNet IP solutions for video and audio processing and distribution with innovative MuoNSFP technology, which will exhibit an abundance of processing: encoding/ decoding, ST 2110 gateways, multiviewers, and newly added HDR conversion between any SDR/ HDR formats, including Slog3, PQ, and HLG. Visitors
will also discover the MicroN UHD media distribution and processing platform, adding bandwidth, I/O, higher resolutions, and processing power to the MediorNet TDM platform. Plus, Riedel will unveil a hot new addition to their software-deﬁned hardware portfolio on September 15th.
On the comms side, Riedel will present the Bolero wireless intercom in both the DECT 1.9Ghz and the new 2.4Ghz versions, including capabilities in combined networks. Available for all attendants to see it, the Artist-1024 will be on display and as well will the SmartPanel 1200 and
2300 Series multifunctional interfaces.
ROE Visual will showcase its latest products and technologies, highlighting its signiﬁcant presence in the broadcast and ﬁlm market.
With its dedication to pushing boundaries and delivering high quality LED displays, ROE Visual’s presence at IBC Show 2023 aims to impact the attendees. From broadcast professionals to ﬁlmmakers, the showcase will provide a glimpse into the future of visual storytelling and media production for all visitors and professionals.
As visitors at ROE’s stand will see, the Ruby LED series will take center stage, featuring the Ruby RB1.2, RB1.5, and RB1.9BV2-C panels alongside the Black Marble LED ﬂoor, the BM2.
Making its debut at IBC, the Ruby RB1.2 is a ﬁne-pitch, broadcast-grade HD-LED panel delivering highperformance visuals. The Ruby RB1.9BV2-C, a curved LED panel fully compatible with the regular RB1.9BV2, promises seamless integration for broadcast and virtual production applications.
Designed with cutting-edge LED technology and highspeed components, the RB1.2 and RB1.9BV2-C panels oﬀer true-to-content color representation and unrivaled in-camera performance with high frame rates, refresh rates, and minimal scan lines, and attendants to IBC 2023 will be able to check it out.
In addition, ROE Visual will bring a new prototype to the exhibition: Coral, a ﬁnepitch COB LED panel creating unparalleled visuals.
With its high visual performance, Coral’s Chip on Board technology delivers high contrast, wide color gamut, and great color accuracy, providing a highdeﬁnition viewing experience. Its energy-eﬃcient common cathode design saves costs and extends the panel’s service life, making it a sustainable choice. In order to verify it, visitors will be able to ask for exclusive sneak peeks at the stand.
Ross will have new, solutionscentric, booth (Hall 9, Stand A04 and A05) demonstrating end-to-end workﬂows for a wide variety of video production environments.
Attendees to Ross stand will be able to experience full solution demonstrations that show the latest advances in:
- Extended Reality (XR)
- News Workﬂows
- LED Production
- Production Graphics
- Camera Motion Systems
- Automated Production
RTS Bosch will be showcasing and demonstrating its products and solutions at IBC 2023, and these are some highlights about what visitors will be able to see at their stand:
Available for all attendants, there will be a demo of RNP implementing NMOS protocols. As RTS Bosch’s team will explain, RNP acts as a proxy between RTS OMNEObased devices and non-RTS NMOS third-party products, such as NMOS explorers, NMOS nodes and NMOS controllers.
At RTS Bosch’s stand, visitors will check out RNP functions:
- IS-04 Discovery and Registration -- Broadcast controllers identify and manage new devices through automated workﬂows.
- IS-05 Device Connection Management -- Broadcast controllers integrate IS-05 devices through a common method without requiring any drivers for stream connection management.
- IS-08 Audio Channel Mapping -- This capability allows users to manage audio channels through a control system in a uniform manner without the need to develop custom drivers for every diﬀerent device.
In addition, RTS intercoms will have a demo that will show one of ODIN’s redundancy and glitch-free capabilities. In case of a device failure, the system will switch over to a backup device / second network.
The latest ﬁrmware release for the ODIN matrix and the OMNEO Main Station (OMS) features call light support. RTS Bosch will oﬀer a demo of this functionality for diﬀerent verticals (theater, broadcast, live entertainment & industry).
Also, RTS intercoms will be showcasing the new PH+ headset which combines improved audio quality for enhanced speech intelligibility with comfort for extended wear and robustness for tough environments.
RST will have on display the new DSPK4 (Digital Speaker Station) —an IP-based speaker station with several key enhancements, such as high-quality digital audio and the use of standard Ethernet, making it a ﬂexible ﬁt for any work environment.
New immersive innovations, content creation and broadcast audio solutions, plus an invite-only preview of the future of wireless audio –at Sennheiser’s stand will be displayed the group’s latest advancements.
Audio at its best: at IBC, visitors will see how Sennheiser, Neumann, Dear Reality and Merging
Technologies will demonstrate state-of-the art immersive production workﬂows as well as exciting solutions for audio capture, monitoring and processing. Creators and audio professionals of all levels will be welcomed.
Four spaces invite IBC visitors to explore Sennheiser group’s state-of-the-art solutions for today’s content production: (1) the immersive presentation zone, (2) active and passive product islands, (3) a Neumann area and (4) a separate Wireless Multichannel Audio Systems (WMAS) zone.
The immersive zone
In the immersive presentation zone, which is powered by a 5.1.4 Neumann monitor setup with nine KH 150 monitor
loudspeakers and two KH 750 DSP subwoofers, guests will be able to get hands-on with the Dolby Atmos/immersive broadcast workﬂows facilitated by the products of Merging Technologies, the latest member to join the Sennheiser Group. The company shows their Anubis, Hapi, Pyramix and Ovation solutions integrated in typical broadcast workﬂows. In addition to their new Dolby Atmos certiﬁed monitoring package, Merging Technologies will show a brand-new interpretation solution.
In the same space, visitors will be invited to experience Dear Reality’s dearVR SPATIAL CONNECT for Wwise. The software will be previewed at IBC for the ﬁrst time —and
enables game designers to develop the audio directly in the VR or AR gaming environment, instead of having to switch between systems. The in-game mixing workﬂow will provide a new in-headset control of Wwise audio middleware sessions and aims to improve the workﬂow for next-generation XR productions.
Other presentations in this area will include AMBEO 2-Channel Spatial Audio for live production, using an Anubis to create an enhanced immersive stereo feed from an immersive audio production source, and information sessions on WMAS, Sennheiser’s
broadband wireless technology.
At IBC 2023, Sony will showcase solutions for the Media Industry under the theme of Creativity Connected. It rests on long-term commitment to customers around core technologies allowing them to be ever more creative and eﬃcient.
All visitors will be aware of Sony’s complete ecosystem of solutions, products and services, which reﬂect four strategic areas of focus that are shaping the media industry:
• Networked Live, a platform enabling the optimization of people, locations, and processing for each and every live production environment, whether on the ground or in the cloud.
• Creators’ Cloud, Sony’s cloud platform dedicated to eﬃcient media production, sharing and distribution.
• Imaging, with color science, ﬂexibility and ease of operation at the heart of Sony’s cameras and lenses, in particular its Cinema Line range, including CineAlta cameras and reference monitors, and the added beneﬁt of SDK-based remote control.
• Virtual Production, for ﬁlm production that combines the Virtual Production Toolset software with Crystal LED displays for high brightness and contrast 3D set background images and rich color reproduction, as well as the VENICE camera series, for soft and delicate depictions, smooth skin tones, and beautiful color reproduction.
Attendants to Telestream’s stand will learn how media organizations are enhancing workﬂows using scalable and cost-eﬀective solutions for Media Production, Media Supply Chain Management, and Media Distribution. As well, Telestream’s team will be demonstrating solutions and platform-as-a-service oﬀerings that bridge on-prem ecosystems with the beneﬁts of working in the cloud.
MEDIA PRODUCTION SOLUTIONS
• Monitor and test video/ audio outputs in SDI, IP, and hybrid production environments
• Ensure premium quality content in remote productions
• Easily synchronize resources
and generate reference clock
Live Capture, Ingest and Delivery
• Capture live feeds via SDI, NDI, and IP
• Support SMPTE 2110, SRT, RTMP and Transport Streams
• Create ﬁnal deliverables for PAM systems, social media and more
MEDIA SUPPLY CHAIN MANAGEMENT SOLUTIONS
• Easy color space conversion for live HDR production
• Rapid broadcast preparation for on-demand with DAI
• Support for format conversion from proxy through SD, HD, and UHD
• Eﬃcient captioning from
video ﬁles as well as auto subtitling with live workﬂow
Cloud Media Processing and Workﬂow Orchestration
• Cloud-native execution of Vantage workﬂow orchestration
• Support for hybrid cloud/ on-prem conﬁguration of Vantage workﬂows
• Support for customers chosen Private Virtual cloud (AWS, GCP, Azure)
MEDIA DISTRIBUTION SOLUTIONS
• Fully cloud native ﬁle-based media processing for broadcasters and streaming services
• File-based QC and automated captioning
Visitors to IBC 2023 will be able to enjoy at Telos’stand a tour of the latest hardware and software solutions for radio and TV that enable creators and companies to broadcast without limits.
At Telos’s booth attendants will ﬁnd their expert staﬀ of broadcast specialist’s on-hand: whether customers or visitors have product questions, a speciﬁc workﬂow they’re seeking to implement, or just want to see what’s new, Telos’ team will be there to help.
Telos Alliance is bringing a full slate of radio and TV products to Amsterdam for IBC 2023, all designed to make life easier and more productive for
broadcast content creators. Visitors will ﬁnd a wide range of solutions for managing workﬂows and media delivery.
Telos Alliance products on display at IBC 2023 include:
• Axia Quasar AoIP Mixing Consoles
• Telos Inﬁnity VIP: The Inﬁnity Intercom family of products includes now the new Telos Inﬁnity VIP App
• AudioTools Server WorkﬂowCreator
• Jünger Audio AIXpressor
• Omnia Forza: The new Omnia Forza, a brand-new software-based approach to multiband audio processing, is purpose-built for HD, DAB,
DRM, and streaming audio applications.
• Axia Altus. This new software-based audio mixing console delivered as a Docker container brings the power and features of a traditional hardware console to desktop and laptop computers, tablets, and smartphones running any modern web browser.
At this year’s IBC edition Vislink will be unveiling its latest innovations, oﬀering attendees a sneak peek into the future of live content production and distribution. The Vislink team will be live on stand 1.C51 with a full complement of product
displays, presentations and video demonstrations.
One of the highlights of Vislink’s showcase will be its Hybrid IP/COFDM applications. This technology, expressed in key products like the Cliqultracompact mobile transmitter, blends the strengths of Internet Protocol (IP) and Coded Orthogonal Frequency Division Multiplexing (COFDM) to create optimal transmission capabilities. As visitors will be able to learn, this technology is designed to provide broadcasters with a more robust and ﬂexible transmission solution, especially in challenging environments where signal stability is crucial.
Vislink will also be demonstrating its private 5G solutions for event productions, leveraging the high-speed and low-latency beneﬁts of 5G technology.
Another oﬀering from Vislink available to all attendees will be its solutions for remote production in the cloud. Led by theTerra Link series of portable, rackmount and compact encoders, these ﬂexible, cost-eﬀective, and scalablecloud-based production solutions are
tailor-made for modern broadcasting needs, allowing organizations to streamline their production workﬂows and maximize their resources. Vislink’s cloud-based solutions oﬀer production teams the ability to collaborate remotely and deliver highquality content to audiences worldwide.
Vislink’s showcase will also feature AI-automated live sports and in-studio systems, harnessing the power of artiﬁcial intelligence for enhanced production automation and viewer engagement.
In addition to the two big shows at Vizrt booth – The Vizrt Experience live show and The TriCaster Anywhere live show – attendants won’t want to miss a few of our other highlights. Every visitor
to Vizrt’s booth will experience live demos of their cuttingedge solutions, and their team will be eager to showcase what they have to oﬀer and answer any questions professionals may have.
Solutions that will be on display at Vizrt’s stand are:
Viz Engine 5
Vizrt will be showing its powerful render and compositing engine together with its tight integration with UnReal Engine 5 for photorealistic AR, VR, and virtual sets.
Adaptive Graphics for every screen
This single workﬂow, multiplatform content delivery that automatically adjusts resolution and format to support speciﬁc display devices will be on show for
visitors and interested public. It’s an exclusive technology that enables broadcasters, creators and graphic artists to create once and publish multiple times for ﬂuid, editable, and consistent visual storytelling.
HTML 5 Graphics for any size production
This is a Cloud native so professionals and artists can create and control live graphics from a single interface via any browser. Besides, it oﬀers a wide range of audience engagement and monetization features.
Media asset management and editing in the cloud
Vizrt Sports Content Factory provides a full sports content workﬂow in the cloud from live ingest, archive, and metadata annotation, through branded content distribution to multiple platforms. It’s been designed to address the high-performance demands of sports teams, leagues, and rights holders.
The Viz Flowics interface via any browser gives creators and professionals simple
control over the creation, integration, and playout of HTML5 graphics that are adequate for fast-paced productions and digital or multi-screen extensions.
VoiceInteraction returns to IBC to showcase its leading AI speech recognition solutions: with demos of Live Closed Captioning, Transcription Automation, and a Broadcast Compliance tool that provides additional features such as automatic news clipping and analytics.
VoiceInteraction’s ﬂagship automatic subtitles solution is available in 40 languages. With reliable speech recognition and marketspeciﬁc adaptability, it delivers high accuracy and reusable captions for VOD. Enhance accessibility costeﬀectively with support for any SDI or IP workﬂow. The on-prem translation widens audiences, while the intuitive web dashboard provides customized control, event scheduling, and seamless caption integration.
Audimus.Server is VoiceInteraction’s transcription automation platform for oﬄine speechto-text or post-recording subtitling workﬂows. With an in-app editor and automatic timecoding, it creates .srt ﬁles or directly burns the subtitles into videos, among other exporting formats. The metadata extraction is designed for indexing and enriching large MAMs, as Audimus.Server empowers broadcasters to repurpose content for on-demand programming and other platforms.
Media Monitoring System
A tool for Broadcasters to reduce costs, create additional monetization opportunities, and enhance viewer loyalty.
Traxis Platform will be on display at Zero Density’s IBC stand, for all visitors to see, and ZD’s team will answer all questions and explain all features to them.
With an accuracy of up to 0.2 mm and ultra-low latency of just six milliseconds, the Traxis Camera Tracking system oﬀers
high precision. On a virtual production, this ensures continuous calibration and seamless real-time results, so that every camera move that happens on set is reﬂected in the virtual world without delay. Virtual graphics can then be broadcast live without breaking the audience’s immersion.
For talent tracking, the Traxis platform includes an AI-powered system that can identify multiple actors within a 3D environment at once. The Talent Tracking system uses AI to extract the actor’s 3D location from the image in real time, then sends the data to Reality Engine to create accurate reﬂections, refractions and virtual shadows of the actor in 3D space. The technology
is completely markerless, which streamlines setup and maintenance, and means actors no longer need to be distracted by wearables or beacons that make it harder to perform.
The Traxis Hub
At the heart of the Traxis platform is the Traxis Hub, a comprehensive tool that manages both tracking and lens calibration data. Compatible with all industrystandard tracking protocols, the hub brings together all available tracking and lens data into a centralized interface, signiﬁcantly reducing setup time. The hub has a library of the most commonly used broadcast lenses, enabling quick and easy calibration.
Traditional production methodology may be trespassing the borders of efﬁciency as the toolset and possibilities of virtual production allows not only a better outcome in terms of creativity and quality, but a faster way of testing new ways of working and, above all, in full cost-saving mode.
Content demand is growing all around the globe and every year new players come and try their best within an environment where every day new technologic advances widen a little more the landscape.
To this stage comes Orbital Studios, created in 2021 after Covid-19 had everybody conﬁned at home, by a group of professional coming from AV production industry whose members met at A.J. Wedding garage. Just two years after, Orbital Studios offers a wide range of production services, even recently launched a New Division (Orbital VSI), dedicated to virtual production stage integration services worldwide. Besides, it has delivered successful productions as Snowfall (HBO), Goliath (Prime Video) or Justiﬁed. City Primeval (FX), collaborates with platforms like Hulu or Disney and has partnered with Narwhal Studios (VAD creators for series such as Mandalorian, Boba Fett, Obiwan) to create a more streamlined, collaborative, and efﬁcient pipeline.
TM Broadcast talks with A.J. Wedding, Orbital Studio’s CEO, about how to achieve success in so little time as well as the advantages of virtual production, video industry and what are the main challenges in audiovisual production nowadays.
How started Orbital in the virtual production industry, and how has developed itself during this time?
Orbital Studios started in my garage during the pandemic. I think, myself and a few of my friends had seen this technology coming to fruition with The Mandalorian and really wanted to know more about it because it’s a game-changer. We started to ﬁnd out that there were a lot of problems with it. As it’s brand new, they’re growing pains. The technology wasn’t built for it speciﬁcally. We thought maybe we can help some brands ﬁx those problems and get to a place where people in production have a better set of the tools to work with.
We all came from production. Our background was not a tech background. I’ve been involved in production for over 20 years and also visual eﬀects. It was the perfect marriage for me. I did some consulting on some early virtual productions and started to form some opinions
on how the technology should be used and how we can get to a point where it’s capable of more things.
So helping and sharing, this is how you found this spot in the market to exploit. Yes, absolutely.
Is traditional production already done? Do you think in a near future it may be used only by residual or independent creators and artists? What do you think about the next future of traditional ﬁlm production?
I think that as people learn the toolset, as the price of these things come down; it starts to become more realistic. We actually did a History Channel series called History’s Greatest Heists, starring Pierce Brosnan. They didn’t have a huge budget. Prior to that, we did a series called Snowfall for FX. Also, not a big-budget show. I think their budget was about half or less than what The Mandalorian is. Of course, we did Justiﬁed: City Primeval, which just started airing this month.
None of those shows had huge budgets. We have done also up here at the studio some smaller artistic projects where we’ve partnered
with the artists because we thought what they were doing was really interesting and helping us to continue push boundaries, which is what Orbital is all about.
You mentioned the toolsets and costs, and nowadays you can ﬁnd a lot of diﬀerent software and solutions to do the same thing. The issue is that it’s impossible to know every tool or program in depth. How do you train your crew? You have a plan developing of knowledge or learning, or do you seek for talent and do you take it?
Our team is pretty well-versed in virtual production and Unreal Engine, the tracking software that we use. Anytime somebody shows interest in wanting to learn, we try to bring them in to learn from us. We also provide learning platforms for our clients overseas. One of the things that we do aside from our own work with TV shows and movies is that we also help build stages for people. Part of that help is us training people in those countries, in those cities.
Then on top of that, we’ve built a data center here at Orbital where we can provide a level of support from our
studio here where we could actually operate facilities all the way across the world from here. The nice thing about that is not that we’re trying to take over someone else’s job, it’s more we’re there to be a backup until everybody’s up to speed with the technical.
But cost saving is a very important part of any job. We already knew virtual production could reduce costs by eliminating team and equipment’s trips. Now, are you telling me
Orbital Studios started in my garage during the pandemic. I think, myself and a few of my friends had seen this technology coming to fruition with The Mandalorian and really wanted to know more about it because it’s a game-changer.
that virtual production is going to reduce its costs even more? In which ways could expenses be dropped?
There are a few ways. One is, the longer a company like Orbital has their LED panels and tracking systems, which of course, are the most expensive part of this, we get to a point where we start paying them oﬀ, so we can charge a smaller rate. Another thing is, and this is probably the biggest one, the pipeline for creating the virtual art department. Most of the larger companies that are doing this work, they’ve spread it out as though you would a visual eﬀects pipeline, and it’s taking a lot longer than it should, in my opinion.
One of the things that we’ve really pushed forward with our partners at ROTU, they made us believe that this stuﬀ can be done a lot faster, and that takes a lot of the cost out of it. For instance, sometimes we’ll have meetings with a client where we know 48 hours ahead of time what their general ask is going to be, and we’ll create an asset.
In just 48 hours?
Yes. We’ll show it to them in
the meeting, and it’s mindblowing because people assume this stuﬀ takes a long time, and it really shouldn’t. The thing that’s great about it is if you can get at least an early version done while you’re still in early pre-production, it becomes a physical document that all the production department heads can use, can meet and make decisions about the production there. You save a ton of money because of the organization that you build with that asset.
What part of virtual production could be improved aside costs? I think it’s slow, the process
that you have to learn new solutions and new software. Aside these two things, what can you improve in virtual production?
I think one of the great things is the more things that get made, the more assets that then exist in the world in the virtual art department. You’re not starting from scratch every single time you create something. That’s something that we leverage ourselves so that we can move quicker. I think that Epic Games has done a really good job of raising the ﬁdelity of what we can put on the wall. It used to be you would walk into a stage and look at the LED.
It looked like a video game, but you could trick the camera and it still looked okay on the camera. Now, using Unreal 5.2 and 5.3 on our stages, we’re looking at them and they look absolutely real. It changes what you’re capable of with virtual production. Now, you can have everything in focus and have it all look believable.
So there are a lot of possibilities. Now, let’s talk about saving costs and saving time. How do you see this reduction in the production time required to ﬁnish a virtual production are going to aﬀect the cinema industry?
I think there’s still quite a learning curve for the producers and the art department, but once they get through that and they start to understand, they’ll be better at scheduling the days. One of the things that is a premise of virtual production is that I can move the world. I don’t have to move all of my lights and all of my team. I think the real value is going to come in when you realize you can shoot four diﬀerent locations in one day. For Justiﬁed, we did a lot of car process work, and there was one day where we shot four diﬀerent car scenes with four diﬀerent cars in diﬀerent
locations and diﬀerent times of day and diﬀerent actors.
If you can imagine what that would’ve cost with a process trailer, and closing the streets, and switching to nights. They saved a ton of money on that side of it. It really comes down to what you’re comparing it to. If I’m trying to compare onset virtual production with green screen, and I’m not taking into consideration what it’s going to cost to do all that work in post, then there’s no way we can compare that. If you compare it to a location shoot, oh my God, there’s huge savings.
Virtual production is helping all the creators to see beyond the sight, you said. You are stimulating directors and creators to reach out beyond they can imagine. Can you tell us a little case of this that actually happened? A case of collaboration between your team and a director in which, as a result, it’s created a better outcome than he imagined in the ﬁrst place.
We did a commercial recently with an NFL Football team, where the idea was we would have the football players on the ﬁeld. In the stands, they
One of the things that is a premise of virtual production is that I can move the world. I don’t have to move all of my lights and all of my team. I think the real value is going to come in when you realize you can shoot four diﬀerent locations in one day.
were just going to use colors that are moving so that it didn’t bog down the system, but it didn’t look like people. You were going to have to have it way out of focus. We had said, “Hey, look, actually, we can put people in there, and not only that, we’ll put 65,000 people in there. We can set it so that we can press a button and make them cheer.
We can press a button so we make them boo. We can actually make it interactive for you.” It was like a light bulb went oﬀ. It was like, “Really? That’s possible now?”
The technology has come such a long way in such a short period of time. I think that sometimes there are guardrails that get put on a director or a DP because of past versions of this not working out. The technology moves so fast that we’re now-We did a production where we were trying to render water, which of course, is really hard to render water live.
It looked pretty good on our monitor, but then once we put it up on the wall, it didn’t look very good at all. The director was upset about it. That morning that he called us in for an emergency meeting, a new plugin for a
ﬂuid dynamics system became available and we put it into the system and on the wall within two hours without having tested it previously. Did that, we looked at it, it looked great. It’s literally changing by the day.
Now, if you could make a wish with a magic wand, what would you ask for the industry manufacturers? What solution would you ask for?
Wow, that’s a good question. I think we’ve worked pretty closely with a lot of the manufacturers, especially the LED panels, to try to help drive that innovation forward. Color is the biggest problem with them and kind of a magic trick to ﬁgure out how to get the color between the sensor and the LED to match as closely as you can.
It just comes down to the hard simple fact that an LED panel is made up of RGB lights. Those three colors are trying to make every color in the universe, and sometimes to your eye, it looks okay, but to a camera sensor, it doesn’t always pick the right color.
Having better color representation would be a huge ask. I don’t have a magic wand, but I’ll tell you what? I’ve
asked about 1,000 people for this.
So here we are talking to display’s manufacturers. It could be interesting for them to know about it. Probably they are working on this because you already asked about it, I’m sure.
The camera manufacturers are working on some great tools on the other side of it for the software. ARRI has a new virtual camera tool they’re building to help. One of the biggest problems we have with virtual production is you’ve got tech people sitting behind computers and production people sitting behind cameras. Very often, they don’t speak the same language. When the director
I think that the thing that sets us apart is that we’re constantly trying to make the tools better for ﬁlmmakers. That means the technology has to be better, software has to be better.
of photography says, “Hey knock down the wall onestop,” that means nothing to this guy behind a computer.
He’s trying to ﬁgure what onestop mean. By giving him a set of tools that match what the director or the DP is asking for, it helps that process go faster. That diﬀerence in education or diﬀerence in understanding of their tools, that’s been a big crashing point for virtual production.
I was going to ask you how does Orbital to stand out in a crowded market as the audio-visual one is right now. Now I understand
it’s curiosity and sharing knowledge, and all your work with manufacturers and creators what is keeping you on the scenario, am I right?
Yes. I think that the thing that sets us apart is that we’re constantly trying to make the tools better for ﬁlmmakers. That means the technology has to be better, software has to be better. We’ve been fortunate that a lot of those people that we’ve spoken to were willing to hear the feedback and make changes. Because of that, the tools that we have at Orbital and that we put into the stages that we integrate is top of the line.
Which solution or software do you always trust in every production?
Deﬁnitely Unreal Engine. We’re pretty close partners with Epic Games and they’ve been amazing at helping us as they’ve developed and we’ve also been helping them crash-test, some of the new stuﬀ that they have going on. In the past, we’ve tried other things. We’ve worked with Unity as well. I think that Unity is coming along and for certain things it is great. I think that right now, while there are several others that are developing better real-time rendering systems, the only one that’s really set up for
virtual production at the level that we use it for cinema and television would be Unreal Engine.
You could use Unity or Notch for broadcast, I think for certain applications, but Unreal is what we’ve really been big fans of. I think also Motive software, which we use with OptiTrack cameras for that sensor tracking. We’re also trying to help push them to develop some tools we deem necessary for what we do. Those are the two big ones.
Okay. What are the main projects that Orbital has produced during last year, and what are the main that are you are going to produce if you can tell us?
Well, some of those we can’t say but--We certainly talk about the past. About the future, I only can say that there are some big things on the horizon. In the past, we’ve done two seasons of Snowfall for FX. We did the series Justiﬁed: City Primeval for also for FX, History’s Greatest Heists for the History channel. We’ve done a ton of commercials and small little things for diﬀerent TV shows, but those have been our big ones.
Now I’m going to ask you to dare and forecast, how is going to evolve virtual production in the next year or, let’s say, next couple of years?
I think the thing that I always say when people ask why our technology is so advanced is that we had the luxury of when we started, we didn’t have a client. It’s like when other companies are having to put out TV shows every couple of months, we were developing better products. I think that there’s going to be a period where people catch up and upgrade their systems to a point where they’re more capable of some of the things that this new technology is capable of. I think that the learning curve is getting better.
I think that the more that producers and DoPs understand how to budget for this and how to shoot with this, it’s just going to become more prevalent. I’m already seeing here in the States, more and more TV shows shooting with this technology. It’s just going up and up because number one, you’ve got the ease of use of it, and two, you’ve got the ﬁdelity. It used to be you could have a window in your set, and then outside the window, you have
a photograph of a city.
Now with these 6K and 8K cameras, you can’t really get away with that anymore. If we can put a video wall out there and we can give you super high ﬁdelity, or we can give you 30 diﬀerent skies, it gives so much more ﬂexibility to the ﬁlmmakers. Even just that on a very simple level, is we’re starting to see that grow a lot. I do think that virtual production is here to stay. I think that it’s just going to get used more and more, and I’m betting on that with my life.
One of the changes could be related to workﬂows and remote production. How do you feel about cloud solutions boosting virtual production?
I think the cloud is great for the virtual art department and the things that we talked about earlier where we can use it to control and operate stages from Los Angeles anywhere in the world. When it comes down to the actual physical computers and operation, the assets have to be local still. There’s too much latency, that is basically the thing. On top of that and maybe, this is more of a personal thing, but we look at our computers like hot rods.
We’re constantly opening them up and changing them. Whenever there’s a problem on set, we’ve got to get inside and see what’s going on. If you don’t have the capability to do that, it makes it a lot harder.
We do have a lot of applications for camera to cloud, so that’s one of the things that we oﬀer here; we have to dedicate a 100 Gb that go in and out ﬁber lines. You can take the new LED camera and it actually has a ﬁber output that we can send directly into the cloud and go wherever in the world you want it to go. That’s the raw ﬁles directly to editorial or wherever you want it to go. Cloud is deﬁnitely a big piece of this, it’s just not the only piece.
Sometimes it’s better to have the things at hand. One of the best things about that is for VAD because there are Virtual Art Department staﬀ all over the world. It’s easy for me to say, “Hey, I’m going to put all of our assets in our data center, and anybody in the world who wants to work on them, can work on them through the cloud.” They don’t need a huge computer
like we have here. They can just have a keyboard and a mouse and a monitor, and what’s great about that there’s the lack of need of processing for their side of things. For our side of things, it’s more of a security issue. They’re never actually getting the assets. It’s good both ways.
What do you think about LED video walls? How do you use LED video walls in productions?
I would say that I’m surprised every day by some new thing that’s coming out that’s a part of this whole ecosystem of virtual production. Obviously, we did a lot of work to try to drive these LED panels to be better. Even the way we wire them here is diﬀerent from the way they are other places. But these panels of video walls don’t deliver the perfect color. They don’t, but there are tools like the HS Scope that allow you to calibrate better. If you think about it, it really doesn’t matter what your eye sees, it only matters what that sensor sees. Basically what you’re doing when you point a camera at an LED wall is
you’re pointing the camera sensor at a light. You’re trying to trick the camera into thinking that it’s not a light. There’s a lot of work that goes into, and that’s where the art comes in really, is ﬁguring out how do we make this frame look like a real frame.
Very often, what ends up happening is if you walked on set and looked at what it looked like, you would be shocked. You would say, “How does that look good? It looks awful.” It doesn’t matter, again, what your eye sees: it matters what the camera sees. Everything is dealt into making it look good on camera.
The interpretation of the camera is what we ultimately see. Yes, and every camera is diﬀerent, so it’s a diﬀerent art for each one.
This would be all; thank you for sharing this time with us.
Thank you for the opportunity to speak with you and spread what Orbital is doing. Hopefully, we can come into your neighborhood and build a stage sometime.
Quality, sustainable and environmentally friendly production
LIGHTING ON TV SETSBy Carlos Medina Audiovisual Technology Expert and Advisor
The audiovisual sector is looking for new business models and new audiences in this digital age were are experiencing, beyond what we know as traditional television.
In this sense, audiovisual content is very present in our daily lives, through new channels, media and consumption slots, which has lead to a chenge in the ways of producing said content.
This new present reality has caused the technology used to also adapt to changes, seeking more diverse, specialized and economical equipment to meet the needs of creators and/or communication companies.
But despite these new quests for stories and ideas and the implementation of new production strategies, what remains unchanged is the use of dedicated spaces prepared for creation of programs and content; speciﬁcally we are making reference to TV studios.
A TV studio is a fully prepared ﬁxed installation comprising two areas: on the one hand, the production control, where there are multiple technical equipment units for shooting, recording and broadcast of the contents thus created; on the other, the TV set, a perfectly isolated space where there are three basic items for production: camera equipment, direct-microphone sound equipment and lighting equipment, along with scenography (conventional and virtual) elements and the possibility of having (or not) an audience.
Nowadays, there are many types of TV studios that adapt perfectly to diﬀerent budgets, the type of program and/or the size of the company, organization and/ or entrepreneur producing audiovisual content. From the large TV sets, typical of communication companies operating in the traditional TV environment, to mini-sets for debates, discussions and/or content that are closer to the ﬁeld of social media, through a custom design of other sets better suited to the needs of universities, vocational training centers or private sector companies in other professional environments that are in the need of establishing online and/or hybrid communication.
These TV studios, whatever their type, have three featres in common: ﬁrst, they are an isolated interior location; second, the production of content with multi-camera is the prevailing way (from studio cameras, EFP and even PTZ cameras); and third, they allow multi-content, that is, they are ready for oﬀering diﬀerent types of programs.
These three features are key to understanding the lighting of a TV set in a generic way, a matter that will concern us throughout this article.
This purpose of getting to know AV inside out leads us to embark on the lighting system. There is no doubt that without light the audiovisual industry would not be possible. Whether its source is a natural or artiﬁcial one, light shows us, presents us, allows us to see what surrounds us; and with more nuances, it allows us to create diﬀerent stories and atmospheres. This is what we call ‘lighting’.
Audiovisual lighting is the ability to make sense of such light for aesthetic, artistic, narrative and communicative purposes. It is a technique and a language applied to theatre, cinema, television and the audiovisual in general.
We all know that the ﬁrst works in this sector were carried out with natural light, mainly emanating from the sun. A capability for invention, scientiﬁc curiosity and good work in this trade, by applying new means of production, gave rise to studios (ﬁlm studios at ﬁrst and then TV studios).
A TV studio with artiﬁcial light allows greater use of working time, maximum control of lighting, and a better balance between the ﬁnancial cost of a day’s wpork in the audiovisual sector.
From experimentation to industry, from the beginning to the present times, we can see that implementation of constant technological
innovations has been facilitating work in the audiovisual sector, thus improving its aesthetic and narrative possibilities. And, of course, also in the ﬁeld of lighting.
As a ﬁrst observation to focus on the subject, we must specify the two fundamental work environments for creation and production of television/audiovisual content: a TV studio (which, in turn, is subdivided into technical control and TV set) and, on the other hand, outdoors (what is known as “doing feature reports or reporting”) and natural indoor locations.
In both instances, the way of working, the novelties and developments in the ﬁeld of
lighting and the criteria for progrm production, have all been gradually modifying the type and design of the various light sources, types of projectors and accessories, thus adapting to each particular need.
As it is the case with the diﬀerent audiovisual technological solutions in terms of lighting, we continue to ﬁnd a myriad of types, models, conﬁgurations, brands and manufacturers that provide a solution to the way of implementing lighting for works both in TV studios and outdoors.
Therefore, it is important to determine what environment we are referring to in this article so as not to mislead readers or generate a false expectations about what we are going to explain next.
We will only deal with what is necessary for lighting a TV set:
We are referring to the place and the building itself, that is, the set. This space, of a given height and width, should comply with the following recommendations: be an enclosed space of great height, with thermal,
light and sound insulation, protection from possible risks and prepared with acoustic insulation coating. It should also feature large entrances and exits for technical staﬀ, be equipped with technical machinery, building elements for sets/scenery and other goods, and typically built at street level, no uneven surfaces, and ﬁtted with emergency exit doors.
It is the basic set up for us to work with artiﬁcial light in an electricity-based lighting system. In this sense, most TV studios/sets are dependent on an electrical network supplied by a company and electrical installations supervised by a company specialising in Low Voltage (LV). Therefore, a series of technical requirements must be met, both in terms of quality, construction and safety, so as not to give rise to major problems when operating the lighting system.
Inevitably, we can only work in suitable installations equipped with lighting systems that have been designed for such use and, therefore, having in place all the necessary electrical protection equipment and measures in regard to the
electrical power for which it has been designed; for example, a TV set with 16,000 watts of power dedicated to lighting.
The power supply or connection is where the electricity we work with comes from, the one that allows to switch everything on, from a complex installation to a simple lamp.
We establish two types of power sources: stationary dependent sources, such as distribution networks and LV link facilities, and portable autonomous sources, that is, fuel-based generators and electrical generators.
Based on the electrical knowledge when it comes to lighting, here we are going to ﬁnd some variables that we must be acquainted with: type of single-phase or three-phase current, alternating or direct current, electrical circuits in series or in parallel and the type of wiring and connectors pertaining to the type of power supply and electrical installation concerned.
It is highly recommended, whatever the lighting needs of the TV set are, that the entire electrical installation is independent of other installations, uses, buildings
and needs. In case of having lots of power, with stationary dependent sources, it is convenient to have an own link installation from the HV/MV to LV reducer transformation center and, in case of lower amounts of power, an own, adequate electrical circuit set up from the general control and protection panel (DGMP), separate from other uses.
The purpose of any electrical installation, according to correct wiring being compliant with the regulations in force, is to achieve a connection between the equipment and the projector/lighting device, what we call the power outlet. That is, the element that, in an electrical installation, has slots for the insertion of jacks and provided with electric current.
The most common of these is the plug (professionally referred to as the base/ connector). It can be on surface, recessed, indoor, outdoor with IP protection and watertight, with or without grounding. It depends on the number of pins or poles, the voltage and current supported.
The solutions achieving the most professional results and the highest performance are
schuko inputs/outputs, quickconnect terminals, Harting modular connectors, with powerCon and/or electriﬁed bars.
Light emission technologies or systems
These are the various ways of generating light, always based on some kind of principle or phenomenon. The three main
ones are: thermoradiation, luminescence and electrical radiation. Therefore, in a practical way, we are referring to the type of artiﬁcial light production under what we all identify as lamp types (careful, lamps, not bulbs):
- Incandescent lamps. Based on thermo-radiation. The passage of an electric current through a resistor (ﬁlament) in a vacuum or in a medium ﬁlled with inert gas reaches incandescence and, as a result, heat and light are generated. The ﬁlament used is tungsten or wolfram, which is why they are also called tungsten lamps in the audiovisual sector. There are regular/conventional incandescent, halogen (lamps that have a halogen component – iodine, chlorine, bromine, ﬂuorine - added to the ﬁlling gas increases the useful life of the ﬁlament) and IRC (Infra Red Coating –which have a special infrared reﬂective layer inside the lamp).
- Fluorescent lamps. They work under the phenomenon of electric radiation and luminescence. They involve excitation, by means of short-wave ultraviolet, of luminescent substances, thus obtaining
the visible spectrum of light. They are low-pressure mercury vapor discharge lamps.
This type of emission system is fully implemented in the audiovisual sector, after a period of constant innovations that improved the characteristics of the typical ‘oﬃce’ ﬂuorescent tube; thus, they provide greater light intensity, a reduction in greenish color, no ﬂickering, high color reproduction level, wide ranges in terms of color temperature, ability to regulate intensity, low consumption (which leads to savings in the cost of the electricity bill).
Another quality is that it hardly generates heat, thus achieving an improved luminic performance and greater comfort when illuminating both in the operation of techniciana and for the actor/ presenter, who does not receive as much heat.
In professional slang, we talk about ﬂuorescent tubes (swords) or screens (several tubes) of diﬀerent lengths and diameters as compared to the compact, energy-eﬃcient lamps used in our daily lives (substitutes for incandescent bulbs in our homes).
The solutions achieving the most professional results and the highest performance are schuko inputs/ outputs, quick-connect terminals, Harting modular connectors, with powerCon and/or electriﬁed bars.
- High Pressure Discharge Lamps (HID – High Intensity Discharge Lamp). Based on the principle of electrical radiation, they emit light through a discharge between two electrodes within a gas in gaseous form. We can ﬁnd high-pressure sodium, high-pressure mercury vapor, mixing lamps, halogenated mercury or metal halides (MH, HMIMSR), ceramic burner lamps (CDM-HCI) and xenon shortarc lamps (MSR).
We should clarify that the renowned HMI lamp (Hydrargyrum Medium Arclength lodide) is a brand of OSRAM, as MSR (Medium Source Rare Earth Lamps) is Philips’ own brand. They have achieved so much recognition among professionals in the audiovisual sector because they are capable of emitting a very intense light of a color temperature identical to the sun’s (5,600ºK/6,000ºK), an excellent color rendering index and a very complete spectral distribution curve, even when we make variations in intensity.
We can point out the need for a high-voltage power source for ignition and demanding ventilation in the lamp.
- LED (Lighting Emitting Diode). This light production system is an electronic device made of semiconductor materials that emits light when crossed by a current. 1962 saw the ﬁrst commercially usable LED (GaAsP – Red Led); 1990s: the birth and expansion of the Blue Led (InGaN). This technology is the latest development and is reaching ever-increasing market shares, both in homes and professionally in the ﬁeld of lighting.
Its improvements and innovations have made it possible to move from a conventional/generic LED (depending on the construction of the lamp: SMD, COB, Microled) to high power, Tunable Color or RPT (Remote Phosphorus) LEDs, where performance levels of and quality are much higher in terms of color reproduction (including white), luminous performance, color temperature control, long useful life and energy eﬃciency.
In addition, it shows versatility in the design and manufacture of diﬀerent lamps, so it presents very diverse shapes, sizes and ﬁnishes for
immediate application (panels, tubes, curtains, drapes, bars, downlights, rigid, malleable, ﬂexible, cuttable...) and adapts to the most professional needs and demands with the possibility of intensity regulation, color changes and absence of heat emission.
- LASER (Light Ampliﬁcation by Stimulated Emission of Radiation). Based on luminescence. Its use is very exclusive and restricted to very speciﬁc applications and environments, especially in the world of entertainment.
Each type of light ﬁxture/projector is a concrete response to the operation, control and distribution of the light emitted by the lamp, taking into account the internal and external resistance of a work environment, aesthetics, weight, size and economic cost of manufacture and maintenance.
Type of appliances (projectors, luminaires, lighting ﬁxtures, panels and/ or screens as the case may be)
Let us take the opportunity in this article to advocate among professionals the use of these terms when naming them and not the now expired term of ‘ﬂoodlight’.
There is a wide range of solutions, innovations and designs in manufacturing, as well as types of light ﬁxtures that include a range of very diverse optical, mechanical and electrical solutions.
Each type of light ﬁxture/ projector is a concrete response to the operation, control and distribution of the light emitted by the lamp, taking into account the internal and external resistance of a work environment, aesthetics, weight, size and economic cost of manufacture and maintenance.
It is, in short, a set of elements around the lamp: the box/ housing/chassis/body, the lamp holder (sometimes a ballast or a transformer), the optical system composed of the reﬂector, depending
on the case mirrors, louvers or diﬀusers to control the light beam, and the electrical system, including the type of connector.
We can classify them into those featuring a ﬁxed or variable beam, with an open or focused design through a lens (PC, Fresnel
and/or a combination of lenses); those that incorporate external reﬂectors (spherical, circular, parabolic, ellipsoidal, smooth, dichroic or metallic); those that are symmetrical or asymmetrical, or generating concentrated or omnidirectional light; those that have internal and/or external regulation, and conventional or mobile lighting.
Even the design, shape and treatment of the lamp bulb (drop, fungus, linear, tubular, mirrored reﬂective surface, matte ﬁnish, opalinization, coloration...) are decisive to evaluate a projector or a luminaire. A characteristic example of the implantation of a type of lamp causing us to refer to (not accurately) a type of projector is the case of the PAR lamp (38, 56,... the number simply indicates the diameter of the front glass in eighths of an inch and is made of pressed glass).
It is true that many of these devices are sometimes named after the model or brand of the relevant manufacturer such as, for example, Kinoﬂoo or Cotelux, or the Bambinos; other times, because of the original color of the casing/chassis/ box -for example, “the butanes”, for their orange color-, or for their shape: “the cans”; the lamp they sometimes have is what they are called: the famous “quartz” suitcase or the aforementioned PAIR and Fresnel, or HMI; their intended purpose: the trimmings, the cycloramas or the blinding machines. Thus, we can ﬁnd a variety of names depending on the country, the tradition among technicians, and the presence of lighting equipment manufacturers.
Installation of placement stand, suspension and fastening
We are referring to the physical location where the diﬀerent devices will be located. This entails diﬀerent possible solutions, structures and supports depending on the budget, the complexity of the work to be illuminated and the space chosen (width, depth and, above all, height) of the TV set. This is known as the lighting grid.
As a general rule, in a professional TV set we will hang the devices, so we can ﬁnd a lighting grid, which can be from the most complex to the simplest: comb system, weft grids, tensioned interlaced steel surface, hanging walkways, truss structures, electriﬁed rails, ﬁxed and sliding rail kits (tracks), and ﬁxed iron or aluminum bars/tubes. The design of the grid (shape, placement, load resistance...) is always limited by the safety of the building, roof and ﬂoor, in terms of resistance to weight and production needs and speciﬁc content.
Now, we can hang the lighting ﬁxtures by means several solutions: hoist (self-climbing barrel), manual pantographs, spring pantographs, motorized pantographs, telescopic pantographs, extension bars or with simply hooks, claws and/ or clamps always with their corresponding slings, ratchets and anchoring accessories.
Handling and operation solutions
Since we normally work with lighting that is suspended several meters high, it is necessary to have equipment, solutions and accessories that allow us to hold, handle, direct and/or position the lighting devices. Thus, telescopic poles, ladders, elevators, platforms, scaﬀolding equipment are used and, at present, some TV sets use robotization and remote control systems, especially with mobile lighting.
Control and regulation system
Once we have placed, directed and correctly positioned the light ﬁxtures, both conventional and mobile lighting, and always under the guidelines of the illuminator, we must have the possibility of giving orders to each of the ﬁxtures individually or be able to work by making groupings and/or implementing temporary programming. It is necessary, therefore, from a simple regulation of intensity (from 0% to 100%) through dimmers (integrated in the light ﬁxture or through external modules), to work with communication protocols such as the DMX 512 (Digital Multiplex or DMX512) either with control consoles (lighting tables) or with direct DMX distributors.
The DMX512 protocol was developed by the USITT engineering committee in
1986 under a GNU license, thus achieving standardization between the lighting ﬁxture and the control table. At present, the audiovisual scene and, above all, live shows, demand more; thus, ESTA (The Entertainment Services & Technology Association) has developed a set of applications known as ACN (Advanced Control Network) for this purpose. Other protocols being used are Artnet, Pathport, etc... as well as proprietary protocols from various manufacturers (for example, GrandMA’s MA-NET). What is already a reality is the use of IP standards, under Ethernet communication and exchange systems.
Audiovisual content is very diverse, but it is highly recommended to adjust the size of the TV set to the relevant features and, in the case of lighting, it is also essential in terms of height,
number of devices based on what must be illuminated; and consider whther under a suspension system or hung for better operation and greater use of space when placing sets, artistic and technical equipment. This television production mode makes a series of accessories, implements and devices that facilitate working with light inevitable: cutting blades or visors, grids, ﬁlters and jellies, ﬂags, gobos, windows or light boxes, reﬂective surfaces, grids, glass holders...
Each of the aspects discussed above can be grouped into three main areas of work such as engineering and stage mechanics, stage lighting, and professional control and TV/cinema lighting; and there are numerous companies specialising in the design, construction and commissioning of a TV set.
Likewise, the lighting system on a TV set entails a clear separation of skills and professional proﬁles such as the illuminator, the technical coordinator or head of lighting, the lighting and luminotechnics department, the lighting technology operator, the operator or table/console control technician, and the lighting assistant.
The illuminator, after getting acquainted the content, is responsible for generating the environment and the visual atmosphere through light, which is an essential narrative element, in addition to showing each element of the scenography. Determines the lighting equipment to be used and the place where they have to be placed on the grid, makes the scheme and design of lights and makes all decisions on issues related to light exposure and measurement.
The technical coordinator, or head of lighting of the Lighting and Luminotechnics department, is the professional who puts into operation the technical conﬁguration of the TV set, the distribution of loads and the dimmer and DMX512 communication protocols. Therefore, this professional also determines the resources to be used and the maintenance planning. Acts as a liaison between the illuminator and the TV set installation itself.
The lighting operator (not electrician) has an operator proﬁle for the placement, handling, steering, mobility and maintenance of each light ﬁxture on the TV grid. Makes
changes according to the illuminator’s approach and decisions.
The table/console control technician is a very speciﬁc professional proﬁle around the management and operation of the lighting tables; therefore, this professional is a specialist in the individual operation of each device, in the grouping modes, in the inclusion of eﬀects and timings according to the decisions made and with the content of the TV program.
Finally, the lighting assistant is an operator with less responsibility since he only attends to issues of maintenance, transport of lighting equipment and management of power lines and their distribution.
Given the risk of working with electricity, especially with high-power installations, it is essential to stress training in the prevention of electrical risks amongs the human team, which must always have individual protection equipment and approved materials and measuring instruments.
The TV lighting system has been modiﬁed over time in relation to the changes that have also occurred mainly in image capture devices and playback displays; to innovations such as virtual sets and augmented reality, but also according to the content proposed by professionals and the preferences of viewers. Always without limits but within a framework of social development in favour of quality, sustainable and more environmentally friendly production models.
Powering broadcast and streaming industries with
What are the main advantages of MPEG-5 LCEVC compared to other existing video compression methods?
MPEG-5 LCEVC (LCEVC) is a game-changer in video compression, oﬀering a boost to all existing codecs instead of trying to replace them. It reduces the cost and energy use of the transcoding process by up to 70% while improving compression eﬃciency by up to 40%. What sets it apart is its ability to be deployed rapidly and at scale, using the existing silicon, without waiting for hardware upgrades and long device replacement cycles. Plus, it works hand in hand with other eﬃciency-improving interventions, and its beneﬁts are cumulative, rather than alternative, with those of other methods to improve video compression such as Content Adaptive Encoding (CAE) or simulcasting with a new codec.
How does the LCEVC encoding algorithm work and what makes it unique in the market?
LCEVC is a versatile tool for enhancing video codecs. It works by creating an additional layer of data that compresses either details or corrections of impairments, producing a higher quality stream from a lower one compressed with a traditional video codec.
LCEVC can be leveraged in various innovative ways like upgrading a service to UHD or HDR by just adding a thin layer of data, or to improve the quality, latency and cost eﬃciency of existing video workﬂows.
In a common application, the LCEVC encoder employs a base codec at a quarter resolution (for example in 2160p the base layer will be 1080p), creating a backwards-compatible stream that will play on any device as usual. LCEVC then uses a smart upscale and applies the missing UHD details. By using LCEVC in this way, a service provider can avoid transcoding and simulcast independent UHD proﬁles or channels and upgrade a service to UHD by just adding a small layer of data to the existing 1080p service, saving up to 80% of transcoding processing, storage, and CDN caching and up to 40% of egress bandwidth.
In a speciﬁc application, showcased by Globo airing the ﬁrst LCEVC-enhanced live TV channel for the FIFA World Cup in 2022, LCEVC is used to produce a video stream that simultaneously includes a backwards-compatible 8-bit SDR 709 video and an enhanced 10-bit HDR 2020 video. The base layer is either encoded at a lower resolution or even at the same resolution as the LCEVC-enhanced video, with LCEVC data used to carry HDR information, upscale from 8 bit to 10 bit and correct impairments introduced by tone mapping. By using LCEVC in this way, a service provider can costeﬀectively upgrade their service to HDR by just adding a small layer of data to the existing SDR video stream, rather than duplicating their video workﬂow to produce an independent HDR 10-bit workﬂow.
These are just a few examples of the many unique ways to leverage LCEVC that were tested and veriﬁed extensively over the past two years.
What are the most common applications for LCEVC technology?
LCEVC is a universally beneﬁcial tool. Its most popular use cases are in:
• Broadcasting and Streaming,
to improve cost eﬃciency, latency, and service quality, all critically important to improve proﬁtability and acquire/retain viewers in today’s highly competitive market.
• Social Media (both in content upload and delivery to end-users), for enhanced video quality, end-toend encryption, faster publishing times, reduced infrastructure costs and scalable messaging.
• Cloud Gaming and XR/ VR Pixel Streaming, where the low-complexity nature of LCEVC and its unique multi-layer structure are ideal to simultaneously reduce bandwidth and latency, increasing the quality of cloud gaming experiences and enabling immersive stereoscopic 4K virtual reality experiences within real-world wireless constraints.
When implementing LCEVC, how did you address the challenge of compatibility with existing devices and established video standards?
LCEVC’s compatibility with existing devices and video standards stems from its innovative, low-complexity requirements. By design, base layer processing leverage silicon implementations and the enhancement eﬃciently uses existing hardware acceleration, so that most devices reliably decode LCEVC-enhanced video with power eﬃciency broadly on par with typical hardware decoding.
Uniquely, LCEVC can be eﬃciently rolled-out via software, but it is also important to highlight that even when implemented in silicon, it exhibits great power eﬃciency and density gains (e.g., up to 3x transcoding density), as showcased by AMD on FPGAs and by silicon IP specialists Allegro with the ﬁrst available silicon implementations.
SoC manufacturers are adding LCEVC to their current drivers for immediate ﬁrmware support on current SoCs, and some are also working on dedicated silicon support
for their upcoming SoC generations, for additional density and power eﬃciency.
What are the key criteria that you need to have in mind when deciding whether LCEVC is the right solution for your video service or product?
LCEVC oﬀers immediate return on investment, a wide array of applications, and an ever-growing ecosystem. Start simple, delivering immediate value, and identify one or two high-value use cases to gradually deploy over time. Reach out to us at V-Nova or to any of the companies with integrated solutions for personalized business case discussions.
How is LCEVC adoption in the industry?
Adoption is progressing at a rapid rate. Within 18 months of its ratiﬁcation, key industry players have integrated or are integrating LCEVC into their solutions. Large-scale trials such as the ones performed by Globo TV in 2022 for terrestrial TV and in 2023 for
plug-in-free web delivery are cementing its place in the industry. The standard has also been adopted by the next generation broadcast standard SBTVD 3.0 that shapes the future of media delivery in Latin America, with inclusion under works also for other relevant broadcast speciﬁcations.
Exciting use cases, such as the ones in social media or cloud gaming and VR/AR, have also been gaining traction, as illustrated by a recent blog post from NVIDIA highlighting the LCEVC beneﬁts for ultralow-latency pixel streaming.
The MPEG-5 LCEVC showcase at NAB and IBC has seen a rapidly growing participation of supporting companies to what is expected to be around 40 companies at this year’s IBC, including encoders, platforms, media players, open source projects, chip manufacturers and the most popular video SOCs used in STBs and TVs.
LCEVC’s licensing terms for entertainment services, published in May 2021, are as innovative and simple as the technology itself. Integration is free for solution providers. Licensing fees, which are low-cost and capped, are payable by the video services that deliver LCEVC-enhanced content to viewers, based on viewership and the business model of the service, producing relevant business cases with almost immediate payback.
What does the future hold for LCEVC?
With ongoing projects involving top global media companies, we foresee LCEVC becoming integral to future developments in social media, pixel streaming, video streaming, and broadcast.
What are the licensing terms for using MPEG-5 LCEVC?
Stay tuned for cutting-edge VR/XR spatial entertainment applications – redeﬁning the frontier of visual entertainment with exciting content from tier 1 Studios – and other breakthroughs. Join the LCEVC community or check out lcevc.org for the latest updates.