TM Broadcast International #125, January 2024

Page 1

NEWS - PRODUCTS

Titular noticia Texto noticia

PHOTO CREDIT NANCY MCLAUGHLIN

1



EDITORIAL As we step into the dawn of 2024, the broadcast industry finds itself at the crossroads of innovation and evolution. With technological advancements continuing to shape the landscape, 2024 promises a thrilling journey for broadcasters worldwide. With 2024’s first issue of TMBroadcast International, we explore the new expectations and breakthroughs that could define the year ahead. One of the most anticipated developments in 2024 is the surge in generative AI applications within the broadcast domain. The power of artificial intelligence is set to revolutionize content creation, with generative AI algorithms creating visually stunning and dynamic visuals, but also distribution and monetization for broadcasters and content creators. This promises a seismic shift in how broadcasters produce and present content. Within the present issue, readers will able to find interviews and interesting articles: Keeping on discovering new broadcasting tracks, we explore the experiences and perspectives of the Canadian Broadcasting Corporation (CBC), shedding light on their strategies to stay ahead in the dynamic broadcast landscape. This broadcast landscape is receiving the benefits of new advancements as Augmented Reality, and this is why our in-depth article

Editor in chief Javier de Martín editor@tmbroadcast.com

on (AR) explores how this transformative technology is reshaping storytelling. Discover how AR is breaking down the barriers between the virtual and real worlds, creating captivating and interactive broadcast experiences for audiences. Closing the circle, TM Broadcast unveils how the market is waking up to the fact that the future of broadcasting lies in innovative content distribution strategies. Because of this, we explore the emerging trends that are reshaping how audiences consume content, from personalized streaming experiences to interactive live broadcasts. And, last but not least, as Golden Globes and other cinema industry awards are on the spot this season, we bring our readers a topical interview: we sit down with David Acereto, Director of Photography internationally recognized as participant in projects like Bayona’s ‘Society of the Snow’ or as DoP for ‘Money Heist‘’s spin off ‘Berlín’. With this talk, we aim to gain insights into his creative process and the intersection of technology and storytelling. Let’s open the window and contemplate the horizon to which we are headed for and place your bets: we are starting not only a new year but a new cycle in broadcasting and content creation.

Creative Direction Mercedes González

TM Broadcast International #125 January 2024

mercedes.gonzalez@tmbroadcast.com

Key account manager Patricia Pérez ppt@tmbroadcast.com

Administration Laura de Diego

Editorial staff press@tmbroadcast.com

Published in Spain

administration@tmbroadcast.com

ISSN: 2659-5966

TM Broadcast International is a magazine published by Daró Media Group SL Centro Empresarial Tartessos Calle Pollensa 2, oficina 14 28290 Las Rozas (Madrid), Spain Phone +34 91 640 46 43

3


SUMMARY

6

22

News

Navigating technological frontiers - An in-depth interview with CBC on evolution and innovation In an exclusive interview, CBC’s technology leaders provide insights into the broadcaster’s transformative journey over the past years. From pioneering IP-based MoIP systems to harmonizing OTT platforms and venturing into the realm of cloud production, CBC has been at the forefront of technological innovation. Discover the challenges faced, advancements made, and a glimpse into the future of broadcasting as CBC embraces automation, AI, and cutting-edge advances.

4

30

Augmented Reality in TV

34

Content distribution: the final word has not been said yet


40

56

64

David Acereto, DoP of “Berlin”, “The Head” or “Sky Rojo” CAT wiring The audiovisual sector has experienced many innovations, changes and technologies that have marked the course of its history, but perhaps, one of the greatest transformations having taken place is the implementation of communications and connections through IP (Internet Protocol) network architectures.

Test Zone: Sony HXC-FZ90

5


NEWS | PRODUCTS

Ikegami elevates broadcasting with UHK-X600 4K solution The recently announced UHK-X600 multi-format portable camera addresses the demand for durable and future-proof solutions, offering an upgrade from HD to UHD through a license-key based option. This camera supports 2x, 3x, and 4x high-frame-rate capture in HD, with optional 4K format (3840×2160/50p) providing

IKEGAMI UHK-X600 CAMERA

2000 TVL resolution. Additional options include a SMPTE ST

documentaries. With the growing

5 kg, the camera accepts all B4

popularity of HDR in streaming

ENG/EFP-lenses, supporting

services such as Apple TV Plus,

chromatic aberration correction,

Alan Keil, Vice President and

Netflix, and Prime Video, the

vignetting correction, ramping

Director of Engineering at

UHK-X600 supports Hybrid Log

compensation, and remote

Ikegami Electronics (USA), Inc.,

Gamma and offers flexibility in

control of backfocus when

emphasizes the importance of

selecting BT.2020 and BT.709

supported by the lens.

the license-key option, allowing

color spaces. Global shutter

customers to invest in solutions

imagers minimize artifacts when

that align with their current

televising LED screen walls or

business model without risking

flash/strobe illuminated stage

obsolescence. The shift to

environments.

2110 compatible media-over-IP interface board and 12G outputs.

4K-UHD is particularly relevant in the USA, where ATSC 3.0 offers potential for terrestrial and satellite-based 2160p direct-tohome transmission.

The UHK-X600, an extension of Ikegami’s Unicam XE product line, is designed for studio, OB applications, and batterypowered over-the-shoulder

The UHK-X700, an extension of the UHK-X600, includes UHD-frame rate operation as a standard feature and can operate in 4K/UHD 2x speed or in Full HD format at up to 8x speed. Sales highlights include purchases by CERTV in the Dominican Republic and RE:LIVE Productions in Singapore.

Gisbert Hochguertel, Ikegami

location production. Featuring

“RE:LIVE Productions has

Europe Product Specialist,

three 2/3-inch CMOS global

operated Ikegami HDK-55

highlights the added advantage

shutter imagers, it delivers 1000

camera systems successfully for

of high-frame-rate video

TVL resolution, minimal aliasing,

the past five years,” comments

capture for applications like

and high sensitivity. Measuring

Yasunori Kanno, President of

slow-motion sports and wildlife

34 x 24.5 x 15.5 cm and weighing

Ikegami APAC. 

6


NEWS | PRODUCTS

7


NEWS | PRODUCTS

LucidLink and Magicbox partners to impulse virtual production within LED volumes

LucidLink has forged a strategic

immersed themselves in a live

partnership with Magicbox,

virtual production set, becoming

the creators of the cutting-

the stars of their own jungle

edge mobile virtual production

adventure movie, all made

studio, to pioneer innovative

possible by LucidLink. The

virtual production workflows

interactive experience, aptly

for motion picture productions

named ‘Jungle Hunt,’ took place

and immersive experiences.

inside a 600-square-foot semi-

The modern virtual production

trailer.

paradigm within LED volumes has brought about significant transformations in the media and entertainment industry. This approach empowers content producers with greater control

With this new tool, companies say, editors could remotely edit personalized souvenir movies from the cloud and clients can now swiftly connect to footage

over the production process.

captured on set. With Magicbox,

Magicbox recently unveiled the

Whether a producer or an editor

Magicbox Mobile Superstudio™,

is located in the same building or

a fully functional LED volume and

a different time zone, the virtual

computer control center housed

production workflow remains

within a transforming semi-

seamless, providing shared

trailer. Visitors to presentation

access to media assets stored in

8

the set can be virtually anywhere.

the cloud as they are produced. The Magicbox Jungle Hunt activation showcased the integration of several technologies prevalent in the media and entertainment industry. Footage captured on Red Komodo cameras was uploaded to Frame.io through the Camera2Cloud feature, then downloaded to LucidLink. Video editors efficiently inserted the clips into a templated sequence stored on LucidLink. The final souvenir movie files were swiftly uploaded to Frame.io. Peter Thompson, CEO and CoFounder of LucidLink, highlighted, “LucidLink is allowing creatives to concentrate and create at the speed of thought rather than the speed of technology.” 


NEWS | PRODUCTS

9


NEWS | PRODUCTS

ARRI announces Trinity Live, a broadcast-optimized upgrade of the first-generation TRINITY stabilizer ARRI launches Trinity Live, an upgrade option for Trinity Gen.1—the first generation of ARRI’s flagship, body-mounted camera stabilizer system. Comprising new hardware, wiring, and connectivity, the upgrade kit optimizes the Trinity Live system for live productions and improves its functionality within the ARRI Multicam System and third-party broadcast camera applications. At the NAB Show in Las Vegas, ARRI announced its Camera Control Panel CCP Live and Tally System Gen.2, which expand the broadcast capabilities of existing ARRI cameras and camera stabilizers systems. The new Trinity Live complements both of these products and provides the ideal way to seamlessly integrate stabilized ARRI cameras into live, multi-camera broadcast environments. Trinity Live allows the wireless video link that is required for broadcast work to be mounted at the bottom of the system instead of being attached to the camera, as is normally the case. Positioning a wireless video link such as the Vislink system at the bottom of the center post allows it to be used as a counterweight when balancing the rig, reducing its weight by up to 4 kg / 8.8 lb. The new position also results in a more stable wireless signal because the antenna rotates within a smaller radius and fluctuates in height less than it

10

ARRI TRINITY LIVE CAMERA STABILIZER

does when attached to the back of the camera. Enhanced connectivity is at the heart of the TRINITY Live upgrade, with four new 4G video lines and a 10 Gbit shielded Ethernet line facilitating the new wireless video link position. The connectors for these new video lines necessitate a completely new center post, post connection, top stage junction box, and wiring loom, all of which are included in the upgrade kit. In total, Trinity Live offers five 4G video lines. The four new lines feed the wireless video link with up to four quad link signals (4 x 3G), or with two video lines when Dual SDI (2 x 6G) is needed. The fifth video line serves as a playback line, allowing the operator to see the program image on a

second monitor, transmitted wirelessly from the OB van to the Trinity Live. Meanwhile, the additional internal 10 Gbit Ethernet line provides the OB van with full camera control and communication. Two further Trinity upgrade options are also available for specific TRINITY Live use cases. The ARRI Master Grip Trinity upgrade gives operators simultaneous control of a zoom lens and the stabilizer’s tilt axis via a common LBUS controller. The 24 V upgrade for Trinity Live allows the system to provide reliable 24 V camera power when operators need to fly an ALEXA Mini LF, ALEXA 35, or Sony Venice. Trinity Gen.1 rigs previously upgraded to 24 V must still be fitted with this new 24 V upgrade as part of a Trinity Live conversion. 


NEWS | PRODUCTS

Increased efficiency and new graphic user-interface: VideoIPath update available Nevion has announced the availability of the 2023 LTS release of its media orchestration platform, VideoIPath. This release focuses on providing enhanced functionality for orchestrating, operating, and monitoring broadcast infrastructure and media networks. The platform is a key component of Sony’s Networked Live offering, is utilized globally by broadcasters and telecom service providers for various applications, including contribution, remote production, facilities, OB trucks, and GCCG.

The 2023 LTS release features a new graphic user interface (GUI) with improved search and filtering capabilities, faster interface refresh, and increased customization options. The GUI facilitates linking between different applications, streamlining the user experience and reducing the number of clicks required for increased operator efficiency.

workflows that leverage the

Addressing the management of broadcast operations, the release emphasizes enabling

legacy protocols, to support

flexibility of IP, especially in hybrid SDI/IP environments. New functionalities include support for camera control workflows and fast switching in IP, such as shading, along with added support for GPIO connection management and logic. The release introduces new north-bound interfaces, including familiar systems and control surfaces. 

11


NEWS | STORIES OF SUCCESS

NEP UK employs Sony HDC cameras to elevate live production across Europe Throughout 2024, the HDC3500V cameras will play a pivotal role in NEP UK’s endeavors, spanning not only entertainment productions but also live sporting events, including those in vibrant locations like Paris. The company points out a standout feature of the HDC3500V: the newly developed Virtual Iris function, coupled with the optical variable ND filter. According to company’s statement, this empowers vision engineers to pinpoint an iris ‘sweet spot’ on the attached NEP UK, part of the global NEP Group, is gearing up for a state-of-the-art year in live productions. The broadcaster company has invested in the latest Sony HDC-3500V cameras, marking the first deployment of these devices in Europe. This move follows NEP UK’s strategic decision last year to incorporate Sony HDC-F5500 cameras, emphasizing a commitment to delivering top-notch visual

of creative possibilities. The

lens, fine-tuning the camera’s

integration of these cameras into

depth of field under varying

NEP UK’s portfolio aligns with the

light conditions. Enginers, this

company’s dedication to pushing

way, have the ability to adjust

experiences.

seamlessly fitting into live

The Sony HDC-3500V cameras, unveiled at NAB 2023, bring a revolutionary feature to the table – an optical variable ND filter. This innovation provides operators with precise control over the depth of field during live broadcasts, unlocking a realm

12

technological boundaries for its clients. In addition to the HDC3500V cameras, NEP UK had previously embraced the Sony HDC-F5500 cameras, known for delivering a shallow depth of field with S35mm Large Format, production workflows. This setup, complemented by the ILMEFR7 and HDR-capable monitors

brightness using the optical variable ND filter, instead of the lens iris. Donald Begg, Senior Director of Engineering and Technology at NEP UK, emphasized the company’s commitment to merging exceptional talent with cutting-edge technologies. He stated, “The variable ND filter in the HDC-3500V permits, if

such as the PVM-X2400 and

desired, the creation of images

LMD-A240 models, showcases

with a distinctive artistic tone,

NEP UK’s commitment to offering

and we have been witnessing a

comprehensive solutions to its

growing appetite for that look

clientele.

and feel from our clients.” 


NEWS | STORIES OF SUCCESS

Japanese NHK’s international channel expands reach in Germany through waipu.tv launch NHK World-Japan, the international service of Japan’s public media organization NHK, is expanding its presence in Germany through its launch on waipu.tv. Starting from January 15, 2024, the 24-hour service will be available on the waipu.tv platform, accessible to all waipu. tv subscribers. This marks the first Japanese service on the platform, diversifying the channel offerings provided by waipu.tv.

Jun Takao, CEO of Japan International Broadcasting (JIB), the NHK subsidiary responsible for its international service’s global distribution, expressed, “Today marks a very important day for us. With waipu.tv, we found a partner that will help us make NHK World-Japan even more accessible across Germany. We are excited about the launch and look forward to a successful partnership.” Susanne Thierer, Head of Content at waipu.tv, added, “Our aim is to offer relevant channels

and content for every customer. The inclusion of NHK is an exciting addition to this variety and gives our customers an insight into Japanese culture and its views on the world.” NHK international broadcast channel offers a 24/7 service, featuring hourly live international news from Tokyo and news bureaus around the world, as well as programs covering Japanese society, politics, science, culture, history, food, and lifestyle. 

13


NEWS | STORIES OF SUCCESS

Clear-Com and Matsuda Trading impulse Tulip TV’s intercom system in new Toyama facility 2021 to equipment shipping

The core of the system is the

in 2022 and installation amid

Eclipse HX-Delta, functioning as a

Covid restrictions in early

central matrix, equipped with an

2023, showcased a resilient

E-IPA card for IP connectivity. This

and innovative approach to

enables seamless communication

tackle challenges posed by the

between V-Series Iris™ Panels,

pandemic, both company stated.

FreeSpeak Edge transceivers, and

Tulip TV and its longstanding partner, Matsuda Trading (MTC), expressed their commendation for the technology and mutual dedication displayed during the challenging circumstances. MTC emphasized their gratitude towards Tulip TV for their perseverance in installing cuttingedge communication equipment during the Coronavirus pandemic and a difficult procurement IRIS PANEL

period. The commitment from Clear-Com and MTC to continue providing a stable intercom

In a pivotal role, Clear-Com®

system in the future was

has spearheaded the upgrade

reiterated.

of Tulip Television Co. Ltd.’s intercom system for its state-ofthe-art office building in Toyama City, Japan. As an affiliate of the Japan News Network (JNN), Tulip Television Co. Ltd. undertook a comprehensive modernization

Facilitated by Matsuda Trading (MTC), Clear-Com’s distributor in Japan, the deployed system incorporates a Clear-Com Eclipse(r) HX digital matrix system with FreeSpeak Edge® digital

Agent-IC across different rooms and locations, with Iris panels strategically placed at remote master/news desks. All wireless transceivers operate through the IP network, with a 5 GHz FreeSpeak Edge system providing wireless coverage for beltpacks in the ground floor studio. Tulip TV’s technical team emphasized the unity of all endpoints within a single communication system, featuring numerous Clear-Com Iris key panels in each sub-table location. Interoperability extends to cameras, radios, announcer talkbacks, and phone lines. The system designed for the new space offers Tulip TV an advanced communication environment, utilizing the Clear-

wireless. Leveraging Clear-

Com system’s functionality.

Com’s highly flexible mobile

Tulip TV expressed satisfaction

broadcasting industry.

smartphone app, Agent-IC®,

with the user-friendly EHX

the solution delivers a cutting-

management software and

The entire upgrade process,

edge communication platform,

highlighted its ability to create

spanning from the design and

simplifying and streamlining

a diverse communication

demonstration phase in 2020-

communications for Tulip TV.

environment.

of its intercom system to meet the evolving demands of the

14


NEWS | STORIES OF SUCCESS

TCL joins Sony, Samsung, and Hisense in offering integrated NextGen TV receivers Over 100 NextGen TV products are set to be available for U.S. consumers in 2024 as the deployment of ATSC 3.0 expands, reaching 75% of households. TCL has joined Sony, Samsung, and Hisense in offering integrated NextGen TV receivers, along with accessory upgrade receivers from ADTH, Stavix, Zapperbox, and Zinwell. ATSC 3.0, the broadcast standard powering NextGen TV, enhances the video and audio experience while facilitating the seamless delivery of overthe-top streaming content. The Consumer Technology Association reported that the cumulative U.S. installed base of NextGen TV receivers has surpassed 10.3 million, with consumer sales expected to increase by 45% in 2024. TCL has announced plans to introduce new TV models with integrated NextGen TV capability. The ATSC exhibit at CES 2024 showcases the latest developments, with a focus on strengthening the number of available consumer receivers. ATSC President Madeleine Noland anticipates NextGen TV reaching a 75% household reach milestone in February. Apart from major TV manufacturers, accessory receiver models are expected to double in 2024, providing affordable options for upgrading

existing TV sets. ATSC is pleased to welcome TCL to the list of TV makers offering NextGen TV. Additionally, ADTH, Stavix, Zapperbox, and Zinwell plan to offer NextGen TV certified and security-verified receivers, including some with digital video recording capability. ATSC 3.0 has a global impact, with South Korea and Jamaica already on-air, Trinidad & Tobago transitioning in 2025, and Brazil and India evaluating ATSC 3.0 for handling various technological challenges. The ATSC exhibit at CES also showcases upgrade accessory receivers from ADTH, Stavix, Zapperbox, and Zinwell, offering diverse options to consumers. Dolby Laboratories is highlighting immersive entertainment through NextGen TV, featuring Dolby Atmos for fully immersive audio and High Dynamic Range (HDR) video like Dolby Vision. Local stations are expanding

over-the-air OTA sports coverage, and broadcasters are expected to offer more natively-produced HDR sports programming in 2024. ATSC 3.0’s capabilities extend beyond television viewing, with demonstrations at CES showing how broadcast IP can offer more choices to viewers, advanced emergency information, unique broadcast “start over” functions, and additional navigation services. A suite of new music video channels is set to be announced at CES, bringing interactive music features to broadcast TV for the first time. Regulators in Brazil are evaluating the physical transmission layer of ATSC 3.0, and India is exploring ATSC 3.0 for transmitting television to mobile devices. Jamaica’s consumerfacing advertising campaign for NextGen TV will be on display, part of its ongoing launch of ATSC 3.0 services. 

15


NEWS | STORIES OF SUCCESS

Dock10 and University of York allies on AI solutions for real-time lighting in virtual studios This collaboration is designed to address challenges in multi-camera setups, targeting real-time solutions for enhanced broadcast quality

University of York and dock10

broadcast industry. The primary

setups, the complexity of multi-

collaborate on a research

focus is on driving innovation in

camera environments demands

project using AI to improve

virtual production technologies,

simultaneous multi-angle

realistic lighting in virtual

particularly in complex scenarios

captures, rendering current

studio productions. dock10

such as live sports broadcasts,

solutions less practical.

has embarked on a research

game shows, and music

initiative in collaboration with

performances that rely on multi-

the University of York. This

camera green screen studios.

dock10’s virtual studio experts

In the realm of multi-camera

at the University of York aims

setups, a persistent challenge

to pioneer AI-driven tools

is the absence of natural light

specifically designed for real-

using artificial intelligence (AI).

interactions, necessitating post-

time applications in virtual studio

production efforts to introduce

settings. The ultimate objective

The project, funded by an

shadows or reflections for

is to develop AI-powered

embedded R&D grant from XR

actors placed within computer-

compositing solutions capable

Network+ and led by XR Stories

generated environments. While

of seamlessly integrating realistic

at the University of York, seeks

LED volume techniques have

lighting interactions, thereby

to establish a strong connection

been successful in achieving

transforming live or ‘as live’

between academia and the

these effects in single-camera

broadcasts.

joint venture aims to tackle a significant challenge in virtual studio productions: the realistic simulation of light interactions

16

The collaboration between and the AI research scholars


NEWS | STORIES OF SUCCESS

Da Vinci Resolve Studio used in “Next Goal Wins” film’s post-production Blackmagic Design has recently

we were attempting to have

“I really loved the scene and

announced that the feature

a feeling of the more ‘normal/

following montage of the team

film “Next Goal Wins” was post-

formal’ world of organized sports,

running up a mountain where

produced by FotoKem’s David

and in particular, to enhance the

a delirious Rongen delivers

Cole using DaVinci Resolve

‘interrogation’ of Fassbender’s

a speech. Lush vegetation,

Studio, the platform for editing,

character Rongen.”

beautiful skies, low sun, and

color correction, visual effects (VFX), and audio post-production.

Cole utilized DaVinci Resolve Studio’s toolset throughout the

enticing beaches all added up to a beautiful couple of scenes,” he noted.

Directed by Taika Waititi, “Next

project. “We used the full arsenal

Goal Wins” tells the story of the

of Resolve tools, including keys,

While challenging schedule-wise,

notoriously terrible American

animated Power Windows, and

Cole enjoyed working with such

Samoa soccer team, infamous for

curves to have a consistent feel

a visionary filmmaker as Waititi.

their brutal 31-0 loss in a 2001

over the course of the movie,”

“Taika is a filmmaker who is

FIFA match. With the approaching

he added. “Custom DCTLs were

always busy and has many plates

2014 FIFA World Cup qualifiers,

used to handle gamut mapping

spinning at once,” said Cole. “We

the team hires down-on-his-luck,

of police car lights so that they

maverick coach Thomas Rongen

didn’t tear and look too digital.

(played by Michael Fassbender),

A custom film emulation was

hoping he will turn the world’s

also created within Resolve and

worst soccer team around in this

applied to the entire movie. The

heartwarming and humorous

film’s LUTs were also developed

underdog story.

within the grading package.”

the movie. Judicious use of Taika’s

Cinematographer Lachlan Milne,

With a wide variety of beautiful

meetings allowed us to get inside

who previously collaborated

imagery, Cole particularly enjoyed

his head and realize his vision on

with Waititi on “Hunt for the

grading the team mountain run:

the screen.” 

began grading the film, then stopped for a year or more while he went off and shot ‘Thor: Love and Thunder.’ We then came back to an editorially refined version of the film and completed time during grading sessions and

Wilderpeople,” also worked with Cole on two consecutive films, “Love and Monsters” and “Minari.” Collaborating with Milne and Cole, Waititi focused on creating a unique feel for the film’s two distinct sequences. “One of Taika’s main directives was that American Samoa needed to lean into the world of ‘hyperreal’ or ‘postcard idealized,’” said Cole. “In comparison, at soccer HQ,

17


NEWS | STORIES OF SUCCESS

LiveU’s IP-video ecoSystem facilitates live broadcasts from Pacific Games The 2023 Pacific Games held in the Solomon Islands achieved successful live broadcasts to viewers across participating countries, thanks to LiveU’s IPvideo EcoSystem. Chosen for its cost-efficiency and adaptability, LiveU’s technology, backed by LRT™ (LiveU Reliable Transport), ensured resilient live broadcasts to various broadcasters and audiences spanning the Pacific Islands, Australia, and New Zealand. LiveU’s team was onsite during the games, offering technical support.

Islands and neighboring nations through sports. The event features 24 sports, including archery, athletics, basketball, football, judo, and rugby.

The Pacific Games, established in 1963 and held every four years, foster camaraderie among Pacific

LiveU’s compact 5G LU300S encoders were utilized for pointto-point transmission, while LiveU

LIVEU IN ACTION AT THE PACIFIC GAMES

18

Traditionally, the world feed was satellite-transmitted and centrally produced with selective content. However, organizers turned to LiveU for cost-effective and dependable IP distribution over the open internet using LiveU Matrix’s fully managed service, accommodating multiple feeds from different venues.

Ingest facilitated cloud recording on-site. The rackmount LiveU Transceivers handled uplinks and point-to-point transmission. Chris Dredge, Country and Sales Manager – LiveU Pacific, remarked, “It was an honor to be part of these historic Pacific Games, a resounding success that showcases the potential of our EcoSystem. This significant production illustrates how LiveU can replace conventional broadcast methods for major sports events, encompassing live ground contributions, cloud ingest, and diverse distribution approaches.” 


NEWS | STORIES OF SUCCESS

Gravity Media Australia to deliver broadcast coverage of 2024 Santos Tour Down Under Events South Australia and Gravity Media recently unveiled the broadcast technology and television production collaboration that will see Gravity Media Australia create and deliver coverage of the 2024 Santos Tour Down Under across Australia and multiple international territories. A total of 9 stages and 2 criteriums bring together female and male UCI WorldTour professional cycling teams to race on the streets of Adelaide and across regional South Australia every January. South Australia and the Santos Tour Down Under are the first destination for the world’s best male and female cycling teams and riders competing in 2024. In partnership with the Union Cycliste Internationale (UCI), this event marks the opening of the UCI WorldTour, which also includes prestigious races like the Tour de France, Giro d’Italia, and Vuelta a España. The Santos Tour Down Under is owned and managed by Events South Australia, a division of the South Australian Tourism Commission, on behalf of the South Australian Government. Gravity Media Australia has been appointed by Events South Australia as the “host” broadcast company for the Santos Tour Down Under. This marks Gravity Media Australia’s fourth year as

the broadcast and production partner for Events South Australia for this event. In a major technological, logistical, and production effort, Gravity Media Australia is deploying state-of-the-art highdefinition outside broadcast trucks and satellite units, over 15 cameras, including on-board “chase” motorcycles, aerial coverage using a helicopter camera, and a crew of up to 100 to cover the Santos Tour Down Under as it travels more than 1,000 kilometers across South Australia. Gravity Media Australia is also utilizing specialized broadcast technology and communication services through its Globecam business, which is internationally recognized for its innovation in developing world-leading specialty miniature cameras and RF communications solutions. In addition to providing all broadcast technologies and acting as the “host” broadcast

company, Gravity Media Australia has also been appointed as the television production company for the event, responsible for creating, producing, and delivering the complete television production of the Santos Tour Down Under for a global audience. Beyond its role as the “host” broadcast company, Gravity Media Australia has also been tasked with creating and producing Seven Network’s coverage of the Santos Tour Down Under, using its SNG fleet and introducing highly mobile cellular and cloud-based technologies that connect to the Gravity Media Production Centre in Sydney, streamlining and enhancing the efficient delivery of Seven’s complete broadcast production. Gravity Media’s overall production includes 40 hours of live coverage and the production and delivery of a daily highlights program for both Australian and international audiences. 

19


NEWS | BUSINESS & PEOPLE

TVN Live Production takes control of OB company TopVision With this movement TVN aims to expand expertise in OB van services and reinforce broadcast capabilities

SEALED THE BUSINESS MERGER: MARKUS OSTHAUS (CEO) AND CHRISTIAN PANHORST (CFO) OF TVN LIVE PRODUCTION, ACHIM JENDGES (CEO) AND SHAREHOLDER MANFRED MÜLLER OF TOPVISION (FROM LEFT).

TVN Live Production GmbH has officially taken control of TopVision Telekommunikation GmbH & Co. KG, effective January 1, 2024. The acquisition encompasses the brand, the Berlin-based company headquarters, and the existing management structure. This strategic move sees TVN Live Production acquiring the entire shareholding of TopVision, a well-established OB van service provider headquartered in Berlin since its inception in 1993. Specializing in sports and soccer productions, TopVision

20

operates with three OB vans and a workforce of approximately 40 employees, under the leadership of managing owner Achim Jendges. TVN Live Production GmbH, a joint venture between TVN Group Holding (65%) and Sportcast GmbH (35%), both entities having connections to Mediengruppe Madsack and DFL GmbH respectively, is positioning itself for enhanced capabilities with this acquisition. The merger aligns with the shared commitment to high-

quality productions and technical expertise, characteristic of both TVN and TopVision over the years. For Managing Director Markus Osthaus, the consolidation represents a logical step forward that aims to amplify performance and flexibility, ultimately benefiting the clientele: “Continuity and partnership are also our top priorities when it comes to customer relationships. We will continue to maintain these with our existing contacts, and we will be able to serve them even better – thanks to the greater possibilities.” 


NEWS | BUSINESS & PEOPLE

Intel establishes independent AI software firm Articul8 with DigitalBridge Backing Intel has announced the creation

developed in-house, is now

of a new independent company,

being spun off into Articul8 AI, a

Articul8 AI Inc., focused on its

company jointly launched with

artificial intelligence (AI) software

investment firm DigitalBridge

initiatives. The venture is backed

Group. Articul8 is positioned as

by digital-centric asset manager

an independent entity offering

DigitalBridge Group and various

an enterprise-focused, fully

other investors.

integrated, and secure generative

While the financial details of the deal remain undisclosed, Intel

artificial intelligence (GenAI) software platform.

clarified that Articul8 AI would

This strategic move follows

operate as an independent entity

Intel’s substantial investments

with its own board of directors,

in the AI sector as it competes

emphasizing that Intel would

with industry leader Nvidia in

retain a stake in the new venture.

the rapidly growing artificial

Intel’s AI software, previously

intelligence market.

Key Features of Articul8 AI: 1. Full-Stack GenAI Platform: Articul8 provides an AI platform with capabilities designed to keep customer data, training, and inference within the enterprise’s security perimeter. It offers deployment options in the cloud, on-premises, or through hybrid configurations. 2. Collaboration with Intel: Articul8 leverages intellectual property and technology developed at Intel, with both companies committed to strategic alignment on goto-market opportunities and collaborative efforts to drive GenAI adoption in the enterprise. 3. Leadership and Funding: Arun Subramaniyan, formerly vice president and general manager in Intel’s Data Centre and AI Group, takes the helm as CEO of Articul8. DigitalBridge Ventures serves as the lead investor, with additional support from a syndicate of established venture investors. Intel and its consortium of venture investors, including Fin Capital, Mindset Ventures, Communitas Capital, GiantLeap Capital, GS Futures, and Zain Group, have collectively taken an equity stake in Articul8.

21


BROADCASTERS| CBC

22


BROADCASTERS| CBC

CONTROL ROOM.

23


BROADCASTERS| CBC

In an exclusive interview, CBC’s technology leaders provide insights into the broadcaster’s transformative journey over the past years. From pioneering IP-based MoIP systems to harmonizing OTT platforms and venturing into the realm of cloud production, CBC has been at the forefront of technological innovation. Discover the challenges faced, advancements made, and a glimpse into the future of broadcasting as CBC embraces automation, AI, and cutting-edge advances. TM Broadcast International magazine’s readers will find through this interview some keys about journalism and its relationship with broadcast and production technologies, as well as valuable hints into the future of the audiovisual market.

PHOTOS BY NANCY MCLAUGHLIN.

Can you tell us about CBC’s technological evolution during these last years? What have been the main challenges CBC had to overcome since 2021? In recent years, we have developed the expertise needed to design, implement, and maintain the IP-based MoIP systems at our new French Services headquarters in Montréal, the Maison de Radio-Canada. Our primary challenge since 2021 has been the transition to softwarebased audio/video processing equipment. The Montréal presentation has been successfully delivered and is on-air, while development is still underway for production systems in Toronto.

24

Nowadays, CBC has integrated different platforms for streaming services (news). How has the process been to create these platforms, and what solutions do you use for managing media and storage? CBC/Radio-Canada distributes news, entertainment, and sports content across various digital platforms such as CBC Gem, CBC.ca, CBC Listen, CBC Kids, ICI TOU.TV, RadioCanada.ca, Radio-Canada OHdio, and MAJ. While some tools have been developed in-house from the ground up, others leverage industrystandard frameworks. Recently, our focus has been on harmonizing our OTT

platforms, specifically CBC Gem for English Services and ICI TOU.TV for French Services. Although these platforms were initially built on separate systems, CBC/RadioCanada decided to harmonize them for optimization and cost reduction. This initiative involved more than just visual alignment; it

MRC


BROADCASTERS| CBC

Our primary challenge since 2021 has been the transition to software-based audio/ video processing equipment.

required the harmonization of various technological systems, including the CMS, subscription and billing systems, and advertising components. Moreover, we successfully delivered components to support over 14 partner platforms and applications, such as iOS, Android, TVOS, Android TV,

Fire TV, Web, Chromecast, Xbox, Hélix, Telus Optik, and more.

CBC/RADIO-CANADA DISTRIBUTES NEWS, ENTERTAINMENT, AND SPORTS CONTENT ACROSS VARIOUS DIGITAL PLATFORMS SUCH AS CBC GEM, CBC.CA, CBC LISTEN, CBC KIDS, ICI TOU.TV, RADIO-CANADA.CA, RADIO-CANADA OHDIO, AND MAJ.

25


BROADCASTERS| CBC

Can you tell us about CBC’s

Has CBC implemented

local news production to

technological evolution

robotic or automated

flagship network morning

during these last years?

solutions for production?

shows and anything in

What have been the main

Can you describe what kind

challenges CBC had to

between. Although we

of automation or robotic

overcome since 2021?

tools do you use, what

currently use industry

In recent years, we have

brands or companies?

developed the expertise

CBC/Radio-Canada is a leader

a full set of complementary

needed to design, implement,

in automated production. We

solutions to make automated

and maintain the IP-based

heavily rely on automated

production software usable

MoIP systems at our new

production for all sorts of

for lightly/loosely scripted

French Services headquarters

productions, ranging from

production and breaking news

in Montréal, the Maison de Radio-Canada. Our primary challenge since 2021 has been the transition to softwarebased audio/video processing equipment. The Montréal presentation has been successfully delivered and is on-air, while development is still underway for production systems in Toronto. During our last interview, you told TMBroadcast that CBC used Virtual Production just occasionally? Has that situation changed? Is CBC using more of virtual production? This is still the case. We don’t do any pure virtual production on a daily basis, but we still use XR for select elections and sporting events, where it brings added-value to our storytelling.

26

MTL

standard software (ross overdrive), we have developed


BROADCASTERS| CBC

scenarios. We are keeping an

a hybrid model that you

eye on the ecosystem and

tested that works well

the user community that is

and that you were looking

building itself around Sofie,

at deploying this concept

an open-source production

for a large-scale project.

automation software from our

Have you done it? With

friends at the NRK.

which storage providers do you work at the moment?

Does CBC work with cloud

And, what challenges has

production? Last time we

CBC met during the last

spoke you told us that

two years regarding Cloud

CBC used to work with

production?

In recent years, we have developed the expertise needed to design, implement, and maintain the IP-based MoIP systems at our new French Services headquarters in Montréal, the Maison de Radio-Canada. A few POC tests were completed, and some more are in progress to test all sorts of use cases. We faced multiple roadblocks in this journey. Hybrid models are difficult to justify in most circumstances because they still require a significant amount of hardware onpremises, which means we don’t really substantially save on our building/power/HVAC infrastructure. Also, many broadcast software vendors’ pay-as-you-go models don’t make sense to us as endusers, as the hourly cost for extra capacity is so high that most of the time it still is better to buy monthly, yearly, or even perpetual licences. The cost to move data in/out of the cloud is also

27


BROADCASTERS| CBC

significant. All this to say that the cloud seems to be better suited to some applications than others, and for us, we haven’t found a good match for production workflows that were recently improved, changed, or transformed. That being said, it’s another story for our playout systems, and we are in the process of moving all our FAST channels’ playout into the public cloud. What can you tell us about VFX technology and how did it impact the broadcast industry? And what about xR? What possibilities have opened this kind of tech for broadcast production? I would argue that the gaming industry had a more profound impact on the (especially live) broadcast industry than any other industries. Unreal engine-based solutions significantly lowered the entry point for quality xR in broadcast applications. What role has Live Graphics technology in broadcast at this very moment? With what kind of solutions for creating and managing live graphics does CBC work? We currently use VizRT across most of our studios and production centres. Although that unified solution brought

28

to us tremendous value over the years, we see an explosion of live graphics needs, and we believe that 80% of them could be better solved using HTML5 workflows. We are experimenting with our own SPX based in-house built applications, and we are evaluating our options for the future. Does CBC work with artificial intelligence? If the answer is yes, how does CBC apply this technology? With what kind of AI solutions does CBC work? One example I can share is from our English-language

news service. You can read more in this blog from the Editor-in-Chief of CBC News, Brodie Fenlon, here: How CBC News will manage the challenge of AI. «To be clear, many forms of AI are already baked into much of our daily work and tools. Suggested text, autocomplete, recommendation engines, translation tools, voice assistants — all of these tools fall near or under the broad definition of “artificial intelligence.” What has made headlines and raised many questions in our newsrooms lately


BROADCASTERS| CBC

The gaming industry had a more profound impact on the (especially live) broadcast industry than any other industries

has been “generative AI,” a version of the technology that uses machine learning on vast amounts of data to produce high-quality original text, graphics, images and videos. Consumer-friendly versions of generative AI tools like ChatGPT and DALL-E have increased the public’s

awareness of the incredible power and risks of this technology.

synthetic media, which is any media that has been fully or partly generated by AI.

CBC/Radio-Canada is already an industry leader on standards and best practices. We are a founding member of Project Origin, which aims to set up provenance standards and a process for the authentication of original media (i.e. to ensure people know what they’re seeing or hearing was actually produced by its purported source).

As with the emergence of any significant new technology, there are both opportunities and dire warnings about what’s to come. While the future is uncertain, it’s clear AI will disrupt society in ways that are still difficult to imagine, including in our work in public service journalism.

And in February, CBC/RadioCanada signed on to a firstof-its-kind framework for the ethical and responsible use of

(…) At the heart of the CBC/ Radio-Canada approach will be the principles of trust, transparency, accuracy and authenticity that are already core to our journalistic standards and practices (JSP). The bottom line: you will never have to question whether a CBC News story, photo, audio or video is real or AIgenerated ». https://www.cbc.ca/news/ editorsblog/cbc-twitternews-1.6873270 What big projects are on the horizon for CBC? What developments and new technologies is CBC thinking about to deploy in a near future? Nothing I can announce at this time, but stay tuned! 

29


TECHNOLOGY| AUGMENTED REALITY

By Carlos Medina Audiovisual Technology Expert and Advisor

My first experience in television was in my fourth year of Image and Sound university studies in a public channel and a great program. It was a contest-show that would take place on a large set full of props with great detail and spectacular scenery. You could really see the material side, everything was tangible. Surely, if augmented reality had existed at that time, it would have been an essential technology in the dynamics of this contest. Nowadays, television sets are almost empty of scenographic

30

elements, only equipped with LED screens and infinite backgrounds where the computergenerated scenery is embedded or directly into the broadcast master video signal. Augmented Reality (AR) is making TV more TVlike than ever before. The word “television” is a hybrid of the Greek words τ λε (tēle, “far away”) and the Latin visiōnem (accusative of visiō “vision”). In other words, augmented reality is a great commitment towards the visual element.

Writer L. Frank Baum (Chittenango, New York, United States, 1856 - Los Angeles, California, United States, 1919) commented in one of his novels on the idea of “electronic lenses/screens that superimpose information on living beings”. He called them “Character Markers.” In 1990, a former Boeing engineer, Thomas P. Caudell, coined the term “Augmented Reality” (AR) with propriety. This reflection on AR is present in the very origin


TECHNOLOGY| AUGMENTED REALITY

31


TECHNOLOGY| AUGMENTED REALITY

of the invention of television, when in 1926 the engineer John Logie Baird managed to transmit the first images on the screen of a TV set. With this, we do not want to disparage sound, but it was not until 1930 when the first simultaneous transmission of audio and image in black and white was achieved. Visual content, from almost the beginning, has been adding techniques, technology and processes that have resulted in an increased value of image content both at an informative level and in the ability to attract the viewer’s gaze: from the use of transparencies, models, filters in front of the camera, to more sophisticated solutions such as working with layers, inserting logos/signs (DSK / USK), including visual postproduction up to the ability to embed some images within others (such as chroma Key or Luma Key). So, what does Augmented Reality (AR) mean in the TV Broadcast environment today? The answer is a concrete one: the integration of visual elements in real time, mixing the real world captured by the cameras with computergenerated information/images (2D/3D), thus creating a perfectly synchronized hyper/ ultra-realistic whole.

32

WTVISION REAL-TIME GRAPHICS AND AUGMENTED REALITY IN STOCK CAR

In this sense, AR is the best

part (graphic content,

example of the good work that

backgrounds, computer-

professionals in the broadcast

generated images...) that

environment and those in the

is added to the real part.

IT/video games environment

These are the so-called

have done both throughout

graphics engines.

the entire workflow and in the resulting image. Three elements are necessary to make Augmented Reality

 A viewing device. From augmented reality glasses, to adapted contact lenses, without a doubt to the

technology possible:

televisions we have in our

 A capture device such as a

of smartphones/tablets.

homes or the screens

broadcast camera or even a

With these devices, we are

mobile phone, which is what

offered the resulting image

offers us the real part of this

(real part + virtual part).

process, also called reality.  Hardware and software. This will offer the virtual

Therefore, one of the most important aspects for Augmented Reality to become


TECHNOLOGY| AUGMENTED REALITY

a success among viewers and an increasingly common and profitable production process among television networks is the issue of graphics engines. These graphics engines result from the solutions that have been implemented in the field of video games such as Unity or Unreal. For example, Unreal Engine’s success in the TV industry is due to the fact that it enables realtime processing, video input, output, keying, blending, compositing, and rendering. And we insist on the fact that it is an essential condition that the whole process is done live and direct between what happens on the TV set and what the viewer sees. What is augmented reality (AR) for Unreal Engine? According to their own website: “Simply put, it is a technology that superimposes computergenerated images on the world around us. Let’s think of it as a tool that literally augments or overlays reality with digital graphics.” The company Brainstorm has been gathering experience in real-time AR for broadcasting since 1995. Its current range blends in perfectly to create amazing Augmented Reality, combining the features of

virtual studios and high-end 3D graphics. Its InfinitySet solution includes unique features designed to enhance content creation and delivery and can effortlessly integrate Aston graphics as Augmented Reality objects within the scene, with a photorealistic rendering quality through the use of Unreal Engine. Augmented reality is a mix between virtual and real that must work under several conditions that are inherent to the broadcast world:  Work live, generating broadcast master signal with AR.  Hyper-realistic results for shapes, textures, volumes and finishes.

 Realistic lighting, reflections, shadows matching real and virtual.  Generation of 360º spaces and images.  Facilitating hybrid TV production.  Automation and integration into the data flow without time delays or lack of synchronicity. Alongside traditional television media, new resources and technologies have been added such as virtual scenarios, virtual reality, augmented reality, mixed reality, immersive reality, expanded reality, hybrid reality and/or the implementation of artificial intelligence (AI) in content

 Tracking technology both in regard to positions between cameras, physical space and virtual content; and relating to people and presenters in movements and travel between the real and the virtual. What are also known as artificial vision techniques, currently feature active sensors based on structured light, SLAM tracking or 3D tracking.

production.

 Direct interaction between what happens in the real world and what is generated virtually.

medium into a programming

In the case at hand, Augmented Reality (AR) is in the implementation and experimentation stage in many sectors such as leisure, education, entertainment, tourism and also among TV professionals. But the first results that we are getting in TV programs allow us to announce that this technology is going to turn the television offering featuring more educational, visual and spectacular content. 

33


TECHNOLOGY| CONTENT DISTRIBUTION

34


TECHNOLOGY| CONTENT DISTRIBUTION

Content distribution the final word has not been said yet It seems that, since the mass emergence of streaming services -whether OTT or IPTV- and the gradual replacement, although not complete or far from it, of unilateral distribution systems -either by satellite or terrestrial- there have been no major innovations to face them. However, quite new methods and improvements are beginning to appear in the way we receive content that will change, and a lot, the way we consume it. Let’s dig a little deeper into that. By Yeray Alfageme, Business Development Manager at Optiva Media an EPAM company

35


TECHNOLOGY| CONTENT DISTRIBUTION

As I mentioned in the introduction, the big impact of increased bandwidth and its democratization cannot be ignored. Since the mass spreading of fiber networks, which allowed us to have sufficient bandwidth in our homes so that enjoying digital video would not be an issue, as well as the development of high-capacity mobile networks, mostly 4G, consumption of digital content is no longer a technical problem. Video was the last of the services -or perhaps one of mong the last- to “go digital” since related bandwidth requirements made it necessary to have more advanced infrastructures than those used to handle emails or simply audio. This seems obvious but was not so evident a few years ago. Especially for consumption of these services on mobile devices, which have boomed since 4G. And after these technologies matured -and especially their use- the time has come for everything to take steps forward. And not only because of the arrival of 5G, but also because of new ways of providing these services by using it not only technically but in a more innovative way

36

so that it really is something different and not just replacing an antenna by a mobile phone.

Ultra-low latency One of the things that “worsened” with the adoption of online solutions for distribution is the time “the signal” takes to make it from the production center to our devices. Going back to the times of analog terrestrial television, “latency” was even negligible. The implementation of DTT saw this time increasing to a few seconds, but in OTT services we are typically dealing with a range of 20 to 40 seconds, which is not negligible at all. This latency is mainly caused by Content Distribution Networks, or CDNs. We already discussed them in a previous TM Broadcast article, so we are not going to go into great detail, but basically, due to the architecture of CDNs -on which the content is replicated over and over between servers located throughout the world, this time delay is inevitable. Or it used to be. In addition to the latency accumulated on the CDN, there are two other points where it can be improved

timewise: coding and packaging. These two processes take place -let me allow myself the luxury of greatly simplifying the concept- at the source of the CDN, the single point from which the content is replicated for the entire network, and that is why any improvement here will have a marked impact. Encoding and packaging at the source are no longer dealt with in isolation from content distribution and caching throughout the network but are designed together instead.


TECHNOLOGY| CONTENT DISTRIBUTION

That is why, from the typical -say for instance, 20 seconds, although this can be reduced to 10-15 seconds in many cases- the delay can now be broght down to less than a second, which it is at par with terrestrial distribution services. And this is not only a technical advantage but it also allows to provide new services around digital content that until now were not viable. For example, interactivity is no longer an issue. I am not referring onbly to two-way video -that too- but to massive interactivity that

can provide real-time access, for millions of simultaneous viewers, to a live event, enabling them for example, to participate in it in the same way as it was done in the past through a phone call. Another application -although a more controversial one- is betting, especially at sporting events or online casinos. In certain markets this sounds somewhat foreign, but it undoubtedly represents a very large percentage of content and, above all, of business. This environment also needs all viewers to receive -not

only as soon as possible, but simultaneously- the content. And this is already possible by means of low-latency, synchronized streaming techniques.

Hypercustomization Another concept that has had a very strong impact on content distribution in recent years is FAST channels, which are linear channels created from on-demand content catalogues offered for free and monetized through advertising. We are not discussing here this business model again since we also covered it at length at TM Broadcast, but we will delve into the concept of hypercustomization that it can bring us. Offering a thematic channel with content to the taste of viewers together with relevant ads is nothing new, they are niche channels and there are countless regular TV channels of this kind. However, each viewer receiving their 100% customized FAST channels, with their relevant ads that interest them in a dynamic way and updated according to their tastes, preferences and content consumption is really going the extra mile.

37


TECHNOLOGY| CONTENT DISTRIBUTION

One of the main advantages of digital distribution is that we have, in real time, individualized consumption data of what, when, where and how each viewer is watching. And we can make use of this huge amount of data, no doubt, but how? Well, one of the uses that can be given to the data is to automate the creation of these linear FAST channels so that each user receives what is of interest to them, including the ads that may attract their interest, this offering being different from that of others, that is, hypercustomization. Due to the huge amount of metrics on content consumption that are available, the large amounts of metadata that the VOD catalogs have and the variability of the existing advertisers, the use of AI mechanisms that allow managing all this in a viable way is almost essential. Machine learning and deep learning are currently being applied to automate and

38

customize the individualized offer of content to viewers to an incredible extent. And far from seeing this as an intrusion into decisions about what I want to see and when and how, we perceive it with an aid with which to discover and consume new content and receive customized offers on products and services that may be of interest to us and we may want to purchase. It’s almost perfect.

5G Broadcast If 4G democratized video consumption on mobile devices, -WiFi aside, but in the end a WiFi network is linked to a land line, not a mobile network, 5G comes to take the next step. More bandwidth, no doubt that will help improve both the quality and quantity of services we can consume; lower latency, more opportunities for better interactivity; less response time and even other businesses such as betting; but the change is going to be 5G Broadcast.

What is that 5G Broadcast thing? Isn’t it digital distribution anyway? Actually, not really. Currently, thanks to CDNs, each user has a copy for themselves of the content they want to view. There is no single point from which everyone consumes content without replicating it, hence the need for the existence of CDNs, but thanks to 5G networks, a new way of distributing digital content emerges. Beyond a one-to-one distribution, what is currently an OTT service -although everyone consumes the same- opens the door to what is offered by DTT or satellite networks, one-tomany distribution. With 5G Broadcast there is the possibility of making a stream available on the network at a specific URL which is viewed by all the devices being connected to that specific network without this stream being replicated throughout the network. Imagine the bandwidth savings this means and the amount of simultaneous services we can offer in this way.


TECHNOLOGY| CONTENT DISTRIBUTION

By comparison, it would be like offering an IPTV service for which the network is controlled and private and in which there is a single stream that all users consume under a controlled environment, but in a public network, Internet over 5G. If you do not know this concept and your head has not burst yet, read again 😉. It’s squaring the circle. It is as if we had the advantages of low latency, low bandwidth consumption, high availability and quality offered by terrestrial or satellite content distribution networks, but with all the benifits of digital distribution in regard to metrics, hyper-customization possibilities and added services. It sounds really good. Through 5G Broadcast we will be able to efficiently distribute a lot of content to all users with less latency and in a more reliable way than a conventional OTT network over CDNs. It will no longer depend on where or on which device the content is consumed, everyone will view the same thing, at the same time and with the same

quality. In addition, thanks to the bandwidth savings achieved by the use of this technique, we wll be able to offer other types of services or increase the range of offerings available. One of the additional services we can offer will be immersive or augmented reality experiences, in which -without the need for specific hardware such as 3D goggles or similar- viewers will have the opportunity to experiment with their mobile device in their environment and not only on their screen. Not only the content but the ads and promotions can make use of this immersiveness to have more impact and be more personal. There’s a new world out there.

Conclusions Thanks to the maturity of digital content distribution networks -OTT and IPTV- new forms of content distribution are beginning to be explored and exploited. These distribution methods are already strong enough as to allow a new twist.

From ultra-low latency mechanisms that allow content to be distributed simultaneously to all devices at the same time and “almost” in real time, as we were used to with traditional networks, up to hyper-cutomization services through FAST channels or similar, are just some examples that we have explored. However, 5G Broadcast will enable going a step further. It represents a technology by which all devices on a network can consume the same content at the same time without the increase in bandwidth nneds that this currently entails in OTT services. This will allow us to combine the best of both worlds. Low latency, synchronism and high availability of terrestrial and satellite content distribution networks with the availability of real-time metrics, bidirectionality and the broad offering available in digital networks over the Internet. And this will lead to the next step in content consumption: hyper-customization. 

39


DOP| DAVID ACERETO

Exploring the paths of light

David Acereto’s love story with the audiovisual world began more than two decades ago and ranges from internationally renowned movies to long-awaited series and films such as “Berlin” or “Society of the Snow”. During this conversation, Acereto explained his experience in large format productions, the logistical and aesthetic challenges he had to face and how each choice made was a necessary step to translate the director’s vision at a technical level. Exclusively for TM Broadcast, in this interview Acereto revealed how he combines the simplest tricks with cutting-edge techniques, filters and specific solutions to achieve unique visual effects. Acereto addressed how innovations should serve the core essence of cinema: telling stories that thrill.

40


DOP| DAVID ACERETO

SHOOTING “THE HEAD” (SEASON TWO) COPYRIGHT NIETE

41


DOP| DAVID ACERETO

What were your first

Upon arriving in Madrid, I

steps as Director of

initially worked as a camera

Photography? How

assistant and attendant,

did you get started in

mainly with Ángel

the world of content

Iguácel, an experienced

production? My first steps as Director

Subsequently, I took

of Photography date back

on roles as a lighting

to my days in Barbate,

technician at Enfoco

where I studied image and sound at a high school. Joaquín Cor, my first video teacher, was crucial in recognizing my skills for the AV world, thus flinging the doors open for my career. Spurred by his advice, I decided to move to Madrid and complete

42

cinematographer.

company, where I had the opportunity to work in technical lighting and participate in the making of short films, my true passion. I was involved in about 30 short films, including “The Raven” by Tinieblas González, as well

my training in photography

as “Fade” by Eugenio Mira,

directing at the Séptima

the latter reaching a great

Ars Academy.

visual impact.


DOP| DAVID ACERETO

My debut as Second Operator, that is, Camera Operator, was in 2001 with “The City of No Limits” by Antonio Hernández. Since then, I have worked on more than 45 films, firmly establishing myself as a Camera Operator. In my latest projects, I have also worked as Director of Photography in many productions. I can highlight the good fortune I had to take part in renowned international projects, such as “Manolete,” “Red Lights” with Robert De Niro, “The Gunman” with Sean Penn, “Biutiful” by Iñárritu, “The Promise” with

Christian Bale, and “Exodus” with Ridley Scott. In 2018, during the series “El Embarcadero” (The Pier), having Migue Amoedo and Álvaro Gutiérrez in charge of photography and where I worked as a Camera Operator, I was commissioned to lead a unit, which marked my transition to Director of Photography. After that debut in the international series directed by Jorge Dorado, I have continued to work on numerous fiction series and the feature film “Lost & Found”.

What has been your experience directing/ working on studio productions? Do you prefer shooting in a studio or outdoors? I think the answer to that question depends on the project being carried out. I don’t usually make personal decisions about it, but the nature of the project dictates the ideal conditions. For example, if I’m working on a film that requires a very realistic approach, natural light becomes the preferred choice. However, in situations where technical conditions demand the use of LED screens, chroma or other forms of more controlled lighting, the logical option would be to shoot indoors. Fortunately, I think each project benefits from both modalities. As a director of photography, it is essential to remain receptive to seeking technical solutions that expressively impact and meet the expectations of the director or producer. In this sense, each film or project should be evaluated sequence by sequence and decisions taken based on the specific needs of the narrative. In the audiovisual process, the guidelines are strongly

43


DOP| DAVID ACERETO

influenced by the streamlining

captured, it may be necessary

use of light. Although on the

of resources for each

to resort to a set to reproduce

set we endeavour to emulate

particular moment. For

the light in a controlled

natural light, it is crucial to

example, if you need to

manner.

consider that natural light

shoot an impressive sunset outdoors, it is essential to adjust the work plan

The combination of natural light and studio lighting is

has limitations in terms of schedules and team travel.

what really provides the visual

Ultimately, the decision is

right schedules. However,

truth in the end. Some films,

based on a comprehensive

such as “The Mandalorian,”

assessment of factors

if the production cannot

use LED screens to recreate

such as costs, production

afford multiple sunsets and

sunsets, thus allowing precise

requirements and the ability

numerous shots must be

control over the making and

to adapt to various conditions.

to make the most of the

44


DOP| DAVID ACERETO

What optics and what technical equipment always work, never fail, whatever the subject of the production?

The choice of optics and technical equipment that never disappoint, regardless of the subject of production, is not something that can be reduced to a one-size-fitsall set. It is essential to have a team that adapts to each project’s specific needs. The decision about the type of lens, camera and lighting is closely linked to the director’s and producer’s vision, and as a cinematographer, my goal is to translate that vision to a technical level. Every project requires careful consideration, from the glass in the optics to the texture and sensor of the camera.

SHOOTING “THE HEAD” (SEASON TWO) COPYRIGHT NIETE

“The combination of natural light and studio lighting is what really provides the visual truth in the end”

45


DOP| DAVID ACERETO

“Nowadays, camera sensitivity has become crucial, especially in night-time situations or scenarios that require little lighting”

Choosing a sturdier or less

aesthetic requirements for

up to 2500, provides useful

sturdy camera also plays an

each film. While I usually

technical responses in these

important role, especially

shoot with ARRI Alexa or

situations.

when multiple cameras

Sony Venice, this decision

are used and a balance is

is made in coordination

sought between the quality

with production and always

of the glass and the camera’s

in keeping with financial

durability.

limitations.

Personally, I have a preference

Nowadays, camera sensitivity

director and production

for anamorphic optics, and I

has become crucial, especially

constraints, always seeking to

have been fortunate to work

in night-time situations or

optimize performance with

with the Panavision G-series.

scenarios that require little

the available resources.

However, these choices are

lighting. The dual sensitivity

also conditioned by budget

of cameras such as Sony

When the shooting takes

constraints and the specific

Venice and Arri 35, with ISO

place outdoors, do you

46

In short, the choice of technical equipment is based on a detailed assessment of the script requirements, conversations with the


DOP| DAVID ACERETO

SHOOTING “THE HEAD” (SEASON TWO) COPYRIGHT NIETE

take part in the choice of locations? What process do you follow to detect potential problems and determine shooting and lighting requirements when filming? In outdoor filming, as a cinematographer, I play a crucial role in the choice of locations. Oftentimes, there are challenges in aligning with the director’s vision and optimizing the work schedule. Although sometimes it is necessary to give in because

of the production value of a

terms, collaboration with

location, I have an important

the art director is essential

role in the planning of the

to maintain continuity and

filming hours.

stylistic unity outdoors.

To organize a shooting in a

The technical choice is crucial

unique location throughout

when studying a location,

the day, I use tools such

considering not only the

as Sands Seeker and solar

light but also aspects such

positioning programs. This

as the need to cover it, thus

involves marking the fields

keeping a stylistic continuity

for shooting at specific

in different environments.

times, coordinating with the

Preparation includes

director’s assistants to define

anticipation of atmospheric

when to film each shot and

elements, such as clouds, that

reverse shot. In aesthetic

affect lighting. It is critical to be

47


DOP| DAVID ACERETO

ready to make quick decisions and adapt to unexpected changes during outdoor filming.

means being equipped and ready to face the challenges that nature can pose during filming.

In this sense, the use of the sun and the need to cover it sometimes require considerable equipment, such as palios or large fabrics to maintain a uniform register. My ability to modify lighting is essential, and I make sure I’m prepared with tools like cherry pickers, palios, and gauze to counteract the effects of intense sunlight. In short, shooting outdoors

In large format productions like “The Head”, we assume that coordinating with an international team is one of the biggest challenges, what has been the main difficulty in particular in this series? What challenges does shooting in snow or environments saturated with white / light pose?

“In the first season of “The Head”, we were faced with a complex challenge when setting the entire plot in a station in Antarctica”

48

In the first season of “The Head”, we were faced with a complex challenge when setting the entire plot in a station in Antarctica. The geographical and climatic isolation had to evoke a threatening and isolating dramatic tone, which was essential for the thriller to work. Collaborating with Jorge, we sought to integrate the cold in a palpable way in the film, which involved considerable technical issues.


DOP| DAVID ACERETO

Originally, we planned to shoot on a glacier in Iceland, representing the outdoors, for about two weeks. In addition, we used a chroma set with snow in Tenerife to simulate the outdoor landscapes. This approach required careful coordination in order to maintain aesthetic continuity. The chroma technique was crucial, although we needed to shoot photogrammetry and obtain backgrounds and plates during our stay in Iceland. The goal was to have viewers being unable

to distinguish between the

The production, with the

real base, the glacier and the

invaluable contribution of Ana

chroma set, thus creating a

Vilella, made an outstanding

technical puzzle.

effort to provide the necessary

In season two, we transferred

resources.

the action to a boat, thus

The choice of specific

maintaining the atmosphere

textures, such as rust on the

of isolation. Shooting on

wet boat, was strategically

a real ship and on a set

used to strengthen the

presented significant logistical

thriller-like atmosphere. The

challenges. For example,

differentiated color palette in

orienting the ship offshore,

various parts of the ship, along

which was vital for obtaining

with specific color treatments,

360 degrees with no land in

contributed to the visual

sight, required considerable

narrative. These subliminal

logistical infrastructure.

elements, such as changes in

SHOOTING “THE HEAD” COPYRIGHT JACKES MEZGUER

49


DOP| DAVID ACERETO

In “Sky Rojo”, a more pop-like, modern approach was sought, which was reflected in the use of vibrant colors and contemporary textures

colour palette and textures, became expressive tools to enrich the viewers’ visual experience. “The Head” and “Sky Rojo” share location scenarios such as Madrid and Tenerife, as the burnt hues of volcanic earth fit perfectly with “Sky Rojo”, for example. However, “The Head” stands out for a cooler setting, as we mentioned before. What are the differences in the equipment used in both productions? How did you

50

manage to adapt the light flux to suit the style of the series? In the first season of “The Head,” we used an oil rig that Adán wanted because of its characteristic ‘Ugly Nice’ look. This discovery proved crucial, as it offered a unique authenticity with its patina, rust and wear, as these elements are difficult to recreate in a set. This rig became a valuable resource that we used again in the series, even in intermediate productions such as “Sky Rojo”.

SHOOTING “SKY ROJO” COPYRIGHT TAMARA HERRANZ

Each project has its own specific look that is influenced by specific iconography and visual narrative. In “Sky Rojo”, a more pop-like, modern approach was sought, which was reflected in the use of vibrant colors and contemporary textures. RED Helium cameras and Leica optics were suited to this style, whereas for “The Head”, we chose the Sony Venice with Cooke lenses. The choice of equipment is not limited to technique, but it is adjusted to the visual style and narrative


DOP| DAVID ACERETO

of each particular series. It is crucial to be honest and look for the best for the sake of the story and be careful to not fall into the trap of repeating successful formulas. Photography must be at the service of the plot, and we must adapt our optical approach according to the needs of each project. In “Sky Rojo”, I used seasons two and three to set my own aesthetics, although I had inherited Migue Amoedo’s vision in season one. Although timewise the series do not share continuity, we strive to maintain an aesthetic coherence so that viewers can visually recognize each universe. In Spain, where many projects share sets, the challenge lies in giving each location its own aesthetic universe. Today’s photography used in fiction series focuses a lot on immediate iconography, seeking to make even a few frames recognizable and characteristic of the series. This approach contributes to an aesthetic recognition by viewers, thus creating a quick and effective visual connection with the narrative of the series.

In the case of Berlin, although the series is part of the universe of “The Money Heist”, the visual style is completely different and has a resemblance of “Amélie” in the hues, moving from a fast-paced action scenario to a romantic comedy of white-collar thieves. What solutions or filters have you used to achieve that colour range and definition, so close to a painter’s stroke, in the series? In the case of “Berlin”, which is considered a prequel to “The Money Heist,” a radically different visual style can be seen. Hues and definition, which may resemble those seen in “Amélie” because of the surroundings of Paris, with its cobblestone, orange light and romantic scenes, are achieved through the use of filters and specific solutions. The choice of on-camera filters is made with the aim of obtaining a technical and aesthetic resolution, thus avoiding pores and blurring imperfections in the skin. However, the result in terms of color gamut and definition is attributed in a greater extent to the handling of the LUTs used. In this case, the series has three directors of

photography: Migue Amoedo, Sergio Bartroli and myself, each of us contributing to the agreed aesthetic unity. The aesthetics in “Berlin” features a break in both the dramatic and the aesthetic elements as compared to “The Money Heist”. The colour palette, defined from wardrobe and art, sets rules for interiors, exteriors and night scenes, thus creating a visual coherence. Although previous agreements are reached, there is the freedom to adapt the dramatic tone in each sequence, always in observance of the rules laid down. The process is not limited to the application of filters but involves the use of specific colour elements for each sequence or environment. In the chase at the airfield, for example, a particular universe was explored with tuned-up cars and neons, thus maintaining the overall aesthetic unity. The series seeks almost hyperbolic and deeply marked aesthetics, aligned with the aim of being a ‘global show’ for an international audience. In “Berlin”, we are proud of the fact that the viewers will be unable to tell which DoP has worked in each sequence,

51


DOP| DAVID ACERETO

as we all contribute to the aesthetic coherence of this production of global scope. Virtual production has played a big role in the filming of “Berlin” and “Sky Rojo”. What are the differences between how you approach shooting in virtual scenarios as compared to the outdoors, as for example, the locations in Cabo de Gata for “Sky Rojo” or on the freighter in “The Head”? As for virtual production, its role has been significant in the most recent projects, such as “Berlin” and “Sky Rojo”. The introduction of LED screens has been a breakthrough, as we are pioneers in their use at an industrial level. Unlike the chroma key, LED screens provide complete aesthetic control by not relying on outdoor environments. This technique offers assurance and control, making it easier to direct scenes. As compared to outdoor filming, where factors such as road cuts and weather conditions have an impact on time and safety, a virtual set provides a more controlled environment. In addition, the flexibility of LED displays allows you to adjust digital backgrounds and

52

light effects according to the

the effort and infrastructure

relevant needs, thus offering

required, it is envisaged as a

technical and visual solutions.

forward-looking solution that

The virtual set becomes an effective alternative, especially in challenging situations proposed by the director, such as a night chase at an airfield with actors on a vehicle’s top. The complexity of these scenes is resolved more efficiently and safely in a controlled environment.

stands out for its ability to address logistical and creative challenges. Have you ever used some rudimentary photography trick to get a special effect? In photography, I have resorted to various techniques to achieve special effects. Simplicity is often effective, as

Although the implementation

the simplest solutions tend

of this approach must be

to rely on common sense.

carefully evaluated in terms of

The imperfections of an optic,

profitability, especially given

for example, can generate


DOP| DAVID ACERETO

different flairs or cause the use of special glass units placed in front of the lenses to create unique refractions. In “The Head,” I used these pieces of glass when the characters were experiencing a T3, a state of brain freezing, to add a visual layer to associate with that condition.

“The virtual set becomes an effective alternative, especially in challenging situations proposed by the director, such as a night chase at an airfield with actors on a vehicle’s top“

I have also used diopters for macros, providing an expressive and vaporous resource, especially in situations where you want to convey a sense of subjectivity. To achieve an effect of strangeness, we have shot scenes backwards or at 36 frames instead of 24, thus generating an altered perception that reflects a peculiar state in the narrative. Resorting to more traditional techniques, such as shooting a scene backwards or hitting the camera during a fight, has also been effective in conveying certain visual or emotional elements. In classic cinema, I would make use of camera flashes or non-bleaching coating to achieve specific shades. These techniques, although ancient, are like a toolbox of expressive resources that must be kept at hand. From the use of mirrors

to fiberglass, mockups and macro optics, each element becomes a technical solution to address specific challenges during filming. Cinematography has evolved, but these classic tools are still effective and used more often than one might think. They are versatile and creative solutions that are applied when no other means are available. And what technological development that does not yet exist would you ask from an almighty manufacturer? In the technological field, artificial intelligence has provided various solutions catering to our desires. However, I consider it crucial to properly use these advances. The technological revolution has been phenomenal, from the transition to the digital format to the arrival of 8K resolution, HDR and full frame. These innovations present wonderful opportunities that, if misused, can lead us away from the essential purpose: to tell stories in impactful ways. An excess of technical tools can lead to visual uniformity and strip films off their unique identity. Sometimes the desire

53


DOP| DAVID ACERETO

to make the most of technical capabilities can result in a loss of the visual diversity needed for a complete narrative. It is vital to keep in mind that a story requires dark moments and moments of ugly light to truly stand out. My wish, in this sense, would be to return to the essence of telling stories that move and thrill. Often, in the pursuit of dazzling visuals, the emotional connection that a well-told story can bring is lost. I long to return to the ability to move and thrill through narratives that use magic in meaningful ways, rather than simply by means of a visual show. Technology is valuable, but it must be at the service of the story, creating images that are ideal for conveying specific emotions. In short, my wish is that technology does not take us away from the essence of emotion through well-told stories. Which production has been the most rewarding and which has been the most challenging or complicated production you’ve done so far? How did you solve these challenges? As a camera operator, the most outstanding experience was Antonio Hernández’s

54

WITH UMA THURMAN IN BACKWOODS

“The City of No Limits”, being my first film, at 26, and sharing the set with Fernando Fernán Gómez and Geraldine Chaplin’s daughter. I also had significant moments as a second operator in international productions such as “Red Lights” with Robert De Niro, Sigourney Weaver and Cillian Murphy. As a DoP, my debut in “The Head” with Jorge Dorado was crucial, as he saw in me the ability to tell his stories. Other highlights include “Lost & Found” and this year, participating as a camera operator in “Society of the Snow.” In this latest film, being part of a major project and collaborating with DoP Pedro Lu was an exciting experience.

As for the more complicated production, “The Head 2” stands out, especially during the scenes on the ship. Shooting on high seas with actors, at the mercy of the waves and the sun, was a significant challenge. I also faced similar complications during a “Sky Rojo” episode that involved several weeks at sea. Shooting at sea is always tricky and has been one of the most challenging experiences in my career. What future projects do you have underway? Currently, I consider myself to be fortunate to have several projects underway This year I was offered a film by Ana Purna, a Hollywood production company directed by Eugenio Mira, entitled


DOP| DAVID ACERETO

WITH ROBERT DENIRO AND RODRIGO CORTÉS

“In short, my wish is that technology does not take us away from the essence of emotion through well-told stories”

Jiménez, a Spanish actress with whom I shot a short film at the beginning as a director. She has written a wonderful script with a feminine approach to motherhood, entitled “Mi hijo en la tierra” (My Son on Earth.” I can’t wait to be a part of this film. As for other opportunities, I have received a call to participate in “El Bunker” (The Bunker), a series that Vancouver Media is

“El Autoestopista” (The

of drug dealing in Cádiz. I

Hitchhiker). Although it has

have also recently worked on

been delayed due to the

a teaser for the action film

actors’ strike, I hope to carry

Media has always been a

by Manu Carballo and Teo

home to me, and they call me

out this project with Eugenio

García.

to participate in their projects.

Mira. Soon, I will shoot the

developing as its novel project after “Berlin”. Vancouver

In short, although I have yet to

film “Tierra de Nadie” (No

In addition, I am excited about

One’s Land) with Albert Pintó,

a future project that will be

year, I am grateful and excited

director of “Nowhere” (2023),

offered next year, which marks

to have so many options and

which will address the issue

the filmmaking debut of Lucía

opportunities in my career. 

decide on my choices for next

55


TECHNOLOGY| CAT WIRING

Full-fledged connectivity

CAT wiring By Carlos Medina Audiovisual Technology Expert and Advisor

The audiovisual sector has experienced many innovations, changes and technologies that have marked the course of its history, but perhaps, one of the greatest transformations having taken place is the implementation of communications and connections through IP (Internet Protocol) network architectures. IP production is a technological and conceptual paradigm shift of enormous impact on professional audiovisual environments, events and live broadcasts, which affects workflows in such specific fields as video, production, lighting, sound and visuals, among others; and which has spread throughout the various stages of the audiovisual industry, from planning to broadcast/ marketing, undoubtedly through contribution/ production of audiovisual content itself.

56

One of the aspects that shows that IP audiovisual production is here to stay is the standardization for implementation being used by the various players in these sectors: manufacturers, TV networks, event producers... This issue involves the making of new highly important decisions that affect budgets, training of technicians, approval of standards and protocols, changes in workflows and a greater knowledge of types of network architectures, cabling and connectors.

“DPG Media embarks on IP production with Sony technology”, “LiveU presents its new LiveU Studio IP video production service”, “FOR-A announces showcasing of solutions for IP – SDI hybrid environments”, “Grass Valley IP/SDI/Hybrid routers orchestrate the production of the NBC Sports Winter Games”, are some examples of news and headlines that have been and are at the forefront of the changes that are now taking place in the audiovisual sector, national and international events and live shows. Is IP production a guarantee for success? A question with many sides to it to have a unique and indisputable answer, but what we can advocate is that it is a reality today in a completely digital world. An IP (Internet Protocol) production involves a digital ecosystem for interoperability


TECHNOLOGY| CAT WIRING

57


TECHNOLOGY| CAT WIRING

Let’s take a closer look to one of the aspects that has to do with IP production: cabling. First, we need to decide what type of network cable we want to use for our IP production: twisted pair cable, coaxial cable, and/or fiber optic cable. These are the solutions existing nowadays.

achieving quality results (broadcast, in the case of audiovisual).

of audiovisual equipment of diverse nature and use that under the assignment of an address -comprising a series of numbers assigned to each device- are interconnected with each other under the coverage of a computer network involving different architectures, systems and protocols suitable for

58

Héctor Sierra, Key Account Manager at Sony told us that as far back as in 2012 Sony had made the first presentation of an IP product; but I still remember the first promoters of IP technology in the broadcast world, such as when the company Eurocom successfully completed the implementation of a new Telecinco IP contribution network in 2009.

In this sense, a twisted pair cable, as its name suggests, consists of two conductive wires, usually made of copper, which when twisted together effectively reduce the degree of signal interference. As the associated costs are relatively low, this type of cable is widely used in short and mediumlength data networks. There is a wide variety of twisted pair cables depending on shield construction, maximum bandwidth, transmission speed (measured as a frequency in Hz), the signal-to-noise ratio and the use being made of them. Given the variety of twisted pair cables, reference is made to the category (abbreviation: CAT) in order to facilitate cable identification, certification and for better choice under the rules of the Electronic Industries Association (EIA):


TECHNOLOGY| CAT WIRING

Currently, all Ethernet cabling is 100 ohms. But it is always convenient to look to the past to see the changes that have taken place in such a short time. Thus, in 1973 Ethernet originated thanks to Bob Metcalfe at the Xerox Palo Alto Research Center under the use of thick copper coaxial cables. The first version, 10BASE5, featured an extremely stiff cable nearly half an inch in diameter, and later 10BASE2 was incorporated, which used a cable about half as thick and much more flexible. In the late 1980s the development of the Ethernet hub, and later on the switch, allowed twisted pair copper cables to become the primary means for supporting Ethernet. In 1989, Anixter, a distributor of cabling products, introduced its “Levels” program, the first written specification on performance for data cabling systems. It became the basis for the first category cable based on official standards, ratified in 1991 by the Telecommunications Industry Association (TIA) as category (CAT) 3. It’s time now to learn more about each type of cable according to its CAT (category):

 Pre-CAT5: The first cable categories are already totally obsolete, even some of them are not recognized by the TIA/EIA standards. For example, CAT1 cabling was used for telephone networks and doorbells, while CAT2, CAT3 (1991) and CAT4 (1992) cabling was used in networks with transmission speeds of 4Mb/s, 10Mb/s (16Mhz frequency) and 16Mb/s (20Mhz frequency) respectively, making it clear that their usefulness in today’s world is practically nil.  CAT5: (1995) (upper frequency 100Mhz; transmission speed 100Mb/s at a maximum distance of 100 meters). This category is not currently recognized by TIA/EIA. Designed for applications under Ethernet 100BASE-TX and 1000BASE-T. Its most common cable construction is UTP.  CAT5a: (2001) (upper frequency 100Mhz; 1.000Mb/s transmission speed at a maximum distance of 100 meters or up to 2.5 or 5 GB/s in some cases). It is defined in TIA/ EIA-568-B but under the new IEEE 802.3bz-2016 or 2.5G/5GBASE-T standard it will allow speeds of 2.5 Gbps in this cable category,

and therefore 2.5 times faster than the current maximum of 1 Gbps. Designed for applications under Ethernet 100BASE-TX and 1000BASE-T. Its most common cable construction is UTP.  CAT6: (2002) (upper frequency 250Mhz; 1,000Mb/s transmission speed at a maximum distance of 100 meters or up to 2.5 or 10 Gb/s, in some cases reaching a maximum length of 56 meters in favorable environments and with a limit of 37 meters when the environment is hostile). Also defined in TIA/EIA568-B, it has become the new standard and currently the most used. Average transfer time of 1 Terabyte in 3 hours. Designed for applications under Ethernet 10GBASE-T. Its most common cable construction is STP.  CAT6a: (2009) (upper frequency 500Mhz; transmission speed 10Gb/s at a maximum distance of 100 meters) Average transfer time of 1 Terabyte 20 minutes. Designed for applications under Ethernet 10GBASE-T. Its most common cable construction is STP.  CAT7: (2010) (upper frequency 600Mhz;

59


TECHNOLOGY| CAT WIRING

transmission speed 10Gb/s). Under the international standard ISO-11801. Average transfer time of 1 Terabyte in 20 minutes. Designed for applications under Ethernet 10GBASE-T or POTS/ CATV/1000BASE-T over a single cable. Its most common cable construction is STP, that is, insulation is mandatory.  CAT7a: (2013) (upper frequency 1Ghz; transmission speed 10Gb/s). It has even stricter shielding and cable protection specifications which will also come with a new connector called IEC 60603-7-7, one that is much more robust than the classic RJ45. Designed for applications under Ethernet 10GBASE-T or POTS/CATV/1000BASE-T over a single cable. Its most common cable construction is STP, that is, insulation is mandatory.  CAT8: (2016) (upper frequency 2Ghz; transmission speed 40Gb/s at distances of up to 100 meters, and we can even reach speeds of up to 100Gbps for very short distances). These are the most modern ones to date and allow handling the highest Gigabit Ethernet bandwidth. The new standard allows for a maximum throughput of 1 gigabit per second (1 Gbps), which is 10 times faster than Fast Ethernet, 100 times faster than traditional Ethernet, and 1,000 times faster than telephone modems. Average transfer time of 1 Terabyte in 5 minutes. Designed for applications under Ethernet 40GBASE-T or POTS/ CATV/1000BASE-T over a single cable. Its most common cable construction is S/FTP, that is, insulation is mandatory. As we can see, all categories (CAT) of twisted pair cable are designed for Ethernet applications under a designation that includes a number and a name, for example Ethernet 40GBASE-T (CAT8). It is important to understand that the number in question is a way of transmitting data at a specific speed in Mbps through a copper cable. So, a 40GBASE-T Ethernet network means reaching a data transfer rate of 40G/s. And the name BASE-T refers to the type of cable that is used for said data transmission: a twisted copper cable. The “T” in BASE-T means “twisted”. So, what is Ethernet? It is a technology for connecting devices in a wired local area network (LAN) or wide area network (WAN), allowing them to communicate with each other through a protocol (set of rules or common network language). Ethernet describes how network devices can format and transmit data so that other devices on the same local area network or campus segment will be able to recognize, receive, and process the information.

60


TECHNOLOGY| CAT WIRING

The first Ethernet standard was approved by the IEEE 802.3 working group in 1983. Since then, standards have emerged that set the requirements for data communication systems that use Ethernet technology, evolving and adopting new media, higher transmission speeds and changes in network structure:  IEEE 802.3ac was introduced to accommodate VLAN and priority tagging.  IEEE 802.3af defines Power over Ethernet (PoE), which is crucial for most Wi-Fi and Internet Protocol (IP) telephony deployments.  IEEE 802.3bt or PoE++ comprising four pairs that allows the transmission of electrical energy and data in a single cable with greater throughput, which means that devices requiring more energy can be powered.  IEEE 802.11a, b, g, n, ac, and ax define the Ethernet equivalent for WLANs.  IEEE 802.3u introduced 100BASE-T - also known as Fast Ethernet - with data rates up to 100 Mbps.  IEEE 802.3ab and 802.3z is an update that defines Gigabit Ethernet technology, known as Gigae or GE.  IEEE 802.3ae, also known as 10 Gigabit Ethernet (XGbE or 10GbE). It is the latest (year 2003) and fastest standard among Ethernet standards: ten times faster than Gigabit Ethernet. IEEE is the Institute of Electrical and Electronics Engineers, the world’s largest technical professional organization, bringing together more than 420,000 engineers, scientists, technologists, and practitioners in more than 160 countries, who are devoted to advancing technological innovation and excellence for the benefit of humankind. This institution was born on January 1, 1963 through the merger of two previously existing organizations, the American Institute of Electrical Engineers (AIEE), founded in 1884 by Thomas A. Edison and Alexander G. Bell, among others, and the Institute of Radio Engineers (IRE), founded in 1912.

61


TECHNOLOGY| CAT WIRING

An Ethernet network cable has a total of four pairs of twisted wires two by two, this meaning that we will have a total of 8 wires for each network cable. The telecommunications standards by TIA (Telecommunications Industry Association) and EIA (Electronic Industries Alliance) on twisted pair network cabling are T568A and T568B. Both present the arrangement of connector pins in the UTP or STP network cables. In reference to connectors, from cable CAT5a to CAT6a RJ-45 connectors (8P8C connector) are used; and CAT7 to CAT8 use GG45 (GigaGate45, created by Nexans and standardized by the International Electro Technical Commission as IE 61076.3.110.; it can operate in the frequency spectrum between 600 MHZ and 5 GHZ, with twisted pair cabling). There is also the TERA connector. It is a type of shielded twisted pair connector for use with Category 7 twisted pair data cables, developed by The Siemon Company and standardized in 2003 by the International Electrotechnical

62

Commission (IEC) under reference IEC 61076-3104. The 2006 revision of the standard extended performance up to 1,000 MHz. The connector is a different size than the RJ-45 connector. Let’s go over the differences in relation to the construction of the twisted pair cable concerning shielding:  UTP cable: stands for Unshielded Twisted Pair. This cable has no shielding; it is the cheapest, most flexible of all and offers high performance.  FTP cable: stands for Foiled Twisted Pair, also known as global-shielding twisted pair. That is, the pairs are not individually shielded, but have a global protection that includes all of them, inside the plastic cover, in order to improve the level of protection against external interference.  STP cable: stands for Shielded Twisted Pair. It assumes that each of the twisted pairs in the cable is covered by a protective coating made of a thin sheet of aluminum foil. This shielding allows to protect

the pairs against potential interference and also against electrical noise. The drawback: it is a very rigid cable.  S/FTP cable: is a combination of FTP and STP. It features dual shielding: each cable has its protective layer and, further, an additional layer that covers the entire assembly. Another possibility for IP production is do it under a coaxial cable. It is a type of network cable with an internal conductor surrounded by a tubular insulating layer and in turn by a tubular conductive shield, in addition to an insulating outer coating or insulation. Coaxial cable is used as a radio frequency (RF) signal transmission line, in power lines that connect radio transmitters and receivers with their antennas, computer network connections, digital audio, and cable television signal distribution.


TECHNOLOGY| CAT WIRING

Several types can be found under the central copper core:  RG-58/U: solid copper core.  RG-58 A/U: twisted-wire core.  RG59: It is the basic coaxial cable. It is thinner and with less shielding. This is best suited for television broadcasts and short section.  RG6:It has a thicker caliber and better insulation and shielding. It is used for digital video signals and satellite TV.  RG-62: ARCnet networks. Finally, we can go for fiber optic cable. This type of cable is an excellent transmission medium due to its large data transmission capacity, with transmission speeds from 10 Mbps to 100 Gbps or higher and support for long distances.

A fiber optic cable has a fiber/glass core (single-mode -SMF- and multimode -MMF-) inside an outer rubber coating and light beams, rather than electrical signals, are used to transmit data. The transmission capacity of fiber optic cable is 26,000 times greater than that of twisted pair cable. MMF fiber is usually short range (up to 550 meters over a 10G network) and the cable is OM1, OM2, OM3, OM4 and OM5. SMF is long-range and the most common cable is the OS2 cable Fiber optic, together with the IEEE 802.3cd standard, known as 50GBASE-T, will allow data connections with speeds of up to 50 Gbps through a single Ethernet cable. This situation is essential to update networks in a practical, economical way while achieving high performance. In IP production we cannot forget one of the most important issues: communication protocols. On this occasion, I recommend the contribution of Yeray Alfageme, a colleague in the field of publications on TM Broadcast, under

the title Contribution and Distribution Protocols in Broadcast Environments (No. 166) to approach the most outstanding protocols, such as: ST2110, SRT, DASH, HLS, RTMP, NDI, RIST, ZIXI... This same author points out that IP is highly flexible, asynchronous, allows multiple formats through the same network, it is bidirectional, and data of all kinds can coexist with audio and video signals, as long as all devices can understand each other. Because these last few words were not necessary in the SDI world, although it is the main headache in IP environments: interoperability once more.” As we have seen in this article, CAT twisted pair cabling is accompanied by other types of cables that compete for better performance in speed, maximum bandwidth and distance. However, CAT cabling has set its own course as each version is more robust, has better insulation, more capacity to transmit data and faster than the previous one. In addition, under prices perfectly adjusted to each need, which allows us to point out that it is a full-fledged connectivity. 

63


TEST ZONE| SONY HXC-FZ90

64


TEST ZONE| SONY HXC-FZ90

Sony HXC-FZ90 Investment and growth to measure When decision-making is influenced by external factors that are beyond our control, having adaptable tools allowing us to dose investment and growth as our needs change is a sure asset. By Luis Pavía

These are difficult times to make certain decisions. The paths are visible, but not fully defined. With significant investments involved for increasingly tight revenues, decision-making in audiovisual equipment becomes an especially delicate issue, not only in regard to the equipment as such, but also the choice of the right moment to take each step. Let’s recall some examples, such as 3D viewing technology, for which in recent years significant development

and promotion efforts were made that soon resulted in a nearly marginal use or for very specific applications. Simply because these efforts have not reached end users in the expected way. With this example in mind, it is easy to understand how complex decision-making is towards new fields, warranting that when the time of adjusting budgets comes, decisions tend towards safer values, as is the case at hand. We usually focus our labs on the technical side of the products being analyzed and it will be no different this time, as there are interesting developments here. But in this case we will have to include a commercial factor that we believe is also a great innovation, and we are convinced that it will have a significant weight in decisionmaking when considering this choice of equipment. So much so, that it is the first thing we will be dealing with.

65


TEST ZONE| SONY HXC-FZ90

What is it about? Very simple:

Alternative: purchase lower-

what we bring to our pages

cost equipment, with shorter

today is a portable studio

renewal and amortization

camera, genuinely 4K, but

periods, but which is likely

which will initially work in HD...

to have certain quality or

at the price of an HD camera!

functionality limitations.

Why so? Because this is the

What solution does it bring?

section of the road that is clearly marked: at present, a huge amount of content is still produced and broadcast in HD (high definition, 1920 x 1080) quality and SDR-709 (standard dynamic range, normal dynamic range) space. Although undoubtedly, content of the highest quality and budget allocation has since long been created in 4K (actually on UHD 3840 x 2160 broadcasts) and HDR-BT.2020 (high dynamic range, high dynamic range) space. So what is the sense in this? The camera is designed to deal successfully with a very frequent situation. Because investing now in a range much higher than necessary can mean an amount beyond bugdet or compromise the profitability of production. But investing in a piece of equipment that will become outdated in a yet unknown period but in the short or medium term at best, will also end up posing a budgetary problem.

66

It enables the possibility of having a camera that is compatible with all current equipment and facilities,

invest now in this camera and in two years we will change the sensor”. It’s much simpler, much more efficient, much more flexible, and much more cost-effective than that. The notion is: “I invest in purchasing a license, either temporary or permanent, that releases all the potential only when I need it and I am going to make it worth the while”.

for which you will pay only for what is really needed and used, but a piece of equipment capable of unleashing its full potential when required without the need to change absolutely anything. And on top of this, by gradual steps. Let’s explain this. Because this is where this new form of marketing comes into play. It’s not about “we

Sony HXC-FZ90 is a portable studio camera, genuinely 4K, but which will initially work in HD... at the price of an HD camera!


TEST ZONE| SONY HXC-FZ90

The camera features from the outset all the elements to offer the highest performance, but we have the possibility of purchasing, and only paying for HD functionality alone. No need to change anything when, in the future, we may want to unleash the full potential of 4K. And having multiple advantages from the start, even when working in ’simple’ HD. Such as a genuine 4K sensor that will provide significantly higher quality even in ’simple’ HD.

It can be seen in different ways. As the equivalent of having a good piece of equipment for the day-today, but having the higherend equipment when a specific production requires it. Or as the upgrading of my equipment to that higher range that I already need permanently without the need for any renovation whatsoever.

A shoulder camera, defined as portable and upgradable, with a 2/3inch 4K CMOS sensor, B4 bayonet, built-in fiber optic output, studio remote control unit, etc.

So what modes of operation are available? Very simple, without any intermediate steps or settings, the camera only offers two modes: HD and 4K, with all the features at the highest level. We will soon go into detail about that, but let’s first finish with this feature. Because we are interested in knowing how it is done, and it is also very simple. Just by uploading a license to the camera’s software. In addition, to offer even greater versatility there are weekly and monthly licenses, in addition to the permanent one. In this way, if we face the need only in specific projects that are spaced out in time, we will be able to adjust the investment and expenditure

67


TEST ZONE| SONY HXC-FZ90

as much as possible. And if we need to make the definitive switch, the camera is released for life by means of a single action without the need for new interventions or for recurring or conditional expenses in the future. This seems a very flexible model to us, as it provides the benefits of a moderate initial investment, which will remain up to date over time, and one which allows us to jump to the upper level temporarily to solve specific situations; or make the definitive switch without the need for a large investment when finally deciding to leave HD behind.

68

Going now into the purely technical part, the equipment is globally designed to be integrated into the usual studios and environments for content creation or live broadcasting. A shoulder camera, defined as portable and upgradable, with a 2/3inch 4K CMOS sensor, B4 bayonet, built-in fiber optic output, studio remote control unit, etc. All its features are aimed at facilitating an integration into existing environments, or even creating new ones according to widely implemented and well-established, reliable methods and workflows.

Handling is very flexible and suitable for all types of live broadcasts: TV studios, events of all kinds, sports, corporate videos, training, etc. The magnesium alloy body is light and robust. And the 4K sensor provides a resolution of over 2,000 TV lines. It must be highlighted that while in HD format, the entire sensor is always used, scaling the image and never a crop, which provides a significant advantage in final quality. Regardless of whether we have the 4K license enabled or not, the HDR workflow is always available, as well as HLG (Hybrid Log Gamma)


TEST ZONE| SONY HXC-FZ90

modes, S-Log3 logarithmic

Regardless of whether we have the 4K license enabled or not, the HDR workflow is always available, as well as HLG (Hybrid Log Gamma) modes, S-Log3 logarithmic curves, and BT.2020 colour space. In this way, even without the license extension, we will always be able to provide the best results at a chromatic level.

curves, and BT.2020 colour space. In this way, even without the license extension, we will always be able to provide the best results at a

One of them is TLCS (Total Level Control System). This system features the ability

chromatic level.

to simultaneously manage,

Likewise, native capture

possible automatisms for

and output at frame rates up to 60.00p makes this camera suitable for the most

in an integrated way, all the controlling the amount of light that reaches the sensor, this enabling, in situations where

demanding streaming and

both levels and contrast

platforms.

change over time, the capture

In terms of light responsiveness, beyond the high sensitivity of the sensor,

of an image that reflects these changes, but without falling out of range at any time.

we have all the innovations

Another feature that allows

offered by Sony with its best

to significantly improve

equipment.

results is ARIA (Automatic

Restoration of Illumination Attenuation). This functionality automatically compensates for vignetting, the typical loss of light around the corners caused by some optics with a large zoom range. Thanks to this capability, the camera automatically adapts this effect, even while zooming in or out. Correcting this effect without this system is possible, but it would take a lot of work and care. As for the light coming in, the ND filter is a conventional turret type, with 4 independent filter positions for 0 (clear), 1/4, 1/16 and 1/64, dimming, equivalent respectively to 2, 4 and 6 diaphragm stops. When dealing with image sharpness, a key aspect that has a direct impact on the quality perceived by the viewer, we also have interesting functions. And here comes another batch of interesting new features that make a significant difference from what we had known so far. Those related to focus assistants begin with the wellknown one of highlighting the contours of the in-focus elements with a customizable coloured line. Not only in the

69


TEST ZONE| SONY HXC-FZ90

integrated viewfinder, but also in the optional 7” external viewfinder. But we found the combination of the focus position meter and the focus indicator assistant particularly interesting. These represent a line at the top of the viewfinder that, by moving a visual mark along the line, provides us with a graphical reference of the distance at which the focal plane is from the camera. But the best thing is that markers can be fixed on this line that allow us to return to preset positions easily, efficiently and accurately. It’s a similar concept to the markers in focus remote control systems, but with visual markings superimposed on the screen. This feature will be especially interesting the more complex the situations that require more complicated focus tracking are. And, as if that weren’t enough, we also have a dynamic contrast function, called UMBRA, which makes it possible to solve one of the most complex situations: focusing in dark conditions. In a similar way to the traditional focus assistant, this system increases the contrast of the

70

outline of focused objects only in the viewfinder, even if they are in very dark areas of the take. Thus, it is possible to maintain and follow the focus in situations in which it was traditionally only possible to rely on our own experience and intuition.

kilometers, it’s not a mistake.

The last item relating to sharpness aids is a feature that, ironically, is favored by the use of the HD format, such as the digital image extender. This allows magnifications of 2x while maintaining full resolution, and even up to 4x without noticeable loss of quality.

meters more of Neutrik fiber

Going now a little further -literally- we get to the other end of the fiber connector, where we find the HXCUFZ90 CCU (camera control unit). Although this item of equipment is not essential for operating the camera -since it has its own connectivity, such as SDI-12G and various power modes, among others, that we will mention later on- it does add interesting features. By means of a Neutrik fiber link, the 300m mark is reached with power from the CCU itself. And with singlemode fiber, distances of up to 10 km can be reached for data transmission. Yes, ten

Although to cover such large distances with power, the HXCE-FZ90 power extension source must be used. In this way, between the CCU and the source we could cover that distance of up to 10 km with single-mode fiber, having 300


TEST ZONE| SONY HXC-FZ90

between the external power supply and the camera. As expected, the CCU has the usual elements to deal with all the parameters related to image processing, such as brightness and colorimetry, thus making it easier for the operator to concentrate on framing and focusing. These

tasks that can logically be supplemented by a RCP through a simple Ethernet connection to the CCU, such as the RCP-3500.

directly to the CCU, whose

Highlighting now new features that come linked to the operation with CCU, it is possible to connect a prompter control computer

that is already very common in

data are transferred to the camera through the fiber itself, so that, if we want to use this accessory -something most news sets for example-, it will be hooked up directly to the camera, without the need for additional elements.

71


TEST ZONE| SONY HXC-FZ90

Likewise, by means of the Network Trunk function, the camera can be used as an intermediate bridge to connect a second PTZ-type camera through it, such as the also novel ILME-FR7. In this way, simply connecting the PTZ camera output to the SDI input of the FZ90 will make this signal available in HD quality from the CCU itself. We can also connect an RM-IP500 type controller to the CCU itself in order to have full remote control of the PTZ camera through the same fiber as the FZ90. Taking into account the distances that can be bridged between both pieces of equipment, having a camera that bridges a second camera and its own PTZ positioning control provides increased versatility that will be especially appreciated in setups involving mobile units, due to a simplified installation of these additional cameras, which can also be operated in a completely remote way. Unlike the ENG environment, in which we often work individually, in equipment designed for studios it is also important to have elements that facilitate operation within a team. In these environments with

72

distributed responsibilities, direct intercommunication is key, and for this purpose we have built-in intercom available. And to make this task even easier, we not only have the traditional 5-pin XLR connector, but it is also possible to do so with the same headphones we use for our mobile phones, with 3.5 mm TRRS mini-jack connector. Also in the same vein of facilitating integration in various environments and for different uses, we have three possible viewfinders when it comes to determining our configuration: for situations in which the camera is on a tripod or column -also typical when large and heavy optics are used- we have the 7.4” HDVF-EL75 OLED viewfinder, which has a support featuring multi-axis adjustment and the controls that are usually found in this type of viewfinders. In addition, the classic HDVF-EL20 eye viewfinder is available, another OLEDtype element with 0.7” HD colour resolution. But the one we had the opportunity to try -and found it extremely interesting because of its versatility- is the HDVF-EL30 viwefinder, which is very similar to the previous one but comes also with another

3.5”folding, adjustable screen to the eye viewfinder. This model will surely be the most suitable when the camera is operated on a lightweight tripod or steadycam mount, thus offering maximum versatility and lightness for viewing. In addition to controls for all the usual image parameters such as peaking, brightness, contrast, zebra, tally, etc., this viewfinder has dedicated buttons to operate the menu, as well as a couple of customizable buttons. The camera has direct 4K output up to 60.00p through SDI-12G, facilitating operation independently and even recording by means of thirdparty equipment. We have complete connectivity in the camera body itself: genlock, synchronisms, audio, in addition to what has already been mentioned. And even a power outlet available for other accessories. Camera power supply can come through the fiber connector or the standard 4-pin XLR jack. Those who are used to handling this type of shoulder camera with their fibre umbilical -a situation that facilitates particularly dynamic shots- will find a piece of


TEST ZONE| SONY HXC-FZ90

In short, it can be said without a doubt that this equipment is flexible, versatile and capable enough to adapt to productions and budgets of very different levels. It offers flexibility in configuration and handling in a wide range of live applications, as well as in operations and situations in which regardless of resources and the human team it still offers possibilities to obtain the optimal result in each case. equipment that will sound extremely familiar in terms of location of controls, but more lightweight and with an improved image quality as compared to other similar models.

and image capture such as

From a commercial point of

shutter, iris, white balance,

view, it is possible to acquire

pedestal, level settings,

the camera body as a stand-

chromatic response and tint.

alone element, as well as a

Of course, the intercom also

kit that includes optics and

has its dedicated section in

viewfinder. In addition to the

the CCU.

options for weekly, monthly

This way of working is especially appropriate whenever production is more elaborate and we have a human team in sufficient numbers so as to split the tasks. To end with the CCU, simply mention that, in terms of frame rates, we have 2160/50p, 59.94p and 60.00p through SDI-12G, 1080/50p, 59.94p and 60.00p through SDI-3G and 1080/23.98 PsF, 24 PsF and 29.97 PsF are supported as well. In short, it can be said without a doubt that this equipment is flexible, versatile and capable enough to adapt to productions and budgets of very different levels. It offers flexibility in configuration and

or permanent licenses -this being a distinct point that we are sure will soon become widespread among other manufacturers. As we can see, the innovation that so often comes from the hand of the technical aspects, still has room for application in other areas and, in this case, it is specially designed to facilitate decision-making in such a changing and demanding environment as the audiovisual is. Since its two possible configurations -HD and 4K- are stable situations with sufficient development ahead of them, this piece of equipment will make it easier to find the ideal time to take that step forward.

handling in a wide range of

Even without having to do

At the other end of the

live applications, as well as in

it permanently to make our

umbilical, through the CCU,

operations and situations in

investment even more flexible

connectivity possibilities can

which regardless of resources

and optimize profitability,

be expanded, in addition to

and the human team it still

turning this into a safe

having the necessary controls

offers possibilities to obtain

investment. Because trust and

to manage all the functions

the optimal result in each

security are always values that

of the in-camera menu

case.

need to be taken care of. 

73


74


Issuu converts static files into: digital portfolios, online yearbooks, online catalogs, digital photo albums and more. Sign up and create your flipbook.