News from the market.......................................................................6 Case studies WASP3D Implements AR Workflow for Spring News............... 22 Another Monumental Project in Giza............................................. 24 Mining the archives for First Man.................................................... 26
Technology IP Live Production............................................................................... 28 The Challenges of IP-Based Production, by Open Broadcast Systems.................................................................................................. 36 IP Means Interconnected Production, by Newtek...................... 40 Forging ahead with All-IP Production, by Harmonic...................44 The evolution of IP based broadcast workflows, by Primestream.................................................................................... 48 Phabrix and IP-based productions...................................................52 IP-based Production special report, by Nevion............................54
Mobile Journalism......................................................................... 58 Test Zone
Canon XF705................................................................. 66
28 Editor in chief Javier de Martín
Creative Direction Mercedes González
Key account manager Beatriz Calvo email@example.com
Translation Fernando Alvárez
Editorial staff Daniel Esparza firstname.lastname@example.org
Administration Laura de Diego email@example.com
TM Broadcast International #65 January 2019
TM Broadcast International is a magazine published by Daró Media Group SL Centro Empresarial Tartessos Calle Pollensa 2, oficina 14 28290 Las Rozas (Madrid), Spain Phone +34 91 640 46 43 Published in Spain
EDITORIAL We kick off 2019 by focusing the attention
discussion. That is why we included in these
on one of the issues having raised the most
pages the opinion of several senior officers
interest in the market during 2018: IP live
from leading companies in IP solutions, with
productions. Throughout last year we
the aim of providing a comprehensive view
explored this in lots of TV stations and
on this topic.
audiovisual groups that went for it and they told us their experiences. We recommend
Another field that will grow important
our readers to go over previous issues in
during 2019 is the so-called Mobile
order to get a first-hand view of contrasting
Journalism or MOJO. Communications
case studies as well analyses from a wide
media are witnessing a convergence
choice of technical senior officers.
scenario in which journalists must deliver
We believe that this first month of the year is the time for carrying out a general review on the technological status of this production method. Our contributor Yeray
contents to various platforms. This context requires solutions being able to boost multiplatform creativity and digital innovation. And smartphones are certainly one of these
Alfageme, Business Technology Manager,
tools. Their benefits and limitations are
Olympic Channel, undertakes the task of
explored in this issue.
unveiling all keys, also providing, for our
Last, we the whole TM BROADCAST TEAM,
readerâ€™s benefit, his insights on this technology.
would not like to let this occasion pass without wishing you a thriving 2019. We will
Furthermore, a number of manufacturers
keep on doing our best to be your medium
claimed to be included also in this
of reference in the industry.
4 JANUARY â€˜19
NEWS - PRODUCTS
AJA KUMO 1616-12G router is now shipping
KUMO 1616-12G is currently available through AJA’s worldwide reseller network for $2495 US MSRP.
AJA Video Systems has announced that KUMO 1616-12G is now shipping. Featuring 16x 12G-SDI inputs and 16x 12G-SDI outputs, the compact SDI router supports high frame rate and deep color workflows over SDI, at up to 4K and 8K resolutions. Designed for broadcast, production and ProAV environments where size and capacity are crucial, KUMO 1616-12G mirrors the compact physical form of AJA’s productionproven KUMO 1616 router and offers network-based and/or physical control. Multi-port gang-routing
6 JANUARY ‘19
supports up to 8K resolutions, and a USB port makes it easy to configure the router IP address via AJA’s free eMini-Setup software. Key feature highlights include: - 16x 12G-SDI inputs and 16x outputs for up to 4K/UltraHD support at 60p over a single cable - Small, compact 1RU form factor - Multi-port gang control supports up to 8K resolutions - Redundant power supply option - Eight salvos per router
- Embedded web server for remote control on any standard web browser - USB port to easily configure the router IP address and simplify initial network configuration - Auto re-clocking SDI rates: 270 Mbps /1.483/1.485/2.967/2.970/ 5.934/5.940/11.868/11.88 0 Gbps - Support for AJA KUMO Control Panels (hardware, with direct connect or networked) - 5-year warranty and support
NEWS - PRODUCTS
Magewell ships 4K HDMI to NDI encoder and unveils new HD model also accept a 4Kp60 HDMI input signal, downconverting it automatically to HD for encoding. A third model, the Pro Convert SDI 4K Plus, converts 6G-SDI signals up to 4K at 30 fps into NDI streams.
Magewell has begun shipping a pair of featurerich models in its new Pro Convert family of standalone IP encoders: the recently-introduced Pro Convert HDMI 4K Plus, and a previouslyunannounced 1080p HD counterpart. Enabling users to bring traditional HDMI video signals into live, IP-based production and AV infrastructures using NewTek’s NDI® technology, the two converters will be showcased alongside other Magewell innovations in stand 8G430 at the upcoming ISE 8 JANUARY ‘19
2019 exhibition in Amsterdam. The latest expansion of the Pro Convert family offers users and systems integrators a flexible choice of input connectivity and encoding resolution. The Pro Convert HDMI 4K Plus transforms sources up to 4K Ultra HD at full 60 frames per second via an HDMI 2.0 input interface, while the newly-unveiled Pro Convert HDMI Plus encodes HDMI source signals into full-bandwidth NDI streams up to 1080p60 HD. The Pro Convert HDMI Plus can
All Pro Convert models feature extremely low latency and are truly plugand-play, with automatic input format detection and DHCP-based network configuration eliminating the need for manual setup. For users wanting greater control of the conversion process and the powerful features of the Pro Convert devices, an intuitive browser-based interface provides access to status monitoring, advanced settings and FPGA-based video processing including up/down/cross-conversion, de-interlacing and image adjustments.
NEWS - SUCCESS STORIES
Alpha TV migrates to HD with Imagine Communications Imagine Communications has implemented a new multichannel playout centre for Alpha TV in Athens. This new installation gives Alpha TV the ability to meet its viewers’ demand for HD on all channels, and builds a platform for future growth plans. It also ensures business continuity by providing full redundancy, eliminating the risk of any channel going off air. The new system, designed and installed in association with local systems integrator AmyDV, is space-saving and energy-efficient, making a real difference to the broadcaster today. It also provides room for expansion beyond the current range of channels that are broadcast to Greece and to the international market. The platform is designed to take advantage of next-
generation technology as that becomes commercially advantageous to Alpha TV. “As one of the most popular channels in Greece, and also available to our expat community around the world, we have to respond to our audiences’ demands for the highest quality and for new services,” said Antonis Stantzos, CTO of Alpha TV. “We also wanted to improve our operations, for instance through advanced on-air graphics. And of course, we cannot have any black on air — so playout has to be completely reliable.”
“Our first aim was to achieve the move to HD with a totally reliable architecture,” Stantzos continued. “We also felt that the adoption of software-defined elements would prepare us well for an elegant and costeffective transition to new technology in the future. AmyDV proposed an integrated solution from Imagine Communications, and I can say with certainty that the choice we made has left us completely satisfied.”
9 JANUARY ‘19
NEWS - SUCCESS STORIES
Beeline has selected Broadpeak to power its TV service
Broadpeak, a provider of content delivery network (CDN) and video streaming solutions for content providers and payTV operators worldwide, has announced that Beeline, one of the largest telecom operators in Russia, has selected Broadpeak to power its Beeline TV service. Using high-capacity CDN solutions from Broadpeak, Beeline can deliver highquality live, VOD, catch-up and start-over TV content to a wide range of 10 JANUARY ‘19
subscribers, through their set-top boxes, smart TVs, mobile devices and the internet. Broadpeak’s video delivery solutions are completely softwarebased and highly scalable, making it easy for Beeline to accommodate traffic growth and deliver a high quality of experience (QoE) to subscribers. “For huge networks like ours in Russia, delivering TV services to customers in scale is a serious challenge,” said Oleg
Karadzhi, CTO of Media Business at Beeline Russia. “We need a scalable and elastic video delivery solution with a low total cost of ownership, and that’s why we have chosen Broadpeak as our solutions provider. Our decision was based on Broadpeak’s extensive video streaming deployment experience and software-based approach to video distribution, all of which help to significantly reduce delivery costs
NEWS - SUCCESS STORIES
while ensuring a superior QoE for Beeline TV subscribers.” Beeline is relying on Broadpeak’s BkM100 video delivery mediator and BkS400 cache servers, deployed in various points of presence (PoPs) to ensure high-quality video streams for all subscribers across Russia on national and local channels. With Broadpeak’s BkA100 video delivery analytics, Beeline
can gather data about video delivery, monitor quality, generate statistics about content popularity, and provide feedback on what is going on at the servers, network, and player levels. “Beeline is delivering a world-class QoE for its TV service, and we are proud to work with them on this successful deployment,” said Jacques Le Mancq, CEO, Broadpeak. “Thanks
to the high ﬂexibility and scalability of Broadpeak’s software solutions, combined with strong collaboration between technology provider and operator, Beeline is poised to support unlimited traffic growth and deliver exceptional television experiences without making a significant infrastructure investment.”
NEWS - SUCCESS STORIES
BNJ FM selects StudioTalk to power its visual radio platform
Broadcasting three radio programmes for the counties of Neuchâtel, the Jura and the Bernese Jura in Switzerland, BNJ FM focuses on proximity information, concerns and interests of the population of the Swiss Jura ARC. The general-interest, pop-rock music radio reception is done via FM, DAB +, web and via mobile applications.
audience to watch the programs live and even consult the videos afterwards.
While BNJ already has a portal where its viewers can access video content (BNJ.tv), it is part of BNJ FM digital transition strategy to give the opportunity to its
StudioTalk is BCE’s all-inone solution to produce shows, manage content, add special effects, control sets and broadcast programs. Thanks to an intuitive Graphic User
12 JANUARY ‘19
StudioTalk will enable the radio to get a complete video production infrastructure within its actual studios, ensure the live production of the daily shows and distribute the streams to the radio website as well as its social networks.
Interface, a touchscreen and a dedicated remote, the customer can control the cameras, the feeds and the studio sets. It embarks a content management system as well and on the fly transcoding for immediate delivery to the viewers. “Watching radio is the next move for the media market. StudioTalk provides BNJ FM with solutions which match their radio know-how while delivering broadcast quality programs”, explains Olivier Waty, Technology & Project Director at BCE.
NEWS - SUCCESS STORIES
Quicklink TX empowers Ravensbourne University London and the Royal Shakespeare Company In 2018 as part of the Schoolsâ€™ Broadcast Series, Ravensbourne University London have been using the Quicklink TX to facilitate live studio question and answer sessions between school pupils and actors of the Royal Shakespeare Company (RSC). The RSC is one of the most highprofile theatre company in the UK. The Quicklink Skype TX
designed in partnership with the Microsoft Skype team, is a video call management system. It is a transceiver that enables professional reception and transmission of multiple Skype video calls through SDI inputs/outputs and HDMI interface. As the Quicklink Skype TX can receive from and send to any Skype user it provides unrivalled global access to millions of Skype users to
broadcast as HD outputs.As part of the performances, students took part in interactive activities as well as a Macbeth production which was broadcast to an audience of over 220,000 students. ď ľ
NEWS - SUCCESS STORIES
E-sports Arena Las Vegas sounds off with Lawo
E-sports – the world of competitive video gaming – has become an international phenomenon, with millions of fans watching both in person and on Internet livestreams, rooting for players that might take home millions of dollars in winnings. Where better to build a premium Esports venue than in the gaming capital of the world – Las Vegas, Nevada? So that’s what Allied Esports International set out to do, with a new 14 JANUARY ‘19
venue in the Luxor Hotel on the Las Vegas Strip. “Our mission was to build a turnkey infrastructure to do many different live events, as well as produce content in a style and quality that meets or exceeds current broadcast standards,” says Drew Ohlmeyer, Head of Content, Allied Esports International and Esports Arena Las Vegas. To repurpose a former nightclub space, Allied hired CBT Systems to design and build an A/V system tailored for high-
availability streaming video broadcasts. Darrel Wenhardt, President & CEO of CBT Systems, knew from the start that the space’s previous audio system was insufficient for the demands of Esports. CBT chose a Lawo mc²36 console for front-of-house duties, and a second mc²36 to power the streaming broadcast production control room. “We’ve worked extensively with Lawo, who have developed
NEWS - SUCCESS STORIES
consoles specifically for these types of challenges,” explains Wenhardt. “[They] are used around the world to manage major global events including sports, live performance and networked production infrastructures. Lawo was an easy and obvious choice and provided exactly what was needed, while also being able to offer custom workflows for the unique aspects of Esports events.” Working with CBT Systems, Allied Esports International began drafting plans in mid-2017 for a combined live performance / broadcast venue with audience seating, multiple cameras, concert-sized video screens, and a premium FOH system. During the events, the action would be packaged into broadcast-quality programming streamed real time to subscribers worldwide.
entirely new audio system was specified and installed: twin 24-fader Lawo mc²36 consoles; one for FOH sound, the other to mix finished audio for livestreams. These consoles are networked via RAVENNA AoIP using a Lawo Nova 37 router; I/O includes four compact Lawo stageboxes, each with 32 Mic/Line I/O, 8 AES I/O, 8 GPIO and1 MADI port. Lawo A__mic8 units provide AES67 AoIP connections for multiple deployed mics. Facility design and rebuild were accomplished in only five months, with Lawo personnel on site to support CBT Systems and Esports for commissioning and training. “This was an exhilarating
project,” says Michael Mueller, VP of Sales for Lawo in the USA. “We were very excited to work with CBT Systems and Esports on this project. Lawo has been involved in traditional live sports broadcasts and entertainment production since our inception, and we’ve also provided networked audio production infrastructures for gaming and live streaming customers. So the prospect of working in the growing field of esports is a great opportunity to marry live sound with broadcast/streaming technologies, using a unifying network platform.”
To attain these goals, copper network services were replaced with fiber for maximum bandwidth and connectivity, and an 15 JANUARY ‘19
NEWS - SUCCESS STORIES
News agency Ruptly relies on Riedel to equip new OB and DSNG vehicles
Riedel’s MediorNet realtime media network and Artist digital matrix intercom are providing the decentralized and redundant signal routing and communications backbone on board two all-new, state-of-the-art OB vehicles for Ruptly, a Berlin-based international news agency. Qvest Media, a world-leading system architect and integrator 16 JANUARY ‘19
for the broadcast and media industries, designed and built the new broadcast van and DSNG vehicle. Ruptly’s new production vehicles feature an innovative and lightweight body design, developed by Qvest in close cooperation with partner Carrosserie Akkermans. Despite their compact dimensions, the
two vans offer state-ofthe-art equipment optimized for high-quality 4K and UHD productions — including live coverage of news, events, and sports, as well as cinematic-style documentaries. “The decentralized routing approach of Riedel’s MediorNet makes it ideal for the rigors of
NEWS - SUCCESS STORIES
live broadcasting, and it delivers great cost savings for 4K and UHD productions. MediorNet not only reduces single points of failure, but also creates powerful operational efficiencies by allowing us to place physical I/Os closer to where they’re needed. Then, integrated processing capabilities including embedding/deembedding and up/down conversion reduce the need for single-purpose peripheral devices. These features result in significant weight savings that enable Qvest Media to exceed our expectations without exceeding the weight limit of 3.5 tons,” said Ahmet Cakan, Chief Technology Officer, Ruptly. “Our two new vans had their baptism by fire during a high-profile international football tournament in Russia, and the performance of MediorNet was more than satisfactory. We know we can count on MediorNet to deliver the perfect blend of high-quality output and reliability for
even our most demanding live productions.” Ruptly’s MediorNet network consists of five interconnected MicroN media distribution devices with four of them handling signal distribution and processing and one providing virtual multiviewer capabilities. With decentralized routing provided by MediorNet, all audio and video signals are distributed in real time between connected nodes in the OB truck, the DSNG van, and MediorNet Compact Pro stageboxes that can be placed wherever they are needed. An Artist 32 digital matrix intercom mainframe enables robust and reliable crew communications for each vehicle. The Artist intercom supports four RSP-2318 SmartPanels and three Bolero wireless beltpacks, with intercom signals distributed by MediorNet. Operators, administrators, and crew now profit from enhanced workflows due to the seamless integration and perfect interplay of all
panels and beltpacks. “Versatility is a critical factor in the design of OB and DSNG vehicles. With Riedel’s MediorNet installation, we have once again shown how to achieve a high level of technical quality and versatility, with minimal space and with a competitive budget,” said Norman Tettenborn, Principal at Qvest Media. “This system required a particularly compact and efficient media and communications backbone. We knew these requirements could be met with Riedel Communications as their MediorNet offers redundancy, scalability, and a decentralized topography, making it the ideal solution for modular system design for live broadcasting.” 17 JANUARY ‘19
NEWS - SUCCESS STORIES
RAI chooses ABonAirs’ wireless systems to be integrated inside KA-SAT mini-van ABonAir announced that RAI, the biggest television broadcaster in Italy chose ABonAirs’ wireless systems to be integrated inside their KA-SAT mini-van, for its news gathering and event coverage operations, all over Italy. RAI looked for a solution that will allow it to arrive quickly to scene and go on-air with automatic set up of the wireless system at the highest video link quality. Another requirement for the system was to fit into a small news car with cameraman and reporter only, which means a very reliable system that, can be easily operated by a single person. RAI published a tender for the supply of wireless broadcast systems and their integration in the mini-vans together with other video equipment. To carry out the project for setting up 8 KA-SAT minivans, RAI chose Video 18 JANUARY ‘19
range of 750 meters at a variable bit rate, controlled by the user, the AB407 enables broadcasters to “cut the cord” and add agility and mobility to their live broadcasts. The AB407 utilizes H.264 CODEC. Every pixel is acknowledged thanks to a bi-directional radio channel between Transmitter and Receiver.
Projetti s.r.l., a prime System Integrator company in Italy. As their video equipment for live broadcasts, RAI chose ABonAir’s AB407™ wireless microwave transmission solution. The AB407 is a lightweight system comprised of a portable Transmitter and Receiver. It can be easily deployed and fits into any small OBvan. With a coverage
Here is what Andrea Cafasso, mobile system engineer at Rai satellite production systems says about the ABonAir system: “We work now with the ABonAir system that just fitted to our needs. It allowed us to get to the scene and deploy quickly. Set up was very easy and quite intuitive. Just plug and play. Within minutes we were on air with a good picture quality and news was at our control room”.
NEWS - SUCCESS STORIES
PPTV HD Thailand invests in Octopus newsroom automation Octopus announced the completion of a newsroom computer system at the Bangkok headquarters of Thai television channel PPTV HD. Founded by entrepreneur Prasert Prasarttong-Osoth, PPTV HD went live on June 1, 2013, with a broad mix of news, entertainment, documentaries and sports, targeting the 18 to 35 age group. “We began as a linear channel transmitting via satellite C-band and cable, extending in 1080i high definition to the Thai terrestrial digital multiplex and later to online networking,” says PPTV HD’s IT Director Pitsanu Porameesanaporn. “News currently amounts to just over 30 per cent of our output which is quite a large proportion for any broadcast channel. We tackle it under the five categories of home news, global news, hot issues, lifestyle and entertainment news. Sport occupies another 20 per
cent or so of our programme schedule and overlaps quite frequently into the news coverage.” “We selected the Octopus newsroom computer system as the most versatile and operator-friendly way to improve production efficiency. It has proved an ideal choice for creating linear and on-demand news as well as sports coverage, and is a great improvement on what was previously a largely manual workflow.”
“Our target viewers, and indeed those of every age group, are increasingly watching via IP-connected digital devices both at home and when traveling. Mobile viewers are usually in a hurry so appreciate the freedom to select the news that most interests them rather than watching traditionally constructed linear programmes. By providing that option, we are also able to compete strongly with other online news sources.” 19 JANUARY ‘19
NEWS - BUSINESS & PEOPLE
Pikolo Systems and Mediaproxy partner to deliver advanced incident management Mediaproxy has announced a new partnership with Pikolo Systems, the advanced workflow automation solutions provider based in Addison, Texas. Pikolo’s ITracker allows broadcasters manage their complex operations. The collaboration enhances the monitoring confidence levels of enterprise scale content management. Mediaproxy develops compliance recording systems for broadcasters worldwide. LogServer provides real-time compliance monitoring, reporting, and incident management of multiple channels by uniting access to all broadcast sources via IP. Once logged and stored in a proxy format, content can be retrieved for reconciliation using the intuitive HTML5-based Mediaproxy LogPlayer. The integration of LogServer, with Pikolo Systems’ flagship ITracker, 20 JANUARY ‘19
will provide broadcasters with the enhanced ability to maintain accountability and perform real-time assessment of their operations. The integration connects Incident Tracker events and alerts directly into LogServer. Users of Incident Tracker will now have direct access to recorded archives on LogServer systems. “Our software solutions are designed to help broadcast operations work more efficiently in a homogenous and collaborative environment,” said Vernon Omegah, President at Pikolo Systems. “Incident Tracker provides
broadcasters the means to communicate and engage effectively despite the complexity of their operations. From local television to network and internet operations, to satellite and origination operations, our solutions together with Mediaproxy help broadcasters acquire, manage, distribute and analyze their assets. As software-defined workflows continue to replace older, limited technologies, we seek out technology partnerships with market leaders like Mediaproxy to empower our clients with highly functional, seamless solutions”.
NEWS - BUSINESS & PEOPLE
SeaChange and TotalVideo enter into an agreement to develop an OTT platform Video solutions pioneer SeaChange International (NASDAQ: SEAC) and Russian-based video operator TotalVideo have entered into an agreement to develop a multioperator, multi-tenant hybrid OTT/DVB/IPTV platform that will enable video and cable providers in Russia to offer multiscreen services to their subscribers. TotalVideo has agreed to develop their new service using SeaChange’s PanoramiC platform, and plans to offer this new video platform to service providers across Russia and Eastern Europe. The service platform will provide advanced viewing capabilities for live, linear, and on-demand content across a variety of devices, both fixed and mobile. “With SeaChange, TotalVideo has the scalable foundation to serve our customers as comprehensively and effectively as possible,”
said Mikhail Silin, General Director, TotalVideo. “The combination of software solutions and professional services provided by SeaChange will allow us to deliver a platform that video providers and operators can easily adopt, allowing them to offer their viewers an enhanced user experience. With our new platform, they will benefit with increased viewership and customer loyalty.” The PanoramiC platform is a cloud-based video management and delivery
platform, combining the components of the SeaChange cFlow™ video management and monetization portfolio with complementary streaming solution elements from leading vendors ATEME, ATES Networks, Broadpeak, and castLabs. This highly scalable platform solution allows providers to quickly and cost-effectively deploy personalized video services across multiple screens and devices, while offering a consistent user experience across their multiple devices. 21 JANUARY ‘19
WASP3D Implements AR Workflow for Spring News Spring News, the Thai Television news channel, owned by Spring News Television Ltd., part of News Network Corporation PCL, approached WASP3D â€“ Real-time 3D Broadcast Graphics with a vision to improve on-air graphics quality and enhance subscription to new resources. WASP3D implemented Its Augmented Reality Broadcast Graphics workflow along with its Video wall Display Solution to uplift the look n feel of Spring News (MCOT) A camera on a crane, along with the MO-SYS star tracker within the studio space, was converted into a full-scale Virtual studio setup. The technology gave freedom to transform the complete studio environment to match the content. The 22 JANUARY â€˜19
quality of graphics delivered was visually stunning. The graphics and data interacted with real characters, resulting in a very attractive presentation and gave the viewers a realistic feel of all visual elements.
and at the same time transforming their presentations into one that is visually more interesting and engaging. We look forward to catering to Spring News soon.
WASP3D on-air graphics can be integrated with cloud data sources to enrich production value and make for lively augmented reality presentations. The system is best suited for channels with any size studio space, who want to save costs on building expensive sets
WASP3D Augmented Reality Broadcast Graphics In contrast to composite productions utilizing Chroma-keying to place an anchor inside a 3D environment, broadcast productions may be set up to make use of physical
“WASP3D provided AR and virtual video wall solutions for our news show in Spring News, Thailand. It surely is one of the greatest real time 3D graphics solutions. The powerful render engine and workflow flexibility, helped us boost our broadcast production value.” Mr. Chakrit Khuangarin | Vice President- Creative, Spring News.
spaces augmented with 3D virtual graphics. Augmented reality acts as a substitute to creating hard sets similar to Virtual sets studios. The technology gives freedom to transform the complete studio environment to match the content without the usage of a blue or green screen. A camera on a crane along with the MO-SYS star tracker and a simple table with a glass top within a studio space can be converted into a fullscale studio setup. The technology gives freedom to transform the complete studio environment to
match the content. The quality of graphics is visually stunning where graphics and data interact with real characters displaying data and information in a very attractive manner thereby giving the viewers a realistic feel of all visual elements. The system is best suited for channels with any size studio space, who want to save costs on building expensive sets and at the same time transforming their presentations into one that is visually more interesting and engaging. In case of augmented reality presentations, the studio consists of a camera topped on a pedestal or crane fitted with a camera sensing device from MO-SYS. The WASP3D tracked Virtual Set system can ingest tracking data from various camera sensing devices into 3D virtual set scenes. The WASP3D Drone designer has an inbuilt virtual camera which accepts Pan, Tilt, Zoom and Focus data coming from the physical camera and the perspective to match the 3D virtual environments as the
physical camera moves in real-time. The WASP3D Virtual Set system can be configured to use tracking data from a camera to map the physical studio with the computergenerated 3D content. (Augmented Graphics). In case of a pedestal or crane-based camera implementation, the X, Y, Z position data and other details can be ingested by the system for even more elaborate camera angles like Standard Views – Top, Bottom, left, Right and Full Circle/360 degree), Bird Eye view and others for richer graphical representations. WASP3D on-air graphics can be integrated data from cloud data sources to enrich production value and make for lively augmented reality presentations. For example: In case of Sports Presentation, Augmented Graphics can be integrated with Opta Sports to enrich production value.
23 JANUARY ‘19
Another Monumental Project in Giza
Broadcasting excellence reaches new peaks.
When D Media, a leading media entity in Egypt and the owner of DMC TV Network, were planning to build a completely new broadcast centre in Egyptian Media Production City, they selected a consortium of two well established Egyptian system integrators: Systems 24 JANUARY â€˜19
Design and Technology KAR to manage the whole project. The scope of the project was to launch several new TV channels broadcasting general entertainment programmes, news, sports, and drama. The consortium of systems integrators in turn allocated different aspects
of the project to selected manufacturers and software houses, including: playout and studio automation to Aveco; the newsroom computer system to Octopus; long-term planning, scheduling and content management to Provys; production and playout servers to EVS and
graphics to Vizrt. Perhaps Systems Design and Technology KAR got some ideas from previous articles about ABEX Society which highlighted the wealth of broadcasting expertise to be found within the members of the society in which the first three play a prominent role. In contrast to many other local broadcasters’ habits, DMC specifically wanted to achieve maximum automation of all operational workflows. Scheduling is in pole position when it comes to workflows because all related activities are driven by the schedule. DMC is unique in that its scheduling contains longterm planning which subsequently triggers all downstream activities. A further special feature in DMC is the accurate scheduling of Salah times (Islamic prayer times) which occur five times per day at continually changing times, and which must be accurately programmed for years ahead. Such scheduling must also include
appropriate and varied content for each prayer time. Out of interest, Salah times are calculated using a sophisticated mathematical formula using factors such as sunrise, sunset and shadows, typically in keeping with traditional Arabic expertise in the field of maths. These prayer slots are precisely defined within the system in order that subsequent operations cannot interfere with these prioritised intervals, thus avoiding any chance of human error.
of ad sales where
“The whole scheduling and content management operation rests securely on an Oracle relational database platform, thereby facilitating coterminus collaboration across the whole of the team. This is a vast improvement on previous standard office software systems which were not entirely dissimilar to the local archaeological monoliths” says Adam Krbusek, Senior Provys Consultant, Egypt. One area where spreadsheets are still used is in the field
commercial break time slots are defined and outsourced to ad agencies to fill appropriately. The completed spreadsheets are then imported back into Provys for subsequent playout. A further sophisticated element is the use of Arabic, reading from right to left, for all internal text communications whilst the menus are still presented in English in line with international standards
On an interesting final point, Egyptian Media Production City is a concentrated hub of Egyptian media companies all working within a closed campus which is designed to protect the massive capital investments made by the various participants.
25 JANUARY ‘19
Mining the archives for First Man The movie First Man dramatises the story of Neil Armstrong’s famous giant leap for mankind. To give it a strong sense of authenticity, Director Damien Chazelle approached NASA to see if there was any archive footage from the Apollo era. The good news was that NASA offered complete access to their archive material and there was plenty of footage from which to choose. For the Apollo 11 launch they had multiple cameras capturing different engineering details of the Saturn V from different angles. The bad news was that it was in a unique format: an obsolete 10perf 70mm military stock developed by NASA using Kodak Ektachrome reversal film – the sprocket pitch in particular was incompatible with modern film scanning techniques. David Keighley, president at IMAX, suggested a 26 JANUARY ‘19
Archive footage from NASA used in First Man movie
prototype scanner at FilmLight might offer a way to transfer the footage. Visual Effects Producer Kevin Elam also found there existed a gate made for a Rank Cintel telecine at the White Sands Missile Range, so whilst this could provide a telecine reference a modern scan quality was still desired.
Kodak developed a special Ektachrome reversal stock for NASA, with an extremely high dynamic range. That meant that all the detail of night-time launches could be captured, with the highlights of the rocket exhaust not burning out the detail of the activity around the launchpad.
Kevin Elam contacted FilmLight to see what could be done. FilmLight is widely recognised, not just for its colour science, but for its understanding of the medium of film. Established in 2002, FilmLight develops unique colour grading systems, image processing applications and workflow tools for the film and video post production industry. FilmLight's first product was the Northlight film scanner, a motion picture film scanner capable of scanning 16, 35 or 65 mm film formats. Fortuitously, FilmLight has a continuing research project into advanced film scanning techniques, recognising that archive preservation is going to be a critical issue in the years to come. Archives come in very many formats, the majority long abandoned. But the deteriorating nature of much of the material, though, means that frequently conservationists get just one shot at scanning the
film.So one of the guiding principles of FilmLight’s research project – codename Arclight – is that it should be capable of scanning any piece of film, regardless of gauge or number and position of perforations. This seemed like an ideal test for the development. Chris Hall leads the project for FilmLight, and he takes up the story. “I was sent a test film, and all I had to do was load it up and define an appropriate format in the software – the dimensions of the film frame and the sprockets,” he said. “The image area then came out as a square frame of just over 7000 lines: actually 7250 x 7250 pixels. That gave a very highresolution scan of the images.” “While scanning this film, it started out almost completely opaque, but I was relieved to see the image of a rocket, on a launch stage, at night, clearly visible on the monitor.”
transparent as the rockets fire,” he said. “The Ektachrome was slightly over-exposed at this point – which is completely understandable considering what it is pointed at. This range of densities is always a challenge for a film scanner, but we are pleased with the image quality capturing the full dynamic range. There was a lot of footage to scan, mostly of the actual launch of Apollo 11 with some shots of the other Apollo launches mixed in.” In all, FilmLight scanned around 20 minutes of footage for the producers of First Man. The movie’s goal of true authenticity was greatly boosted by the inclusion of this material, and the exercise enhanced FilmLight’s understanding of the challenges around digitising archive footage at the highest possible quality.
“Then suddenly the film became almost 27 JANUARY ‘19
Is it IP really a better way to produce video? This seems to be a simple question but with a difficult answer. LetÂ´s try to discover if IP is better than traditional SDI infrastructure when talking about live production.
Author: Yeray Alfageme, Business Technology Manager Olympic Channel
28 JANUARY â€˜19
IP LIVE PRODUCTION
29 JANUARY â€˜19
IP is better than SDI for video infrastructure for multiple reasons: it is more scalable, more flexible and format agnostic. There is one popular reason most of us think when talking about IP in terms of equipment, we can use standard IP infrastructure for video. This is not like this 100%. When building a video IP environment, we use “standard” IP infrastructures, such as LAN switches and routers, but these systems need to be properly configured. This ends in separate networks, one for video and another one for data.
30 JANUARY ‘19
Everybody implementing IP infrastructure at broadcast centres is doing like this. IP is widely implemented on contribution and distribution networks because it is more flexible, and you can adapt it event per event and pay for only what you consume when a service provider gives you the service. But since some time ago, a couple of years, the OB-van owners build, or at least consider, IP infrastructure when building their new live production units. This makes more sense when combining it with contribution and
distribution but there are more reasons to do like this. Most of the manufacturers are looking into IP only when talking about new developments and this means that in the future the support and spare pieces available for SDI equipment will be limited. An OB-Van has a lifecycle of about 10 years and we need to consider these factors when building a new one with a huge investment. Even though you need to do more training, in the beginning, to acquire the same level of knowledge than with traditional SDI
systems, in the long run, it is cheaper when expanding it and more flexible to future changes, such as 4K or HDR. From HD to 4K in SDI means quadruple the number of cables but in IP it is just to consider more
bandwidth. When moving from SDR to HDR in SDI means to have 3G equipment but in IP, letÂ´s guess, it just means more bandwidth. So, it is easier to adapt and expand an IP video system than an SDI one.
From HD to 4K in SDI means quadruple the number of cables but in IP it is just to consider more bandwidth.
32 JANUARY â€˜19
When operating it, after all the configuration is done and the doors of the OB-van opens 2 hours before the match, the production team does not see much difference on how operations are conducted. The change is in support. It is the engineering team the one need to learn and adapt to this new technology, but it is easier for a video engineer to go to IT than in the other way. There are more specific terms on
IP LIVE PRODUCTION
video than on IT because this last one is more expanded than video.
is also designed for it so stability and reliability, now, are guaranteed.
The only thing may concern “traditional” broadcast guys is the stability and reliability. IT systems could manage packet losses and recover the system and communications from them, but this is not an option when talking about video. A packet loss is a glitch, and this is not OK at all. But we must consider that video over IP systems
We say now because at the beginning of videoover-IP it was not like this, and this is a handicap for IP also. If a new technology comes and it is not reliable from the beginning it takes time to rely on it. But now, we can ensure that actual IP systems can handle video without issues when properly configured. The installation phase is
much simpler in IP also. In SDI one video signal means one cable but in IP one single cable could handle more than one signal decreasing the total amount of installation material and wiring needed. Then, where is the issue when building IP systems? On the design of the network. All our system relies on the network, so we need to design properly it. The time does not expand on wiring let´s 33 JANUARY ‘19
expend it on designing the network. There is no video router or glue, but the network is more than key and we need to dedicate there all our effort to guarantee the system will work in the future and for the future. A major change of IP is the bidirectional communication it establishes, and it makes a difference when talking about permanent installations like studios. If you have a studio on the 5th floor of a building and your MCR is on the basement it makes no sense today to run video cables but to establish data connections where you can place a signal on both directions and every destination is also a source of video. Protocols like NDI facilitates this discovering the systems over the network. When you connect a new NDI system into the network the rest of the systems discover it and it appears as a source or destination, or both, on your vision mixer, your recorders and more. It is much more simples than adding video equipment into an SDI installation. More than that, when talking about permanent installations we can reallocate resources between galleries with IP. You just need to reassign the equipment to one production or the other and, 34 JANUARY â€˜19
IP LIVE PRODUCTION
A major change of IP is the bidirectional communication it establishes, and it makes a difference when talking about permanent installations like studios.
because everything is connected over the network, there is no need for systems reallocation or physical movements. This means that with less equipment you can do more. And what about the “near” future? What about 4K? We have told before that all IP systems are format agnostic and it´s true. You can mix definitions, standards and frame rates over the same network without issues. Of course, you will need to adapt them if you want to use them together, but you will be able to transport them without transforming them. So, what is going to happen in 4K? The same. The network will handle 4K signals, in the same way, today it handles multiple formats, we only need to take care of the bandwidth. Instead of multiplying by four of our cabling we just need to multiply our network capacity, something much easier and cheaper. It is possible today to shoot live broadcastquality programming using IP-connected equipment. Costs are competitive today with providing similar function using SDI architecture, with the added benefit that IP systems are both bidirectional and more flexible than SDI cabling. 35 JANUARY ‘19
THE CHALLENGES OF IP-BASED PRODUCTION By Kieran Kunhya, Managing Director, Open Broadcast Systems
IP-based production has the potential to revolutionise broadcast workflows, making them more efficient than ever before. In 2018, we saw some broadcasters undertake major projects, creating brand new IP production facilities. However, these were not without their challenges and many others are watching from the sidelines to see how successful they will be before making the leap themselves.
The Benefits of IP Using IP for production has a number of clear benefits. One of the most obvious ones is flexibility. Broadcasters often need to scale up operations very quickly, for example during coverage of a major sporting event. With traditional
36 JANUARY â€˜19
broadcast equipment, this means adding more hardware, often with significant amounts of legacy cabling which means you need the space to put it and often has long lead times. That is fine if you have the space and know well in advance when demand will increase. IP means you can instantly scale up or down according to needs and resources.
IP is also extremely costefficient - it can be delivered using off-the-shelf IT hardware. Crucially, this also means that the same hardware can be used for any number of different functions and those can be adapted at any time without the need for new hardware or massive work reconfiguring it. The other major benefit of
IP LIVE PRODUCTION
IP is the fact it is future proof. Formats and workflows are always evolving as the broadcast landscape changes. With an IP workflow, broadcasters can more easily adapt to format changes without the need for massive infrastructure alterations or replacements. A good example of this would be supporting UHD without new cabling.
The Technical Hurdles IP has obvious advantages and is already proven as a reliable way of contributing and delivering media content. With that in mind, why isnâ€™t everyone already doing it? The answer is simply that there remain a few hurdles to get over and the challenges can seem somewhat overwhelming.
These are the main barriers for IP-based production: 1. Network Design When it comes to IP production, there are so many different formats right through the workflow and very little in the way of proven standards. For example, in terms of the core network architecture, designs are generally based around leaf-spine or central
37 JANUARY â€˜19
systems. All of our customers use central systems, because they are generally the early adopters and leaf-spine simply wasnâ€™t viable when they setup their networks. Leaf-spine architectures are now getting there, but still arenâ€™t a mature solution yet some broadcasters are deploying them. However, leaf-spine architectures may require vendor-specific Software Defined Networking (SDN) protocols to be used, and significant care must be taken in their design. 2. Interoperability With regard to to control
38 JANUARY â€˜19
systems, vendors are adopting different approaches and this is causing problems for users, some of these control protocols can be very old. NMOS could be the solution to this but there is limited vendor support owing to its complexity in integration into products. In a SMPTE 2110 environment which most (but not all) large scale facilities are choosing, audio is probably the biggest practical issue right now and again a lot of that comes down to differing formats and protocols causing
problems with interoperability. With audio you also introduce differences in number of audio channels and there is a risk of causing additional latency especially if audio is being fed through different systems. Early 2110 senders are hardware-based and will be compliant with the narrow sender profile of 2110. But future senders will be software-based and comply only with the wide sender profile which is incompatible with narrow receivers. Care must be taken in equipment selection and testing. Also
IP LIVE PRODUCTION
where projects have tight turnaround times, the current solutions are not mature enough and require a great deal of work and effort to make the system run as it should. There is a gap opening up between what should be possible and what is currently available and no-one seems to be ready to fill that. 4. Migrating Systems
there has been limited vendor testing with the SMPTE 2110-21 models, and it is lightly this will cause issues in the future, especially due to limited lack of test products. 3. IP Maturity As broadcasters started to look at moving to live IP workflows, vendors rushed to deliver the right tools first. The problem is that many of those tools cited as IP capable are relying on the long timespan of many IP production projects, meaning that solutions will be ready once finalised in 23 years’ time. However,
Building new systems from scratch is one challenge, but migrating existing facilities to IP is another level of complexity. This is especially difficult if you have live services that need to be completely switched over to the new system in one go, and this comes with an obvious risk factor. An easier approach would be to move systems over gradually, but that is not always possible. 5. The Expertise All of the challenges above can be solved with technology and a lot of hard work, however having the right expertise remains a challenging barrier to overcome. An IP-based production network is totally different to existing systems and staff are often not
equipped to operate a totally new type of facility. Not only that but IP native devices are hard to troubleshoot and difficult for operators to understand. A move to IP means the need for training on a massive scale. It also means working closely with solutions vendors, often requiring vendors to be on site testing in labs to ensure everything is working as it should be which may often take much longer than anticipated. It should also involve remote monitoring by the vendor. Many of the services we have worked with would be on air if it weren’t for remote access allowing us to make suggestions and troubleshoot remotely. Challenges aside, IP still has the potential to revolutionise production workflows but it needs to be implemented well. Taking time to do it right the first time will mean broadcasters can truly maximise the benefits it affords.
39 JANUARY ‘19
IP MEANS INTERCONNECTED PRODUCTION By Scott Carroll, Newtek
within that suffix, the computer software-driven advantages of networking, that provide new meaning to the future of our industry, transforming the highly limited linear production model into a fantastic new world of possibility and opportunity where every source becomes a destination. In its broader context, call it Interconnected Production.
IP – an acronym that means a lot of different things to a lot of people in the video industry. Buried within its intended meaning “Internet Protocol” is the key suffix ‘net’ as in network. It is
40 JANUARY ‘19
As manufacturers, developers and the broadcasters themselves push the limits of today’s emerging IP technology, predecessors in our respective fields might barely recognize the landscape not only as workflows adapt and change but as new business models emerge as well. The bigger video production universe
surrounding us is rapidly changing face with SMPTE ST 2110, NDI®, and other standards gaining widespread traction. It is now simply a matter of time before the market matures and the world of broadcast, sport, and more replace their baseband investments and never look back. One might even formulate the mistaken impression that the IP revolution is nearing completion. However, the proliferation of IT-based infrastructure and operations is simply the first milestone of a journey into a new and relatively unexplored frontier. For as finished as interconnectivity within a control room, OB vehicle, or even an entire facility or campus using networking as the backbone might feel, simply replicating point-to-point digital workflows, tracing over the
IP LIVE PRODUCTION
sprawling map of SDI production framework, would miss out on the unique advantages of IP. Fortunately, the IP-focused vendors and producers at the forefront of the revolution are not settling for just reconnecting the dots but reshaping the industry as we know it.
Complex Workflows The classic vision of an entire production crew with all of their equipment cramped in a dark room
illuminated only by the tools of their trade will not die easily. But, just as video calling and screen sharing have become commonplace in day-to-day business, soon too will sending, receiving, and processing actionable real-time content in multiple locations become a mundane activity in the context of a live production workflow. Solutions providers simply need to adjust their mindset to imagine and trust what IP can do.
Imagine a server rack as a condensed version of a control room, itself a microcosm of a facility full of technology. Deploying IP for interconnectivity and interoperability is the new end game, transforming a production area from a few feet to a few miles to who knows how far away â€“ removing the limitations of traditional box equipment in dramatic and breathtaking fashion.
41 JANUARY â€˜19
Software-Driven Production Speaking of those boxes, transforming the essence of live production from its proprietary cage is the distinct advantage of software-driven production. Beyond scalability and flexibility through code, the benefit of highly specialized computer programming is that it isn’t obligated to meticulously engineered hardware. This means it can be moved about the ether— called upon as needed, where needed, and within whatever data center where there is vacancy, easily combining with other platforms to create either a massive or miniscule virtual ecosystem, scaling up and down as desired. Virtualization has long been the dream of the highly capitalized broadcast studio. While a known and proven concept in the world of IT, virtualization currently requires a unique constellation of leadingedge technology for live production. But with IP, that door is now open to broadcasters and producers—and not merely
42 JANUARY ‘19
for ancillary operations supporting the core process of program creation. This is only one area where the realm of new possibilities and new business models emerge with ever mind expanding visions of new power and capability within an all-IP future.
Cloud Implementations Not that long ago the consensus next big thing – cloud-based production and the elements comprising it were subject to smirks and scoffs from the field. Regardless, legitimate cloud-based production is on the horizon, with software-
IP LIVE PRODUCTION
and IP networks the domain of another department within the organization. As the modern world evolved and the amazing things these three technical components working together could accomplish were seen and experienced in other industries, it was only a matter of time before the broadcast space would find a way to follow suit. In the last decade, production with computers, software, and networks has advanced by leaps and bounds. Much more than a force to be reckoned with, it is now the engine driving the transformation of an entire industry.
driven, IP-native video solutions residing in the external data center free of their black boxes, readily and remotely accessible anywhere in the world with sources and destinations similarly distributed. The cloud business model that has taken the world by storm, allowing other media and technologies to flourish
is inevitably where this IP movement is headed— perhaps, by design. And the groundwork is being laid right now.
Computers, Software, and Networks
As established practices, principles, and long-standing ways of working that the industry has relied upon for decades remain valid, they also stand to benefit immensely from IP interoperability, continuing down the road being paved by software-driven production, powered by computers and networks, is where the real opportunities lie. Interconnected Production is the new reality.
Computer-based production was once considered a novelty, software a mere accessory,
43 JANUARY ‘19
FORGING AHEAD WITH ALL-IP PRODUCTION By Andrew Warman, Director of Playout Solutions at Harmonic
entirely Ethernet-based environment. Does this mean that all of the issues have been worked out and the industry is completely ready replace SDI? Not yet. But we are learning and adapting quickly.
All-IP-based production is rising. There are a growing number of examples of all-IP production for live events, studio, news and other workflows that prove that it is not only possible, but it is practical to operate in an
44 JANUARY ‘19
This article will explore some of the issues with all-IP production workflows and practical ways to prepare for the transition. It will also look at the benefits of IP, explaining how it improves upon what’s possible with SDI today, as well as how innovative production solutions that bridge the IP and SDI worlds are helping this transition fall seamlessly into place.
Current State of IP Specifications and Challenges IP standardization is underway, and while a wide
range of capabilities are available that enable multivendor solutions to be built, there is a significant degree of customization needed to create functional systems. We have solved the technical challenge for transporting video, audio and data over a network (SMPTE ST 2110), and keeping everything in sync (SMPTE ST 2059). However, ubiquity is the goal, and achieving a plug-and-play approach is crucial toward making media over IP easier to use. AMWA specifications are now available for handling media over IP with an aim to resolve connection issues and creating a plug-and-play type environment for devices as they connect and disconnect from a network — specifically discovery and registration and connection
IP LIVE PRODUCTION
management. Once these are widely adopted, it will take the guesswork out of knowing what types of sources and destinations are available and simplify connecting compliant sources and destinations to one another. The specifications that will help to optimize IP production include: • IS-04 for discovery and registration. This protocol specifies APIs to allow network-connected devices to register their resources on a shared registry, and for client applications to query the registry, and to subscribe to updates.
• IS-05 for connection management. This specification uses APIs to connect one flow of video, audio or data to another, verifying that the sender and receiver are compatible. It relies on IS04 to for the necessary information. • IS-06 for network management. This specification outlines how to discover network and endpoint topology; create, modify and delete network flows; reserve network resources; establish bandwidth limits and flow priorities, and more.
Benefits of IP IP fundamentally changes the game of production by offering any-to-any connectivity, access to a vast pool of bandwidth, multiple protocols and streams per connection and a path to COTS, virtualization and cloud-based solutions. Using the massive amount of bandwidth provided by IP networking, handling high volumes of HD and UHD video flows along with high data rate file-based workflows and video monitoring becomes a practical proposition. By supporting multiple
All-IP Workflow Diagram
45 JANUARY ‘19
protocols, IP enables production systems to use a single physical network connection to carry numerous types of video, audio and data flows. This approach is very different from SDI, where only one flow is enabled per physical wire. In an IP environment, production can go beyond delivering just real-time video over a single connection. It can also support routing and signal distribution, device control, monitoring, configuration and management over the same infrastructure. IP allows many different flows to be mixed together to where a system may have dozens or even hundreds of streams going backward and forward over one connection. This vastly reduces wiring, space requirements and physical infrastructure. Another key benefit of allIP production is that it paves the way for more softwarebased systems in workflows, as well as virtualization and cloud. In the IP universe, everything relies on networking. In the future, broadcasters can expect to see more microservice
46 JANUARY â€˜19
architectures and a growing reliance on software solutions. An on-premises system can take advantage of COTS hardware, whether that involves software on bare metal or virtualization.
Practical Steps to Take When preparing for the transition to all-IP production, there are a few practical steps to take. If an
IP LIVE PRODUCTION
conversion equipment and enable users to transition in stages to reach the end goal of all-IP, without requiring a complete infrastructure upgrade. Another capability to explore is cloud environments. The cloud is
advanced media server system and Electra® media processor family are helping drive the industry toward allIP and cloud-based production environments,
providing hybrid support for
production and delivery,
both SDI and IP, as well as
bringing increased agility, efficiency, scalability and
all video, audio and data
cost savings to broadcasters.
standards, including SMPTE
A growing number of
ST 2022-6 and ST 2110 and
production processes are
AMWA IS-04 and IS-05. Using
now accessible or practical in cloud environments.
Spectrum and Electra
Cloud-based workflows tend
systems, broadcasters can
to rely on compressed media
realize the full potential of
while uncompressed workflows are primarily supported on-premises.
Conclusion All-IP production is achievable today, but future
IP, and have a path to cloud and software. As an active participant in global standards bodies, interoperability testing and
developments will provide
leader in real-world
deployments, Harmonic is
now that AMWA’s IS-04 and all-IP system is not possible due to workflow or budget constraints, production equipment that supports SDI and IP can be used. Products with hybrid functionality eliminate the need for
Harmonic’s Spectrum™ X
IS-05 specifications are
moving the industry forward
being adopted. Once support
toward an all-IP production
for plug and play is in place,
more benefits of operating a production workflow in an all-IP environment will become available.
47 JANUARY ‘19
THE EVOLUTION OF IP BASED BROADCAST WORKFLOWS By David Schleifer, COO, Primestream
drives, moving them around, storing them centrally and working in an IP-based network of tools. It took some time for the workflow to be real-time or faster and to scale up to sizes that facilities really needed, but we have reached the point where production facilities are built around an IP infrastructure, and in many cases off-the-shelf hardware. But there are still parts of the end-to-end workflow that have lagged behind.
The core of video production moved over to IP years ago; it is everything around the edges that is now catching up. As soon as we moved from film and tape to files, we started importing them to computers with hard
48 JANUARY â€˜19
So far, video coming in from live remotes, live production, or cameras, and video going out to viewers has remained SDI. The answer has been to put servers at each end of the pipeline and convert video into computer friendly formats so that it can get in
and out of the production system to be edited, transcoded, managed, archived and delivered. The reason for this is that SDI guarantees bandwidth and delivery in real-time. You know that your signal will fit down the wire and get to the other end intact. IP-based solutions are great for moving files around and in some cases can do so in faster than real-time, but at its core, generic networking technology was not designed the same way as broadcast video technology. This has meant that there have continued to be places where SDI is the better solution for the task at hand. With the existing investment that most facilities have in SDI infrastructure, the wall between SDI and IP parts of
IP LIVE PRODUCTION
the solution have been maintained in a balance for the past few years. Vendors have continued to look at how to deliver complete IP-based solutions because customers are looking to reduce overall cost, increase flexibility and improve workflow. Codecs keep getting better, bandwidth and reliability of networks keeps improving, and the number of devices that offer some IP output or input on them keep increasing. The result is that
vendors have more and more to work with as they try to solve more problems while still delivering the quality of service that broadcasters and content producers require. The challenge remains that as an industry, we need to balance SDI and IP, and existing infrastructure with new technology â€“ all while providing different things to different types of customers. Some are looking for futureproofing and a path forward, while others are building
from the ground up and want to avoid any legacy solutions they feel will be left behind. The path that Primestream has taken has been to define problems that can be solved in new ways. With the goal of helping content creators optimize their media creativity, we have taken an approach which is less concerned with maintaining legacy infrastructure, components, or revenue streams, and is simply focused on finding better
49 JANUARY â€˜19
ways to get the job done. In
technology and architecture.
some cases, others have led
improvements where we
the way, such as review and
can, but in many areas, we
approval over the internet –
are already pushing the
we have added ease of use,
limits of what can be
tight integration with asset
delivered with current
Primestream’s IP Ingest Media Framework is an example of how vendors can take steps to bridge the SDI and IP islands, while
50 JANUARY ‘19
IP LIVE PRODUCTION
providing new workflow and efficiency that is flexible. In essence, it facilitates the capture, editing and transcoding of web, camera and broadcast sources such
as HLS, MPEG Dash RTSP, RTMP, SMPTE 2022 and NDI streams along with traditional SDI feeds, into single or multiple selfcontained file formats to integrate that content into a production, management and delivery workflow. This media framework gives users the ability to capture all of the sources required using a single system, and the same workflow across all source types. It is also built to manage SDI sources, so with the right hardware it can manage any signal needed in or out of the facility. This is a major step in being able to support the same workflow across all media inputs without having certain paths, hurdles or compromises for some signals that make users continue to fall back on traditional workflow solutions. By making all signal sources equal we can speed up adoption and integration into the workflow.
we are willing to leverage,
The reality of IP is that it is a moving target, while the base layer of technology is off-the-shelf and well understood. What we want to do with it, which codecs
processes, product and
what devices have what combination of codecs and wrappers, etc., is changing all the time. In fact, this is an area where I believe we will see a lot of change and progress over time. The same reason why everyone wants to leverage IP-based solutions is why it will evolve rapidly. It is an open system that gives a broad group of innovators the ability to leverage it in their own way. Whether it is finding a better way to connect from the field or more efficient ways to compress signals on the wire, a lot of brainpower and energy is going into solving and improving these issues across the board. This is why we felt that building a plugin architecture that let us mix and match, sources, codecs and wrappers was the best possible path we could take to help our customers optimize their people. ď ľ
51 JANUARY â€˜19
PHABRIX AND IP-BASED PRODUCTIONS By Prin Boon, Product Manager, PHABRIX
As a world leader in broadcast test and measurement, PHABRIX offers a full range of portable and rackmount systems for rapid fault diagnosis, compliance monitoring, and product development. With the adoption of IP-based productions becoming more common place, the industry requires IP test and
52 JANUARY ‘19
measurement to ensure the smooth delivery of content. PHABRIX has addressed this with the introduction of SMPTE 2110 (-10, -20, -30, 40) with PTP and ST 2022-6 support to its flagship Qx IP Rasterizer and handheld Sx TAG, offering unbeatable solutions for hybrid IP/SDI environments. When considering a move to IP, the economics need to
be discussed. End-users can’t just make an ‘apples for apples’ comparison with SDI because the concepts are so different. The greater capabilities of IP are one justification for making the move. For example, the weight and space saving when switching from SDI copper to fibre IP in an OB truck is up to 7 tons for a large truck and enough rack space that a (second) vision mixer dedicated to HDR operation can be fitted. This gives much more flexibility as to the layout and configuration of the vehicle and there is the added benefit of being HDR enabled. For a Studio complex, many of the new buildings being commissioned by Broadcasters such as the BBC Cardiff project, CBC in Canada and TPC in Switzerland, are able to
IP LIVE PRODUCTION
shrink the amount of rack space required. All file-based workflows in these plants already use IP, and for this size of operation IP makes sense for live operation as large-scale IP routing is, in general, less expensive that the SDI equivalent. Remote production is also now a realistic proposition thanks to IP, and it is already taking place routinely with fly-packs and highly compressed links back to the studio. For non-critical camera shading, there are host broadcasters who already use remote-shading.
It is becoming increasingly common for smaller productions to be run remotely with an IP linked ‘fly-pack’ placed in the venue. High capacity and low latency IP circuits have been used successfully for complex remote production experiments such as that conducted by UEFA and BT Sport in summer 2017. From a technical perspective, there have already been several experiments involving the backhaul of 10’s of cameras over uncompressed links back to the studio with the
entire live production team – and the audio and vision mixers and camera shaders operating out of the studio over IP links. The achievable round trip latency has already been shown to be low enough for this to be practical. What is needed now is the widespread availability of the necessary high capacity high reliability infrastructure to all of the required venues to accommodate this type of workflow.
53 JANUARY ‘19
IP-BASED PRODUCTION SPECIAL REPORT By Olivier Suard, Vice President of Marketing, Nevion
resources like studios, control rooms, equipment and indeed production staff, thereby increasing production nimbleness and reducing operations costs.
In recent years, we have seen more broadcasters moving to IP-based production. This was initially driven by the anticipated cost savings deriving from using enterprise equipment, but IP solutions are now increasingly being used to transform workflows â€“ sharing more efficient
54 JANUARY â€˜19
While there are clear operational benefits to using IP, these can only be achieved through capital expenditure, which broadcasters obviously need to justify. The justification comes when they are faced with situations where they would need to invest anyhow. At that point, the question for them becomes whether to invest in traditional SDI technology and infrastructure or move to IP. Currently, there are three main use cases where such investment decisions are leading broadcasters to move to IP.
Remote production The first, and most common is remote production. Covering sporting, entertainment, news or other events away from broadcast facilities is a complex and costly exercise for broadcasters, as OB units need to be deployed for onsite production. Broadcasters, and the providers of services to them, are periodically faced with the need to decide how to cover major events, for example when the FIFA World Cup or the Summer Games take place. A cost-effective alternative to using OB units is IP remote production, whereby all or most of the feeds are transported from where the event is happening to the central facilities, where the production takes place. IP
IP LIVE PRODUCTION
remote production not only reduces the amount of equipment and people that need to be on-site, but also enables broadcasters to make better use of the resources in the facilities, not least the production team which can produce multiple shows without travelling. A further consideration is that, if the broadcaster wants to move to 4K/UHD, IP provides are much easier and costeffective solution that upgrading OB trucks to handle the technology. A recent example of migration to IP remote
production, is provided by Streamteam Nordic Oy. The technical broadcast production service provider deployed Nevionâ€™s softwaredefined media node platform, Virtuoso to remote produce the Finnish Elite icehockey league, SM-liiga for the broadcast rights owner Telia. On every match day, up to seven games take place simultaneously across the country, and up to 15 camera feeds per game are transported to the central facility in Helsinki when the production takes place. The decision to pick IP remote production was driven by the desire to reduce costs and
optimize the use of production resources. It is worth noting that, while IP remote production is great for high-profile events, it can also equally be used to make profitable the coverage of events with small audience appeal, such a Danish horse-racing.
New facilities The second main driver for broadcasters to move to IP production (and probably the most high-profile reason currently), is the need to move to new premises. This is a point where broadcasters need to choose
55 JANUARY â€˜19
Nevion VideoIPath is a convergent orchestration and SDN (software defined networking) control software system
whether to build a traditional SDI broadcast infrastructure into the facilities or opt for IP. SDI is the safe option, and remains competitive cost-wise compared to IP, but clearly the latter is the future of broadcasting. The move to IP in the facilities is an order of magnitude more complex technically than remote production, because of the need to accommodate a much higher number of signals, the frequent switching, and the synchronization amongst other things.
56 JANUARY ‘19
Some broadcasters have opted for solutions which are essentially “SDI in IP clothing”. While having a familiar feel, these solutions will not deliver the benefits over time that real-IP solutions can. Other broadcasters who have chosen IP, have ended up which solutions that have issues as result of the wrong network architecture and control decisions being made from the onset. Some very high-profile IP projects remain in trouble. Successful IP in the facilities projects, such as TV 2 in Norway or PLAZAMEDIA in German, have taken a
ground up approach that is essentially broadcast-centric and distributed in nature. Even if a project is initially focused on a single facilities, it can be safely assumed that multiple-sites while eventually come on stream, and these will need to be handled in a homogeneous way from a network point of view, leveraging the opportunity offered by the IP LAN/WAN convergence. For that reason, decisions about architecture and control are absolutely crucial.
Capacity upgrade A third potential case for a move to IP in production,
IP LIVE PRODUCTION
arises when broadcasters are running out of capacity on their baseband network, e.g. as a result of adding new studios or moving to 4K-SDI. The temptation is to upgrade the SDI infrastructure. However, an alternative approach is to keep the existing infrastructure and add capacity using IP. This presents a great opportunity to begin a migration to IP incrementally. These SDI/IP hybrid networks essentially require two main components to be successful. Media nodes that can acts as the gateways between SDI and IP streams; and an orchestration and control system that can route flows seamlessly between sources and destinations across both networks. While this use-case is still relatively uncommon, we are likely to see a much greater uptake in the near future.
Development of Standards Standards for the essencebased transport of synchronized media over IP have made great strides in the last three years, with the
development of SMPTE ST 2110. Despite this, there is still some skepticism amongst some that these standards are mature enough for deployment in production, and as a result, any move to IP should be delayed if possible. This view however is not accurate, as there are now many live projects based on SMPTE ST 2110 workflows. It is worth noting however, that the video part of SMPTE ST 2110 (-20) is not yet universally implemented by vendors, so using SMPTE ST 2022-6 to carry the video streams in a SMPTE ST 2110 environment is also a perfectly acceptable approach initially. SMPTE ST 2022-6 can be swapped for SMPTE ST 2110-20 when appropriate.
Network Architecture & Control It’s been mentioned before in this article, but it’s worth mentioning again: getting the network architecture and control right is absolutely fundamental to the success of IP production networks. It’s worth remembering that broadcast production
networks are fundamentally different from the enterprise networks and telco networks that IP has traditionally been used with. Uniquely, broadcast networks have to carry a very high volume of data, with very low latency and zero failure, and with source and destination switching happening at a very high frequency. Few other industries (if any) have such a combination of demanding requirements. Get the architecture and the control wrong, and you easily end up with a gridlocked network! Ultimately, nothing replaces knowhow and experience in building IP networks, with showcase projects acting as beacon for others to follow and as broadcast production technologies become more advance within the industry, it’s critical to engage with the experts from the start of any IP-based transition..
57 JANUARY ‘19
A step forward toward platform convergence
Author: Daniel Esparza We are witnessing a scenario featuring media convergence. TV, radio, Internet or social media are no longer viewed as separate, isolated areas within a company dedicated to information. Here lies one of the most relevant trends in news production: creation of unified editorial offices with multimedia journalists capable of publishing and adapting content for various platforms. As regards of technology, this new paradigm requires the existence of 58 JANUARY â€˜19
integrated multimedia systems that are responsive to a novel workflow. But, at a global level, this also means a change in mindset amongst editors themselves, an element that is not always easy to solve, most particularly in companies of long tradition with staff being used to other work schemes. At any rate, it seems there is general agreement that broadcasters will have no choice but to adapt to these new rules for the
game if they are willing to survive. It is clear that users wish to view contents in a different manner, so information must be produced and conveyed also in a different way. In this context, the use of new tools is gaining ground, one of them being the smartphones. This is where the notion of Mobile Journalism -MOJOstems from. Technological developments have provided journalists with the ability to record, edit and share multimedia content with the editorial
59 JANUARY â€˜19
office or with the audience in a completely autonomous fashion by means of their mobile phones. Therefore, MOJO makes available to journalists a kind of miniature production studio at the reach of their fingertips. Additionally, this tool promotes digital innovation and drives multi-platform creativity. Previously, the quality that could be obtained through a mobile device was way far from what would be achieved by means of professional equipment. This gap has been considerably reduced to become, in some instances, impossible to tell. At any rate, this should not mislead us: this is not about replacing one tool with another, but about widening the range of production possibilities. This is just what new technologies achieve and, in particular, Mobile Journalism. Thanks to MOJO, editors are able to reach the place where news is happening with an 60 JANUARY â€˜19
Users are willing to view contents in a different manner, so information must be produced and conveyed also in a different way.
immediacy that would otherwise be impossible, even though coverage
may be subsequently backed by means of professional production
equipment. In this era of social media, this immediacy factor becomes a must. Furthermore, MOJO allows reaching otherwise inaccessible places and providing takes that could never be achieved through traditional professional equipment. And let us not forget cost savings that
result for a medium by carrying out part of its production through a mobile phone.
Challenges and limitations in the use of mobile phones However, this tool has, of course, a number of limitations. One technical challenge is image
stability. Manual recording with a mobile phone is shaky and offers a rather unprofessional outcome. It is true that many smartphones feature an image stabilizer, but the use of a tripod is recommended as well. Another challenge lies in the audio side. A smartphoneâ€™s microphone can record high-quality video, but keep in mind the distance existing between the mobile and the speaker in, for instance, an interview (which should not exceed one metre), or the presence of adverse wind or noise conditions. An external microphone is the solution to both issues. Lighting, on the other hand, is not so easy to sort out. One of the biggest usage challenges when recording with smartphones is the amount of noise seen in dimly lit environments. Lense quality is another issue worth considering. It is important to check that the mobile phone has optics capable of capturing high-resolution 61 JANUARY â€˜19
MOJO makes a miniature production studio available to journalists.
images. Frame rate is yet a further point to take into account, especially if recordings are intended for use on TV broadcast. And the issue about format has to be addressed as well. Transcoding entails a real headache for TV stations, which are forced to take materials from agencies or other sources in a wide array of formats. In this regard, smartphones only add further complexities to this scenario that is so typical of current times.
Additional equipment Once the above has been duly taken into account, we wonder: what additional equipment is recommended for an editor to carry when recording from a mobile phone? There are some keys for answering this question. In order to ensure image stability a useful gadget is â€“as already remarked- a tripod. A mobile phone is not as heavy as a TV camera or a DSLR, so a light, affordable tripod will suffice. A feasible alternative is the use of a monopod or selfie stick. For fitting the smartphone into the tripod, an adequate mount is, of course, required. We also said that the use of an external microphone is advisable. Here the range of options is wide, also depending on the situation intended for recording. For 62 JANUARY â€˜19
interviews in noisy environments, a clip-on or lapel microphone attached to the subject’s clothing is an optimal choice. For recording two people simultaneously, using a double lapel microphone may be an option. Yet another option is using two lapel microphones connected to a dual adaptor. In order to control the voice level for each single microphone, an audio mixer for smartphones can be used. It is highly recommended, at any rate, carrying an extension cord as well. If the editor is alone and has no external microphones at hand, the microphone integrated in the headphones plugged to the mobile device may be also used. If the distance from the subject is significant or the latter is moving, using a wireless recording system will be required, although alternatives do exist. For example, in order to avoid extension cords and wireless systems, the journalist may resort to using a second phone to which a lapel microphone would be attached, connected in turn to the clothes of the intended interviewee. This way, the
The basic kit for a ‘mojo’ editor comprises a smartphone, external microphone, extension cord, tripod and mount.
63 JANUARY ‘19
editor may record the scene from the desired distance and then put the two recordings together during the editing stage. For other instances, there is the option of using, of course, a hand-held microphone. For dimly-lit environments, the journalist may resort, as in any other production, to artificial lighting. There are systems specifically designed for smartphones, such as iBlazz2, a wireless flash LED light (https://concepter.co/iblaz r2/).
64 JANUARY ‘19
Mobile World Congress and NAB Show will be two key events for getting to know the incoming novelties for this technology.
In regard to lenses, the editor may also use external optics fitted into the phone. Brands such as Olloclip (https://www.olloclip.com) or ExoLens (https://exolens.com) provide specific solutions for this purpose. Many of these
accessories are, indeed, optional. Although each particular professional may want to customize the equipment, we could say that the basic kit for a ‘mojo’ editor comprises a smartphone, external microphone, extension cord, tripod and mount.
The case of GLOBO
Technology for the future
We ask Brazilian television GLOBO, one of the largest media companies in the world, about its experience with this technology, after recently interviewing its CTO, Raymundo Barros (https://issuu.com/daromedia/docs /tmbroadcastmagazine62/34). This is what they told us:
MOJO is a technology for the future as shown by the fact that is being applied in multiple communications media globally, both large and small. Instances of all kinds can be found. Canada’s public TV and radio corporation CBC, news channel from India NDTV, the French Léman Bleu or regional Dutch TV Omrop Fryslân, have encouraged all their journalists or at least a part of them, to use smartphones as a production tool.
How are you using the MOJO technology? We are using it in hard news and documentaries, as the profile of these programs, as well as the professionals involved, are more suitable to take on such technology. In these cases, mobile journalism is a differential. What main advantages do you think it poses for broadcasters? Mobile journalism enables greater capillarity, increasing the presence and the agility of journalistic coverage. It also allows for a less intrusive, more informal and often more modern way of telling stories. What are the main disadvantages you think it has? It is very suitable for hard news coverage and documentaries, in which the quality of the audio, video and lighting are not the most relevant aspects. In more complex situations we still need to rely on more sophisticated ENG crews.
Irish public television RTE, launched on their own an initiative focusing in this kind of journalism. With the aim of increasing its presence in the social media, RTE produced – solely with mobile phones- a series of feature articles, targeting the audiences of these platforms, on issues not covered by the agenda of this general TV company. The channel’s drive for MOJO was not limited to this, as it also produced a 4K documentary by using an iPhone. BBC, yet another example, has also used this tool in many occasions, such as in the coverage following Donald Trump's victory in the US presidential elections. One of the broadcasts, intended for Facebook Live, featured two journalists. One of them used his own smartphone for recording, while the other editor was making live interviews with a hand microphone and used a second phone for monitoring and responding to comments posted by Facebook Live users. Next Mobile World Congress in Barcelona (29 February) and the Las Vegas NAB Show (9 April) will be two key events in order to know the incoming novelties in the field. 65 JANUARY ‘19
66 JANUARY â€˜19
Today we are testing the Canon XF705, a one-inch sensor camera oriented to professional broadcast capture. While being true that in general terms, for some years up to present time, content has been prevailing over technical means, there are infinite situations in the profession in which attaining certain contents requires the appropriate tools. This is the case with the Canon XF705, a professional camera featuring zoom and manual controls, 4K recording and up to 50/60 fps and 100/120p in HD. By Álvaro Bernal
It is a professional camera thought out for very diverse jobs. Having a 1-inch sensor enables two things that are very much in demand. One is some margin for using depth in a narrative manner by taking the focus where required in the story, the other being performance under dim lighting. It is obvious that nothing is perfect, but such sensor size and current technology enable some degree of selective blurring, especially in mid and long-distance zooming, as well as an improved performance with little light as compared to cameras equipped with smaller
sensors. For the time being, these sensors neither have the small field-depth of 35mm and full-format sensors, nor their luminosity, but anyone acquainted with the subject knows that many contents are impossible to record with these cameras, as the reality in front of them will not allow such usage. This is where ‘broadcast’ cameras (it is no longer easy to find a definition to everyone’s taste) come into play: documentaries, features, news, etc. Being realistic and based on current expectations, many contents require several types of cameras. We see in many TV
programs large-sensor cameras being used for the fictional or controlled part, one-inch sensors -as in this case- for follow-ups, far-focus planes, action takes, etc., stabilized cameras, minicameras, etc. What defines the XF-705 is its zoom lense with a 25.2 – 382.5 range referenced to a 35mm sensor and 2.8 angular aperture that drops to 4.5 in maximum telephoto position. As it cannot be otherwise, we manually control whether we want zoom, iris and focus. Obviously, all these parameters can be automatically managed with excellent performance. Each 67 JANUARY ‘19
situation requires one working choice or the other. Therefore, we have 15X magnification in 4K, which becomes 30X in HD, with excellent digital conversion. Times in which a digital zoom used to ruin the image are now gone and HD is already 100% functional. Image stabilization is optical, 68 JANUARY â€˜19
featuring 5 axes with various operating modes. Although it is true that camera stabilizers are present in nearly all production work, setting a camera the size of the XF705 on a gimbal would be mostly of little use. Other cameras would perform this task better. Well, what the stabilizer
on the lens aims to achieve is that in the normal use of this camera we will have a valid image with plenty of zoom when walking behind someone, inside a car, on a ship, etc. Lense quality can be classified as Canon Lseries and yes, there is a noticeable difference as compared to, for instance,
camera- are big, solid and well-placed. As we have already mentioned, the camera records up to 60p in 4K, using to such purpose a particularly efficient file, the recent H265. Times are changing and we will witness in coming years how files decrease in size while global quality increases. Canon has
As it can be expected, we can set in the camera LUT curves that enable us to see the reality of what we are recording when we use the LOG recording option. The issue is certainly well resolved here.
its little sibling, the 400/405, in regard to zoom performance. Here the resulting image is bold, with good definition in the longer zoom ranges. The camera’s body features buttons for 1/4, 1/16 and 1/64 ND electronic filters. These work swiftly and -as all other buttons in the
really banged its fist on the table and is now capable of recording 4K at 60p in 422 10 bits in economical SD cards. That is how things stand. Not everything is perfect as at the outset only very few editing software packages considered this code and therefore conversion is required. As we very well know, this situation will
get back to normal in a few months. Needless to say, this camera features many other formats but I deem it advisable to focus on what it does when set at top quality. As it was to be expected, we have HDR and Log curves. Essentially, the idea is that the XF 705 will perfectly blend in a workflow with C200 and C300, for example. It is worth noting that we will be able to extract the camera’s full potential into external recorders through HDMI and 12G-SDI connectors. We should be really thankful for a “comfortable HDR curve" (HLG) that enables getting a lot of what HDR is good for without further complications in the editing process. Be reminded that a 'Log' workflow is not always possible as it requires good control of what lies in front of the camera and many situations in which the XF 705 gets the job done are like that, so this curve will provide us with the sensor's best. We have all experimented LOG recording and have then 69 JANUARY ‘19
been let down as we though all would be recoverable in the editing stage. As it can be expected, we can set in the camera LUT curves that enable us to see the reality of what we are recording when we use the LOG recording option. The issue is certainly well resolved here. The image we get is really worth the price and size of the camera as we are very much subject to
70 JANUARY â€˜19
performance of smartphones and small cameras which are always shown in maximum luminosity and light direction conditions. There is life beyond fantastic dawns with the sun on the back, though. Real work is usually performed under very low contrast or very high contrast conditions and it is in such instances where cameras such as the XF 705 allow for results meeting
professional expectations. Just to mention something that is highly noticeable even if we are not professionals is dynamic range. The sensor, while not reaching the same level as the larger ones, does manage such situations decently. As we have said, recording is done on SD cards. The camera has two slots and we have the choice of duplicating the recording or storing
materials in the second card once the first one is full. This is very useful in long recordings as it is the case with concerts or sports events. Performance under dim light is quite satisfactory. Even though it is not as overwhelming as in large sensors, there are no artefacts or a grainy image with distorted colours. I especially liked the touch screen: 4-inch, far apart, nearly at lense level, with the possibility of rotating it on both sides. As it is so much separated, we can record while looking very comfortably to the screen and focus by using the fantastic dualpixel autofocus system. The truth is that I love that way of focusing. You can touch on 80% of the screen and change focus plane very quickly. Only in low-contrast conditions the system proved lazy or erratic. There are several autofocus modes, speed in which focus switches planes can be modified, face monitoring available, etc. Autofocus is a feature is already functional
The camera has two slots and we have the choice of duplicating the recording or storing materials in the second card once the first one is full.
professionally speaking, even for the most sceptical users. It is a system that progresses steadily and one way or another is now present with all manufacturers. The camera’s visor is also very functional, large, very sharp, looking quite reliable. The camera’s body is robust and features a 1/4 thread for accessories on the grip and everything that is to be expected, such as plenty of shortcut buttons for main features and other buttons for customizing the tasks most frequently used that have no default button. Audio is well implemented. A KLR micro connector on the grip and another one on the back together with the video output, headphone connectors, etc. that is
well thought-out. All connectors are safely protected. The camera weighs about 2.5 kilos. Ergonomics are just alright as all manufacturers feature the same issues. It is perfect on the tripod but exhausting when handgripped. The XF 705 has a small support on the rear which I expected to be adjustable, as by just moving it a bit it would enable having it rest on the shoulder and frame with a screen being so far apart from the eye, a real shame. Undoubtedly, the accessories’ market will sort this out. Last, it is worth highlighting that in view of its professional nature, the optional RCV100 remote control and real-time transmission of FTP files via cable or WiFi can be used as well. 71 JANUARY ‘19
In this issue: a special report about IP-based production, Mobile Journalism, Canon XF705 in Test Zone and much more!.