THE P WER OF THE CLOUD
HOW THE CLOUD PLAYED A MAJOR ROLE IN AMAZON’S THE RINGS OF POWER
Doctor Who’s visual effects
From Big Top Circus to Broadcast Circuits
HOW THE CLOUD PLAYED A MAJOR ROLE IN AMAZON’S THE RINGS OF POWER
Doctor Who’s visual effects
From Big Top Circus to Broadcast Circuits
This issue marks the sixth anniversary of my appointment as editor of TVBEurope. The time has flown by! I came in not really knowing much about the technology side of the TV industry; like most people, I was aware of certain kinds of cameras or sound equipment, but I needed to learn about things like IP or cloud. It was a very steep learning curve for the first six months.
The great thing has been how the industry helped me to understand, from the PR people that I work with every day to everyone I have interviewed or met over the last six years, thank you so much for bearing with me and sharing your knowledge.
at our last issue and how Coronation Street is using the technology.
Another game-changer that has become ever more important to the industry is the cloud. It’s not just about storage or moving files anymore. The cloud is being used for editing, sound mixing and post production. I was lucky enough to talk to The Rings of Power producer Ron Ames about how the show employed the cloud from pre- to post production. Is that the way forward for the whole industry? I guess we’ll need another six years before we can fully answer that question.
I wonder what we’ll all be talking about in another six years? Will traditional broadcasters still be operating over-the-air, or will everyone
www.tvbeurope.com
FOLLOW US Twitter.com/TVBEUROPE / Facebook/TVBEUROPE1
CONTENT
Editor: Jenny Priestley jenny.priestley@futurenet.com
Graphic Designer: Marc Miller
Managing Design Director: Nicole Cobban nicole.cobban@futurenet.com
Contributors: Kevin Emmott, Peggy Dau
Group Content Director, B2B: James McKeown james.mckeown@futurenet.com
MANAGEMENT
Chief of Staff: Sarah Rees
UK CRO: Zack Sullivan
Commercial Director: Clare Dove
Head of Production US & UK: Mark Constance
Head of Design: Rodney Dive
ADVERTISING SALES
Advertising Director: Richard Hemmings richard.hemmings@futurenet.com
SUBSCRIBER CUSTOMER SERVICE
To subscribe, change your address, or check on your current account status, go to www.tvbeurope.com/subscribe
ARCHIVES
Digital editions of the magazine are available to view on ISSUU. com Recent back issues of the printed edition may be available please contact customerservice@futurenet.com for more information.
LICENSING/REPRINTS/PERMISSIONS
TVBEurope is available for licensing. Contact the Licensing team to discuss partnership opportunities. Head of Print Licensing Rachel Shaw licensing@futurenet.com
It’s one of the reasons why I love doing this job, the fact that so many people are willing to give up their time to explain complicated technology to me so that I can write about it for our readers.
It’s funny to think that back in 2017 new technologies such as virtual production weren’t even being talked about, and now it’s something that everyone is becoming more familiar with. I sometimes wonder if The Mandalorian hadn’t been released in the UK and Europe right at the start of the pandemic, would everyone have been so interested in the way it was filmed? Having that captive audience when we couldn’t leave our homes probably helped pique people’s interest. And now, it’s something that is becoming part of the mainstream; just look
have moved to the internet? It was interesting reading Tim Davie’s recent comments about how the UK media industry needs to come together as it transitions to an “internet-only” world. That’s all well and good, but there are still plenty of viewers who don’t watch TV over the internet. What does that mean for them? Will we even need the internet? Will distribution have moved to something like 5G? If you have any thoughts on that do drop me a note; I’m always keen to hear what you think.
Anyway, thanks for the last six years, it’s been so much fun. And here’s to the next! n
JENNY PRIESTLEY, EDITOR JENNY.PRIESTLEY@FUTURENET.COM @JENNYPRIESTLEYI wonder what we’ll all be talking about in six years from now? Will traditional broadcasters still be operating over-the-air, or will everyone have moved to the internet?
Producer Ron Ames tells Jenny Priestley how The Rings of Power pushed technology boundaries to take viewers back to Middle-earth
Netflix’s Jeff Shapiro and Peter Cioni explain how the cloud is helping the streamer create compelling content
Director Dwight Steven-Boniecki details how he used 50-year-old footage to create a film about America’s forgotten space station
Visual effects supervisor Dave Bannister talks to Jenny Priestley about DNEG’s work on The Power of the Doctor
AE Live’s Scott Marlow reveals how the company created a virtual studio seen by millions of viewers (but not those in the UK)
Kevin Emmott reports on how BBC Studioworks is investing in the future of the Scottish broadcast industry with one of Glasgow’s most historic buildings
Michael Geissler, CEO of Mo-Sys Engineering, explains why there are currently endless possibilities for the broadcast and film industries thanks to the emergence of virtual production
Virtual production is undoubtedly one of the most fascinating technology developments shaping the future of post production. It brings a huge number of efficiencies to VFX workflows, and not just via The Mandalorian-esque LED screens driven by game engines.
Today, VFX supervisors and filmmaking creatives can turn to a host of complex toolsets to visualise landscapes and characters – or any other onscreen element – both before and during principal photography.
But it’s not just for post; virtual production offers clear benefits right across the production workflow. Set designers can easily see what locations might look like with a whole host of different colour tiles or brick-types without having to build or draw options. Stunt coordinators can use prevized scenes to help block the action. And on top of all of that, producers can see how these efficiencies impact their bottom line.
But what about editors? How do we see virtual production impacting the people responsible for turning every crew member’s contribution into a finished film?
One of the key benefits virtual production offers editors – whether it’s for film, TV or advertising production – is the ability to be more involved onset or on-location.
On-set editing is something many editors have long taken advantage of, since seeing action being shot and starting rough cuts as soon as rushes are captured enables them to ensure there’s the right level of coverage. VFX supervisors and creative directors say the true benefit of this is that it allows artists and filmmakers to make creative decisions earlier. This immediacy can have a similar impact on editors, taking on-set editing much further. With the right virtual production workflows in place, editors can start rough cuts straight away on huge VFX-driven shows. They’re able to immediately see what the final images will look like, even if they feature fully-CG characters or entirely fabricated worlds; assets that they historically might not see for another few months.
Virtual production will also have a huge impact on asset management, which opens up the window for creativity wider than ever.
Asset management will no longer revolve around managing rendered or
rasterised audio and video during the final editing stages, as editors won’t need to wait to receive final rendered VFX shots. Instead, virtual production can enable a much more collaborative process between editors and VFX studios or artists.
With virtual production, editors can now manage in-progress visual assets, 3D models or VFX textures in context of the video they are associated with. Having access to all these elements, and the ability to render them throughout the entire post production process allows editors or filmmakers to make creative decisions later. This might involve changing a location via a virtual background or DMP, altering lighting, the time of day of a scene, or even a camera angle.
This opens up the creative impact an editor can have in the editorial process. This is a gamechanger for editors, enabling them to have a serious creative impact on a project without having to request reshoots or additional photography.
This opening up of flexibility to be more creative also comes with benefits to the time needed to create the biggest blockbusting films.
If virtual production allows VFX supervisors and creative directors to start making decisions much earlier on in previz and on-set, it also does for editors and post-production supervisors. Add to that the ability to make changes to visuals later into the process, closer to a final render, and we’re seeing a real shift in the timeline being used in the creative process.
And to top this all off, these changes can be achieved without the time, personnel and power resources needed to constantly render shots to see how they look in a cut, and then send them back for amends, only to be rendered again. It’s not just creative efficiencies we see from the deployment of virtual production, but practical and economic ones as well.
When it comes to virtual production then, what was the talk of just VFX studios and filmmakers like James Cameron only a few years ago, is now a much larger and more complex beast.
Virtual production workflows, techniques and technologies are now impacting more of the filmmaking process; from the earliest stages of preproduction to the latest stages of post production. And we’re excited to see the flexibility and creative freedom it will continue to offer editors. n
The Rise Mentoring Programme enters its sixth year in 2023; it has supported in excess of 250 women globally across the UK, EMEA, APAC and North America.
The programme has developed significantly during this time, won awards and importantly garnered the support of incredible mentors from across our industry. But I think the power of mentoring can still be underestimated, as one exec recently pointed out to one of our mentees, ‘It is just for your personal development, rather than a benefit to the company’. That is a very short-sighted and narrow approach to what can be a life-changing experience professionally. This statement, of course, also only assumes that it is the mentee that benefits from the process too, rather than the mentor and the wider network of people involved in the programme.
The idea for the Rise Mentoring Programme started from a passion of not wanting women to leave the industry; we wanted to ensure that we retained, promoted, put a spotlight on and championed women across the sector. As it turned out, mentoring can, and is, so much more than this.
From building confidence to salary negotiations, promotions and working through redundancies, the mentoring programmes have supported women through these scenarios and much more. Many of them wouldn’t have had the outcomes they did without their mentor’s guidance.
One of the critical elements has been around salary negotiations and promotions. It is easily assumed that equity has been achieved on this and all genders receive equal parity when it comes to pay. But I have seen this still isn’t the case. At least one woman on the programme has reported every year to date that after a promotion they haven’t received the pay recognition to go with a new job title, with many more reporting that they know they are not on a par financially with male colleagues. One particular mentee delivered a presentation to her managers about why she should receive equal pay with her new responsibilities and role; they said no to her request three times. Had she not had her mentor supporting her, she is clear that she would not have fought for herself and the pay she deserved. They eventually did say yes; she has since flown in her career.
There is a reason why this year’s forthcoming International Women’s Day theme is #embraceequity: there is still a lot more work to be done.
One of the key elements of the Rise Mentoring Programme is that it has four strands:
• The 1-2-1 mentoring with a designated mentor
• Access to industry events (i.e. DPP Leaders Briefing)
• A training programme (personal branding/stakeholder management/LinkedIn)
• Regular meet-ups with your mentees and mentors
The regular meet-ups with all of the mentees can be a linchpin for many taking part in the programme. Having a group of 50 people can be enormously helpful when working through difficult situations. Over the years a few women have been made redundant, but by utilising the extensive network of mentees and mentors, they have all had multiple job opportunities to choose from within a very short space of time.
But perhaps the life-changing nature of mentoring can be better described by the mentees themselves:
“The programme has been a very transformational experience. I feel as if a million doors have opened for me”
“It is as if I have become a new person, with a much clearer sense of what I want to accomplish’’
“The programme has given me wings, and a knowledge and network that will stay with me wherever I go from here’’
It can seem an exaggeration to state mentoring can be life changing, but without a doubt, we see this happen every year.
Our Rise Mentoring Programme will be open again for applications on 8 March, so for those thinking about applying this year, remember:
1. Mentoring helps you build a network of like-minded peers across the industry that you wouldn’t have had exposure to before
2. There are training opportunities underpinning the programme to support you with: public speaking, personal branding, social media, stakeholder management, and more
3. You will feel encouraged to apply for new roles or gain promotions
4. You will understand the wider sector and where future career paths might lie
5. You will make new connections within the industry, not only through your fellow mentees, but also through the mentors and their extended network
6. You build a camaraderie with your fellow mentees where all issues and problems can be shared in a trusted environment. They can advise and support you through difficult situations
7. You have one dedicated person within the industry who is there to support and advise you solidly for six months; when else does this happen?
8. You gain access to industry events and activities that perhaps you wouldn’t have done previously
For those of you who are interested in our work around mentoring or other aspects of Rise, please do get in touch: carrie@risewib.com n
TVBEurope’s website includes exclusive news, features and information about our industry. Here are some featured articles from the last few weeks…
WHY CAMERA SHADING HAS NEVER BEEN MORE IMPORTANT
WHAT’S IN STORE FOR THE MEDIA TECH INDUSTRY IN 2023: LIVE, VIRTUAL, AND REMOTE PRODUCTION PREDICTIONS
TVBEurope kicked off 2023 by inviting a number of key industry executives to share their thoughts on what the future holds across a series of topics, including production, streaming, the metaverse, cloud and audio. n
https://bit.ly/3XPFljw
OPERATORS NEED LOW LATENCY, BUT ULTRA LOW? MEH!
Ben Schwarz explains why the majority of use cases don’t need ultra-low latency, except maybe during the World Cup’s four-week duration, once every four years. n
https://bit.ly/3XzKtbU
Rohde & Schwarz’s Mohammed Aziz
Taga believes linear broadcasting still has a strong future, but it makes sense for broadcasters to seek revenues from new opportunities which can piggy-back onto that infrastructure. n
https://bit.ly/3ZSSPwV
Telestream’s Bruce Lane argues that in the nottoo-distant past, productions could occasionally get away without shading cameras to perfection for live events. In the new HDR/SDR world we’re living in, it has become a necessity. n
https://bit.ly/3HrAX4Z
MEDIA’S NEW CEO DISCUSSES THE COMPANY’S PLANS
James Arnold talks to TVBEurope about his new role, plans for the company, and why broadcasters need to evaluate their needs as they move from SDI to IP.
https://bit.ly/3JcwQeq
The latest Quarterly Global FAST Report from Amagi reveals that viewing of free ad-supported television (FAST) is continuing to grow across Europe, with over 50 per cent more hours viewed during Q3 2022 compared to the same quarter the previous year. n
https://bit.ly/3Xy7HPB
‘WE WANT TO BE THE LEADING MANAGED SERVICES COMPANY IN EUROPE’ RED BEE
The first season of Amazon’s The Lord of the Rings: The Rings of Power employed brand new ways of working, both on-set and off. Producer Ron Ames tells Jenny Priestley how the show pushed the boundaries to take viewers back to Middle-earth
Towards the end of 2022, Prime Video finally released its new Lord of the Rings TV series around the world. Set thousands of years before the events of Peter Jackson’s epic movies, The Rings of Power tells the story of the Second Age of Middle-earth, focusing on the rise of Sauron, alliances between elves, dwarves and men, and the forging of the titular rings.
Amazon first announced it had signed a deal with JRR Tolkien’s estate to develop the series in late 2017, reportedly paying $250 million for the rights. The company brought in JD Payne and Patrick McKay to act as executive producers and showrunners. Among season one’s producers was Ron Ames, whose previous credits as visual effects producer include Star Trek Into Darkness, Avengers: Age of Ultron and Shutter Island
Ames has worked with Lindsey Weber on a number of projects and when she joined The Rings of Power she asked him to come on board as well. “We had worked on a number of technology-forward shows, myself as a producer, and Lindsay as an executive producer,” explains Ames. “She
called me and asked if I would come with her and be part of the process on the show, and I said ‘I would love to’. The show was going to be so big and how we were imagining it and how the showrunners JD and Patrick were imagining, we knew there were going to be some technological challenges.”
Ames joined the project as a producer of “everything that had a technological aspect to it”, from pre-visualisation to camera capture to post production. “I was an interconnect between all of the technical departments so that we were working and all rowing in the same direction,” he adds.
AN UNEXPECTED JOURNEY
Tolkien’s Middle-earth was created over decades, and anyone who has read The Lord of the Rings knows that maps are key to his storytelling. In fact, the author created a map of Middle-earth before he even began writing, and so the production team on The Rings of Power decided to follow in his footsteps.
“When you start a journey, you have no idea where you’re headed,” says Ames. “You imagine where you’re going, and you draw the rough outline of that map, and as you proceed you begin to get a better sense of the direction you’re heading in and it actually narrows as you go.”
For The Rings of Power team, that meant starting with every possibility and when things didn’t go the way they wanted, trying something else. “It is truly about being flexible and dynamic in your choices as you go but having an end goal,” continues Ames. “Our end goal was to use the latest technology to support art-making.”
The biggest challenge the team faced was that a lot of the existing technology wouldn’t enable them to create what they wanted within the timeframe that they had. That led to Ames working alongside the show’s visual effects producer Jesse Kobayashi to create an analytic that helped them work out what they could and couldn’t deliver.
“We could tell you when we would fail,” he explains. “We knew precisely when that was going to be, and the only way we could actually not let that
happen was to spread the workout internationally, and find a way of getting it done in the time that we actually had. So that kind of forced the map upon us. We started with a picture and then it got clearer and clearer along the way.”
The key piece of technology that helped The Rings of Power team achieve their targets was the cloud. The series employed a cloud-based production workflow (see below),
using AWS as its key partner. From cameras on-set to visual effects to colour grading, all aspects of the cloud came into play during the making and distribution of the show.
Three different directors worked on season one of the show, which meant all three had to get to grips
The Rings of Power is set during the Second Age of Middle-earth
with this new way of working. While it was important that each one was able to understand the workflow, Ames believes it was paramount that the producers believed in the technology as well. “Patrick and JD, our showrunners and creators, had full faith in the technology. They were there from the very beginning, and they said, ‘we’re going to commit to working in this way’,” he says.
“As the directors came in, they were all fascinated and interested in the technology. When I warned them that we wouldn’t have enough time to edit at the end of production and that we would need to edit as we go, they all said ‘OK’, and actually really enjoyed it.”
Ames credits each of the directors as being open-minded and interested in the technology. “All of them are very, very fine filmmakers, so they brought their best game,” he adds. “Some of them leaned into the technology more than others. Some actually asked if we could make a system for them to take home so they could play with pre-visualisation, and they would actually work on it at home. JA Bayona [who directed two episodes on season one] would shoot for an hour and shoot hundreds of shots and edit them with his team. He loved it. He embraced the technology.”
Talking of virtual production, there has been rumours that the show would follow in the footsteps of other high-end TV series and employ LED screens. Not so, says Ames. “We used Ncam, which is another form of virtual production.”
That means that as the actors walked on set in New Zealand, they could look at a monitor and see a game engine representation of Middle-earth. “For Númenor, ILM created virtual worlds for us that we put into a game engine and played on the screens. So while it wasn’t in-camera visual effects, it was visual effects or pre-visualisations of what the world was going to be. That meant the director, the actors, the camerapeople knew where to point the lens, it wasn’t just guessing that the sun’s going to rise over there, for example. We knew precisely where to light from.
“That informs your creative decisions,” Ames continues. “You are intentionally creating something so the actors know what they’re supposed to be imagining. One of my favourite scenes in The Aviator is Leonardo DiCaprio flying the plane into the house. What sold that visual effect was Leo’s performance. So our actors really believed where they were. They could look at the magnificence of Khazad-dûm on playback by going to a monitor and seeing what they were supposed to be looking at. It just makes us all speak the same language.”
The series was edited using Avid and conformed in Blackmagic DaVinci Resolve. Again, this was an area where Ames and his team ventured into the unknown, using Resolve in the cloud. “When we started, we reached
out to Company 3 and Blackmagic Design and said, ‘in nine months we’re going to need to do colour correction in the cloud, would you help us build that?’ They both said yes, and literally nine months to the day, we tested it and it worked.”
That allowed the show’s colourist Skip Kimball to work from his home in Idaho while production was on-going in New Zealand. “Because all of the files were in the cloud, with no latency, we could look at one colour grade on our monitor, Skip had the exact same monitor at home, and he could do the colour in real time,” says Ames.
The cloud also enabled the show’s composer to work remotely. Bear McCreary was based in Los Angeles, while the orchestras were in London at Abbey Road and Air Studios, and the vocals were in Austria. “All of the recording was first done by Bear in his studio with simulations, and then we had the orchestrations done in London,” explains Ames.
“Usually, you do the mix and then all the creatives talk about it, and then you bring the composer back in. But Bear would do a pre-mix, we would get his mix and then we’d all come together and listen to it and make our notes and then show it to the creatives. That meant they were getting the best of everybody’s intelligence working together. It was a fantastic way to work. And all of that was all shared to S3 buckets.”
The first season of The Rings of Power premiered in September 2022, with work on season two starting a month later. The show has moved its production base from New Zealand to just outside London at Bray Film Studios. Ames isn’t involved in season two, he says his family wanted him back home in Los Angeles, so he’s serving as “producer emeritus”.
Everyone involved in season two is using the tools developed for the first eight episodes. “I know for a fact that on day one, when the art department and the visual effects team on season two started, they had the benefit of everything we created in an S3 bucket that they could pull down using common tools and have a head start on everything that we had already created,” states Ames. “They have everything we’ve created, and now they’re creating fantastic new stuff.” n
“When we started, we reached out to Company 3 and Blackmagic Design and said, ‘in nine months we’re going to need to do colour correction in the cloud, would you help us build that?’ They both said yes, and literally nine months to the day, we tested it and it worked”
Ron Ames
At the end of 2022, Netflix announced a partnership with AWS to enable visual effects companies use the cloud to render workflows. TVBEurope talks to Jeff Shapiro, director of production strategy, and Peter Cioni, director of production business development at Netflix, to find out how the cloud can help the streamer create compelling content
Jeff Shapiro (JS): We built workstations, compute and storage on the AWS backbone to provision to vendors and talent through a platform called NetFX. To put that in context, if you were thinking about Uber; AWS and the environment we built on top of it is our car, and we’re looking for drivers. The drivers are artists, they’re the ones taking the passenger, which is a production, to the finish line. We show up with the cars, we’ve got a network of drivers, and we bring the shows in on time. We serviced over more than 100 projects in 2022, across a broad network of individuals and companies. We’re hosting out of the Mumbai zone, and also US East 1 and US West 1, so we’re out of LA, Montreal, and Mumbai. We’ve been successful in being able to provide this giant safety net for our productions to be delivered in
time. Not every region is as mature as the US or the UK, to be able to just service the volume of work that we have. We’ve got a lot of work going on in Korea, which is a big market for us, and there are very few suppliers there. In cases where we have a large demand of work, we then have the spillover effect where we create what I call a ‘transformer vendor’; we get 26 vendors to plug-in to the same ecosystem to deliver thousands of visual effects shots in a very short period of time. It’s pretty novel. We’ve been using AWS since the pandemic started. It was a coincidence; we had wanted to build this before [but] the pandemic just shot it into the future. We have close to 2,000 individual users on the platform, doing a variety of services. We provision infrastructure through AWS and we also provide a host of software that gets mounted on these workstations for the artists.
Peter Cioni (PC): One of the most constrained resources right now in the vendor supply chain is visual effects; with increased demand, many more shows being made, timelines being extended, there’s just a lot of demand on visual effects vendors. We’ve seen challenges with the delivery of shows as a result of these crunches for resources. One of the biggest constrained moments in time is the rendering process; all the work is done, all the creative decisions are made, it’s just hitting the render button. All of the facilities have on-prem infrastructure, but in a crunch moment they may not have enough infrastructure to actually render all those shots and make the delivery. That’s where a cloud service provider like AWS is so powerful, because they have effectively unlimited compute capability. We recognise that by allowing and enabling more VFX vendors to leverage AWS infrastructure, the render process is not a cause for delay in delivery.
Many VFX vendors are already leveraging AWS for the cloud. But when we work with certain vendors, some of them have not yet been able to connect to AWS, for myriad reasons; it could be budget, it could be technical know-how, it could be just the staffing resources that they have. So we recognised that this was a bit of a barrier to some of our VFX vendors being able to deliver shows for us.
We partnered with Conductor, which is a softwareas-a-service provider that leverages AWS infrastructure, and by working together, we’re able to introduce more VFX vendors to AWS’s capabilities. Conductor helps streamline the on-ramp to AWS. For those who don’t have the in-house team to develop special software or IT departments, they can make use of Conductor’s capability to make that process easier
The benefit to Netflix is now we can put more work in the hands of visual effects vendors, and the rendering process of visual effects is no longer a barrier to those vendors getting the work done, which means we can put more shows through that same facility.
HOW DID THE PANDEMIC CHANGE NETFLIX’S VISUAL EFFECTS WORKFLOWS?
PC: In terms of long-lasting takeaways, or kind of permanent changes in the industry, what comes to mind first is distributed workforce; the comfort that people
now have with having a workforce everywhere. The idea of people working in different places all over the world and being OK with it, that’s a very long-lasting outcome. We have visual effects vendors who have talent all over the world. We’ve heard of examples where a VFX company that’s based in Montreal may have an artist or two based in Brazil. The empowerment and the enablement of technology makes that work. I think it’s pretty empowering that physical location is no longer a barrier to how big your workforce is.
JS: It’s a double-edged sword though, because now everyone that doesn’t have the right resources goes to Brazil, Mexico, or Korea. Netflix is a global company, we need those resources to work on local content. You can access anyone, anywhere now, but it also disrupts the local equilibrium.
JS: The pandemic thrust a lot of virtual production workflows into the limelight. We’re still learning a lot about using LED as a technology to help create digital backgrounds and reflections. I think virtual production is a tool, and you have to make sure that the content is written to use the tool. It’s not something that you want to force on a production.
Volumetric capture is, first and foremost, a big catalyst for us because it helps us with transmedia; scanning and capturing assets and moving them through linear content, games, whole worlds. I wouldn’t call it ‘metaverse’. That’s a loaded term. I would say 3D asset reuse and more interplay betw een the characters and environments we create on our IP and being able to share that in lots of different ways. I think we’re most excited about that. n
“We recognise that by allowing and enabling more VFX vendors to leverage AWS infrastructure, the render process is not a cause for delay in delivery”
Peter Cioni
Future’s annual Best in Market Awards programme returned in 2022 to celebrate some of the most innovative products in the broadcast market. Here, we profile the winning entries
Accedo One enables global companies to deliver impactful video experiences across multiple devices. It provides the tools needed to easily design and manage a uniquely branded video business across all leading consumer platforms, empowering video service providers to get their offering to market fast by reducing the complexities of execution.
It enables video businesses to expand and monetise their services by integrating new features based on what its customers want and need.
Widely deployed for post, news and by other teams needing to collaborate remotely, Avid | Edit On Demand now includes SRT (Secure Reliable Transport) protocol, extending workflows with overthe-shoulder access for client review and approval to accelerate content completion. True to the requirement for high quality video all along the workflow, the Edit On Demand experience is indistinguishable from onpremises editing set-ups. With SRT, over-the-shoulder workflows can be based on software, or a mix of hardware and software.
Blackbird is the world’s fastest, most powerful professional cloud video editing and publishing platform. Enabling remote editing, Blackbird provides rapid access to video content for the easy creation of clips, highlights and longer form content to multiple devices and platforms.
A fully-featured editor accessed through any browser, easy to learn and needing only limited bandwidth to use, Blackbird powers significant productivity and efficiency benefits for any enterprise organisation working with video. An ultra green technology, Blackbird supports the carbon reduction goals of the media production industry.
Argo is a brand-new, fully modular, IP-native audio mixing control platform with a flexible control philosophy breaking the traditional geographic barriers between processing and control. It comes with interchangeable, configurable hardware panels designed to interact with a targeted implementation of Calrec’s time-served Assist UI. This means that whether you are working on physical hardware panels or on a remote GUI, the user interface is familiar and easy to drive.
Its modular panel system encourages broadcasters to adapt surface hardware to meet their unique requirements, with two mid-level rows of interchangeable panels on the Argo Q model, and one mid-level row on the Argo S model. Calrec has also introduced a comprehensive system of default templates to instantly change the user interface to meet broadcasters’ changing requirements and user preferences.
Taking control of high-value content is a trend among media-rich organisations, whether they are broadcasters, OTT providers, studios, brands, or sports teams and leagues. They want more autonomy and dexterity to monetise and extract the highest value of their content. With full control of content and historical archives, these companies can customise and deliver media experiences that deepen engagement and loyalty with existing viewers/fans while reaching new audiences. Successful production/distribution workflows rely on the ability to manage content libraries and leverage archives which are often disconnected from the workflow. The Dalet Flex solution for Intelligent Archive enables customers to enhance their video strategy and mediasavvy organisations to optimise their high-value content.
Harmonic’s software-based XOS Edge Advanced Media Processing solution combines the latest software processing and edge delivery technologies in an all-in-one appliance to streamline media workflows. The XOS solution significantly improves the quality of video streaming and broadcast channels, allowing service providers and broadcasters to deliver content in any format up to 4K HDR.
What is especially innovative about the XOS solution is a recently announced Single Illumination System (SIS) offering that ensures greater interoperability and cost savings for broadcast operators with digital terrestrial television and direct-to-home services.
The iconik Agent is a desktop companion app for the iconik web interface and helps users upload or download files for projects outside of the browser application; preventing interruption by browser crashes or WiFi issues. Not only does the agent streamline workflows for uploading, downloading, and file management in iconik, it solves a common pain point for creators and saves them from having to redo all the hard work they put into their project.
Today, media professionals are dealing with increasing daily pressures to be as productive as possible and produce high-quality content that is profitable. The application is intended to improve efficiencies by minimising the failure rate of larger file uploads/downloads, eliminating the need to open iconik in a web browser.
LiveU Ingest is an automatic recording and story metadata tagging solution for cloud and hybrid production workflows. It enables users to process video content faster by accelerating story metadata association, while cutting production costs.
With the increasing amount of live content, manual processes create bottlenecks in production workflows. Automation tools speed up this process, enabling the reallocation of human resources. LiveU Ingest automatically records every piece of live content while applying the respective metadata generated by the Newsroom Computer System. Storage costs are reduced because content is filtered and trimmed before being transferred to a MAM system, ensuring that only relevant content is moved.
Nimbra Edge is an intuitive open platform for ingesting, delivering, and distributing live streams to multiple destinations across different IP networks. The platform combines advanced stream protection and encryption mechanisms to ensure content integrity right from the source to all geographically distributed destinations.
The open infrastructure of Nimbra Edge is focused on reliability and performance, providing minimal operational overhead in cloud-agnostic environments. Built with a robust interface, it provides expandable functionality and integration with third-party solutions.
Designed for 3D, surround and immersive audio workflows, Halo Vision is a customisable, real-time visual analysis suite operating in up to 7.1.2 channels for AAX, VST3 and AU formats.
It features a variety of modules that provide audio professionals with a clearer understanding of every aspect of their sound, including Timecode View, Correlation Matrix, Correlation Web, Frequency Haze and Location Haze functions, Spectrum, and a True Peak meter for each channel.
These modules support engineers in their decision-making and troubleshooting process, allowing them to pinpoint problem areas that might otherwise be missed.
Pebble Control is a self-contained, scalable, and easy-to-configure
IP connection management system built specifically to enable broadcasters to make the transition to an all-IP facility without the need to deploy a bespoke enterprise solution. It leverages full support for the NMOS (Networked Media Open Specifications) suite of protocols – produced by the Advanced Media Workflow Association to facilitate networked media for professional applications – and operates on webbased UIs. Designed to deliver immediate benefits to even the smallest IP facility, it interfaces with NMOS-enabled devices from multiple vendors on the network and is easily reconfigurable when interconnections change or when devices are added or removed, essentially providing plug-andplay capability for IP networks.
Signiant’s Media Engine is a modern media management service built into the Signiant Platform. Media Engine makes it easy for users to search and preview assets across all their Signiant-connected storage anywhere in the world, on-premises or in the cloud. Media Engine makes assets searchable by automatically indexing any organisation’s storage and extracting relevant metadata. When indexing is complete, users can perform a simple Google-style search across their storage. Once the asset is found, they are able to download it or move it anywhere in the world using Signiant’s best-in-class fast file transfer service.
What makes Media Engine unique is that, unlike a conventional MAM, there’s no need to move or ingest content.
Witbe’s Witbox+ is a next-generation breakthrough for testing and monitoring multiple streaming devices – including OTT boxes, smart TVs, mobile platforms, and more – addressing a critical need for video streaming companies.
Set apart by its scalability and compact form, making it the smallest test automation device on the market, the Witbox+ is also easy to set up. Users simply plug any physical device into the unit to automatically test and monitor any available service running on it.
A single unit is capable of testing and monitoring four different 4K devices simultaneously, while offering support for 5.1 surround sound, Bluetooth, and RF4CE control.
TRAXIS talentS is an industry-first AI-powered markerless stereoscopic talent tracking system that identifies the people inside the 3D virtual environment without any wearables.
Deploying talentS makes beacons/wearables that talents had to wear to track their movement obsolete, thus giving more flexibility and ease in moving in virtual studios.
talentS also introduce a new level of visual fidelity to virtual production by enabling photorealistic shadows, reflections and refractions to appear in the correct location and position.
The Live Events Manager includes powerful new tools in ZEN Master to help organisations build and visualise event schedules. Video operations teams such as Amazon Prime Video are leveraging these capabilities to manage complex live event schedules at scale.
While ZEN Master has always provided a centralised control plane for managing, monitoring and orchestrating live event and linear channels, the new Live Events Manager features add sophisticated event scheduling dashboards and programmatic automation that are enabling Amazon Prime Video to successfully launch a significant increase in the availability of live event programming. n
WHAT WILL DRIVE INNOVATION IN 2023?
WILL STREAMING CONTINUE TO DISRUPT TRADITIONAL MEDIA?
THE NEXT STEP FOR THE STREAMING SERVICES
WHERE DOES IP ADOPTION STAND?
The last 12 months was another fantastic period of innovation in the media industry, bringing huge leaps in the evolution of live production, virtual production and much more. We have also seen the media ecosystem evolve with new streamers launching, existing streamers consolidating and established streamers taking the plunge and deciding to add advertising into their customer offer. This level of innovation and ecosystem evolution brings with it both commercial challenges and opportunities. We have chosen to apply a commercial lens on 2022 and reflect on three key themes that have been uppermost in our media clients’ minds:
1. Shifting content spend: Will there be a slowdown in investment in content production? If so, how will the effects of this play out through the industry value chain? What might the impact to my business be?
2. Cost optimisation: In light of the challenging macroeconomic environment, how do I best optimise my cost base to keep my business positioned for growth?
3. Exploring Web 3.0: How much should my business be thinking about Web 3.0? Should my business have a strategy for the metaverse or for NFTs, and what might this look like?
Let’s consider each one of these in turn.
As commissioners’ key revenue streams (advertising, subscription revenue, and even the [BBC] licence fee) have come under pressure, attention has inevitably turned to the impact this might have upon top-line content spend in the near future.
While commissioners of all sizes are re-evaluating their current and planned content mix, our conversations with these commissioners have been encouragingly nuanced. Our discussions have been less about spend going ‘up’, ‘down’, or ‘flattening’, but instead about focusing content budgets on programming that not only delivers return on investment but also helps the content provider deliver its strategic objectives. Some providers are concentrating on smarter use of existing
content assets (e.g. greater content sharing); others are considering shifting spend on new commissions from higher cost-per-hour genres to lower ones.
Throughout these discussions, the importance of content and continued content investment remains paramount. The battle for eyeballs and attention is intense, with content still considered to be the primary competitive tool. So the outlook for 2023 is one of growth, albeit cautious, intelligent growth, which should provide some optimism for all those connected to content production.
We live in complex and challenging economic times. At EY-Parthenon, we view the current inflationary challenges through the framework of three uniquely interacting, and mutually reinforcing, ‘trilemmas’: rising household and business costs (money); pressures on access to affordable energy (energy); and increases in input costs (supply). These inflationary concerns have prompted many conversations about cost management, which no doubt will continue well into 2023.
There is no ‘one size fits all’ approach here. The levers available to a company will, of course, vary according to the company’s place in the value chain, their competitive positioning, the nature of their customer base and their key financial imperatives.
It has been encouraging to see the level of strategic thinking demonstrated by our clients when approaching this topic. Our conversations with broadcasters, for example, have centred on creating sustainable efficiencies through portfolio optimisation and workflow improvement, rather than on the rather blunt tools of cutting marketing and content budgets that were the goto in previous economic crunches.
Technology will play a very important part in this story, and the rapid evolution cycle that began during the Covid-19 pandemic continues to deliver options such as the increasing viability of virtual production, remote production and collaboration, cloud workflows and IP delivery. As these cost conversations spill into 2023, we will continue to encourage businesses to think strategically and sustainably and to keep the customer, or viewer, at the core of their thinking.
Although the excitement around NFTs and visions of a metaverse-driven future has cooled a little, with macroeconomic factors taking centre stage, media companies continue to prepare for the next age of interactivity. Businesses are undertaking strategic planning, research and development, consumer research and technology investment so that they can be ready for the full emergence of the metaverse.
In 2022, we had numerous conversations with IP owners, especially sports teams and leagues, to help them understand the value of their brands and assets in this emerging new world. Our discussions have focused on how to build Web 3.0 into long-range business plans and, in particular, on how to tackle the practicalities of defining and executing a Web 3.0 strategy. A key concern for our clients is whether they should ‘go it alone’, form strategic partnerships, or simply take the notionally ‘easier’ option of licensing their rights to specialist third parties.
Some of the commercial and strategic decisions being made now will define the winners and losers in this space over the next few years. We expect these conversations to continue through 2023, with some businesses appointing dedicated metaverse ‘champions’ to explore potential future scenarios and drive innovation.
The three themes explored were the most consistent topics of interest in 2022, but a fourth theme has captured significant attention: the great leaps made in AI. With all the current talk about chatbots (including ones with the capability to write forewords to industry publications; please note that this piece was very much crafted by human hands!), it seems that we are on the cusp of a new phase in the development of artificial intelligence, particularly generative AI. This advancement is ushering in a world where algorithms can drive the creation of new content based on patterns in so-called ‘training’ data. Our industry has moved quickly from talking about demos to demonstrable use cases. So, what might this mean in the near future for the media industry and the technology which underpins it?
AI has the potential to drive transformative change in media technology and to drive the application of this technology across the content production value chain, including:
• Production: more powerful use of metadata and AIcreated content, emerging ‘on the fly’ in broadcast, are already enabling new levels of creativity
• Post production: we are beginning to see small glimpses of AI’s potential in transforming content’s look and feel (take the de-ageing process of the cast in Martin Scorsese’s The Irishman) and its role in driving new efficiencies in the post-production process
• Distribution: where AI-driven technologies are optimising content distribution, driving greater efficiencies and allowing for enhanced monetisation
As we enter 2023, there will no doubt be many more questions to answer, which is what makes working in our industry so exciting! We look forward to reflecting in 12 months’ time on another fascinating and innovative year. n
“The battle for eyeballs and attention is intense, with content still considered to be the primary competitive tool”
When I came into the role of SMPTE president in 2020, in the midst of the pandemic, indicators of significant change were readily apparent. Already, the digital transformation of our media industry was in full swing. The pressures of the pandemic forced an accelerated shift to software, the cloud, remote work and remote production, and even more remote consumption of media. By the time we entered 2022, all of these factors had made an impact on SMPTE; as they had on every other volunteerbased, membership-driven association.
Due to distancing and travel limitations, the Society had been unable to host in-person meetings; we were forced to pivot and quickly become an organisation capable of reaching our members online. This was a real challenge, but we immediately created the new hybrid interactive and on-demand SMPTE+ events, each focusing on the latest advances in media technology, to continue providing connection and value to our members.
SMPTE delivered further value to members by offering educational tools and seminars that would be especially helpful to them during this time of drastic change. We’re proud of how we supported engineers, who turned future concepts into real-world solutions so that individuals could communicate with one another and collaborate across the globe to keep media on-air and in production.
Working closely with SMPTE Sections, we pulled ideas from the membership that would benefit the Society as a whole in dealing with new ways of
working. At the same time, we focused our ongoing education efforts more intensely on the emerging technologies driving the industry’s transformation. We extended the SMPTE Virtual Courses to cover not only the cloud but also the essentials of IP media transport, SMPTE ST 2110, imaging system fundamentals, colour-managed workflows, ATSC 3.0 and NextGen TV, networking, compression, DCP, and HDR workflows. More recently, we joined forces with MovieLabs and EBU to publish a Media in the Cloud primer on the use of ontologies and other semantic web technologies within the media domain.
Responding to nascent development in immersive media, in the world of on-set virtual production (OSVP) in particular, the Society created SMPTE’s Rapid Industry Solutions (RIS) programme, designed to give the Society an agile, responsive framework through which to address emerging industry needs. Drawing participants from across the industry –including greater representation from creative sectors – to curate knowledge resources, build partnerships, connect people and organisations with resources, and shape educational programming and training opportunities, the RIS programme made an immediate impact.
While programmes such as RIS are helping the Society engage more broadly with professionals across the industry, fresh engagement with SMPTE Student Chapters is fostering greater connection with younger members of our industry.
Several existing chapters have been reinvigorated, and new chapters have been created too. Just this
past autumn, three universities of applied sciences – the Hochschule der Medien (HdM), Hochschule RheinMain (HSRM), and Hochschule Hamm-Lippstadt (HSHL) – earned approval to launch Germany’s first SMPTE Student Chapter.
Other areas of SMPTE have grown – and grown more agile – as well. The SMPTE Executive Committee now meets more often, with shorter, more focused monthly meetings. Though the group has begun rotating time zones to accommodate membership around the globe, attendance has remained strong. Subgroups have formed to study the implications of new technologies and concepts including Web3 and the metaverse.
One of the larger challenges for SMPTE over the past year has been managing the transition to a new executive director. Any leadership change of such magnitude shakes up an organisation.
Barbara Lange was a wonderful leader for the Society, and we conducted a thorough search process to find a successor, a leader who could take SMPTE forward. In David Grindle we have found the right person for this role. (We are enormously grateful for all the work the SMPTE home office has done to see the
Society through day-to-day operations during this transition!)
As we move into 2023, and I move into the role of past president, SMPTE also welcomes Renard Jenkins as the Society’s new president. Renard is a talented and visionary leader – and founder of SMPTE’s Committee on Diversity, Equity, and Inclusion and its first chair – and as he shifts into this new leadership role with SMPTE, we’ll all be working together to continue building connection with the larger community behind the art and science of moviemaking and television.
As our industry and its needs change, SMPTE as an organisation must listen to what our members tell us about their own needs and experiences. For that reason, we expect that 2023 will be a year of assessment and change.
We will build services that are informed by data but also driven by our younger members; the future of the Society. We will focus on eliminating barriers, real and perceived, to participation in our core value products – standards work, education, and networking – and we will meet SMPTE members when and where they are with the tools and resources that help them settle into our industry’s new reality and thrive. n
“As our industry and its needs change, SMPTE as an organisation must listen to what our members tell us about their own needs and experiences. For that reason, we expect 2023 will be a year of assessment and change”
Key trends in Europe and the world indicate that streaming’s disruptive influence on video distribution will keep transforming the traditional video landscape. Netflix and Disney are among the streaming giants working on advertising-supported subscription tiers. Broadcasters are finding out that a digital-first strategy is critical to success. Meanwhile, pay-TV operators are integrating more of the essential OTT services, while also shifting from IPTV to OTT via third-party platforms for the content and technology.
The coming year will see Netflix and Disney securing their spots as the world’s leading streaming providers, with combined paid subscriptions potentially exceeding 84.5 million in Europe and 450 million across 200 markets. Warner Bros Discovery (WBD) and Paramount Global will remain on the podium but switch to supporting roles as they follow a more targeted approach to launching direct-to-consumer services in select markets.
In 2022, Netflix, Prime Video, and Disney Plus generated twothirds of total subscription streaming revenues in Europe. We expect that trend to continue in 2023, fuelled by price hikes as streaming providers push to mitigate rising operational costs. Regional media firms such as Viaplay Group and RTL Group are looking to scale as an answer to US media and tech group dominance. They have expanded throughout Europe and have grown their paid user bases by investing in local programming and key sports rights.
S&P Global projects RTL and Viaplay to outperform the total SVoD
market in 2023 and grow at a 14 per cent annual rate versus 6 per cent for the industry.
With subscriber growth rates down to single digits for the overall SVoD market in Europe, international streamers such as Netflix and Disney are incorporating ad-supported tier plans into their strategy mix, following the example of local broadcasters and TV providers. The so-called hybrid subscription-advertising model is part of a broader trend in the streaming business that will become more prevalent during 2023. After a decade of fast growth, streaming video providers face intense competition from peers such as tech firms (Amazon, Apple, Alphabet), studios (Warner Bros Discovery, Paramount), as well as a worsening macroeconomic environment. Based on the popularity of ad-tier subscriptions for Hulu, Peacock
and Paramount Plus in the US, around 40 per cent of Netflix’s total members could be on a ‘basic with ads’ plan by the end of 2025, boosting lagging growth.
Free-to-air broadcasters that are available locally, as well as Alphabet, with the widespread success of YouTube, traditionally controlled the AVoD market in Europe. Free ad-supported TV services such as Rakuten TV and Paramount Global’s Pluto TV should continue to gain traction as the growth of SVoD plateaus.
According to Magna Global’s June 2022 Global Advertising Expenditure forecast, digital video’s share of total digital ad spend for the UK, Germany and France should grow from 8.1 per cent iin 2021 to 12.3 per cent in 2026. Digital video’s gains signal a shift from linear TV, radio and print to digital formats, in line with the increased usage of streaming services.
Inflationary pressures and the rising cost of living may result in consumers reducing their streaming spending. Combined with increased borrowing costs and lackluster stock market performance, that is putting pressure on media groups to focus on profitability rather than scale, after results showing loss-making direct-to-consumer segments at Disney, Warner Bros Discovery and Paramount Global.
The consolidation of European broadcasting could continue in 2023, with all eyes on MFE; MediaForEurope N.V., which in late 2022 took its ownership of ProSieben.Sat1 to 29.9 per cent, just below the 30 per cent threshold that would force a mandatory buyout offer.
At the very least, MFE will influence the direction of ProSieben.Sat1, but gaining control would create a €4.8 billion per year advertising business, the largest in Europe, with a top two position in Italy, Spain and Germany.
Next year’s earnings will also show how well ITV Plc’s hybrid streaming service ITVX performs. Currently, its advertising business generates 17.1 per cent of revenues outside of traditional TV spots in addressable inventory and on-demand or FAST TV channel mediums. This figure could race close to 25 per cent very quickly. At the same time, its paid subscriber base that opts out of on-demand ads should also rise. The challenge is how efficiently ITV can monetise its viewers’ total streaming hours.
With the decline in pay-TV service revenues and TV subscribers, operators are increasingly decoupling from pay-TV. Although multichannel providers are well placed to continue to integrate and aggregate services into a seamless user experience to maintain appeal and support content discoverability, ownership of content rights and interest in traditional pay-TV offers has been shifting away from traditional operators.
As of July 2021. Subscriptions in millions as of fourth quarter 2020 unless otherwise noted.
^ Subscriptions as of first quarter 2021.
* Total traditional multichannel (basic video) subscriptions for operators with integrated offerings.
Traditional US content powerhouses like Disney want to retain their content rights for their proprietary OTT services. As new entrants in sports rights, such as DAZN and Amazon Prime, erode operators’ historic dominance, the appeal of traditional pay-TV diminishes.
Spanish cable operator Euskaltel, now owned by Másmóvil, sold its TV segment in 2021 to virtual TV/OTT provider Agile Content. Másmóvil subsidiaries now provide Agile TV bundled with fixed and mobile broadband services. The platform uses an Android TV box, which offers access to all other OTT services. Only one payTV operator in Italy survives, Sky Italia, while Sky Deutschland is potentially up for sale in Germany. In the UK, Sky remains the dominant and only viable pay-TV offer in terms of content value and appeal, with Virgin Media rapidly losing pay-TV subscribers. Part of the pressure on pay-TV comes from operators’ need to invest vast sums in fibre and 5G networks, making the ever-increasing cost of premium content hard to justify.
Pay-TV operators will continue to support the popularity of thirdparty set-top devices and systems such as Android TV, Apple TV, Roku and Amazon’s Fire TV stick. That strategy in turn allows operators to serve younger segments and hit the market faster with integrated services and the latest TV functionalities without taking on the cost of platform development. In 2023, this trend will continue, especially as global financial pressures persist, pushing migration to third-party set-top platforms. Some operators have shifted to entirely virtual TV offerings, launching TV apps that run on smart TVs. At the same time, a handful, including Comcast and Sky, have begun developing their own branded smart TVs. n
In the early autumn of 2022, Netflix introduced its new adsupported tier, offering a cheaper alternative for viewers who wanted to keep their streaming service during a time of economic uncertainty. Netflix laid down the gauntlet, and in true ‘streaming wars’ fashion, the other giants are set also to roll out their respective announcements of ad-supported tiers in order to win the hearts of consumers and avoid losing subscribers, with Disney’s now [DisneyNow] available in the United States.
The cynics out there might argue Netflix’s ad-supported model simultaneously creates an advertising revenue stream whilst mitigating subscriber churn – amidst the news of its widely reported loss of 1 million subscribers – however, all streaming services are taking measures to keep customers on their side.
As we look forward to 2023, the streaming wars are likely to continue and streaming services must have the right content, technology and business models to keep their subscribers happy.
In a flooded market with too many choices resulting in subscription fatigue, the current cost of living crisis has seen a sharp increase in consumers cancelling their streaming services. Having multiple streaming entertainment subscriptions is no longer tenable, and viewers are culling their subscriptions to cut costs.
The streaming industry now has the challenge of not only delivering the highest quality content for consumers but also delivering highquality viewing and customer experience.
Finding this golden ratio between the quality of content and user experience is essential for a streaming service to emerge triumphant. The quality of video streaming infrastructure needs to match the quality of content, or the experience degenerates. You wouldn’t serve steak with ketchup so why have a great library of content with sub-par technologies? With 2022 having been dubbed ‘the peak of the peak TV era’ for the calibre of content available, it is essential for streaming services to reconsider how they also deliver a stellar viewing experience.
In practice, there are a number of ways streaming services can up their UX game:
These are highly efficient and compatible with most forms of video technology used for web streaming, however not all streaming providers have them in place. Those that are yet to migrate from AVC to more advanced codecs like AV1 are falling behind.
However, things are moving in the right direction as our recent video developer report found the percentage of those planning to implement AV1 has doubled from last year, with AV1 adoption steadily increasing over the past few years.
The CTOs of the streaming giants should be looking into how low bit rate streaming can be provided when the internet or data speed drops to keep the viewing quality high. For those that don’t have access to highspeed WiFi this is a real problem that can impact which main streaming subscription they choose.
This should be applicable to each and every device, not just ‘translated’ or ‘ported’ onto another device for the sake of cost efficiency. And not just across different types of devices but also within different devices themselves; for example Android vs iOS, or particular brands of smartphones. Our research found that with the increase in streaming services available, device fragmentation continues to rise and developers should be testing video content on all devices because consumers want seamless playback across all devices. In fact, our research found that the ability to stream a device was one of consumers’ top three reasons to keep a service.
There is no doubt that 2023 will see more ad-supported tier options appear as a solution to keep subscribers on side, and to carve out an advertising revenue stream in an increasingly competitive space. We will see a convergence and conflation of linear TV, FASTs, CTV, SVoD and AVoD to form a new idea of what ‘TV’ is. However with this in mind, what will ultimately determine success will be the quality of experience for the end user.
The ad experience has yet to be perfected and will become a major headache for streaming services that don’t seriously consider the timing or impact an advert has on a viewer midway through watching their favourite blockbuster.
Once the next wave of churn subsides, and consumers have separated the wheat from the chaff, we will see the consolidation and acquisition of the major streaming players. In order to survive, streaming platforms must ensure their viewing and customer experience is in order and ahead of the rest, and the best way to get ahead of this is through strong technology. n
This last year was a pivotal one for the industry in terms of IP adoption across both broadcast and pro AV. Facilities deployed and managed IP media workflows with greater confidence and ease, with better support than ever from the open-standards ecosystem. Working through various projects and initiatives to support those advances, AIMS had its share of successes in the past year too. Through this ongoing work, the alliance has not only facilitated broader implementation of SMPTE ST 2110 media workflows but also fostered a significant increase in interest in the IPMX set of open standards and specifications for AV-over-IP deployments.
While 2022 was punctuated by several key achievements, one of the most notable for the alliance was the launch of our Educational Working Group (EWG). When our board gathered in 2021 to discuss goals for the upcoming year, we identified one industry challenge as being paramount: The urgent need for qualified professionals to design, build, and operate SMPTE ST 2110 systems as they expand beyond live production to studio workflows and the pro AV industry. To address this challenge, we founded the EWG, and it was no small undertaking.
We defined the different kinds of users – technicians, maintenance staff, implementers, software developers, and others – who interact with a SMPTE ST 2110 system at any stage from design to operation, and then created a custom curriculum suited to each of them. In doing so, we organised a vast amount of information from resources such as technical sessions, pandemic webinars, training events, and material developed by our member companies. We also collaborated closely with SMPTE on the curriculum to ensure the delivery of a clear, consistent message. The result is a solid starting point for anyone new to the SMPTE ST 2110 community. We have received a great deal of positive feedback, as well as numerous requests to expand the curriculum beyond introductory material into more advanced subjects. Further development of this educational resource is something we’ll certainly be exploring in 2023 and beyond.
Beyond the launch of the EWG, 2022 was also a big year for IPMX, our proposed set of open standards and specifications for the carriage of compressed and uncompressed video, audio, and data
over IP networks. We created IPMX in 2020 to address the interoperability challenges in the pro-AV ecosystem, which was dominated by a few proprietary standards from the industry’s top players. We spent 2021 defining the core requirements of IPMX, held an interoperability demo at InfoComm 2022, and began to see market adoption by the close of 2022. We built IPMX through collaboration with leading companies to ensure it serves the needs of manufacturers, integrators, and end users, and now these public interoperability demonstrations are boosting awareness and interest in the open standard for AV over IP.
The InfoComm 2022 demonstration truly highlighted IPMX as a critical enabler for the pro AV environment, allowing the sharing of content between different network profiles, brands, and hardware and software nodes. A working demo of an open and standards-based system where equipment from a dozen or more companies interoperate in a single environment rightly attracts some attention, and we’ll continue to provide demonstrations such as this throughout 2023, including at ISE. In Barcelona, visitors to AIMS’ booth 5J550 will see HDCP running on IPMX, IPMX bridging to HDBaseT, and the capabilities and benefits of AMWA NMOS IS-04 and IS-05 for discovery, registration, and connection management.
And while IPMX is still new to the market, things are moving in the direction of broader adoption. If you compare the adoption curve of IPMX in the pro AV market to that of SMPTE ST 2110 in the broadcast industry, IPMX is at the same stage SMPTE ST 2110 was in 2017 and 2018. The standards are complete, interoperability has been introduced, a few early projects have been launched, and the industry is starting to regard it as a serious solution.
SMPTE ST 2110 exploded in 2020 and over subsequent years became the core of many new IP infrastructure projects. IPMX is heading in the same direction, and 2023 is going to be the year where we see more projects adopting the standard as their foundation. You’re going to see some bigger rollouts and higher profile installations, and then it will really take off in late 2023 and early 2024. So, stay tuned; it’s going to be an exciting year. n
Afew months ago I went from London to Los Angeles to facilitate a business dinner. Around the table were senior executives from a number of the world’s most famous media organisations, most of them based in LA. The dinner was being held on behalf of a DPP member company, a major service provider also headquartered in the city.
At one level this sounds crazy. Why would I need to be in LA to bring together people who already live and work in that city? The answer is worth sharing, because it not only reveals what the DPP is all about but also why the ownership of the DPP has recently changed hands.
Some people still find it difficult to understand what the DPP is for. They are familiar with trade bodies, which have the primary purpose of representing the interests of particular sections of the industry. The DPP isn’t that. They are also familiar with standards bodies. The DPP isn’t that either. So what the heck is it?
The purpose of the DPP, put simply, is to generate the insight and connections that media companies need to do business better. Sometimes we do that on a small scale, with ten or twenty people brought together in working dinners or workshops. Sometimes we bring one or two hundred customers and vendors into the room, such as at our European Broadcaster Summit in Germany, or our Media Supply Festival in the USA. And sometimes we do it on a huge scale; notable at our Leaders’ Briefing in London, when dozens of media organisations share their strategies with an audience of more than 600 colleagues. The scale varies; the purpose is always the same.
The other reason people could be forgiven for not understanding the core purpose of the DPP is that what we are today isn’t quite the same as when we first started.
The DPP began its life in 2011 as an informal initiative between the BBC, ITV and Channel 4. Its aim was to hasten the transition to end-to-end digital production.
The Digital Production Partnership, as it was then called, brought together all parts of the media supply chain with the primary purpose of identifying technologies and workflows that would promote interoperability. Its first great achievement was the definition and implementation of a common file delivery specification for UK broadcasting. That specification – known as AS-11 DPP – was implemented by all UK broadcasters on the same day in October 2014.
That was impressive. The Digital Production Partnership’s ability to unite the media industry around not only a common technical specification, but also the business change required to implement it, looked like a blueprint for the future. No wonder there was considerable interest in the Partnership becoming a more formal, long term entity.
And so, in April 2015, Digital Production Partnership Ltd was formed, with ITV, Channel 4 and the BBC as its shareholders. Its stated purpose was consistent with the public service perspective of its founders: to promote common specifications, interoperability, and end to end digital production; and, in so doing, to bring cost savings.
It’s easy to forget that many greeted this noble aspiration with huge scepticism. Some assumed the three broadcasters must have created this new industry association as a devious way of creating a preferred supplier list. Others were honest enough to admit that the DPP’s mission to reduce complexity would be bad for business; on more than one occasion it was pointed out to me by the CEO of a technology vendor or service provider that they made money precisely from sorting out the chaos inadvertently created by their customers.
In reality, those of us responsible for the creation of the DPP were doing something much simpler than either our company statement or critics suggested. We were backing a hunch.
We believed the key to business success in the future would be openness. In the rapidly changing and fragmenting world of media, no
company, we believed, could possibly do everything itself. Knowledge would remain power, but that knowledge could now only come from peer to peer exchange.
As the DPP found its feet, and grew its membership, this hunch started to be borne out. Companies consistently reported that network building and strategic insight were what they got from being in the DPP.
As the globalisation of media gathered pace, the DPP, quite logically, extended its reach outside the UK. Within a few years, the typical DPP member company was headquartered in North America or on the continent of Europe. And although a technical dimension to the DPP’s work has always remained important, it has become increasingly focused on informal collaboration and information sharing, rather than on defining specifications or workflows.
The DPP’s founding shareholders were extraordinarily generous in enabling the organisation to evolve its remit and reach. But with time it became clear that two fundamental – and related – things had changed. The first was that the DPP, while still based in London, was not a UK
body. And that meant that UK-focused shareholders were not best placed to direct the DPP’s international expansion. The second was that the new-found clarity of purpose in enabling media companies to do business better meant that commercial interests needed to be served just as strongly as public service ones.
And so, in a move entirely consistent with their selflessness in enabling the DPP to evolve, the founding shareholders facilitated the transition of the DPP in October 2022 to becoming a fully independent entity.
What began as an informal UK partnership is now an international media organisation with the mission to create encounters and content wherever and however they will deliver tangible business value for both customers and suppliers.
As incessant change only makes work busier and less predictable, the media industry’s need for a neutral, independent organisation laser focused on providing business connections and insight, looks set only to grow.
It’s easy to understand what the DPP is for, isn’t it? n
“What began as an informal UK partnership is now an international media organisation with the mission to create encounters and content wherever and however they will deliver tangible business value for both customers and suppliers”
HOW WOULD YOU DESCRIBE ON AIR?
JK: On Air started out as a streaming service for on-stage entertainment. We started it as a company to create content for music artists, and for them to be able to stream that content in the highest possible quality on a robust streaming platform.
It’s an end-to-end service for the artist to create content and ultimately distribute it to their end customer. We help them with the entire production process, filming their content, post production, and also helping them with monetisation and distribution throughout the world.
ARE YOU USING TECHNOLOGY THAT’S ALREADY AVAILABLE, OR HAVE YOU BUILT YOUR OWN?
JK: We use AWS as our CDN, so it’s very much existing infrastructure. The one thing that we do innovate on quite a bit is pushing the boundaries of existing and new products. So we are talking about delivery of 4K, with Dolby Vision and Dolby Atmos technologies. To our knowledge, there aren’t that many companies that would be pushing the boundaries of AWS and some of the other partners and the technologies that we use in such a way. We have some proprietary in-house technology, specifically on creating the content. But on the delivery side, we are using existing technologies.
MARCO, CAN YOU TELL US ABOUT THE TECHNOLOGY YOU USE TO CAPTURE THE CONTENT?
MV: We don’t have the equipment ourselves for recording, we use local vendors based in London, Amsterdam, or Los Angeles, who are able to film in 4K and record the sound in Dolby Atmos. We do post production in-house, we use DaVinci Resolve for post production as well as grading.
IS THIS A SERVICE THAT WAS AVAILABLE PRE-PANDEMIC, OR HAS IT GROWN OUT OF THE PANDEMIC, WHEN MUSICIANS WERE TRYING TO REACH THEIR AUDIENCE IN THE ABSENCE OF TRADITIONAL LIVE SHOWS?
JK: We started development all the way back in 2019, and actually, the whole business started even prior to that. The pandemic has changed the dynamics quite a bit. We have seen quite a few businesses pop up specifically in this space, which in a way is good that there is an appetite
helps music artists reach audiences at home. Having worked with artists such as Becky Hill, UB40 and Years and Years, the company recently announced a partnership with Dolby to add enhanced sound and video to its platform. CEO Jakob Krampl (JK) and COO Marco Verhoek (MV) tell us more…
for something like this. However, then you also have a few competitors without a longer-term outlook. We were here pre-pandemic, and that’s why we are ultimately here as well post-pandemic.
HOW ARE YOU DEALING WITH ANY ISSUES AROUND LATENCY?
MV: A lot of our shows are pre-recorded. We usually have a delay of around 30 minutes.
JK: It’s important to stress this is not like sport where you would expect to see things live as they are happening. With music, there is a little bit of leeway in terms of how much delay you can have. And ultimately, it can also improve on the quality of the output. So, as much as we would want to have everything live, I think it’s a much better end result with a little bit of delay. It gives you a little bit more creative control over the output as well.
OTHER THAN MUSIC, WHAT OTHER AREAS DO YOU SEE YOUR PLATFORM WORKING WITH?
JK: Definitely theatre, opera, those kinds of on-stage entertainment genres are really what we are focusing on. We’ve been in quite an interesting situation over the past few years, because when the pandemic hit, people were asking, ‘should we be open to streaming?’ There was a lot of back and forth. I think, right now, as the dust settles a little bit, they understand that there is a future in streaming as well. So I think we will see more and more of that, but it just takes a little bit of time. n
Did you know that in the early 1970s NASA launched its first space station? It was called Skylab and its job was to probe whether the Sun could be used as the main source of energy for man to live and work in space for long periods of time. It remained in space until 1979 when it eventually disintegrated in Earth’s atmosphere. Fast forward 30 years to 2019, and Skylab’s largely forgotten story was told in the documentary, Searching for Skylab
The film was directed by Dwight Steven-Boniecki, who can remember the first time he got excited about the possibilities of space. “I remember watching the Apollo 17 landing on television as a three-and-a-half-year-old. That’s how big the impact was of seeing the astronauts walking on the Moon,” he says.
“At that time in Australia, we still had black and white, so it wasn’t in colour. We watched the Apollo 17 astronauts, Gene Cernan and Harrison Schmitt, walk on the lunar surface, and then my father took me outside, pointed at the Moon and said, can you see the astronauts walking up there? I remember looking at the Moon and looking at the TV and thinking, wow, this is fantastic.”
Those early grainy pictures not only led to a love of all
things space, but also helped Steven-Boniecki on a career path into TV. It was this background that helped him on his own mission of telling the story of Skylab.
The journey to making the film began while he was watching a special telecast marking the 20th anniversary of the Apollo 11 landings. “I sat there riveted to this thing, and that’s where the impetus for my interest in NASA first had its real proper birth,” Steven-Boniecki explains.
“The more I started to learn, the more I wanted to learn and being in television I became interested in the television technology that was used on Apollo,” he adds. “I got in touch with Stanley Lebar, who was the manager of the Westinghouse lunar television department. I would speak to him two, three times a week, and our talks would go on for hours.”
From those discussions, Steven-Boniecki wrote his first book, Live TV: From the Moon, which details the key role TV played in such an integral part of history.
Lebar would often mention Skylab, which StevenBoniecki remembered as a thing that crashed into the Australian Outback when he was ten. “Somebody decided to sell the VHS collection that they had gotten
from Johnson Space Center on eBay, but it wasn’t in sequential order,” he explains.
“I thought after Live TV: From the Moon, I would write a second book focusing on live TV from orbit. So I sequentially arranged these segments to get an idea of how the missions ran out in real time. I was sitting there thinking, this is actually really, really good stuff. If I’d watched this when I was in school I would have understood physics so much better; watching astronauts demonstrate orbital mechanics or liquid in zero-G would have been fascinating.”
That of course led to the idea of the film. “I thought, ‘oh that sounds like a good idea’. Now I wish I had listened to my gut instinct that said ‘you don’t know what you’re getting into’,” he laughs.
He describes his work on the film as a “rocky road” involving years of tracking down footage from both the launch and Skylab’s time in orbit, a lot of which was only available on 16 millimetre Kinescope. “I’ve got, I think, 40 canisters of film, several of them I have found absolutely no record of with NASA so there is some ultra-rare stuff there,” he explains.
“The quality ranged from pretty good to absolutely
shocking. One thing I noticed when I put them on the projector is the colour is all over the place. A lot of them were engineering copies. Before they had domestic video, they had to have a 16-millimetre film of the experiments and so forth. The engineering department would watch these films to have a record of what was going on.”
Among the issues was a red fade across the footage of the first mannedmission to Skylab, SL-2. “Luckily on that Kinescope there are colour bars at the head of a videotape so I had that as a reference. Somebody had a photo of the same EVA telecast, and so I spent three months tinkering with this thing until I finally managed to get the colour right. When you boost the colours you get the noise from that which is inherent in the video, so I had to noise-reduce a particular channel and then boost it again. It was a difficult process, but it looks better than the original colour reference video. That was a lot of work!”
Steven-Boniecki used Adobe Premier Pro throughout production, employing a free plug-in from Red Giant to help with the colour issues. He also used AI image quality software from Topaz Labs to enhance some of the stills and video. “There was a lot of tinkering to make it look good,” he adds. “We followed the standard documentary thing that if that’s the only source footage you have, and it happens to be on a VHS tape, so be it.”
Of course the problem with cleaning up VHS is that it can easily end up looking artificial. To get around that, Steven-Boniecki reintroduced film grain into the footage.
During the course of his research, he found footage that even NASA wasn’t aware it had. While looking through a number of DVDs sent to him by the Kennedy Space Center Media Office, on the last disc in the box, he unearthed footage that no one had seen in 45 years. “It was the news feed of the launch of the space station and it looked like it was shot yesterday; the only giveaway was that the suits the people were wearing looked like they were from the 1970s,” he laughs.
In total Steven-Boniecki amassed over 500 hours of footage for the 90-minute film, working with archive producer Stephen Slater to find as much as they could. The duo began working on the project in earnest in 2016 when Slater sent over some footage that filled in the blanks left by the Kinescope films. “That’s when I thought, ‘yes, we can do this’,” recalls Steven-Boniecki, “it pushed it from a home-made production into something a little bit more professional. I’m a typical first-time director, I watch the film and see every single mistake that’s in there. The hardest part of making this film has been to listen to critics who saw rough cuts and to swallow the bitter pill where I’m thinking, ‘well it looks fantastic’.”
It wasn’t just the video that revealed long-hidden stories of Skylab’s journey. During the making of Searching for Skylab, Steven-Boniecki came across some audio from SL-4, the last manned mission to the station, where the pilot, William R Pogue, became ill and the crew discussed whether they should tell everyone back on Earth what was going on. “What they didn’t realise was the onboard tape recorders were recording and they would automatically dump down this audio to Earth to be transcribed later,” he explains. “Their whole little insurrection was recorded for everyone to hear,
and the next day they got a message from Alan Shepard, who was head of the Astronauts’ Office, giving them a reprimand.
“I had been looking for this audio for five years. About six weeks before we were ready to press our Blu-rays, I went to the US National Archives and did a search for Skylab, and I saw all these audio recordings. I thought, ‘OK, I’m onto something here’. I was sat there listening to a three-and-ahalf-hour-long file, and it was getting closer and closer to the end and I was thinking, ‘please let it be there’. And then I heard Bruce McCandless say, crew we’ve got somebody who wants to talk to you, and I heard Al Shepard. It meant I could include what actually happened in the film, because over the years a lot of hype had built up about it.”
Asked if he has plans to make any more films, Steven-Boniecki laughs and says he’s going to make one about how he survived making a film about space history. “If I do, I’m not doing it independently,” he states. “I’m going to have a studio behind me. It has taken so much energy to do this on our own.
“There were times when I felt like throwing in the towel. But I’d say to anyone who wants to make an independent film to just follow your dream; you’re going to learn one way or the other. I’ve always had the philosophy that I would rather be lying on my deathbed thinking ‘I’m glad I did it, even through all the hassle’.”
He adds that the thing he’s most proud of is bringing the stories of the Skylab astronauts to new generations. “I remember going to autograph shows where I would talk to the Skylab astronauts for 40 minutes before I’d get a tap on the shoulder because it was someone else’s turn. In those days Buzz Aldrin would come out and there’d be a queue going right around the street to see him.
“Now, the ‘Skylabbers’ have an equal length of people waiting for them as the Apollo moon-walkers. That’s really cool. I am so happy that they’re finally getting their time in the spotlight.” n
You can find out more about Dwight Steven-Boniecki’s film at searchingforskylab.com
“We followed the standard documentary thing that if the only source of footage you have happens to be on a VHS tape, so be it”
Fira Barcelona | Gran Vía
30 Jan – 2 Feb 2024
See you next year!
Last year saw the end of Jodie Whittaker’s time as the Doctor. Her final episode has more visual effects than any other in Doctor Who’s history. Visual effects supervisor Dave Bannister talks to Jenny Priestley about DNEG’s work on The Power of the Doctor
Any show like Doctor Who is always going to be visual effects-heavy. From its first episodes in the 1960s through to today, the show isn’t afraid to use effects to tell its story, although the modern version is a lot more techsavvy than those early episodes.
Doctor Who helped mark the BBC’s centenary last November, with the feature-length episode, The Power of the Doctor, also serving as Jodie Whittaker’s swansong in the TARDIS. Visual effects company DNEG began working with producer Bad Wolf in 2018 during the production of season 11, so it was a natural fit for them to continue their work on the centennary episode.
Dave Bannister worked as visual effects supervisor on the feature-length episode which meant he spent a lot of time on-set ensuring that the company had all the data it required. “I work with the director, executive producers and the crew to make sure we shoot everything we need to shoot and then acquire the data that we need to be able to do the effects that we’re going to create,” he explains.
That data includes the camera position, tilt, lens focal length, height, and start and end points. DNEG also lidar-scans sets in order to create a point cloud that it can generate geometry from. “We also did photogrammetry of props and Jodie herself,” Bannister continues, “which
means taking loads and loads of photos from different positions so the computer can extrapolate a point cloud from it. There are lighting references, reference photography... it’s all a part of the data that we have to acquire.”
All of that data was then used to help create the 200 VFX shots used in The Power of The Doctor, with around 150 people across DNEG’s offices in London, India and Canada involved. But surprisingly, many of those who worked on the special were not based in the office. “Personally, I’ve not been back to the office for two-and-a-half years,” reveals Bannister. The DNEG team were able to work remotely because the company
already conforms to a high level of security due to its work on a number of major blockbusters. “When the pandemic happened and it became clear that we were going to work this way, we had to go through a rigorous security systems check and we’ve all got a particular system on the computer,” explains Bannister.
“We log into our machines in London and then everything gets run on the machines there so that there’s no data coming to my computer, everything is done at a distance. We have a tool where I can have an instance running on my computer and I’m looking at it on my monitor, and then everybody else will have an instance running on their work
“There is a shot when we’re on the Cyber Planet where Yaz is carrying the Doctor into the TARDIS that is just so cinematic”
computer, so they’re seeing everything first-hand and not via Zoom. It’s an in-house tool that we have developed.”
Bannister first heard about the plans for the special episode while on-set in the Brecon Beacons. He says everyone at DNEG was excited to be involved both because it marked the BBC’s centenary and because it was Whittaker’s final episode.
“It was exciting because it was a feature-length show, but we were also slightly concerned because the turnaround was going be pretty quick,” he explains. “We were still in the middle of doing episodes seven or eight in the latest series when it was confirmed DNEG would be working on the special.”
The team were able to develop some assets beforehand, while the actual on-set production took around three months. “There were some late nights but it was fun,” says Bannister. “We have an ethos at DNEG, we don’t want people to have to work late nights or weekends, but we kept it to a minimum. People love Doctor Who. We had people working on it who have worked on Oscar-winning films. We had heads of department coming in and quietly working with us because they love the show and just wanted to be part of it.”
As well as its in-house technology, the DNEG team employed Photoshop for DMP (digital matte painter), Maya for build and lighting, Clarisse for rendering, and Houdini for disintegrations and “blowing things up”.
Bannister describes the relationship between the creatives at DNEG and the Doctor Who team as a “real synergy of talented people”, adding that the producers give DNEG a lot of leeway in creating what the audience sees on screen. “Unless they’ve got a real problem with a story point or if they really don’t like it, they just seem to love everything we do,” he continues. “We’ve worked with them for a long time now; I worked on the last series as on-set supervisor, so we have a good idea of what they like.”
It’s not just the final 200 visual effects shots that DNEG creates. Each of those shots goes through multiple iterations before the final version is signed off. “We would very rarely go over 15 versions,” says Bannister. “A lot of shots were less than ten versions. It depends on the complexity of the shot, of course. We didn’t have time to go through a lot of versions, unfortunately, so we had to get it right straight away, which meant we had to talk to the clients to understand what the director wanted.
“That’s part of being on-set, understanding exactly what they want from the shot and making sure that we tick all those boxes and tell the story the way they want it to be told. It’s the biggest one-off episode that I’ve supervised for the company, certainly in terms of the timeframe, and we had to be so careful on the budget. This is trying to be very ambitious with not as much money as you might have on other projects, but with lots of great ideas.”
The Power of The Doctor takes viewers to brand new planets that haven’t previously been seen on screen, which meant the DNEG team had to create them from their own imaginations.
“There were so many different environments, so many different things to come up with. The Qurunx [an ensalved energy being], which was shot in the quarry, had to be designed from nothing. That was very difficult because there were lots of disintegrations and explosions and models to make.”
Another sequence that tested the team was the shot with Ashad the Cyber Zealot and all the Cybermen. “I turned up on-set not expecting to do that,” laughs Bannister. “We shot it and I had to make it work.
“The planet where The Master was abandoned was quite last minute design-wise. The producers didn’t like the first look we did, so we had to have another one and we eventually managed to make it happen.”
Bannister continues: “The bit where Ace jumps off the top of the building, we had to create lots of 3D assets in order for her to jump off, fly through the air and suddenly end up in the TARDIS. That plate was shot when I was in the hotel not expecting to be on-set that day. Somebody showed it to me on their phone the next day! There were just so many different, but really creative and interesting challenges to get our heads around.”
Asked about his favourite sequence in the episode, Bannister says it’s hard to choose just one. “There is a shot when we’re on the Cyber Planet where Yaz is carrying the Doctor into the TARDIS that is just so cinematic, and it’s just so emotional,” he says. “Visually it is really striking and just looks incredible. I also like the train sequence at the beginning because that was one I was involved with and it was just a lot of fun.
“The thing about Doctor Who is that there are so many different aspects to it. There are so many different challenges and they all have different assets. Being able to add something to the canon of Doctor Who is a real privilege,” he adds.
“DNEG didn’t work on the final sequence where Jodie was on top of Durdle Door, but I have to admit the first time I saw that, even in the rough cut, it was a real ‘it’s dusty in here’ type moment, because it was beautiful. When she says ‘right then, Doctor whoever I’m about to be: tag, you’re it’, it was just such a Jodie moment.”
The DNEG team were able to get a sneak preview of the episode before it was broadcast on BBC One last November, and Bannister says they’re all very proud of what they accomplished. “Nothing’s ever perfect on this sort of timescale or budget but we’re pretty happy with the storytelling. We love the episode,” he concludes. n
AE
Broadcast into 800 million homes in 188 countries, the English Premier League continues to attract TV viewers around the world. The majority of the international feed is produced at the headquarters of Premier League Productions (PLP) at Stockley Park in West London.
For the start of the 2022/23 season, PLP decided to move to its first-ever fully virtual studio, working with international broadcast solutions company AE Live. The multiple virtual sets have been built in Unreal Engine by AE Live’s extended reality team and are used for live match coverage, as well as magazine shows.
AE Live’s team of virtual set operators oversee the on-air playout of each set using Zero Density engines and camera tracking data from Stype’s Red Spy system. The sets also feature augmented reality content, containing live data and driven by AE Live’s software control applications.
AE Live works with all of their clients on the design of a virtual studio, and with PLP’s new studio work began
around a year before it debuted on screen. “It’s normally quite a collaborative process with quite a few different parties involved,” explains Scott Marlow, head of virtual studios at AE Live. “There’s the virtual set design aspect as well as the physical set design. You might think because it’s virtual that there’s not any physical aspects, but there’s obviously the desk, chairs, all that kind of stuff.”
Part of every virtual studio build involves bringing all of the parties together at AE Live’s offices in Hemel Hempstead to help get them away from the “hustle and bustle” of a live studio, adds Marlow. “We brought everyone together on the PLP project to map out camera plans, and look at details and things like that. Everybody had input on it. Everyone said what they think is the right thing, but ultimately the final decision was client-led.”
It’s especially important to know where the cameras will be because just one tiny movement can give the viewer a completely different experience compared with what was originally planned. “It helps the client feel
Live’s Scott Marlow details how the team created a virtual studio seen by millions of football fans around the world (except those in the UK)Scott Marlow AE Live’s team of virtual set operators oversee the onair playout of each set using Zero Density engines and camera tracking data from Stype’s Red Spy system
comfortable with exactly what pictures they’re going to be putting out, as well as letting us know where we need to add that really fine detail to make the biggest impact,” adds Marlow.
As with anything of this nature, there are always ideas that a client might want that aren’t possible with the technology currently available. The team at AE Live always try to offer a suggestion of what is achievable. With PLP the plan was to not just create the inside of the studio virtually, but also outside. “We don’t ever go outside of the virtual building,” explains Marlow, “but we have created things like training pitches, lakes and dozens and dozens of buildings that can be seen through the ‘windows’. There are also things like arcade games in the background, and even the changing rooms.”
The AE Live team also created a completely different look for the studio depending on whether a match is taking place during the day or evening.
Once everyone is happy with the initial designs, the studio is created using Zero Density’s engine, which is based on Unreal Engine. But Marlow is keen to stress that the augmented reality graphics are just as key to any project as the studio itself. “The environment side gets a lot of attention and love because it’s the wow factor,” he states. “But a lot of people are doing two-dimensional screens where video is piped in from another render engine so they have very flat two-dimensional graphics. What we try to do is create properly three-dimensional dynamic graphics that are fully within the environment.
“That’s really hard to do in Unreal Engine, because it’s not really designed for that. So we’ve created our own plugin that lets us create graphics that would traditionally be done in a Vizrt engine or something similar.”
Marlow admits the PLP project really pushed what was technically possible. “We were really bumping up against the edges of technical feasibility coupled with client ambition to get it as good as we could for launch.”
In fact, the studio has already had a “lick of paint” as it underwent a number of changes during the World Cup break at the end of last year. “We made dozens and dozens of improvements to the level of reality that we were able to put into it,” Marlow explains.
For Alastair Robbins, commercial director at AE Live, the PLP project is a “flag in the ground” of what the company is able to achieve. “Everyone’s amazed by it,” he says. “It highlights what our team can do, that these are the kinds of projects that we can deliver. As well as providing this amazing studio, we’ve got people based at Stockley Park who are working on every single Premier League game that’s broadcast across the world.
“The Premier League is a huge project for us,” he continues, “we also provide services for Premier League coverage on the BBC, BT Sport and Amazon Prime.” n
With the transformation of Glasgow’s Kelvin Hall into a state-of-the-art commercial studio facility, BBC Studioworks is investing in the future of the Scottish broadcast industry with one of the city’s most historic buildings. Kevin Emmott reports
Kelvin Hall is one of Glasgow’s most loved and iconic buildings. Opened in 1927 as an exhibition centre, it has served as a factory, a library, a transport museum and a circus, and has hosted some of Glasgow’s most exciting concerts and events. Following redevelopment, sections of Kelvin Hall reopened to the public in 2016 as a multi-use sport, culture and education building.
Following a two-year renovation, another vacant section of the building began a new chapter in September 2022 when BBC Studioworks threw open the doors to unveil a 10,500 sq ft studio, the largest sound and lighting galleries that Studioworks has ever built, and one of the most environmentally sustainable production studios in the UK.
With established studio spaces at Television Centre, BBC Elstree
Centre, and Elstree Studios, Studioworks was already looking for a studio facility outside of London.
A growing demand from producers and industry bodies to regionalise content, combined with an opportunity to exploit a niche in the Scottish market for more multi-camera television (MCTV) studios, were both convincing arguments to take the project on.
For BBC Studioworks chief executive officer Andrew Moultrie, however, the development was bigger than that. It was not only an opportunity to make a sound commercial investment in Scotland’s broadcast infrastructure, but an opportunity to make an impact on the future of the industry.
“Given Ofcom’s strategy of regionalisation, we looked at a number of
locations across the UK to ascertain which facilities could pick up the work we saw coming,” he says.
“It was clear that there was a demand for more space and opportunity in Glasgow. We had a meeting with Glasgow City Council (GCC) just before Covid, and they were looking to do something similar; the multi-camera and soundstage sector is burgeoning, and projects like this are a way to develop the creative sector in the region to provide more opportunities. This was clear in every conversation we had. GCC had already bought-in to the idea and were looking for a partner that could operate and manage it.”
Kelvin Hall had two immediate advantages. First of all, it’s enormous. Unlike Studioworks’ London facilities where the company worked around the space limitations in an existing building, Kelvin Hall enabled the Studioworks team to build a brand-new facility inside an existing cavernous space, which they took advantage of in a big and very literal way.
“It’s like a box within a box,” continues Moultrie. “I’ve been to so many new builds which squeeze everything in to maximise floorspace, but what is always very clear is you need space to create good working environments for sound and vision, and you want to provide a production with space to accommodate everyone in comfort. Kelvin Hall had that space in abundance.”
Kelvin Hall’s other big advantage was a local infrastructure which was already there to support it, with the BBC’s public sector studio Pacific Quay also located in the city. The appointment of BBC Scotland’s Stuart Guinea as studio manager in April 2022 was a key acquisition, as he already had deep operational and technical knowledge of local crews and production companies, and at launch Kelvin Hall already had a booking from STV Studios for its Bridge of Lies quiz show.
Within a few weeks the studio had produced more than 30 programmes, all of them supported by local crews, local facility support teams and a core local team to manage the whole facility.
This focus on inspiring, nurturing and developing the next generation of Scottish production talent was a priority for Studioworks, but as Moultrie explains, it contributes to the wider creative sector.
“There is a clear skills shortage and not a lot of new people coming into the industry. The UK creative economy is in a period of flux and if we don’t provide the infrastructure and the skills at a high level then our creative sector will start losing overseas investment. We’ve created our own barriers to entry based on a historic view on qualifications and experience, and there are financial barriers preventing people getting onto some of the accredited courses. We were looking for a way to overcome that.”
Studioworks already has a well-established relationship with the National Film and Television School (NFTS), offering placements for students on the NFTS Camera, Sound and Vision Mixer for Television Production course with its craft teams in London. NFTS Scotland, Studioworks and Screen Scotland collaborated to create Scotland’s first multi-camera TV conversion programme at Kelvin Hall. Designed to strengthen existing skills, provide on-the-job training and encourage new entry points into the industry, a four-week pilot in August created opportunities for 12 people to gain their first break into multicamera TV, with 21 days of paid training already completed since the programme began.
Moultrie adds: “The aim of the scheme is to kickstart a self-sufficient freelance community which can float between the different studio houses. It’s vital that we create roles and jobs for local people to drive our output in the volumes we will require. The key to this is the development of an indigenous workforce to create the ecosystem. Our aim is to do lots more training to encourage fresh talent into the industry.”
Creating a sustainable skills infrastructure and new opportunities are both admirable, but sustainability as a whole was an overarching theme across the project. Environmental sustainability has been a focus for Studioworks for some time; in 2022 it secured the Carbon Trust Standard for Zero Waste to Landfill accreditation for a third consecutive year and it is a founding collaborator of albert’s Studio Sustainability Standard which helps studios measure and reduce the environmental impact of their facilities.
Kelvin Hall gave Studioworks a blank canvas to create new standards of environmental sustainability and as the host of the COP26 climate change conference in 2021, Glasgow already had a keen focus on the effects of climate change.
“Kelvin Hall gave us the opportunity to create something new. We have worked with all our suppliers to pilot techniques which work
towards lowering emissions, and in Glasgow we have a very supportive council who have encouraged us to invest in the right long-term technologies to make a significant difference to carbon neutrality; they have worked hand-in-hand with us throughout,” says Moultrie.
Not only did the studio repurpose an existing building, but all its electricity is sourced from renewable sources. Other initiatives are also in place, such as the absence of dimmers to encourage LED and low energy lighting technology. (Studioworks’ London facilities are also embracing LED lighting and are hoping to phase out tungsten lighting by April 2023.)
The reduction in heat generation from lighting also enabled the use of air-source heat pump technology for heating and cooling, while the ventilation plant has class-leading efficiency using heat recovery systems.
Studioworks’ vision to support sustainability across all these aspects has won support from the city and has also attracted interest from clients outside of Scotland, which means it expects to see the facility used by a range of productions from all over the UK. Studioworks’ aim of attracting more investment and growing the creative economy to bring in more shows seems to be paying off. n
Cameras, vision mixers and monitors
• Six Sony HDC-3200 studio cameras. The 3200 is Sony’s latest model which has a native UHD 4K image sensor and can easily be upgraded to UHD
• Sony XVS7000 vision switcher, LMD and A9 OLED monitors for control room monitoring
Hardwired and radio communications systems
• 32 [Riedel] Bolero radio beltpacks with the distributed Artist fibre-based intercom platform and for external comms, VOIP codecs
• A SAM Sirius routing platform solution to support the most challenging applications in a live production environment and to ensure easy adoption of future technology innovations
Audio
• Studer Vista X large-scale audio processing solution that provides pristine sound for broadcast
• Calrec Type R grams mixing desk. A super-sized grams desk providing ample space for the operation of the Type R desk and associated devices, such as Spot On instant replay machines
• A Reaper multi-track recording server
Lighting
• ETC Ion XE20 lighting desk and an ETC DMXLan lighting network
• 108 lighting bars with a mix of 16A and 32A outlets (if tungsten is required)
• 48 Desisti F10 Vari-White Fresnels
• 24 ETC Source 4 Series 3 Lustr X8 coming early 2023
Outside broadcasting veteran James Poole recently joined Televideo as its new chief operating officer. He talks to TVBEurope about why he decided to join the company, and its plans for both IP and remote production
WHAT MADE YOU WANT TO JOIN TELEVIDEO?
I’ve got a long history working with Televideo; from years and years ago, when I was a young vision engineer. I used to work with them across lots of jobs. When I started my own company, we used to hire the Televideo trucks to do ballets, operas, music jobs, sports jobs... so I’m very familiar with the company and we have lots of history together.
It’s a really strong management team and we have some amazing staff, which is a solid foundation to build on. Like many companies, it has some issues to work through, but at its core there’s a fantastic team who all want to pull together to get things moving the right way and take the company to the next level, which is really exciting for me.
WHAT DO YOU BRING TO THE COMPANY?
My background is primarily music, arts, and entertainment, as well as lots of sport. Televideo wants to diversify in terms of the genres of programming we’re working on. At the moment, it’s a lot of football, a lot of rugby, we do bowls, badminton, etc. We want to find other income streams for the business, look at where we could perhaps adopt things like remote production, if that is practical, and places where we can innovate.
The industry has changed so much recently; demand outstrips supply. We’re in a really good position where we don’t have to be ‘bargain basement’; everybody has enough work, there’s no shortage of work for any company in the industry. So for me, it’s just all about ensuring that the quality is there in everything we do.
We need to innovate too. There’s so much change going on at the moment in terms of 4K and HDR. We’re looking very heavily at 8K as well. It’s about taking the company to the next level.
During Covid we weren’t able to invest a lot of money in new developments for the fleet. But our flagship HD8 has just been upgraded to 4K HDR, and we’ve recently been working on making it Dolby Atmos ready as well. That is going to be a baseband truck rather than IP. I did look at IP very seriously but we just felt for where we are and what we’re aiming for, it doesn’t really give us any added value at the moment. The next truck to be upgraded will be HD7, and that might be a different story; it may be that we do go fully IP in that truck.
We’re sort of taking a similar route to that of Telegenic and just going for large baseband trucks. The technology is simpler to understand for engineers, and we’re not doing anything on the scale of an athletics world championships or Wimbledon where it totally makes sense to use IP.
We’re always looking at what the next thing is, and we fully anticipate that within the next year, things will take a change again. We’re very interested to see what other vendors release in terms of kit at NAB and IBC in 2023. I expect there’s going to be a lot more 12G kit arriving in the marketplace.
Definitely! Coming from Timeline, we led the way in terms of remote production. So I bring all that experience having led the work on the remote BBC cricket production, working with BT Sport as well on remote football production. That’s definitely something that we are looking at. Remote production is still very much driven by the availability of good connectivity, or having lots of matches or events at one venue, which would make it worthwhile for us to install our own connectivity. We do work on remote production with Sky Sports for netball. That’s very much driven by Sky, and that’s worked really well so I expect there’ll be more of that in 2023.
It’s interesting to see where people are going in terms of consolidation in the marketplace. EMG with Telegenic and CTV coming together, seeing where they’re going. I think they’re looking at smaller vehicles that will be able to do remote production, rather than sending out a triple expanding arctic truck. We’re all trying to be greener, we’re all trying to be more sustainable.
We’re also looking at things like power solutions. How
do we minimise our use of hydrocarbon fuels? How do we perhaps use hydrogen cells in future when the technology becomes a bit more settled? So we’re always watching what people are doing.
THINK TELEVIDEO WILL BE?
I want to see our staff really engaged in what they’re doing, be really motivated, and have a really strong base of customers who want to keep coming back to us, because we’re doing a really good job delivering high quality OBs. I think our fleet upgrade package will be nearly complete, so we’ll have a very strong complement of 4K HDR, Dolby Atmos trucks, and a fleet that’s able to go toe-to-toe with any other company in the industry.
I’m also hoping that we’ll see a bit more movement with the whole situation around freelancer rates and things like that, because that’s a big point of discussion at the moment. In terms of our research and development, we’ll be doing some innovative things for our customers and having a lot of fun doing them. n
Jenny Priestley finds out how MARS Academy is helping train TV and film content creators for the future of virtual production, both in the UK and abroad
Over the last few few years, the UK has seen a huge rise in companies opening virtual production facilities. From major studios aimed at huge blockbusters to smaller facilities that are used for commercials or music videos, virtual production is the major disruptor in the production industry right now. But, with that growth comes complications because virtual production is still so new that there are many
content creators who don’t yet understand or work with the technology.
Last summer the UK’s largest independent virtual production facility, MARS Volume, launched a new training academy to bring the next generation of content creators into the TV and film industry. Based in the London suburb of Ruislip, MARS Academy offers a range of vocational virtual production training programmes focusing on a
broad and general understanding of the virtual production pipeline from render node to screen; as well as a course covering LED systems for film crews.
Joanna Alpe, commercial director at MARS’ parent company BILD Studios, admits that the company wasn’t certain how popular the courses would be when they first launched. “There was a lot of hope that if we built it, people would come,” she says, “and we really did see that happen. There was quite a lot of demand to get onto the courses.”
Applications came from those just starting out to some who had been in the industry for over 20 years. “I sat-in on some of the interviews, and just to see the enthusiasm and the excitement about these new skills was brilliant,” she adds. “It was just remarkable to see people pick them up so quickly. We always see the penny drop when people are on the Volume, and they actually get their heads around it.
“When we had our graduation party in October, just to see everyone come together and be busily talking about a project that they’re looking at doing, I felt really proud that we provide a way for people to really level-up their knowledge in this area.”
The team at MARS Academy has been approached by companies from across the world to find out more about what they’re doing, and now Studio City Norway intends to launch its own virtual production Volume that will facilitate the delivery of MARS Academy on a global scale.
“We were able to come over to London and visit MARS Academy and were really impressed by what they’re doing and also their plans for the future,” explains Dag Hvaring, chief executive officer at Studio City Norway. “Obviously, we wanted to tap into that because we don’t have education for professionals. So to be able to join forces here was really important for us.”
Studio City Norway is currently building a full-service studio complex which it aims to open by the end of 2024. The plan is to establish an initial Volume with MARS Academy and begin training and facilitating smaller productions while the build is ongoing, with a larger Volume opening next year. The first students are expected to be on-site this summer.
Asked why BILD chose Norway as the first nonUK location for MARS Academy, Alpe says it was the passion of the Studio City Norway team that attracted them. “There was a lot of international interest in what we’re doing,” she states. “When we met with Dag, and we understood their plans, how they had a forwardlooking approach, and their ability to put resources into innovation, it was clear we could form a partnership of very like-minded people. We really see the vision within the team that Dag is leading and we’re really impressed with that level of ambition.”
The aim of both facilities is to give students the experience of working on a real Volume, using the exact technology they would in a real-world environment. A key aspect of the training available both in London and Norway is that the students get the opportunity to work with best-in-class technology, such as Unreal Engine. “We’re definitely going to keep developing our curriculum,” adds Alpe, “I think what’s really important is that we keep iterating and developing so that it’s really up-to-the-minute in best practice techniques”
The collaboration between MARS Academy and Studio City Norway is backed by the British Film Commission, which recently signed a memorandum of understanding with its Norwegian counterpart. Gareth Kirkman, the Commission’s business and industry development UK lead, says the organisation is keen to showcase how the UK is shining a light on this new way of working.
“We’ve always been at the forefront of things in terms of VFX and post production services,” he explains. “We have incredible crews, incredible artists, incredible technicians; a long history. So it’s great to see companies like BILD with the MARS Volume who are really leading the way. We’re thrilled to support them wherever we can.
“Our whole remit is to be collaborative with our fellow commissions and territories, and really bolster those kinds of creative partnerships,” adds Kirkman.
“It’s very much about crossing those borders and we’ve had a long, great relationship with the Nordic Film Commission. There have been lots of great projects shot between the two countries, from No Time to Die, Black Widow, and one of my personal favourites, Ex Machina,” he continues.
“We think it’s brilliant when industry specialists and experts like MARS give back to the industry and share those learnings with other territories.” n
Richard Clarke, head of content operations at global distributor Banijay Rights, knows the importance of effective media operations management and the challenges arising from the industry’s ongoing migration to the cloud
Banijay Rights is a company whose reputation and reach are such that it requires little in the way of introduction. One of the leaders in international digital distribution, the company’s catalogue of more than 146,000 hours includes a host of top titles from Banijay’s 120+ in-house labels as well as thirdparty producers, encompassing drama, comedy, entertainment, factual, reality and more.
Overseeing content operations for the entire Banijay Rights business is Richard Clarke, whose career has taken a not untypical path to his current role at the heart of a global content business. “I had always been very interested in technology, but my starting point was a degree in music production... I wanted to be a sound engineer,” he recalls. Concluding that opportunities in the music industry were in decline, he took up an offer to spend a day at a post production company observing the work of a colourist; an experience that set him on a new trajectory of “applying for TV jobs, and ultimately starting my career as a QC operator.”
After a spell as an assistant editor at RDF Media, Clarke joined Zodiak Rights as a digital content coordinator in September 2010. Zodiak Rights was subsequently folded into Banijay Rights where, a brief period at Technicolor aside, he has been employed for the last eight years, latterly as head of content operations. As Clarke indicates, it’s a remit that has become increasingly challenging in light of both business-oriented changes – notably the incorporation of Endemol Shine International in 2020 – and the need for optimum flexibility of content delivery.
“As a distribution company, we need to ensure that the asset can be repurposed to service any deal,” he says. “That means it needs to be the best version and of the highest quality, which inevitably results in large file sizes as well as significant storage requirements and transfer costs. How to do that efficiently and cost effectively is always at the top of our list of priorities.”
Like any distribution operation, the operational model has inevitably evolved dramatically during the past decade. When Clarke came into the business, “we were still making and sending out DVDs [for localisation services] that people could be watching on their laptops.” At Banijay, one of the next major steps comprised “starting to use a transfer software like Aspera, buying a Gigabit internet line, and starting to accept delivery from production departments into storage.” From that point onwards, “we started getting more and more into digital content and the development of processes” to enable and support it, explains Clarke.
However, it soon became apparent that a major obstacle was looming: the lack of standardisation in content transfer and management. “It was a bit like the Wild West before the DPP began to get involved,” recalls Clarke. “So, we made a few steps along the way with different formats, and we also kept on buying more storage, which proved expensive. Finally, with the huge growth in metadata
requirements, we started to think about consolidating everything into a central asset management and archiving system.”
With Banijay also growing rapidly, “we went from a group of 40 companies to more than 120”, the choice of system could hardly be more critical. Having outsourced its asset management for a while, Banijay elected to invest in the second generation of Blue Lucy’s BLAM system, which combined media management, workflow orchestration and task management. Blue Lucy subsequently helped the company migrate to BLAM 3 arranged as a hybrid on-premise/cloud operation.
There is no doubting the scale of Banijay’s asset management requirements, which include around 415,000 assets in total. Like many other broadcast and media organisations, it also constitutes a moving target, and as such the precise calibration of cloud vs on-prem is bound to evolve over time.
“We are now firmly planted in a hybrid approach,” confirms Clarke. “With the first iteration of BLAM, we were very much working onpremise for all of our operations. The latest version is cloud-oriented, and consequently our back-up and disaster recovery functions are now based in the cloud. We have also evolved to the point where our distribution and serving of content to other vendors is largely cloud-based, too.”
There is one primary area of workload where Banijay Rights has yet to move away from an on-premise ethos: editing. It is here, implies Clarke, that the drawbacks of cloud are most clearly observed. “The problem with cloud is that we were all sold the dream that it’s cheaper, and the fact is that it’s not cheaper, whichever way you look at it. In particular, it’s really easy to lose control of how much you are paying for egress fees. Having a degree in maths is certainly an advantage in terms of how to plan for, and allocate, egress costs on a granular level!”
Whilst cloud-based editing is “one of the big issues we are still grappling with” says Clarke, Banijay has managed to curb these expenses for back-up and disaster recovery by using cheap cloud storage for related workflows. “If we egress, we do it only once and we don’t incur repeat fees.”
Clarke does not hide his reservations about the industry’s steady march away from software licensing towards the SaaS (software as a service) model. “It can be a real pain,” he admits. “There is a business sense to buying software and owning it, then finding the most cost-effective solution at that point. Now there has to be a much greater focus on adapting and budgeting for what you might require in the future.”
Consequently, media companies must ensure that they invest in solutions which offer ease of integration on a continual basis. “It’s definitely advantageous to us that Blue Lucy has ongoing conversations with all of the software vendors we are using,” says Clarke. In Banijay Rights’ case, the company benefits from the “integration of Blue Lucy with Signiant and Aspera, which we use for file transfer, with Amazon Web Services.”
Banijay continues to implement other software tools and utilities as required, “we no longer need those big ‘all in’ solutions from day one”, so this emphasis on integration will remain constant. “Historically,
when implementing asset management and storage, we have tended to spend the most time on integrating different bits of software to make our lives easier,” says Clarke. “The biggest one for us is our rights management system, and whether you are talking about metadata or coding, you need to have one source of truth where content can be recognised. With our evolving use of BLAM, we have been able to ensure that our hierarchy in asset management is the same as our rights management system, which is also our billing system. If you can have all of that working seamlessly together, then you are going to have a much more efficient and effective workflow experience.”
With BLAM 3 as the foundation of its ever-growing content asset management infrastructure, Clarke is able to look ahead with confidence to the probable migration of more workflows, including editing, to the cloud. “It is very difficult to pick up everything and put it in the cloud, so the approach we’ve taken is that we have trickled up into the cloud,” he says, with a chuckle. “I would imagine the next step will be editing. We have already implemented PC over IP at the office, and I suspect that if we replaced on-premise with cloud editing our editors wouldn’t actually notice.”
A pragmatic philosophy of incremental change shaped by doing what is right for Banijay and its clients has served the company well to date. The sheer volume of data that Banijay is managing, and “the bandwidth and time involved in getting that to and from the cloud”, will inevitably determine the nature of its future cloud migration. Moreover, the adoption of any new technologies has “to be in line with our policies regarding sustainability and support. It can’t be emphasised enough that decisions about software are always informed heavily by the availability of support.”
More generally, Clarke has plenty to occupy his thoughts, from “new formats” to the possibilities for dubbing and localisation heralded by AI and ML. “Because of the way that we operate, it’s essential that I keep up to date with new technologies and consider how these can be applied to our business,” he says. And whilst he is yet to be fully converted to SaaS, he does admit that “the ongoing relationship it brings with vendors means that we can develop things more closely as partners moving forward.” n
How did you get started in the media tech industry?
It was 25 years ago. I’d seen an opportunity to improve VFX and automate the combination of digital elements in film. I researched and found a specific analogue sensor which could be used to sense the position of light sources, and I thought this would be useful for camera tracking. Unfortunately, there was no market for camera tracking at that time but I was confident the market would be coming, and it did, albeit 20 years later. I had developed an early outside-in camera tracking system at a time when the term virtual production hadn’t been coined. Then 15 years ago, Mo-Sys developed its first real-time compositing system but at a time when Unreal Engine hadn’t yet evolved. The earliest time we were involved with LED walls was in 2011 on the film Gravity.
How has it changed since you started your career?
It has changed drastically from a technology point of view, and our camera tracking systems are so much more precise than they were 20 years ago. The virtual environments and digital backgrounds are so much more photo-realistic and finally usable as realistic backgrounds. It has also changed in the sense that it is becoming more mainstream, what was a very exotic tool could now become a serious alternative to location filming.
What makes you passionate about working in the industry?
It feels like a pioneering age as things are always changing and Mo-Sys can make a real difference in the way things are filmed, and that is exciting. There are endless possibilities in improving workflows and we are at the point of changing the way people work by giving them better tools. I see enormous scope for innovation.
If you could change one thing about the media tech industry, what would it be?
People need to be more inquisitive, asking for reason and background rather than looking for brands. For example, people don’t see how the technology has changed, it’s the same as someone walking into Apple and asking for an iPhone 1 instead of a 14.
How inclusive do you think the industry is, and how can we make it more inclusive?
The film industry is made up almost entirely of freelancers; the focus is on the hire of people based primarily on their ability and experience. This naturally creates diversity within the industry. It doesn’t matter what you look like, where you’re from or how you chose to live your life. We just look for people who can deliver.
How do we encourage young people that media technology is the career for them?
I think we have to show them that these careers are accessible. We also need to open the doors and provide meaningful training opportunities for young people. This is why all of our Academy courses are based around small groups and hands-on practical learning. It allows students to explore and build confidence which leads to them leaving with valuable experience in a high-demand and continually growing segment of the media industry. From a training point of view, virtual production is less focused on experienced individuals and more so on curiosity and fast learners. There are hardly any people with hard skills in VP and the industry is therefore more open to soft transferrable skills and there are endless possibilities. Mo-Sys also supports universities in a collaborative way which helps to shape curriculums and ensure students have access to expert training alongside industry standard professional grade equipment.
Where do you think the industry will go next?
I see a great future in the potential for virtual production which has had a first phase of becoming popular with large LED volumes which were pushed by manufacturers. There is now a second phase that is more economically viable and enables higher production values for the same budget.
What’s the biggest topic of discussion in your area of the industry?
How to teach and train people in an area that is touching all disciplines and areas/workflows of the industry while it is still evolving and changing. n