G u i l d o f Te l e v i s i o n C a m e ra m e n
Televising The Ryder Cup 2014
Drama Focus: Nick Dance and Andrei Austin
Perfect productions... Make the right choices!
PO Box 191, 1200 AD Hilversum, Netherlands Tel.: +31 35-6233707 Fax: +31 35-6233997 e-mail: email@example.com Web: www.vocas.com
BVE 2015 Stand: E32
COVER IMAGE © BBC/COCO VAN OPPENS
Outlander: Drama in the Scottish Highlands
by Paul Mellon
GTC member Andrei Austin operated on the whole series of the new epic drama Outlander that has already boosted Scottish tourism
“We must do this again” – Televising the Ryder Cup ... and more ‘drama’ from Scotland – last September the Ryder Cup was contested at Gleneagles with both European and US crews out in force
20 An interview with Nick Dance BSC
DoP Nick Dance talks cameras, crew, lighting, filters and octocopters … all in a day’s work for a leading cinematographer
52 Springwatch: Reality TV in the natural world How remote cameras – and lots of them – have transformed the working day of the wildlife cameramen on Springwatch
56 A variety of options... from the new VariCam
It’s been a long wait but then two brand new VariCams come along at the same time - introducing the VariCams 35 and HS
60 High-speed developments in slow and ultra motion The pace of progress in this technology has been anything but slow! Mel Noonan gives an overview of slow-mo developments for sport
66 GTC sponsor companies
The GTC is grateful to all its sponsor companies for their ongoing support
28 A 4K toolkit
67 A shooting star: the Sony A7S
A cracking contraption: the new Gimbal Vest 34 from Easyrig
Along with other camera manufacturers, GTC sponsors Sony are gearing up for the 4K revolution and have put together quite a ‘toolkit’
The Easyrig team have further expanded their back-saving range to include a new vest that supports the latest gimbal rigs
Still Game – Transforming a TV sitcom into an arena event
What does it take to engage a stadium audience of 10,000 with a live sitcom recording? Michael Hines, director of Still Game Live, explains
Codex – A smooth production flow
If you’re not familiar with the name, it’s time to get acquainted with this British company whose products are leading the world
The last bastion news GTC Member Paul Francis explains the trials of reporting an international news story, when circumstances are against you
Don’t stay in the dark about the stills/video hybrid camera that’s creating a stir - GTC member Mark Langton reviews the Sony A7S
Take 2: a second chance at life GTC member Mark Print - or Pronto to his fiends and colleagues - tells his remarkable story of survival in the face of a life-threatening illness
How and when might we see UHD 75 distribution to the home? Amidst all the industry hype surrounding 4K, there are some practical issues we should be aware of: Dr Bill Garrett enlightens us
78 The Bill Vinten GTC University Awards 2015
Entries are open for this year’s Bill Vinten GTC University Awards – two of last year’s winners explain why this is such a great opportunity
movie production – a glimpse into 80 Spatial the future
Want to know what’s at the cutting edge in imaging technology? Zerb has the latest news from Fraunhofer Alliance Digital Cinema
Guest Editor: Paul Mellon | Managing Editor: Alison Chapman | Tel: 07976 938784 | Email: firstname.lastname@example.org CONTACT ZERB Advertising Manager: James French 07855 743845; email@example.com GTC Administration and Membership Enquiries: Roger Richards 0300 111 4123; firstname.lastname@example.org Briar Cottage, Holyhead Road, Llanfairpwll, Gwynedd LL61 5YX Print: Information Press Ltd 01865 882588; www.informationpress.com Design: Toast Design 01295 266644; www.toastdesign.co.uk Zerb Subscriptions: Alison Chapman 07976 938784; email@example.com Zerb App: https://itunes.apple.com/gb/app/zerb/id581339247?mt=8
Views expressed in Zerb are not necessarily those of the GTC. Zerb is always glad of feedback from readers and members of the GTC. If you have any comments or questions about Zerb, contact Alison Chapman at: firstname.lastname@example.org. Zerb is published twice a year. The magazine is free to GTC members and is also available on subscription: UK £5.50 per issue; overseas £8.50 (or equivalent in own currency) per issue. Both inland and overseas subscription rates include post and packing. Contact Alison Chapman for information on subscriptions to Zerb at: email@example.com or on 07976 938784. Or see the Zerb app: https://itunes.apple.com/gb/app/zerb/id581339247?mt=8. No part of this publication may be reproduced or transmitted in any form, or by any means, electronic or mechanical, including photocopying, scanning, recording of any form or by any information storage and retrieval systems without permission in writing from the Managing Editor. The GTC logo is a registered trademark. Zerb copyright 2015. All rights reserved.
Editorial by Paul Mellon
warm welcome to Issue 81 of Zerb, which has a ‘touch of tartan’ running through it by virtue of my being based in Scotland. Fifteen years and 29 magazines have passed since my previous attempt at being guest editor. In that time the variety of means by which the pictures we create can be viewed has changed for ever. This edition coincides with the 10th anniversary of YouTube, founded 12 years after the first live streaming of video online. Yet the quest for ever higher picture definition continues apace. If 4K or ultra high definition workfows do indeed form the next chapter in the development of our industry, then it’s fair to say the distribution chain is at a crossroads for the major broadcasters; a subject very eloquently addressed in this magazine by Dr Bill Garrett. In mulling over what to write here, what has struck me most about our roles now is the incredible variety of what we do as GTC members and sponsors, in terms of both equipment used and as practitioners in different branches of filming. I hope there is a happy balance of technical and human interest across the articles for all of you. We have contributions from two of the UK’s leading drama cinematographers; sports, wildlife and news camera operators; directors; engineers; manufacturers; equipment reviews; news and updates. May I express my most sincere thanks to everyone who has supported and contributed to all the various subjects. You have, I hope, made this issue as relevant to as large a cross-section of the readership as possible, and I am grateful to you all for your time and effort. The GTC’s publications are the product of dedicated and selfless input by a very small number of working professionals and I would urge you to get involved in whatever way you can, to help with these endeavours. Whether it’s a single photo or a complete article, it all helps. The subject waters are wide, so please do remember that Zerb and GTC In Focus would like to hear about your work. The smartphone and tablet age in which we now live means the public expect to receive pictures and information immediately from almost anywhere in the world, usually without much consideration as to how this is achieved. News operators are now often responsible for arranging, filming, editing and sending the stories they work on. The BBC’s Paul Francis writes for us about his work in Afghanistan as he and the last international troops left Camp Bastion. I can only express my admiration for what he does and the understated way in which he describes the ‘difficulties’. My respect to Paul and to TV crews working on news around the world. Take care people… wherever you may be. It remains for me to thank James French and Alison Chapman for the power of work they have put into this issue of Zerb, and in keeping the GTC at the forefront of our craft. The Ryder Cup article could not have happened otherwise, and I’m hugely grateful to Ali for battling against flu in January to complete everything as the delivery deadline loomed. One story I would encourage everyone to read is from Mark Print, or ‘Pronto’ as he is known to many. It is something I hope none of you will ever have to contemplate, but I am indebted to Mark and Samantha for sharing their story with us. Here’s to many more updates of that T-shirt!
It has been a pleasure to be part of this edition and I hope you will consider being involved in future ones. I would like to dedicate this Zerb to the memory of BBC Scotland Camera Supervisor and GTC member Alistair W Hendrie, and to Martin Singleton, one of the founders of our freelance sector, both of whom sadly passed away last year. Fine cameramen both, and the finest of people, from whom I learned much.
Alistair W Hendrie
Fact File Paul Mellon began working professionally as a camera assistant in 1988. After 12 years as a staff cameraman at a facilities house, he went freelance in 2002. He works as a Lighting Cameraman, and on multi-camera in studio and OBs. Recent career highlights include covering Rowing & Canoe/Kayaking at the London 2012 Olympics, various events at the 2014 Commonwealth Games, and Still Game Live, which you can read about in the magazine. He lives in Glasgow, and has been a GTC member since the mid-nineties. Contact Paul on: firstname.lastname@example.org
To join the GTC please go to: www.gtc.org.uk/join-the-gtc.aspx
Spring 2015 ZERB
HIGH GRADE COST EFFECTIVE Broadcast quality camera system for productions without a broadcast budget
HC-HD300 • Ikegami’s most aﬀordable HD broadcast camera system • Newly developed 1/3-inch 3-CMOS optical block • Outstanding picture quality for studio and ﬁeld production
OUTLANDER Drama in the Scottish Highlands On 6 March 1988, a professor of scientific computation began writing a novel, simply to learn ‘what it takes’. Diana Gabaldon’s writing experiment was published in 1991, the first of a series of eight novels that have gone on to sell more than 25 million copies worldwide. American cable and satellite network Starz commissioned the adaptation of 16 TV episodes, and work commenced on one of the biggest productions ever filmed in Scotland, in September 2013. With plot lines and appeal compared to the Game of Thrones, the airing of the first series in the US last autumn has already led to a surge in tourism to the Scottish Highlands. GTC member Andrei Austin shares his experiences of working as a camera operator on this major television series. 6
Outlander plot summary A mixture of historical drama, romance, science fiction and swashbuckling adventure, Outlander follows the fortunes of Claire Randall, an army nurse at the end of World War II, who encounters a druid ritual and is transported back in time to the Highlands of Scotland in 1743 as war rages between Scottish clansmen and the English.
A long contract I was already known to the Outlander production company Left Bank Pictures because I had filmed in Hungary earlier in the year for their series Strike Back, shown on Sky. As soon as I heard I’d been selected as one of two camera operators for Outlander, I got hold of the book, which I always do as prep so that I will be familiar with the storyline. Often the book is heavily adapted but at least it gives you a ‘heads up’ so that you can hit the ground running from the off. Initially my contract was from September until just before Christmas. I would be working with David Higgs BSC as my DoP. To start with we were both contracted Spring 2015 ZERB
Outlander to do just one block, which is not uncommon on long runs. In the event though, they asked me to come back; I asked how long for and they said “All of it!”. I immediately had to rethink what I was about to do for the whole of 2014. We are all hostages to fortune… your whole month, day or even year can change with one conversation. However, this was a very well put-together production so I was happy to sign up.
Equipment As on many dramas these days, we shot on the ALEXA, with two complete camera channels paired with a complete set of Cooke S4i lenses, from 14mm right up to 135mm. For zooms, we had a 15.5–45mm Fuji Alura plus a couple of 18–80mm Fuji Aluras and a 45–250mm Fuji. When we had a third camera available, this would often be used with the 45–250mm to ‘fish for shots’ or to pick up a specific shot at 250mm, while myself and fellow operator Ossie McLean ACO were covering the other shots. Typically we would have the master camera on the dolly. The B camera might be on a slider and we carried a complement of these, from 3ft to 8ft. Set-ups might involve a dolly shot, Steadicam and/or crane shot (either Technocrane or jib) – or it could be a MoVI rig; we used anything and everything you can imagine! The most common set-ups, however, were on a Peewee or Hybrid dolly. Originally a lot of handheld was planned but that didn’t transpire in the end. Things evolve and some shots require a more stable studio kind of camera set-up, some Steadicam, some handheld, others the MoVI… it’s all contextual. You can’t always prescribe how the camera is going to move. We had two Steadicams available to us all the time, as Ossie McLean and I are both Steadicam operators. There were about 18 days in total when we had two Steadicams on the same scene at the same time, out of 203 days, so the Steadicam was used quite a lot. We had loads of Technocrane days and a couple of drone shots as well. We had the full gamut of kit!
Looks and LUTs Before block one we had about three weeks’ testing and various Look Up Tables (LUTs) were designed. Even though there might be a LUT applied to the camera, that LUT is ‘non destructive’ so it’s not a permanent change to the colour scheme of the camera. Even if the LUT is uploaded during the grade, you don’t have to use it because it’s not ‘burnt in’ to the signal. Throughout the year we were sent various LUTs for different situations and applied these during the filming, knowing they could be scrubbed and a different LUT applied. However, that would be an extreme example of what might happen as obviously if you are using a camera platform like the ALEXA, which is perfectly suited to this sort of shoot, there’s no point in designing LUTs that are going to be completely different, so the changes were very subtle. We used the LUTs all the time on set. I have my camera set up so that www.gtc.org.uk
I can view either the LUT or the Log C image via a selectable button. So, if I am operating on something that is, say, very dark and low key, where you can’t (with the LUT applied) see into the shadows or detail in the highlight areas, I will flick the switch temporarily over to the Log C image. This allows me to see what the camera will actually be recording and to spot any ‘gotchas’ like a flag stand or tape mark in the shadows, or a light stand in a highlight through a window. On low-key scenes, I might only look at the Log C picture, whereas on others I will flick over to the LUT, whether that’s Rec.709 or whatever is applied. If I have an idea of what the director and DoP are aiming for in terms of mood, I can then adapt my operating accordingly.
It was very helpful to pre-test some scenes – especially the special builds – to guard against any surprises when we came to do it for real; an awful lot of planning goes into making stuff like this work and you can’t just turn up on the day and hope for the best.
Changing directors and DoPs On this production the director and DoP changed from block to block while the operators and other crew members remained. This is a common pattern on long-running series, certainly in the US and some parts of Europe. The continuity of the crew is retained and the director, who has worked out and planned a particular storyline or script, comes along and ‘plugs in’ to the crew. Each DoP has their own viewpoint; everybody sees the world individually and this is no different for DoPs. The way in which you are required to frame, operate and move the camera can vary enormously. So, for instance, Neville Kidd, who has done lots of Sherlocks and programmes like that, is very innovative in terms of camera mounts and has built some of his own camera rigs. Then you have David Higgs who loves Steadicam and handheld. Martin Fuhrer, on the other hand is not a big fan of Steadicam and prefers the camera to be in studio mode for structured shots. So it changes depending on both the director’s storytelling and the way in which the DoP interprets this. Director Anna Foerster, who did blocks five and eight, loves wide-angle lenses, so
Technical rehearsal with the crew looking on
Outlander work and you can’t just turn up on the day and hope for the best. I didn’t find this shoot any more or less intense than other big productions I’ve worked on. You have to nail it. If you’ve got 30 horses running down a track and it takes 15 minutes to reset, you’ve got to be switched on. All those people – wardrobe, make-up, lighting – they’ve all worked their socks off; if you don’t see their work, it all counts for nothing. On Outlander there were amazing sets. One in particular, Castle Leoch, was huge, complete with flagstones, corridors and something like 800 candles, all of which had to be lit and tended, as well as gas chandeliers with gas piped in. All this had to be set up and all those candles extinguished between takes because of the heat. Even though we had proper ventilation, filtration and air conditioning, you don’t want that many burning sources when you’re doing a reset or relight. So, you have to get organised and get your shots right. You don’t want to be wasting people’s time. The Taurus vehicle with telescopic crane mounted – Andrei is operating from the trailer behind
you might do a wide master on a 21mm lens or even wider, whereas Martin Fuhrer (DoP on block 6) doesn’t like wideangles so might do the wide on a 50mm lens. There can be huge differences, but all equally valid.
American influences On American productions, the producer will be on set all the time so there is instant feedback if a line needs to be changed or differently emphasised. They can and will whisper in the director’s ear and suggest things. If the director has a query, then an instant response is available and there is that ‘steady hand on the tiller’. It’s not always necessary, but can be reassuring to have that person there. Outlander was probably one of the best set-up TV productions I’ve worked on. They gave themselves time. We shot a number of wardrobe and hair colour tests that were sent to the relevant people in the UK and US to make sure everybody was happy with what they would be getting. We also shot tests for key scenes; for instance, there was a scene involving dancers around some ‘Magic Stones’. These were fibre-glass standing stones that would be placed on location but we also had them available to us in advance in a studio. We needed to know whether the dance in and around the stones could be contained in our high-angle shots, so I did the maths and worked out what height the crane would need to be and what lenses would be required and at what angles, to capture the dance they had been rehearsing for weeks and weeks when we arrived on location in the Highlands. Each DoP has their It would be disastrous to go all the way own viewpoint; up there and discover we couldn’t get everybody sees the the shot. So, we put the stones up as world individually and they would be situated, and shot a test to this is no different make sure it would work. The testing wasn’t always this extensive for DoPs. The way in but it was certainly helpful to test some which you are required scenes – especially the special builds – to frame, operate and to guard against any surprises when we move the camera can came to do it for real. An awful lot of vary enormously. planning goes into making stuff like this
Subtlety in SFX The majority of special effects used were to ‘enhance’ the scenery as much of the story takes place in the Highlands and, while we were in Scotland, we weren’t necessarily up in the mountains all the time. Also, sometimes there were vistas we needed with perhaps a castle in the background where there was none, and this was added in by Special Effects (SFX). This was very subtly and intelligently used, and absolutely brilliantly executed. Another great use was for the character of Colum, the Laird, who in the story has ‘Toulouse Lautrec syndrome’ with very malformed legs. For the scenes in which his legs can be seen, the actor would adopt a gait that made him look malformed, and he would wear special green socks with fiducial marks on them so that SFX could track the movement of his legs and replace them with disfigured ones. This was absolutely brilliantly done. These weren’t static shots either; we made no compromises in our camera movement and it is testament to the skill of the SFX team that it looks seamless. Some of the SFX shots were storyboarded but not all. A typical scene was when we shot in the village of Falkland, where Claire and her husband arrive at their second honeymoon guest house and are walking across a street. We wanted to show the hills and mountains in the deep background but there were none. The shot was done on a jib and framed very low to the ground with lots of headroom so that we could insert the hills and mountains into the background. The visual storytelling was achieved in that one shot and the audience knows from it that they will be going up into those hills in the deep background.
Gripping stuff To enable us to use the Technocrane on the rough terrain, we mounted it on a Taurus vehicle, owned by Bickers. This combination was absolutely brilliant and is something I would use again because it allows you to position the base of the Technocrane, or any crane for that matter, anywhere you want. We went up some ridiculous slopes with the Technocrane. Otherwise we would have had to use a crane that could be broken down but you pay a time penalty for that and it means bringing in a crew early. With the Taurus you could just drive it into position. The Technocrane is such a useful device for getting reach and height, and for achieving camera movement over rough terrain without
Spring 2015 ZERB
Outlander laying decking and tracks. If you can get a Technocrane out there, you’ve got a lot of movement available. The grip Tim Critchell was fantastic. You have to nail Tim Critchell: “On the recce, it – if you’ve standing at the foot of the hillock, got 30 horses they told me this was where the running down a Standing Stones would be and that we would need telescopic crane track and it takes movement. From my experience, 15 minutes to and that of the DOP, we knew reset, you’ve got straight away that this location, with to be switched on fairly steep banking all around and and get it right. multiple positions required, would need the Taurus. We mounted a Moviebird 44 telescopic crane onto the vehicle and technician Andy Thomson found a viable route and drove it straight up. Once we were there, repositioning the crane only took about 10 to 15 minutes. We also used the Taurus with 30’ and 50’ Technocranes in other locations.”
The Camera Team A-Camera: Camera/Steadicam Operator: Andrei Austin (associate BSC, ACO) 1st AC: Anna Benbow/James Harrison 2nd AC: Erin Currie Camera Trainee: Scott McIntyre/John Young Key Grip: Tim Critchell Assistant Grip: Jon McCormick B-Camera: Camera/Steadicam Operator: Ossie McLean (ACO) 1st AC: Luke Coulter 2nd AC: Chris Maxwell Camera Trainee: Daniel Hill/James Hogarth Grip: Cassius McCabe Assistant Grip: Henry Stone/Ronan Devlin
Andrei’s apps I have a suite of software that I use to help me on productions. I use pCAM, Sun Scout, Shot Designer, RoomScan and Artemis. When the director is on set with the actors blocking, I will fire up Shot Designer on my iPad and plot the actors’ movements. When it comes to showing the crew, I’ve already got a head start; I know where the actors are going to be and can start plotting camera positions and even lenses with the DoP and director, straight off my iPad. It’s a really useful tool. With Artemis, which is absolutely accurate, once you’ve programmed in the camera information and suite of lenses you’re using, you can jump to a virtual lens and angle of view, and plot your shot with a high degree of accuracy. This works for any shots but is particularly useful for dolly shots. Once you’ve committed to a particular ‘track lay’ you don’t want to have to rip it up. For a long time now I’ve used either a director’s finder, or now Artemis, to very accurately plot the track, checking where it needs to be for a particular focal length, angle of view and height. The grip is there as you discuss it with the director and can mark it on the floor. Height is vitally important as this determines which camera mount you need. By the time the artists come back on set you are 90% there. With a director’s finder, only one of you can be looking through it at any one time so, in order to agree a shot, you have to hand it to the DoP and/or director. With the shot on your iPad,
everyone can see at the same time and hopefully agree it. If you’re not in agreement though, you can take a screenshot and say: “Look, there’s an angle over there… this is what it would look like.” So we might then put the B camera over there or do that angle later. Software is a huge part of drama production now. The other software I use is RoomScan. With this you can go into a particular location on a recce and very quickly get the dimensions of a room on the iPad or phone, by going around the room and tapping it on the walls. It generates a plan of the room, which means you can start to look at where things might go. You can then import that room plan into
New LEE Filters
Swatch App Our new LEE Swatch app puts the complete range of LEE lighting filters on one screen, with an innovative colour picker so you can easily build palettes anytime inspiration strikes.
Apple, the Apple logo and iPhone are trademarks of Apple Inc., registered in the U.S. and other countries. App Store is a service mark of Apple Inc.
Andrei in full flight with Steadicam
6236 LF SwatchApp_Zerb_132x90.indd 1
I have a suite of apps that I use to help me on productions… software is a huge part of drama production now.
Filming on board a rib with the ALEXA mounted on a ‘Perfect Horizon’ stabiliser
Shot Designer and plot in your actors and props etc. With Shot Designer you can actually animate moves in plan view so you get a dynamic representation of where your actors are going to be and how your cameras can move in relation to that. Again, this is a really useful tool for sharing with the rest of the crew and also going through with the director exactly what they have in mind. It saves a lot of misunderstandings on set.
Useful production apps
pCAM FILM + DIGITAL PRO by Thin Man Inc – £20.99
The other thing I do, which is software related, is upload a schedule onto iCal and populate that schedule with any bits of equipment we might need, above and beyond that which we normally carry. So, if we need a special lens on a particular day, or we’re doing anything out of the ordinary, I’ll note that in iCal. Then I invite everybody on the crew, as well as the production office, to share that calendar so we can all see what’s happening at any particular time. I give the office and certain members of the crew ‘read and write’ permissions. This means there are no surprises. If the crew want to print it out in month view, list view or week view, they can do so themselves. As we go to print there is still no confirmed news of a UK broadcaster or transmission dates, but the US is gearing up for the second part of Series 1 to start airing in April.
24 bundled cinematography and photography apps including: depth of field; field of view (picture sizes); angle of view; relative sizes (for different sensors); colour correction; mired shift; and much more!
Shot Designer by Hollywood Camera Work – Free
Includes: Camera design – speeds up camera diagram creation; Animation – animate characters and cameras in real-time; Shot list – writes itself while you work; Director’s viewfinder/storyboards – lensaccurate camera angles; Sync and team sharing – sync scenes across devices and share with team.
Artemis Director’s Viewfinder by Chemical Wedding – £20.99
Artemis is a digital viewfinder for the iPhone and iPod Touch. It works in much the same way as a traditional director’s viewfinder. Great for location scouting and making storyboards.
by Locometric – Free
RoomScan draws floor plans all by itself – touch each wall with your phone for dimensions.
by Benjohn Barnes – £6.99
Sun Scout uses your iPhone’s compass to tell where the sun will be at a specific time.
If you want to see more photos of Andrei and the crew in action please go to: http://instagram.com/outlander_starz GTC member Andrei Austin started his career as a TV cameraman at the BBC, covering everything from current affairs, sport and State occasions to documentaries and commercials. Since leaving the BBC to go freelance, he has operated A-camera and Steadicam on numerous film and digital drama productions, including the Emmy award-winning BBC series Waking The Dead, Silent Witness and, of course, Starz/Sony’s Outlander. He was also DoP on indie films Wayland’s Song and His Heavy Heart. Andrei is an associate member of the BSC and also a member of the Association of Camera Operators (ACO). Website: www.andreiaustin.com Tim Critchell worked for Axis Films in Glasgow, before going freelance as a grip in 2003. He works on a variety of commercials, features and drama productions.
Spring 2015 ZERB
16 BIT ENCODER
DIGITAL DRIVE UNIT
4000 > KELVIN
LDS / iTECHNOLOGY
PL MOUNT FOCAL RANGE 14-400 MM
MACRO FUNCTION RICH CONTRAST
For great emotions: Cine lenses from Fujinon Equipped with everything that makes movie productions livelier, more exciting and action packed. The Premier HK-Series and the light-weight ZK-Series Cine lenses. More via QR scan or at: www.fujifilm.eu/fujinon Fujinon. To see more is to know more.
FUJIFILM UK Ltd., Optical Devices Division, 88 Bushy Road, Raynes Park, Merton, London, SW20 0JH, Tel.: +44 (0) 78 80 18 69 48 Email: Lenses@fuji.co.uk
The Ryder Cup
We must do this again Televising The Ryder Cup 2014 Ninety-four years ago, the first unofficial golfing contest between Great Britain and The United States was held at Gleneagles. Five years later, in 1926, a spectator at the second challenge in Wentworth enthused in the clubhouse bar afterwards:“We must do this again!” His name was Samuel Ryder, and the inaugural competition bearing his name took place the following year. The Ryder Cup visited the famous Perthshire resort in the autumn of
2014, for the latest chapter in the history of a tournament that has become one of the most tense and emotional dramas in sport. With a global audience watching, excellence in the television coverage is the minimum expectation, so what is involved in bringing this giant of the sporting calendar to our screens? Camera supervisors Keith Gibson and Rick Fox, and technical manager Hamish Greig share their insights with Zerb …
Spring 2015 ZERB
The Ryder Cup
n 2014 the ‘World Feed’ coverage of The Ryder Cup was provided by European Tour Productions and facilitated by CTV Outside Broadcasts. In addition, CTV serviced the needs of other broadcasters, including the BBC, the Golf Channel and TNT, along with facilities to other organisations connected to the event. Television rights in Europe and the United States are held by Sky and NBC respectively, and outside broadcast companies Telegenic and Visions looked after their specific requirements on site. Visions worked with the Americans, while Telegenic provided the facilities for Sky, which included studio presentation cameras, some RF cameras and, notably, 4K coverage from cameras on some of the holes.
A giant replica of the famous trophy stands guard at The Harris Pavilion
For CTV technical director Hamish Greig, however, discussions had begun much, much earlier. “One of the most challenging aspects of the Gleneagles Ryder Cup was to engineer all the required cabling and RF prerequisites when you are situated in a compound that is just under 2km away from the 1st tee (T) and across public roads. For the cabling, over winter, we had three main fibre pipes installed to three locations on the course – 592 cores to the 1st fairway (F) hub and 192 cores to 6th green (G) and 16th fairway hubs respectively. From these hubs we further extended cores over ground to create camera, sound and RF hubs as required at 3T, 9G, 14G and 18F. At the hubs we installed our own purpose-built camera TEDs (camera interface over two fibres), of which we used over 110 TEDs to service CTV clients’ camera cables, plus six audio Hydra systems for all our oncourse stereo FX mics.”
For GTC member Keith Gibson, CTV’s camera supervisor, involvement with the event began around the time of The Open Golf Championship at Hoylake in July. Planning by the producer, director and unit managers had started some time before, which included on-site meetings with the NBC director to determine camera positions and tower heights. “My involvement started when I received a camera plan from Jim Storey, the World Feed Director. I didn’t start work on it until after The Open, but then transferred the information from Jim’s plan onto the CTV camera sheet format normally used for the golf throughout the year. This format combines the necessary production and technical information relevant to each camera.” Meanwhile across the Atlantic, Rick Fox, senior cameraman and ‘FAX TD’ (facilities technical director) at NBC Golf, was similarly preoccupied with camera plans. “Although I don’t do the surveys of the golf courses and get involved with the actual planning of the camera positions (that is all handled during extensive surveys by our executive producer Tommy Roy and director Doug Grabert), I do get involved from that point on with the actual implementation of the setup.”
The Ryder Cup
Keith Gibson had 53 CTV cameras on the course, broken down as follows:
Multi-camera on a grand scale!
• 6 RF cameras with 22x lenses • 1 Xmo (RF camera) with 22x lens • 1 Steadicam (RF camera) with 14x lens • 1 GF-9 Jib (cabled) with 14x lens • 44 cabled cameras: - 22 box lenses, mainly 86x - 16 40x lenses - Six 22x and two 11x lenses
Keith Gibson explains: “A couple of the tee cameras had the option of a 22x or an 11x, which is why the total number of lenses doesn’t tally with the number of cameras. We had big lens The surrounding cameras on towers as the main coverage at the countryside greens, and big lenses on the four crane cameras at Gleneagles (cherry pickers). The 40x lenses were used on made a great fairways and low cameras on the greens, with most of these commuting between positions. backdrop to The 22x and 11x lenses were mainly used looking the event but, from behind various tees, with one 22x camera beautiful as it starting at the practice putting green, followed is, the hills and by two fairway positions. The tee cameras were hollows are not used in conjunction with the Pro-Tracer system (www.protracer.se). The Pro-Tracer cameras very RF friendly! were mounted on vertical specially fabricated scaffold tubes with a pod adaptor and leveller. The Sony camera plate bolted straight onto the leveller using it to tilt the camera down. For the World Feed coverage, we had 44 camera operators, some of whom commuted between pre-rigged, fixed camera positions. In addition, we had four assistants and myself, as non-operational supervisor. I was assisted by two other supervisors: Phil Gilbride looked after the RF cameras and Dave Matthews was responsible for the cranes. The six RF cameras were double-crewed on the first two days, as they are particularly long, with two rounds a day – one each of the foursomes and fourballs. Six of the RF cameramen moved to cabled cameras for the singles matches on the Sunday.” For the US coverage, Rick Fox’s NBC crew had operators at all the greens sharing the towers with the World Feed cameras, 14
plus a number of fairway and low camera positions. They also had their own team of RF camera operators and studio crew. The output of both teams of RF cameras was shared by NBC and the World Feed, but the Perthshire countryside isn’t the ideal environment to provide uninterrupted signals to the worldwide audience. As Keith Gibson explains: “The Ryder Cup is normally on an inland course, not a seaside links course. The surrounding countryside at Gleneagles made a great backdrop to the event but, beautiful as it is, the hills and hollows are not very RF friendly! The course covers a very large area, which made getting around very time-consuming, compounded by the enormous crowds during the event. We had to have a number of ‘receive and transmit’ points for the RF cameras and radio talkback.” Hamish Greig elaborates: “RF was a huge challenge. We had four main RF cranes for a combination of either: camera receives, talkback transmit and receive, or high power radio mic reception for 100% course coverage. We had a 44m crane at 6G, a 72m crane at 3T, a 72m crane at 16F and a 56m crane in the TV compound. For CTV’s clients we provided: • 39 duplex radio talkback channels plus 340 radios • 33 high power radio mic kits for either commentator use or FX mic coverage • 14 radio cameras.” Yep … keep track of that lot at the derig (or ‘teardown’ as our colleagues across the pond would say)!
Differing styles From the camera operators’ point of view, the foursomes and fourball formats on the first two days of competition don’t make a huge difference to the way in which play is covered compared to the more familiar singles golf format. Matchplay, however, means that putts can be conceded, so Spring 2015 ZERB
The Ryder Cup
IMAGES: 1 Guy Chadwick using a HJ40 lens 2 Russell Dawson at work in the sub mix gallery 3 16th Fairway: Tigger Gray (left) at his first Ryder Cup and Andrew MaClenaghan 4 Graham Keyte’s Gator with GF-9 Jib in hi-vis travel mode 5 12-channel camera hub with audio Hydra interface above 6 The production gallery hot seats 7 Steadicam RF crew – L to R: assistant Chris Crowe, operator Phil Walker and rigger Jim Ritchie
it is important to keep an eye on all the players as well as covering the player putting on the green, in case a ‘gamechanging’ moment occurs. While the approach and practice to covering golf is very similar in Europe and the USA, small differences are apparent. Rick Fox explains: “There are some differences in the coverage between the European channels and the American network (NBC). As an NBC cameraman, and especially when we travel overseas, we tend to take fewer breaks than our European counterparts, and we normally do two cameras each due to the shortened field of golfers. For example, in Scotland this past year, I covered holes 9 and 18. We tend to take our breaks while transitioning from hole to hole, if possible. “As for the coverage, I have noticed over the years that the European cameramen tend to follow the ball in flight by zooming out – and then zoom into the ball after it lands. In the USA we tend to zoom in to the ball in flight and zoom
Rick Fox “It was a very quiet and reserved event from our point of view. We all commented on how respectful and nice the spectators were to all of us. While talking to a policeman one day, he asked me what I did back in America. I explained that I was a sports camera operator for a television network in New York City and told him how I travelled the world covering various sporting events, including the Olympics. He then told me I was “Leaving the leaf of railey”. It took me a few moments, but I finally realised he meant I was “Living the life of Reilly”... I’m sure he got a kick out of my Philadelphia accent too!
The Ryder Cup 2
1 Above the 17th green 2 It wasn’t just the spectators showing their support - this for the European side 3 … and here for the Americans 4 Overnight accommodation or POW camp? 5 Actually the Snoozeboxes were very comfortable and great for early starts
Snoozebox vs Hilton
I am sure 4K will become increasingly popular as the ‘next big thing’ in the quality of TV coverage and will become more prevalent as time goes on.
out to show the relationship of the ball to the pin once the ball lands. 4K is not currently a huge part of any golf coverage that I know of in the USA. I was first introduced to the quality of 4K pictures while covering the 2014 Winter Olympics in Sochi, Russia, earlier this year. I am sure, however, that 4K will become increasingly popular as the ‘next big thing’ in the quality of TV coverage and will become more prevalent as time goes on.”
Keeping track of everything
Managing such large quantities of camera and other equipment is clearly a major undertaking, so how do the supervisors keep track of everything, including breakdowns, swap-outs and all the daily adjustments to plans that keep the coverage going? Keith Gibson explains his method: “The camera department had a portacabin for an office next to a marquee-type tent, about 12m square. The CTV Head of Cameras, Dave White, was the mainstay looking after and organising the kit, both with the prep at CTV’s base and on site at Gleneagles. We were able to put together individual camera kits in the tent prior to taking them out onto the course, all the big lenses being rigged in position for the duration. The other cabled cameras were assembled in the tent and went out onto the course daily, being stored overnight in the tent. On the Sunday, all the cameras came back to the tent before being loaded onto various trucks for transport to the next golf job, the Dunhill Cup, which would be held across the three courses of St Andrews, Carnoustie and Kingsbarns. One truck with the surplus equipment returned to CTV’s base. Everything came off the course on the Sunday evening, but some of the truck loading was left until the Monday morning.
There is, of course, much more to working on an event of the Ryder Cup’s magnitude than just filming the protagonists on the fairways and greens. It’s fair to say that members of the opposing continents’ crews had a rather different experience of staying in Scotland. Most of the UK crew stayed at the Premier Inn in Stirling but those who were there for the rig and the RF camera crew stayed in Snoozeboxes, an ingenious ‘portable hotel’ that is erected on site (www.snoozebox.com/ home/index.html#welcome).
Ed Nash I was filming for PGA Turner. I’ve worked on golf regularly, but never with an atmosphere like this – it was truly electric. On the first morning, out of the haze, a terrified deer bolted past us, up the 1st fairway to the tee, where the golfers were waiting. A wave of noise at the shock of seeing this gave way to cheering and clapping. A magical moment to kick off this amazing event! It ended just as memorably. The Ryder Cup is a ‘freefor-all’ at its conclusion. We’d just interviewed Jamie Donaldson about his winning wedge shot, with players still hitting up to the green, when everyone suddenly backed away, leaving me alone. I didn’t see the shot, but I knew it was headed straight for me. The ball landed, luckily about a metre away. Chaos began. The media rushed in and suddenly I was getting soaked with bubbly, camera held above my head, right at the heart of the scrum. It was fantastic madness and I was so glad to be involved. I’ve done a lot of memorable jobs, but the Ryder Cup is definitely up there with the best of them.
Spring 2015 ZERB
Outside Broadcasts for Winners
CTV Outside Broadcasts Ltd - 3 The Merlin Centre, Lancaster Road, High Wycombe, HP12 3QL Contact: Adam Berger: email@example.com or Bill Morris: firstname.lastname@example.org / 020 8453 8989 / www.ctvob.co.uk Photo credits: Virgin London Marathon: Geoff Pugh / Rex Features. Soccer Aid for UNICEF: Beretta / Sims / Rex Features. NFL: David Fisher / Rex Features. Monty Python Live: Geoff Robinson / Rex Features. Rory Mcilroy: BPI / Rex Features. Boat Race: Tom Dymond / Rex Features. BAFTA Awards: BAFTA. 2014 Ryder Cup Golf: Richard Castka / Sportpixgolf / Rex Features.
The Ryder Cup
IMAGES: 1 Some of the 2014 production team 2 Crew receive their planning sheets in the equipment marquee
3 Dawn breaks over the pavilions
4 The peace of the Perthshire countryside … soon to be shattered by thousands of exuberant spectators 5 The galleries around every green were packed 6 A beautiful sight to greet the Snoozeboxers
Keith Gibson: “With the early starts and long days, staying next to the compound was a huge advantage. It was good for me as I was able to go out on the course before breakfast It was lovely to see on a couple of mornings to check things while the sun rise over it was still very quiet. It was lovely to see the the hills around sun rise over the hills around the course, in the course, in what at that time of the day, were very peaceful surroundings.” what at that time Meanwhile, the US crew were being bussed of the day, were into Glasgow each night. Rick Fox describes very peaceful this: “Our accommodation was actually quite surroundings. good, with no complaints from the crew as far as I know. We stayed at the Hilton in the centre of Glasgow and found it to be very nice from beginning to end. Although it sounds like it had some advantages, the Snoozebox option wasn’t available to us due to union and company regulations.
Established friendships Since the introduction of European golfers to the tournament in 1979, The Ryder Cup has become a treasure for the viewing public on both sides of the Atlantic. It is also a special time for the camera crews, as Rick Fox illustrates: “The very best part of the entire trip for me was running into old friends that I only see every two years or so. Some of the European broadcasters are old friends of mine that I have known since about 1992. Keith Gibson tops my list, but there are others: Nick, Badger, Peter, Roger, Steve – all only first names – but when you’ve known old friends like these for almost 25 years, first names are all you need.” 18
Fact File Keith Gibson: Freelance for about 35 years having started in the business at Television Centre followed by BBC Outside Broadcasts. Rick Fox: After 14 years working for a small TV station in Philadelphia, Rick joined NBC in New York as a cameraman in 1984, working predominantly in the studio/field division. Although he doesn’t play golf, over the last 30 years he has operated in every camera position on the course, and is NBC’s Facilities Technical Director for golf coverage. Hamish Greig: In 1986, after 11 years working abroad, Hamish Greig joined CTV (Carlton), and four years later took over technical management of large projects such as the Barcelona, Lillehammer, Athens and Torino Olympics. In 1992 he pioneered the first CTV flypacks for European Tour productions of golf. Since 1996 Hamish has been Director of Engineering for CTV’s OB fleet, overseeing builds, budgeting, engineering operations, projects and the technical crewing of CTV’s fleet and business.
Spring 2015 ZERB
Future proof your facility with advanced 4K SD, HD and Ultra HD mini converters! Blackmagic Design’s new high performance mini converters switch instantly between all SD, HD and Ultra HD video formats so they’re ready for Ultra HD when you are! Mini Converters are available in regular or heavy duty models that are machined from solid aluminum so they look beautiful and are super tough! Choose from models with 6G-SDI, HDMI, analog, optical fiber connections and more.
6G-SDI Technology Mini Converters include multi rate 6G-SDI so you’re always future proofed! 6G-SDI is fully compatible with all your existing SD and HD SDI equipment and will automatically switch when you need to run Ultra HD! Broadcast Quality
Auto Switching SD, HD and Ultra HD Mini Converters instantly switch between Ultra HD and all SD or HD formats, including NTSC, PAL, 1080PsF23.98, 1080PsF24, 1080PsF25, 1080i50, 1080i59.94, 1080i60, 720p50, 720p59.94 and 720p60. Updates can be loaded via USB. Redundant SDI Input Mini Converters feature a redundant input and loop through SDI output. Connect a redundant SDI cable to the second input, and if the main SDI input is lost, Mini Converters will automatically switch over in an instant. That’s great for mission critical tasks such as live events. Pro Analog and AES/EBU Audio Standard 1/4 inch jacks are included for professional balanced audio that switches between AES/EBU or analog. Unlike other converters you don’t need expensive custom audio cables so you’ll save thousands of dollars!
www.blackmagicdesign.com/uk *SRP is Exclusive of VAT
Designed with Ultra HD technology, Mini Converters give you even better performance when used with SD and HD! You get low jitter, the longest SDI cable lengths and the highest quality broadcast video and audio performance available. Mini Converters Mini Converter SDI to HDMI 4K .........£195* Mini Converter HDMI to SDI 4K .........£195* Mini Converter SDI to Analog 4K .......£195* Mini Converter Analog to SDI ............£195* Mini Converter SDI to Audio 4K .........£195* Mini Converter Audio to SDI 4K .........£195* Mini Converter Optical Fiber 4K ........£325* Mini Converter SDI Multiplex 4K ........£325* Mini Converter SDI Distribution 4K....£195* Mini Converter Sync Generator..........£205*
Nick Dance BSC
An Interview with
Director of Photography Nick Dance had a great year in 2014. He was busy throughout: firstly shooting the well-received BBC 1 series Our Girl (nominated for a GTC Award for Excellence), then a feature, Dartmoor Killing, a psychological thriller set in Devon – and lastly a couple of episodes of the forthcoming BBC1 crime series The Interceptor. The icing on the cake was the richly deserved news that Nick has been accepted into the BSC. Zerb Managing Editor Alison Chapman chatted to him about his recent work and approach to shooting drama. What is your current preferred camera for shooting? I generally go for the ALEXA; I haven’t used the AMIRA yet. We were offered one by Films at 59 in Bristol, who did the pre and post-production on Dartmoor Killing, but only having one day off between Our Girl and starting the film, there wasn’t time to do tests, so I didn’t feel comfortable using it at that time. For Our Girl we had a couple of ALEXAs from Take 2 in Cape Town with Cooke primes and Angénieux Optimo lightweight zooms. On The Interceptor I used the RED Dragon. I didn’t initiate that series; there were four DoPs doing two blocks each and I did the last block of two episodes. The first DoP and director chose to use the RED. They wanted to go for a different look, which included shooting in 2:40 ratio (unusual for a BBC1 show), so it will transmit in a letterbox format. We shot 5K on the RED, but ultimately it’ll be down-converted to 1080 for broadcast. I find the viewfinder on the ALEXA far better for lighting through than the RED; it’s bigger and a more true image. On The Interceptor that wasn’t an issue as I had an operator, so I could use a monitor, but on Our Girl I operated A camera,
mostly handheld, plus we had a B camera mainly on Steadicam, so I needed to be able to gauge the lighting and exposure through the viewfinder, often for both cameras simultaneously. With the ALEXA, you can monitor the B camera by switching feeds in the viewfinder and we did this wirelessly, a very useful feature – I’m not sure what I would have done without it. I also prefer the ALEXA for handheld. With the RED you need a rig and it’s hard to get everything in quite the right place. Although the ARRI is heavier, by the time you’ve bolted everything onto the Dragon, it actually becomes heavier and can be a bit unwieldy. Nevertheless, the pictures from the RED were very impressive – especially the dynamic range, which was proven in the grade.
Shooting Our Girl The story is set partly in Afghanistan, partly in the UK. The Afghanistan scenes were shot in South Africa. Some days we were working in 40+ degree heat, which was very tough on the actors in army uniforms. We couldn’t keep them in the gear in that heat for long, so it was almost like doing a period drama – always waiting while they got in and out of costume!
Spring 2015 ZERB
BBC / COCO VAN OPPENS
Nick Dance BSC
Did you try to reproduce the sunset and sunrise look? It’s hard when you’re on a schedule. I’ve On Our Girl I operated had this situation on documentaries in A camera, mostly places like the Sahara when the director handheld, plus we says: “We’re going to get up early for the had a B camera sunrise, shoot for a couple of hours and mainly on Steadicam, then go back to the hotel. We’ll have a break and go out again to catch the so I needed to be able evening light.” In practice, there’s never to gauge the lighting time for the hotel break and you end up and exposure through still shooting at midday! the viewfinder, often Actually, for Our Girl, we found the harsh midday light helped. Normally you for both cameras would silk it down and try to soften it simultaneously; when shooting actors, but we actually with the ALEXA, wanted the harshness to show the heat you can wirelessly of Afghanistan, so we shot right through monitor the B camera, the day. We would schedule scenes that were supposed to be early morning or late switching feeds in the afternoon at the beginning or end of the viewfinder. day, and there were some pre-dawn scenes so we’d get to location at 4:00 or 5:00am, well before sunrise. Most of the time the actors had helmets on, which would shade their faces, so you didn’t have the harsh light directly on the face but still felt the heat; that was the look we wanted.
How did the ALEXA fare in the heat?
Our Girl was shot very much in a documentary-style. The first director, Anthony Philipson, like me, had come from documentaries and he wanted to shoot it that way. In prep we watched several Afghanistan war documentaries, including awardwinning Hell and Back Again. This was made by a photojournalist who was out there to take stills but decided to shoot some video on his DSLR. There was some great footage, with lots of shots taken at sunrise and sunset with wonderful golden light. This is what we wanted to try to achieve. I mentioned this to our ex Army military advisor and he explained: “The reason the light’s so good is that the Taliban don’t go out in the midday sun. It’s far too hot to fight – early morning and late afternoon are much more comfortable!”
BBC / COCO VAN OPPENS
The ALEXA was incredible. It got very, very hot – you couldn’t touch the camera at one point; it was like touching an iron – but it just kept going. We would try to shade it but it wasn’t really practical when we were handheld. We never had an issue with the cameras going down. Another problem was shooting scenes involving helicopters because of sand blowing around. You couldn’t put any covers on the cameras because they would just get blown off. At one point we could hear sand rattling around in the fan – not great for the camera – but again it just soldiered on. The only real casualty was a filter; we had an optical flat that turned a very heavy Pro-Mist because it was literally sandblasted. It was like frosted glass in the end.
Feeling the heat in the Med tent on Our Girl
BBC / COCO VAN OPPENS
Nick Dance BSC
Shooting outside the Forward Operations Base – much of Our Girl was shot handheld
We had wonderful support from Take 2 in Cape Town. There’s an incredible infrastructure down there, with a great choice of kit and amazing crews. Generally they’re shooting movies and commercials; our gaffer went straight onto Homeland and they’re also doing a series for Starz called Black Sails. It’s busy down there; they have both the infrastructure and great crews.
You used local crew? At my initial interview for Our Girl, I was told we had to use local crew; only HoDs would go from the UK. I was slightly nervous about this as I’d only worked in South Africa on documentaries, so not with any drama crew. However, after a bit of homework, we put together an amazing crew. Justin Hawkins is one of the best focus-pullers I’ve ever worked with. It was very tough for him because so much was handheld and everything was moving – both camera and actors. The whole point of having a large sensor is to go for a shallower
depth of field, to help focus the audience’s attention and allow them to be more intimate with the characters, so letting the background fall away. We also often used longer lenses, so I really don’t know how Justin did it – the focus was always absolutely spot on. He would often get a big hug from the director! The grips were great too; nothing was too much trouble. Here, we struggle sometimes if we want to get out a 12x12 or especially a 20x20 silk. To be fair to grips here, it’s often because of the lack of manpower, but in South Africa it never seemed to be an issue. The gaffer would say, “Do you want the 20x20?” and I’d reply, “I’d love it but it’s far too windy isn’t it?” “Don’t worry, it’s fine,” he’d say. They’re just used to working in those conditions!
Tell us about your approach to lighting My approach for the lighting on Our Girl, as with the shooting style, was based on the documentary aesthetic of naturalism. This means when you go into a location you look to see where the main source of light comes from – maybe the windows or practical lights. It’s this that motivates my lighting sources and makes the drama feel real. If you’re doing fantasy, you can do what you want as it’s not meant to be real, but for Our Girl we wanted realism. Even for Dartmoor Killing, which is a psychological thriller that gets darker and darker, I still tended to base the lighting on natural sources because we wanted the audience to believe what they are seeing. This approach not only feels real but can be quite a simple way of doing it. In 10 years of shooting drama, I’ve learnt that simpler is often better. The more lights you use, the more shadows you get and it generally takes longer without necessarily getting better. I did several series of Skins, which involved a lot of inexperienced actors, who were still in sixth form. Because of that they learnt very fast and were soon very film-set savvy, but nevertheless we wanted freedom for them to move around and not to have to hit exact marks. I don’t like too much clutter on the set, so the actors have more freedom to move around. It’s always a compromise; you want to do as good as you can photographically, but at the end of the day it’s really about the script and the performances. It doesn’t matter how beautiful it looks if it’s badly acted and the script is poor. Script and performance are king and I don’t think any department should distract from that. As soon as the audience notices something – maybe the design or a costume isn’t right, or the lighting jars – it can take them out of the moment and you lose them. Some directors (like Kay Mellor whom I worked with on The Syndicate) don’t like to rehearse; she likes the actors to have free rein and to catch the resulting variations in performance, maybe capturing something new and surprising each time. Spring 2015 ZERB
Nick Dance BSC At the other end of the scale, other directors like lots of rehearsals and do precise set-ups, which is what we did on The Interceptor. Some directors are more theatre-orientated so they’re primarily interested in the performance and leave me to do the camera blocking. Others are more technical and know exactly what lens they want and precisely where the camera is to go. I enjoy the variety – that’s the fun of it – but really I like directors who give you that little bit of freedom. I always like to be on the set for the director’s rehearsal if possible. I usually sit quietly in the corner but I’m watching and thinking about how the scene is playing out, how we are going to cover it, where I am going to put my lights, etc. To come in at the crew rehearsal stage is far too late; I can do a much better job when I’ve had that thinking time and can be ahead of the game. In TV, the schedules are so tight this is very important The experience I’ve gained since moving into drama has helped greatly. Perhaps to begin with I overcomplicated things – I had a truck full of lights so I felt I had to use them! Outwardly I might seem quite calm on set, but internally, especially when the clock is ticking, I’m thinking “How are we going to do this in the time and make it look good?” Previously, I might have panicked, but with experience I find I can relax and anticipate how it will work. Faster cameras help to a certain extent. Where you might have needed an 18K lamp, it might now be a 2.5K because you don’t need as much light to get an exposure – but you still need to convey the mood of the scene, give it interest and make it feel real with lighting. If you base it on reality, the audience will believe it. We can shoot in pretty much any light condition now, but it’s not just about getting an exposure, it’s about how you enhance the emotion of the scene through composition and lighting.
around it, and the whole reason we’re there is to see Dartmoor beyond. It’s an important character in itself. Otherwise you may as well In 10 years of shooting shoot in a studio against a black cyc. drama, I’ve learnt that Even Spielberg can’t afford to light simpler is often better. an entire valley!” I came up with the The more lights you idea of setting it at deep dusk so it looks very dark but you can still see use, the more shadows the landscape behind the actors. you get and it generally Night is always tricky to get right. takes longer without When I first used the ALEXA, we necessarily getting better. did tests in Leeds for Sirens, a series about paramedics. The results were amazing just with street lighting and the lens wasn’t even wide open. Previously those streets would have had to be lit, with cherry pickers up etc. In an urban environment, I like all that natural light of varying colour temperatures you get from street lights and shop windows, especially if it’s raining or you have a wet down – you get wonderful colours and reflections. But once you go out of an urban environment, where’s the motivation for your light source? Well, of course, it’s the moon – and it’s always a full moon in the movies! It’s often too blue and overdone, and can look stagey and theatrical. It’s difficult, especially somewhere like a forest. I did another army series a few years ago where we put up a number of Airstar balloons, which worked pretty well because you could shoot 360 degrees and move fairly quickly. It gives a nice soft top light. We shot Dartmoor Killing in June, over the time of the longest day. This gave us maximum twilight for the ‘night’ scenes, but even then we would be lucky
The changing skies played a bit part in the Dartmoor Killing shoot?
Yes, many scenes were filmed up on Sharp Tor. We had to climb up and down the Tor for 10 evenings because the finale of the film takes place there. Originally these scenes were scripted as night, but I pointed out to the director and good friend BAFTA-award winner Peter Nicholson: “That’s going to be quite a challenge shooting on the Tor for night. If you light it for night, you’re only going to be able to light a small area, especially on our budget. If you do light a small area there’s no point in going up the Tor because you’ll see nothing
Nick and crew on one of the many trips up Sharp Tor
Nick Dance BSC to shoot a minute of screen time per evening and we had about 15 minutes of ‘night’ to cover. The main problem with shooting a scene over 10 nights is We definitely got shots weather continuity; it might be dry one with the octocopter we day and wet the next. We had planned couldn’t have achieved to shoot in May but I couldn’t in the end any other way. We did because I was still on Our Girl. However, the production moved the dates and they one shot where we thanked me in the end because in May started virtually on the we had terrible weather (which actually ground, revealed two worked in our favour for the UK scenes actors running down for Our Girl as it was a good contrast to a hill, went up to see the sun of Afghanistan). Otherwise we’d still be in Dartmoor shooting now! We a guy walk across a were blessed with fine, mainly sunny farmhouse courtyard weather for the whole shoot. Each and then kept on rising evening we would start on the shadow to see the Tor beyond. side of the Tor so the sun didn’t spoil the illusion of twilight. Ironically, if it was a cloudy evening we could start earlier because there weren’t any sun issues. Another challenge was that one of our main actors needed to use a torch and we had a car stunt with the headlights on. We had to shoot those scenes at the very end of the twilight, otherwise they wouldn’t show up, so we had to move fast before the light completely vanished. That’s where the ALEXA is amazing – it sees into the twilight and I only had to up the ISO once. Although the final footage needed to appear dark, I didn’t want to underexpose, especially when there was plenty of light early in the evening (we started with NDs and pulled them as the light dropped). I exposed normally because I didn’t want to lose any information by stopping down. The final light levels and colour temperature were set in the controlled environment of the grade, which was done in the Films at 59 grading theatre by colourist Tony Osborne, who did a great job. We also shot time-lapses on a Nikon D800 over two or three days, mostly sunrises and sunsets. That worked very well. We were able to catch some wonderful moody skies, adding another dimension to the film.
The octocopter was fantastic. We used Gifford Hooper of HoverCam in Plymouth. He has won Academy awards for innovation. I used them on Time Team years ago when it was a substantially bigger remote helicopter that could take a film camera. Now it’s much easier with smaller cameras and drones. In South Africa we’d used a drone for a fairly straightforward shot just rising up, but for Dartmoor Killing Peter wanted to try more movement. However, we soon realised they are better going in straight lines, especially when you’re trying to choreograph something with actors and moving vehicles. Doing 360s was quite tricky. Obviously the first thing is you’ve got to hide the pilot who flies the 24
Do you use filters much these days? With the previous generation of HD cameras, the dynamic range was so limited I really don’t know how we used to cope with the limited contrast range, especially on sunny daylight exteriors. That’s the great thing about cameras like the ALEXA and RED; it’s so liberating, not only because of the fast ISO but also the amazing dynamic range. Where in the past we had to use grad filters to keep sky detail, now with RAW and RIC BACON
You used an octocopter for some shots?
‘copter and the camera operator, not to mention the rest of the crew. This is not really possible because the pilot needs to have visual contact with the aircraft at all times, so he can’t hide under a bush! There are a lot of elements to get right, especially if you’re cueing actors who have to run ahead of a Land Rover driving – but It actually all worked very well in the end. We definitely got shots we couldn’t have achieved any other way. We did one shot where we started virtually on the ground, revealed two actors running down a hill, went up to see a guy walk across a farmhouse courtyard and then kept on rising to see the Tor beyond. You couldn’t have used a conventional helicopter for that because it couldn’t have started low enough and it would have been too close to the actors; it would have blown them away and been dangerous. I see drones used more and more on documentaries as well as drama these days. They really are a great tool and you can get shots that would otherwise be impossible – say, starting inside a house and flying out through a window – but I think there’s a danger of them being overused and losing the impact. We used it sparingly on Dartmoor Killing. There were three or four big shots that lift it out of the ordinary without being gimmicky and it was perfect to show the incredibly cinematic landscape of Dartmoor. The other issue is weight of course. You can’t put an ALEXA up on an octocopter. So we used a Panasonic Lumix GH4 in 4K. We hadn’t tested it, there just wasn’t time, but it cut into the ALEXA footage really well. I could slightly see the difference, but considering it’s a budget camera, it was incredible. At the end of the day, it’s a £1,000 camera compared with a £50,000 camera, so the dynamic range isn’t as good, but then the ALEXA is something that’s hard to beat for dynamic range.
Nick and focuspuller Ben Oliver shooting on Dartmoor
Spring 2015 ZERB
Photo courtesy of Johann Perry, cinematographer on Firecracker Filmsâ€™ shoot for the Vodafone Firsts campaign
The ARRI AMIRA is a versatile documentary-style camera that combines exceptional image quality and affordable CFast 2.0 workflows with an ergonomic design optimized for single-operator use. Easy-access controls and an intuitive menu structure make working with AMIRA simplicity itself.
ARRI AMIRA. TRULY CINEMATIC.
For more information please contact: ARRI CT Camera team, +44 (0) 1895 457000, email@example.com
Shooting scenes at ‘Camp Bastion’ on an ALEXA from Take 2 in Cape Town
even Log C, all the sky and foreground detail is captured without grads. So I’m using fewer ND and colour grads than I used to. Although it’s great to do things in camera, we have all these tools available to us now in post and sometimes it can be quicker and more precise to do it in the grade. I do use straight NDs to reduce the depth of field and diffusion though. On Our Girl I used a White Pro-Mist to reduce the contrast, lift the shadow detail and slightly soften the image. With HD, whether it’s 2K, 4K or beyond, we’ve got all this resolution and that’s great for documentaries and wildlife where you want to see all the detail – the feathers on a bird or scales on a snake etc – but you don’t necessarily want that level of detail to show up the warts and all on an actor’s face! We DoPs often live or die on how we light actors. I did the pilot last year for the forthcoming Lionsgate show The Royals with Elizabeth Hurley. We did screen tests with Elizabeth, not just for lighting but also to see which diffusion worked best and we liked the Hollywood Black Magic. It’s an in-focus diffusion, in other words it softens the skin but doesn’t make the entire image look soft. Elizabeth is amazing – she’s nearly 50 and has great skin – so we were lucky, but it’s not always the case and we have to be very conscious of how we shoot actors, especially if we want to work with them again! The irony with HD is we often end up degrading the image, especially on close-ups, by adding filters to make the image softer and more filmic. Maybe for wides and landscapes I’ll shoot clean. I recently saw a test using a new 4K camera with brand new lenses; when it was projected my eyes were practically bleeding it was so sharp! I’m not a Luddite, but I don’t think 4K is necessary for everything; it very much depends on the subject matter.
Have you ever owned a camera and would you now? I once owned a 16mm ARRI SR and an Aaton XTR, but HD video and the BBC killed off 16mm film for television. I wish I’d kept them though as I have a collection of old cameras at home, including an ARRI ST and some 8mm, 9.5mm and other 16mm cameras, so they would have ended up in my display cabinets for a bit of nostalgia! When I worked in documentary you would generally be hired with kit, so that’s why I had my own. I did also own a Beta SP camcorder – the workhorse Sony BVW400 – and I used that for the first and subsequent Time Team programmes for about eight years, an amazing long life for a video camera. 26
I don’t own equipment now because in drama today it seems de rigour to have a pantechnicon full of camera bodies, lenses and a plethora of accessories. It’s a vast investment and also with technology changing so fast, I don’t think it’s worth it. Also, the rental companies are doing such good deals for production, I really couldn’t compete. I also don’t want the responsibility if equipment fails. I’ve been in that position before and it’s a nightmare. Now I can just pick up the phone and someone else deals with it. I have enough on my plate just getting the show in the can. So for me, it’s peace of mind. It’s fine if people want to buy their own equipment but, for me, I’ve been there and done that and this way I can always have the very latest kit. What I don’t really understand these days is the economics; we seem to have less money for rental then we had when it was just a 16mm camera kit in the back of an estate car but now we have about 20 times more kit in a vast truck!
Nick, thank you for sharing your experiences with Zerb.
Fact File PAUL BLUNDELL
BBC / COCO VAN OPPENS
Nick Dance BSC
Nick shooting The Royals at Blenhiem Palace
Nick Dance BSC started his career in documentaries, shooting in over 60 countries, from the deserts of the Sahara and Atacama to the pyramids and Great Wall of China, including flying with the Red Arrows to diving into the ocean on nuclear submarines. He made documentaries for BBC’s QED strand about: Falklands War hero Simon Weston; Monty Roberts, the horse whisperer; the Elephant Man and many others. He was part of the team that created the C4 series Time Team and went on to shoot many of the early episodes. He has shot commercials for Saatchi & Saatchi and promos for Ridley Scott Associates, including Jamie T’s Sheila with Bob Hoskins, and visuals for the concerts used in The Chemical Brothers: Don’t Think feature film. His recent drama productions include the feature film Dartmoor Killing and The Interceptor, a new series for BBC1. Nick has worked on many BAFTA and RTS-award winning programmes including: Bodies; Pompeii – The Last Day, Terry Pratchett’s Johnny and the Bomb; Nuremberg: Goering’s Last Stand, Call The Midwife and Skins – for which he was personally nominated for a BAFTA and RTS award for Photography and Lighting, as well as awarded a GTC Award for Excellence. Other credits include: Mansfield Park, The Body Farm, Death In Paradise, The Syndicate, The Paradise and Our Girl.
Spring 2015 ZERB
4K developments at Sony
With 4K the current hot talking point in the industry, many manufacturers are busy researching, developing and testing new products to meet the potential demand for this new technology, none more so than Sony, who have thrown resources not only into developing the products but also into running comprehensive training programmes at their Digital Motion Picture Centre to back the new products up. Peter Sykes, Strategic Technology Development Manager for Sony Europe explains where the company is up to in this area.
The live 4K broadcast of War Horse from the New London Theatre, screened in 4K at the Curzon Cinema in Chelsea, was a world first
Spring 2015 ZERB
4K developments at Sony Industry developments and standards With image quality at the very core of film and television entertainment, the growth of 4K or Ultra High Definition (UHD) production has been a key topic of discussion in the last few years. Both production companies and single shooters are realising how 4K images can work for them and are embracing the flexibility that 4K brings. This sits against a backdrop of intense competition in the television and online content marketplaces, with viewers fragmenting across multiple delivery platforms and companies looking to secure their attention with the best, most immersive experience possible. The creative flexibility that 4K brings to post-production is also a major attraction for TV and film production. With 4K comes higher resolution footage with better colour fidelity, which in turn makes keying, resizing and image cropping easier without compromising quality. For cameramen shooting on their own, or productions capturing one-off events, the ability to shoot a wide 4K view and then crop for the desired image in post-production means that multiple different images can be extracted from the original 4K shot. Rather than being restricted to using the image as it’s framed at the point of capture, the editor can move within the 4K image to reframe the shot in post-production. More encouraging news for television is the recent approval by the DVB Steering Board of the DVB-UHDTV Phase 1 specification. This new specification allows for the over-theair transmission of 3840 x 2160 resolution images at up to 60Hz and is a significant milestone towards widespread UHD broadcast. This highlights that now is the time for content producers to start thinking seriously about 4K, if they haven’t already done so.
4K: Tangible benefits, today and tomorrow For production and broadcast, there are various potential benefits beyond merely the creation of outstanding 4K images. Content shot in 4K can look sharper and richer than footage originated in HD, even when the 4K content is converted to and distributed in HD. Shooting in 4K for HD play-out today is something that many production companies have begun to do, with the added benefit that the acquired 4K images are future-proofed for when more 4K distribution platforms become available. The creative flexibility that 4K brings to post-production is also a major attraction for TV and film production. With 4K comes higher resolution footage with better colour fidelity, which in turn makes keying, resizing and image cropping easier without compromising quality. For cameramen shooting on their own, or productions capturing one-off events, the ability to shoot a wide 4K view and then crop for the desired image in post-production means that multiple angles are possible within one shot. As the options for delivering content to consumers increases, another benefit of capturing content in 4K is becoming more apparent. With a multitude of aspect ratios across television, laptop and mobile screens, the requirement to re-version original content for these platforms is growing. Multiplatform content provision is an issue that the media industry is facing and 4K acquisition is one potential solution.
several years. Over 18,000 Sony 4K projectors have been installed around the world by leading chains, including Vue, Showcase and Everyman in the UK, which are using the technology to provide rich cinematic images to audiences. Sony’s first 4K-capable camera, the CineAlta F65 was introduced in 2011. Featuring an 8K image sensor, it has been used to capture a list of features including Belle and The Second Best Exotic Marigold Hotel, both shot by cinematographer Ben Smithard BSC, as well as Ex Machina, shot by Rob Hardy BSC, After Earth captured by Peter Suschitzky, ASC, and Luc Besson’s Lucy. In 2012, the F65 was joined by two new CineAlta cameras, the PMW-F5 and PMW-F55. Both cameras feature 4K image sensors and their introduction paved the way for widespread adoption of CineAlta and 4K around the world. This increased choice and creative freedom has been picked up in TV drama production, with shows such as The Blacklist, Masters of Sex and Big Bang Theory all shot in 4K with CineAlta cameras.
The F65 represents a big step for the future of filmmaking. The amount of detail and colour it captures are breathtaking. I wanted the audience to feel the beauty of the 4K images and be involved in the story, and the F65 made that a reality. Ben Smithard BSC, DoP for Belle and The Second Best Exotic Marigold Hotel
4K for live production 4K has also come to the fore for live production across a range of sport, theatre and historic events. In the arts world, National Theatre Live (NT Live) broke new ground in February 2014, broadcasting a live performance of War Horse from the New London Theatre to cinemas around the UK. Captured on PMW-F55 cameras as part of a 4K live production chain, the signal was also transmitted to the Curzon Cinema in Chelsea, London and, in a world’s first, screened there live in 4K. In a ceremony in St Peter’s Square, Rome, in April 2014, Pope Francis raised Pope John Paul II and Pope John XXIII to sainthood. Working with the Vatican Television Centre (CTV), Sony and facilities partner DBW Communication produced a two-hour broadcast of the ceremony that was simultaneously transmitted in HD, 3D and 4K. Feeds from six PMW-F55 cameras were used for this historic event, another world first.
4K for TV drama Cinema has always been a powerful and immersive experience and cinemas have been presenting feature films in 4K for www.gtc.org.uk
DoP Ben Smithard shooting Belle on the F65
4K developments at Sony
As with the beginnings of live HD before it, live 4K has become most strongly associated with sport production. Two of the most recent The Vatican’s goal is to examples come from high-speed develop productions that sports. In July 2014, TVN Mobile enhance the involvement Production used a full 4K workflow of people and to provide to capture the Polish leg of the Red Bull Air Race World Championship, in wider archive formats. Gydnia. More recently, the 2014 Hertz Ultra HD gives incredible British Grand Prix MotoGP was shot in detail and a real emotive 4K at Silverstone in September, a trial quality. We are looking that also used a live production chain at 4K as the highest featuring Sony PMW-F55 cameras. But last year the biggest test quality to store material came in the form of the world’s in future. largest sporting competition. In Stefano D’Agostini, Technical June, the 2014 FIFA World Cup Director at CTV marked a watershed moment for sports broadcasting. Not only did the tournament see more than 2,500 hours of HD footage captured from 64 matches, in 12 Brazilian stadia, across two timezones, but a team of 30 people captured and transmitted three matches in 4K, live, across the world. Having been tried and tested at the FIFA Confederations Cup in 2013, as well as the Spanish Premier League earlier in 2014, for the very first time football fans themselves were able to enjoy an unparalleled 4K visual experience, as live cinema screenings took place in London. As Host Broadcast Service’s (HBS) delivery partner for the 2014 FIFA World Cup, Sony worked together with HBS and FIFA TV to deliver the 4K matches. Twelve PMW-F55s and an F65 were used as part of the live 4K chain, proving the viability of 4K live production on a global scale. To make the production a reality, Sony selected UK-based outside broadcast (OB) company Telegenic to harness the experience and technical expertise gained at the FIFA Confederations Cup. This was in turn underpinned by Brazilian OB and 30
programming company Globosat (part of TV Globo Group), responsible for the provision of on-the-ground technical facilities to deliver a visually spectacular viewing experience to fans of ‘the beautiful game’.
Workflows for 4K Production workflows for 4K have matured over a number of years and now offer choices in recording format, storage, onset and post-production processing and delivery. Establishment of a workflow suitable for the specific requirements of a feature film or TV drama is a critical requirement and will need to provide the combination of image quality, robustness, storage capacity, data throughput and speed necessary. For acquisition, the F65 (and cameras such as the F55, F5 and PXW-FS7 when used with an external RAW recorder) provide exceptionally high-quality RAW recording capability. For users looking to maintain very high image quality at reduced data rates, Sony has also developed the XAVC recording format. XAVC complies with H.264/MPEG-4 Part10 level 5.2, with video essence encapsulated into an industry standard MXF OP1a wrapper, accompanied by audio and metadata. The development objective for XAVC was to create a family of professional production tools to efficiently handle high frame rate (HFR) HD and 4K imaging formats. XAVC has been incorporated across Sony’s wider camera line-up, from the PMW-F55, PMW-F5 and PXWFS7 4K cameras, through the PXW-X70 4K-ready compact camcorder, to broadcast workhorse HD camcorders such as the PXW-X500. Providing a consistent format from acquisition through to playout, XAVC has also been adopted by a substantial number of companies via the XAVC Alliance Partner programme. At the time of writing, 69 companies have signed up to the programme with the biggest names in on-set dailies, editing, grading, storage and playout all licensing the technology. This open approach has helped simplify workflows at all points of the production chain, providing efficient HD and 4K workflows for all. Spring 2015 ZERB
4K developments at Sony
The canonisation of Pope John Paul II and Pope John XXIII in April 2014 was simultaneously broadcast in HD, 3D and 4K
The practicalities of shooting 4K Recognising the appetite within the industry for practical hands-on training in all aspects of 4K production, Sony opened its European Digital Motion Picture Centre in October 2013. Located at Pinewood Studios, the centre houses Sony’s 4K camera family, a selection of tools for on-set and postproduction for features, drama and commercials, and an end-to-end 4K live production chain. Visiting cameramen are encouraged to stop by or make a dedicated visit, taking the opportunity to speak with technology experts and colleagues. Regular training sessions are also scheduled at the centre, making it a hub for learning and exchange of ideas about developments in TV and cinema, and the future of filmmaking. Richard Lewis is Sony’s Chief Engineer at the facility and has witnessed the way that visitors to the centre have embraced the opportunities offered by the new technology. “One of the concerns that some television cameramen have had is that the majority of 4K cameras have a shallower depth of field than the tools they’re currently using. While there is an adjustment from HD to 4K cameras, it’s effectively the same as the transition from SD to HD some time ago.
The difference is that this time the transition can be easier, as the supporting tools have come on in leaps and bounds. Pin-sharp viewfinders, distance calculators such as Cine Tape and remote focus control devices are all available to productions using large format sensor cameras, making the job of focusing easier and more accurate. Essentially, the full range of 35mm shooting tools is now available to TV productions as well as films. We’ve reached a point where cameras such as the Sony PMW-F55 have a similar number of stops as a film camera. This means that shooting digitally is very similar to shooting with film, as we’re seeing the recording of a true ‘digital negative’. The differences in approach and technique are swiftly melting away. Looking at the detailed practicalities of working with 4K cameras, two areas immediately crop up: storage and power. Fortunately, progress in both areas has been remarkable in recent years and months. Sony’s latest AXS cards – ideal for recording RAW footage – are now the same physical size as SxS cards, marking a substantial reduction in actual size whilst retaining the same high data storage capacity. When it comes to batteries, 4K cameras have been accused of being powerhungry, but we’re now at a point where the F55, F5 and the new FS7 consume the same amount of power – or less – than many HD cameras. Power and storage are now no longer pressing concerns when it comes to 4K. As a specific example, even when shooting RAW 4K, Sony’s 512GB AXS cards can record around one hour of footage at 25fps, or 25 minutes at 60fps. The AXS card reader can then download this via USB3 at approximately 160MBps (around real time). With the average amount shot being around 45 minutes to an hour per camera per day, that means 500GB per camera per day. On-set storage would typically be a 3TB drive, which can comfortably cope with backing up rushes on location. We’re also coming full circle when it comes to the craft of the camera operator. With the similarities between 4K and film acquisition, operators can take advantage of the creative possibilities that shooting with a large sensor brings, many of which are the same as shooting with film. Those who were trained on film cameras are often surprised to find that their original skills are directly applicable to the latest generation of 4K cameras.”
4K and the future The move towards UHD is very much underway. Around the world, broadcasters are rolling out trials and services.
4K developments at Sony
Tools of the Trade Each of Sony’s native 4K cameras has been developed from first principles with 4K in mind. The camera line-up has been designed to cover the full range of productions – from Hollywood blockbusters to small, independent projects – and all have 4K at their core. F65 • 20 Megapixel camera with an 8K Super 35mm sensor, the flagship of Sony’s large sensor CineAlta camera family • With a wide colour gamut, high dynamic range and high sensitivity, the F65 gives directors the freedom to produce images exactly as intended. Other features include a high frame rate mode for RAW recording at 120P to deliver exceptional 4K images in smooth slow motion • PL mount and rotary mechanical shutter • Developed for ultimate quality cinematic projects
PMW-F55 • Modular 4K camera with Super 35mm sensor and multi-codec support • RAW recording using dockable AXS-R5 recorder • Includes Frame Image Scan feature to eliminate rolling shutter skew and flash band and a wide colour gamut for exceptional colour reproduction. Offers 14 stops of exposure latitude, high sensitivity and low noise • In use around the world for feature films, TV drama and commercials, and also within Sony’s 4K live production system
PMW-F5 • Modular 4K camera, with same chassis design and ergonomics as the PMW-F55 (without Frame Image Scan and wider colour gamut facilities) • RAW recording using dockable AXS-R5 recorder • PL-mount capability for use with cinematic lenses from suppliers such as Angénieux, ARRI, Canon, Carl Zeiss, Cooke, Fujifilm and Leica
PXW-FS7 • The latest addition to Sony’s family of 4K-capable cameras • Great combination of 4K imaging, ‘run-and-gun’ flexibility and quick and slow motion capture • Supports a variety of recording formats including XAVC Intra and Long GOP and MPEG HD422 (Apple ProRes 422 codec scheduled for early 2015 via a firmware update) • Ideal for film-making and documentary work
NEX-FS700R/RH camcorders • 4K Super 35 Exmor CMOS sensor, providing 11.6 million pixels in total • Can capture 4K RAW footage with an AXS-R5 RAW recorder over 3G HD-SDI using an HXR-IFR5 interface unit. Subsequent RAW workflow is the same as with the PMW-F5 • Ideal for videographers and documentary-makers
PXW-Z100 • Professional handheld XDCAM camcorder • Features a 1/2.33-inch Exmor R CMOS sensor with 16 million pixels • Can capture 4K images at 50P or 60P. Also supports remote control via WiFi through a smartphone or tablet • Designed for 4K production across a range of budgets
PXW-X70 • Palm-sized, compact XDCAM professional camcorder with 20 Megapixel 1.0 type Exmor R CMOS sensor • Can record High Definition in XAVC Long GOP, with 4:2:2 10-bit sampling at 50 Mb/s • Upcoming release will provide 4K UHD recording • Ideal for a variety of applications from newsgathering and documentaries to event shooting For more information about how to get the best use out of 4K technology please visit: www.sony.co.uk/pro/products/broadcast 32
Spring 2015 ZERB
4K developments at Sony
The 4K Live Production Chain Behind each of the live 4K productions with which Sony has been involved is its 4K system, providing a full live 4K workflow, which takes the images from initial capture through mixing to storage and distribution. Sitting at the heart of the solution is the PMW-F55, which uses the CA-4000 camera adapter to integrate into the live production chain. The adapter incorporates a SMPTE fibre interface with a full set of connections for audio, tally and prompter. The action is then relayed to – and stored on – the PWS-4400 4K server, via the BPU-4000 baseband processor unit, which debayers the F55’s image into 4K baseband video. The MVS-7000X or MVS-8000X 4K switcher is then ready to add 4K effects and transitions before completing the 4K live production cycle, ready for distribution. Throughout the chain, the PVM-X300 30” 4K LCD monitor can display the 4K signal with pixel-for-pixel accuracy.
Other developments include the new 30-inch BVM-X300 TRIMASTER EL OLED 4K monitor designed for critical colour grading applications and featuring unparalleled black level performance, High Dynamic Range mode and a wide colour gamut conforming to DCI-P3 and approaching ITU-R BT.2020. Major upcoming international sporting events in 2016, 2018 and beyond will be key drivers for 4K adoption. But regardless of broadcast plans, the tools to create engaging 4K images are available, and TV and film producers are building inventories of 4K content now.
Top: The F55 shooting 4K at the FIFA World Cup in Brazil Bottom: Prepping for the Red Bull Air Race World Championship
The first test broadcasts in South Korea began in 2013 and that country plans to have 4K terrestrial broadcasting within the next year. Similarly, in Japan, 4K broadcasting began in June 2014 and 8K satellite tests are planned for 2016. Online content providers have launched services and, in the consumer world, sales of 4K UHD televisions and portable devices continue to increase. The Blu-ray Disc Association has announced that the specifications for 4K Blu-ray are expected to be finalised in the first half of 2015, paving the way for commercial availability by the end of the year. A next step in the professional 4K live production journey has also been identified and a milestone set out – 4K over IP. At IBC 2014, Sony, BCE (Broadcasting Centre Europe) and Level 3 Communications demonstrated a live 4K link delivered over an IP infrastructure between Luxembourg and the RAI Exhibition and Conference Centre in Amsterdam. Captured using F55 cameras at IBC and at BCE’s live production studio, the footage was screened throughout the show and highlighted the suitability of the technology as the next logical step for broadcasters looking to migrate from today’s SDI-based systems. First implementations based on this technology are scheduled for 2015. www.gtc.org.uk
Peter Sykes is the Strategic Technology Development Manager for Sony Professional Solutions Europe. He has worked in the content and media technology industry for 30 years and has been involved in the introduction of some of Sony’s most exciting professional developments including the Digital Betacam, XDCAM and HDCAM production formats. Peter first became involved in the product management and marketing of Sony’s High Definition cameras and systems over 10 years ago, when he was tasked with widening the discussion beyond the traditional cinematography audience in readiness for widespread broadcast roll-out starting in the mid-2000s. He then worked on the introduction of file-based HD technologies such as XDCAM HD, before taking up the role of Strategic Marketing Manager for Digital Cinematography in 2010, leading the introduction of the CineAlta F65, F55 and F5 cameras and workflows. As Strategic Technology Development Manager, Peter is currently involved in the introduction of key technologies such as Ultra High Definition for the European professional market. Based in the UK, he recently led the project to create Sony’s new Digital Motion Picture Centre for Europe located at Pinewood Studios. Contact Peter on: Peter.Sykes@eu.sony.com See more about Sony products at: www.pro.sony.eu
A cracking contraption:
the new Gimbal Vest from Easyrig
Swedish cameraman Marcus Johansson using the new vest with Flowcineâ€™s Serene attached to the rig
Spring 2015 ZERB
Easyrig When guest editor Paul Mellon last edited Zerb way back in 2000 (yes, he came back for a second go, thank you Paul!), the magazine carried news of Easyrig, an invention to ease the strain on the backs and shoulders of cameramen operating handheld. Shortly before that, GTC member Dudley Darby had first come across the rig and its inventor Swedish cameraman Johan Hellsten at IBC. Since then the range of products has gone from strength to strength, receiving a GTC Seal of Approval in 2012. The latest incarnation features a new gimbal vest and Dudley has been talking to Johan and early users about this latest addition to the Easyrig family. Gimbals everywhere Over the past few years a large number of small ‘broadcast quality’ cameras have appeared along with, in some cases, the perception that everything could be shot handheld with electronic or optical stabilisation removing the lumpiness. As that myth was exploded, a vast array of accessories to make these lighter cameras more usable in demanding situations has appeared. Over the past couple of years handheld gimbals have come to the fore, not just for the low-end cameras, but for 4K and high-end cameras as well. While improving the stability of shots, the penalty is weight, and an increased risk of back pain and muscle injury for the operator. Very early ‘lightweights’ from the likes of Ikegami, Philips, Bosch Fernseh and Link were anything but light but offered a degree of freedom to the TV cameraman that had previously been the preserve of the film fraternity. However, these could be a source of pain or injury to those operating them for long periods. I met one such cameraman at IBC in 1996. He appeared to have what looked like a hook strapped to his back. He stared at my badge (they were cardboard in those days), which declared I was a cameraman. “Ah, you are a cameraman. I have something to show you that I think you will like.” That was my first introduction to both Johan Hellsten and Easyrig, and yes, I did like it. Johan had developed Easyrig in 1994 to support the weight of the camera at the hips rather than the arms and shoulders, via a cord attached to an arm fixed to the lower back part of a lightweight vest. It worked. The end result was less back pain and a lot less fatigue during long stints of handheld operation. The largest obstacle to getting it widely accepted was the reluctance of many cameramen to being seen sporting such a strange-looking contraption. That changed though as more tried it and realised the benefits, first in Scandinavia,
Top: Toby Strong using the Gimbal Rig vest in Patagonia Bottom: Patrick Weir using DJI gimbal with a RED camera
then further afield in Europe and beyond. In 2012 it gained a GTC Seal of Approval.
Adapting to need In the 20 years since the first Easyrig, there has been continued development as cameras and styles have changed. The original rig capable of supporting loads of up to 16kg (Easyrig 2.5) has had a few refinements over the years, but its limiting factor when it came to adjusting for camera weight was the way the gas shock-absorber mechanism worked. There are five gas shock absorbers matched to load with a range of about 3kg each. A requirement for a more heavyduty version capable of supporting larger film cameras and 3D rigs with their accessories emerged and Easyrig Cinema 3 was born with two additional shock absorber options to cater for loads up to 25kg. Specialist rigs to take loads far in excess of 25kg have also been built from time to time. Early on it became evident that not all cameramen are the same size, so small and large hip belts also emerged as alternatives to the standard fit. Contrary to the supposition that small, light cameras wouldn’t need anything like Easyrig, a need emerged not based so much on weight, but on the ergonomics of using small cameras for video work. The Turtle X built around a Petrol backpack with a foldable arm was the first version,
Save the Date Join us for the GTC Awards and AGM 2015 Sunday 17 May A Day in the Country, Aynho, Nr Banbury, OX17 3AY
Easyrig which evolved into the Easyrig Mini with sprung shock absorption taking up to a 4kg load. The Mini Strong was added for loads between 4kg and 6kg. A Shoulder Mount was also introduced to allow DSLRs to be used with the Easyrig Mini in a conventional shoulder-mounted position.
The new Gimbal Rig vest And so to gimbals. A gimbal doesn’t just add weight; it tends to be held further from the body. This pulls a standard Easyrig forward, putting strain on the front of the lower part of the vest, and, to an extent, the shoulders. This drove Johan and the Easyrig team to develop a new vest to take account of the conditions encountered with gimbals. The result is now available as the Gimbal Rig vest. To combat the greater leverage with the load further away from the body, the vest has been redesigned with reinforcement of the aluminium back support, now constructed of thicker gauge metal with the vest extending over the hips. A wider front strap with some 30cm of adjustment and quick-release buckles spreads the increased load across a larger area at the front of the body. The shoulder straps are also longer. The new vest is available as an individual item or as part of a complete Easyrig system with standard or small front straps. Large ones will soon also be available. Before release, the vest was field-tested in Norway, New Zealand and the USA, and recently Toby Strong, winner of a GTC Award for Excellence last year for his work on Bill Bailey’s Jungle Hero took it on a shoot in Patagonia for a three-part BBC series.
The Gimbal Vest in use Toby had built his own motorised 3-axis gimbal, which he’d used previously with the standard Easyrig vest. For the Patagonian shoot he used the Gimbal Vest with a 230mm Cinema 3 arm and 400N shock
Contact: Andy Driver
10% firstname.lastname@example.org 25% 20% 15% up to
45% 5% 25%
£100 10% 20% 10% 20%
Please quote: GTC2014 All Kit Hire (subject to availability) Contact: Mark Wilson on 020 7436 3060 email@example.com Contact: Peter Scrutton 01932 570001 firstname.lastname@example.org Editing Facilities (Avid 7) Online editor with 20 years TV experience Contact: Dewi Evans on: 029 2131 0131 DEvans@CrackingProductions.com Kit Hire. 30% Crew + 15% Members Discount Subject to availability. Please ring for details Contact: Paula Connor 020 8232 8899; email@example.com Contact: Andrew Morgan 0844 330 8693; firstname.lastname@example.org Contact: Russell Martin 020 8226 4073 email@example.com HD-SDI version of Rubi-Radio Mini HD Link Contact: Neil Davies 020 8500 4385; firstname.lastname@example.org Contact: Sharon Howells 01792 720880; Sharon.email@example.com All Kit Hire (subject to availability) Contact Shaun Wilton: 020 8941 1000 firstname.lastname@example.org Camera Courses Contact: Barry Bassett 020 8922 1222; email@example.com New VW Vans plus tting out Cordwallis Van Centre, Bedfont Contact John Harkin: 0844 811 2175 These discounts are not transferable. Be prepared to quote your membership number. Please contact the company directly, not the GTC.
Connor O’Brien, AFI, using a RED camera rig with the Gimbal Rig vest
absorber. The camera was an ARRI AMIRA. I caught up with Toby shortly before he left for another adventure with the new vest involving climbing a volcano. “My homebuilt gimbal is a bit larger and weightier than some of the commercially available ones, but it does what I want it to. When I’d used it with the standard Easyrig Cinema 3 vest it worked, but there was a noticeable pull on the front straps and shoulder straps. The new vest is an absolute game-changer! The improved shoulder support and wider front belt that allows more adjustment have made a massive difference both in terms of comfort and support. One of the sequences in Patagonia involved following people down a steep mountain track. Using a handheld gimbal alone for it would have been precarious but with this new Easyrig Gimbal vest I had absolutely no qualms in taking it down the mountainside. The 400N shock absorber allowed a very impressive vertical range from eye-level to close to the ground. The handy pockets on the vest let me safely carry a V-lock battery as well as a Leatherman, the pockets being arranged such that you barely notice the extra weight. It’s this sort of attention to detail and build quality that make this vest outstanding. One thing the Easyrig doesn’t do – and was never intended to do – is take out vertical movement when walking, but it does exactly what it says: it supports the weight.” Flowcine, another Swedish company, have developed the Serene, a spring-loaded attachment to fit the Easyrig arm that will take out some of the vertical motion errors that Toby mentions. Toby continued: “The opportunity to use the Gimbal vest came very late in the day, shortly before I was due to leave for the location. I thought it was too late but the guys in Sweden pulled out all the stops and made sure it was with me in time. Their customer support is superb; they listen and act. If I had to sum up the Gimbal Vest in one word, it would have to be ‘Brilliant’!”
Fact File See more about Easyrig at: www.easyrig.com The UK agents for Easyrig are: Production Gear, Millennium Studios, Elstree Way, Borehamwood, WD6 1SF Phone: +44 (0)20 8236 1212 www.productiongear.co.uk Website:
Spring 2015 ZERB
A N EW S TA R
IS BORN YOUR VISION, IN A BETTER LIGHT MEET THE ASTRA™ 1X1 BI-COLOR—THE NEW STAR IN LED LIGHTING. WITH CLOSE ATTENTION PAID TO EVERY DETAIL, YEARS OF EXPERIENCE AND CUSTOMER FEEDBACK HAVE GIVEN BIRTH TO THE NEXT GENERATION 1X1. )M7G><=I:GDJIEJII=6CA:<68NA:9E6C:AH =><=8G>;DGHJE:G>DG8DADGG:EGD9J8I>DC 688JG6I:69?JHI67A:8DADGI:BE:G6IJG:/96NA><=IIDIJC<HI:C H:A:8I67A:8DDA>C<BD9:H/H>A:CIE6HH>K:$JAIG6FJ>:I68I>K:L>I=9DJ7A:DJIEJI BD9JA6G7N9:H><C/6;;DG967A:6C98JHIDB>O67A:
ASTRA 1X1 FROM LITEPANELS—LEADING THE LED REVOLUTION
Still Game Live
Transforming a TV sitcom to an arena event Last September Zerb Editor Paul Mellon worked on a live show comeback of the popular Scottish sitcom Still Game. No one could have predicted what a phenomenal success this was going to be. The process of presenting the comedy dialogue and action (usually viewed close-up at home on a small TV screen) to a large audience in a very large arena threw up some interesting questions around how we view and enjoy such productions, highlighting the important role camera coverage plays in a whole range of live events. As well as forming a key part of almost all rock concerts,‘live event’ screenings of opera and ballet performances are now regularly beamed to cinemas and open spaces. Events such as these are posing new questions of how best to cover them in order to optimise the audience experience. Still Game Live was no exception.
Still Game, which ran for 44 episodes between 2002–2007, is written by Ford Kiernan and Greg Hemphill, who also play the main characters, elderly widowers Jack Jarvis and Victor McDade. Set in the fictional area of Craiglang in Glasgow, the programme is one of the best-loved Scottish sitcoms of all time, with an ensemble cast of characters appearing regularly throughout the series. The live show threads together various plot strands, including Jack and Victor’s discovery of Facetime, with disastrous results. A hallucinogenic Bollywood musical extravaganza concludes the show!
the venue in question was the 12,000-seater SSE Hydro in Glasgow. Over 210,000 people flocked to the show – about a third of the population of the city. Extrapolate those numbers and that would be the equivalent of 2.8 million Londoners watching a show at the O2 Arena. Not bad for a wee Scottish telly programme. MICHAEL HINES
he advisability of showbiz comebacks can be highly debatable, so for the creators of a sitcom that had been off air for seven years and was familiar mainly to a regional audience in Scotland (although it did air nationally on BBC2), there was most certainly an element of risk involved. The internet had dramatically changed the landscape for all branches of entertainment, so would there be any appetite for the return of a sitcom in which the principal characters are all OAPs? The decision to create a live stage version of the TV programme was taken though and seven shows over four days announced. The box office did not so much crash as completely melt down! Extra dates were added and, in the end, a phenomenal 21 sell-out shows scheduled. All quite unremarkable, you might think, until you consider that
View from middle level seating, as Jack and Victor prepare the audience for a ‘Craiglang Wave’, funnier than any Mexican version! Front-of-house cameras are at the bottom of frame
Spring 2015 ZERB
MARC TURNER PFM PICTURES
Still Game Live
How do you visually enhance a live sitcom in a huge arena? The very first discussion I had with Ford and Greg was that the cameras would just do a very basic wide and tight frame of each set, so that the people at the back could see roughly the same frame with their own eyes as if they were near the front. However, I quickly realised that, although this would be like sitting closer, you would nevertheless lose a lot of the essence of Still Game, such as the cutaway reactions during dialogue that enhance a joke or maybe set it up for later. These really need to be seen in close-up, or at least mid-shot. With Still Game being a TV series, I also wanted to make sure that the level of performance wasn’t too theatrical. The King’s Theatre in Glasgow has around 2000 seats but no screens. We could have put it on there but, not only would we have had to go on for 150 consecutive nights (instead of 21!) to get the same audience, it would have inevitably www.gtc.org.uk
become a theatre piece and the performance levels would have had to be so high that it would have ruined the inherent nature of the dialogue and characters. I always knew that the big screens were going to need to offer something closer to the TV experience, which is why the Hydro was such a great place to work in. Because of the size of it, we had to have big screens, and those screens were so giant we could afford to make the performance more like TV than theatre. Also, part of the script involved ‘FaceTimes’ and it was going to take a while to work out how to display these to best effect.
I don’t think anyone is entirely sure how to define what the live show became: television, theatre, panto, sitcom or a new hybrid form?
MARC TURNER PFM PICTURES
I was booked as part of the crew to ‘do the big screens’ and, other than being delighted to be involved in the return of a show I’d enjoyed watching, I didn’t give much thought to what would subsequently transpire. A 2.5-hour live episode, in effect, of a hugely popular TV programme was the outcome, and expectations were suitably high. I don’t think anyone – cast, crew or critics – is entirely sure how to define what the live show became: television, theatre, panto, sitcom or a new hybrid form that will acquire its own name over time? It’s undoubtedly a piece of entertainment history though, and I’m proud it took place in my home city. I caught up with TV series and live show director Michael Hines to gain some understanding of the thought processes and considerations around creating this ambitious and very successful show, and also gained some insights into what directors expect from camera operators in this kind of situation. The following is extracted from my chat with Michael.
Top: A typical moment from a pub scene, showing the live cut and rear scenic projection above Bottom: Show characters Tam, Boabby the barman and Winston discuss the technical wonders of the iPad
Still Game Live
cameras to have been spread out more evenly but as it was they were bunched together quite tightly at the back.
Landing gags on the small screen vs arena As I wrote the camera cut, I realised I could actually do a complete television cut and this allowed the actors to react as they would normally to keep the lid on the performance levels. I could see that this would work, but the actors had to trust me completely and, to start with, some of them wanted to give a much higher performance level until they saw it on the screens and realised “OMG, look at the size of that!” Cutting becomes organic when you are responding to live laughs and it is the edit points that make it. We had a 201page script for a 2.5-hour show and there were 340 shots in Act One, but that’s because I knew exactly where I wanted the cuts in order to create the laughs. Sometimes we would delay the cut and at others we would drop in a little extra cut, but this was not a new shot number. So, the camera operators learned to stay on the shot – in fact we all learned as we went – not to go instantly to the next shot as I might take an extra cutaway if it had created a laugh. Sometimes, if we went at a different point, it would cause a bigger laugh. We only learnt that through trial and error and the audience reactions. Also, the editing would be different from an afternoon to an evening show, or from a Friday to a Saturday. Remember though, the actors couldn’t see when they were on screen, so wouldn’t wait for the screens to change to say their lines. They’d just
COURTESY OF ANDY GIBBS BROOME PRODUCTIONS
The Hydro has about 130-degree sight lines, so it’s not quite flat. This meant that each of the three sets had to be 130 degrees so that everyone could see Because of the size of everything on the stage. Designer Ben the Hydro, we had to Stones did an amazing job with Production have big screens, and Manager Andy Gibbs; everyone, even those screens were in the shallowest of angled seats could see. The only thing they couldn’t see was so giant we could the opposite side screen – there were afford to make the three screens: camera left, centre and performance more like right. People on the edge couldn’t see TV than theatre. the screen on the other side and this did compromise me a little, but on the other hand it also made life a bit easier as it meant the two side screens effectively became duplicates and I could have some fun on the middle screen; for instance, changing it to the graphics for the end song. The FaceTimes went up on that section also, because everyone could see the middle screen. It meant you could have that lovely thing of pre-recorded footage on the playout in the centre with live camerawork and reactions on the outside screens, as well as live FaceTime cameras (minicams embedded in the set at certain places) on the outside screens. The biggest compromise though was camera placement because of the number of seats we had to sell. I couldn’t always get the best eyelines and sometimes you’d shoot into blackness rather than against the sets. I would have liked the front-of-house
The camera plan and screens configuration 40
Spring 2015 ZERB
Still Game Live
sense it and weâ€™d have to ride with them. So, we had a vision mixer who wasnâ€™t used to doing this, me who wasnâ€™t used to doing it, a cast who didnâ€™t know when they were on screen, and a different kind of audience for every show! It was like surfing a wave â€“ there was always a right moment to make the cut, but you might crash and burn sometimes! What I discovered, fascinatingly, is that we created extra laughs that werenâ€™t in the script merely by cutting to a different shot. The camera crew actively contributed to this and generated laughs through their shots. They became part of the content and, although Iâ€™ve been directing for more than 20 years, that got me thinking: â€œI wonder if I can do that when I go back to the small screen?â€? So, Iâ€™ve actually learned about the small screen from doing the live show.
MARC TURNER PFM PICTURES
The SSE Hydro (right) on the banks of the River Clyde next to another impressive auditorium â€˜The Armadilloâ€™
Jack and Victor: Jack (Ford Kiernan) and Victor (Greg Hemphill) co-writers and principal characters of Still Game
What did we just do there? This was a comedy show that we wanted to present through the big screens in the live arena. Because I came to it with a TV background, the screens didnâ€™t phase me and I blocked it bearing that in mind. Yes, the theatrical sets were big, but we did have the massive screens and the important thing was that we could see the characters in close-up. You donâ€™t go to a concert to see your favourite pop star in long shot! For me, this was a hybrid of television and theatre. There is a moment, early on, when Jack (self-referentially mocking himself) says: â€œYou canâ€™t put a television show on in a theatre.â€? Actually you can, but it was almost like a big studio recording. What you canâ€™t do is put the theatre on TV easily. So I donâ€™t know if there is a new term for it? The Stage said we were â€œredefining theatre for the arenaâ€?, which is a lovely phrase to hear; very pretentious and quite funny. The Guardian said it was â€œa thrilling but narrow theatrical victoryâ€? because they were instinctively being protective of the â€˜theatre thingâ€™. This wasnâ€™t Monty Python, which just did the favourite sketches; and it wasnâ€™t a rock â€˜n roll sketch show like Little Britain on tour, because it couldnâ€™t have been. We were trying to recreate a world â€“ Craiglang. Still Game is about the small moments towards the end of your life, and it worked very well for that.
10,000 living room sofas per show Because of the screens and projections, everyone in the audience could make their own individual cuts: 10,000 people saw 10,000 different shows, each time. I donâ€™t think this show could have existed even 5 years ago. Weâ€™ve got used to self-editing â€“ weâ€™ll have the news on but weâ€™ll be www.gtc.org.uk
Still Game Live
The ‘get in’ on Day One. Left of frame is Gregor Tulloch facs checking the ‘gods’ handheld camera.
on our tablets at the same time and, if the news is boring, we’ll self-edit and cut away without turning over. That ability to take in lots of information and process it and extract I think this is enjoyment out from it is fascinating to me. I what impresses don’t think we could have done this before me about camera because it would have overwhelmed us. operators these There was no need to watch the stage at all if you just wanted to watch the screens. But days – the amount then the reason people came was that they of different gear wanted to be part of something that was a they’re supposed huge event. What I think we did was create a to know about at unique way of doing that size of thing. No one any one time. has done a sitcom to that extent, at that size, so intricately. I didn’t realise that until the end and it has become a source of pride for me. We did create something unique and those in the know in theatre thought: “I haven’t seen this before.” We gave them something new and more than they could have expected, between the content, the revolving set, the way we treated the show and also the screens. The screens meant that the ‘media generation’ (or whatever you want to call it), who look at screens all the time, were completely satiated. As the thing evolved we came to understand how much the camera coverage was contributing to the event, way beyond just magnifying the action for those at the back. With stand-up, the screens don’t generate ‘new’ laughs, just the same laughs you would have had if you were sitting closer. With a Monty Python-type sketch show, it’s the same because everyone buys into the fact that, in a second, you’re about to create another world, and another, and so on. But, with Still Game, we went to Craiglang, and stayed there, until the Bollywood finale inside Isa’s head. I’d never thought of it like that before. Yeah, it’s weird!
It’s rare to have the opportunity to work on something that utilises craft skills that have been in place for many years, yet still takes you into new territory. Still Game Live achieved that. The indefinable nature of the show led me to ponder on how we describe ourselves bearing 42
in mind the variety of what we now do as ‘television’ camera operators in this internet age. What do directors make of us? I asked Michael for his thoughts. The type of material and the range of equipment camera operators need to know about these days is incredible. This week alone, I’ve used a GoPro, PMW-800, Panasonic 101 and Canon 5D, all on the same tiny little shoot, and I expect my camera operators to know all of it, inside out – why it’s not doing something or why it is – and to be able to balance it all to look good in the grade etc, etc. That’s crazy! It’s like learning to drive five different machines that all do the same thing ultimately. I think this kind of stuff is what impresses me about camera operators these days – the amount of different gear and types they’re supposed to know about at any one time. Regarding multi-camera, I talked to my pal Hamish Hamilton, who is a world-famous director – he does the Oscars and the Superbowl half-time show – about this. He knows far more about multi-camera directing than I do but we both agree there are different types of operator and it’s important to know and use them accordingly. There are always some who are happy to take the money and are not in any big rush – often towards the end of their career. You can put them on the autocue camera – the flat, mid-shot of the presenter – and they’ll do it all day long and it will always be pin sharp. They’ll do it really, really well and they’re not interested in anything creative beyond that; they’re quite happy that this is their role. These are the’traditional’ multicamera operators and they’re worth their weight in gold. Then there are the ‘young guns’, the ones who want to offer you different stuff, a little bit more ‘rock ‘n roll’. You put them on the side angles, the 45-degree angles, the drums and guitars. They’re the ones who are a bit more cheeky and will chance their arm doing the low or unusual angles. Then you have the guys who really know how to track or jib, and you put the experts on that gear. I can only speak from my experience as a director but what I like from a camera operator is someone who wants to contribute craftwise but knows intuitively how to frame. I don’t necessarily mean that every frame has to be exquisite like a painting. I don’t quite know what it is; it’s like sound, Spring 2015 ZERB
Still Game Live
you notice it when it’s wrong! Is it headroom? Is it the eyeline? Or selecting the best height? I also expect to be able to have a conversation with the operator about what I want and for them to be able to offer, creatively, the best way to achieve that. My operators also need to be able to hit the shots quickly and accurately. For multi-camera, it’s vital that the operators understand: a) who the characters are; and b) what the next shot is and to be able to hit it quickly. Camera operators expect us (directors) to turn up with half a clue (sadly I know that’s not always the case!) and to shoot things efficiently. So, in return, I expect a multi-camera person to have read and understood the script so that they can find the shots quickly. If I ask for something different, then they should be able to offer up an alternative. Quite often in comedy on a live thing, something will come up that you didn’t expect, so you have to be on your toes. I’m not always watching your camera output unfortunately – I know that’s hurtful! – but if you offer something and I don’t take it but ask for something else then move on to that. That’s really important. Multi-camera people need always to be aware of what is going on around them, both shot-wise and from other departments, which is why ‘return’ is so important. If you get six location camera people in, they’ll all offer you a beautiful shot, but it’s the same blooming shot, and that’s no use to me. I always need something to cut to, so that would be my best advice to someone going into multi-camera: “Always remember the director and vision mixer need something to cut to!”
Fact File Michael Hines is an award-winning director with over 20 years experience. He trained in childrens’ BBC, learning both single and multi camera directing. He alternates between single camera comedies, commercials etc and live OBs, including the two-hour live Gaelic New Year show every year. He produced two series of Still Game and has directed every episode. Still Game Live at The Hydro was his first major theatre production. He also has his own production company The Woven Thread, which makes comedies for BBC Alba, winning Best Entertainment at the Celtic Media Festival for his first commission, a comedy documentary about the Royal National Mod. He is a member of DRS (Directors Right’s Society), on the BAFTA Scotland Committee and is in Equity. Camera Crew Supervisor: Mark Jason Cruikshank Operators: Gregor Tulloch, Humphrey Tauro, Paul Mellon, Fraz Raheem, Iain White (H/H) Vision Mixer: Richard Ellis
Codex Director of Operations Ben Perry
A smooth production flow Hundreds of press releases land in the Zerb inbox weekly and, much as we’d like to, we can’t follow them all up. GTC sponsors get priority, but we endeavour to keep a ‘weather eye’ on all news that might interest GTC members. Last year, one name kept surfacing - Codex manufacturers of high-end digital recording and production workflow solutions. Partnering with the likes of Panasonic, ARRI and Canon and, more out of this world, sending a 4K recorder to the International Space Station, this company and its products are clearly going places. Time to find out more, so GTC members Mark Langton and Martin Hammond were dispatched to meet Codex’s Director of Operations Ben Perry.
A thoroughly British company All Codex products are designed and manufactured in England, and much of the software development occurs at the company’s London headquarters in Poland Street, Soho, where a five-storey building houses R&D, management, administration, sales and support. There is also a brand new, state-of-the-art grading theatre to assist in the evaluation of images. The idea for Codex was hatched by Marc Dando and Delwyn Holroyd, who had previously worked together at British VFX software developer 5D Solutions, where they had experimented with data capture from the Thomson Viper FilmStream camera and identified the need for digital recording and workflow 44
systems. Delwyn has been Codex’s technical director since the business was founded in 2005. Marc became Managing Director in 2009 and, under his auspices, Codex has significantly grown its team, product range, corporate footprint and technology partners that now extend worldwide. In 2011, a US office supplying R&D, support, sales and marketing opened in Hollywood, close to many of their studio and post-production partners. Codex currently employs around 50 people between London, Los Angeles and Wellington, with additional staff, representatives and partners based in Asia, Europe and South America who support marketing and productions. The founding ethos, which remains true today, was to design top-end equipment for the motion picture, broadcast and advertising industries, that is easy to deploy Spring 2015 ZERB
Codex and streamlines production into post-production. The team at Codex has always liked to work closely with cameramen and cinematographers, both on specific projects, and on an ongoing basis, to ensure that they consistently come up with products that are required.
The Codex VFS At the heart of every Codex video recorder is a custom-built software infrastructure to manage data, providing optimum quality video files regardless of the destination. This is the Codex Virtual File System (VFS). Here’s how it works: when a Capture Drive with, for instance, ARRIRAW footage is loaded onto a computer, it shows up like a normal external drive. Operating underneath is the VFS. The VFS can present readily processed DPX files, MXF, DNxHD or QuickTime/ProRes proxies next to the original .ARI files on the Codex volume. Except for the recorded data on the drive, none of these additional files actually exist. It’s only when these files are requested, that they are generated, ‘on demand’, and on the fly; hence the term ‘virtual’. The file formats, file naming and directory structure presented by the VFS are fully configurable through the Codex Platform software. This makes the VFS a highly flexible tool for providing exactly the material you require, when you want it, without redundant processing and storage overhead on your drives. Ben Perry explains: “It means you’ve not restricted yourself to a particular file format at that point. You can take that RAW data with our transcoding engine and make any file format you need; for example, you could deliver DPX for visual effects, Avid or Final Cut files for editorial, and then add ‘burn-ins’ like LUTS. You can adjust metadata… but we retain a pristine ‘digital negative’ you can go back to.”
Expanding from film into TV production Initially, Codex technology was geared mostly towards film production and has been used on hundreds of digital features including 007 Quantum of Solace and 007 Skyfall; Gravity; Life Of Pi; X-Men: Days Of Future Past; as well as several of this year’s BAFTA/Oscar hopefuls like Birdman, Mr Turner, Selma and Paddington. Increasingly, high-end TV shows are the customer, as Ben Perry confirms: “We are very much known for the feature film work, but there is a shift toward the big, episodic TV series which essentially are mini feature films: Game of Thrones is a great example.” Interested in how the expansion from film into TV would lead new technology moves, Mark asked Ben whether there are any plans to produce recorders capable of recording direct to ProRes or DNxHD. Ben Perry: “We have always focused on
uncompressed or RAW capture for the big screen, but compressed recording is not something we say is necessarily a bad idea. Recording direct to ProRes or DNxHD is something we have looked at The founding and continue to look at it terms of new products. ethos, which We do recognise the demand for robust, efficient remains true today, and cost-effective solutions for TV capture and was to design topworkflow.” Pushing Ben a bit further, Mark asked what the end equipment benefits of recording RAW are over improving for the motion codecs like XAVC. Ben explains: “These are picture, broadcast compressed formats which, depending on what and advertising you’re delivering, are perfectly acceptable at industries, that the moment. Compressed formats have their place for sure, but for some... and this is being is easy to deploy driven from the big American studios for their and streamlines episodic productions... there’s a need to protect production into the highest quality negative for the future. We post-production. have just witnessed a major TV show in the US, Marvel’s Agent Carter, shooting everything RAW. You can shoot ProRes now, and that’s fine for today’s purposes, but then you’ve got a compressed 4K master. What happens in 10 years’ time, when you want to do a redux for the new box set? TVs will have moved on, viewers might have 8K TVs and fast pipes into their homes, getting less compression on delivery. All of a sudden your ProRes masters probably won’t look so good when they get upscaled. There’s a hike in cost to shooting RAW, but it’s not that much for bigger budget productions. So they are shooting RAW and delivering what they need to now – but safe in the knowledge they have the highest quality negative already banked. They have absolute RAW files that have never been changed since they came off the camera sensor.” In another exciting venture, Codex equipment is currently recording out in space. Paired with a Canon Cinema EOS C500, the Codex recording system will be used by astronauts onboard the International Space Station to capture a set of pre-determined shots at 4K for an upcoming IMAX production, with the working title A Perfect Planet. After rigorous testing, including with radiation, the Codex system was chosen for its rugged reliability as well as its known good compatibility with the C500.
Working with the big players
With advances in technology, Codex has been able to design smaller and increasingly powerful products. Following its early models in 2006/07, the Codex Onboard Recorder arrived in 2011. As digital cameras gained rapid acceptance, Codex collaborated with ARRI, and the Codex Onboard Recorder 45
Codex key people Marc Dando – Managing Director Marc is well-known among cinematographic and production communities around the world. He has over 25 years experience in cutting-edge technologies, having held senior sales, marketing and management roles with Softimage, Discreet and 5D before joining Codex. Today, Marc travels the world to deepen existing collaborations and start up new technology partnerships. Delwyn Holroyd – Technical Director Delwyn has been developing innovative products for the broadcast television and film industries for 20 years. Prior to Codex he was a senior developer with 5D, where he was responsible for the 5D Commander, the first PC-based real-time 2K preview system. Before this, Delwyn was at Lightworks, where he was lead designer on the Newsworks product and the revolutionary, next-generation non-linear editing product known as Lightworks Touch. Ben Perry – Director of Operations Ben has been with Codex from the very start of the business. He is actively engaged in the day-to-day operations of the business – sales, support, marketing and administration. Jens Rumberg – Director of Product Strategy Jens joined Codex in 2013 following a successful 10-year spell in senior technology roles at ARRI, Germany. Prior to joining Codex he was the technical supervisor of ARRI’s ALEXA XT digital cinema camera, whose pioneering in-camera recording and workflow capabilities were developed in collaboration with Codex. Sarah Priestnall – VP Market Development Based at Codex in Los Angeles, Sarah has over 25 years experience in production and post-production, working both for manufacturers and post facilities. She was deeply involved in Cineon software product development and managed the first digital intermediate projects, including O Brother Where Art Thou, while at Kodak and Cinesite.
quickly became the de facto standard for recording ARRIRAW with the ALEXA. The next step was to build recording into the camera, in the ARRI ALEXA XT, allowing for the first time uncompressed RAW capture at 120fps. The custom-made digital magazines or ‘mags’ contain very high-end solid state drives (SSD). The XR version can handle a data rate of 6.7 Gigabits per second, fast enough to record ARRIRAW at 120fps. As well as working with ARRI thoughout their transition to digital cinema, from the The team at Codex D-21 to the recently announced ALEXA has always liked to 65, Codex also offers workflow support for the Sony F55 and F65, and worked closely work closely with with Canon during the development of cameramen and the C500, pairing it with their Onboard S cinematographers, external recorder to provide a reliable, highboth on specific end solution for capturing 2K RGB 4:4:4 and projects, and on an 4K RAW up to 120fps in the form of 10-bit Canon Cinema RAW. ongoing basis, to To refine their products the Codex team ensure that they rely on close collaboration and feedback consistently come from both engineering teams and end users. up with products “Through knowledge of what happened with that are required. early recorders for the ALEXA, one of the 46
weak links was found to be the BNC connection between the two units. This is why (on the later ALEXA XT) we fitted our recorder into their camera. That meant a much more secure data path,” explains Ben Perry.
Codex Vault Smaller recorders and the need to handle ever more data led to the launch of the Codex Vault – a workflow hub enabling cloning and archiving of media, the creation of dailies and other deliverables, plus playback and visual quality control from ARRI, Sony, Canon and RED cameras. It is a ruggedised unit with a flip-up touch screen, interfaces for all the different memory cards, and it’s modular so you can configure it for your particular needs. There is also a linear tape open (LTO) backup bay – currently the most secure digital storage option. For those not familiar with the format, LTO is a cartridge containing magnetic tape, originally developed for the computer industry back in the 1980s. Its current incarnation, LTO-6, is capable of storing 2.5TB of data per cartridge. There’s an irony that the ultimate backup solution for digital video in this age of Flash memory, optical discs and multi-platter spinning hard drives is good ol’ ferric tape!
Panasonic VariCam At the end of 2014 it was revealed that Codex had been working with Panasonic on the development of the new VariCam 35 Dock (see pages 56–59). The launch of this camera marks Panasonic’s foray into the digital cinema market. The compact recorder will capture uncompressed 4K VariCam RAW at up to 120fps. The Codex V-RAW recorder connects directly with Panasonic’s VariCam 35, eliminating cabling completely to facilitate greater efficiency and higher mobility whilst shooting. The Codex Vault system supports the rapid transfer of digital camera originals for post-production and archiving. Ben Perry explains how this collaboration came about: “They’d seen what we’ve provided for other big companies, like ARRI and Canon, and I think they decided they needed an absolute, no compromise recording solution. Yes, they could have gone and made it themselves but, if you’re trying to launch a camera into the cinema market, it makes sense to work with people who are already active in that market. They get a Codex workflow and a lot of people understand that workflow and are already comfortable with it.”
The Codex Vault: a fully integrated solution for secure copy and backup of data cards from ALEXA, C500, F55 and RED. Its modular design means it can be configured for most workflows.
Spring 2015 ZERB
ALEXA Classic with external Onboard-S recorder
Tthe Codex Action Cam with PL lens
Meanwhile, the close relationship with ARRI has matured even further, with Codex being ARRI’s chosen partner to provide the in-built recording technology and Vault workflow system for the ARRI Alexa 65 camera, launched in December 2014.
Action Cam In early 2014, Codex entered the camera market when they unveiled their Action Cam. Similar in size to the now largely obsolete Toshiba TU-series miniature cameras that once ruled the in-car/skateboard/ dog-cam roost (now firmly occupied by GoPro), the Action Cam has a C-mount lens fitting that will accommodate adaptors for B4, PL and EF lenses. It boasts a dynamic range of 13.5 stops and a single 2/3” Kodak CCD sensor that doesn’t suffer from skew like its CMOS competitors. Its main differentiating factor is that, while GoPros and other action cameras are great for TV production, when you try to match them with a higher-end camera like the ALEXA, F65 or F55, their limitations begin to show. Footage from Action Cam can comfortably sit alongside that from these A cameras. Codex designed Action Cam to work with a custom-built Camera Control Recorder capable of handling two camera heads for 3D acquisition. The Action Cam fits into the RAW video workflow, making it easy to deliver rushes and archive the digital camera negative. Completely portable (the recorder can be carried in a backpack or connected to the camera head at up to 85m via BNC) and able to run autonomously from battery power, it can record 1920 x 1080 12-bit RAW video at up to 60fps. That RAW video can then be converted to a choice of delivery formats including ProRes, DNxHD via the Codex Dock or Codex Vault workflows. Early users of the system have included Belgian DoP Stijn Van Der Veken, who used the Action Cam for a stunt sequence in the movie Alles Voor Lena; and Oona Menges on Social Suicide, a modern-day retelling of Romeo and Juliet. Radiant Images in Los Angeles have built helmet rigs and used the camera on several commercials. There are now moves to use multiple Action Cams in 360-degree rigs.
Fact File Codex London Office 60 Poland Street, London W1F 7NT Tel: +44 20 7292 6919 Fax: +44 20 7990 9906 Los Angeles Office 3450 Cahuenga Boulevard West, Unit 103, Los Angeles, CA 90068 Tel: +1 323 969 9980 Fax: +1 323 417 4969 www.codexdigital.com
Looking into the future While Codex products have made great strides in streamlining the transition of images and metadata from production into postproduction, other challenges lie ahead. There’s a need to establish secure colour pipelines, so that the look created on set is exactly what appears in the DI grading suite. There’s also a need for VFX vendors to have key production and technical data to assist and speed their work. Movie and TV producers are also looking at how best to harness the ‘cloud’ for production and post purposes. Codex is active in all these areas, developing new solutions that will integrate with other complementary products. www.gtc.org.uk
TV News from Helmand
Deserted Camp Bastion
In 25 years of news camerawork, GTC member Paul Francis has filmed in most of the world’s hotspots, including multiple trips to Afghanistan. These shoots have been consistently fraught with danger and difficult terrain, and yet it was with a sense of poignancy and some sadness that, as the last British TV cameramen in Helmand, Paul documented the final pull-out of British troops from the vast Camp Bastion. This assignment had got off to a very bad start…
In Kabul - with no bags Since 1989, when I joined BBC News covering breaking stories around the world, I have on occasions had to soldier on with the odd bag or box of technical equipment missing on arrival at a foreign airport, but this time things were worse. My flight had been delayed leaving Heathrow but BA and Emirates staff were waiting for me on my arrival at Dubai airport and pulled out all the stops to whisk me through the vast concourses onto my connecting flight to Kabul. I only wish the same treatment had been afforded to my baggage. On arriving in Afghanistan, everyone else collected their bags from the rather primitive baggage carousel and continued on their journeys. My boxes and bags were nowhere to be seen. So there I was, separated from the satellite Bgan kit, my personal body armour, tripod and all other essential technical equipment, not to mention my personal bag containing my clothes and so on. All I had with me was my PMW-400 camera, Li-ion batteries (thanks to the new transportation regulations, I had been carrying these in my hand luggage) and a basic FCP X editing kit that I’ve always hand-carried on flights in case of 48
just such an eventuality. This was going to be a real challenge on a four-day embed in Helmand Province with the British and US military, filming their handover to the Afghan forces and subsequent pull-out from Camp Bastion. I was to be the last British TV cameraman to film with the final few hundred soldiers leaving the huge base for good. I’d also been elected the pool cameraman–editor–producer for not only BBC Defence Correspondent Jonathan Beale, but also ITN and Sky correspondents John Irvine and Alistair Bunkall. We would be syndicating our coverage to all our respective affiliates around the globe, so just about the whole world’s broadcasters. No pressure then! It was vital that I manage to purloin some technical gear from somewhere in the couple of hours remaining before flying to Helmand at 06:00 the next morning for the start of the embed. Luckily, the BBC has a Newsgathering Bureau in Kabul, so I raided their kit and took their tripod, Bgan and minimal other kit to cobble together a working system.
The toughest of environments Afghanistan has undoubtedly been one of the most challenging countries in which to work over the last decade. In my career as a news cameraman I have been ‘lucky’ Spring 2015 ZERB
TV News from Helmand is tinged with sadness; somehow I always look forward to my next trip and especially to meeting up with the wonderful people I have become acquainted with there. Afghanistan will continue to endure its problems of course, but I do believe that day-to-day life has become safer, especially for the population of Helmand. Imagine the reality of not being able to leave your house to go shopping in the street outside for fear of the Taliban; or not being able to sleep, frightened that they will come knocking in the middle of the night; or most poignantly the impossibility of young women attending school to further their education. Everyone’s hope is that coalition forces will not have to return there any time soon.
You may find yourself out on patrol either getting soaked by torrential rain or wading waistdeep across a wadi or alternatively baking in the sweltering sun weighed down with 14kg of body armour capable of withstanding highvelocity 7.62mm rounds.
enough (if that’s the correct adjective) to be asked to go on assignment to a fair number of the world’s hotspots. I have covered the downfall of President Ceausescu in Romania in 1989; both Gulf Wars; the Balkan conflicts in Bosnia, Croatia and Kosovo; South Sudan, Angola, Somalia, Pakistan, Egypt, Israel, the West Bank and Gaza; and latterly three trips to Ukraine and Crimea. In July 2014, whilst being shelled in Sloviansk just to the east of Donetsk, I had my camera taken from me at gunpoint by Russian-backed rebels, who accused the team of being Ukrainian spies. I’ve also done my fair share of more peaceful everyday assignments over the past 25 years allowing me a bit more time to be creative. In Afghanistan though, everyday existence on military embeds throws up some major challenges – from coping with talcum powder-like sand finding every route into the £30,000 camera and lens; to hauling equipment on and off helicopters with gravel and dust from the landing-zone spraying up like a smokescreen all around you, while at the same time being pelted with stones catapulted outwards by the chopper’s downwash; to being disgorged out of the back of a Chinook or Merlin into an unknown corner of Helmand littered with IEDs and concealed Taliban. Back on terra firma you may then find yourself out on patrol either getting soaked by torrential rain, wading waist-deep across a wadi (an Afghan river that can one moment be dry, the next flowing furiously and swollen full) or alternatively baking in the sweltering sun weighed down with 14kg of body armour capable of withstanding high-velocity 7.62mm rounds. To compound this, the wet dust on the equipment then bakes solid in the fierce sun. All nightmares for professional cameramen, whether they own their equipment or just cherish that which is entrusted to them.
Top: C-130 Hercules flight to Camp Bastion Bottom: Troops arriving safely in Kandahar airbase
Of course, these acts of nature have contributed to the excellent solid construction of the compound walls surrounding most rural Afghan houses, many of which have existed for hundreds of years and have lasted many times longer than the average new-build home in the UK. Afghans have many things they could teach us about existence but this time it’s been the turn of the British and Americans to help train and create an Afghan National Army (ANA) fighting force whom I have witnessed being pretty gutsy in the defence of their own country against Taliban insurgents. I’ve been coming and going from this most desolate but also stunningly beautiful country for some years now and although I’m always relieved to leave, at the same time this
Learning from each other
TV News from Helmand
Embedded So, back on the embed, after the flight from Kabul to Bastion on a Hercules C-130 transport plane, we set off on a whirlwind tour of the vast base, some 4 x 6kms in area. We were given a confidential briefing by the top brass as to how the pull-out would happen, then had just two hours to hoover up most of the shots and pieces to camera (PTCs) that would fill the bulletins when the embargo was lifted 36 hours later after the ALISTAIR BUNKALL
It felt just like the US withdrawal from Saigon must have done back in 1975 for Brian Barron, the BBC correspondent there, and I felt very privileged to be the last TV cameraman filming this historic event.
Two things I crave for are the glorious sunsets and the panoramas of the snowcapped Hindu Kush Mountains that can be seen for miles. I never tire of looking out of the window on the flight into Kabul as Afghanistan has some of the most spectacular scenery in the world.
last troops were safely back on the ground at Kandahar airbase and had all been accounted for. At dawn the next day, in preparation for the embargo being lifted the following day, I set up the Bgan satellite dish and pre-fed the general shots that wouldn’t compromise operational security and the secrecy of the plan that the generals and brigadiers had been drawing up for some time. To those of us in Bastion, it felt just like the US withdrawal from Saigon must have done back in 1975 for Brian Barron, the BBC correspondent there, and I felt very privileged to be the last TV cameraman filming this historic event. When the final US Marine Corps Hueys and British Chinooks arrived at Kandahar, escorted by Apache attack helicopters, it was an amazing sight, just like a scene from the film Apocalypse Now. Anything that had any monetary value or could assist the opposition had been packed into planes shuttling backwards and forwards between Kandahar and Bastion, including everything from auxiliary ground power units for the planes, tow trucks, fork lifts and, on our last flight out, the final boxes of blood and plasma from the Bastion field hospital that had finally closed its doors for good. Its medical staff had saved countless lives and pioneered medical life-saving procedures that 10 years ago hadn’t even been thought of.
Experience counts In preparation for assignments like these, every three years we go through a Hostile Environments training course concentrating on Combat First Aid and awareness of different environments, ordnance and scenarios that might be encountered in different situations. But there are some things that only experience brings; for example, wearing goggles to protect your eyes when boarding a helicopter; carrying dry bags and plastic bags to protect the camera and lens from being sandblasted; carrying a headtorch for trips to the ablutions after dark; having earplugs handy for the noisy plane and chopper flights; and being able to differentiate between essentials and ‘nice to haves’ that sometimes have to be ditched to lighten the load.
Keeping track of time
Top: Filming final flag ceremony, Camp Bastion Bottom: UK union flag lowering
I’ve learned the value of leaving your MacBook clock on UK time in order to keep the edit buzzing along and ensure packages are fed in time for the news bulletins. Afghanistan is 4.5 hours ahead of London time, which can get a little confusing when you’ve not had enough sleep. Covering news can be a pretty exhilarating experience but if you aren’t prepared there are many potential pitfalls along the way. With tight deadlines and ‘today’s news being history tomorrow’, good shoot–edits need equipping not only with good HD acquisition equipment but also with a quick and reliable editing platform. To this goal, BBC News Field Operations have put together a Final Cut Pro X conversion training course for the newsgathering camera crews, along with equipping them with the latest MacBook Pros that can have software updates pushed to them in the field from the London Operations team, as and when appropriate. We have put a lot of work in with various potential software suppliers over the past couple of years testing different versions and running numerous workshops. In the middle of 2014 we arrived at the point where we were able to announce that we were going to upgrade our edit software from Final Cut Pro 7 to FCP X. I have been using this for over a year now, soak testing it around the world, doing fast edits for the BBC News at Six and Ten o’clock programmes, and it is performing Spring 2015 ZERB
TV News from Helmand A long career with one employer So, nearly 34 years after starting out as a camera trainee in BBC Wales at the Llandaff Studios (now up for sale), before moving on to Outside Broadcasts for four years at Kendal Avenue (demolished a while back), I am one of the lucky ones to have been with one employer for my whole career. And just this year, after too long a gap, BBC News is now spearheading the broadcast industry by once again taking on Technical Apprentices. A few weeks ago I had the privilege while running a FCP X course to see the benefits to both sides of recruiting keen youngsters. One of my ‘trainees’ was a recently appointed apprentice and he brought a fresh face to the course with lots of ideas and enthusiasm. Long may this continue in an industry that has sadly neglected training for far too long. And in a nice twist I am now back based at the Park Western News Operations hub at Kendal Avenue – the place I left 25 years ago in 1989 in my move to become a news cameraman.
Fact File Sending video footage back via Bgan satellite terminal from Camp Bastion
extremely well, able to ingest nearly every format thrown at it and to output speedily in the correct format appropriate for real-time baseband playout or with our own in-house developed Jupiter File Exchange (JFE) FTP software. We are continuing the dialogue with the Apple application developers in Cupertino, California to further enhance the application in terms of speed and features appropriate not only for news but also to benefit the wider editing community.
GTC member Paul Francis has been a cameraman for 34 years and a news cameraman since 1989. He is now Global Lead, BBC News Field Cameras & Editing. Twitter: @manuelfocus See some of the Camp Bastion news reports: http://www.bbc.co.uk/news/uk-29776438 http://www.bbc.co.uk/news/uk-29776437 BS-Magazine Ad Jan 2015.pdf
The equipment you need to get the shots you want
Tel: +44 (0)1932 570001 Fax: +44 (0)1932 570443 broadcast-services.co.uk P roduce rs now favour The Coach House Depth of F ield Ruxbury Road After di scoverin Chertsey g with ev er ything a 1940’s film Surrey in focu s........ KT16 9EP
nowMini to hadded stoc ire k2015
Millba surprise nd p the n a Come ew S Danci
Gettin g togeth er for th public D were lo avid and The ok but wou ing forward ld have Ballroo to polis m. h
Volume 1, Issu
Broadca st Pay n ow
higher th TV sta ff an local negotia can now expe councils tio ct aver average ns. Top exec age pa (just) y utives of £2m are now of over £500 ill & mee .000 th tings pe ion per year, in line anks to plus in r month centive with council . CEO’s new bonuse Better M ini s, on an ba pay equa sed on attendan nols bettead people r prde then on w dmm ce ogra e ha es a spok tos tohir payethe es m stock best pensions. an said. If you wan first time in t ththe er for eth e tog be ting st Get said they and Theresa Latin public David forward to the were looking on their up sh poli to e but would hav Ballroom.
May MillIiband & s in er surprise partn tly ric the new St g Show Come Dancin
New Surround 8K TV Announced
ound cur ved but surr Now not just d sound. your surroun TV to match said it res in Korea nical The Manufactu e many tech had to overcom t of which was the leas and problems, not 8K full in is r. It one cut out for a doo med to view can be program at the same time. ral seve or l, nne back cha ience can sit aud the g with Meanin presumably back, to headphones. .
ces) space (or spa So watch this
s (just) n local council now higher tha new Broadcast Pay 0 thanks to an over £500.00 on
er since Coldest wint s 2013 Loom (Daily Express)
of average pay line with council CEO’s ndance now expect in atte TV staff can Top executives are now ntive bonuses, based on negotiations. illion per year, plus ince £2m average of . nth mo best per s & meeting If you want the kesman said. grammes a spo als better pro best pensions. Better pay equ the pay to has people then one
*not true - few still available call 01932 570001
Coldest win 2013 L ter ooms (Dail y Express
Springwatch: Reality TV in the natural world
Freelance wildlife cameraman Jo Charlesworth is now in his tenth year on the BBC series Autumnwatch and Springwatch and during this time has seen the camera technology used evolve into its current sophisticated multicamera operation based on multiple remote-controlled cameras supplied by GTC sponsors Bradley Engineering. With the luxury of this technology, the days of sitting around in an uncomfortable hide for days on end waiting for something to happen, are to some extent a welcome thing of the past. Path to wildlife filming Like many I suppose, I got into my chosen career – wildlife filming – entirely by accident. I grew up in Devon, where there wasn’t a lot to do when I left school, so I tried my hand at various kinds of work. I soon found out that I wasn’t cut out to be a chef and instead got a job mixing lime mortar. I then went through a string of jobs including scaffolding, working for a welder and fixing cars in a mate’s garage, before eventually getting work as a builder, mainly doing plastering. I even had my own van! Then, while I was working on a barn conversion for a lady who had been a TV producer at the BBC Natural History Unit, I happened to meet Charlie Hamilton-James, a very good wildlife cameraman and producer with a BBC background. At the time, Charlie had his own production company making wildlife films for the BBC Natural World strand amongst others. He and his then assistant, Jamie MacPherson, now a wildlife cameraman himself, were staying in the cottage next door and filming at night, but they were occasionally around 52
to chat during the day. As I looked at the work they were doing, I decided it looked way more interesting than mine. Around that time, some mates were moving to Bristol, and as I was getting bored in Devon, I put my stuff in the van and set off to the city. On the motorway on the way up, I phoned Charlie to ask if he had any work and, as it happened, he needed a tank built for filming underwater shots of his tame otter for the film he had been making in Devon. I said I would do this. Funnily enough, it turned out that many of the jobs I had been doing were surprisingly applicable to wildlife film-making. I became Charlie’s assistant, mostly set-building for him, and worked for him on Wye: Voices from the Valley, another in the Natural World series. I still think this is probably the finest film I’ve worked on and Charlie was the best person I could possibly have been with. I will always be grateful to him for singlehandedly giving me a start. It was through Charlie that I was recommended to the producer of Springwatch and I’m now in my tenth year with the series. Spring 2015 ZERB
Springwatch to the programming of the cameras that would allow us to track focus in infra-red. David managed to reprogramme the cameras in just 24 hours, in time for our departure. We moved to using the Bradley cameras the next year on Springwatch at Pensthorpe Nature Reserve in Norfolk, and the quality was amazing. It did involve a big jump in cost though, so we had to use them carefully. At that time we had 15 to 20 cameras instead of the 40 or so we now use and Springwatch was still an SD production. By changing to these HD cameras at that time, my department was actually ‘HD-ready’ a year ahead of the rest of the programme. For filming nests, the Camball is my preferred camera as it allows reframing and shooting whole sequences. If I can’t use this, I will use the HD10, but it doesn’t pan and tilt, so I try to use at least two of these if possible to allow the editors to cut. If space is even tighter, then one locked-off camera and dissolves it is (sorry, editors!). We use around 17 to 20 Camballs and 13 HD10s. The cameras tend not all to be out at the same time and we would never broadcast from all 30 at once.
Birds are, on the whole, not scared of cameras. If you leave a camera on the floor, the bird will happily sit on it, but if you sit on the floor yourself, it won’t sit on your head!
Coping with the elements
Early gear on Springwatch The first year on Springwatch I used the little 4:3 Board CCTV cameras with a C-mount lens on the front, not dissimilar to those you can now buy already fitted to nest boxes, and also larger pan-and-tilt cameras like those you see around town for the Council to watch you – all of which were pretty ordinary quality and all 4:3. In the aspect ratio conversion we had to chop off a considerable amount of what little resolution we had. As I became more involved with the series, we started to enhance what we were doing. A very kind man called Jim Dods, a BBC electronics services engineer in Bristol, gave up a lot of his time to help us design and build some of the equipment we needed. With my vague grasp of GCSE Physics, and Jim’s patient help, we started sending volts hundreds of metres down a bit of bell wire to dim the lights on a nest a kilometre away, meaning we didn’t have to go there to do it manually. Our first remote-controlled iris worked with a rubber band and a little motor on a manual iris lens, which I cut a groove in with a Dremel, and you had to keep flicking a switch at the other end to move the iris, which would inevitably not stop in the right place until the sixth attempt, or would sometimes end up unscrewing or defocusing the lens. It was very Heath Robinson but it felt like a triumph at the time.
All cameras are problematic in unpleasant environments. We often have to leave them outside in the rain for long periods (weeks, sometimes months); in Kenya we were out in lashing rain and it was boiling hot as well; or we could be working in snow or sea spray, say, filming a seal colony. The Camballs are tough and they probably get harder treatment with me than they would on other kinds of work, as our work is not like sports or entertainment. For the work I am doing, I like to use remote cameras only in places where I can’t actually go myself, either because it’s not safe or because it would disturb the subject, and to give perspectives you couldn’t get with a larger manned camera. Close and wide is my preferred position for filming with a remote camera. We have found that birds are, on the whole, not scared of cameras, but they are scared of people. If you leave a camera on the floor, the bird will happily sit on it, but if you sit on the floor yourself, it won’t sit on your head! That said, you have to take enormous care with any changes you make to the birds’ environments, because they are generally wary of change. Some species are easier than others, but as a rule you have to introduce things slowly and, frustratingly, even so, you still sometimes have to admit defeat and give up if they won’t accept what you are trying to do.
Moving to Bradley cameras In 2008, we started using David Bradley’s cameras. I had already used them on Big Cat Live, which had been enormously hard to film. That series was shot in the Masai Mara and had to be filmed in total darkness to be broadcast live here, so much of it was infra-red. David had shown himself to be very willing to co-operate with us to make the necessary changes www.gtc.org.uk
Bradley HD10 mounted on bird box
Springwatch What makes a good camera for this job? It’s very important to have excellent fluid pan-and-tilt motion. This isn’t particularly a requirement on a lot of jobs but for us we use the whole range of the zoom and we need to be able to make very gentle, tiny movements, say, from the feet of a bird to its head. These movements need to be perfectly smooth, even at the long end of the lens, all done by remote control. Not many cameras are good enough to do this at all and just to add to the pressure we do it live on air. I have put cameras in all sorts of places, everywhere from inside a fox’s den to on the lip of a sparrowhawk’s nest. There is no other way you can get that kind of shot. You could try to get it with a long lens from a camera in an adjacent tree, but it’s not the same as getting in close with a wide-angle lens; I think that is more immersive for the viewer. The traditional camera with a large lens is more detached, but a mixture of the two makes for the best sequences. Bradley’s cameras were also used extensively for the last big flagship BBC series, Africa, on which I used them for a sequence filming wildebeest. They were also used to film lions, shoebill storks, a crowned eagle nest and a sequence on silver ants, amongst others. We were able to cut this footage seamlessly with shots captured using more traditional techniques, so that you wouldn’t even know that we weren’t using a ‘proper camera’. I even put one in the middle of a hyena’s den and got a shot from in between the horns of a dead wildebeest. The hyena looms over the camera in beautiful light and it is incredible – there is no other way you could have got that shot. Unfortunately that gem didn’t make it to air.
The control room with Bradley multi function controllers and multi camera selectors
action and not make a mistake when something interesting finally happens. Sitting in a tiny hide for hours on end, with a VariCam, long lens and big tripod, there isn’t much room left for you; it can be very uncomfortable and is usually very boring. On another occasion I remember being asked to film a longeared bat for Autumnwatch as a standalone shoot away from the location. I was filming from inside a Toyota Hilux covered in a black drape to stop any light leaking out. We wanted to see the bat fly into a barn, catch moths and eat them. With this kind of filming, you can sit there for hours on end with the animal not doing anything. In this case, the animal was not even there, so it was a question of looking at an empty barn through infra-red and the bat may or may not fly in. I did this for four or five nights and it was pretty bleak – and still no bat. Finally, it did fly in, just once, but chose to land behind a beam, so the camera could only see its ears! The following year, I went back and tried again; still no bat, despite plenty of evidence of moth wings on the floor. It eventually took us 110 hours to get about 30 seconds of film – and that wasn’t even used. As I see it, there are massive advantages to using remote cameras from a nice, comfortable OB truck, with a catering tent full of tea and biscuits nearby!
Multicamera operation The unique thing, in terms of wildlife TV, about Springwatch and Autumnwatch is that they allow multiple cameras to be relayed back to one place, maybe as many as 20 at once, just like a sporting event or reality TV series. The remote camera operation is housed in a VT truck with two multi function controllers (MFCs) controlling 30 cameras, with the option to switch monitors manually. This is all recorded to a four-channel EVS and the feeds into this are monitored and switched to capture the most interesting behaviour by a team of three ‘story developers’, 20 hours a day. This is a complete luxury; it’s not like traditional wildlife filming where you might sit in a hide for all the daylight hours and for 17 out of 18 of them there is absolutely nothing happening at all. On Springwatch you can sit in comfort on a nice chair with a cup of tea and it’s pretty much guaranteed there will be something of interest happening on one of the cameras almost all the time. Compare this with filming lions for the BBC where we would sit for hours on end in a Land Rover by a pride of sleeping lions, waiting for them to wake up and kill something. The key to this kind of work is the ability to remain happy while both bored and uncomfortable, and yet still be able to spring into 54
Fact File Contact Jo on: firstname.lastname@example.org See more about Bradley Engineering at: www.bradeng.com
Spring 2015 ZERB
IBC2015 RAI Amsterdam Conference 10 - 14 September 2015 Exhibition 11 - 15 September 2015 IBC Content Everywhere MENA Dubai 20 - 22 January 2015 LATAM Sao Paulo TBC 2015 Europe Amsterdam 11 - 15 September 2015
Round the clock, live, linear web channel available 365 days a year IBC VOD is a library of exclusive content from IBC and IBC Content Everywhere events that take place year round, around the globe. These videos are industry insights from those creating, managing and delivering content, by the industry for the industry. IBC VOD allows you to select videos and content from: • IBC2014 Conference Keynotes from Professor Brian Cox OBE, Neelie Kroes, Former Vice President, European Commission and many more • IBC2014 Conference sessions and industry insights • IBC TV Show News, exclusive interviews, analysis and highlights from the exhibition floor and conference
• IBC Content Everywhere Hub Theatre panel discussions providing insights into the consumer behaviours, business models and technologies driving the growth of consumption over connected devices • IBC Content Everywhere Cloud Solutions Theatre presentations and case studies on Cloud Technology and products • IBC Content Everywhere Workflow Solutions Theatre case studies on the financial and production benefits of tapeless production
A Variety of Options
new VariCam from the
It’s been a long time coming and perhaps it’s a little ‘late to the table’ but it looks as if the wait for the recently launched third-generation VariCam from GTC sponsors Panasonic has been well worth it. Sometimes holding back a while and observing the learning curve of others can be a smart move and in this case Panasonic seems to have done just that, resulting in a system that addresses many of the production niggles that have been issues with the current crop of top-end 4K cameras.
Panasonic VariCam 35mm
Panasonic VariCam HS
Spring 2015 ZERB
nfortunately at the time of writing, cameras were in hot demand and so we were not able to get our hands on one for a proper ‘Zerb road test’ so, for now (and we’re sure we’ll be hearing much more about this camera system in future issues of Zerb), a brief overview of what the new VariCam offers will have to suffice. Much of the information is gathered from a useful set of videos from the Digital Cinema Society (DCS) in California, at which both cinematographers and post technicians gave the camera a broadly enthusiastic reception. The videos can be viewed at: http://vimeo.com/digitalcinemasociety/videos. It is perhaps no surprise to see initial reactions emanating from Hollywood since, with the 4K VariCam, Panasonic is looking to pitch for top-end productions, measuring the new camera up against the likes of the ALEXA, AMIRA, F55 and RED. When the first generation VariCam was the hot new camera, over 10 years ago, one of its biggest selling points was its ‘filmic look’, at the time leapt upon by the wildlife fraternity; now, the new iteration not only retains those aspirations to beautiful cinematographic images but also offers an impressive array of workflow options required by productions, extending right through from image capture to post.
Varicam 4K and HS The first thing to note is that there are two front-end forms of the new VariCam: the VariCam 35 (native 4096 x 2160 sensor with PL mount allowing the use of 35mm lenses) and VariCam HS (1920 x 1080 high-speed version with a 2/3” mount). The main difference, apart from the resolution and lens mounts, is that while the 35 has a top speed of 120 frames per second (fps) in full 4K, the HS will record up to 240fps. Both versions record to the same splittable modular back end, so not only can either the 35 or the HS be separated from the record module (up to a distance of 100 feet) to make it lighter and smaller, for instance when used on a jib or in a tight space, but it will also be possible to take advantage of both sets of features on the same production without having two whole cameras – for example, using the 4K PL version for shallow depth of field ‘beauty shots’ and then swapping to the HD front end for slow-motion sequences.
might have hoped, all that research was not completely in vain as it produced a sensor called the ‘DYNA chip’, which allows the two stereoscopic streams to be simultaneously The new VariCam recorded. This chip has been incorporated into not only retains the the new VariCam allowing dual recording, traditional VariCam meaning the camera can capture a full-quality cinematographic 4K 4:4:4 ‘digital negative’, a 2K (conformed to 1920 x 1080) ‘daily’ with Rec.709 colour images but space, plus proxy video files on SD card, all also offers an at the same time. DoP Theo Van De Sande impressive array ASC, who shot the first project on an early of workflow prototype of the camera, sees definite options required advantages to this multiformat output and expects to regularly use all three: “The 4K by productions, can be the negative and never touched; the extending right 2K will be used for editing as it’s fast to work through from image with; and the SD cards will be very accessible capture to post. on set – you can quickly check back a shot for continuity, for instance. The proxy feature is a good cinematographer’s tool while the 4K is an asset for the production to fall back on in the future.” The camera provides for various card sizes: P2 (including a very sophisticated and powerful new expressP2 card offering 72 minutes of recording at transfer speeds up to 2.4Gbps), Micro-P2 and SD card. As mentioned in our article on pages 44-47, Panasonic has also collaborated with RAW experts Codex to create a bolt-on (cable-free) VRAW recording unit, allowing data-heavy RAW images (up to 120fps at 4K) to be recorded straight to a hard drive, effectively offering a fourth capture option.
Chips and codecs The new super 35mm MOS sensor has been developed and built in-house by Panasonic and records internally to the AVCUltra codec which, as Michael Cioni, CEO of post-production and workflow specialists Light Iron explains, offers significant advantages over other leading codecs. Firstly, it offers two flavours in the VariCam: 10-bit 4:2:2 and 12-bit 4:4:4. Cioni goes on to make comparisons with ProRes 4:4:4 1080p, which as he puts it “pretty much rules the world”, whether it be for features, commercials or TV productions. When it comes to data transfer and storage though AVC-Ultra is about 45–50% more efficient than ProRes, meaning important cost savings in both storage and processing time. That said, an option to record to ProRes is promised for a future firmware upgrade.
Multiple formats One area in which the new system really is unique is the ability to output several file formats and quality standards simultaneously. While 3D may not have taken off in the way that some of the camera manufacturers, Panasonic included,
The VariCam 35 on set for the demo reel
Wireless in-camera LUTS Like most modern cameras, the VariCam has the ability to apply a LUT (look up table) in camera to one or more of the video streams without affecting the high resolution ‘negative’. This graded video file remains alongside the ‘negative’ and can be used as a reference on set or later on in the edit. What is unique about the VariCam is the ability to wirelessly transmit full HD video via the optional AJ-WM30 wireless
The new VariCam offers the option to ‘recalibrate’ the baseline ISO and simply switch to an impressive 5000 ISO allowing noise-free exposure at very low light levels.
DoP Suny Behar shooting for the demo reel with the VariCam 35
module complete with LUT applied, so the director, cinematographer, gaffer etc can instantly view high-quality images and make necessary adjustments without the need for The camera can long cable runs and conversion boxes on set. capture a full-quality The ‘negative’, whether it’s recorded as Log 4K 4:4:4 ‘digital or RAW, is not affected. negative’, a 2K Whereas in other systems the camera negative and its associated LUT are essentially (conformed to created and stored on separate systems until 1920 x 1080) ‘daily’ they are later combined in post (allowing with Rec.709 colour the possibility of the wrong LUT getting space, plus proxy attached to an image), in this case all the video files on SD card, files are stored together – not only making it harder to mix them up but also opening the all at the same time. possibility of creating and attaching different LUT options for each shot. American DoP Suny Behar has been closely involved with the development and assessment of the new camera. He explains: “What the VariCam has done is given the ability to paint the camera wirelessly but, more importantly, to associate that LUT directly with the file, in the file structure, so that LUT goes all the way from set to post. The fact that the camera will make all your deliverables for editorial with all the timecode burn-ins with the LUT management in one pass is a huge time-saver.”
5000 ISO Another exciting feature is the ability to change the baseline ISO. Along with many other cameras on the market, the normal working ISO of the VariCam is around 800 – fine for everyday exteriors and lit sets, but not enough when attempting to shoot in darker environments. Of course, the ISO can be pushed, but when getting up to ISO 4000 noise creeps in to the extent that the image is visibly degraded. 58
This camera, however, offers the option to ‘recalibrate’ the baseline ISO and simply switch to an impressive 5000 ISO allowing noise-free exposure at very low light levels. When this was demonstrated recently at the Digital Cinema Society it was met with enthusiastic applause.
Usability Because of the critical focus required for 4K, special attention has been paid to the newly developed OLED viewfinder, with a handy focus-check feature that allows the cameraman to zoom into the viewed image on the fly. This viewfinder can be flipped to the other side of the camera if required. The camera also has an in-built ND filter wheel with 0.6, 1.2, 1.8 densities.
The VariCam look Summing up, Suny Behar remarks that it is exciting that Panasonic has managed “to keep the VariCam look, keep the film idea but take it to higher bit depth, more colour definition and more spatial resolution. It’s a huge improvement because we’re talking about being able to shoot very high contrast scenes and resolve the shadows and the highlights… The ability to record onboard 12-bit 4:4:4 is enormous – that’s a ton of colour resolution the camera never had before.”
Fact File For more information about VariCam and other Panasonic professional camera solutions visit: business.panasonic.co.uk/professional-camera To watch the Digital Cinema Society videos: http://vimeo.com/digitalcinemasociety/videos
Spring 2015 ZERB
in Slow and Ultra Motion There’s no doubt that slow-motion recording and playback hugely enrich the viewing experience of sports broadcasts and beyond. Use of this production technique is advancing rapidly and so too is the technology behind it, with significant steps forward even since we last looked at the subject in Zerb two years ago – in particular the ability to handle ever greater quantities of data, at faster rates, in shorter periods of time. Technology writer Mel Noonan looks at recent advances made by some of the high-speed camera manufacturers particularly for sports OB work.
Slow motion everywhere Slow motion capability is increasingly available on the proliferation of cameras out there in the marketplace. Just like its origins in film, to produce slow motion in video you need to speed up the production frame rate when you record it, then when played back at the normal frame rate, you will have slow motion; the higher the recording speed, the slower the motion on playback. Of course, this is a simplistic explanation. You also need to consider, amongst many other things, that the more images you shoot in a given time, the less exposure each image will get, so light and sensor sensitivity come into the equation, not to mention motion blur and the sharpness of a moving object in the frame. Many of the current cameras are aimed primarily at the production areas of TV and cinematography. A camera like RED, for example, can be set to shoot at an increased frame rate to produce smooth cinematic moderate slow motion in post. There seems to be an average upper limit for these
cameras of around 200 to 300 frames per second (fps), sometimes available only in short bursts, depending on the camera and storage.
Slow motion for sport OBs But these types of camera are not going to be found set up around a football or rugby stadium connected to an outside broadcast (OB) scanner. For this type of usage, the cameras need to offer the remotely operated, quickly accessible, fine adjustments required to enable close matching to the other OB cameras on the vision engineer’s control panel and, moreover, the ability to be recorded, played back and fully integrated into an existing – often live – workflow. They also have to be ‘ruggedized’ to survive widely varying operational conditions. Finally, the camera operator has to have all the facilities required to do their job – communications, the usual zoom and focus demands and, most importantly, a decent high-resolution viewfinder/monitor.
Super Slow Motion (SSM)
Grass Valley LDX XtremeSpeed covering soccer in the USA
This is the term generally applied to camera systems that will record at 2x or 3x speed and integrate with a current live workflow. These broadcast camera channels started to appear many years ago, originally recording in SD, then later in 16:9 SD. Super-motion dual and triple ‘phase’ cameras are still widely used for sports today, now in HD, and coming soon in 4K. Sony pioneered these cameras and launched the first highspeed scanning camera back in 1984. Then the second-generation BVP-9000 was introduced in 1993. When combined with the BVW-9000 Betacam SP high-speed VTR, this was the world’s first SD Super Motion (triple-speed) system. Later the VTR would be replaced with disk recorders such as Tektronix Profile, EVS and BLT. Spring 2015 ZERB
Editec’s X10 I-MOVIX camera in rugby action
There have been various improvements in the sensitivity and image quality of these Sony Super Motion cameras, but the frame rates have never exceeded 75fps, resulting in one-third speed on-screen replays. The latest such camera in the line is the HDC-3300R, based on the proven HDC-1500R HD multi-format camera system. In addition to Super Motion output, the HDC-3300R camera also provides real-time, normal-speed images. This output is available simultaneously with the Super Motion output, allowing users to employ the HDC-3300R for both slow motion and standard shooting purposes. There are similar cameras available from other manufactures such as Grass Valley and more recently Hitachi, with the SK-HD1500 High-Speed Broadcast System Camera. Klaus Weber, Senior Product Marketing Manager for Cameras at Grass Valley (a Belden brand) explains: “We always had cameras to support Super Slow Motion such as: the LDK 23, a 16:9 camera with SSM; LDK 23HS mkII; in 2006, the LDK 6200, a 2x speed HD SSM camera; then the LDK 8000 Sportcam. Then we had the Grass Valley LDK 8300 3x speed HDTV SSM camera with Anylight flicker reduction, which was introduced at UEFA Euro 2008 in Austria.”
Ultra Slow Motion (USM)/Ultra Motion/ Hi Mo Different manufacturers use various terms and specifications for this. Steve Cotterill, MD of Editec, a specialist in the sales and supply of the I-MOVIX range of Ultra Motion cameras comments: “There are many claims of ultra motion capability but, in my experience, there is little increased detail in the slow motion until you get down to, say, one-tenth speed (250fps), and you get the real ‘art’ when you exceed 500fps (one-twentieth speed). Also be aware of what the ‘fps’ actually refers to – frames per second or fields per second. www.gtc.org.uk
Views from the cameramen
Ben Seaman is a freelance who mainly works on sport and is a highspeed camera specialist. He occasionally does handheld work but prefers using long lenses. “Regarding the speed you can run the camera system at, you have to come to a compromise depending on what you are shooting. If it’s a fast moving sport, you have to lower the frame rate, so with football you have to settle for around 300fps; if you go up to, say, 500fps, you’ll struggle to get a replay in because it takes too long to play the sequence through – you’d be missing the next action in the game. 300fps looks quite different to the super slomo 75fps, so that’s quite fast enough for football anyway, and it works very well. For the World Cup we set it to 400fps. On the show jumping I change the frame rate many times during the day. I love doing the horses because when you slow it right down you see a lot of movement that you just don’t see with the human eye. The I-MOVIX camera allows you to record up to 2,400fps. With the antiflicker and noise reduction circuits, the pictures that are produced are truly outstanding and add that wow factor to any production.”
I do get annoyed when manufacturers quote their ‘ultra motion’ cameras as shooting at 300fps, when this is actually 150 FRAMES per second, just half the speed of existing supermotion replays (i.e. one-sixth speed). What I am interested in is the actual speed of replay the viewer sees on the TV screen, in relation to full speed.”
SSM/SM vs Ultra Motion/Hi Mo Super slow motion has become firmly established for sports broadcasting because, over the years of development, it has become a very reliable way of inserting smooth slow motion clips quickly into a live production, even in a fast moving sports game like soccer. If a clip is recorded at 3x speed, a two-second clip will take six seconds to play back; if it is 10x speed, replay will take 20 seconds. If you stay with this, you could miss a goal. Cotterill comments: “Even at 250fps (onetenth speed) you can get replays in, but they need to be very accurately cued. Don’t forget, at 250fps, a replay cued just ½ second early will in real time be 5 seconds early – certainly resulting in a derogatory comment from your replay director!” However, the higher speeds that Ultra Motion offer are superb when used for analytical shots, or so-called ‘glory’ shots when there is a break in the play, or in the highlights. In sports like athletics and equestrian events they offer a new viewer experience and, as high-speed camera systems improve with advances in technology, we are seeing more and more of these amazing pictures on our screens.
Advances in OB sport high-speed cameras in the last two years
Most camera storage RAM Ultra Motion cameras now have the ability to offer ‘multi-loop’ recording by effectively segmenting the camera RAM into a number of ‘blocks’. 62
Sony appears to have been concentrating on cinematography and programme production in the last couple of years, so for the moment no Ultra Motion system OB cameras (yet). However, last year they developed the PMW-F55 Live system (with CA-4000 and BPU-4000) that is able to provide 6x speed. For recording, Sony has also developed the PWS-4400, a solid state memory video server, which is capable of recording highspeed content in HD but also up to 2x speed in 4K. Meanwhile, Grass Valley’s Weber explains their latest offering: “At NAB last year we
1 The new I-MOVIX X10 UHD camera system 2 PICO camera in FishFace underwater housing on Polecam 3 The NAC HI-Motion II 4 Editec’s Steve Cotterill with the new I-MOVIX system on test in Norway 5 Infiniti Red Bull Racing F1 pit stop analysis with Polecam and PICO slomo minicam
introduced the LDX XtremeSpeed, an Ultra Motion camera system producing up to six times the number of images but with a live workflow. It has already become very popular for sports broadcasting in the USA. “The 3x slow motion market had one particular advantage for many years: live workflow. For some time, this couldn’t be done with Ultra Motion. It wasn’t possible to get all those pictures straight out of the camera base station onto an external server. The cameras had an internal on-board recorder inside the camera head, typically based on RAM memory and a loop recorder. So you recorded the images for a fixed time in the loop recorder, running for, say, 20 seconds. After that the material was overwritten by a new 20 seconds and only if the operator actuated a trigger would the material of interest be frozen for a moment and downloaded in real time.” Cotterill agrees this was the case in the past but adds: “With software development, most camera storage RAM Ultra Motion cameras now have the ability to offer ‘multiloop’ recording by effectively segmenting the camera RAM into a number of ‘blocks’. When one loop is stopped for slow motion layoff purposes, another continues recording, and so on. Consider an operator setting a camera to have the RAM segmented into four blocks. If the camera is triggered once, the three remaining blocks are left recording. If some material of interest is noted on the ‘live’ camera output, the operator simply triggers once again. At this point, two blocks are not recording. When the first loop is played back, the block now finished with is ‘freed’ and the material noted and triggered in the second block is made available for playback etc. Obviously, if the operator triggers four times when in four-block mode, then the camera is not recording. This may sound a little complex and does require a decent slomo operator, but there are many out there who can do this exceptionally well. With a good operator, the second the Spring 2015 ZERB
All the manufacturers have been working to incorporate flicker reduction technology, and it’s improving all the time... I-MOVIX has recently launched a high-level processing system called d-flicker which claims to eliminate flicker altogether.
replay is triggered and layoff initiated, the EVS can be cued to allow the material to be played out. There seems to be some thought that the process of laying off causes a great delay in availability – simply not true – the Ultra Motion replay can be the first in a sequence, if so desired. A point often not realised, is that the replay output from the high-speed camera, certainly with I-MOVIX, is genlocked and can thus be fed directly to the vision mixer for even faster availability of action replay.”
NAC/Ikegami NAC Image Technology is a Japanese company that has collaborated with compatriots Ikegami, to produce an Ultra Motion camera called the NAC Hi-Motion II, which they say will operate at speeds approaching 1000fps in HD, and is a RAM based system that can record and replay at the same time. The first units were delivered in early 2012. Prior to that, the Hi-Motion I was a collaboration with Panasonic and marketed through ARRI. The Hi-Motion II is of standard broadcast camera handheld design with three sensors, a B4 lens mount, and offers a continuous live as well as Ultra Motion output. As with all high-speed camera broadcast integrations, industry-standard SMPTE Fibre (up to 2km) is used between camera head and CCU. A flicker suppression feature was introduced in mid2012 and is standard on all the 100 or so Hi-Motion II systems worldwide. There were 22 Hi-Motion II systems in use at the London 2012 Olympic Games.
Polecam/LMC – Antelope PICO This is a new and recent development that is an offshoot of specialised Ultra Motion for sports in a very small package. The camera part of this is a partnership between Polecam in the UK and LMC in Germany. LMC Techical Director Christian www.gtc.org.uk
Schreiber explains: ”We have a range of high-speed cameras under the Antelope name, but in the past couple of years, besides getting higher and higher frame rates, our goal has been to reduce the size of the camera to a minimum and basically this is our main advantage compared to other companies providing high-speed cameras. With the Antelope PICO, it is possible to put the camera into many new environments to shoot high-speed footage that has not been seen before. We recently collaborated with Polecam to fit the PICO into a modified Polecam FishFace underwater housing and are now able to shoot high-speed footage not just underwater but using the remote head of a Polecam, so with a moving POV capability both above and below water. This is a milestone in high speed, I think, which has never been seen before on live broadcast. We also integrated the deflicker solution of the Antelope PICO as an option, so basically no additional hardware is needed to shoot high-speed pictures under artificial light. We have a broadcast remote designed especially for broadcast operators. We can also do a zoom, focus and iris remote for the PICO, for C-mount lenses as well as B4-mount lenses.” The Antelope PICO operates at up to 350fps in HD. Steffan Hewitt, Polecam’s designer adds: ”I’ve just returned from covering the 25 metre World Swimming Championships in Doha, Qatar. I started the shoot with a Toshiba HD camera on the Polecam but once FINA saw the footage from the Polecam/PICO tests we did in Luton (www.polecam.com/latestnews/679-fishface-pico-slow-motion-swimming) they asked XD Motion and Polecam/LMC to see what they could do in the short time left. The last two days saw the PICO in action generating some fantastic footage. Recently, quadruple Formula One World Champions Infiniti Red Bull Racing have enlisted the help of Polecam for pit-stop analysis and have shaved 2/10ths second off a pit stop. So many critical events happen in typically 2.5 seconds and they are all important. When it all happens in the blink of an eye, but we want to see it instantly in fine detail, the Antelope PICO on the end of a Polecam makes it easy.” 63
I-MOVIX Located in Mons, Belgium, this is a company which put a huge effort into R&D to move Ultra Motion forward for the broadcast market. They work in partnership with Vision Research in the US, who make a range of high-speed cameras right up into the Hyper Motion category. I-MOVIX CEO Laurent Renard discusses the advances made over the last two years: “New technology has allowed us to increase both bandwidth and data rate to the point where we can now transfer, analyse and process the signal in real time; we are now able to work with a really huge amount of data. It means that I-MOVIX is able to process RAW video with a signal that can be more than 50 gigabits a second. Now we are able to work with RAW images going out of the memory of the camera continuously. We are able to work directly with the sensor of the camera, and we no longer need to use the internal camera memory. We work continuously with the camera as if it were a standard broadcast camera, and we can work in HD at up to 600fps continuously.”
Views from the cameramen
Steve Cotterill at Editec has been working closely with I-MOVIX since purchasing their very first Sprintcam Extreme Motion SD camera, based upon the Japanese Photron industrial camera. “This camera, it has to be said, was a proof-of-concept device… Although it was used on some live productions, the image quality was such that matching into existing cameras set-ups was not really possible. Operationally though the camera was a good performer, offering speeds up to 10,000fps at SD resolutions.” Now the operational control panel (OCP) for the new X10 UHD camera offers OB engineers the ability to quickly and automatically match the referenced colour matrix settings to the standard cameras in the OB scanner, e.g. Sony or Grass Valley.
Flicker Flicker has been a problem with slow-motion recording, especially under stadium lighting. In Europe, with the 50Hz AC supply, the positive/negative swing of an AC cycle means that the lights are actually flashing on and off 100 times per second. We don’t see it because of our persistence of vision, but if you are recording pictures in this environment with a high-speed camera, as you operate at faster speeds, the sequence of images you record will have different amounts of illumination and eventually you can even get an image that is recorded while the lights are dark. This causes a flicker effect on playback. It can also happen with electronic signage and hoardings. All the manufacturers have been working to incorporate flicker reduction technology, and it’s improving all the time. Most say they can now reduce this to small or negligible levels, but I-MOVIX has recently launched a high-level processing system called d-flicker which claims to eliminate flicker altogether. Cotterill from Editec is enthusiastic: “We now have d-flicker here in the UK and it does exactly that!”
Viewfinders GTC member Jim Cemlyn-Jones is a high-speed specialist freelance cameraman. “As in wildlife photography, which has improved dramatically as camera speeds have increased, the Ultra Motion/Hi-Mo cameras have enabled us to improve the coverage and analysis of sport, with amazing images that up until now we haven’t been able to achieve with the Super SloMo cameras. It has opened up a whole new world of creativity. From an operating point of view, sometimes it’s the unexpected or mundane that becomes exciting through seeing it in this way. As an operator of the system, you have your own blank canvas – you’re on a plane above the normal slomo. Suddenly you get these stunning images of simple things elevated to another level. I’ve been amazed by the flex of a cricket bat, how much it actually bends when it connects with the ball; a galloping horse slowed right down – a thing of beauty. The bounce and spin of a snooker ball is fascinating to watch in Ultra Motion; detail you just won’t see with the human eye. Sometimes the wider shots work too. As a cameraman with an Ultra Motion system, it’s tempting to zoom right in, but I’ve found that sometimes it works well on the wide shot too – like the horse galloping. It’s exciting to have such a creative tool as this, and to seek out the shots that it’s able to provide. We’re unravelling a new world of physics that we’ve been unable to see until now, and it’s very enjoyable as a cameraman to work with it.”
All the manufacturers now say they offer OLED viewfinders on their high-speed cameras. Editec, when supplying I-MOVIX cameras on rental, offer the latest Sony OLED viewfinder (PVM 640) but also the earlier Panasonic LCD (BT-LH80) that some cameramen seem to prefer.
Final comment A final comment from Steve Cotterill: “Producing the magical images these cameras can give you is a real team effort – the operator pointing the camera, the vision engineer and, of course, the slomo operator are all very specialist roles. Here in the UK I believe we are gifted with some of the best TV professionals in the world.”
Fact File See more about all the cameras and technology discussed:
www.pro.sony.eu/broadcast-solutions www.sony.co.uk/pro/hub/home www.grassvalley.com www.nacinc.com www.antelope.tv www.polecam.com www.i-movix.com www.editecuk.com
Spring 2015 ZERB
The GTC would like to thank all the sponsors for their generous support
020 8427 5168
020 8985 8960
020 8941 1199
020 8891 8910
01628 477270 www.cammotion.co.uk
020 8955 6700
+46 90 77 60 01
020 8232 8899
020 8961 0090
0800 970 2020
020 8334 2100
0844 330 8693
01457 851000 www.holdan.co.uk
0845 272 5151
020 3239 4858
0121 285 0021
0113 257 4834
020 8500 4385
020 8881 7850
020 8232 8822
020 7637 0888
020 8543 3131
0121 469 0070
01457 869999 www.peliproducts.co.uk
0191 265 0061
020 8969 6122
020 8236 1212
0845 123 5678
020 8659 2300 www.rosco.com/uk
020 8809 8680 www.shift-4.com
0870 411 5511
020 7622 5550
020 8977 1222
+31 35 6233707
020 7871 0700
0113 257 1333
020 8256 4932
0870 100 1220 www.tiffen.com
Camera review: Sony A7S
The Sony A7S One of the most talked about new cameras on the GTC Forum recently, notable above all for its exceptional low-light performance, has been the Sony A7S. Luckily we didn’t have to go hunting around for a review model to find out more about it as GTC member Mark Langton had already taken the plunge and invested in one. So, does it live up to all the claims?
any of you will already be aware of the Sony Alpha ILCE-7S camera (or A7S to its friends) released in mid 2014. This is a curious little marvel of photoelectric engineering with amazing low-light sensitivity. In the flesh, the first thing you’ll notice is its diminutive size: it’s significantly smaller and lighter than the Canon 5D MkII or III. Technically it’s a DLSM (digital, single lens, mirror-less) camera, which is relevant because not having a reflex mirror like a traditional DSLR means the camera body is slimmer and the sensor can be nearer to the lens. As a consequence almost any stills lens can be attached to the camera’s E-mount via an appropriate adaptor. If you currently own nice, fullframe Canon or Nikon glass, for example, you will be able to use it on the A7S and lenses designed for APS sensors can also be used, thanks to the A7S’s ‘crop mode’. Whether you love or loathe hybrid stills cameras, the A7S is an important development because I believe its low-light capability is a very significant indicator for broadcast and digital cinema camera technology in general. It could be a clue as to where we will see cameras heading in the very near future, and when I say ‘low-light’ I’m talking about a useable ISO of 10,000 and higher. With that kind of sensitivity you could light a whole set with a few candles and still have room to spare; you could shoot a night scene using moonlight as your background source; night-time cityscapes sizzle with vibrant colour – you get the idea. By comparison, 64,000 ISO on the A7S has a similar amount of visible noise as +18dB on a broadcast 2/3” camera. There is another real benefit to having an extremely lowlight capable, full-colour, low-noise sensor because it adds yet another level of creative freedom to the camera operator’s toolbox. Being able to shoot clean video at lux levels previously deemed unworkable means you are no longer slave to the camera’s sensitivity/noise limitations: you can finally choose your preferred iris setting and dial in the gain to match.
Real-world use Okay, that’s all sounding good so far but is there really a place for the A7S in professional camera work? How would it fare in a broadcast environment? www.gtc.org.uk
Well, technically, it can satisfy the requirements for broadcast acquisition but not straight out of the box. Although the A7S ships with a superb implementation of Sony’s XAVC codec, similar to that found in the F5, F55, FS7 etc, it is in fact the slightly watered-down XAVC-S variant, which (in the case of the A7S at least) is limited to a data rate of 50Mb/s (megabits per second), sub-sampled colour at 8-bit 4:2:0, and files are ‘wrapped’ as MP4 rather than MXF, but it’s the same H.264/MPEG-4 AVC level 5.2 file structure. The video this camera records internally is really nice with lots of detail and smooth tonal range. Flesh tones do lack a bit of warmth, but in my experience that’s a Sony trait in general with their single-sensor CMOS cameras. A simple bit of tweaking will get you your desired look. To get a compliant broadcast image, however, you will need an external recorder to take advantage of the uncompressed 8-bit 4:2:2 output via the micro-HDMI port in either full HD (1920 x 1080) or 4K (3840 x 2160). For all other purposes the internal codec is perfectly capable. When XAVC-S is compared side by side with ProRes 150Mb/s video it’s actually hard to tell them apart.
Sensor Sony has created something truly impressive with this fullframe Exmor, 12.2 megapixel, high-sensitivity sensor boasting 15.3 stops of dynamic range. Behind the scenes is a Sonybuilt BIONZ-X processor that is fast enough to process either HD or 4K video straight from the sensor without line skipping or pixel binning. For a primarily stills-based DSLM, 12 megapixels doesn’t sound a lot by today’s standards but therein lies the secret to the A7S’s remarkable video performance: it means each photosite (single pixel) can be bigger and therefore capture more light. Pair this with a maximum 409,600 ISO and it’s verging on night-vision! For our purposes as professionals we’re realistically looking at a maximum of 20,000 ISO before things start to get visually noisy – comparable to the noise at +9dB gain in a broadcast camera. In fact the only problem I found with this camera was remembering to take ND filters for daytime shoots because the image is too bright even at 100 ISO. 67
Camera review: Sony A7S
1 An active EOS to NEX (also known as EF to E-mount) lens adaptor is required to communicate with the Canon EF lens iris and to power the IS (image stabilizer). This Commlite model costs £58 and works very well, but if you’re feeling flush you can always opt for the £400 Metabones model.
2 The internal battery life is poor, but if you’re handy at soldering you can build this external solution for around £20. It uses good old Sony NP-L series batteries for hours of run-time. 3 A7S front view
Rolling shutter skew is present when the camera is panned quickly. It can be reduced somewhat in APS-C crop mode, but you will never get rid of it entirely.
S-Log 2 Another unique feature is the inclusion of Sony’s S-Log 2 gamma curve, which maximises the dynamic range of the sensor in video mode. Based on the same curve found in the F5, F55, F65 and FS7 cameras, it gives the option to capture much more highlight and shadow detail than would be possible with a standard video colour space based on Rec.709. But this does mean the recorded video will need to be graded to restore contrast and colour. Also, shooting S-Log on this or any camera will show more noise than you’re probably used to because S-Log was originally designed to be used in digital drama production where it is normal to add noise reduction in post. In TV we’re expected to hand over clean pictures to the client, so use Log curves with caution and make sure your client understands them. Send them a sample video file if For a primarily stillsnecessary. Unlike the bigger cameras, there is based DSLM, 12MP no option on the A7S to record Log might not sound a lot while viewing with a LUT (look-up but therein lies the table); everything you see on screen is secret to the A7S’s being ‘burnt in’ to the recorded picture remarkable video and, more importantly, if you are trying to light your scene using S-Log 2 as a performance: it means visual reference you may get unexpected each photosite (single results. So if you intend to go down the pixel) can be bigger S-Log 2 route you may need a monitor and therefore capture with built-in LUTs to ensure you’re not more light. over/under-lighting your set, and to stop 68
your director/producer asking why it’s all looking milky. Fortunately there are seven ready-to-use preset ‘PP’ (picture profile) ‘looks’ and, thanks to a deceivingly powerful submenu hidden within the camera’s software, these ‘looks’ can be customised a fair amount and stored in the camera’s memory. You have access to settings never before seen on a camera of this size: black level, knee, colour curve, gamma curve, colour phase, colour depth etc. Plenty to fiddle with if you’re the sort of person who can’t leave things alone! Overall I’m not convinced S-Log 2 is a good idea if you are recording to the internal XAVC-S because of the 4:2:0 colour: three-quarters of the chroma information has been discarded from a video signal that only has 8 bits of colour per channel to start with. S-Log 2 is intended to be graded but there is very little data for the computer to work with. Recording externally with give you more options unless you intend to perform all but the mildest of colour tweaks. It makes me wonder why Sony didn’t provide a 10-bit output like Panasonic’s GH4 or Blackmagic’s BMCC?
Frame rates Unlike the North American version, which is limited to NTSC-centric frame rates, the European A7S has all available options from 24P to 60P in full HD, plus up to 120fps at 1280 x 720 HD. At 1280 x 720 the ‘slo-mo’ function is surprisingly good, although there is a small amount of aliasing. There doesn’t appear to be any limit on the amount of slo-mo you can record because it doesn’t need to buffer the video like the FS700 used to. However, this does mean you can’t review the slow-motion video in camera – you need to prepare it with an editing programme like FCPX or Premiere. 50fps is very Spring 2015 ZERB
ALL PHOTOGRAPHS BY MARK LANGTON
Camera review: Sony A7S
4 A frame of HD video 1920 x 1080, 25P at only 4000 ISO. The scene is lit by a single candle. Lens: Sigma 28mm f1.8 prime. 5 A metal cage like this one from Movcam will protect your investment against damaged ports. It also allows for attachments like a radio mic receiver or external monitor/ recorder. The downside is trying to reach all the buttons and dials.
and exposure aid. There is also a small OLED viewfinder, which is good enough to get you out of trouble on a sunny day, but realistically if you are shooting video you are going to need an external monitor. The body is crammed with assignable buttons and dials, and being such a small camera things start to get cramped very quickly. Rather more of a negative though is the menu, which is, quite frankly, a mess. There are seven main pages containing a total of 26 subpages and I found myself constantly having to flick from page to page to change fundamental, frequently used settings. There is a customisable ‘Function Menu’ (a sort of
Things you should know The Good • Professional timecode and user bits. • Extra functionality like ‘time lapse’ can be added with apps from Sony’s website. • Moire is well controlled, thanks to the direct readout from the sensor. • Internal XAVC-S produces excellent, detailed video – better than the Canon 5D MkIII. • Maintains good colour accuracy and dynamic range at high ISO.
useful as it matches the fluidity of 50i interlaced video and can therefore be used for fast action like sports, or for news and current affairs.
• Sony S-gamut colour space option (but with limitations due to the 8-bit processing). • At least 13 stops of dynamic range in S-Log 2 mode. • 15 stops of dynamic range in 14-bit RAW stills mode.
The Not So Good
A half-decent preamp circuit inside the A7S means the sound quality is pretty good for a small camera. It’s capable of recording clean audio from a radio mic or self-powered shotgun mic directly into the 3.5mm mini-jack, useful if you have to travel light. If you require more control, you can buy the optional XLRK2M adaptor kit. This includes a two-channel line/mic XLR adaptor and ECM-XM1 short shotgun-type camera mic. The adaptor is basically a mini mixer/preamp with two independent rotary controls and selectable 48V phantom power. It takes its power directly from the Sony proprietary ‘Multi Interface’ hot shoe mount with extra pins. The audio signals also travel into the camera via this route, which means no extra wires.
• Poor internal battery life: approx 30–40 mins in video mode.
Layout There is a 3-inch TFT screen on the back for real-time preview and menu display. It articulates up and down, but not to the side like the GH4, which is a shame. The screen’s resolution is okay but it’s just too small to be used as a focus www.gtc.org.uk
• Rolling shutter skew is present. • The EVF eye proximity sensor is too sensitive, switching off the screen when anything is within 5cm of the eyepiece. • Maximum video record time is 29 mins 59 secs per file (Sony says this is to protect the unit from overheating but I strongly suspect it’s to avoid the 5% Customs Duty for being classified as a camcorder.) • High-end SDXC cards required to record video in XAVC-S. • Can’t record 4K internally (like the Panasonic GH4). • Peaking function is average quality and not as good as in Blackmagic cameras. • Micro-HDMI port is very delicate. A port protector is included but you ideally need a cage to protect all the sockets. • LCD screen is too small for critical exposure and focus so you will need a monitor. • Can’t shoot stills in video mode like the Canon 5D.
Camera review: Sony A7S quick launch page) but the camera doesn’t allow you to add certain basics like headphone volume, timecode set or APS-C sensor crop mode (essentially acting as a lossless 2x doubler). No Brownie points there!
Stills I know this is a review of the video functions but the quality of the stills is worth a mention. You may think 12MP isn’t enough for modern photography when other players are boasting 24 or 36MP sensors, but I have to tell you I underestimated the A7S. Those extra large photosites on the full-frame 35.4 x 23.8mm sensor produce images of exceptional clarity with superior low-light ability the more densely populated chips simply can’t match. You could even argue that fewer pixels actually makes for a better image. My Canon L-series lenses work perfectly (auto-focus is slow, but then I’m a manual focus kinda guy anyway). The photos pack plenty of vibrant punch; they don’t possess the organic warmth of Nikon or Canon as they’re a little too sharp out of the box, but not severely so and you can always dial it down to your liking.
Summary ‘Real’ cameras with good ergonomics like the Sony FS7 and ARRI AMIRA are making a comeback (yay!) but there is still a place for small, light, discreet cameras capable of producing flattering, broadcastable images. I will continue to use a larger camera when needed but there are plenty of occasions in the current climate where it is just not practical, either logistically or financially, to take a full-size kit on a job. I like the fact that I can continue to use my same favourite lenses I used on the 5D MkII and C300, and the visual quality of the A7S certainly stands shoulder to shoulder with the C300. Add a decent monopod and I can go all day without wrist and back pains.
Having the ability to take high-quality still photographs is a big bonus for me. It will also make a good B-camera to the larger Sonys due to its very similar colourimetry. I didn’t delve into the 4K ability mainly because there are very few 4K recorders out there at the moment but also because, although Rec.2020 is all polished and ready to replace Rec.709 as the standard set of technical guidelines, 4K is still in its infancy when it comes to broadcast. Currently it is sparsely scattered across a few corporate productions and archival documentary projects. There’s plenty of time before it becomes mainstream, but it’s good to know that, when it does, the A7S is ready for it. Yes, this camera has a few annoying quirks and is not perfect by any means – but it’s a lot of camera in a tiny form factor with incredible, unique features just not found in other hybrids.
Fact File GTC member Mark Langton began his television career in 1993, firstly as a video tape editor then as a studio and PSC cameraman. He taught himself lighting and cinematography through reading books, studying films and pestering other cameramen. He is now an established DoP/lighting cameraman with many documentary and primetime programme credits under his belt, including Tomorrow’s World, Bang Goes The Theory, Horizon and Top Gear. He also makes an occasional short film here and there. Contact Mark on: email@example.com See more about the Sony A7S at: www.sony.co.uk/electronics/interchangeable-lenscameras/ilce-7s
Spring 2015 ZERB
CREDIT: PA IMAGES
: 2 E K A T a second chance at life Many Zerb readers will know Mark Print (aka Pronto), a long-standing member of the GTC (in fact a former Zerb editor, way back in 1996), who started his TV career in Scotland and, in recent years, has been the overall camera crew supervisor for Sky football. Some will also know that Mark was dealt a cruel blow a few years ago when he became very ill with a condition that threatened to end his career and even take his life. So, those of us who are lucky enough to have known Mark for a long while were delighted to see him taking an active part at the GTC ‘Day in the Country’ last year, not to mention back at work on the football circuit - heartwarming signs that he has been given ‘a second chance at life’. Mark tells his story for Zerb.
uring the past four years the old saying ‘Life is not a dress rehearsal’ has come to mean a lot more to me than I had ever imagined it would. I have been extremely lucky to do a job I love since leaving school at 18. All seemed to be going well, but then in the space of a few months I came to realise just how fragile life can be and how we take the simplest things – like breathing – for granted. To give you a brief background, I’ve lived in Scotland since 1987, when I joined BBC Scotland as a trainee television camera operator. After a fantastic seven years working in the studios, on outside broadcasts and doing location filming, I took the leap into the freelance world. After dipping my toe into 16mm and 35mm film and various other things, I have spent the majority of the last 13 years working on the Sky football circuit and lately been the overall camera crew supervisor. But anyway... back to the ‘game changer’ as the commentators like to call it.
A slight cough After a couple of months of gentle persuasion from my lovely wife Samantha, I finally got around to visiting my GP to ask about a dry cough and slight breathlessness Sam said I had been showing for the past year. To be honest, I went along fully expecting to be told it was my age – 42 at the time – and nothing more. How wrong could I be! It took less than six months to go from GP’s surgery to local respiratory consultant to diagnosis. All the while I was still working hard and even went to South Africa as part of the World Cup host broadcast camera crew, at the time having no hint of the bombshell about to be dropped. On my return from South Africa, in July 2010, I was informed that my CT scans were showing up some anomalies and the medics would like to do a lung biopsy to nail down a diagnosis as they suspected I might have something called idiopathic pulmonary fibrosis (IPF). Basically this is a very nasty lung disease that hardens your lungs, with scar tissue preventing the exchange of oxygen into the bloodstream and eventually leading to your no longer being able to breathe… oh, and there is no medication that can cure it at the moment. Life expectancy from diagnosis is approximately five years. As a non-smoker all my life, to hear this about my lungs came as no small shock, but sadly the clue is in the name: ‘idiopathic’ means unknown cause.
Drug-fuelled My lung biopsy was performed quite soon, in August 2010 – not a pleasant experience – and it confirmed the diagnosis of IPF. That’s where the fun and games really began. My consultant explained that, while there was no recognised drug therapy, sometimes a three-drug cocktail would slow the progress, so I was started on a mix of drugs including some heavy doses of steroids. These tend to send you a bit wacky and hyperactive, meaning that one hour of sleep a day seems adequate. At the time this seemed like quite a blessing as I had agreed to supervise Sky’s 3D coverage of the Ryder Cup – and boy do those drugs keep you going! Don’t try this at home though as the downside is you tend to become highly emotional and crying becomes a regular occurrence.
After some gentle persuasion from my lovely wife Samantha, I finally got around to visiting my GP to ask about a dry cough and slight breathlessness she said I had been showing for the past year. I went along fully expecting to be told it was my age – 42 at the time – and nothing more. After two months there was no noticeable improvement nor stablilisation of the disease, so I was slowly withdrawn from the drugs as long-term use is likely to cause a large amount of harm from the side effects. So, what next?
Transplant – the only option “Well, it looks like we’ll have to look at a lung transplant as the final option.” I do like my consultant – completely honest and to the point since Day One. He explained that referral and assessment could take up to a year and that, even then, I might not be accepted as a suitable candidate. It’s all to do with being ill enough to need a transplant and yet healthy and strong enough to survive the major operation, a decision that only the transplant team can make. Sam and I went home to take this on board. We had known that things might go downhill, but this was only two months since the full diagnosis and I was still working as normal. Was this all really necessary, we wondered. Over the Christmas period I underwent more tests and my file was sent to the Freeman Hospital in Newcastle-upon-Tyne where any lung transplant would be performed. Meanwhile, life went back to normal as we waited to see if Newcastle could offer any hope of a solution. Work continued, including supervising the Champions League Final at Wembley between Manchester United and Barcelona. However, I was beginning to be really aware that my breathing wasn’t as good as I would like and I was also starting to feel a bit light-headed on some days, with occasional bouts of pins and needles in my hands. Maybe things really were changing internally. July 2011 came around and we travelled to meet the transplant team to see if I passed their basic requirements. I seemed to tick all the boxes and so they were happy to progress with the proper assessment, which would involve four days in Newcastle for a full MOT. This was to satisfy everyone that I would be able to withstand all the pressures that transplant brings – both physical and mental. They also suggested I might benefit from using oxygen when going out, to ease the increasing breathlessness I was experiencing.
Left: Moscow for the Champions League Final, 2008 Right: August 1994 - last day as staff at BBC Scotland
It was after this meeting that Sam and I reluctantly agreed that continuing to operate a camera, especially outdoors, was becoming too tiring and might well cause my health to deteriorate even more rapidly. Thankfully, the financial pressures this decision would usher in were eased to some extent by an income protection policy I’d taken out when I left the BBC in 1994. This proved to be a godsend and I would encourage any freelancer reading this to ensure that they have some sort of cover in place should things go wrong. Spring 2015 ZERB
With hindsight, I should also have accepted the advice from the GTC to contact the CTBF for help but I was stupidly reluctant to seek help from others – crazy, because this is exactly what they’re there for, as Chris Yacoubian pointed out in the last edition of Zerb. So, the MOT took place in Newcastle that August: I was poked, prodded, bled and examined from all angles; then, in a small office, we were told that I fitted all the criteria and was suitable to go on the transplant list for a single left lung. This, it was explained, was the best option for getting the quickest possible match for transplant and we were reassured I would be able to survive with one decent lung. Despite this significant step forward, I wasn’t quite sure how to feel – happy to be given the chance of a transplant or sad to think this really was my only hope of survival.
The waiting game It’s hard to put into words quite how it feels to be on the transplant list, waiting for that call to come at any time, knowing that this might be your only lifeline. But at the same time it would mean a major operation – that was not guaranteed to be successful. Maybe I could just continue as I was and it would all blow over. My mind was for ever churning… Where’s that crystal ball when you need one? Four months on, the list passed with no phone calls but plenty of sleepless nights and worry, especially when we were out and lost phone signal. Then it came… at 2.00 in the morning. The phone rang. There was a potential match, I was informed by my transplant co-ordinator. Within 10 minutes the ambulance was there – just time to get dressed and splash my face with water. Off I went, blue lights all the way to Newcastle, with Sam following in the car after waking up our lovely, understanding neighbours who had offered to look after our two dogs whenever we got the call. Seven hours later, after a raft of blood tests and X-rays and so on, the co-ordinator appeared: “I’m sorry, the donor lung is no good; the transplant won’t be happening.” We felt at rock bottom, but at the same time this was mixed with a slight tinge of relief. Was it right to feel happy that it wasn’t going ahead? The emotions were all mixed up; no wonder they had emphasised the mental strain of being on the transplant list at the time of the assessment. This pattern would be repeated another gruelling nine times during the following 18 months. Sometimes we would be told on arrival there were problems; on other occasions we would go through long hours waiting for the decision. This is unfortunately inevitable due to the fragility of the lungs when someone dies and the rigorous testing they have to go through before being accepted for transplant. www.gtc.org.uk
3 1 2 3 4
On camera at the South Africa World Cup, 2010 - just before the bad news Out with Sam, oxygen bottle in tow The day after the transplant, complete with ‘new’ lung Day 500 at Newcastle post-transplant
All the while, my health was deteriorating rapidly, meaning I was using oxygen 24 hours a day: in the house hooked up to a machine by a 40ft hose, which the dogs loved to stand on, and out of the house with a liquid oxygen cylinder, which only lasted for two hours as it was set to deliver 6 litres a minute.
Happy Anniversary This was the state of play on 2 April 2013, when Sam and I managed to go out and celebrate our 20th wedding anniversary, oxygen bottle in tow. We enjoyed a superb meal and spent a lovely evening together. Then, the very next morning, the phone rang at 10.30. It was my co-ordinator: they might have a match. Tenth time lucky, we joked. At 20.30 that night, we were given the news – the lung was good and it was mine if I wanted it. How could I say no? This really was my last chance as life was becoming a real struggle. We had already begun to accept that Christmas might not happen for us. As I was put under and drifted off into a nice sleep, it was Sam who experienced the most stressful time waiting to hear how things were going. For me, however, the next thing I knew was that it was midday, 4 April 2013. I awoke to find myself surrounded by lots of beeping machines and I remember being offered a cup of tea once my breathing tube had been removed. I’d made it; I had a ‘new’ lung. It wasn’t all plain sailing post-op though. Three days later I went down with sepsis, giving everyone another big scare. This was followed by a slow and painful recovery, occasionally affected by bouts of rejection and viral attacks. However, the transplant team and medical staff know very well how to treat all these bumps and continued to help me get stronger and fitter.
Looking forward For those of you wondering if that’s me ‘fixed’ now, I wish I could say yes, but unfortunately there are many potential dangers and pitfalls ahead. The average life expectancy of a transplanted lung is seven years and only 50% of transplant patients survive past three years. There is the constant spectre of rejection and infection, as well as a chance of kidney failure due to the drug regime I have to adhere to daily. However, this is small beer compared to the fact that I am breathing without oxygen and starting to get my life together again. To me this is a miracle, with every day a joy – and I aim to prove the statistics wrong. Slowly I started to rebuild my strength and confidence, and managed to do my first camera operating in September 2013. From then on, I have slowly done more and more jobs, to the point where I now feel I’m back to being part of the team again, something I could not have believed would be possible, even after the transplant. In fact, I will be taking part in the 2015 British Transplant Games next year, which are being held at the end of July in Newcastle/Gateshead. This is an amazing event, involving nearly 800 transplant 73
Mark Print recipients and proving there really is such a thing as a second chance. It’s so hard to put in this article my admiration for all those who have been involved in getting me to where I am today: the marvellous medical staff, my wife and family, all my friends and work colleagues, but most of all my donor and their family. They went through the worst of times on 3 April 2013 but were strong enough to take that decision to donate the organ that led to my being here to tell my story. I don’t know who they are, and I don’t need to know, but I thank them every day for giving me my life back. On a parting note, if you decide to sign the Organ Donor Register, or are already registered, please do let your family know of your wishes, as this is the biggest stumbling block at the time of death. Nearly 40% of families refuse the wishes of the deceased and this could be avoided by just taking a minute to tell them. And please enjoy life and those you love, as you really don’t know what’s round the corner.
Fact File From the age of 13, I had always wanted to be a TV cameraman after I saw an advert in a newspaper – I still have the actual ad. While at school I worked hard towards this goal by helping backstage for the National Youth Music Theatre and had a part-time night job at Fountain Television, based in New Malden. At the age of 18 I could at last apply for a job at the BBC and was offered a place at BBC Scotland. From 1987 to 1994 I had a fantastic time working in the studio, on outside broadcasts and assisting on location shoots. It involved a wide cross-section of programmes from classical music to mainstream drama, sport and seven series of Rab C Nesbitt. I took voluntary redundancy in 1994 and attempted to make my way in the freelance world. I had also just learnt to load 16mm, which enabled me to do several comedy dramas, as well as a Live Aid documentary in Ethiopia with Sir Bob Geldof (see right), before being clapper-loader on Regeneration, a 35mm film. During this time I was also keeping my hand in doing multi-camera jobs, which eventually led me to the world of football and Sky Sports. This started out as an odd job in Scotland but soon progressed to covering matches in
England as well and, before I knew it, I was a regular on the circuit. After many years at many grounds, I was asked to take over as crew supervisor with all the responsibility that entails. It has been a tough time looking after the crews but some major highlights have included being in charge of two Champions League Finals, (Moscow 2008 and Wembley 2011 and, thanks to others, I was invited to work at three World Cups – Japan, Germany and South Africa. Related charity links www.organdonation.nhs.uk www.britishtransplantgames.co.uk www.ctbf.co.uk
Hire the best...
01737 370033 Spring 2015 ZERB
n e h w e n e o i d s t n u e a b i w w ht istr e? o H ig d m m HD ho U the to
n itio the ll n fi De rds in at wi ality h g Hi zzwo ut wh h-qu a r Ult the bu w â€“ b ra hig to d n lt no a e p t 4K D) ar right hese u ing u ultan s (UH ustr y o all t e gear cons roblem ind pen t we ar ology the p hap tures Techn lains ing to pic vide? tt exp s look rb e pro Garr aster e supe c l Bil broad on th ow for italise ality n lable. cap ge qu y avai l ima ntial e pot
4K/UHD broadcast How will 4K/UHD images arrive in our homes? This is a question many people in our industry are asking, and often seeking a steer to justify 4K production. In reality it is not an easy question to answer. In my view it is unlikely that you’ll ever receive UHD through your conventional TV aerial and, to gauge why, we need to take a trip back to the classroom to understand modern broadcast distribution. To help break down the problem I split the technical side of the television industry into two parts: Production, which I refer to as ‘upstream’; and Distribution as ‘downstream’. Also, for ease, I will interchange 4K and UHD as equivalent, although I accept that UHD is actually a derivative of the 4K cinema standard. 4K production has some well-defined routes upstream; we might discuss certain camera and post-production toolsets but these are mainly differing brands of product within the same process. However, downstream technologies are changing rapidly and this uncertainty means that investment for a broadcaster commissioning such content is a much more tricky issue. We are seeing some 4K production that is intended for TV, however I would say that this is little more than testing the waters and the audience is only getting limited access.
Let us take a brief look at how we receive TV. Globally the most popular mechanism is terrestrial reception through a rooftop aerial, closely followed by satellite and cable. Clearly, internet distribution is growing at a phenomenal rate but I want to explain conventional broadcast – ‘linear TV’ – first, as it helps demonstrate the challenge. At a broadcaster, most of the workflows that a UHD master recording would go through would need updating and, particularly where subtitles and closed captions are added, this would obviously require an overhaul of the broadcaster’s technology. I don’t want to oversimplify this issue but essentially it is one that can be resolved with investment. It is the battle with the principles of physics that occur further downstream that presents the real challenges that can’t be fixed by throwing money at the problem. So here’s the science bit. Analogue TV was parcelled up into pieces of spectrum approximately 8 MHz wide (slightly less in some territories). This allowed for a single channel containing a colour picture, sound and teletext. When the first digital services started in the late 1990s, they operated in parallel to analogue and therefore in the same given channel allocations of 8MHz. Engineers developing digital terrestrial systems had to make them complementary to existing analogue TV. Globally there are five different digital TV (DTV) systems, however the most popular in terms of number of countries is DVB-T/T2 (technically two systems). For this reason I will unpack DVB-T/ T2’s workings to demonstrate the challenge for 4K broadcast on terrestrial.
I know UHD is more data but technology will get better, won’t it?
Clearly the biggest growth of media distribution has been the internet and we will eventually see the majority of media consumed in this way.
It is obvious that UHD requires more data to be sent, however it is the sheer scale that potentially blocks terrestrial distribution. DVB defines how the signal is sent through the air from transmitter to receiver. Just as a wireless microphone might use FM (frequency modulation) to send audio across a gap, DVB uses a technique known as COFDM (coded orthogonal frequency division multiplexing) to send a stream of digital bits. COFDM is an efficient way of utilising the radio spectrum and the typically implemented version can be thought of as approximately 7000 narrow concurrent digital streams sitting side by side within the allocated 8MHz. Remembering that DVB-T/T2 has to occupy no more than the 8MHz bandwidth, this allocation will limit the amount of data that can be passed. There are some very clever but rather complicated mathematics that predict the theoretical maximum data throughput in a DVB-T/T2 channel but it is not necessary to go into that level of detail here. In short, DVB-T allows for 32Mb/s and DVB-T2, the later standard, 50Mb/s. Practically, for reliable services in countries with varied terrain or where transmitters are placed close together on the same frequency, the theoretical maximums are never achieved and the actual bitrates are often much lower. In the UK, DVB-T is typically 24Mb/s and DVB-T2 allows 40Mb/s. By now some of you may be grasping the scale of the problem! The DVB-T/ T2 standards are well-engineered systems and as a result have already pushed the limits of physics very hard. Assuming the maximum that one DVB-T2 multiplex can Spring 2015 ZERB
4K/UHD broadcast give is around 40Mb/s, to be shared with a minimum of seven or eight TV services to make it commercially viable, the available bitrate for a 4K movie channel could be as little as 6–8Mb/s. So how much bitrate might UHD need? Picture compression techniques are always improving. High efficiency video coding (HEVC), sometimes referred to as H.265/MPEG-5, is the next big thing and sample UHD material compressed by this is now available on the internet at various bitrates (see www.elecard.com/en/ download/videos.html). HEVC claims a 40% improvement over MPEG-4 and gives stunning performance for HD material at just a few Mb/s. It is an open question as to what the subscriber might accept as decent 4K motion picture but my money is on at least 15Mb/s. 4K is four times as many pixels as HD, making the 40% advantage of HEVC less than half a fix for the problem. There is also another interesting attribute of UHD. Early perception tests seem to indicate that viewers need almost double the frame rate to make viewing comfortable. This can be done with temporal upscaling, a mechanism in the TV set which effectively makes up the interleaving frames, but true quality may still need HEVC to compress between 60–100 frames per second (fps). If UHD is all about quality and the driver of a decision to spend circa £2000 Sterling on a UHDTV, anything less than 15Mb/s in my view would shortchange the viewer and I am not alone in this opinion. Here is the irreconcilable problem for terrestrial 4K: effectively, we may need more than double the bitrate that would be made available for such a service. Of course, there might always be a newer modulation standard – a DVB-T3 per se – but it is unlikely to wring much more data throughput out of a typical 8Mhz chunk of spectrum; physics won’t allow it. To compound the issue, terrestrial broadcasters around the world are already on notice that spectrum is a limited resource and a valuable revenue opportunity for governments. Accordingly, the likelihood that more spectrum will be sold off for mobile data/telephone services means no more space for terrestrial TV. I might be sticking my neck out a bit here but I don’t believe we will see significant 4K terrestrial services, if any, within the next 10 years. The steering group behind DVB in all its guises published a standard in July 2014 referred to as DVB-UHDTV Phase 1, which set out some basics for UHD television, including the resolution standard (3840 x 2160) and mandating the use of HEVC. Taking steps towards making UHD a reality will mean that this will most likely be adopted in cable and satellite platforms. Most global satellite services use variants of DVB suffixed with S and S2. These are similar to the terrestrial system but are designed specifically to deal with the propagation of the signal across 24,000km of space. So why are satellite, cable and internet delivery the most promising routes for 4K? The advantage for cable and satellite operators is that they tend to have more spectrum to play with. This is still limited by nature but these distribution systems are either allocated more spectrum in the first place or, in the case of cable, have total control of the link between viewer and broadcaster. An indication of this was the introduction of 3D, not hugely successful commercially, but satellite and cable operators have been able to introduce 3D channels even with limited interest. 3D should not be seen as comparable to UHD in scale of technical challenge, as 3D bitrates are only marginally larger than native distribution HD; if you are a satellite operator you may have at your disposal 1 or 2GHz of bandwidth, which you can use very efficiently. Compare that to terrestrial which might provide just 300MHz of spectrum, the utilisation of which has to leave huge gaps to prevent geographical co-channel interference.
I might be sticking my neck out a bit here but I don’t believe we will see significant 4K terrestrial services, if any, within the next 10 years.
will drive in content production. I would expect that we will see similar announcements in 2015 by other satellite and cable operators around the world. Clearly the biggest growth of media distribution has been the internet and we will eventually see the majority of media consumed in this way. Curiously, the challenge for 4K internet distribution is similar to that of terrestrial TV. Netflix, one of the fastest growing pay-for-content internet providers, subscribes to the idea that 15Mb/s is the quality threshold for 4K. ADSL internet speeds are improving but are starting to hit their theoretical maximum of 24Mb/s. Practically, the distribution of telephone exchanges and subscriber homes around them means that in many countries ADSL2+ is achieving consumer averages of 12–14Mb/s. In metropolitan areas this may be higher but then these areas are also served with higher speed fibre-optic broadband products. When you will be able to receive an internet delivered service will depend on where you live. Assuming the availability of reasonable ADSL2+ in your area, with speeds above 16–17Mb/s, then perhaps Netflix and other providers may be able to offer subscription services within the next 12 months on some limited content.
Fact File Dr Bill Garrett is a professional engineer and journalist. He has worked for national broadcasters in the UK and Australia, where he has been a technology leader overseeing broadcast operations and advising on broadcaster strategy. See more at: www.elecard.com/en/download/videos.html
When can I get UHD? In the USA, satellite provider DirectTV announced in November 2014 that it would launch consumer UHD services this year. It will be interesting to see what changes this www.gtc.org.uk
Bill Vinten GTC University Awards
The Bill Vinten GTC University Awards 2015 Entries are invited for this year’s Bill Vinten GTC University Awards. The Awards are open to final-year students completing courses this summer. To enter, a portfolio of three films shot by three students, should be submitted from each university or educational facility. The films should be from different genres and should each have a specific camera credit. Last year’s winning university was University for the Creative Arts (UCA), who have kindly agreed to host this year’s presentation at their Farnham base in October.
n 2014 the individual prize was won by Vince Knight from Bournemouth University, with runners-up Daniel O’Flaherty and Thomas Read, both coming from UCA. Each year the winners of the Bill Vinten GTC University Award are offered work experience with GTC colleagues. For Vince and Dan this was on the set of the daytime drama Doctors and below they share some of their experiences from this great opportunity.
Vince Knight I’ve previously worked on sets for feature films and commercials, but this was my first time on the set of a television drama so it was a wonderful insight into the way in which this kind of show works. Doctors uses a mixture of purpose-built sets and location filming, and I was able to experience both. The first thing that struck me was the rate at which everyone works; everybody knew exactly what they needed to be doing and the set was ready to go in minutes. Changing to the next location was like clockwork; I could tell these guys had done this before! L to r: Dan, Vince, Chris Pinnock (UCA lecturer) and Tom Read
During my week I got to know people in all departments, observing different aspects of television production. Although I work in the camera department, I take an interest in all areas of production and feel it’s important to learn from them all. It was great to see everybody getting along so well, and everyone I met was so friendly and approachable. While my time there was short, I met some wonderful people and learned a lot that will help me in my career in film and television. It was definitely a workplace I would be happy being a part of in the future. When my film The Domestic Life of Mollusks was shortlisted for the Bill Vinten GTC University Award, it was suggested that I enter Golden Eye, the International Film Festival of Cameramen in Georgia, where I was delighted to also win the prize for best student work. I’ve also started my own film festival for the automotive film/TV/commercial industry. It’s the London Motor Film Festival for anyone who films with cars!
Dan O’Flaherty The week I spent in Birmingham has greatly helped my knowledge of working in the professional television industry. I learnt about the expectations of the crew on a drama that films three episodes in one week. This is achieved by three crews shooting at the same time, each focusing on a different episode. I was able to get a better understanding of the whole work process by jumping between two crews, which enabled me to meet and learn from more crew members. The schedule is very tight so the margin for error extremely slim. I was told that dramas would normally aim to shoot between four and ten pages a day but Doctors expects double that – and more. I quickly learnt how problems were solved swiftly and efficiently. It was a great way to learn a professional workflow at a fast pace. After being involved on a show like this I feel more confident about being able to keep up on other projects whatever they may be. Spring 2015 ZERB
Bill Vinten GTC University Awards
The camera crew was very welcoming and always found a way to offer their experience and advice, even within their busy schedule. As soon as they heard I had experience in the AC role working on short films and corporate projects, I was entrusted to help the assistant cameraman. By being hands on with the team each day I was able to adjust to working in the way expected in a television environment, which was both challenging and enthralling. One of the most beneficial things I learnt was just how important and meticulous you must be in prepping your kit to avoid wasting time on set and minimising surprises or problems with the equipment. As was stressed to me by the assistant cameramen, on a show like this where the turnaround time is especially fast, you cannot afford to be running late or have people waiting for you. A great way of emphasizing this was: “If you’re on time, you’re late. If you’re late, you’re fired!’ I also learnt the value of listening in on conversations between the DoP and camera operators to get an idea of the shots they are trying to accomplish. This way you can plan what equipment you’ll need to prepare and where to run your camera cables before being asked, helping to keep a step ahead of the game. Since leaving university I have been freelance in camera and lighting on some great short films with Irresistible Films and Secret Cinema. I have also been fortunate enough to work with some very talented people at the NFTS and make some great contacts through these projects. It would be a great pleasure to work within the Doctors environment again as I
feel there is so much more to learn. I must thank the GTC for giving me the opportunity and the Doctors camera team for the experience and knowledge they have passed on to me.
Fact File Vince Knight Mobile: 07928 266321 Email: firstname.lastname@example.org www.vinceknight.com www.londonmotorfilmfestival.com
Dan O’Flaherty Mobile: 07748 687433 Email: email@example.com
Details and application forms for the Bill Vinten GTC University Awards can be found on the GTC website at: www.gtc.org.uk/bill-vintengtc-awards-2014.aspx. All last year’s shortlisted films, including Vince and Dan’s, can be viewed on the website at: www.gtc.org. uk/the-gtc-awards/the-bill-vintengtc-university-award/ view-the-bill-vinten-gtc-award-shortlisted-films.aspx
coffeeA5_June2013_coffeePostcard 17/06/2013 11:25 Page 1
Spatial movie production
Spatial Movie Production – A glimpse into the future Readers of Zerb Issue 79 may recall reading about R&D in ‘Light-field Capture and Processing’. Far from a futuristic pipe dream applicable only to special effects, the systems developed during the ‘Spatial AV Project’ were presented at IBC 2014, with equipment already having been trialled at events including the World Cup Final. Now, Project Manager of Spatial AV, Dr Siegfried Fößel of Fraunhofer Alliance Digital Cinema, brings us up to date.
igital media production technology makes many things possible that were unthinkable just a few years ago. 4K image definition and spatial sound, even for portable devices, are available to users and promise enhanced enjoyment of media. In the meantime, 3D has become available for both movie theatres and home entertainment. Now the new challenges for future technologies are to increase picture quality even further, by offering high resolution, high dynamic range (HDR), a wider colour gamut and higher frame rates. Researchers from the Fraunhofer Alliance Digital Cinema have taken their investigations a step further and are looking into ways in which more creativity can be brought to three-dimensional movie production, and how new technologies like light-field will extend the possibilities for film-makers when creating new kinds of content. In the 33-month Spatial AV project they developed different systems to provide multi-sensory recording and production systems for video and audio to pave the way for immersive media experiences for audiences both in theatres and in front of TV or portable screens. They developed and presented three scenarios that enable enhanced workflows and media experience from today and into the near future. Part of the project involves hardware/software systems that allow stereoscopic cameras to be automatically calibrated. With the OmniCam 360, cameramen and viewers will be able to freely select
Camera array for light-field acquisition
Spring 2015 ZERB
Spatial movie production the camera section from a panoramic image, turn full circle, or enjoy a full panoramic view while the camera system is standing in the middle of, for example, a concert hall. With light-field or multi-camera systems, it will be possible to work with recorded scenes as well as computer-generated scenes, or to add new fascinating effects. Scenes and views shot from just one camera from a single position are almost gone. Nowadays, special effects are in demand, and even more so in 3D. Fraunhofer scientists have presented systems some of which have already been used on various productions and others that are on the cusp of testing and evaluation by camera operators to optimise the technology and tools available on set.
Top: Multicamera system for light-field recording Bottom: Plug-in for operating and working with multi-camera/ light-field data for AVID
3D productions made easy 3D content has the advantage of being able to pull viewers in from their seats and whisk them away into an alternative fantasy world. But 3D drives up production costs, which are already high because of the necessary special effects. What makes 3D production so complex and therefore costly? Instead of one, the cameraman must operate and focus two cameras. This is because the left and right eyes have a slightly different angle of view and the two cameras imitate this effect. If that wasn’t stressful enough, the angle of inclination and the distance between the cameras must be constantly adjusted. In the future, camera operators will no longer have to worry about things like this: it will be enough to focus one camera, and everything else will follow automatically. This is made possible by software developed by the researchers at Fraunhofer. The second camera adopts the focus setting of the first one, and appropriate algorithms ensure that the camera alignments adjust to one another in an optimum manner. There is already a prototype of the camera system in which the software automatically carries out the recalibration once per second.
Shooting with multi-camera systems The camera revolves around the main character, who seems to be frozen in the middle of a jump – time seems to stand still for a moment and the camera shows the jumping figure from all sides. Many will recognise this description of the effect originally known as ‘bullet time’. What was possible for an instant of time in a scene by capturing images with an array of still image cameras is now possible for movie production, and not only for monoscopic but for stereoscopic movies as well. Shooting with multi-camera systems is a new technique for professional movie-making. For more complex special effects, two cameras are no longer enough. The Fraunhofer researchers have set up a system comprising 16 cameras, which can be expanded even further as required. The trick is in the software, which uses the 16 camera images to generate depth maps to specify how far the object represented in each pixel is from the viewer. This depth map is used to generate any number of intermediate views in between the 16 camera views – meaning that a virtual camera is created, similar to movies that are entirely computer-generated. This gives a great deal of freedom to the cameraman and producer, such as, for example, moving shots but without having to move the real cameras at all. This new technology is based on multi-camera systems that capture parts of the light-field. This means that one recording includes different views of the same object, www.gtc.org.uk
which can then be used in post-production to recover any creative opportunities that have been missed on set for whatever reason: an unforgettable sunrise, a complicated What was stunt or unrepeatable emotion in an actor’s possible for an performance. Any number of things might instant of time happen to prevent the perfect image, event or in a scene by performance being captured at the time. Working with multi-camera/light-field capturing images systems for special effects, or even for the with an array main camera, is a new and promising trend of still image for film-making. The editing flexibility will cameras is now make complex and expensive retakes or also possible additional shots a thing of the past. Intelligent computation of different views allows for movie refocusing, changes in perspective, 3D effects production. and changes in depth, as well as virtual camera movements in all spatial directions. As many additional views as required can be generated from the existing views. Experts at Fraunhofer IIS are currently working on putting this light-field technology and processing into practice in a useful way: auto-calibration methods can correct geometric distortions in the cameras without test charts or calibration; depth map algorithms compute and correlate depths from different camera views; rendering algorithms generate high-quality views for any screen size for 2D or 3D views. Part of this intelligence is integrated as a first step in a plugin for Avid Media Composer so the changes or modifications of parameters can be done through a familiar user interface. The adjustment of the effects or camera movement is carried out through a menu with timeline and slider functionality. The user gets a preview of the scene with the adjusted effect rendered continuously on the timeline. These first tools make it possible to work with multi-camera systems in a known and comfortable way, so that the movie producer can explore the creative possibilities of this new way of shooting. Fraunhofer is carrying out first test shoots together with partners to learn more about the required workflow for acquisition and postproduction. An additional effect that can be carried out with multi-camera systems is to increase the HDR of recorded 81
Spatial movie production
When the OmniCam360 is placed at the edge of the field at the centre line, it is able to capture a 360° panoramic view of the whole stadium... fans and viewers will be able to freely select the camera perspective.
scenes. The array consists of several individual cameras with different neutral density filters in front of the lenses. By using a high-performance algorithm, the engineers can knit the various individual images together into one single HDR image. In recording situations where a conventional camera would be unable to cope with the full dynamic range of the scene, this is a clever solution for providing optimal conditions to achieve good imaging. First tests already indicate that a multi-camera approach for HDR with moving pictures is a promising method that will be explored and optimised in the future.
360° panoramic view for broadcast and TV
New features, however, are not only coming to movie theatres. How we experience movies and television on our own sofas is also going to be transformed. Soccer and concert fans, for example, will be able to freely select the camera perspective, turn full circle, and enjoy a panoramic view of the field and stands during live broadcasts. The OmniCam360 makes this possible: when this camera is placed pitchside at the centre line, it is able to capture a 360° panoramic view of the whole stadium. The camera from Fraunhofer HHI weighs just 15kg and is no larger than a normal television camera. This enables it to be carried by one person and fixed on a tripod. The OmniCam 360 comprises a total of 10 cameras, but there is no need for complex calibration. All you need to remember when operating the OmniCam is: unpack the camera, plug it in and start filming. The camera has already demonstrated its stability in a range of test productions, including a concert with the Vienna Philharmonic and the 2014 FIFA World Cup production of the Final. The camera is now licensed and is being marketed. If, however, you would like to produce film material for a dome-shaped screen, additional cameras are needed to point skywards – otherwise a panoramic view will be projected around the edge but there will be a gaping black hole on the ceiling. That’s why researchers at Fraunhofer FOKUS have 82
OmniCam360 (seen left of frame) will offer viewers of concerts and football matches selectable sections of the image
developed a special process to merge image streams from individual cameras into a seamless picture in real time. This means that even dome-shaped movie theatres will be able to show live broadcasts in the future. The Spatial AV Project and its solutions were monitored and evaluated by a a consortium of industrial partners, film and broadcast experts, as well as cameramen, to ensure that the solutions can be applied in media and production workflows efficiently.
Summary The goal of the Spatial AV project was to create more opportunities for creativity – for both 2D and 3D productions including both pictures and sound. This required a team effort by engineers and media experts from Fraunhofer Institute for Integrated Circuits (IIS), Fraunhofer Institute for Digital Media Technology (IDMT), the Fraunhofer Institute for Telecommunications, Heinrich-Hertz-Institute (HHI) and the Fraunhofer Institute for Open Communication Systems (FOKUS). It was an eye-opening glimpse into the future.
Fact File Fraunhofer Project Spatial AV Goal: to develop an intelligent, modular, multi-sensory recording and production system for immersive audiovisual media. Results and systems presented at IBC 2014, 14 September. Project duration: 33 months “We want to create new technical opportunities for creativity in cinematography through Project Spatial AV. The camera operator should be allowed to return to intensely concentrating on production of the story and be relieved of numerous technical adjustments and details that have been flooding the set since the inception of 3D.” Dr Siegfried Fößel, Project Manager for Spatial AV
Spring 2015 ZERB
NEO, THE NEXT GENERATION OF ON CAMERA / ENG LED LIGHTING
POWER ACCURACY CONTROL Key Features: BI COLOUR 6300-3150K - ACCURATE COLOUR TEMPERATURE DISPLAY
FADE DESIGNER AND ‘FSTOP’ PRODUCTION TOOL MODES
GORGEOUS SOFT LIGHT OUTPUT - 50º BEAM ANGLE
BEST IN CLASS CRI>91, SKIN TONE>98, TLCI 85
DELIVERS A POWERFUL OUTPUT OF UP TO 1077 LUX AT 3FT
POWER VIA 6 X AA (3 HOURS), AC OR DTAP CABLE
OUTSTANDING INVENTION, DEVELOPMENT AND INNOVATION CONTRIBUTING TO THE ADVANCEMENT OF THE INDUSTRY
Shipping Late March 2015
NOW AVAILABLE TO PRE-ORDER FOR JUST £29.99!! BVE SHOW 2015 EXCEL LONDON STAND F02
WWW.ROTOLIGHT.COM MADE AT PINEWOOD STUDIOS, UK
CABSAT SHOW 2015 DUBAI WORLD TRADE CENTRE HALL 3, STAND D3-32
Resolution Independence The power to work in HD and 4K from camera to post. AJA futureproofs your workflow to grow, as you do. Work at HD resolution and switch instantly to 4K at any time, using the same hardware. CIONTM
Science of the Beautiful
CION is the new 4K/UHD and 2K/HD production camera from AJA. Unite production and post by shooting directly to edit-ready Apple ProRes 4444 at up to 4K 30fps, ProRes 422 at up to 4K 60fps, or output AJA Raw at up to 4K 120fps.
Professional 4K and HD I/O
3G-SDI to HDMI Mini-Converter
Io 4K offers a full set of professional video and audio connectivity with support for 4K/UHD devices and High Frame Rate workflows up to 50p/60p, all powered by ThunderboltTM 2.
Monitor your professional 4K workflow on affordable UHD displays. Supporting High Frame Rates up to 60fps, Hi5-4K is a single, portable device that converts 4K-SDI to HDMI for low-cost, full resolution 4K monitoring on set or in the studio.