GIANT SCREENS The big picture explained
MORE GRAND TOUR
A big difference from small cameras
WILD CONFESSIONS Tips from a wildlife veteran
THE FUTURE OF VIDEO PRODUCTION TODAY
PLANET EARTH REDUX
This time we get up close
Separating the facts from fiction
How Fantastic Beasts were captured
TECH REBORN AT LAST AMBISONICS FINDS A HOME, SEE PAGE 58 cover sample.indd 2
NEWS GRADE STORY
Colourist Greg Fisher gives us an exclusive insight into how he delivered the final grade for Mat Whitecross’s new music documentary, Oasis: Supersonic WORDS JULIAN MITCHELL
If you want to experience the ultimate rock ’n’ roll rags-to-riches story, look no further than the new feature documentary Oasis: Supersonic, an in-depth look at the life and music of Mancunian Brit-pop brothers, Noel and Liam Gallagher. Told in their own words, Oasis: Supersonic intimately charts the band’s rapid journey from the council estates of Manchester to global stardom, ignoring initial industry indifference to become the cultural phenomenon and 90s icon that we know today. As Noel Gallagher described it: “We were grafters who came from nothing and took it all.” The film features extensive unseen archive footage, rare clips, recordings and present-day anecdotes; seamlessly brought together by Greg Fisher, one of the world’s most sought-after colourists. He begins, “It was unusual and enjoyable to have to take different approaches in the grade with the varied material, both to get the best out of every individual shot and ensure they all came together cohesively as a whole.” Using Blackmagic’s DaVinci Resolve, Greg aimed to give the film a vintage look and feel, one that he believed would “be sympathetic to the band’s aesthetic”. It’s a little warm overall and the harshest clipping of the video has been somewhat
WITH MOST OF THE FOOTAGE BEING ARCHIVE MATERIAL, GREG HAD TO APPROACH THE GRADE PARTLY AS THOUGH IT WAS A RESTORATION smoothed but has hopefully not lost the vibrancy it sometimes had. With most of the footage being older archive material, Greg had to approach the grade partly as though it was a restoration. “By the time it got to me, the footage was all DPXs, but most of the material was originally some flavour of analogue video,” he continues, explaining that the quality of the footage would vary greatly from multiple generation VHS filmed by a roadie in the dark to professionally shot Betacam SP. “There were also a number of shots from various film formats such as Super 8 which had a distinct feel from the video footage.” To ensure that the mix of formats would not jar the audience’s experience when edited together in the final feature, Greg used Resolve’s noise reduction and sharpening tools to stretch images more than would otherwise have been possible, enabling each shot to conform as
ABOVE Using DaVinci’s Resolve, colourist Greg Fisher hoped to give the varied source materials used for the documentary Oasis: Supersonic a cohesive look.
well as possible to the documentary’s overall quality. “The trick was to find a line which pulled it all together without losing that inherent richness and variety,” Fisher reveals. “This involved leaving the noise in some shots, heavily de-noising and stretching others, and even dirtying a few up.” As well as the older material, Greg had to incorporate text and graphic elements into his grade plus animated sequences illustrating key events in the band’s history. “There were a lot of titles to take care of, so being able to composite in Resolve and keep them on separate video layers within the project timeline was very helpful. “The only new sections are the excellent animations. They have a distinct feel from both the film and video archive, but do so in a complementary way. It helps the overall feature feel more textured and rich having varied sources. “Every job is a learning process to some degree,” he concludes. “Restoration and promo work are normally seen as quite different, but both informed the approach to this film. I hope the overall result is a documentary full of incredible reminders for fans of those times, and a way for anyone to get to know the band a little better.”
DEFINITIONMAGAZINE.COM JANUARY 2017
def-jan17-010 (news)sgljc.indd 10
SHOOT STORY PLANET EARTH II
Planet Earth II, this time it’s personal. Or this time it’s not so much about the grandeur from above but about the experience on the ground WORDS JULIAN MITCHELL PICTURES BBC NATURAL HISTORY
DEFINITIONMAGAZINE.COM JANUARY 2017
def-jan17-012-021 (planet earth) cb.indd 12
PLANET EARTH II SHOOT STORY
en years ago, the original Planet Earth was all about megafauna. It was really about the planet as god sees it. Producer Chadden Hunter sees Planet Earth II as a much more immersive experience, however: “The proximity of the camera to the animals is a huge difference, the tag line for the original Planet Earth was ‘Planet Earth as you’ve never seen it before’. I like to think for Planet Earth II the tag line would be ‘Planet Earth as you’ve never experienced it before’.” As far as personnel and consequently shooting schedules were concerned this time they also planned in a different way. Normally a producer might make two programmes in the series and they would have an assistant producer on each programme and probably a researcher on each programme. So there might be enough people to run multiple trips in parallel. For Planet Earth II they decided to have smaller teams but they’d give one programme each to the producers with a really long time to do it. It’s meant that the producers have gone on location for a given programme more often than they would have done in the past. But the shoots had to be one after another rather than in parallel. The shoots became so technically sophisticated that the decision was made for a team to concentrate on one film only. Chadden, producer of the Grasslands programme, while feeling the public warmth of the series reception puts some of it down to BBC scheduling. “The title and the brand are obviously a massive factor, that and the Attenborough publicity machine. I think another key difference was that we moved the slot from 9pm to 8pm. Often these Attenborough Christmas series like The Hunt, Life Stories and Africa have been pushed into that 9pm slot and it just makes such a difference with families able to watch it together. I think people really are buying into the new look and appreciating it. “Planet Earth was of course the first-ever HD wildlife series, it was
I LIKE TO THINK FOR PLANET EARTH II THE TAG LINE WOULD BE ‘PLANET EARTH AS YOU’VE NEVER EXPERIENCED IT BEFORE JANUARY 2017 DEFINITIONMAGAZINE.COM
def-jan17-012-021 (planet earth) cb.indd 13
NATURAL HISTORY FILMMAKING
CONFESSIONS OF A WILDLIFE SHOOTER As natural history programmes increasingly base their cinematic sequences on the narrative, what part of the moviemaking box of tricks should you be looking to use? WORDS JOHN AITCHISON PICTURES JOHN AITCHISON/BBC
Wildlife shooting has become story based, or stories within sequences have become story based. In programmes like the BBC’s The Hunt each sequence is a strong story and usually a loose theme – maybe habitat – joins them together. The stories are rarely about generic animals, more often they are about individuals that demonstrate something specific.
From my point of view when I go somewhere almost the first decision is ‘which one are we going to film’. Usually the point of going somewhere is predetermined by someone else: producer, assistant producer, researcher. In an ideal world they have an idea of what they would like: ‘Can you film this species doing that behaviour? These are the interesting aspects, we think it might be good to do it filmically.’ This is about telling stories with moving pictures, so it’s about filming, right? Well yes it is, but first we should think about what a story is. Stories are not just accounts of events - they also have to hold the attention of their audience, which has led to storytelling becoming an art. Show, don’t tell is a cardinal rule of storytelling. Showing the story is essential when you are filming, because the telling comes later and
IMAGE To encourage the audience to engage with an animal’s story, a filmmaker can invite them to be either on its side or against it. These Hyenas were part of the Cities episode of Planet Earth II.
DEFINITIONMAGAZINE.COM JANUARY 2017
def-jan17-024-028 (wildlife shooter) ljcsg.indd 24
NATURAL HISTORY FILMMAKING
the narration will most likely not be your department. RESEARCH AND ADVICE Sometimes much of the thinking about how to tell a story will already have been done for you, by a researcher and a producer who have gathered information and written a shooting script for their programme. As you haven’t started filming yet this is bound to be an idealised view of things. Sometimes these scripts can be very detailed, especially when you’ll have some control over the filming, for instance if you are filming plants. If you don’t have much control, then the script may contain a lot of wishful thinking. If you are filming a sequence for a programme that has a biological theme, say evolution, then the sequence may need to illustrate some
BELOW RIGHT Cameraman Louis Labrom rigging a cable dolly in the main location that he and John used for filming the Hyenas at night. The hyenas came into the square to eat bones left out for them by the butchers.
specific behaviour or principle, in which case a detailed script will be necessary. In contrast, if your film is about what an individual animal does, there is little point writing a detailed script in advance, because the animal will shape its own story, as well as setting practical limits on what it’s possible for you to cover. Either way, you will have to make many choices about what to film, and equally importantly, what not to. Once you’ve done all you can from home, out the door you go with your camera, into the natural world to meet your subjects, and no doubt to face quite a few trials. CHARACTERS A good story needs a protagonist who is able to react to their changing circumstances and to the other characters they meet, usually changing themselves along the way. There is usually one main character about whom we are invited to care. Our films are far more effective if we can distinguish that character from the others visually. I mean ‘characters’ in the loosest sense. A character in
a wildlife film could be a bear or a bristlecone pine, or even the Earth itself. These characters will do things and things will happen to them. The audience can be invited to be on their side, or against them, or neither, or both, because interesting characters develop and evolve as the story unfolds, or at least our understanding of them does. You probably know already whether you will be trying to film a male spider searching for something dangerous but desirable in the leaf litter, or an ancient pine tree that has been alive since the time of the Romans, but you will still have to choose which spider or tree to film. Animals are individuals, and so are plants for that matter. They each live in more or less beautiful settings, and are more or less visible and accessible too. Some animals have been habituated by other people, or are well known by them for other reasons, perhaps because they are being studied by scientists. Sometimes choosing one comes down to such basics as whether it is hungry, young and naïve, old and
A GOOD STORY NEEDS A PROTAGONIST WHO IS ABLE TO REACT TO THEIR CHANGING CIRCUMSTANCES AND TO THE OTHER CHARACTERS THEY MEET JANUARY 2017 DEFINITIONMAGAZINE.COM
def-jan17-024-028 (wildlife shooter) ljcsg.indd 25
SHOOT STORY FANTASTIC BEASTS AND WHERE TO FIND THEM
MAGIC IN THE BIG APPLE DoP Philippe Rousselot talks about how his shooting vision for Fantastic Beasts was based in reality in order to make the fantastical leap out from the grimey streets of New York have to admit to a sense of terror when commencing this project,” begins cinematographer Philippe Rousselot in a moment of patent candour. “Whilst I have worked on a number of large-scale, large-budget movies before, I knew Fantastic Beasts would be a much bigger affair before I
had even read the script. But once I had read it, I could see it was going to be absolutely enormous, with lots of key decisions to make and many challenges along the way. Nothing exists, not even the beasts, until a team creates it. That said, I count myself as very lucky to be invited to work on it.”
RIGHT Director David Yates on set with Queenie (Alison Sudol).
Directed by David Yates, and written by J K Rowling, the Heyday Films/Warner Bros’ production, Fantastic Beasts and Where to Find Them, is the next instalment in Rowling’s Wizarding World. Fantastic Beasts is the first in a series of five features. Set in New York in the 1920s, the story follows the adventures of writer Newt Scamander (Eddie Redmayne), who smuggles magical creatures into the USA, where they manage to escape. Scamander seeks to recapture the beasts, which are at risk of being destroyed by the Magical Congress of the USA (Macusa), and gets help from Porpentina Goldstein (Katherine Waterston), a witch working for the Macusa and Jacob Kowalski (Dan Fogler), a No-Maj with no magic. Philippe shot the $225m movie, releasing in 2D/3D widescreen, IMAX 4K Laser and large-format 70mm film, over a 100-day production period. Whilst the final movie is set amongst the streets and towering skyscrapers of 1920s New York, it was actually shot on a series of vast sets built on the stages and backlot at Warner Bros. Studios Leavesden, and at locations in
DEFINITIONMAGAZINE.COM JANUARY 2017
def-jan17-030-034 (fantastic beasts) ljc cb.indd 30
FANTASTIC BEASTS AND WHERE TO FIND THEM SHOOT STORY
Liverpool (Manhattan). Philippe had around three months of prep before commencing principal photography. Given that the cinematographer’s key task is turning the script into moving images, Philippe – whose credits include Charlie and the Chocolate Factory (2005) for Tim Burton, Sherlock Holmes (2009) and Sherlock Holmes: A Game Of Shadows (2001), both directed by Guy Ritchie, and Shane Black’s The Nice Guys (2016) – says he took the only route possible in the circumstance: “To immerse myself very, very slow into the project by talking with all of the key collaborators, and try to enjoy a voyage of discovery.”
ALL IMAGES © WARNER BROS
WHILST IT'S A FANTASTICAL TALE, AND REALLY QUITE DREAMY IN PLACES, IT NEEDED TO LOOK AS REAL AND STRAIGHTFORWARD AS POSSIBLE
PORT OF CALL For any cinematographer the first port of call is always the director. “I already knew David after having spoken to him previously about other projects. His drive, knowledge, talent and fantastic kindness throughout the production proved a real blessing,” Philippe recalls. “Our initial conversations were not definitive statements about doing one thing or another. The most important thing – which David decided from JANUARY 2017 DEFINITIONMAGAZINE.COM
def-jan17-030-034 (fantastic beasts) ljc cb.indd 31
SHOOT STORY THE GRAND TOUR - THE SMALL CAMERA STORY
THE GRAND TOUR –
THE SMALL CAMERA STORY While the ARRI AMIRAs give The Grand Tour the large vistas, the minicams shoot the in-car reviews and essential guy banter. On the outside of the cars there’s more of a balancing act WORDS JULIAN MITCHELL PICTURES BEN JOINER/JOE JAMIESON
DEFINITIONMAGAZINE.COM JANUARY 2017
def-jan17-038-041 (grand tour) cbLJC.indd 36
THE GRAND TOUR – THE SMALL CAMERA STORY SHOOT STORY
If you have seen the fantastic new The Grand Tour car show, you will realise that the in-car shots are as important as all the other types. You get the oohs and aahs of how good or bad the cars are as they’re driving them. Then there’s the banter, very important for this show. Within the workings of the new programme these cameras need to follow well-defined parameters. They need to be 4K for a start, they need to have a locked exposure for the changing lighting, you need to know they are recording and they need to survive whatever is thrown at them throughout the series. All this responsibility has been taken on by Jon Shepley of CreoKinetics. Jon is another ex-Top Gear operator where he worked for about five years with Extreme Facilities, who are currently still working for Top Gear. “The biggest challenge was the fact that it was being shot in 4K, in my field of minicams that really forces your hand as in what equipment is available. Although 4K wasn’t particularly new there’s still little in terms of small cameras that will actually shoot that. They also wanted to get a more cinematic look to it so cameras like the HD Toshibas that we used to use weren’t good enough. Because they’ve got tiny sensors they’ve got that infinite depthof-field and they look quite video-y and nasty. So we ended up using the Panasonic GH4s which have been fantastic. They’ve never let me down and have done a tough year of filming. “We bought 12 GH4s with three needed per car when the presenters are in there, plus some spares. With the GH4 you can dial in the frame rate we ended up using, but initially you couldn’t shoot for more than 30 minutes with this camera which wouldn’t have worked. This was because of the European tax banding for video cameras. So I had to buy all the cameras from the USA where they weren’t time limited. Panasonic has since brought out the R version which is a Pro version of the camera carrying the V-Log as standard as a colour profile and unlimited record time. The
ABOVE Jon Shepley, as a test, quickly attached his very new Ronin-MX gimbal and GH4 camera to a safety car. The shots were the best of the day. For series two, they will be using the gimbal with a RED camera.
LENS WISE WE GO FOR THE SHALLOWEST DEPTH-OF-FIELD WITHIN REASON
V-Log was another reason for going that way, I think we only shot the first couple of videos in Cine-D which was the colour space we started with. We got V-Log before it was generally available so thanks to Panasonic as it gives us lots of latitude.” SERIES TWO PLANS Although the GH4s record internally there are plans in place for series two that will give Jon some redundancy and less sleepless nights. “I’ve got these recorder boxes which we will be using in series two. We will be cabling the cameras down to Sound Devices recorders. The main reason for that is actually to save in post-production time because we can record in ProRes 422. So it will be the same as the main cameras, the ingest is then a far simpler process. “The biggest challenge in using these cameras for the show is that you’ll start the cameras off for pieces to camera, and sometimes you won’t see the car again for two or three hours so it has to keep recording for as long as possible. The cards in them record up to five hours but have to be reliable and maintain exposure for all that time on the presenter’s face.
That was the main reason to go for the GH4 because there’s a movable spot meter, so you can adjust the size of this overlay box and drag it around the screen. If you make the box roughly the same size as the presenter’s head, drag it over their face and it will automatically maintain exposure over that area. “It’s a fantastic camera – we looked at all the options – Sony was very keen for us to use the A7S II which wasn’t even out at the time but it had the same problem as the other cameras. It didn’t have this feature where you could guarantee reasonable exposure over the presenter’s face.” SMALL LENS CHOICE With usually such small spaces to shoot in inside these hyper cars, lens choice becomes a major factor. Long gone are the days when a GoPro extreme wide-angle shot will do. “Lens wise we go for the shallowest depth-of-field within reason. We use variable NDs on the lenses so we can keep them fairly wide open. The lenses which we are using for the main piece to camera shot are Micro Four Thirds so when you shoot 4K it does crop in slightly tighter, JANUARY 2017 DEFINITIONMAGAZINE.COM
def-jan17-038-041 (grand tour) cbLJC.indd 37
DISPLAY DISTRACTION The new technologies that are here to underpin UHDTV displays, such as HDR, HLG and WCG, are still being misrepresented. We try and separate the facts from the fiction WORDS STEVE SHAW different peak brightness levels and HDR standards, such as SMPTE’s ST 2084 (as used with Dolby Vision and HDR10) and the BBC’s suggested WHP 283 HLG format. It’s also worth noting that Dolby Vision specifies 12-bit imagery, while HDR10 and HLG are 10-bit based. As a result, no 10-bit HDR/UHDTV material can be considered as being true Dolby Vision.
UHDTV, combining HDR or HLG with WCG imagery, is gaining momentum as the next enhancement to our viewing experience. However, the whole UHDTV concept, using HDR/HLG/WCG is as yet very undefined, and even the basics can be very difficult to get to grips with. UHDTV UHDTV has had something of a difficult birth, with different display manufacturers effectively defining their own UHD specifications. In response, the UHD Alliance has released a definitive (for now) UHD specification, linking together all the display parameters required to be accepted as UHDTV, although the individual aspects of the UHDTV specification can be, and often are, used in isolation. There is nothing to stop a standard gamut display (Rec. 709), with standard HD or even SD resolution, working with HDR/HLG contrast, for example. Within this article we are focusing specifically on HDR, HLG and WCG, and what they mean for display calibration and image workflows as well as the end image viewing experience. HDR AND HLG HDR and HLG are not just about brighter displays, they’re about using the greater available display brightness to enable extended detail within the brighter highlights. As such, the gamma curve needs to be set differently for displays with
ST 2084 HDR ST 2084 defines the gamma for the Dolby Vision and HDR10 HDR
formats. It is based on a theoretical Golden Reference display with a maximum luminance capability of 10,000 nits, with all real-world displays referenced to this theoretical display, and has a gamma curve (EOTF – Electro Optical Transfer Function) in diagram 1. This shows that only a small portion of the image’s dynamic range would actually use the extended brightness capability, with the majority of the image being held very low. Note, these are relative display gammas, not conversions
0.9 0.8 0.7 0.6
DIAGRAM 1: SMPTE’s ST 2084 shows that only a small portion of the image DR would use the extended brightness capability, the majority of the image is being held very low.
0.5 0.4 0.3 0.2 0.1 0.1
THEY’RE ABOUT USING THE GREATEST AVAILABLE DISPLAY BRIGHTNESS TO ENABLE EXTENDED DETAIL WITHIN THE BRIGHTER HIGHLIGHTS
DEFINITIONMAGAZINE.COM JANUARY 2017
def-jan17-042-047 (hdr101) CBljc.indd 40
JANUARY 2017 DEFINITIONMAGAZINE.COM
def-jan17-042-047 (hdr101) CBljc.indd 41
UAVS DJI INNOVATIONS
DJI REINVENTS DRONE AERIAL SHOOTING New Inspire 2, Phantom 4 Pro and Matrice 600 Pro drones from DJI have improved proximity sensors and camera tracking capabilities, but it’s the new cameras that are the real story WORDS JULIAN MITCHELL hinese drone and camera manufacturer DJI has announced new aircraft and cameras across its range. There is a new Phantom 4, now with a Pro moniker, a new Inspire 2 and a new Matrice 600 Pro for the higher payloads. The new products come soon after a couple of pieces of significant drone news with GoPro recalling all its new Karma drones and Sweden banning all drones unless a surveillance licence is acquired. INSPIRE 2 Inspire 2 comes with a new magnesium-aluminium alloy body and claims a top speed of 67mph with a 0-50mph time of only four seconds. It doesn’t say how much battery life is left after this feat. With the new dual battery system, flight
time is claimed up to 27 minutes. A new forward-facing camera offers the pilot the flight view for operating the master controller, while the camera operator receives a separate feed from the Zenmuse camera mounted on the main gimbal. The Inspire 2 supports the brand-new Zenmuse X4S and X5S cameras and will support additional cameras in the future (whether these are DJI’s own or others isn’t clear). A new image processing system, called CineCore 2.0, is embedded into the airframe, which allows processing large files faster than ever before; this has been a problem before as DJI insists you use its software for Raw translation. Inspire 2 can capture 5.2K video at 4.2Gbps (with the 480GB SSD) for Adobe CinemaDNG Raw videos with a new faster SSD called CINESSD coming in 120GB and 480GB sizes. A variety of video
IMAGES With electronic support for eight lenses from wide-angles to zooms, the Xenmuse X5S boasts a 20-megapixel Micro Four Thirds sensor and is compatible with the new Inspire 2.
compression formats are supported by CineCore 2.0, including Adobe CinemaDNG, Apple ProRes 422 HQ (5.2K, 4K) and ProRes 4444 XQ (4K), H.264, and H.265. When recording 4K video in H.264 and H.265, the bit rate is up to 100Mbps. This gives you the choice of recording UHD 30fps in ProRes 4:4:4 or 10, but 60fps 4K DNG Raw. You could save lots of post time by using ProRes for capture.
DEFINITIONMAGAZINE.COM JANUARY 2017
def-jan17-052-053 (uav)sg ljc.indd 52
DJI INNOVATIONS UAVS
With optional high-altitude propellers, Inspire 2 can reach a maximum service ceiling of 16,404 feet above sea level, and the self-heating dual battery redundancy system maintains good performance down to -4°F/-20°C. Through an optimised mode, Inspire 2 streams video at the 1080i50/720p60. The redesigned propulsion system can carry Inspire 2 through vertical camera moves, climbing at almost 20 feet a second and descending at almost 30 feet a second. NEW FLIGHT MODES Spotlight Pro Mode offers the ability to lock onto a subject during flight while the aircraft flies freely in another direction – and then automatically rotates the aircraft to stay on the shot if the gimbal reaches its rotational upper limit. The forward-facing camera offers TapFly, which directs Inspire 2 to any point on the screen while avoiding obstacles along the way. A feature which will annoy the Swedish drone law makers is ActiveTrack. This recognises common subjects such as people, cars and animals, sending Inspire 2 to follow behind, lead in front, circle above or track alongside while always flying forward, ensuring the forward-facing obstacle sensing system keeps the aircraft safe.
INSPIRE 2 CLAIMS A TOP SPEED OF 67MPH WITH A 0-50MPH TIME OF ONLY FOUR SECONDS ABOVE AND BELOW The Inspire 2 with magnesiumaluminium alloy body and dual battery system. BOTTOM The Phantom 4 Pro boasts FlightAutonomy and above it the two batteries of Inspire 2.
NEW ZENMUSE CAMERAS DJI is expanding its line of interchangeable Zenmuse cameras. The new X4S has a one-inch, 20-megapixel sensor with a claimed 11.6 stops of dynamic range and a 24mm equivalent focal length. It offers aperture control (f/2.8-11) and a mechanical shutter to lessen rolling shutter distortion. The new X5S has a Micro Four Thirds sensor with 20.8 megapixels, no mechanical shutter and a claimed 12.8 stops of dynamic range. It electronically supports eight lenses from wide angles to zooms and shoots 20fps continuous burst DNG Raw (20.8 megapixels). A handheld mount will soon be available for the X4S and X5S, bringing the cameras down to the ground for their OSMO products. HDR BRIGHT MONITORS There’s also an optional DJI Crystalsky high-brightness IPS monitor, available in 5.5in and 7.85in, with the highest brightness up to 1000 nits and 2000 nits respectively – its dedicated system promises to reduce video transmission latency. Of interest to companies like ATOMOS and Convergent Design these monitors also have the ability to record to dual MicroSD card slots within the screens. PHANTOM 4 PRO Phantom 4 Pro’s camera now has a one-inch 20-megapixel sensor. Its mechanical shutter helps eliminate rolling shutter distortion, and it can capture 4K video up to 60fps at a maximum bit rate of 100Mbps in H.265
compression. Phantom 4 Pro builds on DJI’s obstacle avoidance system with FlightAutonomy – three visual systems that build a 3D map of obstacles in front, behind and below the aircraft, plus infrared sensing systems on both sides. FlightAutonomy can navigate and plan routes, enabling Phantom 4 Pro to avoid obstacles up to 98 feet at front and rear, even in complex 3D environments. DJI’s new Matrice 600 Pro offers improved flight performance, a more powerful battery charging system and better loading capacity for cinema cameras. The M600 Pro is compatible with DJI’s Zenmuse camera series, the Ronin-MX gimbal and DJI Focus. It supports a payload of 6kg, meaning it can carry cameras from Micro Four Thirds systems to the RED EPIC. It comes with an updated battery charging hub, enabling users to charge the six batteries at the same time. PRICE AND AVAILABILITY The UK retail price of the Inspire 2 aircraft is to be £3059. The Inspire 2 Combo, which includes one Inspire 2 aircraft, one Zenmuse X5S, CinemaDNG and Apple ProRes License Key, is available for £6398. Customers who order the Inspire 2 Combo before 1 January 2017 can get a special price of £6269. Phantom 4 Pro’s UK retail price is £1,589 with a standard controller. The Phantom 4 Pro+, which includes a Phantom 4 Pro aircraft, and a high luminance display remote controller, will be available at £1819. JANUARY 2017 DEFINITIONMAGAZINE.COM
def-jan17-052-053 (uav)sg ljc.indd 53
pl fo of Pr Th Li
CLASSIC DIVA-LITE TURNS ON THE COLOUR Cirro Lite makes lighting simple with the new dual mode Diva-Lite and Select LEDs WORDS JULIAN MITCHELL One of the world’s favourite key lights and one of the best named, the Kino Flo Diva-Lite, is now shipping with new colour firmware on-board. Cirro Lite’s Select LED light was released earlier this year and it too has been upgraded with the new colour feature. Good news for those who have already invested in the Select LED as they can get their firmware updated for free. Both the Diva-Lite and Select LEDs come in two sizes: two-foot (‘20’) and three-foot (‘30’) . The main difference between them is that DivaLite has built-in controls and with the Select you can take the controller off and use it remotely. The Kino Flo Diva-Lite now turns into a high-end LED fixture with new features which Cirro Lite thinks will ‘reshape the future of cinema and television production’ – bold claims indeed. It will offer flicker-free white light, smooth dimming without colour shift and full green/magenta hue
control. Now with the new firmware it allows you to switch to the Color Mode which will in the future include up to two million colour choices through its three parameters, as well as an expanded Kelvin range from 2500K to 9900K. The light won’t be loaded up with the two million colours initially; Cirro is finding out what people want and will then offer a firmware upgrade. END OF GELS? You can replicate any gel in the gel book too. An app called Stagehand will help you get the gel you want. All you need is the hue angle from the gel information to dial it in to the light – great idea. Diva-Lite LEDs display True Match quality soft light (CRI >95), operating on universal input from 100V AC to 240V AC, or 24V DC. At 150 watts, the electronics draw as little as 0.65A (at 240V AC) or 7A (at 24V DC). With the new firmware, Diva-Lite
can potentially go into any studio production or on location.
LEFT: A 90° honeycomb louvre is supplied with the lights, a 60° louvre is also available. BELOW: The Diva-Lite features a built-in control panel.
MATCHING CAMERAS The Diva-Lite LEDs feature a green/ magenta control to match the spectral sensitive curves of the most popular cameras and other light sources on the set. That’s a very important feature that gives you full control when matching to different light sources and eliminates the need for colour correction gels. When switched to Color Mode users can dial in nearly any colour, from deeply saturated violet blue to a visual effects 560nm supergreen, and upward into the warmer climes of light straw, flame and salmon red. The fixture comes with more than 100 different colour presets; their names correspond to the actual professional gels. If that’s not enough, you can dial in your preferred angle and saturation values. The Diva-Lite LED DMX includes a built-in Lumen Radio receiver and can be operated wirelessly with a Lumen Radio transmitter. Wireless is an alternative to using DMX cables. The lights come with a 90° honeycomb louvre. A 60° honeycomb louvre is also available as an accessory.
DEFINITIONMAGAZINE.COM JANUARY 2017
def-jan17-054(lighting) ljc cb.indd 54
SPATIAL, THE FINAL FRONTIER DSpatial is a suite of ProTools plug-ins designed to allow immersive 3D above and beyond object-based audio systems. Rafael Duyos describes how it works INTERVIEW JULIAN MITCHELL Definition: What are DSpatial’s main features? Rafael Duyos: DSpatial converts Avid Pro Tools into a native 3D immersive and binaural workstation, redefining production elements such as reverb, panning and mixing, while unifying them into a unique space. It enables post-production engineers to digitally recreate indoor and outdoor real-world environments, by visually locating, moving and rotating sound sources, all in realtime. Physical properties like inertia and Doppler effect are automatically calculated and then imparted on your source material simultaneously in real time, by simply moving them to your desired position in the threedimensional sound field. DSpatial boasts full integration with Avid Pro Tools, integrating up
to 96 real-time convolution reverbs using a groundbreaking Impulse Response modeller. It offers a host of effects such as: • Proximity and distance • Door and wall simulations • A wide variety of surface, shape and structure emulations • Grouping of audio elements as new objects To make this vast level of control easy and intuitive, DSpatial offers up to ten points of multitouch capability. A speaker set designer and a monitoring section are also provided. A dual reverberation mode allows real-time scene changes. The past reverb blends seamlessly with the new one without any audible cuts, clicks or glitches. Any mix made with DSpatial can be played back in an immersive or
RECREATE INDOOR AND OUTDOOR REAL-WORLD ENVIRONMENTS non-immersive format without the need to down-mix. A hyper-realistic Binaural Mode is also available for when the listener will be listening to the end product on headphones.
IMAGES DSpatial’s graphical user interface
Definition: Is it a plug-in for ProTools or a stand-alone DAW? Rafael Duyos: It is actually both. DSpatial is a set of nine different plug-ins, which then interface to our stand-alone application. JANUARY 2017 DEFINITIONMAGAZINE.COM
def-jan17-055-057 (audio) ljc cb.indd 55
SENNHEISER AMBEO AMBISONIC MICROPHONE Sennheiser has made its first Ambisonics microphone – making sense of this decades’ old technology WORDS ADAM GARSTONE
HE LASER was invented in 1959 by Gordon Gould (probably). Sadly, nobody could really think of anything to do with it until the invention of CD players, Laser Quest and the Laser Interferometer Gravitational-Wave Observatory. And it happened again when Peter Fellget and Michael Gerzon invented Ambisonics in the 1970s. Everyone was terribly excited, but no one could think of anything to do with it. Here was a technique which allowed accurate spatialisation of sound recordings not only in 360° around the listener, but also 360° above and below. It was wonderful technology, but recording it was expensive and no one had the speaker set-up to listen to it. So, it remained firmly in its niche, with only a couple of specialist manufacturers making microphones. In the 21st century, digital recording and encoding, coupled with a demand from the likes of VR and Dolby Atmos, has made the
technology interesting enough that a mainstream manufacturer has now made an Ambisonic microphone. Sennheiser’s AMBEO VR combines four, high-quality microphone capsules in a tetrahedron to capture true, Ambisonic signals. A NEW HEAVYWEIGHT? Sennheiser knows a thing or two about making microphones, and the AMBEO VR is very solidly made, with a chunky casting holding the four, 14mm capsules beneath a removable cover. The entire unit oozes quality, not least in terms of weight. The four capsules are broken out into four standard XLRs. You need to provide phantom power to each connector and, obviously, four good quality mic-pres and ADCs. The world of Ambisonics refers to this raw, four-channel signal as A-format. Sennheiser supplies a free plug-in for most major DAWs to encode these four channels to Ambisonic’s
B-format, which is what most other plug-ins understand. B-format also has four discrete signals loosely representing an omni mic, and three figure of eights – left-right, front-back and up-down. We used Sound Devices 788T field recorder to capture A-format recordings. It’s worth pointing out that many of the previous Ambisonic microphones had hardware units to convert to B-format, so the 788T has a rudimentary B-format to Stereo matrix to allow some kind of meaningful monitoring on the headphone output. Unfortunately, you can’t monitor a raw A-format signal like that produced by the AMBEO VR – other than by selecting channels to check for signal presence. The AMBEO VR’s mic capsules have to be small enough to get them close together. Surprisingly, for 14mm capsules, Sennheiser quotes a 20Hz to 20kHz frequency response – similar mics usually only manage 40Hz.
DEFINITIONMAGAZINE.COM JANUARY 2017
def-jan17-058-059 (AMBEO REVIEW)ljc cb.indd 58
Delving further into the specs reveals rather lower sensitivity, and higher noise, than other field recording mics, but in practice we didn’t notice any problems, and appreciated the lower frequency response when extracting an LFE channel. YOUR FLEXIBLE FRIEND Which brings me on to the flexibility – it’s possible to mimic the responses of other microphone patterns. The AMBEO VR records spatially accurate sound from a sphere around the mic, so with some clever maths, plug-ins can extract, say, a stereo signal. The resulting audio can sound exactly like it was recorded with a crossed pair of cardioid mics, and you can point this pair in different directions after the recording. Or you could extract five channels for conventional surround sound. You can even mimic several microphones at the same time. There are a variety of free plug-ins available, but Harpex
HAS SOUND QUALITY AND FLEXIBILITY TO BE THE ONLY AMBISONICS RECORDING MIC YOU’LL NEED
Ltd’s software is worth a look. It’s expensive, but justifies the price with excellent spatial accuracy and outstanding flexibility. A single AMBEO VR recording can be decoded into stereo, surround, binaural – it even mimics a highly directional mic, like a short shotgun but with a nicer off-axis response. One benefit of recording using a tetrahedron mic like the AMBEO VR is that it minimises the phase effects you can get with multiple mic recording techniques – the capsules are close enough together to (almost) represent a single point. If the conditions allow, many recordists remove the basket to ensure it doesn’t produce any unwanted high-
frequency effects, but the AMBEO VR kit also contains a simple, foam windshield if conditions aren’t so helpful, and a simple but effective Rycote shock mount. There can be phase issues on playback, however, when the listener moves, as the ‘point source’ recording is played back with high correlation through several speakers. In practice, VR usually uses headphones and auditoria for full surround playback, like Dolby Atmos, are seated, restricting audience movement. So Sennheiser has produced an outstanding microphone with the AMBEO VR. It’s built to withstand the rigours of field recording, and has the sound quality and flexibility to be the only Ambisonics recording mic you’ll ever need – equally at home capturing forest sounds in the Amazon, the Staatskapelle Berlin at full chat, or battlefield sounds for the next big movie. The AMBEO VR is £1440 inc. VAT. JANUARY 2017 DEFINITIONMAGAZINE.COM
def-jan17-058-059 (AMBEO REVIEW)ljc cb.indd 59
RESOLUTION REVOLUTION! RED’s EPIC-W 8K camera gets a full review
Why would A-list DOP Claudio Miranda shoot solely on a drone?
We review AVID’s new ‘intelligent’ and scalable media storage
We dissect the lauded £100 million production from Netflix
The state of wireless camera systems – for data and RF
All the latest from the pro side of UAVs, audio & lighting
ON SALE 5 JANUARY 2017 ALSO AVAILABLE ON THE APP STORE
DEFINITIONMAGAZINE.COM JANUARY 2017
def-jan17-066 (next issue) cbsg.indd 66