Page 24


The detection of fakes is going to get more and more difficult also working on a program named SemaFor, whose point is to identify semantic inadequacies in deepfakes. While other areas of professional audio are in a state of flux, one field is experiencing rapid growth with the production of more and more source material every day — whether ‘deepfake’, surveillance, reportage or law enforcement. This is audio forensics, the use of advanced audio analysis and filtering to try to extract voices and meaning from badly-captured and noisy recordings, as well as methods of validating and presenting the resulting evidence to courts and other bodies. Strangely, there’s currently no formal qualification in this field. If your aim is to become ‘CSI: Audio’, what can you do? The University of Colorado in Denver is trying to rectify this missing training: sign up for a Master of Science in Recording Arts and you can focus on Media Forensics in the MSRA-MF degree programme offered by their National Center for Media Forensics (NCMF). The intention is to take students from a wide range of disciplines and prepare them for careers in the fields of audio and video forensics, as well as other areas of hi-tech crime fighting. We caught up with one of the NCMF team, Cole Whitecotton, to discover what it takes to be and audio sleuth.

Cole Whitecotton Deepfakes and media forensics — NIGEL JOPSON discovers CSI: Audio


report from Deeptrace, the Netherlands-based cyber security group, identified 7,964 deepfake online videos at the start of 2019. After nine months, the figure nearly doubled to 14,678, and has been growing since. In March 2019 it emerged criminals had used artificial intelligence-based software to impersonate a chief executive’s voice, ordering a fraudulent transfer of €220,000. The CEO of a UK-based energy firm thought he was speaking on the phone with the boss of his firm’s German parent company, who asked him to send the funds to a 24 / Winter 2020

Hungarian supplier (according to the company’s insurance firm, Euler Hermes Group SA). In April 2020, Extinction Rebellion (XR) activists released a deepfake video of the Belgian Prime Minister Shophie Wilmès making a speech linking Covid-19 to the climate crisis. “There is a broad attack surface here — not just military and political but also insurance, law enforcement and commerce,” says Matt Turek, programme manager for MediFor, a media forensics research program led by the Defense Advanced Research Projects Agency (DARPA), part of the US defence department. DARPA is

How did you become involved with the NCMF University of Denver, Colorado? I’m an alumnus of the programme, having graduated from it with a Master’s degree three years ago. At the time, the Center was doing work for DARPA — the US Defense Advanced Research Projects Agency — and I became involved right at the beginning of the project. DARPA is the organisation responsible for the development of emerging technologies for use by the US military and, when I completed my Masters, the Center opened up a position that combined working on the project together with classroom support and online teaching. Is this part of the initiative to identify ‘deep fake’ audio? We were one team out of many working on detecting deep fakes. Our job was to generate fake audio and video materials that some of the other teams then tested to try to detect them. One of these was using straight-up machine learning AI, and others were working with hybrid versions that were a mix of learned and trained algorithms along with traditional

Profile for Resolution

Resolution V19.6 Winter 2020  

Audio for broadcast, post, recording and media production. If you are involved in recording, postproduction, broadcast, mastering or multime...

Resolution V19.6 Winter 2020  

Audio for broadcast, post, recording and media production. If you are involved in recording, postproduction, broadcast, mastering or multime...