New Zealand Security - Dec 2019-Jan 2020

Page 29

are likely to be considered unfair and to intrude to an unreasonable extent onto the personal affairs of the individual. Principle 8 is also particularly relevant. The information (contained on a watchlist) must be accurate, up to date, complete, relevant and not misleading. Watchlists based on mere suspicion are likely to fail Principle 8 requirements. Trials of facial recognition technology have been shown to be biased and inaccurate with extraordinarily high false positive rates. Limits on use as described in Principle 10 may also be problematic if state agencies intend to use driver’s license images for a purpose other than what the image was originally obtained for Accuracy As stated earlier, the Metropolitan Police conducted six trials between June 2018 and May 2019. The trials generated 42 matches, and the people matched were approached by police officers in the field. Those matches resulted in an 81 percent false positive rate. In 2017 live facial recognition was used at the Cardiff Champions League final. 2,470 people were identified by facial recognition as being criminals. Subsequent research showed that there was a 92 percent false positive rate. In 2017 at the Notting Hill carnival, live facial recognition had a false positive rate of 98 percent. Although not used in the Metropolitan Police Service trials, live facial recognition software can be integrated into police body worn cameras

December 2019/January 2020

and city-wide surveillance camera networks, and that data may be subjected to automated analysis. It is technically feasible to create a database containing a record of each inidividual’s movements around a city. The potential for this kind of use raises serious human rights concerns because the data may involve false positives, and could be used to identify unusual patterns of movement, participation at specific events or meeting with particular people Bias There is significant public interest in the issue of potential bias and discrimination. Concerns include discriminatory practices based on input data and watchlist composition, and bias built into live facial recognition technology on the basis of sex, race and colour. Different algorithms and different applications assert biases in different ways, which requires analysis on an application-by-application basis. Independent tests on 127 commercially available facial recognition algorithms have identified gender bias where there are fewer false positives for men relative to women and racial bias where there is a higher false positive rate for dark-skinned females. An MIT study in the US showed that darker skinned women are misidentified as men 31 percent of the time. There are twice as many false positive rates for blacks as there are for whites. The US Justice Department is using artificial intelligence when somebody is arrested to predict whether they are

likely to reoffend in the next two years, and that analysis is used to determine whether they get bail or they don’t get bail. Studies have shown that the assessment of the potentiality for a criminal to reoffend within two years has a significant bias against African Americans. I’d like to finish with the UK Camera Commissioner’s statement on the use of facial recognition, and regarding video surveillance. The Commissioner stated that overt surveillance is becoming increasingly intrusive on the privacy of citizens, in some cases more so than aspects of covert surveillance because of the evolving capabilities of the technologies. The use of live facial recognition has the potential to impact upon ECHR rights and thereby influence the sense of trust and confidence within communities. The Commissioner said that it should be a fundamental consideration of any relevant authority intending to deploy live facial recognition that a detailed risk assessment process is conducted and documented as to the operational risks, community impact risk, privacy and other human rights risks, and any risks associated with it prior to deployment. Such risks should be considered as part of the decision-making processes associated with the necessity and proportionality of its use. This is an abridged transcription of a presentation delivered by David Horsburgh at the ASIS New Zealand Chapter Auckland Members Breakfast Meeting in November 2019.

NZSM

29


Issuu converts static files into: digital portfolios, online yearbooks, online catalogs, digital photo albums and more. Sign up and create your flipbook.