The%20use%20of%20artificial%20intelligence%20under%20data%20protection%20law %20vk%20

Page 1

The Use of Artificial Intelligence under Data Protection Law Vanessa Kodilinye* The world is currently experiencing a digital revolution1, driven and supported by technological innovation enabling the collection, processing and formation of new data-sets on an unprecedented scale and at an incredible speed, in all scientific, business, cultural and societal fields. No country wishes to be left behind2 in the new digital, borderless world order, or to miss the benefits of technological trends that are shaping an open-data driven world in the 21st century, which promise increased efficiencies and competitiveness for business and societies3. Technological players4 in high velocity data driven ecosystems are no longer satisfied simply with accumulating5 and statistically analysing ‘Big Data’6 tunnelled through clouds. Such unstructured data, whether created by users or produced as by-products of computing7, now have an aggregated and insightful value in which large business organizations8 and governments have invested9, and *

LL M (UWI), LL M in IT and Telecoms (Strathclyde, Scotland), Dr rer publ (Leuphana, Germany),Attorneys-atLaw(Barbados), Solicitor(England and Wales), CAMS (Association of Certified Anti-Money Laundering Specialists) 1 The Fourth Industrial Revolution. 2 Center for Data Innovation: ‘ How Governments are Preparing for Artificial Intelligence’, available at https://www.datainnovation.org/2017/08/how-governments-are-preparing-for-artificial-intelligence Recent efforts, identified in the article, at governmental level include those of Japan (2015), a ‘Robotics Policy Office’ established in 2017; Canada (2017) under their Pan-Canadian Artificial Intelligence Strategy, a $125 million Cd dollar programme will be used to support research and establish AI institutes; China (2017), ‘Next Generation Artificial Development Plan’; UK (2017) as part of their digital strategy will be spending £17.3 million on promotion of responsible development of AI. As of 2017, USA is focusing on the role of policy makers in an AI economy to equip the workforce with suitable skills. 3 On the other hand, uncontrollable development of AI could lead to economic, social and political challenges, and displacement of labour, and could reduce fundamental freedoms, leading to oppression. 4 Services provided may be more closely aligned to essential utilities in the modern world. 5 No predefined purpose for collecting or processing data (often personal data). 6 There is no set definition of ‘Big Data’. See classification of ‘Big Data’ developed by UNECE Task Team on Big Data contained in an IMF Staff Discussion Note, 13/09/2017, entitled ‘Big Data: Potential, Challenges and Statistical Implications’ available at https://www.imf.org/en/Publications/Staff-Discussion-Notes/Issues/2017/09/13/Big-DataPotential-Challenges-and-Statistical-Implications-45106 ; Marr, B: ‘Big Data: 20 Mind-Boggling Facts Everyone Must Read’, Forbes 30/07/2015 estimates that big data will generate 1.7 megabytes every second for each individual by the year 2020. For a more comprehensive analysis of ‘Big Data’, see https://privacyinternational.org 7 Created as secondary use of data which has been mined. 8 Such as Microsoft, Google, Facebook, which all use harvested data, novel data and shared data from competitors as training sets for their algorithms. https://www.partnershiponai.org 9 ‘Open data’ which may include the release of personal information by governments to be freely used by private enterprise; it is now considered an ‘economic asset’. For example, private/governmental partnerships such as Google Deep Mind and the UK Royal Free NHS Trust share medical data of identifiable patients without their consent in relation to the development of the Streams instant alert application. In July, 2017 the Information Commissioner ruled such sharing to be illegal. However, it is possible that under Art 25(2) of the General Data Protection Regulation, Regulation 2016/679, effective 25 th May, 2018, this type of scientific co-operation, with advanced pseudonymization techniques, may produce ‘pseudonymous data’ for which there are fewer restrictions on processing as they may be seen as technical and organisational methods of implementing privacy by design. Another example is China’s proposed ‘Social Credit System’, which pilot project began in 2015 and is to date in 40 cities in China. It is essentially a collaboration between tech companies and government to build personal digital profiles of each citizen through the mergers of private and public databases, so as to allow business to be conducted only after verifying scores through the social credit system. Such a system is also designed to account for political


the data are also exploited, analysed10, correlated and accessibly traded at unprecedented rates; and yet, at the same time, the existing data management systems are only capable of processing 0.5% of all existing digital data. Apart from quantum11 communication technology, in which China is at least 10 years ahead of the rest of the world, a new race involving the internet and computing is the development of classical Artificial Intelligence (AI) advanced intelligence machines. Artificial Intelligence is software or hardware based on cognitive computing, ie the ability of a computer, through constant and selflearning algorithms (not programmed), to mimic the thinking, reasoning and perception, communicating and decision making patterns of the human brain. AI will become more valuable in data driven ecosystems with the increase in human/computer interactions12, and for cognitive computing when it achieves a better understanding of natural language and environment13. AI is still considered as an emerging technology. The legal profession has not been insulated from disruptive technology14 caused by Big Data15, legal AI platforms and deep-dive learning16, nor from positive or negative transformation from using these systems. Document-intensive and rules-based aspects of professions, such as in the legal profession, in which clients believe there is a lack of transparency, appear to be ripe for artificial intelligence’s data extraction, pattern recognition and exploration. AI can also understand contexts and concepts within documents17. These capabilities are shifting power away from knowledge based professionals18 and into the hands of a few powerful tech companies which are beliefs. See also China’s Network Security Law, proclaimed in June, 2017 regarding the requirement for localised data retention and storage for government inspection in China by all network operators which use China’s public communications systems. 10 Data mining. According to the Irish Times, 600,000 corporate documents have been accessed from the Registry in Barbados and those documents form part of the investigative journalism surrounding the ‘Paradise Papers’(2017) https://www.irishtimes.com/business/paradise-papers-how-the-rich-and-powerful-hide-their-money-and-avoid-tax1.3280699 . 11 Microsoft announced on 25/09/2017 the development of a new quantum computing language. See https://news.microsoft.com/features/new-microsoft-breakthroughs-general-purpose-quantum-computing-movescloser-reality/ 12 Mc Kamey, M: ‘Legal Technology: Artificial Intelligence and the Future of Law Practice’ (2017) 22 Appeal 45 13 ‘Calm down, Elon. Deep learning won’t make Al generally intelligent.’ Available at https://www.theregister.co.uk/2017/10/09/deep_learning_unlikely_to_lead_to_artificial_general_intelligence/ in which the interviewee, Professor Bishop, a professor of cognitive computing, opined that computers lack understanding of the real world because not everything can be expressed by mathematical functions, nor can computers truly understand knowledge nor feel sensations like humans. 14 Businesses throughout the world have reduced budgets for legal services since the financial crisis of 2008. There has been disruption both as to the practice of law and as to law as a business model. 15 Blockchain technology works differently from artificial intelligence, though there have been moves to combine both technologies. The former technologies ‘verify, execute and record’: see http://www.datasciencecentral.com/profiles/blogs/blockchain-and-artificial-intelligence-1. AI, unlike blockchain technology relies on big data repositories. 16 An algorithmic calculation of flight risk programme, called ‘Public Safety Assessment Score’, was a basis upon which a convicted felon with parole violations was released on bail; after release, video surveillance systems put him at the scene of a murder: see http://www.npr.org/2017/08/18/543976003/did-a-bail-reform-algorithm-contribute-tothis-san-francisco-man-s-murder 17 An example is understanding requirements in a specific case in respect of e-Discovery. 18 Dixon, C on ‘Artificial intelligence: preparing lawyers for new technology in practice’ citing Professor Richard Susskind that there will be the need for hybrid legal practices such as ‘lawyer-software engineer’ available at


willing to invest in complex algorithms19. Legal AI relies on access to troves of personal or corporate data for the creation of its applications, while the tech company which processes the data offers back some form of processed data through expensive licensed access. Legal AI is used by in-house legal departments for due diligence and compliance reviews20, document/contract generation21 and negotiations, legal research22, identification and assessment of risks and liabilities, benchmarking legal costs of outside counsel23 and, in the United Kingdom, most recently in the ongoing reform of the civil justice system, by creating an on-line court24 where cases are decided by a judicial bot. Artificial intelligence has been the driving force in reforming and capping legal costs in personal injuries cases in the United Kingdom, by analysing and predicting possible outcomes in litigation25. Unexamined risk assessment algorithms and experimental technologies have also being used for analysing entirely new sets of circumstances in order to assist with sentencing under the criminal justice system in the United States26, which developments some may argue are making the justice system less fair.

http://www.lawsociety.org.uk/news/speeches/artificial-intelligence-preparing-lawyers-for-new-technology-inpractice-speech/ 19 For instance, the alliance between IBM Watson and Thompson Reuters; alliance between Denton’s NextLab and IBM Watson. In the Caribbean, the greater the public availability online of laws and regulations and ‘Open Data’, the greater the opportunity for legal Al development; apart from any forthcoming new barriers to the definition of ‘practise of law’, legal tech companies may decide that AI development would be contrary to profitability, for the market share is too small and laws are too fragmented across the jurisdictions. 20 ‘Luminance’ for mergers and acquisitions and document review. Finance industry AI compliance for AML is commonly referred to as ‘regtech’ and has been responsible for the loss of thousands of regulatory roles: see https://www.ft.com/content/3da058a0-e268-11e6-8405-9e5580d6e5fb ; though software may exist in respect of data protection compliance, the new GDPR mandates the appointment of a data protection officer in each organization who is required to abide by statutory tasks under Art 39. 21 See ‘Klarity Law’, powered by AI, for non-disclosure contract review and risk profile of contracts being advertised as 97% more accurate, 15 times faster than a lawyer and 20 times cheaper than a paralegal. In addition, self-executing, dynamic blockchain smart contracts. 22 Creators of the DataLex Project (1984-2001) chronicle their experience of creating expert knowledge based legal research databases in Greenleaf G, Mowbray A, Chung P: ‘Building Sustainable Free legal Advisory Systems: Experiences from the History of AI & Law’[2017]UNSWLRS 53 23 Examples are the Serengeti Tracker and LegalVIEW Bill Analyser. 24 Briggs, LJ: ‘Civil Courts Structure Review: Final Report’ (July, 2016) 25 Premonition.ai 26 State of Wisconsin v Loomis 2016 WI 68 – though the court in this particular case lacked understanding of how COMPAS (a risk assessment electronic tool used in sentencing) analysed risk. The raw data, source codes and weighing of inputs were not disclosable, due to trade secrets; see amicus brief filed in Sept, 2017 by the American Civil Liberties Union in California v Billy Ray Johnson Case(2015) No F071640 regarding the use of results of experimental DNA technology, TrueAllele, in sentencing to life without parole: https://www.aclu.org/legaldocument/california-v-johnson-amicus-brief .


Tech companies are attempting to change not only how law is practised but also changing law as a business model,27 and investors in legal AI systems are also attempting to influence the making of law28, how it is learned and how it is applied in the real world29.

Algorithms: the backbone of automated processing Algorithms30 are essentially the backbone of any automated processing31 system, and are put to use in many different types of applications and for many prototypes. There is no established definition for the term. From an automated processing point of view, these are the source codes or raw data inputs32, however imperfect, which are responsible for the performance of software, making certain decisions which could be made on inferences from the raw data. Algorithms are instructions for the programme architecture, which are often designed, non-neutral, written-in codes33, involving training sets34, parameters of data to be included, choices35 of learning, and a host of other instructions which are within the selection of the programme designer, engineers, developers and technologists36. By the supervised input of codes, the AI programme should yield computational outputs based on ‘intelligent analysis’ of all the available data, either yielding outputs based on pre-assigned assumptions and values37 or outputs (predictions or discoveries) based on learned data. Thus, if input data is labelled inappropriately or contains irrelevant, inaccurate data or variables, any predictive output data may yield a result which will be misleading, unjustified and could possibly contain cognitive biases. Algorithms do possess limitations in their very designs, yet are touted as capable of making more accurate and objective decisions38.

Billable hours, for instance, could be driven downwards with the help of research software or ‘Al robo-lawyers’. Law firms in advanced countries are entering into joint ventures or licensing arrangements with legal tech start-ups. Legal disrupters who are intent on ‘de-lawyering’ will always view legal professional rules as ‘protectionist’ and against fair competition. 28 Objectives of Blue J Legal/University of Toronto to create AI to make better laws and regulations for human activity by predicting case outcome, reforming and improving upon algorithms which will then become the law. Accept the outcomes of the AI system as law. The ambitious objectives of Blue J legal are to delegate and to privatize the decision making of law through algorithmic outcomes rather than through due process, the rule of law and the social and cultural context under which laws are usually made. 29 Greenleaf G, Mowbray A, Chung P: ‘Building Sustainable Free legal Advisory Systems: Experiences from the History of AI & Law’[2017]UNSWLRS 53, in which the authors identified the computing limitations of modelling legal expert systems, especially regarding the relationship between legal sources and legal reasoning and lack of training sets for different legal problems. They suggest that, in computing, law as a domain is not a tabula rasa. 30 In a manual form, this would be equivalent to surveys or questionnaires. 27

31

For definition of processing see s 1(1) UK DP, 1998, s 2(1) [same in s 2(1)Jam DP Bill,2017]. Covers even a combination of manual (Lindquist[2003] C 101/01) and automatic acts of processing. 32 Could be obtained from inaccurate, obsolete sources or manipulated data. 33 Miscoding through errors could alter results. 34 The selection of data for training sets should be informed by data minimization principles. 35 Purpose limitation principle should be critical. 36 Advanced, unedited version of UN Report of the Special Rapporteur on the Right to Privacy (19/10 2017) 37 This could also include social constructs; so, for instance, a credit rating algorithm may decide to rate place of birth and residence in a way which may produce discriminatory outcomes. 38 Depending on the purpose for which they are put to use and the level of financial gain for the creators.


For machine learning, algorithms rely heavily on access to ‘big data’ in order to learn, to adapt, and infer new knowledge, often through pattern recognition, but there are a few types of machine learning that rely more on self-taught algorithms rather than on ‘big data’39. Algorithms in automated processing can also be designed solely for decision-making without human intervention. Thus, for example, failure to pay a landline phone bill within the grace period permitted by the telephone company may be the cause of an automatic pre-programmed disconnection of service, and such information may, unknown to the consumer, have a detrimental effect on his credit score rating. Targeted digital TV advertisements40 based on previous purchases or viewing habits are similarly often based on automated decision-making algorithms. A further example of automated decision-making, which may have consequences for an individual, is online recruitment. Here, failure to input key words pre-set by an e-recruitment programme, or typing errors made by the applicant, could have adverse consequences on any individual who does not exercise his option of requesting an explanation regarding the automated decision-making.

Profiling Automated processing is usually the source of automated profiling of data subjects through predictive outcomes or insightful recommendations. Profiling41 relies on pattern recognition of behavioural habits, interests and locations, or of the personal characteristics of data subjects, most often for purposes of evaluation in matters such as work performance, credit worthiness and propensity to commit crimes, and for direct marketing. Such data is correlated from variable sources without the knowledge of the data subjects. These types of information nearly always ‘relate’ to identifiable42 data subjects. An everyday example of correlation of profiles is the A step closer to ‘Singularity’. Using its own intuition, ‘AlphaGo Zero’ devised its own novel and winning strategy by playing against itself in the game ‘Go’, a 3000 year old Chinese board game: https://phys.org/news/2017-10-selftaught-superhuman-ai-smarter-makers.html 40 see https://phys.org/news/2017-11-ready-ads-digital-tv.html . Objection to direct marketing in Art 14(b) Directive 95/46/EC; s 11 UK DP Act, 1998; s 11 Jam DP Bill, 2017. 41 Current Art 15 of Directive 95/46/EC, ss 12 UK DP Act and Jam DP Bill,2017. Profiling had always been implied within these sections and personal data can only be lawfully processed under the conditions set out within the section. Art 4(4) of the GDPR defines profiling. Examples of allowed and disallowed types of profiling are given in Recital 71. Automated decision making and profiling may take place within the justifications included in Art 22(2), with data controllers required to implement suitable safeguards for the data subjects’ freedoms and legitimate interests and for them to exercise their right to contest decisions so made, especially where those decisions produce legal effects for those data subjects or significantly affect them. Human intervention on the part of the data controller to explain why decisions were so made may be necessary. Note that Art 8 of the GDPR requires parental consent for the lawful processing of data of children below the age of 16 (U K will be choosing to lower the age to 13) and Recital 38 specifically mentions special protection in the use of personal data of children for marketing and creation of personality and user profiles. The right to be forgotten under Recital 65 also applies to adults who once provided consent as children for an internet presence. 42 Under sec 1(1) of the current UK DPA, 1998 [Jam DP Bill,2017 s 2], indicators which could make a person directly or indirectly identifiable include ‘personal name’ as an identifier by itself ( unless a common name is considered as personal data): Edem v IC & Financial Services Authority [2014] EWCA Civ 92, departing from the approach taken by Auld LJ in Durant v Financial Services Authority [2003]EWCA Civ 1746; on-line identifiers such as ISP addresses (Scarlet Extended SA(2011)Case C-70/10 , dynamic IP addresses- Breyer v Germany (2016) C 582/14, cookie IDs, location data, RFID tags, are all considered to be within the definition of personal data. The GDPR codifies existing 39


revealing of consumers’ personal shopping preferences by means of data gathered from the use of supermarket loyalty cards. The particular supermarket may be part of a group of companies offering not just retailing of goods but also insurance, financial, ICT and energy services. A record of the purchasing of unhealthy foods may indicate a poor lifestyle and may cause health insurance premiums to rise. Such companies may also allow third party data analytic43 companies access to personal data through third party maintenance licences, unknown to data subjects, with such companies further processing, repurposing and aggregating personal data. This is easily ascertainable because it is known that the loyalty card of one supermarket may be accepted by various other unconnected businesses having no legal relationships with each other. Another example of profiling through personal and sensitive data provided without any explicit or well-informed consent, where data is gathered by automated processing through observation or inference from telemetry metadata stored on users’ devices. Such obscure processing of personal data can lead to manipulation by data controllers of online behaviour or choices. In October 2017, the Dutch Data Protection Authority44 made an investigative finding that the processing activities of Microsoft Windows 10 Home and Pro operating systems were in breach of several sections of the Dutch Data Protection Act in that those operating systems, through their default settings, were continuously collecting, at intervals of between 15 minutes and 4 hours, full telemetry metadata of Apps usage and web browsing which were in the nature of personal and sensitive data. These types of collection were for processing activities of which users often were either only informed in a general manner or which they did not understand, and the purposes of the processing were changeable through experimental research by Microsoft engineers. The Authority found that users were informed in a general manner of just three out of five purposes, and they were informed only in a limited way of the categories of data to be collected through the processing of the personal data of users. Further, Microsoft failed to inform users that their Apps usage was being monitored in order to show personalised advertising and recommendations. The Authority also found that through the lack of transparency regarding pre-determined purposes of processing and the technical impossibility of ascertaining the categories of data collected and allocated to particular purposes, Microsoft could not be shown to have been processing personal data as required under the Dutch Data Protection Act for ‘specified, explicit and necessary legitimate purposes’, having had no clear pre-determined purposes for processing the collection of data, nor could users have specifically or informedly consented by failing to switch off the default settings. Thus, processing could not have been conducted fairly or even lawfully within the legal grounds for processing of personal data under the Data Protection Act, nor could it have been done on data protection principles. Data protection of personal and sensitive data45 where automated decision making and automated profiling are concerned are likely to be invoked more widely under the General Data Protection case law into its definition of personal data and so it includes online identifiers, identification numbers which could make a person identifiable (See Recital 30). 43 Big data analytics must also comply with Accuracy from the time of collection, to analysis and application. 44 see https://autoriteitpersoonsgegevens.nl/en/news/dutch-dpa-microsoft-breaches-data-protection-law-windows-10 45 Additional requirements for processing data within special categories (sensitive) must be present under Art 9 of the GDPR, as processing of such data is prohibited. Key derogations against prohibitions include explicit consent,


Regulation by data subjects because of the proliferation of artificial intelligence in big and open data analytics46. Data controllers who are not able to rely on consent47 provisions for processing, may seek to claim that they have necessary and legitimate interests in processing such data which may override the interests of the data subjects48. Any decision-making made solely on automated processing or profiling based on the use of selflearning algorithms must be susceptible to meaningful explanations as to how decisions were arrived at, so that data subjects will be given meaningful opportunities to exercise their data subject rights of access and right to contest49 any such decisions. If controllers were unable to provide meaningful explanations50 concerning the points at which personal data were considered and the reasoning ‘why’ a self-learning algorithm made a decision, such decision arrived at by automated processing would not be considered to have been arrived at by fair, lawful and transparent51 means, especially where the decisions based on those self-learning algorithms were to produce legal consequences for, or to significantly affect the data subjects concerned. Two prominent and topical examples of automated profiling are: (1)

Automated travel profiling- Passenger Name Record Agreement between Europe and Canada52

necessity for exercising rights of the controller in the field of employment, legal claims, substantial public interest, preventative or occupational medicine, public health etc. 46 Recital 89 of the GDPR clearly states that the focus is now on large scale processing, involving new technologies and where there have not been prior data protection impact assessments, and where these types of processing pose high risk to rights and freedoms of data subjects. This is a shift away from notification of processing to DP Authorities required under Directive 95/46, Parts 111 of UK DP Act, 1998 and Jam DP Bill, 2017. 47 Consent under the GDPR has a more rigorous threshold. Data controllers will be required to keep records to show that consent, for which notice must be unbundled from other documentation and must be given for each type of processing. Consent must be genuinely based on free choice; the data subject must be informed of the purposes for the data collection; and processing must be made known to the data subject before-hand. No imbalance between data subjects and controllers must exist (Recitals 42 and 43).Data subjects also have the right to withdraw consent at any time. 48 Principles of reasonable expectations of data subjects and proportionality are taken into account when balancing competing interests (Recitals 47 and 49). 49 Art 18 GDPR on rights of data subject to restrict processing on grounds of inaccuracy of personal data, unlawful processing, 50 The data controller has to justify. 51 Art 13(2) GDPR. 52 Under Existing Directive 95/46/EC, Arts 25 & 26 address the transborder flows of personal data to third countries. The general requirement is that the third country must have an adequate level of protection of rights and freedoms of data subjects in relation to the processing of personal data. Whether a third country has met the adequacy test for the level of protection of rights and freedoms (both general and sectoral laws) has been the subject of important rulings by the ECJ, with the courts making pronouncements on the legality of agreements between countries (Safe Harbour ruled illegal in Schrems(One)(2015) and Privacy Shield agreement is currently being challenged before the ECJ in Digital Rights Ireland v Commission Case T 670/16 and La Quadrature du Net and Others v Commission Case T 738/16. The legality of binding corporate rules and standard contractual clauses has been called into question in (Schrems(Two)(2017). More particularly, these rulings have arisen in the wake of Snowden’s revelations, and the courts now being prepared to take a closer and holistic examination of the level of protection of personal data in the laws themselves and the practical implementation of protections and remedies available for citizens. See Arts 44-50 GDPR.


(Opinion 1/15 of Grand Chamber Court of 26 th July, 2017)53 Under a proposed Agreement between Canada and the European Union, all passenger name records were to be transferred to Canada, where Europe to Canada air travellers’ personal data were to be analysed and processed by way of pre-established automated criteria through ‘various’ databases. The stated purpose of the Agreement was the combatting of terrorism and serious transnational crime before the passengers’ arrival in Canada and in the interest of assessing public safety risks to Canadians. The collection of PNR included among its 19 requirements, ‘all available contact information’, ‘all available payment/billing information’, ‘travel itinerary’, ‘all advance passenger information data collected for reservation’. It was admitted during the course of the hearing before the Grand Chamber Court that the automated system had a significant margin of error and that when the PNR information was taken as a whole, the unlimited scope of automated processing, without regard to objective necessity, provided for interferences with the fundamental rights and freedoms of European travellers without having the appropriate safeguards against unlawful access and unauthorised use of data. The proposed Agreement, in the form submitted by the EU Parliament for review, was not approved by the Court for a number of reasons, one of which concerned whether the automated processing models are reliably non-discriminatory and purpose specific for the context within which the Agreement was to be applied, in so far as consulted databases were to be specific to the fight against terrorism and serious transnational crime.

(2)

Political profiling US resident attempting to reclaim personal data using British Data Protection Act from British headquartered data controller

An American professor discovered that he, along with over 240 million registered voters in the US, was the subject of politically targeted advertisements relating to the 2016 USA elections through the use of opaque collection and matching of personal data from a multitude of public and purchased anonymised private consumer data base sources compiled by a company called SCL Group Ltd located in London, which was the parent company of a Delaware company called Cambridge Analytics. The Professor made a data subject access54 request under section 7 of the UK Data Protection Act (1998) from the London data controller (SCL Group Ltd) to know what personal data Cambridge Analytics held about him. The parent company revealed that it had processed his personal data for the purposes of behavioural polling and communication outreach, as well as for predictive algorithmic development. It also revealed that the company obtained sensitive data such as ethnicity from data vendors, research survey data which formed the basis of 53

Incompatibility exists between the EU Parliament’s vote on 25/10/2017 in favour of a smart border entry/exist system, retaining biometric data of non-EU passengers for a maximum period of 5 years and the CJEU Opinion 1/15 in July, 2017 on European/Canadian PRN Agreement. 54 Australian resident beneficiaries of a Bahamian trust were successful in the English Court of Appeal in bringing a data subject access request against UK solicitors under section 7 of the UK DP Act, 1998 for information regarding the exercise of Bahamian trustee’s discretion, on the ground that the UK solicitors were agents of the Bahamian trustee: Dawson-Damer v Taylor Wessing LLP [2017]EWCA Civ74.


its predictive modelling, and modelled data or predictions which the system made about the Professor. He was scored on a number of matters based on those proprietary algorithms and his data were then further disclosed and processed by third parties. He was not provided with the psychometrics that were used in profiling him, nor with the sources of information used in the psychometrics. It is believed that he is currently seeking disclosure or algorithmic accountability in the UK court.

Importance of greater protection Personal data yields its highest value to investors in big data when individuals can become identifiable55. They become identifiable56 not only by providing the information or data themselves, but also indirectly through the non-disabling surveillance sensors which technology companies use in smart devices, in order to opaquely and continuously gather behavioural data which can then be converted into transmittable electronic signals57 or through ‘open data’. Though IoT58 technology has been in existence for some time, it is within the last five years that there has been a proliferation of deployment by technology companies; always on- always connecteddevices59 pose huge data security risks and difficult compliance challenges, even within a new GDPR framework60. In the borderless digital age, in which our data are, monetized and are an economic asset to governments61 and commercial enterprises, an appropriate balancing reaction to the challenge of frequent non-transparent interference with individuals’ rights, would be to allow individuals 55

De-identified data can often be re-identified. Anonymised data to which data protection principles may not apply are different from pseudonymisation and encryption which are security measures. 56 For instance, in relation to browser generated information, once browsing histories (usually processed for tailoring advertising) can be linked to a specific device (as is usually the case, for instance, with third party apps), the data will be considered personal data regardless of whether the search engine company has exercised data segregation. Damages may be awarded for distress without the need for showing pecuniary loss by virtue of the horizontal application of the right to be awarded an effective judicial remedy contained in Art 47 of the EU Charter of Fundamental Rights: Google Inc v Vidal-Hall [2015]EWCA Civ 311. In that case, s13(2) of the UK Data Protection Act,1998 which required a claimant to show evidence of pecuniary loss was by-passed. [s71 Jam DP Bill, 2017 contains a requirement to show damage]. 57 Such as those embedded in many commonly used digital consumer devices such as iPhone’s 8 and X’s A11 bionic chip(described by Apple as ‘superhuman intelligence’)housing powerful neural network (deep learning) sensors. See https://www.apple.com/iphone-x/ AI technologies which have now also been embedded in the internet of things (IoT). 58 Newer technology now at Internet of Robotic Things (IoRT). 59 Audio recordings, text data and transcribed recordings from voice assistants Echo were the subject of US warrants in criminal cases. Recordings were ‘accidentally’ made before and after a command. 60 Challenges include data controllers identifying location of personal data when answering to data subject access requests; obtaining customers’ informed consent (esp those of children) for each identified processing operation; lack of inbuilt security against data theft for devices which may not have built with privacy by design; requirement for informing ICO of data breaches. 61 Unstructured data were reversed engineered by the US-based International Consortium of Investigative Journalists in the ‘Panama Papers’ (2016) leaks through unauthorised data exfiltration. Data illegally obtained by theft have been bought by governments throughout the world, including the UK, France and Germany. Similar data protection breaches, ‘Paradise Papers’(October, 2017), have been brought to light in connection with the offshore firm of Appleby in Bermuda.


greater control over and protection of their own personal and sensitive data, including what information about themselves should be made available to others62. While in the USA63, individuals’ personal data in the hands of commercial companies may be considered proprietary data owned by the companies which process and store them, whereas data ownership in Europe takes the opposite turn as is evidenced by the stricter applicability of data protection legislation. More effective and meaningful protections may require a greater legal convergence of general data protection principles and applications across the world,64 so that the level of protection afforded under different legal systems65 will no longer pose obstacles to the free flow of personal data. Under the current 1995 EU Directive legal framework66, it was felt that data protection of personal information as a fundamental qualified right was not as effectively protected or fit for the purpose for which it was intended in the new digital world67, with the additional obstacle of inconsistent implementation throughout the Union. One innovation in the GDPR is contained in Article 25, which introduces the concepts of ‘data protection by design’ and ‘data protection by default’. With regard to the former, the controller is enjoined to implement appropriate technical and organisational methods (such as pseudonymisation) which are designed to implement data protection principles, such as data minimisation, and to integrate the necessary safeguards into data processing, in order to protect data subjects’ rights. As to ‘data protection by default’, the 62

Algorithms alone must not be allowed to predetermine on-line data about individuals or the right of the individual to his own self-determination, and so the’ right to be forgotten’ is explicitly provided for under Article 17, Recitals 65-67 of the GDPR The right to be forgotten is implied under the current Directive 95/46/EC in Art 12 which allows data subjects to request the rectification, erasure or the blocking of personal data where such data is inaccurate, inadequate, irrelevant or excessive for the purpose of the processing. Thus, there is the right of data subjects to require, on a case by case basis, the delinking of their names from the search engines structured information which is directly related to them, even if such information is controlled by third parties; search engine results of the same content sought can be made available through a query other than by name: see Google Spain SL, Google Inc v Agencia Española de Protección de Datos & Gonzalez CJEU (2014) C-131/1; Google v CNIL (French Data Protection Authority) (awaiting preliminary ruling from the CJEU on the practical implementation of the right to be forgotten where sensitive personal data such as political allegiance or prior criminal behaviour are concerned, as Google is currently seeking to restrict its responsibilities regarding delinking to specific geo-locations, rather than delinking globally, which it claims will be in conformity with para 38 of the previous Google Spain decision ). 63 A survey of US laws regarding the protection (or lack thereof) of personal data is contained between paragraphs 164 to 262 in Schrems (Two)(2017) IECH 545. 64 See ‘Exchanging and Protecting Personal Data in a Globalised World’, EU Com (2017) 7 dd 10/1/2017. An example is the divergent outcomes of cases involving search engine algorithmic combination of words in search engine suggestions. 65 See HIPCAR: Privacy and Data Protection: Model Policy Guideline & Legislative Texts (2012). Art 50(d) of the GDPR encourages the exchange of data protection legislation and practices, including on jurisdictional conflicts with third countries. 66 In addition, 1981 Convention for the Protection of Individuals with regard to Automatic Processing of Personal Data (EST 108) to be replaced together with a new e-Privacy Directive to replace Directive 2002/58/EC. A few instances of general DP legislation in the Caribbean include: Trinidad and Tobago’s Data Protection Act, 2011(currently under review); The Bahamas’ Data Protection (Privacy of Personal Information) Act 2003 (has never been amended); St Lucia’s Data Protection Act, No 11 of 2011, as amended by Act No 2 of 2015; Cayman Islands’ Data Protection Law, 2017 (to be brought into force), Bermuda’s Personal Information Protection Act 2016 (to be brought into force in 2018). 67 See ‘Proposal on the protection of individuals with regard to the processing of personal data and on the free movement of such data (General Data Protection Regulation)’ EU Com (2012) 11 final dd 25/01/2012.


controller is required to ensure that personal data are not, by default, made accessible to an indefinite number of natural persons. Thus, the controller needs to pay attention to matters such as the quantity of personal data collected, the extent of the processing, the period of storage, and the degree of accessibility of the data. The extra-territorial reach of the guarantees of data protection under the new General Data Protection Regulation (EU 2016/679) attaches, once online data of EU residents are being tracked or processed, or where they are being targeted for commerce from data controllers from outside the EU68; neither the data controller nor processor need be within the EU for the GDPR to apply. Under the new GDPR regime, apart from an expanded definition of personal data, new categories of data have been created while at the same time restrictions have been placed on the categories of data that may be lawfully processed. In addition, new data subject rights such as data portability69, the right to receive compensation without showing damage70, the right to be forgotten71, and the right of notification of security breaches72, together with expanded roles73 and powers of sanctions74 of data protection/information commissioners, have been provided for. Certain principles75 are no longer implied but have been elevated to explicit legal requirements. One such requirement is the principle of Accountability. 68

Article 3 of GDPR. Under Directive 95/46/EC, the main establishment of the controller is no longer determined by its registered address or headquarters, branch or subsidiary presence within the EU: see CJEU decisions in Google Spain (2014) (single appointment of data controller in one EU country is not a bar for another DP authority in a different EU country from exercising jurisdiction); Weltimmo SRO v Hungarian DPA (2015) C-230/14: three pronged test of (i) real and effective activity, however minimal; (ii) whether the activity is through a stable arrangement;(iii) whether the processing of personal data is within the context of that activity; Schrems (One) (2015) C-362/14 concerns the transfer of Facebook data from an EU resident to US servers. USA Safe Harbour Scheme ruled by the CJEU to be invalid because it did not apply to US public authorities who may interfere with the fundamental rights of people who have no real administrative or judicial redress . In Schrems (Two) (2017) IECH 545, the Irish High Court on 3/10/2017 approved the filing of a complaint for determination of validity of standard contractual clauses which have been approved by the EU Commission for the transfer of personal data of EU residents to the USA. Under Article 3 of the GDPR, territorial applicability covers processing performed outside of the Union. 69 Art 20; Recital 68 requires data subjects to be provided with structured, machine-readable and interoperable formats of their data, which can also be transferable from one data controller to the next. Data portability does not prejudice the right to request erasure of personal data. 70 Art 82. 71 Art 17 GDPR which exists as the right of erasure under the current Directive 95/46/EC. 72 Directly to data subject under Art 34 GDPR and to the supervisory authority within 72 hours under Art 33 of GDPR. 73 Art 58 GDPR under which power is given to issue certification and accreditation Arts 42 and 43, power to carry out investigations and data protection audits, among other powers. 74 Under Art 83 GDPR, the supervisory authority has the power to implement fines for data protection breaches, including security breaches, of up to 4% of total global annual turnover. 75 The eight principles under which personal data may be processed ((1) fairly and lawfully (2) specified and lawful purposes (3) adequate, relevant and not excessive in relation to purpose (4) accuracy and kept up to date (5) data retention for no longer than is necessary for that purpose (6) processed in accordance with the rights of data subjects (7) technical and organizational measures against unauthorized and unlawful access accidental loss or destruction or damage to personal data (8) third country transfer of personal data only if that country provides and adequate level of protection in relation to processing of data and rights and freedoms of data subjects.(Also contained in Jam DP Bill, 2017 as ‘Standards’ ss 22-31). Reorganized into seven principles under the GDPR with a new accountability principle for a controller to show that it is in compliance with the principles (Art 5).


Both controllers and processers will now be responsible for compliance with the GDPR. An important feature is the mandatory requirement for the appointment of data protection officers who are suitably qualified, having expert knowledge of data protection law and practice, so as to ensure compliance and to conduct impact assessments when implementing new technologies; also to ensure that data subjects have suitable access to exercise their rights. They have specified tasks to comply with under the GDPR76.

76

Art 38 and 39.


Turn static files into dynamic content formats.

Create a flipbook
Issuu converts static files into: digital portfolios, online yearbooks, online catalogs, digital photo albums and more. Sign up and create your flipbook.