Data%20protection%20paper%20%28final%29

Page 1

JAMAICA BAR ASSOCIATION NOVEMBER CONFERENCE 2017

Big Data, Small Island Protection, Privacy and Compliance Grace Lindo and Sundiata Gibbs 10/25/2017


TABLE OF CONTENTS

INTRODUCTION .................................................................................................................. 1 BIG DATA AND THE INCREASED IMPORTANCE OF DATA PROTECTION.................... 1 NATIONAL IDENTIFICATION AND REGISTRATION BILL ................................................ 8 PROPOSED DATA PROTECTION IN JAMAICA .............................................................. 10 APPLICATION OF THE DATA PROTECTION BILL IN THE AGE OF BIG DATA ............ 11 TREATMENT OF PSEUDONYMISED/ANONYMIZED DATA ............................................ 14 LITIGATION ISSUES: REMEDIES AVAILABLE IN THE SUPREME COURT .................. 17 CONCLUSION ................................................................................................................... 23

INTRODUCTION

1.

In this paper we examine the emerging use of Big Data and the importance of implementing

data

protection

legislation

to

protect

against

the

dangers

accompanying this new phenomenon.

2.

Accordingly, we will discuss the various data protection standards to which data controllers in Jamaica will need to adhere. The paper will highlight the commercial uses of data and how, beyond data protection specific laws, the utility of data is becoming evident (e.g. in the Credit Reporting Act and National Identification Bill) .

BIG DATA AND THE INCREASED IMPORTANCE OF DATA PROTECTION

3.

It is difficult to appreciate the importance of the impending Data Protection Act without appreciating the prevalence of data processing in today’s society and its impact on our everyday lives. 1|Page


4.

Companies are storing, collating and processing our personal data in large quantities for commercial gain. This is particularly true for the technology sector given the ubiquity of mobile applications and other tracking based technologies. Governments are engaging in similar practices to serve their citizens in more effective ways. To understand the magnitude of this phenomenon, consider that more information has been created, stored and processed in the past four years than in the entire previous history of the human race.

5.

The trend of storing and processing large quantities of data has caused data to become a commodity as valuable as oil in modern society. In May 2017 the Economist magazine published an article titled "The world’s most valuable resource" in which it made this point. The authors of the article wrote:

A new commodity spawns a lucrative, fast-growing industry, prompting antitrust regulators to step in to restrain those who control its flow. A century ago, the resource in question was oil. Now similar concerns are being raised by the giants that deal in data, the oil of the digital era 1.

6.

Intangible assets (such as data) are notoriously difficult to value and many companies whose most important asset is the large collection of data do not reflect that data as assets in their balance sheets.

7.

Facebook Inc. (“Facebook”) is a good illustration of such a company. Facebook started trading on the New York stock exchange in May 2012. When it issued its initial public offering (IPO), its 2011 audited financial statements reflected its assets as being valued at roughly US$6.3 billion. However, its bankers valued its shares at $38 per share, which equated to a market value of US$104.2 billion.

1

The Economist, (2017,May 6) “The World’s Most Valuable Resource is no Longer Oil but Data” retrieved from https://www.economist.com/news/leaders/21721656-data-economy-demands-new-approach-antitrust-rulesworlds-most-valuable-resource

2|Page


8.

While it is not unusual for there to be a divergence between a company’s book value and its market value, this divergence was abnormally wide (an approximate $100 billion difference). Admittedly, one could not use this variance to conclude that Facebook’s stored data was worth $100 billion. However, it would not be unreasonable to state that the absence from Facebook’s balance sheet of a monetary representation of its stored data likely played a big role in the remarkable disparity.

9.

The various ways in which companies are processing data gives us insight as to its value.

10.

Retailers such as Target started using big data analytics decades ago to predict the future shopping habits of their customers. This practice resulted in Target being the subject of a 2012 New York Times article which revealed just how effective (and intrusive) its big data operation was.

11.

According to Charles Duhigg, the author of the piece, Target has for many years been collecting as much of its customers’ personal data as possible.

12.

In 2002, its marketing department recognised that expecting parents, as a demographic, changed their shopping habits drastically during and after a pregnancy. The marketing team decided to process the large swaths of data it had at its disposal to predict who among its customers were likely to be pregnant.

13.

The idea was simple. If Target could identify and market baby products to expectant mothers before they gave birth, it could foster brand loyalty to Target before any other retailers (who relied on public birth records) knew a baby was even on the way.

14.

Target’s data analytics team examined the shopping habits of women who signed up for its baby registry and ran an algorithm to see whether any patterns emerged.

15.

They discovered that many of the women who set up a registry bought unscented lotion in their third month of pregnancy and various supplements a few weeks later. 3|Page


16.

Target was eventually able to identify several other products their pregnant customers bought during their pregnancies which gave an accurate indication of how far along those customers were and their likely due dates.

17.

With that information, Target developed a pregnancy predictor algorithm. Whenever the algorithm revealed the relevant purchasing pattern, Target would send the expecting mother coupons applicable to the stage of the pregnancy at which she was.

18.

Mr. Duhigg described a conversation he had with a Target employee which explained how the system worked. He wrote:

“One Target employee I spoke to provided a hypothetical example. Take a fictional Target shopper named Jenny Ward, who is 23, lives in Atlanta and in March bought cocoa-butter lotion, a purse large enough to double as a diaper bag, zinc and magnesium supplements and a bright blue rug. There’s, say, an 87 percent chance that she’s pregnant and that her delivery date is sometime in late August.” 2

19.

Mr Duhigg reports that the precision of Target’s pregnancy prediction algorithm eventually became the source of some controversy. An enraged customer stormed into a Target store complaining to management that Target had sent his teenage daughter coupons for baby clothes and cribs. His obvious concern was that Target was planting the idea of pregnancy in his young, impressionable daughter’s mind.

20.

The store manager apologized for the “mistake” in person but decided to call a few days later to apologise once more. During the telephone call however, it was the father who was contrite. He had spoken to his daughter and could now confirm what Target’s algorithm had already predicted. His daughter was in need of the maternity merchandise Target marketed to her and the coupons were soon going to come in handy.

2

Duhigg, C. (2012, February 16). “How Companies Learn Your Secrets”. Retrieved from http://www.nytimes.com/2012/02/19/magazine/shopping-habits.html

4|Page


21.

With the commercial utility of big data also comes the risk of privacy intrusions. In December 2009 a woman using the alias “Jane Doe” joined a class action lawsuit against Netflix Inc. (“Netflix”) claiming (among other things) damages for “public disclosure of private facts” 3.

22.

Netflix is a company in the business of renting DVDs and streaming movies online. It allows its users to rate films and based on those ratings (as well as other datasets) its algorithm recommends other similar movies the consumer would likely enjoy. In 2009 Netflix wished to improve the accuracy of the algorithm it used to make these recommendations.

23.

The company launched a contest called “Netflix prize” in which it offered $1,000,000.00 to any person who could design an algorithm which was more accurate at predicting user preferences than the one it was using. Netflix gave the contestants a dataset of 100 million movie ratings, along with the date of the rating, a unique ID number for each of its 480,000 customers, and the movie information.

24.

The company carefully removed any personal identifiers from the data in an effort to anonymize their customers. Despite this attempt at privacy protection a team of researchers from the University of Texas 4 were able to discern the identity and sexual orientation of a user, Jane Doe, who was a closeted lesbian in a conservative town in the US. The researchers later published a report in which they claimed that if an anonymised Netflix costumer rated 6 movies and the dates of those ratings were known their algorithm could identify that customer 99% of the time 5.

25.

The privacy risk associated with big data processing is not limited to internet users. Personal data can be gathered in large quantities from various sources that are not internet based. In an article published in the International Journal of Law &

3 4

Nelly Valdez Marquez et al v Netflix Inc. Claim No. C09-05903-JW-PVT

Arvind Narayanan and Vitaly Shmatikov 5 Narayanan A. and Shmatikov V., “Robust De-anonymization of Large Sparse Data Sets”, February 5, 2008

5|Page


Information Technology 6 Nancy King and Pernille Jessen expressed concern about the privacy implications of using smart meters for measuring electricity usage. Citing research from the US department of energy they write that the potential use of smart metering systems to create consumer profiles gives rise to threats to personal data protection. They explain:

“Taken to extreme, 'deployment of smart metering [systems] may lead to tracking the everyday lives of people in their own homes and building detailed profiles of all individuals based on their domestic activities'.

Analysis of detailed electricity usage data may make it possible to infer or predict (based on deductions about the way in which electronic devices in the home work), 'when members of a household are away on holidays or at work, when they sleep and awake, whether they watch television or use certain tools or devices, or entertain guests in their free-time, how often they do their laundry, if someone uses a specific medical device or baby monitor, whether a kidney problem has suddenly appeared or developed over time, if anyone suffers from insomnia, or indeed whether individuals sleep in the same room'.

Potential consumer profiling related to smart metering systems gives rise to privacy concerns about undermining fundamental privacy rights as well as threats to personal data protection” 7

26. Other observers have also commented on the dangerous trendlines of big data processing and warn that the legal response needs to stay innovative to keep up with the changing character of the risks. In their book Big Data: A Revolution that Will Transform How We Live Work and Think, Viktor Mayer-Schönberger and Keneth Cukier question whether data protection laws have remained effective in protecting personal privacy. They argue that new solutions are now needed because the nature of the risk has changed. They write:

6

“Smart metering systems and data sharing: why getting a smart meter should also mean getting strong information privacy controls to manage data sharing” by Nancy King and Pernille Jessen, Int J Law Info Tech (2014) 22 (3): 215

7

At page 228

6|Page


"The important question, however, is not whether big data increases the risk to privacy (it does), but whether it changes the character of the risk. If the threat is simply larger, then the laws and rules that protect privacy may still work in the bigdata age; all we need to do is redouble our existing efforts. On the other hand, if the problem changes, we may need new solutions. Unfortunately, the problem has been transformed. With big data, the value of information no longer resides solely in its primary purpose. As we’ve argued, it is now in secondary uses.” 8

27.

The authors then question the efficacy of requiring a data controller to obtain an individual’s consent before processing personal data. They opine:

“This change undermines the central role assigned to individuals in current privacy laws. Today they are told at the time of collection which information is being gathered and for what purpose; then they have an opportunity to agree, so that collection can commence. While this concept of “notice and consent” is not the only lawful way to gather and process personal data, according to Fred Cate, a privacy expert at Indiana University, it has been transmogrified into a cornerstone of privacy principles around the world. (In practice, it has led to super-sized privacy notices that are rarely read, let alone understood—but that is another story.) Strikingly, in a big-data age, most innovative secondary uses haven’t been imagined when the data is first collected. How can companies provide notice for a purpose that has yet to exist? How can individuals give informed consent to an unknown? Yet in the absence of consent, any big-data analysis containing personal information might require going back to every person and asking permission for each reuse. Can you imagine Google trying to contact hundreds of millions of users for approval to use their old search queries to predict the flu? No company would shoulder the cost, even if the task were technically feasible. The alternative, asking users to agree to any possible future use of their data at the time of collection, isn’t helpful either. Such a wholesale permission emasculates the very notion of informed consent. In the context of big data, the tried and trusted concept of notice and consent is often

8

Mayer-Schönberger V. and Cukier K. Big Data: A Revolution that Will Transform How We Live Work and Think. Chapter 8. Paralyzing Privacy. Kindle Edition. 2013

7|Page


either too restrictive to unearth data’s latent value or too empty to protect individuals’ privacy." 9

28.

Because of this changing landscape it is incumbent on the Jamaican Government to regulate the “data processing industry” and the recent Data Protection Bill (“the Draft DPA”) is the first step to doing so. Other Caribbean jurisdictions have already enacted data protection laws, but as at the writing of this paper we are unaware of how effective they have been in enforcing them.

29.

Before taking an in-depth look at the Draft DPA, we will access the various ways in which “data” and “privacy of data” have been treated, even indirectly, under Jamaican laws.

NATIONAL IDENTIFICATION AND REGISTRATION BILL 30.

The National Identification and Registration Bill (2017) (“NIDS”) is a bill mandating the collection of personal data on registrable individuals. NIDS defines registrable individuals as those who are citizens and those who are ordinarily resident of Jamaica. Being ordinarily resident takes on a meaning often seen in revenue statutes with the test being whether the individual lives in Jamaica for a period of six months in a given calendar year. NIDS allows the Authority (designated under that statute) to collect and collate data for the purposes of creating an identification number, issuing a card to individuals and maintaining a database. We understand that NIDS will be a crime prevention and planning tool but, from a data collection standpoint, the means of collection, means of correction of inaccurate data and the obligation to provide one’s data to third parties have significant implications.

31.

Section 15 of NIDS speaks to the creation of the National Civil and Identification Database. The type of information to be collected and collated is limited to the very wide definition of “identity information and demographic information regarding registrable individuals”. The demographic data includes biometric data but excludes

9

Mayer-Schönberger V. and Cukier K. Big Data: A Revolution that Will Transform How We Live Work and Think. Chapter 8. Paralyzing Privacy. Kindle Edition. 2013

8|Page


DNA (as defined under the DNA Act). The definition of biometric data includes photographs or facial images of individuals, fingerprints, eye colour (taken from retina scans), manual signatures, distinguishing features and blood type. These types of data fit within the definition of “personal data” under the Draft DPA, to be discussed below, as this data would make the data subject identifiable. In essence, NIDS concerns personal data and will involve the processing of data in the manner defined under the proposed data protection laws.

32.

Section 24 of the bill provides for the issuance of a National Identification Number to individuals. There is an expectation that, under section 41 of the Bill, registered individuals will be required to use this identification number when transacting business with public bodies and may be required to provide it if a private sector body asks for it.

33.

One of the data protection standards under the Draft DPA is the requirement for accuracy 10. NIDS provides for a similar standard of accuracy in Section 19(1) of the bill. That section requires a registered individual to apply to correct inaccurate, incorrect,

misleading

or

un-updated

information

in

the

Authority’s

11

database..However. Section 19 (2) uses permissive language in describing the Authority’s “obligation” to act on the individual’s application. The subsection states that the Authority “may” correct the information. This wording suggests that the Authority has great discretion when deciding whether to correct inaccurate information in the database. If the Authority were to fail to respond to an individual’s application to correct inaccurate information , it would be in contravention of the fourth standard (which requires accuracy of personal data and is discussed further below). Further, section 13 of the Draft DPA speaks to a rectification of records (which can include amendment, blocking, erasure, destruction as may be required). In fact, the data controller has, if it is determined that rectification is required, a period of thirty (30) days after a request to rectify the records. We will discuss this

10 11

The Fourth Standard The Authority under the bill is the Regisrtar General’s Department and is the regulator created under the bill

9|Page


more when we look at the implications especially whether a “Google Spain” 12 right to be forgotten obligation will arise in the light of the promulgation of the Draft DPA.

34.

NIDS has created a multi-tiered regime for data types – something which is not consistent with the Draft DPA. Specifically, bio-metric data is particularly sacrosanct with section 44 of NIDS wholly restricting disclosure unless a court order is obtained even by the security forces.

35.

India is in the midst of implementing a very similar national identification system called the Aadhaar which will see the assignment of a 12 digit Unique Identification Number (UID) to over 1 billion people 13 and the collection of similar biometric data to that which will be required under NIDS. Opposition to Aadhaar in India has sparked litigation on whether a right to privacy exists. An Indian Supreme Court ruling that such a right exists despite not being explicit in the Indian constitution though such right can be abridged if required for legitimate purposes. A separate court will now have to decide on the legality of the requirement for a UID in the light of this “new” right to privacy.

36.

Prior to NIDS, Jamaica did have sector specific data collection regimes such as the regime created under the auspices of the Credit Reporting Act, 2010. That Act mandated the collection of customer credit and other data that can indicate creditworthiness. Licensees under that Act have an obligation to keep the data confidential and a further obligation to ensure it is accurate and complete. This accords with the fourth standard as to accuracy under the Draft DPA (discussed further below).

PROPOSED DATA PROTECTION IN JAMAICA 37.

Jamaica will be one of the few Caribbean countries to implement a data protection law. The Bahamas was the first in the region to pass data protection legislation in

12

The Right to be Forgotten being a right to have one’s online history in respect of a particular piece of data erased. Google Spain SL and Google Inc. v Agencia Española de Protección de Datos (AEPD) and Mario Costeja González (2014), ECLI:EU:C:2014:317. 13 https://www.morpho.com/en/media/aadhaar-unique-id-number-all-20151123

10 | P a g e


2003.However, the Bahamian statute, from anecdotal evidence, appears to be underutilized. This is a delicate time to implement data protection legislation since the EU (the source of risk based data protection regimes 14 has been taking a second look at how they approach the area of law. Nevertheless, Jamaica has chosen to base its new law on European tenets rather than the sector-specific method which is used in the United States (and in which direction Jamaica had started when it implemented the Credit Reporting Act). The sector-specific regulation in the U.S.A sees data protection being rolled up into anti-trust regulation with specific laws on credit card/point of sale data collection by retailers 15, health records and even video rental 16 information.

38.

Nevertheless, the growth of digital data (and the transferal/processing of data) makes it timely for Jamaica to implement legislation regulating it. It appears that Jamaicans are becoming more aware of their digital privacy rights and online identities. Recently, a furor erupted regarding the application of certain provisions of the Cybercrimes Act, 2015 in respect of social media postings under the hashtag, #SayTheirNames.

39.

The rise of Big Data places a greater focus on data use as opposed to data collection 17 , accordingly, our examination of any law on data protection will be primarily focused on privacy and the upholding of the statutory standards. Nevertheless, our overview of the law will start with the ambit of its application.

APPLICATION OF THE DATA PROTECTION BILL IN THE AGE OF BIG DATA

40.

The Draft DPA takes a risk based approach to data protection hinged on the view that the data subject is able to give informed consent to the collection of his/her data. Importantly, however, the Draft DPA is not restricted to online/digital usage. The proposed legislation also captures “manual filing systems”. Also, it does not cover

14

EU Data Protection Directive 95/46 Song-Beverly Credit Card Act of 1971 (Song-Beverly), Cal. Civil Code § 1747.08. 16 Video Privacy Protection Act 18 U.S.C. § 2710 17 Article 29 Data Protection Working Party 14/EN WP 218, Statement on the role of a risk-based approach in data protection legal frameworks, May 20, 2014. 15

11 | P a g e


domestic processing of data which section 43 defines as an individual’s processing of personal data only for that individual’s personal, family or household affairs (including recreational purposes). However, as seen in the application of a similar law under the EU Data Protection Directive, what one person may construe as domestic and seemingly harmless may be nonetheless considered non-domestic. In the Swedish case of Bodil Lindqvist v Åklagarkammaren i Jönköping 18 , the European Court of Justice found that the uploading of names for a church manual without the consent of the persons included in the manual was a breach of the data protection laws. This case also has some seminal points on the meaning of a data transfer since the European Court of Justice found that the mere uploading of the data online did not constitute a cross border data transfer.

41.

The Draft DPA is restricted in its application to data controllers who are established in Jamaica or, if not established in Jamaica, who use equipment for data processing in Jamaica. This may exempt bodies that process data overseas, such as foreign based computer applications,or via cloud based services.

42.

Where a data processor is not based in Jamaica but using equipment in Jamaica for processing and not for transmitting data, the data processor must establish a local representative. The requirement will apply to companies using data processors locally, such as BPOs, since the use of the data is more than transitory. An establishment in Jamaica includes a person who has even a branch or a “regular practice” on the island.

43.

The lack of extra-jurisdictional reach may be good for internationally popular web/mobile applications which depend on the Jamaican market e.g. Facebook or WhatsApp, though these entities are likely subject to international rules on data protection. These “mega applications” are heavy users of big data sets. To that extent, the Draft DPA does not have the impact on big data processors which process the data of Jamaicans. Limiting the Draft DPA only to Jamaicans and those established here, is unfortunate from the perspective of the Jamaican consumer.

18

(2003), C-101/01, EU

12 | P a g e


This is especially so given that these overseas processors are becoming increasingly subject to international regulations on data processing. Jamaican regulation would therefore not be shocking, from a compliance perspective, to large data processors such as Facebook, Google or WhatsApp.

44.

The flip side of the limited application is that the big data processors likely to be captured are the business processing operations/outsourcing sector which will primarily process the data of overseas residents. To that extent the Draft DPA does not significantly protect Jamaicans from the dangers of Big Data. Equally, there is no substantial protection of the right to privacy of Jamaican data subjects.

45.

Nonetheless, to the extent that the Draft DPA will have an impact on commercial prospects, it is necessary to review the salient parts of the Act. Further, the Draft DPA does incorporate European Data Protection principles to which these mega data processors are subject .

46.

With few other exceptions, the Draft DPA will apply to the “processing” of “data” belong to a living “data subject”. The proposed statutory definition of “data” does not only include electronically processed or recorded biographic and demographic information but also includes any data recorded as part of a “filing system”. This means that even data that may not be mined electronically/digitally will still be the subject of the Draft DPA.. The Draft DPA only covers the “personal data” which is data on a living individual who is identifiable based on the data provided or a combination of the data provided and other information. Personal data may include information on the opinion which the courts have had to determine whether the data processor/controller has on the data subject. This definition is important because there have been EU cases in which a data subject has used the statutory request for information (set out under sections 6-8 of the Bill) in an attempt at a “fishing expedition” and the English courts have been clear that this is not the purpose of the law 19.

19

Durant v Financial Services Authority [2003] EWCA Civ 1746

13 | P a g e


TREATMENT OF PSEUDONYMISED/ANONYMIZED DATA 47.

In the age of Big Data, the benefits of data mining must be balanced against privacy rights. As the Netflix case mentioned in paragraph 21 above shows, anonymity is often a myth. Greater reliance must therefore be placed on purpose limitation and data minimization 20. These are principles now set out amongst eight (8) standards under the Draft DPA. These standards are as follows:

a.

First Standard - Data must be processed fairly and lawfully. The first standard is intricately tied to the need to obtain a data subject’s consent before processing. However, under section 23 of the Draft DPA, if no consent is obtained, the data processor has an obligation to show that the processing is necessary given the contractual relationship with the other party or precontractual steps. This could, for instance, authorize the use of social media information for prospective employment purposes. There are other statutory “exemptions” to consent which would make processing the data lawful, such as processing for the administration of justice or in the exercise of any government functions.

Sensitive data will require consent in writing or if not obtained must be used for very strict purposes such as social security purposes. Sensitive data is defined under section 2 of the Draft DPA to include, genetic or biometric data, political opinion, sex life, membership in a trade union, physical or mental condition and data on the commission or alleged commission of an offence

b.

Second Standard – The second standard set out under section 25 of the Draft DPA pertains to “purpose limitation” which means that the data collected must not be more than is necessary for the purpose. This may mean that the literal checking of boxes without underlying reasons by Jamaican companies

20

Article 29 Data Protection Working Party Statement of the W29 on the impact of the development of big data on the protection of individuals with regard to the processing of their personal data in the EU Adopted 16 September 2014.

14 | P a g e


will have to cease. The purpose of any data is the purpose known at the time of collection but that purpose can be that of the data processor or the individual to which it will be ultimately disclosed. Herein lies the biggest limitation to big data as persons interested in mining or sale of smaller data sets must ensure that the consent of the data subject is received before sale to a larger data miner. The ability to use Big Data will require very wide terms for consent.

c.

Third Standard – The data collected must be adequate, relevant and not be excessive in relation to the purposes for which it will be processed.

d.

Fourth Standard- The data that is collected must be accurate and, where necessary, kept up to date. This standard is interlinked to the right to rectification for inaccuracies which is set out at section 13 of the Draft DPA. Under that section, the term “inaccurate” includes omissions. Section 13(1) leaves it to the data controller to determine whether the rectification is required which means that this is a severely restricted right of the data subject. In other words, the Draft DPA gives the right with one hand but takes it away with the other since the rectification is at the discretion of the data controller.

It will be very important for data processors to understand the ambit of this provision. Curiously, given the limited jurisdiction of the Draft DPA, it will not apply to Google or similar international search engines (which do not fall within the definition of “controller”) which do not have an establishment or server in Jamaica, however, it may apply to local search engines and archives such as www.jamaicagleaner.com. This leaves little chance of a Google Spain type judgement coming from Jamaica albeit that the EU has moved to having “the right to be forgotten” be explicitly set out in their proposed regulations.

The Draft DPA is silent on how the controller will determine when the rectification is required. The objective of the drafters in doing so is quite 15 | P a g e


unclear (unless it was presumed that the controller will have reference to the standards).

In Google Spain SL, Google Inc. v Agencia Espanola de Proteccion de Datos and Gonzalez

21

, the European Court of Justice held that a data

processor was obligated to erase the activity of a search engine consisting of the finding of information published or placed on the internet by third parties but indexed by the search engine automatically and made available to internet users according to a particular order of preference. However, court held that prior to erasure, the controller should balance the data subject’s right to privacy against the legitimate interests of the public in receiving the data.

e.

Fifth Standard – There will be time limitations on the storage of data. This is a new concept under Jamaican laws as entities are normally guided by time frames for a claim under the Statute of Limitations. The Regulations under the Draft DPA will set out specific time frames for data retention. Those draft Regulations have not yet been issued.

f.

Sixth Standard – The sixth standard will require data processors to process data with the rights of the data subjects in mind. The rights being referred to are the ones conferred in the Draft DPA.

g.

Seventh Standard - Under section 30 of the Draft DPA, Data controllers must employ “appropriate technological and organizational measures” to guard against unlawful processing and accidental loss and destruction. What is appropriate is determined by the state of technological development and the cost of implementation vis a vis the harm that may result and the nature of the harm that may result from a breach.

21

Ibid, n.10.

16 | P a g e


The Information Commissioner (the regulator created under the Draft DPA) must provide notice of any breach which has occurred. The Draft DPA says that the notice must be “without undue delay” and the information under the notice should include any actual effect on personal data or likely effects.

The omission of timelines for notifying the Information Commissioner is a significantly more flexible than the strict notification requirement (for instance within 72 hours) under the proposed EU General Data Protection Regulation (“GDPR”. It remains to be seen whether acting “without delay” will be dependent on the size of the controller.

Internationally, there is a growing recognition of the need to also notify the data subjects so that they can be sufficiently informed as to how their data is being processed, including whether it is being processed, unlawfully, by a third party. Unfortunately, however, there is no requirement to inform the data subject of any breaches. While this is consistent with the current EU approach it is not consistent with GDPR. Further, it really does not allow a data subject to have informed consent with a concomitant ability to withdraw consent.

h. Eighth Standard – The final standard seeks to regulate the international transfer

of

data.

Specifically,

this

standard

requires

a

data

processor/controller to ensure that the data collected is not transferred to another state.

LITIGATION ISSUES: REMEDIES AVAILABLE IN THE SUPREME COURT

48.

Section 71 of the Draft DPA creates a new cause of action which allows an individual to bring a claim against a data controller who contravenes the act. Section 71(1) is worded in the following way:

17 | P a g e


An individual who suffers damage by reason of any contravention by a data controller of any of the requirements of this Act is entitled to compensation from the data controller for that damage.

49.

The English courts have seemingly vacillated on the issue of whether the term “damage” should be interpreted as meaning “pecuniary loss” 22 only.

50.

In Johnson v. Medical Defence Union 23, the claimant was an orthopaedic surgeon whose professional indemnity policy with the Medical Defence Union (“the Union”) was terminated unilaterally. The Union’s decision to end the policy was based on a risk assessment exercise which uncovered previous allegations of professional misconduct by the claimant.

51.

The claimant alleged that his personal data had been unfairly processed in breach of the first data protection principle. He sued the Union for (among other things) compensation for the distress 24 caused by the alleged breach.

52.

He lost at first instance as well as in the Court of Appeal. The grounds on which the Court dismissed the appeal made it unnecessary to consider whether the claimant suffered “damage” in the sense envisaged by section 13 of the English Data Protection Act 1998 (which is the equivalent of section 71 of the Draft DPA). The court considered the issue anyway and determined that “damage” was limited to pecuniary loss. Lord Buxton considered the EU directive which section 13 of the English Act was intended to implement and said:

There is no compelling reason to think that ‘damage’ in the directive has to go beyond its root meaning of pecuniary loss. 25

53.

In the later decision of Vidal Hall et al v. Google 26 Lord Dyson stated that Lord Buxton’s reasoning was obiter and therefore not binding.

22

Financial loss (2007) 96 BMLR 99 24 A type of non-pecuniary loss 25 At paragraph [74] 23

18 | P a g e


54.

In that case, Google had collected information about the claimants’ internet usage without their consent or knowledge. The claimants suffered no economic loss but claimed compensation for distress under section 13 of the Data Protection Act.

55.

Google filed an interlocutory application to set aside the first instance judge’s decision to allow the claimants to serve the originating documents outside the jurisdiction. One of the issues that arose was whether the claimants had a good arguable claim for compensation under section 13 of the Data Protection Act.

56.

Google argued that the claimants’ claim was not arguable if there was no financial loss. The court determined that interpreting “damage” to be limited to financial loss would result in section 13 being incompatible with the relevant EU directive. It therefore relied on the case of Benkharbouche v Embassy of the Republic of Sudan 27 to disapply section 13(2). The Benkharbouche case was one in which the court explained that where a provision in a national law of an EU member state conflicts with EU law, the offending provision in the national law should be disapplied. The disapplication of section 13(2) in Vidal Hall resulted in the term “damage” meaning any loss, whether pecuniary or non-pecuniary. In his judgment, Lord Dyson said:

“What is required in order to make s 13(2) compatible with EU law is the disapplication of s 13(2), no more and no less. The consequence of this would be that compensation would be recoverable under s 13(1) for any damage suffered as a result of a contravention by a data controller of any of the requirements of the DPA.” 28

57.

Section 71(2) of the Draft DPA is the equivalent of the now disapplied section 13(2) of the English act. It allows an individual to recover compensation if the data controller’s breach of the act causes him to suffer distress.

26

[2016] 2 All ER 337 [2016] 1 All ER 816 28 At Paragraph [105] 27

19 | P a g e


58.

However, based on the wording of the subsection, such compensation for distress is only available if either a.

the claimant has also suffered damage (pecuniary loss) under section 71(1); or

b.

the breach in question related to the processing of personal data for artistic, literary or journalistic purposes.

59.

The exact wording of the section is as follows:

An individual who suffers distress by reason of any contravention by a data controller of any of the requirements of this Act is entitled to compensation, from the data controller, for that distress if-

(a)

the individual also suffers damage by reason of the contravention; or

(b)

the contravention relates to the processing of personal data for the special purposes.

60.

A Jamaican court called upon to interpret this section would not need to consider its incompatibility with EU law. So notwithstanding the disapplication of section 13(2) of the English act, section 71(2) of the proposed Jamaican legislation will probably be interpreted in the same way Lord Buxton interpreted the English equivalent in Johnson v Medical Defence Union.

61.

This means that, in Jamaica, if a contravention of the Draft DPA does not relate to the processing of data for a special purpose 29 a claimant would first need to show that he suffered economic loss before recovering compensation for distress. In the 8 years that elapsed between the decision in Johnson (2007) and the decision in

29

i.e. artistic, literary or journalistic purposes

20 | P a g e


Vidall-Hall (2015) the courts in England seemed to have found a way around this limitation. 62.

In Halliday v Creation Consumer Finance Limited 30 Arden LJ awarded nominal damages under section 13(1) so that the claimant could recover more substantial damages for distress under section 13(2). It remains to be seen whether the Jamaican courts will adopt this approach.

63.

Despite there being limited scope for awarding compensation for distress, the cause of action created by section 71 of the Draft DPA can prove to be useful and wide ranging.

64.

This is certainly the view Peter Thompson QC expressed in his article titled “21st Century Problems” 31 . However, the learned Queen Counsel has noted that the provision rarely used in England. He asks:

Why is this wonderful new cause of action so little used?

65.

He then offers two explanations. First, he speculates that the English courts prefer to decide cases about the disclosure of personal data based on a litigant’s privacy rights under article 8 of the European Convention of Human Rights rather than a breach of data protection principles. He writes:

“There have been two problems. First, the seminal case about Naomi Campbell’s visit to Narcotics Anonymous (Campbell v Mirror Group Newspapers [2002] EWCA Civ 1373, [2003] 1 All ER 224) started off as a DPA 1998 case but because a public interest” defence was raised under s 32(1), the courts treated the issue as turning on whether the Art 8 right to respect for private and family life outweighed the Freedom of the Press under Art 10. In this case and in subsequent “kiss and tell” cases the courts have given decisions based on these two Articles of the European Convention on Human Rights, because this is territory with which they are familiar; and DPA 1998 has been in shadow.” 30 31

[2013] EWCA Civ 333 Thompson, Peter, “21st century problems”, 166 NLJ 7703, p6

21 | P a g e


66.

The second problem the learned Queens Counsel identifies is the same problem we discuss in paragraphs 60 to 63 above. That is, relief for distress under section 13(2) is limited. He writes:

“The second problem has been the inclusion in DPA 1998 of s 13(2), which provides that there can be no general damages for distress unless either the publication was for the purposes of journalism or the claimant has also suffered “damage”, which seems to mean special damage or financial loss. As mentioned earlier, the journalism cases have been upstaged by HRA 1998; and this awkward little sub-s 13(2) seems to mean that, journalism apart, the cause of action under s 13 is conditional on proof of actual financial loss, which will rarely be present in a claim about the invasion of privacy or misuse of personal data, except in cases of identity theft.”

67.

The Draft DPA’s limits on the recovery of compensation for distress, may cause litigants in Jamaica to rely on other legal avenues to recover for the misuse of private and personal data that causes no financial loss and is unrelated to any special purpose. Section 13(3)(j) of the Jamaican Charter of Fundamental rights and Freedoms may be that legal avenue. That section states that the charter guarantees the right of everyone to “respect for and protection of private and family life, and privacy of the home” as well as “protection of privacy of other property and of communication”.

68.

In Tomlinson v Television Jamaica Limited 32 the constitutional court made it clear that the charter is not only enforceable against the government but is also enforceable in a dispute between private citizens. Sykes J, said:

“By any analysis the legislators have used words that make it plain that one private citizen can seek to enforce any right being infringed by another private citizen. For the first time in Jamaica's constitutional history we now have explicit horizontal application of fundamental human rights. It may be argued that under the old bill 32

[2013] JMFC Full 5

22 | P a g e


of rights horizontal application was possible. That would have been an argument from implication. Now, it is explicit and there is no need for an argument from implication.�

69.

We have not been able to find any cases in which a litigant invoked section 13(3)(j) of the charter to recover damages against a private citizen. However, with the growth of

data processing in our society and the increasing recognition of the dangers of big data, litigants in Jamaica may be willing to test how far the courts will go in granting compensation for distress under this section.

CONCLUSION

70.

While the advent of Big Data may appear to many to be a developing world concept, the use of certain technologies has meant that data sets are very likely comprised of the data of Jamaican nationals. However, the limited scope of Jamaica’s Draft DPA does not necessarily provide the protection from the risks of big data given (a) the limited jurisdictional application of the statute and (b) the restrictions on the scope of the compensation the courts may awards against a data controller.

71.

Nonetheless, like NIDS and the Credit Reporting Act, the Draft DPA will give living individuals to which it applies a near proprietary right to their data.

23 | P a g e


Issuu converts static files into: digital portfolios, online yearbooks, online catalogs, digital photo albums and more. Sign up and create your flipbook.