14 minute read

The Crisis in Data Privacy

INTERNET USERS CASUALLY OFFER UP THEIR VALUABLE AND SENSITIVE PERSONAL DATA TO ANY ONLINE ENTITY THAT PROMISES EVEN A HINT OF CONVENIENCE IN EXCHANGE. THEIR CARELESSNESS IS GIVING RISE TO THEFT, WASTEFUL CONSUMERISM, POLITICAL POLARIZATION AND, ULTIMATELY, CHAOS. _ TO ASSESS THE DAMAGES AND SEARCH FOR ANTIDOTES, LUCKBOX CONSTRUCTED A PANEL DISCUSSION FROM THE THOUGHTS THAT FIVE DATA-SECURITY EXPERTS EXPRESSED IN SEPARATE INTERVIEWS. _ MOST FOUND HOPE IN A COMBINATION OF REGULATION AND PERSONAL INITIATIVE.   

ALAN BUTLER_ IS AN ATTORNEY WHO SERVES AS PRESIDENT AND EXECUTIVE DIRECTOR OF THE ELECTRONIC PRIVACY INFORMATION CENTER. @ALANINDC

REBECCA HEROLD _ IS CEO OF PRIVACY & SECURITY BRAINIACS, BELONGS TO THE ENERGY TRENDS WORKING GROUP OF ISACA—FORMERLY KNOWN AS THE INFORMATION SYSTEMS AUDIT AND CONTROL ASSOCIATION—AND IS THE AUTHOR OF MANAGING AN INFORMATION SECURITY AND PRIVACY AWARENESS AND TRAINING PROGRAM. @PRIVACYPROF

BEN MALISOW_ IS AN IT DATA SECURITY INSTRUCTOR, WRITES STUDY GUIDES FOR DATA SECURITY TESTS AND IS THE AUTHOR OF EXPOSED: HOW REVEALING YOUR DATA AND ELIMINATING PRIVACY INCREASES TRUST AND LIBERATES HUMANITY. @BENMALISOW

CARISSA VELIZ_ AN ASSOCIATE PROFESSOR AT THE UNIVERSITY OF OXFORD IN THE UNITED KINGDOM, WROTE PRIVACY IS POWER: WHY AND HOW YOU SHOULD TAKE BACK CONTROL OF YOUR DATA. @CARISSAVELIZ

PAM NIGRO_ IS AN ADJUNCT PROFESSOR AT LEWIS UNIVERSITY AND SERVES AS EVERLY HEALTH SOLUTIONS’ VICE PRESIDENT OF INFORMATION TECHNOLOGY AND SECURITY OFFICER. SHE IS ISACA BOARD VICE-CHAIR. @EVERWELL_HEALTH

How important is data privacy?

N I G R O : You’re being programmed by algorithms. At some point, the line gets really blurred between what was your decision and what an algorithm has suggested for you. Suppose I’m looking for a new TV. Based on algorithms that draw upon where I live, my job, my zip code and 500 or more other data points, they may direct me toward a TV with more bells and whistles than I need because they know I can afford it. The algorithms are like a used car salesman except you don’t understand the tactics behind it.

B U T L E R : Algorithms and other automated systems tend to create echo chambers called filter bubbles that manipulate basic facts, polarize people and have an impact on mental health. The systems try to maximize engagement, which means either clicking on something or watching something for longer than a few seconds. An early version of YouTube was notorious for offering recommendation chains that sent people down these rabbit holes that led to distorted or even fictional conspiracy theories.

VÉLIZ: The loss of privacy that we are suffering is having a tremendous effect on democracy. Personal data is being used for personalized propaganda that is much more corrosive than old-fashioned mass propaganda. In the past, if the president said something, you would discuss it at the water cooler and the dinner table. Journalists and academics would publish articles about it. Now, companies create so many different ads that they’re never challenged in the public sphere because journalists can’t watch all of them.

HEROLD: I’m an expert witness in trials. In one of my cases, a smart device helped a defendant find his intended victim hiding in a hotel. He almost beat her to death in front of their shared 4-year-old daughter. And people can copy your profile and photo and then pretend to be you. Then they try to dupe your friends into giving them money or ask them about what they have in their homes so they can be targeted for physical break-ins and assaults.

MALISOW: I hate to say it, but it’s time to give up on regaining privacy. We’re going to have to go with full transparency whether we like it or not. I suggest we come to terms with that and understand what that means before it arrives—because we’re going to have growing pains that will make the Industrial Revolution look like a joke. What happens when you find out your loved one is taking money out of your bank account without letting you know? What happens when you find out that one of your political leaders is a paid lackey of a hostile foreign government? These things are all going to cause a lot of difficulties, and it’s going to happen within 20 to 25 years.

What does the public get wrong about data privacy?

HEROLD: Fewer than half of all Americans think about privacy very much. But after they find out their data has been misused, they often become some of the biggest proponents for privacy. Someone has pretended to be them, and now they have bad credit and they owe millions of dollars. Or their reputation is shot, and they can’t get a job anywhere because it looks like they’re a big crook. That’s when they start realizing data privacy’s not about hiding things.

N I G R O : We’ve equated privacy and secrecy, and they’re not the same thing. Just because you want to keep something private doesn’t mean it’s a secret. It just means it’s nobody else’s business. There are certain things people don’t need

to know, like how much is in my checking account, and what’s in my 401(k). I can provide you with information to buy your product, but I shouldn’t be on your website or in your database for the next 20 years.

VÉLIZ: This idea that personal data is your own property and that privacy’s your personal preference is completely misleading. Privacy is collective. Rarely can we share personal data without sharing things about others. If you share your genetic data, for example, you’re sharing data not only about the obvious people—like your parents, your siblings, your kids, your cousins—but also very distant kin, whom you might have never met and never heard of. When you share location data, you’re sharing data about your neighbors and about the people you work with. When you’re sharing data about your personality, you’re sharing data about people who share those personality traits.

Are people adequately compensated when they provide their data?

N I G R O : It was estimated in 2013 that Google was worth about $500 a year as a search engine service. I’m sure it would be more expensive now. You get it for free because they gather and use the data that they uncover about you. They make thousands of dollars from that data, so you have to ask yourself: Do you really want to exchange it for so little?

VÉLIZ: What do I care if Facebook knows who my friends are or what kind of car I drive? People don’t realize that data is being used to infer very sensitive things, things like their sexual orientation, their IQ, their ability to pay back a loan—and that’s going to be used against them. Loss of privacy has implications for society at large, too. In the case of Cambridge Analytica, only 270,000 people agreed to give their data to the political firm. From that data, the company got access to the data of 87 million people who are friends of those 270,000 people. Those 270,000 people didn’t have the moral authority to sell their data for $1 because that had consequences for the whole of society.

How bad can data-tracking get?

BUTLER: People don’t like being tracked. It feels icky. Some are convinced Facebook is listening to their conversations through their microphone, but engineers have determined definitively that social media sites don’t actually hear the words you’re saying. The scary thing is they don’t need to hear your conversations to know what you’re thinking because of the amount of data they are collecting about your behavior all over the internet.

V É L I Z : The filter bubbles that people live in on the internet are smaller than they used to be. They used to include thousands of people, but now they’re for really, really reduced communities. Advertisers used to guess what messages would resonate with audiences, but now data scientists are getting so close to predicting responses that it’s like voodoo. We’re fighting armies of people who are trying to mess with our minds.

Does data-tracking worsen inequality?

VÉLIZ: We’re not being treated as equal citizens anymore, and that should really worry us. It erodes the social fabric and our trust that the system is going to be impartial when we apply for a loan, or a job or an apartment. It means that when you call customer service, you’ll get a quicker answer if you’re likely to buy more things.

MALISOW: But things will change. Once we have total access to information about everything and everyone, it’s going to become a lot more peaceful environment. If I know that the person presenting cash to me stole it, I’m going to choose not to do business with them. If I know that my stalker is in my neighborhood, I can respond before he shows up at my front door with a knife. If you’re doing something bad, you might stop because you’re going to get caught.

VÉLIZ: It just doesn’t work like that. You neglect the fact that there are so many things in life where we already have transparency. One example is race. Race is apparent to all and that doesn’t make discrimination go away.

The privacy paradox states that internet users value privacy but fail to expend the effort to protect it. What’s your reaction?

B U T L E R : Since the early days of the internet, the government has let companies drive the process of crafting and implementing privacy rules. It’s essentially a self-regulatory model. And privacy policy leaned heavily on terms of service documents that grew in length and complexity—and obfuscation. No one really reads these documents because it’s not practical. The user is put into an impossible position of having to be a lawyer, an engineer, and an expert on systems design and user interface—and it’s untenable. What’s more, studies have shown that people mistakenly assume a privacy policy is set up to protect their privacy.

VÉLIZ: People say that they care about privacy but then don’t act in a way that shows that they have concern. There’s a lot of defeatism out there. People think, “Well my data is out there anyway. Why am I going to inconvenience myself if I can’t fight this?” It’s like saying you’re going to drive without a seatbelt.

H E R O L D : On the 25th page of the consumer agreement, in tiny font, it may say you’re giving us the ability to contact you about other products. And you’re going to let us share your data with “trusted third parties” that are selling that data.

What’s your view of the General Data Protection Regulation, or GDPR, that the European Union imposed in 2018?

BUTLER: GDPR is still very much a work in progress, and enforcement mechanisms are still being developed and implemented. What’s changed is that companies that operate in Europe now have to examine their processes and actually think through what they’re doing. Sometimes CEOs didn’t know what data their companies were collecting.

VÉLIZ: It’s been a success because it’s changed the conversation forever, and it’s had a tremendous effect worldwide. It also has a lot of faults. We don’t have enough resources to police it. It says you can collect personal data if people consent, or if you have a legitimate interest. And a lot of companies have interpreted legitimate interest very, very, very broadly to cover basically anything that is convenient to them.

B U T L E R : Post-GDPR, companies are using privacy as a method of competing in the marketplace. Apple, for example, has rolled out an app that tracks transparency, severely restricts what developers can do with data and gives users the ability to prevent apps from collecting data that they don’t want collected or doing it under the radar.

HEROLD: GDPR goes beyond companies based in Europe. If you have workers, customers, consumers or contractors in the EU whose personal data you have collected and are responsible for safeguarding, then you need to follow the EU GDPR. Some very large fines and penalties have been applied for noncompliance.

MALISOW: Not being in compliance with the GDPR is a lot more frightening to businesses than having the data exposed. If you steal my clients’ personal data, that may expose me to some repercussions. But those repercussions aren’t on the same scale as what the regulators do as a punitive measure. And the weird thing is that those punitive measures don’t make any restitution to the damaged parties.

How does the California Consumer Privacy Act, or CCPA, differ from the GDPR?

BUTLER: The California law is still very much in flux. It’s only a couple years old and already it’s been substantially overhauled. It’s significantly narrower in scope than the GDPR. It empowers users to block the sale of their data, which is only a limited subset of the data privacy and data use ecosystem.

HEROLD: California is more specific. In Europe, almost every type of organization has to comply. The California law applies only to citizens of California and only to organizations with over $25 million in annual profits. So, they’ve cut out applicability to small- and mediumsized businesses.

Will we see federal regulation of data privacy? What obstacles stand in the way?

HEROLD: In Q1 of 2021, federal lawmakers introduced at least 128 bills that either wholly or in part addressed privacy issues, and that was an 83% increase from Q4 of 2020. Then when we got to Q2 of this year there were 152 more bills.

B U T L E R : A comprehensive federal privacy law will be passed—it’s just a question of when and with what format. Consumer privacy advocates have argued that the federal law should set a baseline but shouldn’t prevent states from improving regulation, especially as technologies evolve and change. Technology companies have shown an appetite for federal legislation, in part because of the problem that a patchwork of different overlapping state laws would cause.

NIGRO: The scary part is that the federal government’s National Security Agency is just as guilty as all of the businesses when it comes to collecting this data on you. What if the government classifies you as a troublemaker? You’re put into that box and they’re watching everything you do.

MALISOW: Lawmakers in the United States are making laws with the best of intentions to protect the citizenry. But these are the people least qualified to be regulating technology because they’re the least qualified to understand it.

Should regulators enforce data privacy protection, or should regulation empower users to protect their own privacy?

BUTLER: These can’t be user-driven decisions because the users plainly don’t have sufficient information to

make these choices. You often have to go to a third-party website and make decisions about turning these things on and off. It’s a recipe for disaster. We need new legal systems and new software systems that recognize that the user is not going to make a million nuanced decisions.

VÉLIZ: Because GDPR relies on consent, companies use it to put all the burden on individuals. The default is data collection, and then you have to say no if you don’t want it. That’s ridiculous.

What aspects of data privacy don’t get enough attention?

VÉLIZ: One thing that doesn’t get enough attention is the importance of forgetting. Once you’ve made a mistake and it’s on Google, you will never get over it. A society that doesn’t grant second chances is a very ruthless society. Another tricky thing about data is that it’s invisible. When somebody cuts your skin, it hurts, it bleeds. When all your data is stolen, you feel nothing. And then two years down the line, you have no way of getting a job because you’ve been blacklisted.

N I G R O : When things have changed in the United States it has always come through grassroots critical mass. We need a consumer champion for privacy—like Ralph Nader was for cars.

Is it too late to save privacy?

MALISOW: Yes. That ship has sailed, and we can’t close Pandora’s box.

VÉLIZ: There’s no such thing as too late because degrees matter. At heart, I’m an optimist, so I think we’re going to end up regulating these companies. Facebook CEO Mark Zuckerberg said in 2010 that privacy norms have evolved and that people don’t care about it anymore and want to share. Many people actually bought that.

Final thoughts?

MALISOW: As much as I don’t like the idea of Facebook and Apple having my data, I’m more afraid of a centralized authority being the arbiter. A demigod or tyrant will abuse his authority and use that information against you.

VÉLIZ: Privacy is power. If we give up our privacy to governments, we risk authoritarianism. If we give it up to companies, we risk corporatocracy. Try your best to protect your privacy. Even if you fail, you will leave a paper trail of trying to protect it. That matters a lot, because in a year or two, when regulators look at that company, they will see it did horrible things.

N I G R O : The Constitution doesn’t protect privacy. The Fourth Amendment, which prohibits illegal searches and seizures, says the government can’t break down your door and search your papers. That’s it. There’s no amendment for privacy. I could legally sit out in front of your house with a cone and record everything you’re doing.

This story is from: