
8 minute read
Jessica Gross ’17
Jessica Gross ’17 is only six years out of law school but has risen fast and is now Associate Corporate Counsel in Privacy & Data at Sony Playstation. While at California Western, she was a Justice Kennedy Scholar, the Editor-in-Chief of the International Law Journal, and a member of the Moot Court Honors Board. After several years working in the government tackling cybercrime, she made the transition to corporate cybersecurity— and could not be happier about where she’s ended up!
She sat down with California Western to discuss her career thus far and the complex challenges data privacy lawyers tackle every day.
Advertisement

How did you end up at California Western and how did your experiences here help shape your career path?
It was obvious from the tour! California Western far exceeded any other school I visited. They introduced me right away to the people in Career and Professional Development Services. They put me in touch with students and faculty. I could tell it was a school that had an obvious commitment to student success. And when I arrived on campus, I liked everything and I did everything. I had great professors, who taught me everything I needed to pass the Bar.
One of the most pivotal experiences I had was the Jessup Moot Court Competition with my coaches Professor Acevez, Professor Bobbie Thyfault ‘84, and Kate Clark ‘10. It helped me so much in improving my legal research. I learned how to break down a statute, how to write briefings and fine-tune persuasive, analytical reasoning. The competition in my first year was actually modeled on Edward Snowden and Wikileaks— it was called “The Frost Files.” It dealt with hacking another country’s cyberspace— which relates directly to the work I’m doing now with Playstation on cybersecurity and data privacy.
That was when I started learning how the internet really works and how technology is continually becoming more enmeshed in our daily lives. It’s a little like Black Mirror, the cutting edge of things, where policy and law intersect with technology and business and how they influence our daily choices and behaviors. No one else was investigating this at the time.
While at CWSL, you clerked at the Department of Justice, with the Attorney General, and in the San Diego County Appellate Court. What drew you to government work originally?
To be honest, I’m a total nihilist— in the sense that I think not a lot matters unless you decide that it does. So you have to choose— do you want to do good or not? And I’ve always thought that if I’m going to be spending most of the hours of my life on something, I want it to have a positive impact. Initially, I wanted to work in the government and tackle cyber crime that way. I did a clerkship after graduation, worked for a magistrate judge. This gave me a certain pedigree that law firms like to see. And I learned a lot from my encounters with different lawyers and their lawyering styles.
But the truth is, there aren’t many roles in government in cyberspace. I interned with the Attorney General and at the time they didn’t really have any people working in it. The feds are focused primarily on protecting children and preventing child pornography, which is essential, of course, but there’s very little activity on any of the other issues in cybersecurity like privacy, incident response, wire fraud, and ransomware.
I would definitely recommend government work for anyone interested in getting trial experience. I think any law student, no matter your interest area, should clerk in an appellate court, because it forces you to do a lot of writing and to appear before judges. You have to be able to communicate effectively, to be appropriate, objective, and fair— convincing but not manipulative. And they will put you in there!
Why did you decide to transition from government work to corporate law?
Well, government officials just have no idea how the technology of the internet works. And what that means is that when privacy or cybersecurity laws are made, officials and regulators don’t know how they can or will be implemented or the consequences of how companies will behave to get in compliance. I call it “cookie banner nonsense”— like, having transparency and choice about how your data is collected matters, but even I get annoyed by cookie banners everywhere. And it’s a perfect example of how these laws don’t really fulfill their intent and reflect how officials and regulators have a total misunderstanding of how the tech is actually working.
But that makes for tons of work for lawyers. And when you’re on the corporate side, you get to work globally, because there is a whole hacking ecosystem with global operations. Plus, businesses operate globally and pretty much everything the internet can be accessed globally. On top of that, there are so many different policies across every state and nation, so when you’re developing contracts and policies for a company like Playstation, you’re trying to come within “demonstrable compliance” with these thousands of competing laws and regulations.
It’s exciting and challenging, because for the most part these are all brand new laws. One of the jokes I used to have about the California Consumer Privacy Act was that it was the “choose your own adventure” law because it had so many typos and subdivisions to nowhere. And there is no case law, there is no precedent, so it requires a ton of legislative interpretation. What I love the most is taking all the mumbo jumbo and putting it into practice, making it make sense.
Can you break down what your day-to-day looks like at Playstation?
Playstation has sixteen different studios all designing new games and tinkering with existing ones. And it’s a vast tech ecosystem, including the games themselves, the platforms they’re played on, and any third-party tech and software that might be integrated into the game. Whenever a person plays a game, data is being collected about certain of their actions or activities (depending on their privacy settings).
So it’s my job to know— What kind of data is being collected? Where is that data going? Will it be stored or destroyed? Who will be able to see it, if anyone? What might it be used for? And will it be sold?
I work with developers to review games to make sure data is being collected according to an individual’s chosen privacy settings, not allowing leakage. I also help craft policies on the types of data that will be collected and where it will go, making sure our policies are aligned globally across the hundreds of games we publish.
What kind of data would a game be collecting? Like, how many times I turn left in a racing game?
Ha! Well, a game is essentially a really complex computer program. So, for it to work, the computer program will collect data. For the most part, that data goes into game development — making the games better. But the bigger question is about personal information— there are over a hundred different laws about what kind of personal data can or can’t be gathered.
PlayStation, like all game developers, will gather gamers’ personal identifiers. Depending on those gamer’s privacy settings, we might use that data to help inform business strategy or develop new products. For example, PlayStation recently developed controllers specially designed for people with certain disabilities. There is also a big initiative in the studios to include more accessibility settings in games, like for color-blindness, hand-eye coordination problems, motion sickness, deafness. Given the sensitivity of this data, we take measures to ensure we’re being transparent with gamers and treating their data with respect.
But some companies out there have a business model that dictates getting as many data points as possible and then selling them to companies who ultimately create conglomerated profiles of people. So, if all your gameplay data were connected to a conglomerated profile and then sold to a third party, that would be a big concern.
And you don’t do that?
We don’t do that. Ultimately, PlayStation’s goal— which is very achievable— is to reach one billion active users, but to do so in a way that respects our users and the trust they place in us. And we don’t believe that you need to buy and sell people’s data to deliver premium gaming experiences. You just need committed developers and artists who make our games come to life.
Can you say more about the risks to consumers in terms of data privacy?
Sure. It’s all part of this vast ecosystem, unfortunately. This kind of data collection is sometimes called “capitalist surveillance.” Companies treat data like an asset, because they can make millions or billions of dollars off of it, usually selling it to advertisers who are always looking for these specified profiles. It’s only more recently that laws have started to catch up to consumer expectations. People believe that they should own their data and decide what to do with it. So do I.
But then there are a lot of other parts of the ecosystem. Credit reporting is part of the ecosystem, and companies like Equifax and Experian have had massive data breaches in the past. There have also been cases of rental companies trying to buy data to determine if people would be reliable to pay their rent. In a previous job, I also saw one instance of a database of profiles that was part of a data trust for which members of a particular political party were named as the beneficiaries (and people’s data was the trust’s property).
The most nefarious cases, though, are identity theft and crimes against children. Children tend to use a lot of tech and play games just as much as anybody else, but they are the most susceptible. They don’t know the consequences of sharing their data, and they could be susceptible to influential advertising and other predatory behavior. So, there’s a lot at stake in privacy and cybersecurity.
What do you think the future of regulation looks like in cyberspace?
Every state and country is slowly catching on to the technical reality of what’s happening, that data is being collected, misused and used unfairly all over the place. But there is a copycat effect in terms of regulation, largely driven by Europe, where privacy laws are becoming more stringent and better enforced.
I expect that for some time it’s going to be a mess. It’s going to be a patchwork of laws across the globe. And eventually certain companies will lobby the right people for change. But we’re not going to have a federal privacy or data breach statute any time soon.
Things will change when a company like Microsoft, for example, a global company that’s worth more than some countries’ GDPs, or associations of such companies are spending $500,000+ to get into compliance with one state law— a law that might change on a moment’s notice, based on a particular legislature’s whim — decides to put their money behind lobbying for standardized laws. Then legislatures will start to listen.
I know you only graduated from Cal Western six years ago, but what are your reflections on your career thus far?
It’s gone way better than I ever thought! When I first started law school, I was focused on the US News Rankings and thought, “How am I ever going to get a job?” But I am just as successful as I would have been if I had gone to the “top-ranked” schools. I took advantage of opportunities that were given to me. I worked hard. I talked to people. I stayed connected.
I feel grateful for the classes I was able to take at California Western on privacy and white collar crime. And California Western has some tremendous programs, like the New Media Rights clinic, of course. I had great career advisers, and I was trained incredibly well.
If you’d told me before law school that I’d be in the position I’m in, providing counsel to a global company, I would have said “Yeah, right.” But it turns out, you’re just as successful as you want to be.