LD500
                        Photo by Rory Earnshaw

Privacy may soon be a dated notion. In this tech-driven world, it has been argued that data is quickly becoming the most highly valued currency of our time. It has taken Lesley Weaver many years and a lot of dedication to understand the nuances of this cyber space in order to advocate for our very human rights. Weaver works to protect what little privacy we have left in the most public of arenas, tackling huge cases with the most famous of players – Google, Facebook/Meta, TikTok. You know them, you love them, you agreed to the terms and conditions when you clicked the box.

When Weaver began in the industry, most people thought of cookies as something to be enjoyed with a glass of milk. Today, the public is beginning to understand the complex ways in which seemingly minor incidents of data collection through cookies and other often hidden mechanisms impact us, and that they have potentially catastrophic ramifications. Weaver believes, “It is critical to our society to have the ability to decide what you are sharing and with whom.”

It wasn’t until the Cambridge Analytica case against Facebook that Weaver began to understand the scope of what we’re truly up against when it comes to privacy. In 2019, Facebook paid $5B to the Federal Trade Commission for consumer privacy violations. In late 2022 they agreed to pay $725M to resolve a class action – they’d been accused of sharing users’ private data with third parties, making user data available to entities like Cambridge Analytica, a data analytics firm that worked with the Trump campaign. Huge numbers and significant results, yes. Enough to make a dent in a behemoth like Meta? That’s yet to be seen.

Weaver is the head of Bleichmar, Fonti & Auld’s Antitrust and Consumer Litigation teams. She currently serves as co-lead counsel in the Calhoun v. Google consumer class action nationwide data privacy suit: Google Chrome said they would not collect users’ data if they do not “synch” their accounts – according to Weaver, the research shows they most certainly have. Tracking users’ activity across the internet is worth billions of dollars. “We are being constantly surveilled, watched, controlled and provoked,” Weaver says. “It's a pretty big demon that we have out there.”

Weaver is an esteemed member of the 2023 Lawdragon 500 Leading Plaintiff Consumer Lawyers.

Lawdragon: What brought you to the law?

Lesley Weaver: I grew up in Elkhart, Ind., a midwestern industrial town populated by people who owned the factories in town and the people who worked in them. My family’s ethics emphasized education and hard work. I actually considered pursuing a career in the theater, but that did not fully answer my need to make a difference in the world in which I live. Now I have the opportunity to do both, because as lawyers we are putting pieces together and framing stories.

I started out litigating intellectual property matters, but it did not capture my attention. I'm somebody who has to be internally motivated. Then I was invited to interview in a plaintiffs’ firm and I thought, “Oh wow, this is fascinating.” 

I began litigating securities cases in 1999. By the early 2000s, many dot-com companies were going under because they were not complying with the most basic business fundamentals. Because the gatekeepers were not doing their job, accounting fraud in particular was widespread. 

There is an immense asymmetry of information between the people collecting and using the data, and consumers themselves. This is an industry that grew up very quickly and way ahead of regulators.

LD: And what brought you specifically to privacy?

LW: Fifteen years ago, when people were talking to me about cookies, I wondered if this was really an important issue. Even until five years ago, so many people were saying, “I have nothing to hide. I'm not concerned about my privacy.” It was not until I took a case to trial on the issue that I understood the impact myself.

LD: What changed for people?

LW: It became a question of understanding what was happening with the information collected about people and that you could not put the genie back in the bottle once it was out there. The Cambridge Analytica scandal was a watershed moment because people caught a glimpse of what was being collected about them and how it was used. 

There is an immense asymmetry of information between the people collecting and using the data, and consumers themselves. This is an industry that grew up very quickly and way ahead of regulators. There are amazing things about technology. There are also issues if there aren't gatekeepers. There’s always a tension in any growth period between expanding new ideas and being reckless about the consequences.

LD: What was the first privacy case that you tried?

LW: It was in 2015 against a company called Positive Singles – an STD dating website. The owner of the company had created a dating site for people with certain diseases. However, without telling the people who paid to join the site, he mixed all of the people into one database, with their photos and information, regardless of what group they thought they had joined. There was a domain name called Positive Singles, but there was also a Christian single website, as well as domain names such as womenwithgenitalherpes.com and swingers.com. There were hundreds of them. Our clients had signed up for Positive Singles, but they were all put in the same dating pool with people who entered through these different domains and thus were associated with them. One woman had a 14-year-old daughter whose face had appeared on a website for swingers.

The right to privacy on this issue is so compelling. If you have an STD and you’re dating, you have to “have the talk” at some point, to tell them about the STD. People were on these websites trying to find people who would understand what they've gone through. It’s a very tender admission. It was not meant to be made available to the general public or to any audience they had not selected themselves.

We had four fabulous and compelling plaintiffs, and the jury and judge were respectful and thoughtful. We recovered 100 percent of economic damages – $1.7M. Then we got $15M in punitive damages, which was about exactly the right number.

LD: How did you approach the case?

LW: We framed the case to say this is not about sex or disease. This case is about a broken promise – the company said it would not reveal people's identities to anyone who was not also on the same website. Instead, their names and faces were thrown out to thousands of people they had not chosen to identify with and they were labeled with those domain names.

It is critical to our society that people have the ability to decide what they are sharing and with whom. It's critical for self-development and a core fundamental right. I'm a lesbian. If I had been outed before I was ready to accept that identity and any consequences, that could have been a very unhealthy thing for me. Each person needs the right to determine what to reveal about themselves – health status, financial status, emotions, decisions – and when, how, or if to reveal it at all.

There’s always a tension in any growth period between expanding new ideas and being reckless about the consequences.

LD: Absolutely.

LW: The Cambridge Analytica scandal in 2018 was really the beginning of seeing real-world harm from privacy violations. It revealed how data can be used to manipulate people. Alexander Kogan was an app developer on the Facebook website, and he collected data about people through an online quiz, and unbeknownst to the users, sold it to Cambridge Analytica, which was serving both the Brexit campaign and Donald Trump. Cambridge Analytica used that data to target people in 11 U.S. states – people they called “lazy liberals” – encouraging them to stay home and not vote. Yet people did not know they were being targeted in the first place, let alone what characteristics made them targets.

This is what happens when you don't know who you're talking to or what they know about you – you don't know how to protect yourself or evaluate the information you are receiving. There are entire regulatory regimes focused on disclosing who is speaking to you – for example, in advertising, lawyers and politicians have to identify themselves. If the police are arresting you, you get your Miranda rights. That’s because you need to understand the consequences of what you’re sharing so you can make a decision about what to disclose.

LD: Meanwhile on social media…

LD: On social media, the goal of the company is to get a response from you. People respond when they're emotional, angry, sad. What they found is that these active emotions tend to keep people on the platforms.

So there are two layers of what is happening. The first is that you are being stimulated so you stay on the platform by invoking strong reactions, primarily negative ones. The second is that they are trying to get you to take an action: to buy things, to vote, or even to engage in a pattern of behavior, for example. But if you are being targeted without understanding why you are the target, or what someone knows about you, you are at an unfair disadvantage. The use of data can also mask discriminatory and predatory practices. It can allow people to be targeted for their vulnerabilities – addiction, financial distress, health challenges – without even telling people they have been put in those categories. Only now are regulators and people in general beginning to understand what’s at stake.

LD: So Cambridge Analytica was a real eye-opener.

LW: Yes. Facebook “hoovered up” so much information about people and were paying little attention to what happened to it. They were moving fast and breaking things. And an entire ecosystem developed that was not respectful of individual integrity and choice.

There's a whole field of study about online disclosures, which describes them as “dark patterns.” Dark patterns are user interfaces that obscure rather than disclose how you are being targeted. For example, if you have spent time with privacy controls, you may have experienced what it’s like to click on one button, but then another appears, and now you have to switch this toggle. You may even end up accepting data collection you meant to turn off. It is often confusing and can be an enormous waste of time. Studies show that most people just click “accept” because they can't figure it out and can't be bothered. Especially if your children's school requires you to be on Facebook to communicate with your teachers, or to use a Google Chromebook, you have to accept the terms of service. But how is that really consent?

It's as if you are naked in front of a one-way mirror. The companies know everything about you, but you have no idea who is behind the mirror and who or what is presenting these stimuli to you, or why.

LD: Could you tell early on that the Cambridge Analytica matter was going to be a culture-shifting case?

LW: It felt important from the start. I started reading everything I could to try to understand the data flow and its value. There's a book by Shoshana Zuboff called “The Age of Surveillance Capitalism.” She was one of the first tenured women at Harvard Business School. The book is incredible, and walks through the economic import of data.

In 2018, even though we worked with experts, there was a lot less known about the data flow itself. Through our work, we have all learned a lot about data collection and how it is used. It was shocking to realize how little is actually disclosed about what is happening. That's a pretty vulnerable place for people to be. It's as if you are naked in front of a one-way mirror. The companies know everything about you, but you have no idea who is behind the mirror and who or what is presenting these stimuli to you, or why.

Studies show that living under constant surveillance, and experiencing a correlated loss of control, leads to increased levels of stress, fatigue and anxiety. Now at least people are more aware of the effect this is having on children and the addictive nature of our phones and looking at social media. Right? So, wouldn't it have been helpful to have had more information 15 years ago before everybody was letting their kid go on social media platforms?

LD: Right.

LW: At its core many of these practices are not disclosed, and so these cases are about misrepresentation and fraud. Courts initially dismissed privacy claims, finding that there was not concrete harm. There are scores of privacy cases out there where the plaintiffs lost at every stage. In Facebook, we framed it instead as a contract claim, and ultimately settled for $725M, the largest privacy class action settlement to date.

LD: It is scary with these mammoth companies. That’s a great result, but is that going to change anything?

LW: I hope so. With our settlement, Meta submitted two sworn declarations identifying what the company is now doing to try to protect user data. And I would say they're doing a much better job. The main issue in our case was a practice called “Friend Sharing,” and Meta is no longer allowing that.

LD: What is that?

LW: The idea was that if my friend downloaded an app, the app could collect not just the data of the person who downloaded the app, but also all the data that the downloading person’s friends shared with that person. Here is an example: there was an app called Pikinis, which collected photos of people in bikinis. If one of your Facebook friends downloaded that app, and you had shared photos of your family on vacation with the friend, then your photos of your daughters in their bikinis ended up on Pikinis. Of course, you have no idea what apps your friends download, and you had no control over that. And unless you downloaded Pikinis yourself, you would not even know your daughters were on the app. 

LD: Wow.

The use of data can mask discriminatory and predatory practices. It can allow people to be targeted for their vulnerabilities – addiction, financial distress, health challenges – without even telling people they have been put in those categories.

LW: So, in the Cambridge Analytica example, around 270,000 people downloaded the “ThisIsYourDigitalLife” app. However, as many as 87 million people’s data was implicated, because of friend sharing. The ideal, from the company’s viewpoint, was to collect as much data about people as possible, and they did so without regard to the impact on the human beings who were their “users.” The company’s defense was that people consented, because friend sharing was disclosed in the terms of service beginning in 2009. But most people did not understand that at all.

LD: What about your Google cases?

LW: Chrome clearly promises that it will not send your data to Google if you do not synchronize your accounts. The case is very simple: We tested it and you can see the data flow. Contrary to that clear promise, Chrome does send your data to Google even if you are not synching. And when Google takes your data and synchronizes it, you are living in a surveillance state – because Google is all over the internet.

In a similar case, Google runs a real time bidding auction that’s happening billions of times a day, hundreds of billions of times every second. The auction is in response to a bid request that describes you and offers those who wish to target you the opportunity to send a message. You're holding your phone right now, you're sitting in this location and you just did X, Y and Z – who wants to send you an ad right now? They run an algorithm, participants bid on it and you get that ad – all in the time that you blink an eye.

These ads are targeted based on data and categories that they put you in. The case claims that many of those categories are highly explicit. If you look at the complaint, it includes highly specific categories, such as extremely sensitive health issues.

What if I'm not really in that category? How did I get in that category? Is my insurance carrier doing something? Shouldn't I get the right to decide who has that information?

LD: What is the status of the Google Chrome case?

LW: The Court found that users had consented to taking of the data even when users were not synched. This was on summary judgment. So, we're hopeful that we have a different outcome in front of the 9th Circuit. I really think this should go to a jury.

LD: Yes, and there should be media covering every day of it.

LW: We are just beginning to get our heads around what it means to have an audience all the time for everything. What does it mean to society if you can't be private? What does it mean for the children?

The whole thing about humanity is holding each other with gentleness and tenderness and allowing each other, and ourselves, the time to understand the world and where we are in it. If instead, we are being constantly surveilled, watched, controlled and provoked, it's inherently unhealthy. I believe that the first step is we should get to decide who is learning what about us.

LD: What does it mean to you to be part of this fight for privacy rights?

LW: We kicked off both the Facebook case and the Google Chrome case and then the pandemic happened. People on our teams had personal issues, family members were ill, family members died. In times like that you have to ask yourself what really matters. Is this really important? And it still felt very important, especially as the world relied even more on online interaction.

I'm not saying disclosures are the perfect answer, but if people have more information about what is happening with data aggregation, I think they might behave differently. They might lobby for more regulation, or they might choose not to have an entirely virtual life. I think it's really important that we tell people what's really happening and let them choose.