By Matthew Heller | August 25, 2025 | Lawyer Limelights, News & Features, Plaintiff Consumer Limelights
Carol C. Villegas and Michael P. Canty led the case for users of Flo Health, whose sensitive reproductive data was being shared with Facebook without their consent.
On August 1, a federal jury in San Francisco delivered a historic verdict in a class action lawsuit against Meta Platforms that has the potential to rewrite the rulebook around the use of private data. The social media giant that owns Facebook, the jury found, had violated a California privacy law by using confidential data it had harvested from Flo, the popular women’s health app, to sell advertising. Meta could now be on the hook for billions in dollars in statutory damages under the California Invasion of Privacy Act.
The case appears to have been one of the first in which Big Tech has been held liable for misusing consumer health information. “I think this is the first time that consumers have been given an opportunity to tell Big Tech how they feel,” says Carol Villegas, a partner in the New York office of Labaton Keller Sucharow, the co-lead counsel for a class of millions of women whose private information was funneled from the Flo Health app to Meta. “The people have spoken, and I think it needs to be a wake-up call for these major technology companies. They need to really change the way they're doing things.”
To try the case, Labaton teamed up with attorneys from two other boutique law firms — Lowey Dannenberg and Spector Roseman & Kodroff. During the two-week trial of Frasco v. Flo Health Inc. in U.S. District Court for the Northern District of California, they faced a formidable corps of defense lawyers from firms including Latham & Watkins, Gibson Dunn, and Dentons. But the plaintiffs’ side not only obtained the CIPA verdict against Meta but also a mid-trial settlement from co-defendant Flo Health.
The evidence was powerful: Five courageous women took the stand to share their experiences as Flo Health users, supported by expert testimony in computer coding and artificial intelligence, and reinforced by internal Meta documents. Together, they revealed how Meta illicitly penetrated the most intimate details of women’s lives – information about menstruation and pregnancy – through a package of tools, the “software development kit” (SDK), that it had supplied to Flo Health.
The testimony from the users was pivotal in the case. “Can you imagine going into a courtroom with a hundred people and talking about your period?” Villegas asks. “That's really hard. And it was hard for these women to do, but they did it. They stood up for all women and they did it.”
For Villegas, it was nothing less than a “David and Goliath” triumph. “When we got the verdict, it was an incredible feeling for all of us sitting at that table,” she recalls. “But I think the most poignant moment was walking out of that courtroom and seeing the women together, holding hands, crying.”
“This,” Villegas adds, “is a big deal. This case is a very big deal.”
* * *
The road to vindication was a long one, starting in 2021 when plaintiffs’ attorneys filed seven proposed class actions against Flo Health and Facebook. Later that year, a judge in San Francisco consolidated the suits and appointed Labaton as co-lead counsel. According to the consolidated complaint, Flo Health violated users’ privacy between November 1, 2016, and February 28, 2019 by disclosing "intimate details" about their sexual health and menstrual cycles to third parties through SDKs incorporated into its app even though its privacy policy stated that, "We will not transmit any of your personal data to third parties unless we have asked for your explicit consent."
The people have spoken, and I think it needs to be a wake-up call for these major technology companies. They need to really change the way they're doing things.
The claim against Facebook relied on a pre-internet statute that can be challenging to apply in our digital world, but that this legal team utilized in a strategic and ultimately successful way. Under CIPA, also known as Penal Code Section 632, it is illegal to intentionally and without the consent of all parties use an electronic recording device to “eavesdrop upon or record” a confidential communication. The statute provides for penalties of $5,000 per violation. According to the plaintiffs, the Meta SDK embedded into the coding of the Flo Health app constituted a recording device that eavesdropped on consumers as they communicated with the app.
Meta argued, among other things, that any eavesdropping was done by the app and that because it tells app developers not to send it any information, it had no intent to collect data.
The case went to trial at a propitious moment as Big Tech companies are increasingly using AI to enhance their targeted advertising. Data collected from consumers helps to train “machine learning” advertising algorithms. At the same time, notes Christian Levis, a partner at Lowey Dannenberg and co-counsel on the case, Big Tech has been “lobbying across multiple states to … limit or substantially curtail privacy rights to allow them to collect information to fuel advertising businesses and other things.” A tech industry-supported bill currently pending before the California Legislature would exempt “communication intercepts for a commercial business purpose” from CIPA.
The Flo Health case “is one of the first times that a jury has been able to look behind the curtain of data collection from Meta specifically and Big Tech generally,” says Diana Zinser, co-counsel and partner at Spector Roseman. “I think it was eye-opening for them to learn about the sheer volume of data that's being collected and also how it's being used.” Consumers, she adds, can make an “informed decision about whether or not they want to use certain apps, certain platforms” but “they need the information to make that informed decision. And I think this case showed that right now they are not getting that information to make an informed decision.”
* * *
Villegas, a former assistant district attorney in Richmond County, New York, joined Labaton as an associate in 2008 and made partner in 2016. During her tenure at the firm, she has amassed extensive experience litigating complex securities fraud and consumer cases, and is currently overseeing cases against Boeing, DocuSign, Nike, and Amazon, among others. She served as co-lead trial counsel alongside Labaton partner Michael Canty, a former a former Assistant U.S. Attorney in the Eastern District of New York who has extensive experience trying high-stakes criminal and civil cases. Canty previously secured a $650 million Facebook BIPA settlement in Illinois, one of the largest consumer data privacy recoveries ever under the Biometric Information Privacy Act. Zinser brought her expertise in consumer protection and data privacy litigation while Levis was, according to Villegas, the team’s “technical guru. He is just really, really good at understanding the tech. He used to code apps, he knows the lingo, but he also knows how to explain it to a jury.”
The attorneys, Villegas says, set up a “war room with probably upwards of 18 people at times working almost 20 hours, 21 hours a day” and formed an efficient unit. “We collaborated, we exchanged ideas, and every decision we made, we agreed on. And I think that's just a testament to our firms and the fact that there was no ego. It was really about how are we going to get the best result for these women in this class.”
It was Villegas who delivered the plaintiffs’ opening statement. According to Canty, she “set the tone for the whole case. We always talk about primacy and sending a message to the jury. And we did it with a sober review of the facts and evidence that we were going to present.”
Among other things, Villegas explained to the jury that Meta gives SDKs to app developers for free and, in exchange, developers “agree to share data from their app and allow Meta to record it” for Meta’s advertising purposes. In Flo’s case, it allowed Meta to record “whether it was a goal for a woman to get pregnant or not get pregnant” and “information about whether she was ovulating and where she was in her cycle.”
“This code sat on the app and it recorded women's private information about their bodies, a lot like spyware would sit on a computer, listening in and recording all of the details of what a woman puts into an app,” she said.
After previewing the evidence for the jurors and walking them through Flo’s privacy policy, Villegas concluded, “You are the jury that gets to decide how seriously Big Tech takes women's privacy,” adding, “I'm confident that you will come to the undeniable conclusion that Meta and Flo violated the privacy rights of millions of women.”
* * *
Each side had only 12.5 hours in which to present their case. As a result, Villegas recalls, “One of the tactical decisions was whether we wanted all the [named plaintiffs] to testify.” Ultimately, they decided that the jury needed to hear from each of these women, which set the tone for the entire case as they each recounted their experience of sharing private information and feeling the violation of that being shared without their knowledge or consent.
No one at Meta will ever be able to tell you why you get one ad over another. They don't know. The algorithm knows, and it's very good at what it does.
The witnesses were varied: from younger women monitoring their fertility, to older women in perimenopause and menopause. They used the app for various reasons, but their testimony included a throughline of the expectation of privacy that was upended. The cross-section of users proved ultimately compelling for the jury.
An exchange with witness Jen Chen lays out a relatable consumer expectation:
Q: Did you consider the answers you gave to the Flo app to be private?
A: Absolutely.
Q: And how did it make you feel when you found out that Facebook was recording this information?
A: I felt violated.
Q: Would you expect those health apps to keep your information private?
A: Absolutely.
Q: That's because it's your medical information, right?
A: Exactly.
Levis then handled the direct examination of Dr. Serge Egelman, a data security expert at the University of California, Berkeley, who, in preparing for his testimony, had downloaded 18 versions of the Flo Health app to help determine exactly how Facebook’s SDK was recording the communications of users. Perhaps most significantly, he emphasized to the jury that it was Facebook’s code that was illegally transmitting users’ private information to Facebook. "That's the actual transmission here, not the Flo app," he testified.
Egelman also explained to the jury that when Meta got that information, if the Flo Health user was also a Facebook user, it could match the information to all the other information that it had collected from the Facebook user and use that to target ads more effectively. “A key part of the case was taking something that could be very technical and seemed very boring and very dry and making it understandable and interesting,” Levis says.
The plaintiffs’ other expert was Dr. Jennifer Goldbeck, a computer scientist at the University of Maryland who educated the jury about how Meta used the data collected from Flo Health to train its machine learning advertising algorithm and how that algorithm then delivers ads. As Villegas said in her opening statement, “No one at Meta will ever be able to tell you why you get one ad over another. They don't know. The algorithm knows, and it's very good at what it does.”
* * *
Another critical component of the plaintiffs’ case consisted of internal Meta documents that their lawyers used in their questioning of company employees and that helped to demonstrate that Meta had the requisite intent to “eavesdrop upon or record” a confidential communication.
The employees included Steve Satterfield, Meta’s VP and Associate General Counsel for Privacy, who initially testified under questioning by Canty that SDKs are merely “business tools” and that Meta tells app developers not to send information to it. But Canty produced a document in which Satterfield told staff that Meta uses “event data” from apps for ads. “I think it's a fair statement to say that we benefit from app event data that we receive,” he admitted from the witness stand.
Satterfield also testified that he was “aware of the risk that app developers would send us certain information that we didn't want to receive and that could have included health information.” In a May 2018 document, moreover, Meta software engineer Tobias Wooldridge warned that “advertisers can inadvertently or intentionally send sensitive information, including health information, in custom fields via website, app, and offline events. This poses risks, as we are unable to cleanly delete data, as well as policy and PR issues.”
“Meta could have said to Flo, 'You're forbidden from using our SDK at any time,’ correct?” Canty asked Wooldridge.
“Yes,” he replied, going on to admit that Meta did not disable the Flo app from sending data during the class period.
Such testimony gave Canty ample ammunition for a closing argument in which he said Meta could easily have protected the privacy of Flo Health users but “took no real steps” to stop the recording. “You don't get to come in here and say with a straight face you didn't have intent to collect [data] when all of the objective evidence points to the fact that you collected it, recorded it, used it, exploited it, profited from it,” he said. “That's intent, ladies and gentlemen of the jury.”
“A verdict for Meta here essentially sends a message that they can collect this private health information without consequence,” he concluded. “And that's not consistent with the law.”
In the event, the jury returned a verdict of liability under CIPA against Meta. “One of the questions they had to answer is whether or not Meta had consent to collect the information,” Canty recalls. “And it was a resounding ‘No.’ If you look at the verdict sheet, there was a very bold X through that ‘No.’ And I think that sent a message: Look, you didn't have consent. Despite your three or four examples where you claimed you did, you didn't.”
“I think that's what consumers want,” he adds. “They just want to be told clearly and succinctly, ‘What's going on with our data? Are you taking it? What purpose are you using it for?’ And that didn't happen.”
Canty also notes that Meta executives prevaricated in their testimony when asked whether information about a woman's menstrual cycle is private and sensitive. “Ultimately, the best we got was, ‘Well, it's contextual, it's the woman's information. It's really up to her on how and where that information is disclosed.’ And I said in my closing, ‘The problem here is that Meta stole that decision from these women.’ And I think that that really resounded with the jury. Yes, people may have different levels at which they want to share their sensitive information, but it's up to them. It's not up to Meta to just unilaterally take it across the board.”
* * *
According to Villegas, the damages award against Meta could be “in the billions just because of the number of people in California that are involved that were using the app.” Meta is expected to file post-trial motions and, if those are not successful, appeal to the 9th U.S. Circuit Court of Appeals.
“We feel very confident,” Zinser says. “We feel that the verdict speaks for itself. Everyone who worked on this case has put in a great deal of time and effort – the associates, the support staff, everyone who was prepping throughout the trial. I think that the work that we’ve put into it is going to pay off in the end with the damages that we hope to get to all the class members.”
“A couple more hills to climb, but we’re optimistic,” Canty says.
No matter what, though, the verdict has already resonated through the legal community, as privacy attorneys are tracking the success of applying CIPA, and more and more consumers and legal teams are looking at ways to hold Big Tech accountable. The fact that jury found intent in Meta’s actions could pave a path for holding SDK developers liable under CIPA in other cases.
For Villegas, the trial repudiated the idea that “this is how advertising works and this is just normal. It’s not.” Yes, people may like getting targeted ads for the brands that they like. But this case was about choice, privacy, and accountability. “What we want Meta to think about,” says Villegas, “is that when you're dealing with confidential sensitive information, we want you to do more. And now you know that consumers want you to do more. You heard it directly from them.”