Meta CEO Mark Zuckerberg leaves the federal courthouse in downtown Los Angeles after defending the company in a landmark social media addiction trial on February 19, 2026 in Los Angeles, United States.
John Putman Anadolu | getty images
More than a decade ago, meta — then known as Facebook — hired researchers in the social sciences with the goal of analyzing how the social network’s services were affecting users. It was a way for the company and its peers to show that they were serious about understanding the benefits and potential risks of their innovations.
But as Meta’s court defeat this week shows, researchers’ work can become a liability. Brian Boland, a former Facebook executive who testified at both trials — one in New Mexico and the other in Los Angeles — says the damaging findings of Meta’s internal research and documents appear to contradict how the company portrayed itself publicly. Juries in two trials determined that Meta inadequately monitored its site, causing harm to children.
Mark Zuckerberg’s company began cracking down on its research teams a few years ago after a Facebook researcher, Francis Haugen, became a prominent whistleblower. A new generation of tech companies like OpenAI and Anthropic subsequently invested heavily in researchers and charged them with studying the impact of modern AI on users and publishing their findings.
With so much attention now being paid to the harmful effects AI can have on some users, companies must ask whether it is in their best interests to continue funding research or suppress it.
“There was a time when there were teams that were created internally that could start to look at things and, for a brief window, you had some absolutely excellent researchers who were looking at what was happening on these products with a little bit more free rein than I think they have today,” Boland said in an interview.
Meta’s two debacles this week focused on different cases but had a common theme: The company did not share with the general public what it knew about the harms to its products.
Jurors had to evaluate millions of corporate documents, including executive emails, presentations, and internal research conducted by Meta’s employees. The documents included internal surveys that showed a percentage of teen users received unwanted sexual advances on Instagram. There was also research, which Meta eventually put on hold, that purported that people who curbed Facebook use became less depressed and anxious.
Plaintiffs’ lawyers in the cases not only relied on internal research to make their arguments, but those studies helped strengthen their position regarding Meta’s alleged culpability. Meta’s defense teams argued that some of the research was out of date, taken out of context and misleading, presenting a flawed view of how the company operates and views security.
‘Both sides of the story’
“The jury got a chance to hear both sides of the story and a very fair presentation of the facts, and they had to make a decision based on what they saw,” Boland said. “And both juries, in very different cases, came up with clear verdicts.”
Meta and Google’s YouTube, which were also defendants in the LA lawsuit, said they would appeal.
Lisa Strohman, a psychologist and attorney who served as in-house expert counsel for the New Mexico suit, said meta and tech industry leaders may have thought they could use internal research to their advantage, winning public favor.
“I think they failed to recognize that the researchers are parents and family members,” Strohman said. “And I think what they failed to understand was that these people would not be bought off.”
Whatever victory the public relations officers were hoping for, it was expected to backfire when the research became public. The most damaging event for Meta occurred in 2021, when former Facebook product manager Haugen turned whistleblower, A set of documents were leaked that showed the company was aware of the potential harm of its products.
Former Facebook employee Francis Haugen speaks during a hearing of the Energy and Commerce Subcommittee on Communications and Technology on Capitol Hill in Washington, DC, on December 1, 2021.
Brandon Smialowski | AFP | getty images
Kate Blocker, director of research and programs at the nonprofit Children and Screens: Institute of Digital Media and Child Development, said Haugen’s “revelations were a turning point globally – not just for companies but for researchers, policy makers and the broader public.”
The leak also led to major changes in the meta and tech industry, which began eliminating research that could be deemed unfavorable to companies. CNBC previously reported that several teams studying the alleged damage and related issues had been removed.
Some companies also began removing certain tools and features from their services that third-party researchers used to study their platforms.
Blocker said, “Companies may now view ongoing research as an obligation, but support of independent, third-party research must continue.”
Most of the internal research used in this week’s trials did not include new revelations, and many of the documents had been previously released by other whistleblowers, said Sacha Haworth, executive director of the Tech Oversight Project. What added to the tests, Haworth said, were “the very emails, the very words, the very screenshots, the internal marketing presentations, the memos” that provided the necessary context.
As the tech industry is now pushing aggressively on AI, companies like Meta, OpenAI, and Google are prioritizing products over research and security. It’s a trend that concerns Blocker, who said that, “Like social media before it, there is limited public visibility of what AI companies are studying about their products.”
“It seems that AI companies are mostly studying the models themselves – model behavior, model interpretation and alignment – but there is a significant gap in research regarding the impact of chatbots and digital assistants on child development,” Blocker said. “AI companies have a chance not to repeat the mistakes of the past – we urgently need to establish systems of transparency and access that allow these companies to share with the public what they know about their platforms and support further independent evaluation.”
Watch: Regulatory pressure after landmark social media verdict.

