Court Papers: Meta Halted Research After Harm

Technology
Court Papers: Meta Halted Research After Harm
Unsealed court filings allege Meta stopped an internal study that found short breaks from Facebook reduced anxiety, loneliness and negative social comparison. The documents form part of a broad lawsuit by U.S. school districts accusing major platforms of hiding risks to children.

Unsealed filings say Meta shelved a study showing harm

What Project Mercury reportedly tested

The portion of the case described in court papers alleges Meta worked with an outside survey firm to measure the short‑term effects of a forced break from the platform. Participants who temporarily stopped using Facebook — and in some descriptions, Facebook and Instagram — were then surveyed about mood and social comparison. Plaintiffs say the results were clear enough that internal researchers described a causal effect on social comparison, but the company discontinued the work rather than publish or scale it.

Allegations beyond the single study

Meta’s response and the company line

Meta has pushed back on the characterisation in the filings. A spokesman said the Project Mercury study was halted because of methodological problems, and reiterated that the company has worked for years on teen safety features. The company has also moved in court to seal many of the underlying documents, arguing plaintiffs are trying to unseal an overly broad set of materials. Plaintiffs counter that those documents are central to their allegations that the platforms concealed known risks.

Where this sits in a longer story

These new filings arrive against a backdrop of public scrutiny dating back several years. In 2021, internal slides and research from the company prompted wide debate over whether Instagram and other social services contribute to body‑image anxiety and other harms among teenagers. The company then publicly disputed some media interpretations of those materials while also releasing annotated decks and announcing product changes aimed at young users. The new court papers amplify that earlier debate by asserting that the company sometimes suppressed or sidelined findings it did not like.

Why a seven‑day deactivation study matters scientifically — and legally

A short deactivation experiment is a relatively direct way to probe causation: if a sample of users stops using a service and their well‑being measures improve compared with a control group, investigators can infer some causal effect from usage to reported outcomes. The strength of such an inference depends on how participants were selected, whether the control group was comparable, and whether the act of deactivating introduced other changes (for example, more sleep or reduced exposure to specific content) that explain the effect. Plaintiffs argue these issues were weighed internally and that the findings were still meaningful; Meta says the methods were flawed. The dispute over methodology illustrates why scientific details matter in court, and why companies’ internal research can be legally consequential.

Broader legal and policy consequences

The filing is part of a broader wave of litigation and regulation targeting social platforms’ responsibilities toward minors. School districts are framing their claims around the harm they say platforms cause students in classrooms and school communities, and the costs that schools absorb when students suffer mental health crises or are exposed to online predators. If courts accept the plaintiffs’ view that companies intentionally concealed internal findings, it could reshape discovery standards and accelerate regulatory pressure on content algorithms, age verification, and disclosure of internal safety research.

What to watch next

  • Pre‑trial motions about sealing and discovery: plaintiffs are pushing to make internal documents public; Meta is fighting those requests and seeking protection for sensitive materials.
  • Evidence of product‑level tradeoffs: the case hinges on whether internal discussions show executives accepted safety risks in pursuit of growth.
  • Regulatory fallout: lawmakers and antitrust regulators are already watching these developments closely; additional unsealed material could catalyse new legislative or enforcement action.

What this means for research norms

The episode highlights tensions between corporate research cultures and independent science. Companies routinely run experiments to shape products, but when those experiments touch on public health — especially the mental health of young people — the expectations for transparency grow. Independent researchers and advocates argue that when private experiments reveal population‑level risks, the findings should be subject to independent scrutiny; companies counter that raw internal work can be misinterpreted and that context matters. The courts will now be asked to weigh those competing claims as part of the discovery process.

Next procedural date

The district court has scheduled a hearing on discovery and sealing disputes for January 26, 2026, which could determine how much of the disputed internal record becomes public during litigation.

For journalists and policymakers, the immediate story is less about one slide deck or one experiment and more about whether large platforms will be required to treat safety research as a public interest resource when it implicates the welfare of children. The coming weeks of motions and hearings will decide how much of that evidence the public — and the courts — can see, and how accountability for design choices that affect young users will be pursued going forward.

Mattias Risberg is a science and technology reporter at Dark Matter, based in Cologne. He covers semiconductors, space policy and data‑driven investigations into technology companies.

Mattias Risberg

Mattias Risberg

Cologne-based science & technology reporter tracking semiconductors, space policy and data-driven investigations.

University of Cologne (Universität zu Köln) • Cologne, Germany