URGENT UPDATE: New evidence has emerged revealing that Meta, the parent company of Facebook and Instagram, reportedly buried internal research indicating that its platforms harm users’ mental health. The details come from unredacted court filings in a class action lawsuit brought by US school districts against Meta and other social media companies, raising serious concerns about the safety of these platforms.
According to documents obtained during discovery, Meta’s internal research project, codenamed “Project Mercury,” found that users who deactivated Facebook for just one week experienced a significant decrease in feelings of depression, anxiety, and loneliness. Instead of releasing these findings, Meta allegedly halted further investigation, labeling the results as influenced by the “existing media narrative” about the company.
Internal communications reveal that staff members acknowledged the validity of the findings. “The Nielsen study does show causal impact on social comparison,” one researcher stated. Another expressed concern that suppressing negative results resembled tactics used by the tobacco industry, which concealed evidence about the dangers of smoking.
Despite its own research documenting a link between its platforms and negative mental health impacts, Meta publicly claimed to Congress that it could not quantify whether its products were harmful to teenage girls.
Meta spokesman Andy Stone defended the company, stating the study was abandoned due to methodological flaws and emphasized that Meta has actively worked to enhance the safety of its products. “For over a decade, we have listened to parents and made real changes to protect teens,” he said.
The allegations of Meta covering up evidence are included in a filing submitted by Motley Rice, a law firm representing school districts nationwide. The lawsuit accuses Meta, along with TikTok, Google, and Snapchat, of intentionally hiding known risks associated with their platforms from users and guardians.
The plaintiffs argue that these companies not only encourage children under 13 to use their platforms but also fail to adequately address issues such as child sexual abuse content. In a shocking claim, TikTok allegedly sponsored the National Parent Teacher Association, with internal communications boasting about its influence over the organization to promote its agenda.
Specific allegations against Meta include:
1. Designing youth safety features that are largely ineffective and underutilized.
2. Setting an unreasonably high threshold for removing users flagged for sex trafficking.
3. Acknowledging that optimizing for teen engagement led to exposure to harmful content yet proceeding anyway.
4. Delaying efforts to protect minors from potential predators for business growth concerns.
5. Statements from CEO Mark Zuckerberg indicating that child safety was not his primary focus amid other corporate priorities.
Stone refuted these claims, asserting that Meta’s safety measures are effective and that the company promptly removes flagged accounts related to sex trafficking.
The internal documents referenced in the lawsuit are currently sealed, with Meta seeking to strike them from the record. A hearing is scheduled for January 26 in the Northern California District Court to address these serious allegations.
As this situation develops, the implications for user safety and mental health are profound. The revelations spark urgent discussions about the responsibility of social media platforms in prioritizing the well-being of their users, particularly children and teenagers.
Stay tuned for further updates on this critical issue that affects millions of social media users worldwide.


































