The European Union has determined that both Meta and TikTok have not adequately protected children on their platforms. The findings indicate that these companies created obstacles that hindered the reporting of child sexual abuse material (CSAM) and other harmful content. As a result, they may face fines of up to 6% of their global annual turnover.
According to preliminary findings from the EU, both companies violated child protection provisions outlined in the Digital Services Act (DSA). Specifically, they were found to have unlawfully obstructed researchers trying to assess the exposure of children to illegal or harmful material. Furthermore, Meta was criticized for complicating the reporting process for illegal content, including CSAM. The EU has accused the company of employing “dark patterns” that intentionally confuse users seeking to file reports.
Legal Challenges and Potential Consequences
Both Meta and TikTok will have the opportunity to review the EU’s findings and submit their responses. If these responses are deemed inadequate, the companies could face significant financial penalties, potentially reaching €6 billion (approximately $6.4 billion).
In a separate development, Meta is grappling with lawsuits filed by multiple US states, which accuse the company of knowingly designing its apps to be addictive, particularly to teenagers. Reports indicate that legal counsel within Meta advised the company to keep certain findings related to teen harm confidential. Meta’s legal team contends that this advice is protected under attorney-client privilege, a claim that is expected to be scrutinized in court. The first of these lawsuits is scheduled to be heard next year.
The implications of these findings and ongoing legal challenges could significantly impact the operations of Meta and TikTok, especially in relation to their policies on user safety and content moderation. The scrutiny from regulators and legal entities underscores the heightened focus on digital platforms’ responsibilities in protecting vulnerable populations, particularly children.
As the situation evolves, both Meta and TikTok must navigate these challenges carefully, balancing compliance with regulatory expectations while addressing the concerns raised by various stakeholders.


































