Meta, the parent company of social media platforms such as Instagram, Facebook, and Threads, has taken significant action in response to Australia’s new legislation aimed at restricting social media access for users under 16. Between December 4 and December 11, 2023, Meta reported that it removed access to over 500,000 accounts believed to belong to minors. This includes more than 330,000 Instagram accounts, approximately 170,000 Facebook accounts, and nearly 40,000 Threads accounts.
The new age restriction law came into effect on December 10, 2023, marking a pivotal moment in digital safety regulations worldwide. Alongside Meta, other platforms such as TikTok, Snapchat, X, YouTube, Reddit, and Kick also imposed similar age restrictions. Accounts identified as belonging to users under 16 were either deactivated or deleted, reflecting a growing trend towards safeguarding young users online.
Impact of the Ban on Youth Engagement
Since the implementation of the ban, there has been a noticeable shift in youth engagement. According to data collected by News Corp, book sales have surged by 3.1 percent, indicating that more young people are turning to traditional forms of entertainment. Retailers are also reporting increases in the sales of board games, card games, and puzzles.
Prime Minister Anthony Albanese expressed optimism regarding these changes. He noted during the announcement of the ban that other nations are considering similar measures. Albanese acknowledged the complexity of removing children from social media, stating, “It won’t be perfect because this is a big change.” He emphasized that success lies in the ongoing discussions between parents and children about social media use, a conversation that he believes is crucial for future generations.
Meta’s Response and Future Considerations
In a recent update, Meta reiterated its commitment to comply with the new law while advocating for alternative approaches to ensure online safety. The company called for the government to explore methods beyond blanket bans. Meta suggested potential industry incentives aimed at providing safer, privacy-preserving, and age-appropriate online experiences.
Moreover, Meta proposed that age verification processes be enhanced by implementing checks at app stores rather than within the apps themselves. The company stated, “To ensure all teens are protected online, we believe legislation should require app stores to verify age and obtain parental approval before their teens under 16 can download an app.” This approach aims to establish consistent protections for young users across various applications and prevent the “whack-a-mole effect” of minors seeking new apps to bypass restrictions.
As Australia’s measures influence discussions globally, the balance between youth protection and freedom in the digital landscape remains a critical issue. Meta’s call for a collaborative approach highlights the necessity for innovative solutions that address the complexities of online engagement for minors. The ongoing developments in this area will likely set precedents for other countries contemplating similar legislation.


































