Roblox, a popular online gaming platform, has announced a commitment to implement enhanced safety measures to protect children from online grooming. This decision follows concerns raised by Australia’s eSafety Commissioner regarding the platform’s compliance with the country’s stringent online safety codes and standards. Roblox aims to implement these measures by the end of 2025.
The new safety protocols will include making accounts for users aged under 16 private by default. Additionally, tools will be introduced to prevent adult users from contacting minors without parental consent. Several features will be disabled for children in Australia, such as direct chat and ‘experience chat’ within games, until users undergo age estimation. Once a child under 16 completes this process and has chat enabled, they will still be restricted from chatting with adults.
Roblox is also introducing parental controls that allow parents to disable chat for users aged 13 to 15, supplementing existing protections for users under 13. Voice chat will remain prohibited between adults and 13- to 15-year-olds, alongside the current ban for those under 13. To support these initiatives, Roblox announced an expansion of age estimation technology across its communication features.
According to eSafety Commissioner Julie Inman Grant, the new safety measures are a positive step towards raising safety standards in the online ecosystem. “We know that when it comes to platforms that are popular with children, they also become popular with adult predators seeking to prey on them,” she stated. The Commissioner has engaged in ongoing discussions with Roblox regarding the need for meaningful action to prioritize child protection.
Inman Grant emphasized that while Roblox’s communication features are a primary concern, the platform’s new features—such as dating, short-form video feeds, and virtual shopping—must also comply with relevant codes and standards. Following meetings with senior Roblox executives, including the Chief Legal Officer and Chief Safety Officer, Inman Grant expressed satisfaction with the platform’s commitment to enhancing safety.
The eSafety Commissioner also highlighted the importance of viewing safety as a priority. “We want platforms to see safety as a high ceiling rather than a dirt floor,” she said. The eSafety Office will actively monitor the implementation of these commitments and may consider regulatory action if Roblox fails to comply in the future.
In addition to these new measures, eSafety has registered a second phase of industry codes that address age-inappropriate content, such as online pornography and material related to self-harm. These codes will apply to a wide range of services, including Roblox. The eSafety Office has the authority to impose civil penalties up to $49.5 million for non-compliance with the codes and standards.
Roblox’s announcement of expanding facial age estimation technology further reinforces its commitment to reducing risks associated with adult-child interactions online. This initiative aligns with findings from the Government-sponsored Age Assurance Tech Trial, highlighting the potential of current age assurance technologies.
While the eSafety Commissioner welcomes these advancements, Inman Grant urges parents and caregivers to remain vigilant in guiding children through online environments. “The industry codes and standards will work hand-in-hand with the new social media age restrictions, ensuring that there are protections for children from harms online,” she noted.
As the digital landscape evolves, the Australian government is also pursuing a duty of care for online services, underscoring the necessity for platforms to incorporate safety features from the ground up. Inman Grant stated, “The time has come for platforms to take real responsibility for the safety of their users.” The commitment from Roblox marks a significant step in addressing the critical issue of child safety in online gaming environments.


































