UPDATE: Australia is set to implement the Online Safety Code this week, requiring search engines to age-verify users starting December 27, 2025. This groundbreaking change will automatically categorize users who are not logged in as minors, raising urgent concerns about access to critical information on sensitive topics like mental health, sexuality, and domestic violence.
As digital platforms evolve, this code introduces a new layer of complexity in how Australians search for and consume information. With AI reshaping search methodologies, users will now face newer gatekeepers controlling access to vital resources. Those seeking information on sensitive subjects will need to navigate age verification processes, potentially hindering their ability to find necessary support.
Authorities confirm that platforms must verify users’ ages, prompting fears over privacy and the potential misuse of personal data. Sensitive searches—such as those related to abortion services and domestic violence support—could inadvertently link personal identities with search histories, posing risks to vulnerable populations.
The impact of artificial intelligence on search is profound. Recent studies indicate a marked rise in the use of AI for information-seeking purposes. A survey conducted across six countries shows that generative AI usage for this purpose has surged from 11% in 2024 to 24% in 2025. This shift is transforming traditional search engines into “answer engines,” fundamentally altering how users receive information.
Notably, web traffic to news websites has plummeted, particularly affecting smaller publishers. The Pew Research Centre highlights that users are increasingly reluctant to click on AI-generated summaries, which often draw from a limited pool of sources. As reliance on AI increases, companies like Apify and Brightdata are utilizing web scraping techniques to gather search results, raising further questions about the integrity of information dissemination.
While Google is rolling out new tools to assist website owners in analyzing search traffic, these solutions primarily benefit established sites with substantial existing traffic. The recent acquisition of Semrush by Adobe underscores a strategic pivot towards rapid analysis and AI integration, positioning new tech companies as dominant players in the information landscape.
The new Online Safety Code marks a significant departure from the traditional model of unmonitored information access, raising crucial questions about the role of search engines. As platforms implement age verification, concerns about misinformation, disinformation, and polarization loom large. Critics argue that simply verifying user identities does not eliminate harmful content; it merely obscures it from some while still allowing it to circulate freely.
Moreover, this policy conflates exposure to content with harm, overlooking the critical role of guidance and digital literacy in shaping user experiences. Children and young people who encounter difficult material online may benefit from strong support systems, while those without guidance may face challenges regardless of protective measures.
As users adapt to these sweeping changes, the onus is on them to verify information sources rigorously. Experts advise turning to reputable news outlets and organizations, utilizing fact-checking tools, and cross-referencing multiple sources to ensure the accuracy of information.
With the implementation of the Online Safety Code, how Australians interact with online content is on the cusp of a profound transformation. As more users navigate this new terrain, the importance of fostering digital literacy and critical thinking remains paramount.
Stay tuned for further developments on this urgent issue, as the implications of the Online Safety Code unfold.


































