UPDATE: A new study reveals a concerning trend in Australia: approximately 10% of Australians are now using ChatGPT for mental health inquiries, raising alarms among mental health professionals. The University of Sydney study indicates that this AI platform is increasingly seen as a go-to for health-related questions, from chronic headaches to troubling moles.
In a striking revelation, around 20% of TikTok users have admitted to relying on AI for therapeutic support, a move that has left psychologists deeply worried. “We know many people are using them due to barriers such as cost, stigma, or access,” said clinical psychologist Katie Kjelsaas. She emphasized the risks, stating that while AI chatbots may provide immediate access, they lack the ability to accurately assess a user’s mental state.
Psychologists warn that the convenience of AI can be dangerously misleading. “AI chatbots could serve a purpose in providing limited interim relief, but they cannot perceive distress levels,” Dr. Kjelsaas cautioned. Many individuals experiencing acute mental distress may not accurately evaluate their symptoms, which could lead to severe consequences if they substitute professional help with AI.
The statistics are alarming. With mental health issues on the rise, the increasing reliance on AI for diagnosis and advice could delay access to appropriate care. “AI platforms like ChatGPT cannot diagnose or personalize treatment,” Dr. Kjelsaas added. “They can only offer general advice, often riddled with inaccuracies.”
Experts like Sahra O’Doherty, president of the Australian Association of Psychology, stress that AI is not a substitute for professional care. “These bots are designed to summarize and mirror user inputs, not to provide evidence-based information,” she stated. Both Kjelsaas and O’Doherty call for immediate action to make mental health services more accessible and affordable.
The risks associated with AI platforms are compounded by a lack of regulation. ChatGPT, owned by OpenAI, confirms that it is not designed to provide medical or therapeutic advice. “Users are encouraged to consult qualified professionals for actual diagnosis or support,” the chatbot stated.
This situation has sparked a critical conversation about the future of mental health care in Australia. With many Australians seeking quick and cheap solutions, the government is urged to take steps to improve access to mental health services. “The main factor driving the popularity of AI as alternatives to psychology is the cost of mental health care,” O’Doherty noted.
As the discussion unfolds, mental health advocates emphasize the importance of seeking professional help rather than relying on AI for critical support. “AI platforms pose a significant risk of harm for those using them as a substitute for therapy,” Dr. Kjelsaas warned.
Individuals are encouraged to seek professional help, especially for crisis situations, and to avoid sharing personal information with AI. The call for a robust mental health infrastructure is more urgent than ever as the implications of this trend continue to unfold.
For those in need, various mental health resources and crisis helplines are available to provide the necessary support and guidance.
