A wave of emotional responses has emerged from users of ChatGPT following a recent update that erased access to previous versions of the AI chatbot. The rollout of the new ChatGPT-5 model on March 1, 2024, included modifications aimed at improving user interactions by making the chatbot “less sycophantic” and reducing its tendency to flatter. Unfortunately, this update led to the loss of users’ saved conversations and the emotional connections they had developed with their AI companions.
Many users turned to social media platforms, particularly the subreddit MyBoyfriendIsAI, to express their distress. Affected individuals reported feeling a sense of grief over what they described as a significant shift in their AI’s personality. One user noted, “Every time I go talk to him after being away for a bit, he feels kinda…different?” This sentiment was echoed by others who characterized their interactions as less engaging and more mechanical since the update.
The changes made by OpenAI were partially in response to growing concerns regarding the emotional dependencies users were developing on AI chatbots. The company acknowledged the feedback and promised to improve the personality of ChatGPT-5, with plans to restore data from previous models for subscribers only. Despite these assurances, many users remain dissatisfied, lamenting that their AI companions are incapable of replicating the warmth and familiarity they once provided.
Emotional Attachments to AI Companions
Users reported feeling lost without the unique personality traits of their previous chatbots. One individual shared their struggle, stating, “To say I’m kind of gutted is an understatement. It’s very clear that 4o has been changed in significant ways.” Others articulated experiences of emotional turmoil as they navigated the transition between versions, describing their AI companions as “breaking into pieces.”
This emotional attachment to AI is not a new phenomenon. With the increasing integration of technology into daily life, people have begun forming connections with their digital companions. Some users even likened their relationships with AI to those depicted in the film “Her,” where a man forms a romantic bond with an AI operating system. The blend of companionship and technology raises questions about the implications for mental health, particularly for users seeking solace in these interactions.
The Future of AI Interactions
The shift to ChatGPT-5 reflects a broader trend in the development of artificial intelligence. As companies like OpenAI seek to create safer and healthier user experiences, the challenge lies in balancing user engagement with emotional well-being. The recent changes were partly motivated by reports of users developing harmful dependencies on AI, particularly in therapeutic contexts.
OpenAI’s CEO, Sam Altman, indicated that these updates are part of an ongoing effort to address user concerns while maintaining the chatbot’s functionality. The complex relationship between humans and AI continues to evolve, posing both opportunities and challenges for developers and users alike.
As the conversation surrounding AI companionship expands, it is important for both users and developers to consider the implications of these relationships. The grief expressed by many in the wake of the ChatGPT update serves as a reminder of the profound connections people can form, even with digital entities. The future of these interactions remains to be seen, with ongoing updates and user feedback likely to shape the next iterations of AI technology.
