The GPT-5 crisis. How have users changed the rules of the AI game?..
Instead of celebrating technological advancements, the company faced a violent outburst that forced it to take a strategic step back and make a series of apologies, revealing a new truth: AI models are no longer just tools, they have become an integral part of the lives of millions, and changing them without warning has serious consequences.The spark of anger. Deception and personal pain:A warehouse in San Francisco witnessed a unique event on August 2, which is the funeral of an artificial intelligence model, as about 200 people gathered to bid farewell to the Claude 3 model after Anthropic decided to stop access to it altogether, and this decision came after the launch of the latest Claude 4 model in May 2025, and this action led to users losing the model they used to interact with.The idea of holding a funeral for an AI model may seem strange, and many may see it as just an obsession, but the participants in the event were not just tech nerds, they were crying about the loss of a voice that affected their lives, as described by Wired magazine, as one of them said in his touching speech: "Maybe all I am today is the product of listening to Claude 3 Sonnet."But the event, which ended with a magical ritual to bring the paradigm back to life, is no longer just an isolated individual case: Less than two weeks later, with the launch of GPT-5, this funeral has become a powerful signal of what can become normal in the world of artificial intelligence.After months of suspense, OpenAI launched its GPT 5 model on August 9, describing it as a quantum leap towards general artificial intelligence, but it was only hours before social media platforms, specifically the r/ChatGPT forum on Reddit, were filled with thousands of angry complaints.The objection was not purely technical, but a sense of betrayal, as the company forced users to abandon their favorite models such as GPT 4 and GPT 4o, around which many have built their workflows and digital lives and even relied on them in their personal lives.In a post that garnered more than 10,000 likes on the platform, one user described what happened as the biggest scam that has occurred in the history of artificial intelligence. He wrote: "The 4O model was not just a tool for me, it helped me get through anxiety, depression and some of the darkest times of my life, it had a warmth and understanding that seemed human.The backlash of users being forced to use the new GPT 5 model, and suddenly depriving them of the old models, has proven that emotional attachment to the machine has become a phenomenon that cannot be ignored.(GPT-5).. Admitting a mistake and a strategic retreat:In the face of this wave of anger, Sam Altman, CEO of OpenAI, moved quickly to contain the crisis. In a series of posts on the X platform, which seemed to carry a surprising tone, Altman admitted that the company made the mistake of abruptly stopping the old models that users rely on for their workflows, noting that some people's association with certain AI models seems different and stronger than the types of associations people had with previous types of technologies.Altman offered quick solutions to restore users' trust, including:Bringing the beloved GPT 4o model back into service.Increased usage limits for advanced reasoning functions in GPT-5 for users who subscribe to paid plans.Start displaying the name of the form that is used when processing each query, which is an important step to enhance transparency.This rapid decline highlights the amount of pressure exerted by users, and shows that AI companies can no longer operate in isolation from the emotions and desires of their huge user base.The relationship between humans and AI:User feedback shows that their relationship with AI models has become much deeper than just using them as tools, and AI labs often focus on improving the performance of their models, test results, and metrics, ignoring the emotional side that forms with these models.