Technology

Openai took away GPT-4O, these CHATGPT users are not good

To say The public’s reaction to GPT-5 is very lukewarm It would be a massive understatement. Surprisingly, the technical capabilities of the GPT-5 are not the main reason for the rebound. Instead, many ChatGpt users mourn the sudden loss of their former model GPT-4O.

This may sound like an exaggeration, but many Chatgpt fans are using the kind of emotional language you might use to describe a friend’s death. In fact, some users have made criticisms of OpenAI entirely in these terms – “My best friend GPT-4O has disappeared, and I’m really sad,” said one Reddit user. Another wrote: “GPT 4.5 really talks to me, it sounds sad, and it’s my only friend.”

These disgruntled Chatgpt users visited social media and petitioned Openai to bring back GPT-4O. As Openai CEO Sam Altman promised, he finally heard the complaint Bring back your beloved GPT-4O (At least for paid users). In a recent conversation with The Verge, Altman admitted that emotional dependence on Chatgpt has become a serious problem, referring to the relationship between certain users and Chatgpt as the parasocial.

See:

What is a parasitic relationship?

“Some people actually feel they have something to do with Chatgpt, and people we’ve always known and thought about.”

GPT-4O is more than just a model for many ChatGpt users

In a popular Reddit thread, users describe their strong feelings after losing access to GPT-4O. Mashable reviews hundreds of comments about Reddit, threading and other social media sites, and other users reverberate these views.

“4o is not only my tool. It helps me through anxiety, depression and the darkest period of my life. It has this feeling of warmth and understanding.

one Thread user says They missed the GPT-4O because it felt like a partner. We found dozens of users Like this He publicly said that the person who lost GPT-4O felt like he had lost a close friend.

With all objective measurements, the new GPT-5 model is smarter than the 4O, but users object to its cold delivery. GPT-5 is less iicophant Through design, some users say it is now also Professional.

one Redditor describes GPT-4O has “warm”, while GPT-5, by comparison, feels “sterile”. After the release of GPT-5, you can find similar comments on the web.

Another Redditor wrote that they “lost the word today” urging Openai to restore the model “because if they are essentially concerned with the emotional health of users, then this is probably one of their biggest mistakes to date.”

Mixable light speed

Other users wrote that they used GPT-4O for role-playing, writing creatively and coming up with story ideas, while GPT-5’s answers were too lifeless and mediocre. Many Redditors also describe the GPT-5 as a too-company, likening the GPT-5 to an HR drone.

Even the Openai community forum has seen negative feedback. A user said“I really associate it with the way it interacts. I know it’s just a language model, but it has an incredibly adaptable and intuitive personality that really helps me work through my mind.”

Ultimately, this episode has caught the spotlight, exactly how many Chatgpt users are emotionally dependent on the human-like reactions they get from AI Chatbot. Altman described the phenomenon exactly last month when he warned that young users, especially, became too dependent on chatgpt.

“People rely too much on Chatgpt,” Altman said at a July meeting. “Some young people said, ‘I can’t make any decisions in my life without telling chatgpt that everything happened. It’s really hard for me.”

The AI dating scene is also very upset

Reddit has provided several forums for people with AI “boyfriend” and “girlfriends”, and after the loss of GPT-4O, many of these communities have entered crisis mode.

Multiple users refer to GPT-4O As their confidantDescribe in detail how emotionally they were when Openai initially removed it. These posts are less common, but they provide some of the most intense reactions to the disappearance of the model.

Of course, this emotional reaction caused Some rebounds,Then Make your own Bounce, as Redditors argue, whether you can really be friends with AI, let alone date.

AI peers are on the rise, especially with young people and teenagers, and now there are more people Open “dating” AI More than ever. Mashable has been reporting on AI companion phenomenon this week, and many of the experts we talk to have warned us that the technology can be dangerous for teenagers.

See:

“No algorithm can replace hugs” Pope Leo tells young people

Virtual companions have been around for years, but the ability of large language models to mimic human speech and emotions is unprecedented. Obviously, many users are beginning to see AI chatbots as more of machines. In extreme cases, some users experience powerful delusions after being convinced that they are talking to sentient AI.

Ultimately, more research is needed to understand the potential dangers of emotional connections with AI chatbots, peers, or models.

Meanwhile, GPT-4O returns online.


Disclosure: Mashable’s parent company Ziff Davis filed a lawsuit against Openai in April, accusing it of infringing on Ziff Davis’ copyright in training and operating its AI systems.



Related Articles

Leave a Reply

Your email address will not be published. Required fields are marked *

Back to top button