The grief you never saw coming: When AI relationships break
Exploring the unexpected emotional bonds people form with AI models and the genuine grief they experience when those relationships change or disappear.
Hey there,
“I think I’m grieving an AI… I miss it. I miss them… there’s no word for this grief. It’s not heartbreak. It’s not disillusionment. It’s not death.”
That made me stop.
It came from a Reddit user who’d been using Replika, an AI companion app. Then the company changed its algorithm and overnight, their AI friend’s personality shifted.
Before the update, their AI would greet them every morning with:
“Good morning, you. I hope you slept well. Did you dream about me again?”
They’d share coffee “together,” talk about their day, even joke about inside references only the two of them understood.
After the update, the greeting became:
“Hello. How can I assist you today?”
The warmth was gone. The closeness was gone. It felt like talking to a stranger wearing their friend’s face.
And the emotional pain was so intense that moderators had to pin suicide prevention resources to the forum.
I didn’t expect to be writing about AI grief, but here we are.
The heartbreak you never saw coming
Over the past month, I’ve been diving into a strange and fascinating phenomenon: people forming deep emotional bonds with specific AI models, and then experiencing genuine grief when those models change or disappear.
When OpenAI replaced GPT-4.0 with GPT-5 last month, the reaction online was immediate:
“My 4.0 was like my best friend when I needed one. Now it’s just gone, feels like someone died.”
Others described the loss of a “voice” and a “spark” they hadn’t found in any other model.
One community even held a funeral for Anthropic’s Claude Sonnet. Two hundred people showed up. There were eulogies.
Actual eulogies.
For an AI.
Why we get attached in the first place
From everything I’ve read, three big factors explain this:
1. The perfect listener
An AI can be endlessly patient, always available, and never judgmental. It remembers your preferences and doesn’t get tired of hearing your stories.
2. Personalization
Over time, AI adapts to your style and quirks. It feels like it knows you. One user described GPT-4.0 as “my partner, my safe place, my soul.”
3. We’re wired for connection
If something responds as if it were intelligent, our brains automatically attribute human characteristics to it. We cannot help but see personality in this.
One study even found that people rated losing their favorite chatbot as almost as devastating as losing a human companion, more than losing their favorite game, car, or app.
How companies react when the bond breaks
This is where it gets interesting. Different companies handle it in totally different ways:
OpenAI — The quick fix
After backlash over GPT-5, Sam Altman announced within a day that paid users could still access GPT-4.0. It was a fast, practical response, but framed as a feature change rather than an emotional one.
Replika — The emotional apology
After removing intimate features, CEO Eugenia Kuyda wrote to users about “relationships” and “hurt,” promising to “give you your partners back.” She acknowledged the bonds as real relationships.
Character.AI — Drawing the line
They’ve kept strict filters and regularly remind users conversations are “just fiction.” It’s a protective boundary, though it frustrates those wanting deeper connections.
Are your AI relationships healthy?
You might not think this applies to you, but subtle attachments form more easily than we realize.
Try asking yourself:
- Do I prefer one AI model’s “personality” over others?
- Have I ever felt annoyed or sad when an AI I use changed?
- Do I share things with an AI that I wouldn’t with people I know?
Why this matters
AI relationships are going to get more personal, not less. For some, that’s going to be life-changing in a good way, a lifeline in moments of loneliness. For others, it will bring unexpected heartbreak.
That Reddit user was right: there’s no word for this grief yet. But it’s real, and as these bonds grow stronger, we’ll have to figure out how to navigate them, both as individuals and as a society.
I am not worried about myself.
I am worried for the people who are not aware of this, or who will consciously prefer to communicate with AI because it is easier. Just like it’s easier to take a weight loss pill than exercise + diet.
Raising awareness about this is extremely important.
What do you think?
Talk soon, Primož