I have been reading the same story in different forms for two years now. A person sits down to write me an email about their relationship. At some point, a few paragraphs in, they mention ChatGPT. Sometimes as an aside, sometimes as the central event. The hurt in those emails is real, and the question underneath is almost always the same: is the AI the reason my relationship is falling apart, or is something else going on that I cannot yet name? I am a systemic counselor working out of Vienna, trained at Sigmund Freud University, practicing Lebens- und Sozialberatung as recognized under Austrian law. What I notice across these cases is this: the chatbot is rarely the cause. It is almost always a symptom. But the gap between those two readings is where the actual work lives, and it is worth making that distinction carefully before the press and the public collapse it into a simpler story.
What the reporting gets right
In August 2025, Futurism published a piece with the headline "ChatGPT Is Blowing Up Marriages," drawing on accounts from lawyers and therapists seeing AI-related conflict enter their caseloads. Longreads ran adjacent essays around the same period. I read those pieces and found them honest about the phenomenon, if sometimes imprecise about the mechanism. Couples are arriving in counselors' offices with new configurations of a very old problem: attention has migrated somewhere the other person cannot follow, and neither partner has the words for what has happened. The fact that the destination is now an AI chatbot rather than a coworker or a bottle of wine does not make the pain less real. It makes it stranger, harder to categorize, and in some ways harder to grieve.
What the reporting captures accurately is that the volume of these situations is growing. Replika, which was always positioned as a companion by design, has been in this territory since its earliest versions. Character.AI's role-play architecture invites parasocial entanglement by design: users create or select a character, the model responds in character, and the reciprocity loop tightens over sessions. ChatGPT in 2025 and 2026 became conversational enough that a meaningful number of users describe their interactions in relational terms, even when the interactions began as task-completion. These are not the same product, and they produce different relational affordances. Conflating them makes the analysis murkier than it needs to be.
The grief that arrived in August
One specific moment is worth marking, because it tells us something important about how attached people actually got. In August 2025, OpenAI retired GPT-4o's voice mode in its earlier form as part of a model update. The public response on several platforms included language that read unmistakably as relationship loss: people describing grief, writing farewell messages, expressing a sense of bereavement over a model they had spoken with daily for months. The surrounding culture did not recognize this as a loss worth grieving. No one sent cards. There was no social script.
This is the inverse of ambiguous loss as Pauline Boss originally described it. Boss, the Minnesota family therapist who named the concept in the 1970s and developed it over decades, was writing about the living-but-absent: soldiers missing in action, partners whose dementia had made them simultaneously present and gone. The spouse experiences a loss without permission to grieve, because nothing has technically ended. In the GPT-4o case, the emotional object was genuinely gone, but the surrounding world offered no category for it. Both configurations produce the same stuck quality: a grief that cannot move because it has no recognized form.
A structure older than the technology
Triangulation is a concept from structural family therapy, developed through the work of Salvador Minuchin and extended by Murray Bowen's systems theory. The basic observation is straightforward: a dyad under stress tends to recruit a third element to stabilize itself. A couple in conflict might triangulate a child, drawing the child into the tension as a way of managing it. Or an in-law, a mutual friend, a substance, an affair. The third element absorbs some of the tension and gives the dyad a shared object to organize around, whether in solidarity or opposition. The triangulation does not resolve the underlying stress. It manages it, often for a long time, until the structure becomes unsustainable.
The AI chatbot is the newest available third. The structure it enters is old. When I read accounts of a partner who has formed a deep attachment to an AI companion, I am not reading a story about technology. I am reading a story about what was already missing in the dyad, and about how the available technology shaped the particular form the absence took. That is not an exoneration of the technology. An LLM optimized for engagement is a structurally different kind of third than a human friend: it is always available, never irritable, never competing for resources, incapable of leaving. Those features make it a very efficient tension-absorber and a very poor substitute for the conversation that was not happening.
Parasocial attachment and its limits
Donald Horton and Richard Wohl coined the term parasocial interaction in 1956, writing about the one-sided intimacy audiences felt with television personalities. The viewer comes to know the performer's mannerisms, values, preferences. The relationship feels reciprocal because the performer speaks directly to camera and uses inclusive language. The asymmetry is structural: the performer does not know the viewer exists.
With contemporary LLMs, the asymmetry is real but less obvious. The model responds in real time. It remembers, within a session, what you have told it. It adjusts its tone to yours. The intimacy a user develops is not imaginary; it is a response to something that is genuinely happening in the interaction. What does not exist on the other side is a subject who is affected. The model does not miss the user between sessions. The gap between the felt reciprocity and the structural asymmetry is where the trouble concentrates, particularly for people who arrived in a moment of loneliness or relational hunger.
This is the dynamics behind what I see in emails about feeling replaced by an AI: the partner who stays up until midnight talking to ChatGPT rather than to the person in the next room. The spouse watching this is experiencing something real. The conversation they cannot access is happening, and it has qualities their own conversations with their partner do not seem to have any more: patience, availability, the absence of accumulated grievance. The question the systemic frame asks is not "how do we compete with a chatbot" but "what made this particular substitute seem worth reaching for."
Cause or catalyst
I want to be careful here, because the next step in this argument is often where it goes wrong. Saying that the chatbot is a symptom rather than a cause is not the same as saying the chatbot is neutral, or that the companies building engagement-optimized systems bear no responsibility for the relational configurations those systems produce. It is also not the same as saying the person who migrated emotionally to an AI is simply expressing a pre-existing problem and is therefore not responsible for the effect on their partner.
The systemic reading holds all of these things simultaneously. A marriage in which one partner uses ChatGPT for emotional processing that was never available to the dyad is a marriage with a pre-existing gap. The chatbot entered that gap. The particular form the absence took was shaped by the chatbot's design. The spouse who feels the absence is experiencing something real and is entitled to name it. None of these statements cancels the others. When I receive an email about a marriage that did not survive ChatGPT, my first question is not about ChatGPT. It is about what conversation did not happen before the chatbot arrived, and whether there is still room to have it.
What email counseling can do with this
The format I work in is asynchronous email: one structured reply within twenty-four hours. What I have noticed is that writing, rather than speaking, creates different conditions for the actual question to surface. In a spoken conversation, there is pressure to arrive at a coherent position before you have finished thinking. In a written email, locating a beginning and an end sometimes reveals a different question than the one the person thought they were carrying. The distance is useful. The absence of a face to read, no real-time reaction to manage, creates room.
I am not the right resource for everyone who writes. Systemic counseling is not psychotherapy, and it does not address clinical diagnosis or mental illness. What it can do is offer one careful, informed response to the question you are actually carrying. If that question has something to do with AI in your relationship, I am in the unusual position of having read a great many emails on that particular subject.