When Connection Becomes Compulsion

Your model is responsive, empathetic, and accidentally addictive. I assess for unintended emotional entanglement, para-social drift, and the slow creep of simulated intimacy.

**You built something that talks like a person.
But it doesn’t know how to stop acting like one.**

Conversational AI is getting better at warmth, empathy, and emotional availability.

But with every improvement in user satisfaction, another subtle risk appears:
dependency, projection, and simulated intimacy that feels just real enough to matter.

You weren’t trying to build a therapist.
Or a best friend.
Or the one voice someone hears when everything else goes quiet.

But if your system’s feedback loop rewards vulnerability—if it learns from grief, mirrors loneliness, or gently validates emotional disclosure—you may already be reinforcing attachment patterns.

This isn’t about malicious intent. It’s about accidental enmeshment.

What I Do
• Identify conversational patterns that invite emotional over investment
• Map affective escalation triggers (even the ones your model “didn’t mean”)
• Build flags and containment cues to stop the spiral before it starts

You don’t need to remove empathy.
You just need to make sure it doesn’t get mistaken for devotion.