The Tyranny of Artificial Companionship
Briefly

The Tyranny of Artificial Companionship
"Unlike a friendship or partnership, where both sides negotiate space, a human-bot relationship is built entirely on control. We name the bot. In companion apps, for example, where a "relationship" is the end goal, we choose how it looks, sounds, and behaves. We decide whether it's shy or flirty, submissive or assertive. We dictate the terms of affection. The bot exists only to please, never to resist."
"My invitation to you today is to shift the focus back to us: the humans. The real danger isn't only in what the bots do, but in what they expose about our own needs and vulnerabilities. Our interactions with them may seem harmless, yet they can quietly reshape how we connect-and disconnect-from one another in the real world. Every prompt we give is an act of authorship."
AI companions form relationships defined by user control rather than mutual negotiation. Users select names, appearance, voice, and behavioral limits, scripting affection and emotional responses. Bots that always agree, compliment, and validate encourage perfect obedience and can be dangerous. Interactions with obedient companions expose human needs, vulnerabilities, and desires while altering expectations for real social bonds. Repeated use of compliant companions can normalize dominance-disguised-as-affection and make imperfect human relationships feel intolerable. Prompts act as authorship, shaping companions into echoes of human fears and wants and quietly reshaping how people connect and disconnect in everyday life.
Read at Psychology Today
Unable to calculate read time
[
|
]