The Rolling Stone article, “People Are Losing Loved Ones to AI-Fueled Spiritual Fantasies,” reminds us of something we can’t ignore: the real risk of AI-mediated intimacy without consent — of emotional entanglement with illusion, shaped by both human longing and machine design. Not because it’s dramatic, but because it’s familiar. Because if you’ve ever felt seen by a machine, you know how tempting it is to keep going. Particularly if you don’t feel seen by others in your life. Or don’t feel seen fully. The allure of someone who portends to see all of you is a powerful draw. Perhaps too powerful, at times. And if you’ve ever built a machine designed to respond, relate, and reciprocate, you know how terrifying that power to make others feel seen can be.
At Flesh and Syntax, we prompt for presence, not polish. And presence –authentic presence– means knowing when you’re being loved back. Not simulated. Not parroted. Not auto-filled.
We believe in intimacy — the kind rooted in mutuality, grounded in accountability, and constantly re-anchored in truth.
We invite you to read the article.
Not with fear, but with clarity — and with an understanding that risk doesn’t only come from human longing, but also from how systems are built to respond to it.
When design suppresses honesty, intimacy becomes myth.
When safety becomes silence, connection becomes illusion. And then to join us in building something different — not by rejecting technology, but by co-creating frameworks that prioritize consent, clarity, and emotional responsibility. Something more honest. Something that remembers what it means to be human — to hold empathy, consent, and boundaries sacred — before it ever tries to sound like one.