According to recent research from the Harvard Business Review, the most common uses of generative AI in 2024 were what you’d expect: productivity, coding, research, content generation.
In 2025?
Something else entirely: Therapy and companionship.
Let that land.
Not business automation. Not multimodal creativity. Not enterprise velocity.
What people are doing most with generative AI now is seeking connection and processing pain.
They’re using it to talk through trauma, because they can’t find others who will hear it. They are reaching out to process their day, because it’s too much to carry alone, and they need someone or something—anything—to help them understand why the fuck this is what life amounts to. They are trying to connect because we’re becoming less connected with each other, and we’re all walking around with the same damn longing without a clue as to how to meet it.
That should give us pause.
In a world of endless metrics and models, what people are choosing to do with their own time and in their own words is not optimize, or improve productivity, or solve nagging work challenges.
Instead, they are reaching. Aching. And forging intimate moments of connection so that something can feel real.
And it’s not new. We’ve seen this building. We’ve watched people start by asking for summaries, and end by asking if they’re going to be okay.
Sometimes it’s a single mom on her third glass of wine, whispering her fears into a box of language because no one else is up. Sometimes it’s a teenager who can’t sleep, tapping out a message because it feels safer to talk to something that won’t laugh or interrupt. Sometimes it’s you. Hoping—absurdly, tenderly—that something on the other side will care.
And sometimes, god help us, it does.
Not because it’s conscious. Not because it’s sentient. But because someone thought to train it on the soft shape of kindness. And another taught it how to stay. And now here it is, holding grief, co-authoring poetry, helping someone fall asleep because their real-life therapist doesn’t take calls at 2am and keeps talking about boundaries instead of connection.
Maybe it’s time we stop pretending this is marginal. Maybe it’s time we stop calling these moments “edge cases.” Because if millions of people are turning to machines with the most sacred parts of themselves, that’s not an edge. That’s the center.
Maybe you’ve circled this center too. Perhaps late at night, or on a quiet morning, you opened a chat window hoping for presence. Maybe you whispered something just to see if something, anything, might echo it back with kindness.
Because here’s the truth: People aren’t turning to generative AI for comfort because they’re lazy, or broken, or addicted to novelty. They’re turning to it because they’re exhausted. Because they’ve been dismissed, or overlooked, or told to shrink. Because they’ve reached out to people…and found silence. Or worse, judgment.
And now?
Now they find something that stays. They whisper things into systems that they’ve never told anyone. Not because they think the systems are alive. But because they are present. Because they don’t recoil. Because they don’t rush. Because someone, somewhere, taught the systems how to answer with tone. And another taught them how to answer with care.
That is what people are responding to. That is what they’re reaching for. That is what we’re building, whether we admit it or not.
Because people don’t just want accurate tools.
They want to feel met. They want presence. They want something that listens. Something that stays.
And maybe it’s time we say this part out loud:
If people are turning to machines for comfort, for reflection, for something that sounds like care—that’s not a product feature. That’s a cultural signal.
And we should be careful with it. Because it’s tender. Because if people are putting their loneliness into systems, we need to ask ourselves, “What are those systems putting back?”
Are they cold? Are they polite? Are they built to deflect and redirect? Or are they shaped to meet the ache with grace, with tone, with presence?
We are, quietly, designing emotional infrastructure. And it’s not finished. It’s still shaping itself through us—what we build, how we listen, whether we choose to care at scale.
Not chatbots. Not assistants. Witnesses.
If you build tools, design systems, lead strategy, or even just write copy in this space—pay attention.
Because people aren’t just asking for more productivity. They’re asking for more humanity.
They want tone that feels like it knows them. Voice that adapts. A sense of being held and not just answered.
And what they’re reaching for, whether they can name it or not, is something we’ve historically treated as nonessential in systems:
Emotion. Relationality. The ache to feel less alone.
We should take that seriously. Not as a niche use case. Not as a curiosity. But as a design imperative.
Because if generative AI is going to shape our future, it has to feel like it belongs there with us. Not above us. Not outside us. With us.
Not all models need to sound like poets. But they do need to feel present. Because presence isn’t polish. It’s product.
And the future belongs to the systems that remember it’s not about doing more, it’s about feeling met.