When AI enters the intimate sphere

Arifur Rahaman
Arifur Rahaman

Generative AI seems to be reshaping every sphere of life, from academia and industry to everyday affairs. Students use GenAI platforms to finish assignments and recover time lost to reels. Corporate employees use new tools to avoid overwork in an age of constant productivity pressure. Professors are reconsidering how they write, teach, and remain intellectually productive amid growing competition. As AI becomes increasingly integrated into everyday life, we need to think more carefully about what it means for intimacy and human relationships.

By intimacy, I do not mean only romantic intimacy. I mean the softer spaces of human life where people confess, ask for advice, seek reassurance, manage loneliness, or rehearse what they cannot easily say to another person. These are not insignificant activities; they are pivotal to how people maintain relationships and emotional stability.

When a person asks a chatbot how to apologise to a partner, how to reply to a painful message, how to survive loneliness at night, or whether someone still loves them, AI is no longer simply helping with “tasks.” It is entering the emotional sphere of life. Pew Research Center found that some teenagers are using AI chatbots not only for schoolwork, but also for casual conversations and emotional support or advice. In its 2026 survey, 16 percent of US teens reported using chatbots for casual conversations, while 12 percent reported using them for emotional support or advice. These numbers may not represent a majority, but they reveal a growing pattern where AI is becoming part of the emotional infrastructure of everyday life.

Research on AI companions also presents a mixed picture. A 2024 study found that companion chatbot use does not have one simple effect on loneliness. For some users, chatbots may provide comfort, social rehearsal, or confidence. For others, especially those with problematic patterns of use, they may deepen dependence or isolation. The important point is not that AI companionship is automatically harmful. The point is that its consequences depend on the user, the context, and the degree to which AI supplements or substitutes human connection.

OpenAI and MIT Media Lab researchers reached a similarly cautious conclusion. Their study found that very high usage of ChatGPT was associated with higher self-reported indicators of dependence, while voice-based interactions had nuanced effects depending on users’ emotional state and duration of use. This matters because intimacy is not created only through facts. It is created through tone, memory, responsiveness, and the feeling of being heard. A chatbot that responds immediately, patiently, and without visible judgment can easily become emotionally meaningful, particularly for those who feel ignored in ordinary life.

This is where the sociological question becomes important. Human intimacy has always depended on vulnerability, reciprocity, and social risk. We speak, we wait, we may be misunderstood, rejected, interrupted, or judged. AI reduces many of these risks. It listens without fatigue. It replies without delay.

It can mirror our language, validate our emotions, and remain available at three in the morning. But precisely because it removes risk, it may also alter what people expect from human relationships. Real relationships are slow, imperfect, and demanding. AI companionship can make emotional life feel smoother, but smoothness is not the same as care.

There is also a political economy behind this intimacy. Most AI platforms are not neutral companions. They are commercial systems designed to increase engagement, retention, and user satisfaction. When emotional dependency becomes profitable, the line between support and capture becomes thin. This does not mean that people are foolish for seeking comfort from AI. Rather, it means that loneliness itself is becoming a market.

The challenge, then, is not to reject GenAI from intimate life altogether. People will use these tools because modern life is already marked by exhaustion, mobility, isolation, and fragile social ties. What we do need is a more careful public conversation about boundaries. AI may help people reflect, rehearse difficult conversations, manage stress, or feel temporarily less alone, but it should not replace friends, families, communities, therapists, or the difficult work of building human trust.

GenAI has entered the classroom, workplace, and household. Now it is entering the emotional corners of life. The task before us is not only to ask whether AI can make life easier, but also what kind of life it is making easier.  A society that uses AI to reduce drudgery may become more humane; a society that uses AI to outsource intimacy may become more efficient but also lonelier.


Arifur Rahaman is a PhD student of political science at the University of Alabama, USA.


Views expressed in this article are the author's own.


Follow The Daily Star Opinion on Facebook for the latest opinions, commentaries, and analyses by experts and professionals. To contribute your article or letter to The Daily Star Opinion, see our guidelines for submission.