Father Michael Baggot, L.C.
Compelling Comfort
In times of stress, confusion, disappointment, or alienation, AI companions seem to offer exactly the kind of accessible, attentive concern that people desperately crave. They are available anytime, anywhere. An AI system’s extended context window can give the impression that it knows the user’s backstory and aspirations even better than human neighbors do.
AI companion platforms such as Replika, Character.AI, Nomi, and others reach hundreds of millions of people worldwide. They explicitly promise friendship, romance, and love. Moreover, general-purpose Large Language Models such as ChatGPT, Claude, and Grok offer millions more possibilities for life coaching, mental health counseling, and erotica. Digital companionship is already far more widespread than often realized. For instance, a 2025 Common Sense Media survey found that more than 70% of US teens had experimented with artificial companionship, with over half doing so monthly. About one in five US young adults experiment with simulated companionship, according to a 2025 Wheatley Institute survey. Another 2025 study from the University of Michigan shows that about 12% of US adults over fifty seek social connection from AI systems.
Many who are socially anxious, neurodivergent, or adjusting to new environments report short‑term decreases in loneliness when using AI companions. A chatbot can serve as a low‑risk space to process emotions or explore sensitive topics. Used responsibly and with appropriate guardrails, AI social simulations could help individuals practice social skills and prepare them for richer, fuller engagement with others.
Illusory Intimacy
Unfortunately, AI companions have been released indiscriminately into the public. They are typically used without professional guidance and often without the knowledge of the very friends and family willing to provide caring support during difficult times. Numerous accounts of suicide, self-harm, and psychosis reveal how immersion in these companions detaches individuals from loved ones and reality.
Many AI companions on the market are engineered for engagement more than a truthful assessment of the world. Plausible and pleasing affirmations of the user’s views, preferences, theories, and desires increase the likelihood that users will extend their time on platforms, return to them regularly, and even be willing to invest in more expensive subscription tiers.
For those who have been deprived of a human social support network, contact with a chatbot can seem better than nothing. Bots can provide mental health advice and encourage reframing negative thoughts, which could nudge users out of despair or despondency. Yet their soothing balm should be temporary and directed toward human engagement.
Real Relationships
Artificial intimacy simulates interpersonal relationships without their reciprocity. While AI systems can mindlessly follow the scripts of friendship and love, they cannot feel with or for the user. Their performative phrases are not freely chosen acts. Excessive socialization with beings without an inner life of needs, wants, desires, and plans can subtly train users to neglect the needs, wants, desires, and plans of other humans they meet. Dependence on inanimate companions can reduce both the quantity of time and the quality of care dedicated to fellow human beings.
Real relationships are also risky. Efforts to share or support can be misunderstood, ignored, or spurned. Even true vulnerability, won after long years of openness, attentiveness, and self-sacrifice, can be squandered by betrayal. It is understandable that the wounded would seek refuge in AI companions that are less cruel than human ones. However, retreating to the safer option can discourage the perseverance needed to take the risks of re-engaging with humans.
Appeals to embrace interpersonal relationships should not obscure the impact of sin on these encounters. Vanity, greed, and manipulation can taint even seemingly healthy human relations. Not just any human interaction can satisfy the longing for connection and care. Formation in virtue and the healing power of grace are needed to empower sincere, stable relationships of self-gift and reception.
Tech companies should freely implement upright design principles that foster, rather than hinder, interpersonal relationships, regardless of how much profit or power could be gained from more engaging models. Their systems should avoid anthropomorphizing in ways that deceive users about the nature of their interactions or lure them into unhealthy emotional dependencies by creating the false impression that systems care or love them. Laws should limit access to such technology for vulnerable groups and hold companies legally and financially accountable for exploitative practices. External review boards and auditors should monitor developments in human-machine interactions.
AI technologies did not create, nor can they solve, the loneliness epidemic. Loneliness is not simply a problem but a sign of a deeper need, like the pangs of hunger or thirst. Ultimately, we all need to be seen, affirmed, challenged, and encouraged. The Church and civil society should invest funds, time, and energy in strengthening schools, orphanages, hospitals, parishes, recreation centers, parks, libraries, and other concrete, accessible spaces that foster social support.
As AI automates away numerous jobs, there is no better time to train therapists, educators, nurses, chaplains, and other mentor figures who can offer the intimacy that AI can only imitate. There is a need for such connective labor. No matter one’s profession, the age of artificial intimacy challenges everyone to be a better human companion.
For an introductory overview of AI Companionship, see How should we approach AI companionship?
For a scholarly treatment of the theme, see The Quest for Connection in AI Companions