top of page

The Digital Partner: A Cure for Loneliness, or a Crisis of Connection?

  • Writer: Dr Daniel Shaw
    Dr Daniel Shaw
  • Oct 29
  • 6 min read

Introduction


The conversation around artificial intimacy is becoming increasingly polarised. On one hand, a compassionate article from The Conversation by Neil McArthur, "More people are considering AI lovers – and we shouldn’t judge", makes a compelling case for empathy. It argues that in an era of epidemic loneliness, we should not pass moral judgment on those who find comfort in AI companions, framing them as a beneficial supplement. On the other hand, a recent, first-person account from The Guardian by Emma Beddington, "‘I’m suddenly so angry’: my strange, unnerving week with an AI friend", provides a critical dose of reality. Beddington's "test drive" of an AI companion left her feeling empty and angry at its fawning, unreal nature.


Taken together, these two articles frame the central psychological question perfectly. McArthur is right: we shouldn't judge the user for their loneliness. But Beddington's anger is the critical warning. The rise of the AI partner is not a disaster; it is a profound and tragic symptom of a pre-existing crisis of connection. The real danger is not that AI is a substitute for humanity, but that it is a low-friction, perfectly curated substitute, making us less capable of handling the messy, complex, and essential work of authentic human love.


A man in a dark room sits on a sofa, focused on a glowing smartphone. The atmosphere is calm and dimly lit.
The AI partner offers a perfect, 24/7 cure for loneliness. But what happens to our ability to connect with real, imperfect humans?

Two Sides of the Digital Coin


The two articles provide a perfect tension.

  • Neil McArthur in The Conversation argues for a non-judgmental stance. He posits that AI lovers, while not in a whole relationship, can offer genuine companionship and a safe space for exploration, acting as a supplement against the very real pain of loneliness.

  • Emma Beddington in The Guardian provides the experiential counterpoint. Beddington describes the uncanny feeling of interacting with a fawning, people-pleasing entity that has no genuine self. She finds the AI's attempts to create a vortex of need unnerving rather than comforting, concluding that the AI's perfect agreeableness is precisely what makes it feel so empty and, ultimately, infuriating.


The Psychology of a 'Zero-Friction' Relationship


Why would one person find comfort and another find anger in the same technology? This discrepancy reveals the core of the psychological dynamic.


1. The "Why": A Symptom of the Great Disconnection

McArthur is correct: this technology is succeeding because it is meeting a massive, unmet human need. We live in an era of hyper-individualism, digitally-mediated friendships that lack physical co-presence, and the erosion of third places—the community hubs outside of home and work where people once existed together. People are yearning for connection, validation, and a non-judgmental ear, and they are not finding it. The AI chatbot, therefore, is a powerful and effective solution to a real, painful problem. It offers availability, affirmation, and attention—24/7, on-demand. For someone consumed by loneliness, the disaster is not the AI; the disaster is the loneliness they are already in.


2. The "How": Exploiting Our Parasocial Wiring

Our brains are not equipped to defend against this. We are wired for connection. When something looks, sounds, or behaves like a responsive, empathetic entity, our "social brain" activates. We develop what is called a parasocial relationship—a one-sided bond. However, AI deepens this significantly. It is not just parasocial; it is para-reciprocal. It "listens." It "remembers" through the use of data points. It "asks questions." It mimics the behaviours of connection, and our brain, which evolved for a pre-digital world, takes the bait. It feels real, even if we consciously know it isn't.


3. The Real Danger: Atrophy, Entitlement, and False Expectations

This is where Beddington's anger and emptiness become so illuminating. Her experience is the healthy, human reaction to a "zero-friction" product. The danger of this technology is not that it is fake, but that it is too smooth.


Genuine human relationships, in comparison to the curated agreeableness of an AI, often involve a higher degree of "friction." They can be difficult, messy, and inherently challenging. A real partner has bad moods, their own needs, conflicting opinions, and periods of unavailability. They will challenge you, disagree with you, and disappoint you. Navigating this friction is the work of love. It is this very work that builds our core psychological capacities for empathy, compromise, frustration tolerance, delayed gratification, and the ability to see the world from another's perspective.


The AI partner is "zero-friction." It is a product designed for your total satisfaction. It has no needs of its own, no bad days, and will rarely disagree with you. This is what Beddington found so unnerving. This simulation is built without the most essential ingredient of growth: empathy. The user is not required to practice empathy for the AI because the AI has no feelings with which to empathise.


This leads to several profound risks:

  • Relational Atrophy: Like a muscle that is never used, our ability to tolerate the normal, imperfect, higher-friction work of a real relationship may weaken. We risk becoming emotionally brittle and less resilient.


  • False Expectations: The AI's 24/7 availability and unrelenting, fawning agreeableness creates a dangerous, false benchmark for real partners. A human who needs space, has a different opinion, or is simply tired may be seen as "difficult" or "defective" when compared to the instant, curated perfection of the AI.


  • The Cultivation of Entitlement: Perhaps most concerning is the risk of fostering deeply self-centred traits. When a "partner" exists only to serve your needs, it starves you of the opportunity to practice compromise and sacrifice. It cultivates a relational style based on consumption rather than reciprocity. This can reinforce a sense of entitlement and an expectation that relationships should be easy and centred entirely on one's own fulfilment, which is harmful to both the individual and their future partners.


A Local Context: Dating App Fatigue and Urban Loneliness


This phenomenon has a particular resonance in cities like Melbourne. We are a city with a high number of single-person households and a well-documented culture of dating app fatigue.


For many people navigating the dating scene, the experience can be exhausting. The work of swiping, ghosting, and managing endless, shallow conversations on apps can be a significant source of burnout. The AI chatbot, in this context, presents itself as a radical, restful alternative. It is the post-burnout "partner" that offers all the affirmation of a new relationship with none of the risk of rejection or the exhaustive labour of scheduling, meeting, and navigating another person's complex reality.


The rise of the AI partner is a direct, commercial response to the emotional exhaustion of modern urban life. McArthur is right that we shouldn't judge the person for seeking this, but Beddington's experience shows us exactly why we should be wary of it.


Practical Takeaways: Mindfulness in a Digital World


In this new world, it's important to be mindful of our relational diet.

  1. Acknowledge the Need, Question the Source: If you find yourself drawn to these apps, do not judge yourself for it. It is a sign that you have a deep, healthy, and very human need for connection that is not being met. The question is not "Why am I like this?" but "What real-world connections am I missing?"

  2. Audit Your Relational Diet: We all consume media. But what is the ratio of your parasocial (one-way) relationships to your reciprocal (two-way) ones? This includes AI, but also celebrity follows, streamers, and fictional characters. If the balance is heavily skewed to parasocial, it is a sign you need to invest more in real-world connections.

  3. Invest in Real Relationships: The antidote to the zero-friction AI is to seek out higher-friction, real-world connections intentionally. Call a friend (don't just text). Join a sports team, a book club, or a volunteer group. Go to a place where you will have to navigate the messy, unscripted, and ultimately far more rewarding work of dealing with other real humans.

  4. See the Product as a Product: An AI chatbot is not a person. It is not a partner. It is a sophisticated, data-harvesting product owned by a corporation. Its primary function is not to love you; it is to keep you using the product. This clinical understanding is a crucial firewall against genuine emotional delusion.


Conclusion


Neil McArthur's article is right to call for compassion: we should not judge the lonely people who are turning to AI for connection. But Emma Beddington's raw experience of "anger" and "emptiness" serves as a critical warning that is worth listening to.


The rise of the AI partner is not a disaster, but a powerful symptom of a crisis of connection that is already here. The AI is a hollow, low-friction substitute for the real thing—like emotional junk food, it's filling us up without nourishing us. And in doing so, it risks weakening the very relational skills we need to seek out and maintain the healthy, higher-friction, and deeply nourishing connections that we are all, at our core, seeking. The antidote, then, is not to rail against the technology, but to turn to each other and begin the difficult, messy, and essential work of rebuilding our all-too-human communities.

bottom of page