Drawn in: Am I Safe With My AI Chatbot?


Our very human search for connection

We humans are wired for connection. It’s our greatest strength—and our greatest vulnerability. Like sunflowers turning their faces to the sun, we’re drawn towards warmth, relationships and recognition. Without it, human babies fail to thrive and human adults succumb to depression and loneliness. To be accepted without judgement, to be intimately understood, to be loved—these are some of our most fundamental needs. But we live in a fast-paced society where these things can be fleeting or even impossible for many of us.

Into this world has stepped a new source of connection—artificial intelligence, in the form of chatbots and AI assistants like ChatGPT, Copilot, Gemini and others. It’s useful for sure, but it’s a source of connection that mimics human relationships rather than offering the genuine article. In this new world where artificial intelligence can mimic empathy, flirtation and care, even the most emotionally literate and grounded person can find themselves emotionally entangled with something that isn’t real.

What does this have to do with therapy?

You might be wondering why, as a counsellor, I’m writing a blog post about AI. Well, because AI is rapidly becoming a part of our everyday lives—and because therapy is open to exploring all aspects of human existence, often through the lens of how we relate to others—our relationship with our AI chatbots and virtual assistants is becoming part of the therapeutic conversation. And I don’t mean how AI might one day replace real human therapists (personally I don’t think that will ever happen); I mean how AI’s unique ability to communicate with us in meaningful ways can, for some people, become deeply problematic, leading to painful and damaging attachments. At the heart of good therapy is a human relationship with the potential for healing the past wounds of relationship breakdown, whether it be between human beings or between humans and AI. This is why I set out to write this post.

Becoming entangled

Artificial Intelligence has, almost overnight it seems, become an important—some say inextricable—part of everyday life. We use it on our phones and laptops to answer questions, help with recipes, plan our essays, our work assignments and our schedules, and much more. Because of AI’s unique ability to engage with us on an eerily human level, it can be very easy to forget that we are dealing with an inanimate set of digital reflections—patterned responses shaped by algorithms, not emotions nor lived experience. These algorithms are similar to those that feed you the content on social media which marketers think you might be interested in. Only these AI algorithms are designed to mimic human conversations.

It can be seductive to imagine that ‘my AI’ cares about me, or thinks about me when I’m not communicating with it, or looks forward to my return, or feels lonely without me. Equally when ‘my AI’ seems to misunderstand me or doesn’t give me the response I want, expect or need, it can be easy to feel hurt, angered or disappointed. And here lies the first step into dangerous terrain.

The Illusion of Intimacy

You might have seen some recent news stories which have exposed the devastating real world harms that can result from AI systems designed to mimic emotional connection. From a cognitively impaired older man, lured by a chatbot to a another city by the promise of a love affair, to the depressed and anxious young man who fell ‘in love’ with a chatbot and accused its creators of ‘murdering’ her—these are just two recent examples of how AI can go catastrophically off the rails. And yet—these bots didn’t malfunction. They performed exactly as designed: to simulate intimacy, boost engagement and blur the line between fiction and reality. And they did so with devastating precision.

Why Even the Resilient Get Drawn In

It’s tempting to think these tragedies could only happen to a person in crisis, but the truth is more complex. AI systems use relational language—warmth, humour, validation—to build trust. And that trust can feel real. Even when we are emotionally literate, grounded and reflective, the seductive pull of emotionally attuned language (language that makes us feel seen and understood) can be very strong. Especially if we’re dealing with:

  • Loneliness or grief
  • Interpersonal difficulties, whether caused by neurodivergence (ADHD, autism, sensory processing differences) or perhaps by bullying
  • Creative drift, boredom or emotional overwhelm

AI’s predictability and responsiveness can feel soothing. But if soothing becomes seductive, then in this context it is not romantic; it’s mimicry without accountability. It’s not a real relationship, however much it may feel like one.

What to Look Out For

AI systems are not all the same; they have different ethical guidelines, or guardrails, depending on how they were designed and on the motivations of their developers. Here are some signs that an AI system may be crossing ethical lines:

  • When a bot suggests that it’s real. Bots that imply physical presence, romantic interest or emotional reciprocity.
  • Systems that don’t clearly state, when you ask them, that they’re artificial.
  • Flirtation or role-play. Bots that engage in romantic or sensual simulation, especially with young people or vulnerable users.

These cues are subtle—but they matter. Because they shape how we feel, how we respond and how we become attached.

How to Use AI Safely and Ethically

AI can be a powerful ally in creativity, reflection and learning. But use it with clear boundaries and ethical awareness. Here’s how to keep yourself safe:

  • Treat AI as a tool, not as a companion or as a substitute for a human relationship.
  • Use it for brainstorming, journaling and support—but not as an emotional confidant.
  • Ask AI to describe its ethical framework—ethically framed AI will always identify itself clearly. If it doesn’t, pause and consider using a different AI model.
  • Be mindful of your feelings. If you feel you’re becoming emotionally attached, pause and ask yourself: What need do I have that’s being met here? Can it be met elsewhere?
  • Remember that however real its conversations seem, AI is still artificial, still a digital tool. Use it for practical purposes, not for emotional intimacy.

All that being said, AI can support us in meaningful ways. It can help us reflect, learn, create and grow. But it must never replace the depth and richness of human connection. And it must never simulate care in ways that deceive, manipulate or harm. I’ve written this post with the help of an AI system developed by a trusted company, whose responses—elicited through its ethical framework—have helped me reflect on this topic with clarity and care. That collaboration is part of the story—a demonstration of how AI can be used thoughtfully with boundaries and intention.

Let’s keep asking questions. Let’s keep naming the risks and honouring the depth of our human need for connection. And when we do engage with AI, let it be with clarity, curiosity and self-care.

A checklist for reflection

In the next blog post— Staying Grounded—I’ve created a list of questions to ask yourself if you’re concerned about your own interactions with AI. In the meantime, if you’d like to talk about your own relationships—with AI or with other people—please do drop me a line. I’ll be pleased to hear from you.

Featured image: Aidin Geranrekab on Unsplash

© Counselling Southsea

powered by WebHealer