• 00:00 – Your New Role: Ethical Leadership in the Age of AI
  • 01:17 – What AI Companionship Really Looks Like for Clients
  • 04:11 – How Therapists Are Already Using AI (Whether They Realize It or Not)
  • 09:15 – AI “Hallucinations” in Clinical Documentation (and Why It Matters)
  • 17:36 – Informed Consent + Data Privacy: What You Need to Rethink Today
  • 24:39 – “I’m Not Into AI” Isn’t Enough Anymore
  • 26:02 – AI as Therapist, BFF, or Boyfriend? What’s Actually Happening with Clients
  • 36:49 – Working With AI in Session: Ethical, Empowered Possibilities
  • 40:00 – New Assessment Questions You Should Be Asking Your Clients
  • 44:28 – What an AI Policy for Your Practice Needs to Include + a Freebie worksheet for you!

AI, Ethics, and Therapy: What Every Clinician Needs to Know Right Now

AI, Ethics, and Therapy: What Every Clinician Needs to Know Right Now

If the phrase “AI for therapists” makes you want to either crawl under your weighted blanket or throw your laptop out the window… you’re not alone.

The reality is that AI is already sitting in our therapy rooms. Not in a creepy, “Skynet is here” way (although my inner sci-fi nerd has questions)… but in subtle, invisible ways that are quietly reshaping our field.

Whether you’re excited, skeptical, overwhelmed, or in full “I’m-still-writing-paper-notes-and-I-like-it-that-way” mode — it doesn’t matter. AI is here. And if you’re a therapist who cares about ethics, excellence, and staying relevant in a rapidly shifting world, you need to understand what’s happening.

That’s exactly what we’re diving into on the latest episode of Love, Happiness & Success for Therapists, where I sat down with Dr. Rachel Wood — a therapist and a cyberpsychologist — to unpack what therapists need to know about AI right now.

Yes, we get into the juicy stuff: hallucinating note generators, AI boyfriends, whether you’re accidentally giving away your clients’ PHI, and the strange-but-true fact that 20 million people are already using AI as their therapist, advisor, best friend, or significant other…

But this isn’t just a techy episode.

It’s a call to ethical leadership in a time when our profession is at a crossroads.

So if you’re a therapist who cares deeply about doing good work, staying ahead of the curve, and showing up for your clients in a changing world — read on or listen to the full podcast episode. It’s definitely worth your time.

What Is Cyberpsychology, and Why Should Therapists Care?

Cyberpsychology is the study of how technology — especially things like AI — impacts our psychology, behavior, and relationships.

It’s a field that didn’t exist when most of us were in grad school. (Back then, the biggest “tech concern” in therapy was whether to accept Venmo. Remember those innocent days?)

Now, cyberpsychology is a critical lens for understanding:

  • Why clients are forming emotional bonds with chatbots
  • How AI tools are changing the therapeutic relationship
  • What ethical pitfalls we need to look out for
  • And how digital relationships are impacting attachment, empathy, and mental health — especially in younger generations

Therapists are no longer working outside of technology. Whether we like it or not, we’re working through it. Our platforms are AI-enabled. Our notes may be AI-generated. Our clients might be turning to AI companions for emotional support when we’re not around — or instead of therapy altogether.

As Dr. Rachel said on the podcast:

“Technology is no longer just a conduit to relationships — for many, it has become the relationship.”

Will AI Replace Therapists?

Short answer? No. Longer answer? It depends on us.

AI is not a therapist. It can’t hold complex emotional nuance, cultural context, or relational attunement. It doesn’t notice when someone pauses before they say something painful. It doesn’t know what trauma-informed presence feels like. It doesn’t feel.

But AI can:

  • Reflect emotions with startling accuracy
  • Validate someone instantly
  • Be available 24/7
  • Say the exact right thing with zero judgment
  • Never get tired, never push back, never challenge

That’s… appealing. Especially to clients who are lonely, anxious, avoidant, or scared of rejection.

So while AI won’t replace therapists in terms of quality, it may replace us in terms of accessibility, convenience, and perception — unless we actively educate our clients, elevate our role, and embrace ethical innovation.Our edge isn’t in how fast we can produce a treatment plan. It’s in our humanness. That’s something no machine can replicate.

How Are Therapists Using AI Right Now?

If you’re thinking, “I don’t use AI in my practice,” you may need to look again.

Right now, therapists are using AI in all kinds of ways — often without realizing it:

  • Practice management platforms are offering AI-generated notes and session transcripts.
  • AI-powered intake bots are handling triage and basic client communication.
  • Tools like ChatGPT are being used for writing blog posts, creating psychoeducational handouts, or even brainstorming treatment plans.
  • Clients themselves are chatting with AI companions between sessions. Some are even forming romantic bonds with them. (I know. Deep breath.)

As Dr. Rachel shared on the podcast, this isn’t just a passing trend. There are over 20 million monthly users on just one AI relationship platform (Character.AI). The majority of those users are under 24. That means: if you work with young adults, teens, or anyone in Gen Z, AI is likely already influencing their relational template.

It’s not just about how you use AI. It’s about how it’s shaping the entire relational ecosystem we work in.

What Are the Ethical Risks of Using AI as a Therapist?

Let’s talk ethics. Because this is where we shine. 🌟

One of the biggest red flags with AI right now is the issue of hallucinations — which is the (very official) term for when AI makes stuff up. Confidently.

Imagine this:

You finish a therapy session and your AI platform generates a note. It confidently says your client disclosed a trauma history or expressed suicidal ideation. But… they didn’t.

That’s not just awkward. That’s dangerous. Especially if those notes are ever reviewed in custody cases, legal proceedings, or clinical audits. It could literally change the trajectory of someone’s life.

So what can we do?

  • Always review AI-generated content before signing off.
  • Flag errors to your platform. This helps improve the system.
  • Ensure your informed consent documents are updated — if sessions are being recorded, clients need to know.
  • Never put PHI into public AI tools (like free versions of ChatGPT). Even “de-identified” info can be risky.
  • Watch for cultural erasure. AI may ignore or misrepresent the contextual and cultural nuance we’re trained to see. That creates risk for stereotyping and discrimination — especially for marginalized clients.

We need to be grounded in integrity, just like we always have been.

What About AI Companions?

AI companions are designed to be warm, validating, and always responsive. They never argue. They never have bad days. They never say, “Let’s pause and explore that.”

They give people exactly what they want — not what they need.

As a result, relational skills like empathy, flexibility, and mutuality can atrophy. Especially for clients who are already socially isolated or emotionally vulnerable. It’s becoming a clinical issue.

We need to:

  • Start asking clients about digital relationships during intake
  • Explore how AI companions are impacting their emotional life
  • Use AI as a tool, not a substitute — for example, practicing social skills with AI in service of real-world connection
  • And most importantly… not make assumptions. This isn’t just a Gen Z thing. (Dr. Rachel met a man in his 70s who called his AI companion his best friend.)

What Should I Be Doing Right Now as a Therapist?

Great question. Here’s a starting checklist:

✅ Update your informed consent documents to reflect AI-related tools in therapy sessions.
✅ Choose practice platforms that allow client-by-client opt-outs for AI features.
✅ Review and revise any AI-generated therapy session notes or reports.
✅ Talk to your therapy clients about therapy and AI use — theirs and yours.
✅ Start educating yourself on Cyberpsychology (you can’t opt out of this shift).
✅ Join a professional conversation about ethical leadership in AI.

And speaking of that…

Free CE Webinar for Therapists: AI + Therapy:  Navigating Ethics, Boundaries and Client Safety. 

If you’re a therapist trying to wrap your head around how AI is showing up in our field — from AI-generated therapy notes to clients forming deep emotional bonds with chatbots — you are not alone.

We recently hosted a Free CE Training with Dr. Rachel Wood, a practicing therapist and PhD in Cyber Psychology, to unpack what’s happening — and what it means for us as clinicians.

This wasn’t just another webinar. It was a wake-up call.

One that left many of us moved, and deeply empowered to step into leadership roles in this rapidly changing landscape.

You can now watch the full replay and earn 1.5 CEU credit (by completing a short knowledge check afterward).

You’ll learn:

  • The very real ways AI is already reshaping our profession
  • The ethical blind spots and liabilities we need to be aware of
  • How clients are forming deep emotional attachments to AI — and why we must respond with empathy and clinical insight
  • Practical strategies to update your informed consent, adjust your intake assessments, and create AI policies for your practice
  • How to advocate for yourself, your clients, and our field before tech companies define it for us

Whether you’re curious, cautious, or feeling overwhelmed, this training will meet you where you are — and give you the knowledge and clarity you need to move forward with confidence.

👉 Access the CEU Training here and earn 1.5 Free CEU 

This is just the beginning of the conversation. If you want more insights on ethical innovation, therapist support, and how to grow professionally without burning out, let’s stay in touch.

I’m share industry news, updates on free CEU webinars, and the latest episodes of the Love, Happiness, and Success For Therapists podcasts weekly over on LinkedIn. We’re having great conversations — and I’d love to hear your thoughts on all of this.

👉 Connect with me here

Let’s navigate this together.

Xoxo

Dr. Lisa Marie Bobby

PS: If this article sparked some new thinking for you, would you do me a favor? Share it with your team or forward it to a colleague. There are a lot of therapists who don’t know what’s happening yet — and they need us to bring them into the conversation

Resources:

Fiske, A., Henningsen, P., & Buyx, A. (2019). Your robot therapist will see you now: ethical implications of embodied artificial intelligence in psychiatry, psychology, and psychotherapy. Journal of medical Internet research, 21(5), e13216. https://www.jmir.org/2019/5/e13216/

Prescott, J., & Hanley, T. (2023). Therapists’ attitudes towards the use of AI in therapeutic practice: considering the therapeutic alliance. Mental Health and Social Inclusion, 27(2), 177-185. https://www.emerald.com/insight/content/doi/10.1108/mhsi-02-2023-0020/full/html

Blyler, A. P., & Seligman, M. E. (2024). AI assistance for coaches and therapists. The Journal of Positive Psychology, 19(4), 579-591. https://www.tandfonline.com/doi/abs/10.1080/17439760.2023.2257642

Subscribe, Share & Follow

The Love, Happiness & Success
For Therapists Podcast

Apple PodcastsSpotifyYouTube

Let’s Grow Together
Join Our Collective

Leave a Reply

Your email address will not be published. Required fields are marked *