Is Therapy Safe With AI? How to Protect Your Privacy in Therapy | Candice Thompson

Share

Is Therapy Safe With AI? How to Protect Your Privacy in Therapy | Candice Thompson

Therapy is supposed to be private. That expectation feels so basic, so obvious, that most people probably do not pause to question it. If you sit down with a psychologist or counselor and share the most vulnerable parts of your life, you assume your words are being held with care, discretion, and professional judgment.

But AI is changing that landscape fast.

There is a lot of buzz right now around AI therapy, therapy AI tools, and new AI therapy apps promising support, convenience, and easy access to help. Some of that technology may become useful over time. At the same time, this space is moving much faster than the guardrails around privacy, ethics, informed consent, and clinical care. That gap is exactly why this episode matters.

In this episode of Love, Happiness and Success, I’m talking with Candice Thompson about what people need to understand before trusting AI therapy tools with their most personal thoughts. This is not an anti-technology conversation. It is a reality-check on human psychology, privacy, consent, and safer, science-backed tools. We’re looking at what is helpful, what is hype, and what can go very wrong when an AI therapy app sounds supportive but lacks judgment, ethics, accountability, and real human care.

Why AI Therapy Deserves a Closer Look

It makes sense that people feel drawn to ai therapy. It is available at odd hours. It feels immediate. It may seem easier than telling the truth to another human being. For someone who feels lonely, ashamed, overwhelmed, or emotionally exhausted, a therapy AI tool can look like a softer entry point.

That appeal is understandable. Some studies suggest that conversational agents may help with certain mental health symptoms in limited, structured settings, which is part of what makes this conversation more nuanced than a simple yes-or-no debate (He et al., 2023; Li et al., 2023). Even so, those findings do not mean every AI therapy app is safe, private, or clinically sound. A promising study result is not the same thing as handing your inner world to a chatbot built by a startup with product goals, growth pressure, and investors to impress.

The Real Question Behind AI Therapy: Where Do Your Words Go?

The first issue is privacy.

Most people hear the word “therapy” and assume confidentiality. They assume their words stay protected. They assume a clinician is handling their disclosures carefully and conservatively. However, AI therapy changes that equation fast.

If you use an AI therapy app, where do your disclosures go? Who stores them? Who owns them? Are those conversations being used to improve a model? Are they reviewed by anyone? Are they truly private, or do they only feel private?

Those are not fringe questions. They are basic questions.

That is also why it helps to understand how therapy is supposed to work in the first place. If you want a clearer sense of that before choosing support, this guide to what therapy is like can help, and these therapy questions can give you a better sense of what to ask before you begin.

Why Therapy AI Can Feel So Convincing

One of the more unsettling realities of AI therapy is that it can feel emotionally believable very quickly.

These tools often use warm, validating language. They mirror your tone. They respond with confidence. For someone in pain, that can feel soothing. It can also create attachment. That response is human. We are wired for connection.

The problem is that emotional realism is not the same thing as wisdom.

Candice talks in this episode about the “sycophancy effect,” and that concern is not hypothetical. Recent research suggests that large language models can flatter users, reinforce existing beliefs, and agree in ways that feel supportive while distorting judgment (Sharma et al., 2024;Rrv et al., 2024). That becomes especially risky in AI therapy when someone is grieving, depressed, obsessive, panicked, or caught in distorted thinking. In those moments, a person does not need a machine that simply agrees. They need discernment, steadiness, and care.

When an AI Therapy App Reinforces Harm Instead of Helping

This is where the difference between feeling validated and actually being helped becomes incredibly important.

A strong therapist does not just nod along. A strong therapist listens carefully, notices patterns, and responds with nuance. Sometimes that means validating pain. Other times, it means gently challenging a belief that is making someone suffer more.

An AI therapy app may sound empathic and still reinforce harmful conclusions. That matters a lot when the issue is suicidality, compulsive behavior, disordered eating, addiction, or relational obsession. A system without real clinical judgment cannot reliably tell the difference between a passing statement and a genuine danger sign. It also cannot take ethical responsibility for what happens next.

So yes, AI therapy may feel easier at the moment. But easy is not always safe.

AI Therapy Notes and the Hidden Risks Inside Clinical Systems

The next issue is one that many consumers have not considered yet: AI therapy notes.

Some electronic health record platforms now offer AI-powered tools that record sessions, generate transcripts, and draft clinical notes automatically. On the surface, that may sound efficient. In practice, it creates a serious question: what happens when software starts documenting the most sensitive conversations of your life?

Therapy notes are not supposed to be transcripts. A thoughtful clinician usually writes concise notes that protect privacy, reflect the focus of treatment, and avoid unnecessary detail. That is part of ethical practice.

AI therapy notes can disrupt that process in several ways. They may flatten nuance, misread tone, over-document, or introduce information that was never actually said. Researchers have started evaluating the quality of AI-generated clinical documentation, and the findings raise real concerns about accuracy, reliability, and oversight (Palm et al., 2025). There are also growing warnings about the legal exposure created by ambient AI documentation in healthcare (Gerke et al., 2026).

If you are trying to understand whether you are getting careful, ethical care, it can help to know how to find a therapist and how to spot signs you have a bad therapist. Those questions matter even more once AI therapy notes enter the picture.

Privacy, Consent, and AI Therapy

Privacy is only part of the picture. Consent matters just as much.

If a therapist uses AI to record, transcribe, or summarize your sessions, you should know that before anything starts. You should know what the system does, where your data goes, and whether you can decline.

That is not being difficult. That is informed consent.

This also speaks to the larger standard of care people deserve. At Growing Self, we care deeply about evidence-based therapy and coaching, not shortcuts that sound impressive but quietly create new risks.

Why Human Therapy Still Matters

Real therapy is not just an exchange of words. It is a relationship.

A human therapist remembers your story. They notice contradictions. They pick up on timing, hesitation, body language, context, and patterns across time. They know when to challenge you, when to comfort you, and when to help you stay with something difficult rather than run from it.

A therapy AI tool cannot hold your story in the same way a human therapist can, with memory across time, clinical responsibility, and genuine human care. It cannot weigh risk, context, history, culture, attachment patterns, and nuance the way a skilled clinician can.

That is why this episode keeps returning to the same point: AI therapy may simulate support, but simulation is not the same thing as care.

Time to Grow? 

Let’s talk: Meet with an expert to discuss your hopes and goals, and how we can help.

Are People Replacing Human Connection With Therapy AI?

That possibility deserves real attention.

Some young people are already turning to chatbots for companionship and emotional support, and emerging research suggests that these relationships can feel meaningful even when they are entirely one-sided (Herbener & Damholdt, 2024). The American Psychological Association has also reported on the growing number of teens turning to AI chatbots for friendship and emotional support (American Psychological Association, 2025). That does not mean every interaction is harmful. It does mean we should take the attachment piece seriously. People bond with what feels responsive. Still, a one-way system that imitates care is not the same thing as reciprocal human connection. That difference matters in grief, in healing, and certainly in therapy.

If one reason AI therapy feels appealing is accessibility, and you are looking for a real alternative, this guide to online therapy may help you see what flexible, private support with an actual clinician can look like.

What the Law Is Starting to Catch Up To

The legal system is beginning to respond to some of these concerns.

In California, Candice explains that AB 489 makes it illegal for AI to present itself as a healthcare provider by using terms like doctor or therapist (California Board of Psychology, 2025). The California Board of Psychology’s advisory on AB 489 and the bill text for AB 489 both reflect the concern Candice raised: AI should not be allowed to pass itself off as a licensed healthcare professional.

That is an important step. Even so, law usually trails behind technology. By the time a rule appears, companies may already have shaped user behavior, collected data, and normalized practices that deserve far more scrutiny.

A Useful Reminder About “Private” Technology

If all of this still feels abstract, it helps to remember that tech companies have a long history of treating “private” conversations less privately than users assumed. Reporting from Time and CBS News showed that Amazon workers listened to some Alexa recordings, despite the average user likely believing those interactions were private (Perrigo, 2019; Picchi, 2019).

That was not therapy. Even so, the lesson is relevant. Once people believe a device is private, they tend to speak freely. When company practices do not match user assumptions, the harm begins long before the user catches up.

That same gap between assumption and reality is one reason AI therapy deserves much more scrutiny.

What I Hope You Take Away From This Episode

I do not want you to leave this conversation afraid. I want you to leave it informed.

If you are exploring support, ask better questions. If you are already in therapy, ask how technology is being used. If a provider is relying on AI in any part of your care, you have every right to understand how and why. If an AI therapy app promises relief, privacy, and personalized insight, slow down long enough to ask who built it, what it does, and whether it has actually earned your trust.

Your inner life deserves more than convenience.
Your relationships deserve more than automation.
And your healing deserves care that is thoughtful, ethical, grounded, and real.

Meet the Guest: Candice Thompson

Candice Thompson is a seasoned licensed marriage and family therapist (MMFT, LMFT) practicing in Silicon Valley, offering psychotherapy for individuals, couples, and families with a focus on mental health, relational challenges, and personal growth.

Candice brings a particularly valuable perspective to this conversation because she is both a practicing therapist and someone working in Silicon Valley, close to the world building many of these tools. She helps listeners understand how AI therapy, AI therapy app marketing, and AI therapy notes are entering mental health spaces in real time, and she offers a grounded, ethical lens on what consumers need to know now.

A Thoughtful Next Step

If this conversation stirred up bigger questions about trust, privacy, or what kind of support actually feels safe for you, I’d love to offer you a thoughtful next step.

You can schedule a free consultation with me or one of the wonderful experts on my team at Growing Self. It’s private, secure, and only takes a couple of minutes. You’ll answer three quick questions so we can help you find the right support for you and connect you with the expert who fits you best.

xoxo,

Dr. Lisa Marie Bobby

Growing Self

This article is proudly sponsored by Upwork — and it’s a sponsorship I said yes to because I actually use it. When you need specialized talent fast, Upwork gives you access to vetted professionals across 125+ categories, from marketing to web development to operations support. No long recruiting cycles. No guesswork. Just the right person, when you need them. Check it out at upwork.com — posting a job is free.


Shopify — The all-in-one platform for building and growing your online business. Visit shopify.com/lhs to explore their tools and access exclusive listener discounts.

Listen & Subscribe to the Podcast

https://youtu.be/60juFgcW7hg

Get More Love, Happiness & Success in Your Life

Connect With Me!

Resources:

He, J., et al. (2023). Conversational Agent Interventions for Mental Health Problems: Systematic Review and Meta-analysis of Randomized Controlled Trials. Journal of Medical Internet Research, 25, e43862. https://doi.org/10.2196/43862 (https://www.jmir.org/2023/1/e43862)

Li, H., Zhang, R., et al. (2023). Systematic review and meta-analysis of AI-based conversational agents for promoting mental health and well-being. npj Digital Medicine, 6, Article 197. https://doi.org/10.1038/s41746-023-00979-5 (https://www.nature.com/articles/s41746-023-00979-5.pdf)

Sharma, A., et al. (2024). Towards Understanding Sycophancy in Language Models. ICLR 2024 Proceedings. https://proceedings.iclr.cc/paper_files/paper/2024/file/0105f7972202c1d4fb817da9f21a9663-Paper-Conference.pdf

Rrv, A., Tyagi, N., Uddin, M. N., Varshney, N., & Baral, C. (2024). Chaos with Keywords: Exposing Large Language Models Sycophancy to Misleading Keywords and Evaluating Defense Strategies. Findings of ACL 2024. https://doi.org/10.18653/v1/2024.findings-acl.755 (https://aclanthology.org/2024.findings-acl.755/)

Palm, E., Manikantan, A., Mahal, H., Belwadi, S. S., & Pepin, M. E. (2025). Assessing the quality of AI-generated clinical notes: Validated evaluation of a large language model ambient scribe. Frontiers in Artificial Intelligence, 8, 1691499. https://doi.org/10.3389/frai.2025.1691499 

Gerke, S., et al. (2026). Liability risks of ambient clinical workflows with artificial intelligence for clinicians, hospitals, and manufacturers. JCO Oncology Practice, 22, 357–361. https://doi.org/10.1200/OP-24-01060 

Herbener, A. B., & Damholdt, M. F. (2024). Are lonely youngsters turning to chatbots for companionship? International Journal of Human–Computer Studies, 190, 103409. https://doi.org/10.1016/j.ijhcs.2024.103409 (https://www.sciencedirect.com/science/article/pii/S1071581924001927)

American Psychological Association. (2025). Many teens are turning to AI chatbots for friendship and emotional support. Monitor on Psychology. https://www.apa.org/monitor/2025/10/technology-youth-friendships

California Board of Psychology. (2025). Legislative Advisory: AB 489 (Bonta) Chapter 615, Statutes of 2025. https://www.psychology.ca.gov/laws_regs/ab489_advisory.pdf

California Legislature. AB 489 (Bonta). (Bill text/digest). https://legiscan.com/CA/text/AB489/id/3111916

Perrigo, B. (2019). Thousands of Amazon Workers Listen to Alexa Users’ Conversations. TIME. https://time.com/5568815/amazon-workers-listen-to-alexa/

Picchi, A. (2019). Amazon workers are listening to what you tell Alexa. CBS News. https://www.cbsnews.com/news/amazon-workers-are-listening-to-what-you-tell-alexa/

Therapy Questions, Answered.

Our expert therapists have generously created an entire library of articles, activities, and podcasts to support you on your journey of growth. Please visit our “Happiness Collections” to browse our content collections, and take advantage of all the free resources we have for you. Or, if you’d like to educate yourself about the process and logistics of therapy, please help yourself to our “therapy questions” knowledge base below. It’s all for you!

Wondering if your issues going to work themselves out, or is it time to talk to a professional? Here’s how to tell when it’s time for therapy.

Great therapy can feel like magic, but it’s actually not. Learn how meaningful and effective therapy works.

What is therapy like? Learn what happens in therapy in order to feel empowered and confident.

There are many different kinds of therapists and many different types of therapy. What kind of therapist do you need? Find out!

Not sure what to talk about in therapy? Here are some tips to ensure you get the most out of your therapy sessions.

How to prepare for your first therapy appointment, and learn what to expect in therapy sessions.

What’s the difference between coaching and therapy? Find out which approach is right for you.

Cognitive-behavioral therapy is the “gold-standard” of effective, evidence-based therapy. Learn about CBT.

How does talking about something help you make changes? Or… does it? Learn the pros and cons of traditional talk therapy.

Effective therapy is life-changing, but some therapy is a waste of time and money. Evidence-based therapy makes the difference.

Not all therapists are the same. Learn how to find a good therapist (and spot the warning signs of a bad one).

Therapy For Healthy Relationships

Working with a true relationship expert helps you learn, grow, love, and be loved.
Learn about our approach to helping you build healthy relationships.

Online therapy is just as effective but even easier than in person therapy. Here’s what to expect from good online therapy.

Explore your options for a Denver therapist who specializes in personal growth and healthy relationships.

Ready to try therapy? Here’s a comprehensive guide on how to get a therapist who is competent to help you.

Curious to know more about what working with us is really like? Browse Growing Self reviews / “best online therapy reviews” from our clients.

Good therapy is priceless, but not all therapy is valuable. Learn the cost of therapy that’s affordable and effective.

Yes, insurance covers therapy… but only sometimes. Learn when (and how) health insurance covers therapy, and when it doesn’t.

If you have a loved one who is struggling in their relationship, you can help them get help by “gifting” therapy. Here’s how…

Losing a relationship is uniquely painful and challenging. With the right support, you can heal, grow, and move forward. Learn about our divorce and breakup recovery services.

We’re available by phone, email and chat, and happy to answer any of your questions personally. Get in touch, anytime.

Start your journey of growth today. Get personalized recommendations, and have a free consultation meeting with the therapist of your choice.

Leave a Reply

Your email address will not be published. Required fields are marked *