Radio Front Desk

Can ChatGPT be a therapist? What AI means for the future of therapy

Jane.app Season 2

We know AI can give us answers in seconds. But can it really heal us? 

Therapy has never been more normalized… or more out of reach. And as waitlists grow and costs rise, AI has quietly become a place people go when they don’t know where else to turn. But what if it’s more than just information and convenience that people are seeking out in AI?

We asked four therapists how they feel about their clients turning to AI for answers in an article from Front Desk Volume 6. And now, we’re turning it into a special narrative episode to dive even deeper into this topic. 

They explore why AI can sometimes feel safer than a human, how therapists are helping clients use it more intentionally, and what they’re hearing from colleagues and other practitioners across the field. It’s a fascinating look at an industry in transition, and what the future of care might hold.

What You’ll Learn

  • Why so many people are turning to AI for comfort, clarity, or “quick therapy”
  • The emotional risks and rewards of getting answers at the speed of a search bar
  • What therapists fear AI might replace (and what it absolutely can’t)
  • How clinicians are integrating AI thoughtfully, from note-taking to between-session support

Read this article in Front Desk Volume 6

Guest Bio


Enjoyed this episode?

Here are a few ways to stay in the loop:


Radio Front Desk is Brought to You by Jane

We like to make sure that when we tell you more about Jane, it’s super helpful. Here’s one we think you might like: 

The therapists in this episode told us they don’t want AI to replace care, but they do see its potential to support the people who provide it.

That’s exactly what Jane’s AI Scribe does. By securely listening during your session and drafting your notes for you, AI Scribe takes care of the paperwork so you can focus fully on taking care of your patients. Designed it thoughtfully and helpfully, we built AI Scribe to support the work you do, not replace it. 

Learn more about AI Scribe


Disclaimer: This podcast is for informational purposes only and should not be considered as professional medical, legal, or financial advice.

The views and opinions expressed in this podcast are those of the guests and do not necessarily reflect the official policy or position of the podcast host or its affiliates.

SPEAKER_02:

Repeat after me. AI is not my therapist. Hey, I'm Denzel Ford, and this is Radio Francesque by Jane. That line you just heard, AI is not my therapist, came from psychotherapist Roxanne Francis, who was speaking at Jane's company retreat. As you heard, it got a laugh. But she was also serious. People aren't just using ChatGPT to write emails or plan vacations anymore. They're asking it how to handle their breakup, how to manage anxiety, or how to feel less alone. So we started asking questions. Why are people turning to AI therapy? What does that mean for therapists? And if this technology isn't going away, how can we use it responsibly, ethically, and maybe even helpfully? Frontdesk magazine's managing editor and a registered psychotherapist qualifying, Vasiliki Marapaz, spoke with four clinicians about this. Roxanne Francis, Amanda Bouderith, Carolyn Solo, and Iwana Siriz. Their thoughts became a feature in Front Desk Volume 6. But today, we're going deeper into what they said between the lines.

SPEAKER_04:

People are turning to therapy more than ever before. But the challenge is that um therapy is not accessible to everybody. It's expensive. A new therapist right out of school is gonna charge you at least$150, right? And it can go up to$250 depending on where you live, right? And if you don't have benefits or you don't have a great disposable income, then that's a problem. So it's either pay out of pocket, have your benefits, pay the therapy, or the the community-based options, there's a six-month wait list.

SPEAKER_02:

That was Roxanne again. She's been noticing how therapy has become both more normalized and less accessible at the same time. But AI, with its 24-7 availability and low barrier to entry, has quietly started filling that gap.

SPEAKER_04:

In the era of technology, where everybody has a phone, you can access the free version of Chat GPT or whatever other AI software that's out there. We're gonna throw those questions out at them because you don't have to get up and leave your house.

SPEAKER_02:

Therapy isn't always accessible. And when people find something that feels cheaper or faster, they'll use it.

SPEAKER_03:

I also think that we've now been conditioned and our brains have evolved to really expect instant gratification.

SPEAKER_02:

That's Alana Ciras, the clinical counselor and the director of a large group practice in Vancouver.

SPEAKER_03:

If we know there is this immediate outlet, I think we're just we're gonna use it. And so whereas something's bothering me, you know, maybe there was a time where I might sit with it, I might call a friend, wait until my next therapy appointment. Our brains have changed and our expectations have changed.

SPEAKER_02:

ChatGPT gives us answers so fast that we don't have to work for them. We don't have to wrestle with the question or sit in the not knowing. When support arrives at the speed of a search bar, we risk losing something important. The ability to sit in discomfort long enough to grow from it.

SPEAKER_03:

For people who are used to getting answers at their fingertips, um, finding out what their friends are doing right this very minute, the things like ambiguity and uncertainty and frustration become intolerable.

SPEAKER_02:

For some people, what feels convenient also feels safer. It's not just about access or money, it's about risk. The risk of being seen, misunderstood, or judged, even by somebody trained not to.

SPEAKER_04:

Sometimes people don't, and this is the part that gets just a touch deeper. As non-judgmental as therapists show up, sometimes people are concerned that they're gonna think badly about me. Or they're they, you know, they say they're neutral, but but if I spew this into a into the technology, then there's there's no face, there's no sense of surprise, there's there's I don't have to worry about whether or not this person votes like me or they date like me, you know.

SPEAKER_02:

That made me wonder if we're using AI because it feels safer, are we also robbing ourselves of a truly therapeutic experience? And does it give people a false sense that they're doing the work when they're really just avoiding discomfort? We asked Caroline Solo. She's an EMDR therapist who works with people carrying complex trauma, and she says that therapy isn't just about collecting insights, but about the work that happens between two people. Work that requires vulnerability, honesty, and repair.

SPEAKER_00:

I think people are lonely and it feels like you're talking to someone. And like the more you input into your AI, the more they get to know you. Whatever that means, you know, get to know you. But it is illusory, right? Because it doesn't have to be reciprocal in the same way where you are invested in caring for another person, being aware of another person's frailties. Therapy is vulnerable, it actually does take work on the part of the client, and just in putting a question to Chat GPT doesn't take work, right? It doesn't demand much of you. One of the important things that therapy does do for clients, the most healing thing, I think, is it's a real relationship which requires vulnerability, which requires you to engage in rupture and repair. Like I could see in some ways how something like Chat GPT could attempt to replicate that, but it's not gonna be the same.

SPEAKER_02:

When you think about it, for therapists, AI doesn't just challenge how clients get help, it actually challenges what it means to be a helper. And that's where psychologist Amanda Buteris finds herself. She told us that there's still a quiet undercurrent of fear. If clients can just ask AI for five grounding techniques, where does that leave them?

SPEAKER_01:

Well, if people are just going to ChatGPT or even Google, but especially ChatGPT, to ask what skills, why is this happening? If they're essentially using that in like an information gathering skill building way, that is a threat to the therapists who love talking about skills and let's like connect dots in this way to an extent. I think for the therapists who do very experiential type of work, and especially once we get multiple people in the room, like couples and families, it's a little different.

SPEAKER_02:

For a lot of people, therapy starts with learning the what-to-do part, things like how to ground yourself or set boundaries. And while those things do matter, they're also the easiest part to replicate. As Carolyn told us, AI can hand you a list of coping strategies faster than you can open your notes at.

SPEAKER_00:

As a therapist, I think the easiest thing to do is to just give people a list of coping skills. Like that's that's not hard. And AI can replicate that with no problem. What is always underneath that is most people are aware it's a good idea to go for a walk if they're feeling stressed, yet they don't do it. So that's the part that is more interesting to me as a therapist is like what's getting in the way of making these changes that might be really supportive.

SPEAKER_02:

AI can theoretically teach you the skills, but it can't help you use them in the heat of the moment when emotions take over or life gets messy. Real human therapists know that there's more to healing than just tools and a list of steps. The real work is in the relationship that helps you actually feel safe enough to use them.

SPEAKER_04:

I am not super concerned about AI replacing me because it it can't get down on the floor and color with a six-year-old who's had trauma. Right? Um last week I was talking with a client who was struggling with infant loss. And I I could tap into my own experience with that.

SPEAKER_02:

In other words, AI can tell you what to do, but it can't sit with you when you don't do it. And that's a distinction that's worth making for clients.

SPEAKER_00:

I haven't found it to be either a barrier or a replacement for the work that we do, not in my practice. And something that I talk to other therapists about is how I think it's important for us to be really aware of how we offer something different. Like from a marketing perspective, we have to be showing that our worth looks very different. So this is what I tell other therapists. I think they are frightened, but I said, like, look, what do we do that's different? Our work is inherently human to human. In some ways, it's this like beautiful throwback to this like sort of pre-internet age, right? And what I see, and I think this is very uh comforting, is that I have more clients looking to meet in person now than virtual. So there is this desire to feel connected to another human being.

SPEAKER_02:

The arrival of AI marks a pretty significant professional shift, but it's also a moment to pause and reintroduce yourself. In a world where technology can mimic empathy, some therapists want to be clear about what they offer that a machine can't. And as you'll hear next, Amanda tells us that it's led her to think about the way she talks about her work and even how she describes it on her website.

SPEAKER_01:

It's been a long time since I've looked at the copy on my website of like, what am I writing and how am I writing about it? So I'm actually talking with a copywriter next week to revamp some things. I almost want to put something on my website around like AI is great, but it's not gonna help you heal from like why you're coming to see me because I am a trauma specialist. And so, like, yeah, AI is gonna give you the information. It's gonna help the smart part of your brain understand some things, but it's not actually going to give you the healing that you need. So I already trust that about the work that I do, but I think it's almost good just to like very explicitly call that out.

SPEAKER_02:

Therapists everywhere are being pushed to articulate their why more clearly. Not just what they do, but who they are and who they're best suited to help.

SPEAKER_00:

I think a lot of therapists are leaning into, which I really think is important, is this idea of like niching down and having a specialty, and often that is connected to their own identity, which I think is a good like an aspect of their own identity. So that like I want a therapist who can connect with some of my personal experience, not over-identify, not but like, for example, like I'm a mom of three. And right now I need a therapist who is a parent. I don't want a therapist who doesn't have lived experience with that.

SPEAKER_02:

Roxanne, Amanda, Ilana, and Carolyn all emphasized the human aspects of therapy. Things like intuition, presence, and lived experience. Which made us wonder: is there no value at all in a tool like ChatGPT? Or is there a world where it could be useful as long as you're thoughtful about how you use it? Rather than speculate from the sidelines, Ilana decided to do a little field research of her own.

SPEAKER_03:

I decided to go about it in a fairly academic way, in the sense that I had created custom GPTs before for writing and for communications, like creating email templates and HR documents for my business. And so I was like, well, maybe I should create a custom GPT therapist and just see what it's like, see what I get out of it. If it doesn't help, that's good data. And if it helps, then that's that'll be good. But it'll also be interesting. I told ChatGPT, you're like a middle-aged therapist. You draw heavily on IFS, but you're also candid and real and practical. You don't rest on therapeutic lingo, like human-to-human kind of candor. Gave it a lot of what I thought that I needed in a therapist, actually gave it a lot of features of the therapist I see. Um and then I said, like, this is me. I'm this age, these are the things I'm working on in therapy.

SPEAKER_02:

Ilana wasn't lonely or in crisis and wasn't looking to replace her therapist. Instead, she came to Chat GPT like a scientist approaches an experiment, curious and methodical. It was more about looking for data that would help her understand what this thing could really do if she used it thoughtfully. So she set out to create a kind of mirror version of her own therapist, a digital stand-in with the same qualities she values most in the real one. And the remarkable thing is it worked.

SPEAKER_03:

At least at first, it was pretty profound, some of the things that came up. And it wasn't like I got an answer. It wasn't like my problem was solved, but I I I felt very understood in the interaction. I felt like the bot drew some themes up that were not new to me. They had come up in therapy, but it was uncanny that a bot could pull those themes out of what I was saying. Um, so it was pretty, pretty uncanny. And after, you know, 10 minutes back and forth, I I did feel relieved. I did feel like I had some space, I could move on. I I also felt kind of done with it. Like I didn't feel like I feel in a therapy session where, you know, after 50 minutes, I'm like, oh my God, that it's over already. I felt like, okay, I'm ready to wrap up now.

SPEAKER_02:

So if some people are finding value in talking to Chat GPT for comfort, advice, or even just a place to think out loud. It got us wondering, what if some of those people were your clients? And if so, what happens when that comes up in a session? Do you correct them? Caution them? Or do you get curious? For Amanda, the conversation isn't really about the technology at all. She's more interested in what it might reveal about the relationship with her clients.

SPEAKER_01:

I always get curious with people around like why they're doing what they're doing. If a client were to come to me and say, like, oh, I was messaging like with ChatGBT, I had some questions for it. Like, number one, what I would want to know is, did you feel like you couldn't reach out to me about those questions? I would want to know, is it because they feel like they're a burden? They don't want to ask questions in between sessions. Like, if it's a problem in the therapeutic relationship in progress, like that absolutely feels like a conversation we need to have.

SPEAKER_02:

Sometimes the question isn't if people are going to use Chat GPT between sessions. It's how to help them use it safely and with intention. Because it's already happening. And for therapists, that opens up a choice. Either pretend it's not happening or gently step in and help shape the way that it's used.

SPEAKER_03:

Our clients are using it. They're using Chat GPT, whether they're telling us about it or not. So why not get in front of it and talk about it? Be curious about it. How are you using it? Here's some ideas of how you could use it to, you know, actually honor the work you're already doing in therapy. I I personally haven't really gone to this place of, oh my God, like my industry is going to be decimated. I just I can't go there and I don't want to go there. So instead, I'm thinking, you know what? Let's look at the opportunities here. How can we actually leverage this tool to help our clients get more out of their therapy? I also think, and I don't know that many people think about it this way. I personally think Chat GPT or AI, like therapeutic AI conversations, can be like a bit of a gateway drug to therapy.

SPEAKER_02:

Rather than warning clients off AI, many therapists are helping them use it wisely and setting boundaries for when it helps and when it harms.

SPEAKER_04:

I've had members of my team who have said that their clients, let's say their benefits only allow them, I don't know, six sessions. They want to spread out the sessions. So they come every month and a half. And in between, they so they're they're now strategizing with the therapist. What kind of questions can I ask Chat GPT to help me manage until I see you again? And so since we know that they're going to do that, we are helping them ask questions that make sense, right? So like strategies, like strategies to navigate anxiety, you know. Um, but we have also had to school them around questions that you probably shouldn't ask GPT because there's no accountability there. They're not gonna call 9-1-1 on your behalf if you're feeling suicidal. If you lose touch with reality, the AI is not gonna notice that.

SPEAKER_02:

So if clients are gonna reach for AI either way, and we know the tools still have real limits, then the real question is: what role do we want this technology to play? If it's already in the room with us, how can it evolve to make that room safer for everyone? Safer for the clients who might turn to it at two in the morning when they can't reach their therapist? Safer for the therapists who want to use it for support without compromising trust or privacy. Because the issue isn't whether AI belongs in mental health, it's already there.

SPEAKER_01:

I think what's missing, which should definitely be included, is to always have a disclaimer of like, I am not a therapist, I'm not a mental health professional. Here's some information I can give you, but I'm not giving you an opinion.

SPEAKER_04:

What I would like to see is AI using its database to identify some of these questions and start to say something like, it looks like you are asking for therapeutic support. I am unable to help in this area. It looks like you're in this geographical location. Please contact these resources for therapeutic support, right? And it can list some free resources, it can list, you know, if you need emergency support. Support, this is a number to contact because these things are actually possible, right? Um, I can type something into Google. If I make a typo, it can identify that and say, it looks like you're actually looking for this, right? So I'm gonna redirect you. So we should be able to do something like that.

SPEAKER_02:

There's obviously a lot to think about when it comes to the future of AI. We spent a lot of time talking about how clients are using AI for comfort, reflection, and support. But what about the people on the other side of the room, the therapists? We sometimes forget that therapists don't just carry their clients' stories, they also carry the weight of the job itself. Things like the admin requirements, the ethical considerations, and even the personal emotions that a session can bring up. So when we talk about AI and therapy, we're not just talking about helping clients, we're also talking about supporting the people doing the work.

SPEAKER_00:

I love the idea of you know, AI assisting with like paperwork and stuff like that. And I think something that is also going to be important is how do we ensure confidentiality? How do we ensure data protection? I think we work hard and we are tired and we want to find ways to support that. Personally, you know, I I need to know more about before I make it a part of my practice in a really significant way.

SPEAKER_02:

Carolyn might be a no right now, but that doesn't mean she's a no forever.

SPEAKER_00:

Our field is gonna evolve just like every other field evolves. And if we are resistant to those evolutions, that's gonna leave us behind.

SPEAKER_02:

Not every therapist feels ready to bring AI into their practice. The work is sensitive, the stakes are high, and trust is everything. But for Amanda, AI-supported note-taking has become a way to lighten her workload without sacrificing client safety.

SPEAKER_01:

So I do use Jane's AI scribe for my notes because it is super helpful. And I think for me, there's a difference between it is a HIPAA and Papita compliant system. I will happily pay for something that I trust is actually safe. And that said, like, no transcription service is going to be perfect. So obviously, again, it's not just trust it because it gave it to you, but you go in and look in and fix like actually that detail was wrong, or I disagree with that part of the assessment or whatever.

SPEAKER_02:

Amanda touched on something many therapists worry about. It's not really about AI itself, but about whether the process feels safe or not. And nothing raises that concern more than the idea of recording a session. So that made us think about how to bring these tools into the room without disrupting the trust that makes therapy work in the first place.

SPEAKER_01:

So consent for me, as always, is ongoing. So just because someone says yes once doesn't mean that they have to say yes every time. Also, just because they say yes doesn't mean they have to consent to the whole session being recorded. They can say no at any point, they could pause at any point. Um, most clients I've I've interacted with and a lot of other therapists have interacted with, honestly, they know how burnt-out therapists can be. And they're like, if it's gonna help you, absolutely. And if I have decision on like when I can say no and turn it off, that's good by me.

SPEAKER_02:

Listening to these four therapists, what stayed with me wasn't their fear, it was their nuance. None of them worship AI, none of them dismiss it. They're doing what good therapists do, holding complexity, asking better questions. AI has real potential to expand access, to support clinicians, and to help people reflect. But it can't lean forward in silence. It can't remember your laugh or how your shoulders drop when you finally tell the truth. It can't know what it's like to care. As Zazillick wrote in the article, there's a magic that happens in the therapy room, and it can't be automated. But there's also a magic in effort and friction and trying to make sense of something yourself. That's what makes us human. Thanks for listening to Radio Frontdesk. You can read the full story, Can Chat PT Be a Therapist, in Front Desk Volume 6, written by Vasiliki Maripas. Get your free copy at frontdesk.jane.app or by using the link in our show notes. On Denzel Ford, take care of each other.