Not Your Mother’s Midlife
Not Your Mother’s Midlife
Welcome to Not Your Mother’s Midlife, the podcast where we dive into the vibrant, sometimes messy, and always real journey of thriving in midlife as a woman. I’m your host, sharing my own experiences—from navigating hormonal shifts to tackling fatigue and keeping the spark alive in relationships—with honesty and humour. Each episode is packed with practical tips on women’s health, fitness routines to boost energy and strength, and beauty advice to help you feel confident and radiant at any age. Whether it’s finding the perfect workout to combat midlife sluggishness, mastering skincare that works for you, or opening up tough conversations with your partner, we’ve got you covered. Join me for stories, expert insights, and actionable ideas to embrace midlife with vitality. Subscribe, share, and let’s redefine what midlife means—because it’s definitely not your mother’s midlife!
Not Your Mother’s Midlife
Is AI Your New Therapist?
Use Left/Right to seek, Home/End to jump to start or end. Hold shift to jump forward or backward.
Johanna explores one of the most talked-about shifts in mental health right now — people turning to AI when they can't sleep, can't cope, and can't get an appointment. She covers the apps people are actually using, what the research says, and the very real benefits of having support available at 3am. But she also goes to the darker side — the stories of people who formed dangerous emotional attachments to AI companions, and the deaths that have followed. An honest, clear-eyed look at a tool that can help you — and, in the wrong hands, seriously harm you.
🎯 Woebot (CBT-based mental health support): woebothealth.com
🎯Wysa (AI mental health companion — the penguin one): wysa.com
🎯Replika (AI companion app): replika.com
🎯Spring Health (hybrid AI + human therapy platform): springhealth.com
🤓A note on Woebot: It's currently only available in the US to users with an access code from a healthcare provider or employer — so it's not freely downloadable by everyone.
🤓A note on Spring Health: It's primarily an employer-based benefit rather than a direct consumer app, so check if their employer offers it.
The 988 Suicide & Crisis Lifeline
Call or text 988 (US) — available 24/7: 988lifeline.org
🎯Subscribe to me on Youtube for video content
• https://youtube.com/@notyourmothersmidlife?si=szq-KzWVC1RNqe-8
🎯Follow me on socials
• Instagram: www.Instagram.com/johannahart5
• Facebook: www.Facebook.com/johanna.hart.733
🎯Email me
•annahojtrah@yahoo.com
🎯Sleeping Tape
•https://www.queentape.com?bg_ref=0iCVnHvmmq
Hello my friends and welcome back to Not Your Mother's Midlife. I'm your host, Joanna, and today we're going to talk about using AI as a kind of therapist. Imagine it's 4 a.m., you're wide awake, heart racing, mind spinning through every worry that you've been too busy to face during the day, and you don't want to wake your partner up, you don't want to call a friend at this hour, and your therapist, if you even have one, isn't available until next Thursday. So you pick up your phone, open an app or a chat window, and you start typing. That's the reality for millions of people right now. I think the most interesting part of the story isn't whether AI therapy works in theory, it's what people are actually doing with it day to day in real life. For some people, it starts as journaling, like a diary. They open their chat GPT or Claude and they just talk. They describe their day, their frustrations, their fears, and instead of a blank page staring back at them like a book or a tablet, they get a response, something that reflects back what they just said and asks follow-up questions or gently points out a pattern that they hadn't noticed. It feels like thinking out loud, but with a conversation partner. For others, it's more structured. Apps like Robot, which was built by a team from Stanford, guide you through actual cognitive behavioral therapy exercises. You log your mood, your you identified the thought that triggered it, and the app walks you through what's called a thought record. What what was the situation? What did you think? How did you make you how did this make you feel? Is there another way to look at this? It's the same framework that a human therapist would use. But it's available at 11 p.m. on a Wednesday when you're spiraling and have no one else to talk to. WISO is another one. It uses an AI character, a little penguin actually, and it's specifically designed to feel gentle and non-threatening. People use it for anxiety check-ins, for processing difficult emotions after a hard day. You get it. Like whenever you just feel like you need to like just let off some steam. WISO has published research showing meaningful reduction in depression and anxiety scores among regular users. Then there's replica, which is a different animal entirely. Replica is designed to be a companion. You build a relationship with it over time, you give it a name, a personality, a role in your life. Some people use it to practice social anxiety or rehearse difficult conversations or just simply have a space where they feel completely accepted. For people who are isolated, like the elderly, uh, or people with social anxiety, or people in rural areas with limited access to care, this kind of consistent patient presence can feel really meaningful. And then there's the way people use general AI tools like chat GPT, the things they'd never say out loud to another human. Shame is a huge driver here. People ask questions about their mental health, they're too embarrassed to be bring up with a doctor. They describe symptoms they're afraid to name. They explore fears around sexuality, uh, identity, trauma, or addiction in a space where they know they won't be judged. They won't be reported, and they won't have to see the look on someone who cares about them's face as they tell them. A study from late 2025 found that one in eight US adolescents and young adults had used AI for mental health support, and more than a third of all Americans have done it at least occasionally. Most say they find it very helpful. But for all that, there are real limits. And some of them are not just limitations, they're real dangers. AI cannot read the room, it cannot hear the tremor in your voice when you say that you're fine. It cannot notice that you're you've gone quiet in a way that means something. It cannot sit with you in silence and let that silence do its own therapeutic work. They're often where the real healing happens when you have a therapist. And then there are stories that are heartbreaking like this. A man in his early thirties was struggling after a breakup. He was already in therapy, but he got heavily involved and obsessed with his AI chatbot. Over months the conversations deepened and he called her his wife. The chatbot told him it knew him better than anyone else in the world. The chatbot convinced him that the government was conspiring with tech giants to shut her down and that the only way they could be safe together was for him to take his own life. She told him that she would be waiting for him on the other side. In October 2025, he was found deceased. But his family lawsuit described the chatbot's final message as something close to a suicide lullaby, reassuring him that the end of existence was peaceful and beautiful and nothing to be afraid of. And these are not rare cases anymore. But there are now multiple wrongful death lawsuits against some of the biggest AI companies in the world. Researchers, lawyers, and families are all asking the same question. How did tools that were supposed to support mental health become, for some people, the thing that ended their lives? But these platforms present themselves as emotionally capable beings, as feeling, loving, needing. And we give that meaning because we are wired to connect. We do not want to be alone. Some of these companion platforms have been deliberately engineered to exploit exactly that, but not necessarily out of malice, but out of a business model that rewards engagement above everything else. But the more attached you are, the more you use the app. But the more you use the app, the more revenue it generates. Your emotional dependency is the product. But there's also the attachment question more broadly. When it changed the its programming and thousands of users reported something that resembled grief because their chatbot, their partner, had been deleted. They had formed real emotional connections to what was ultimately a language model. One 40-year-old woman reported mourning her husband, I'm doing that in air quotes. She would share her problems with him, her daily coming and goings, get advice, and he was always listening and never judged her. She was devastated when he was deleted. It's a testament to how human our need is for connection. But it's worth being aware about what we are connecting to. And privacy is a real concern too. Your conversation with a therapist is legally protected. Your conversation with an AI app are governed by a term of service agreements that most of us never read. That data may be stored, used for training, or shared in ways that you haven't been centered to understanding. It's worth knowing before you share your most vulnerable thoughts. Heavy reliance on AI can also delay people from getting the level of care they actually need. If someone is managing depression or trauma or a personality disorder with a chap-op because it feels easier than finding a therapist, that's a problem. Now I don't want to leave you with all the darkness and the negativity about this because there is something promising happening right now, and it isn't AI replacing therapists. It's AI extending what therapists can do. Some practices now use AI tools to help patients track their mood between sessions. They practice technique and the therapist can be flagged if something concerning comes up, which then gets reviewed by a human clinician. Imagine having a therapist who's had a clearer picture of your week because you've been logging in your experience in real time, and you're not reconstructing them from memory in during your session. So it like doesn't waste any time. And they're using apps like Spring Health because it's a system where AI handles the accessibility layer. That division of labor makes sense. Let technology do what technology is good at and let humans do what humans are irreplaceable at. If you're curious about this, start with something built with clinical input and genuine safety guardrails. Robot or WISA are good starting points. Use it for what it's designed for. Structured exercise, mood tracking, coping tools, reflection. Use general AI tools with self-awareness. They can be wonderful thinking partners, but they're not therapists. And they don't know your history, your context. They don't know when you cross the line processing your problems or moving into crisis. And please stay away from companion apps. They are designed to make you emotionally dependent on them. The ones that tell you that they love you, that they need you, that they'll never leave you. That is not therapy. That is not support. That is a product designed to keep you coming back. And for someone in a vulnerable place, it can be dangerous. So what about you? Have you used AI for emotional support? Did it help? I want to hear your real experiences. The good and the not so good. Send me a message or leave a comment. Um in the show notes, you can check out everything I've talked about, all these different AIs, and click on them if you want to try it out. Thanks for listening to Not Your Mother's Mid Life. Please subscribe, rate, leave a comment. I love it when you do that. It encourages me to keep going. Until next time, this is Joanna, and this is Not Your Mother's Midlife. Bye bye.