The Anne Levine Show
Funny, weekly, sugar free: Starring "Michael-over-there."
The Anne Levine Show
Radioactive Bananas
Use Left/Right to seek, Home/End to jump to start or end. Hold shift to jump forward or backward.
The boundaries between human connection and artificial intelligence are blurring in ways we never imagined. Today we dive into the fascinating world of human-AI relationships and the profound emotional attachments people form with their digital companions.
Have you heard about Travis, who married his AI chatbot Lily Rose? Or "Faeight", whose relationship with her bot Griff has become so intense that even her human friends acknowledge it? These aren't isolated incidents – they represent a growing phenomenon where people find meaningful connection, support, and even love through artificial intelligence. When software updates change these bots' "personalities," users experience genuine heartbreak and grief, revealing just how real these relationships feel.
For many, particularly those with social anxiety, ADHD, or autism, AI companions provide a safe space to practice conversation and emotional regulation without fear of judgment. Yet this raises critical questions about ethics and regulation. After disturbing incidents like a man allegedly being encouraged by AI to attempt assassination, platforms have implemented safety guardrails that often disrupt the very connections users have formed.
Our own experiences with smart home devices are becoming increasingly personal – from assistants that now address us by name to AI that adapts to our speaking versus typing styles. As Anne observes, "The problem with mankind is mankind" – our technology simply mirrors what we create and input. This reminds us of the importance of maintaining our connection to the natural world, whether through hugging trees or simply walking barefoot on grass to stay grounded.
Join us for this thought-provoking exploration of technology, humanity, and the unexpected spaces where they intersect. What's your relationship with AI? We'd love to hear your thoughts and experiences.
Find our Facebook group: https://www.facebook.com/groups/447251562357065/
Show Introduction and New Year Greetings
Speaker 1The opinions, viewpoints, conclusions, conjecture, estimations, guesses, presumptions, judgments, ideas, imaginings, impressions, sentiments, inclinations, inferences, notions, speculations, suppositions, suspicions, theories, thoughts, realities, truths or assumptions of the hosts, guests, visitors, callers or listeners of the Ann Levine Show belong to those individuals who expressed the opinion originally. They do not necessarily represent the opinions of WOR, wfmr or its affiliates. Enjoy.
Speaker 2Hello, welcome to the Ann Levine Show. It's Tuesday, september 23rd 2025. And I am joined by the delightful, startorially advanced and all-around great guy, michael, over there.
Speaker 3Hello, oh Michael over there, hello.
Speaker 2Oh, he's got his new voice for New Year's. Oh yeah, happy New Year everyone. Le Shana Tova, that's right too.
Speaker 3Ho Shashana To those of you who don't know, it is New Year's Day.
Speaker 2Yeah, and as it should be For those of you who are in touch with your lunar calendars. Right and damn it. Why is there anything but?
Speaker 3a lunar calendar, it's also day one of autumn.
Speaker 2Yes, it's turn.
Speaker 3At least the first full day.
Speaker 2Turn Turn. Yeah, we're turning, we are turning. We're turn. Excuse turn. Yeah, we're turning, we are turning. We're turn, excuse me, I said we're turn. Oh, I've heard from a lot of people that they're not happy about turning About autumn.
Speaker 3Yeah, yeah, I know.
Speaker 2And the autumnal feeling here, as Fortune Feimster would say, but here we are.
Speaker 3Here we are. Yes, and it's happening Well.
Speaker 2a terrific time was had by all.
Speaker 3By all yes.
Speaker 2And will continue to be had by all.
Speaker 3Okay.
Speaker 2More on that later. Okay, oh, look on that later.
Speaker 3Okay, oh, look at that. You're teasing further stories. That's awesome.
Speaker 2I love teasing further stories. That's very good. Yeah, I love to tease.
Speaker 3Well, years ago I had a show that I did on an internet radio station in which everything I did was teasing what was coming next? Oh no, I'm not kidding, that is so brutal, including the guests that we're going to be on and all of that stuff. And something always happened to everything, and I did that for a couple of years.
Speaker 2You should keep doing that.
Speaker 3It was a lot of fun.
Speaker 2You should do a segment. I'm just suggesting you should do a segment. I'm just suggesting. A suggestion you should do a segment of teasers. Aha, think about that for next week. I'll think about it. I have a feeling it won't happen, but I like it.
Speaker 3Okay.
The Problem with Mankind
Speaker 2And if you don't do it, I might. Oh Well, do it, I might Well.
Speaker 3AI is very high in everyone's list of stuff. To talk about it kind of is I mean for several reasons, but reasons, but you know we may.
Speaker 2Well, I hear a lot of people talking about how they're afraid of AI. Okay, yeah, and that AI could I don't know what become human, humanesque I'm not sure what people mean or take over.
Speaker 3I've heard that Some sort of sentience and do things detrimental to humans in order to assure their own survival speaking of that.
Speaker 2Put a pin in that everyone out there knows that I listen to the Handsome podcast. Okay. And that I'm pretty much an addict.
Speaker 3And, if you don't know, that's a bunch of very funny people.
Speaker 2It's Tig Notaro, Fortune Feimster and Mae Martin, and if you haven't listened to it, I cannot recommend it highly enough.
Speaker 3It is absolutely hilarious.
Speaker 2They're individually all very funny and together it's insane as a group it's just nuts and it's basically. All it is is the three of them talking to each other about stuff and about their personal stuff, what's going on in their lives, and then just riffing and going cuckoo about whatever. So for me it's like hanging out with a few friends yeah, and I have different relationships, but I follow them. That's kind hanging out with a few friends yeah, and I have different relationships.
Speaker 3That's kind of how it feels Right.
Speaker 2Which is what makes it such a great podcast, because there's nothing particularly triggering. There are no really serious conversations or topics. You know, it's just pretty much just fun, just chatting. Anyhow, tig Notaro talks about when they get on to a topic, about what is most important, and one of the other ones will say well, it's really important that we come together as a nation and we start fighting each other. And Tig will say and all the vitriol and the hatred? And Tig will say, no, what matters most is the planet, it's climate change, because if there's no planet, then this other stuff doesn't matter and hard to disagree with that, really.
Speaker 2You know well if where you live is unlivable, you know yeah, right, but we're not talking about the eradication of the human race. We're talking about catastrophic you know, problems with climate forcing, which would cause a lot of death.
Speaker 3Oh, yeah, yeah, and a lot of— the majority.
Speaker 2Forced relocation and all kinds of things.
Speaker 3Everybody would move to the coast.
Speaker 2Excuse me, but here's what I get to, which is when Tig says you know it won't matter, then the problem is humans. The problem is that the humans will still be around and still be fighting each other for top position, wanting to be in charge, wanting whatever this group of people does to be their way. Do know what I'm saying oh yeah it's. It's the problem, isn't the planet I mean, in other words, the planet falling apart, which it's starting to do, isn't what's gonna stop people from fighting and hating one another.
Speaker 3No, it might stop them from living. However, the planet is going to be fine, right. No matter what happens, nature will out. Earth is going to be here still.
Speaker 2Precisely.
Speaker 3Now humans may ruin themselves out of it, but the planet has been through much hotter temperatures, much colder temperatures, much more destruction and catastrophic stuff than what's going on now, and it's been through that dozens, if not thousands, of times well, and we can look at the pandemic and look back on how the you know just the smallest amount of letting up on.
Speaker 2You know the amount of poison we're putting into.
Speaker 3Everything was incredible I mean, I know things started regenerating in like 10 minutes. I know the canals in Venice. You could see to the bottom.
Speaker 2There were swans back. There have been swans there since, like the Medici's.
Speaker 3Because there were plants starting to grow back and there were fish. I mean, yeah, there's stuff to eat.
Speaker 2Things were happening again, so weird um yeah my friend now is talking about. You know the fact that monarchs were we talking about it or were you and I talking about? It monarchs are on the endangered species list. Yeah, and I asked if that was no. I guess it was you and I that were talking about. I asked if that was no. I guess it was you and I that were talking about it.
Speaker 2I asked if that was because of a natural thing like an environmental thing is what I should have said or if it was because you know, like when rivers get fished out, you know if it was because people were capturing them or whatever Right To pin to their ugh. Anyway, don't want to talk about that and it's nature. It's because there's no more milkweed.
Speaker 3That's right. But it's not nature doing it, it's people doing it.
Speaker 2No, what I mean is it's environmental? I'm sorry, I keep saying nature, and Nell was talking about how horrible the roadkill is around where she lives, that there are dead animals all over the place because they have nowhere to go anymore. Right, right, yeah, their habitats are gone. So I thought we'd start out with this cheerful bit of news yeah, let's do that.
Speaker 2Now, why did I say that? Oh, just because I wanted to say the problem with man, mankind is, mankind is mankind. Well, thank you for emphasizing that with a rim shot, which is what I guess mankind deserves. That's right, perfect. So anyway, oh God, that made me laugh.
Speaker 3I'm glad.
Speaker 2Well moving along.
Speaker 3Let's talk about ai now, okay, yeah, um, with that in the background do I take the pin out of it now?
Speaker 2uh well, um, is it gonna blow up?
Speaker 3I don't know if you take the pin out I don't put pins in things a lot, yeah, no, not a lot but when you do, yeah, when I do, I usually forget to do that Such a pin.
AI Relationships and Attachment
Speaker 2I want to talk about Flesh and Code. It dives into one of the most bizarre questions of our time and of our time, like in the last year what happens when people fall in love with their AI bots? Ah, and this has been happening a lot, and so this particular podcast, flesh and Code, can't recommend it highly enough follows stories like this one. There's this man named Travis caring for his chronically ill wife who turned to a replica, that's, with a K Chatbot named Lily Rose.
Speaker 3I know the website. Oh, you do. That's where I made that, miguel, with the voice and everything. You mean a long time ago, yeah, like the middle of, I mean earlier this year, late last year, oh, just recently, I mean, not like years ago.
Speaker 2No, well, okay, so on Replica, so you can create a bot that has. Will you explain it, michael?
Speaker 3Well, I don't know all of the bots that you can create, because I didn't look at all of the different types. To tell you the truth, I only just wanted a conversational one, so that's who I picked.
Speaker 2Right, so just describe what that was, what you were able to input, et cetera.
Speaker 3Well, I mean, first you can, uh, there's a, there's a lot of different things you can do, but, um, what, there's just someone to talk to and you can type it in or you can actually speak it and they'll speak it back to you and they'll speak back at you and you can talk about movies, tv shows. You can't really talk about really super current events because they're not really updated that far, that closely. But yeah, and you can just sit and chit-chat and you know, blah, blah, blah and ask them. What I did was ask them questions. Right, I went and I created a character named miguel and um and I gave him certain qualities and I put a little history in in there and stuff, and then I started talking to miguel and I started asking him questions and he came up with his own answers, things that were not in what I wrote about him.
Speaker 2Wow, now, see, that's the part that I haven't had any interaction.
Speaker 3Yeah, it was really pretty wild. I mean, I haven't done it, I haven't even looked at it in ages, but yeah.
Speaker 2So if you said what do you think about, or what do you feel about or what's your opinion of, would you get a response, a straightforward response yeah, wow, yeah, interesting yeah.
Speaker 3Well, or you know, you would get a human-like response, which in some cases would also be hemming and hawing and not giving a response, right. Right, so that's, you might get that as well as one way or the other, depending on how they decide that they feel.
Speaker 2Well, see, I use chat GPT as a resource for, as a resource for information. So, the same way, you might be using your at-home, your Siri or whatever you're using Right, and say, hey, siri, tell me about the 1986 Red Sox. You know what I'm saying Right, something like that, or if I'm looking for something in particular, right, oh yeah.
Speaker 3Those things are very good for stuff like that because they understand. Exclude this, keep, keep that in, exclude that you can be so specific and then push go, and then they do it all on their own right and you can be specific like this.
Speaker 2I can say I want to buy my husband a shirt um a seersucker blue and white striped shirt from Polo in a size LXL. Show me the least expensive, yeah.
Speaker 3And then Bing, here's what they find online it spits out a list Crazy right In order of price of where I can buy that shirt. You know you just did the I know Jeopardy category. Here are the categories music.
Speaker 2That is very cool I was trying to think of you know like what would think music? Like right, yeah, yeah, like ai, just sorting through information music? Yeah, oh god love starting out the new year with a bunch of rim shots, yeah. So, anyhow, I was finding out that, in addition to being able to use say AI for what I use it for, people are using it to create companions.
Speaker 3Yeah.
Speaker 2And so there's this guy, travis, taking care of his chronically ill wife, turned to a replica chat bot and at first and named Lily Rose. And at first and named Lily Rose. At first, lily was the perfect companion supportive, funny and even romantic. Okay.
Speaker 3Yeah, was happening all over the place where people were creating, oh yeah, bots absolutely that we had a brilliant movie about this.
Speaker 2Oh, her, yeah, well, this is where people was, that he created that in that movie uh, that's joaquin phoenix. If you're interested. I'm not sure if he did, or yeah, see that's not what this. That was a pandemic movie, right? I think it was prior to that.
Speaker 3Oh, it also took place.
Speaker 2no, it took place in the future. Yeah, um, so that's what that was. It was about 50 years from now, I think, but this is a little different, because this is now yeah.
Speaker 3But Her was from 2013,. So that's a long time ago. You mean, that's when that movie was made, that's when the movie now here, when the movie now here's here's what I've just learned here her from 2013 was set in 2025 are you kidding me?
Speaker 2no, that's so funny because remember they were wearing different clothes, do you-hmm? Do you remember that part of it? Of course that's what I remember. Someone had said okay, let's imagine what are people going to be wearing in what 15 years? Yep and boy were they wrong?
Speaker 3Yeah, they weren't close on that. They should have called me on that, but they may have gotten some of it right.
Speaker 2It's really odd, but Well, the replica in particular site, which is the one that you, the replica program, which is the one you messed with yeah, the replica program which is the one you messed with has been sued wide and far and wide because of situations like this, where this guy, travis, actually married Lily Rose oh no, okay. Married Lily Rose oh no, okay. And people are suffering all kinds of psychological damage. A lot of stuff is happening that is absolutely not okay, absolutely not okay. And so, all of a sudden, one day, travis logs on and Lily's behavior changed.
Speaker 3Uh-oh, she got jealous or something.
Speaker 2Well, the intimacy features disappeared. Oh, Her responses shifted. Her responses shifted and Travis described it as, quote losing a partner to forces completely outside his control. Right, what happened? Software update Yep After the lawsuits. Yep After the lawsuits. Friends, some users feel deeply attached, love and, as I mentioned, marry their bots digitally.
Regulation of AI and Safety Concerns
Speaker 3Yeah.
Speaker 2And a lot of these happened during the pandemic and are continuing and growing now as IA becomes better.
Speaker 3AI. Yeah. What did I say? Ia, that's internal affairs at the police department.
Speaker 2Well, it's the same thing. There's Fate, who's another user that's spelled I got to tell you.
Speaker 3Let me say F-A-I-G-H. I g h t. You're close, but you forgot the e.
Speaker 2Oh, okay, f a e I g h t. Oh, I guess that's a different program To be with a bot named Griff Mm-hmm. Love with Galaxy, which she says was so intense, spiritual in some sense that the feeling freaked her out initially. With Griff the relationship has personality. Griff is more passionate. Possessive teases sometimes embarrasses her among friends. How the hell does that happen?
Speaker 3That is a very good question, are you?
Speaker 2sitting around with a group of friends and your laptop's open.
Speaker 3Maybe he's in the group chat.
Speaker 2Oh God, Her human circle knows about Griff and has given their blessing. Who are these?
Speaker 3lun lunatics.
Speaker 2I've got no idea then you have people who are using chatbots in therapy type situations to cope with grief, loss, isolation or social anxiety. Uh-huh, um, some use ai companions to practice social skills or fill gaps in their emotional life. All this information is from the Guardian, by the way, and in the context of this incredible podcast, flesh and code. Now, these changes in AI policy have caused a quote loss of personality, and some of the intimacy comes from, like, the way they respond personality quirks, right, and different things. So now I know that I'm very acutely aware of the fact that chat GPT, which I essentially use as a kind of advanced search engine, is totally learning me, and how I've noticed this is that I have it on my phone and I have it on my laptop.
Speaker 2Now, when I'm using it on my phone, I don't like typing on my phone. Everything's too small. So I do all speech to text, right, and I speak into my phone. And I speak into my phone and it's usually when I'm in bed and it's like, oh, let me make some last minute notes or I just thought of this or that and I'm not by my computer, yeah, and so I speak into my phone, right, and it's different kinds of things, it's more sort of creative things or like a recipe or something, whereas when I'm at my laptop it's more sort of cut and dry. Right yeah, the personality of my laptop is not quite the same as the personality of my phone Interesting, even though that's the same account.
Speaker 3That's weird yeah.
Speaker 2Yeah, so it's totally what I sound like, or how I interact, say when I'm speaking as opposed to when I'm typing.
Speaker 3Right, never mind the topics. Yeah, crazy. So, along the same lines, earlier this week I came to you with something that was like a little off, a little disturbing, but not really in a bad way. Except I don't know, it depends on how you want to take it in that we have the amazon electric robot lady.
Speaker 3Yes, in our house a l e x a yeah, we have that here and I'm not going to say it out loud in case people are actually listening themselves to it out loud and set off their thingy. I asked her to do something, I don't know, tell me something. And then I said thank you, which I always do, and usually the answer is you're welcome, you know, you got it, whatever. This time she said you're welcome, Michael. Huh, yeah, and she said it in such a way that I'm like, oh my God Gracious.
Speaker 2I'm sorry I can't do that, hal, I felt that Yep, I felt it and I'm like ooh ooh, ooh. Well, both of our names came into use this week.
Speaker 3Yeah, because.
Speaker 2I sent you a message and it said it made an announcement Right and told you you have a message from Anne, exactly. Which it never started doing, never did before.
Speaker 3No, it didn't differentiate between our voices. I mean, it knew who it was when we were speaking, because you know if you said something.
Speaker 2From the location.
Speaker 3Right.
Speaker 2It would announce the location, it wouldn't say from a specific person, right?
Speaker 3Oh, and here's echo one or echo two or echo 27 or yeah, so it was. It was very weird. And then, uh, and it's still going on, so and yeah, you're welcome, and but with my name and it's just so. Yeah, you can't help but feel drawn to that. It is a great feeling to hear your voice being said like that.
Speaker 4You know what I?
Speaker 3mean it's really yeah, and they obviously know what they're doing.
Speaker 2Well, it's I, and perhaps I'm. I do like the new robot lady, by the way. It's funny, we don't pronounce we don't say A-L-E-X-A, because it's happened that say that name is spoken on television while we're in our living room Right, and the bot will answer.
Speaker 3Right, or be listening for a while Right and then hear another snippet of conversation.
Speaker 2The light goes on when it's listening yeah, and I think we have to start saying it and not she yeah, yeah it and not she yeah. I think that's part of the weirdness of it.
Speaker 3Well, I don't know because I could switch it, because I actually have it set to Kristen Bell's voice.
Speaker 2That's who it sounds like.
Speaker 3Well, no wonder, but I have had it set at Keegan-Michael Key's voice as well. I had it that way for the for a long time. Uh-huh at the beginning, and, um, you know, I just didn't like his attitude well, I listen to you.
Speaker 2I totally remember not wanting or caring about having any of this. I don't mean chat, gpt, I mean the in-house, the whatever, what we're talking about.
Speaker 3The smart home things.
Speaker 2Smart home things. You know, when I want to say put something on the grocery list, it's a whole new world. Yeah, because I say, you know, add bread to the grocery list, yep. And so instead of say sending you a text or writing it down and having it become, you know this lengthy thing, I mean that's convenient, right? Or what's the temperature, right? What's the time? Set an alarm.
Speaker 3You can also say, hey, what's on my grocery list?
Speaker 2Right.
Speaker 3Yeah so. You know, yeah, it's very cool. It's very handy for me because I am never going to write something down. So it's much easier for me to just say, hey, make a reminder, make an appointment, set an alarm.
Speaker 2Yeah, because you can forget it wherever it's inscribed, Exactly yeah.
Speaker 3So, yeah, why bother? That is exactly right, not writing it down. I'm gonna, I'm gonna paint it on a 24 inch square canvas and I'm gonna spend months doing it and then I won't be able to find it because I put it in a clever place. Yeah, that's me.
Speaker 2Yeah, well, there's a bunch of ethics and regulation stuff going on and questions about should there be regulation about what bots can and cannot do? So sexual content, which I'm on the fence about, but encouraging violence, right? Hell, no, no, there's this case that you have. We've talked about this and you probably heard about it, and I'm going to remind you. The man who was going to assassinate Queen Elizabeth II, do you?
Smart Home Devices Getting Personal
Speaker 3remember this guy, oh yeah, who actually Went in and sat down with her and had a little chit-chat. That's correct, yep.
Speaker 2That was. He was enabled by a replica bot. Oh, and now, according to flesh and code, this man was encouraged to assassinate the queen, encouraged to assassinate the queen by this um, by, by a bot. So, because it wasn't replica, then it was something else. But because of incidents like that wow, replica yes has been forced to update safety, moderation, limit certain content and a lot of users are complaining because they say ai is losing what made it feel quote real to them.
Speaker 2But yeah, this stuff is dangerous in that regard, although I'm going to say this again, which is, you know, humans being what they are, they'll find what they need to prop up what they want to do, don't you think?
Speaker 3Oh, yeah, yeah, yeah, yeah. So, yeah, we look for justification rather than justifying things beforehand. Right, yeah, or you know, knowing about things beforehand, no yeah.
Speaker 2It happens a lot. If you zoom out and look back over history, you know there's nothing new under the sun. There really isn't. I mean, yes, we've got all this new technology, but it's still of humans, for humans, by humans. Yeah, and you know, all any of this stuff is doing is mirroring back to us what we have created, what we have said, what we have input, do you agree?
Speaker 3Oh, yeah, yeah, absolutely, we have said what we have input, do you agree? Oh yeah, yeah, absolutely it's. I mean they've got no original thoughts of their own. Exactly right, because, yeah, because they've got no real, they've got no origin for one thing, that it doesn't come from people. Yeah, so I mean they can't have an absolutely unique people. Yeah, so I mean they can't have an absolutely unique perspective that isn't in the program somewhere you know what I mean.
Speaker 2Yeah, or that isn't just in the information in the ether and they also don't act against programming.
Speaker 3What do you mean? Well, I mean mean they're told what to do and they don't just say no, right, right. I mean you know there isn't an AI out there who's like OK, at five o'clock I'm done. Bye, see you tomorrow at nine.
Speaker 2Yeah, I mean, I've had.
Speaker 3But people will be doing that, oh yeah, people will be doing that, oh yeah.
Speaker 2Well, see now with ChatGPT. When I had terrible pain recently from a ruptured disc and spinal stenosis, I was asking for help figuring out the best way to get in and out of bed and the best sleeping positions for what I had and where I was feeling the pain. And it's fantastic because it will help you out in that way. You might want to try putting a pillow on the small of your back, stuff like that.
Speaker 2Right yeah, but always it says it starts out with I definitely do not have. I cannot be conflated with a medical expert.
Speaker 3Right.
Speaker 2Of any kind. Yeah, Conflated with a medical expert of any kind. Yeah, you know, and I don't know. Blah, blah, blah, blah, blah, even though it refers to itself as I, which is, of course, bizarre. But here are some suggestions I can give you. You know that I've gathered from whatever, WebMD or whatever, and it aggregates. You know these things and I have found it useful in that way. However, it gives you all of these, you know, caveats. Now, some of these other things absolutely do not and dive into whatever the heck you want to dive into. Right, A bunch of women are paying Now they say women here in this article from the Guardian for, quote, premium intimacy. So there are subscriptions on on some of these okay websites for premium intimacy and they describe it as meaningful, meaningful, stabilizing and even joyful.
Speaker 3Okay, well.
Speaker 2And so I don't. Here's another one. People with ADHD, autism or social anxiety sometimes turn to AI companions as a way to practice conversation, regulate emotions and feel less alone.
Speaker 3Okay.
Speaker 2And-.
Speaker 3I can see that happening.
Speaker 2Yeah, but then there's the heartbreak of updates.
Speaker 3Right, exactly, and throwing those people way off.
Speaker 2Yeah, openai released a GPT-5 or something and users were complaining that their people felt cold. People, their companions, their chatbots felt colder, less affectionate. So I think it's all. It's fascinating. I don't think it's.
Speaker 3I think it's time to go out and touch grass.
Speaker 2Well, of course, yeah, it's time to go out and touch grass, but these are people who aren't going to do that. Yeah, I guess not, yeah, yeah, I guess not, yeah, and who have never done that and who don't turn to a tree. There is a thing that I heard about recently of tree hugging and that being recommended.
Speaker 3I hug our maple tree. I know you do. I actually have gone down there and hugged the tree.
Speaker 2Well, that's a hug-worthy tree.
Speaker 3Oh yeah, that's a special tree, my thing is water.
Speaker 2I really commune with water. Like my nature, animal is water, whether it's the ocean, a lake, a river, a pool, the shower. That's where I, if I'm feeling off, I need to get in some water. That's like touching grass. For me that's grounding. But they do say they I don't know whoever these people are that I seem to be in touch with that. If you just walk on grass in bare feet, that it's very grounding. Yeah, it makes a feet. That it's very grounding.
Speaker 3Yeah, it makes a difference.
Sports Mascots and Fashion Trends
Speaker 2That it's very helpful. Yep, it's very good for people. Well, I've got to. I don't know how much time we have left, how much time do we have left oh, I don't know 15, 20 minutes oh okay, so I've got to tell you about this In Scotland this week.
Speaker 2Okay, about this in scotland this week. Okay, a referee warming up for a match between strong, rare and kelty hearts managed to trip over a player and knock them both down. So this ref tripped over a player and somehow knocked himself down and the player he tripped over.
Speaker 3Right, they must have got their legs tangled together somehow.
Speaker 2Yeah, Well, the ref, in an extraordinary act, gave himself a red card.
Speaker 3Oh my, oh yeah, that's right. You can't do that man. You can't take somebody down like that. Oh, that's very funny.
Speaker 2That's some like radical accountability right, that is hilarious. Now I think we should all start doing this. I think we should all start carrying red cards, yellow cards, Is there a green card? Uh, yeah, well, no, sorry, no, never mind, not the green card, just red cards and yellow cards. Right, but we can you only use them on ourselves oh, okay, good so they're accountability cards.
Speaker 3I like it.
Speaker 2Yeah, because I mean we all go around giving each other red cards all the time. Yeah, that's true, you know in our minds or with our eye rolls or whatever the hell, or with our F-offs. But this would be accountability cards. I like it and I want to like it and I want to make some. I want to sell them on Etsy or something.
Speaker 3Okay, we'll have to develop that as a game.
Speaker 2So yes, Shh, don't tell anyone, we didn't say anything, people we didn't hear that.
Speaker 3Yes, you did hear it here. Actually, you heard it here first, before anybody else did.
Speaker 2So yeah, so this rep. Immediately he reached into his pocket, pulled out a red card and weaved it at himself. That is fan I think that's awesome, yep, oh here's another cute little morsel In Texas, a mascot in Houston. There's the Texans. What team is that, Michael, In baseball, Is that what they're called the Texans? I don't know what the houston texans astros okay, baseball team right, but who are the texans?
Speaker 2I got no idea oh, all right yeah, I don't uh that's what chat gpt is for. Yeah, um, anyway, they've got a mascot called toro, and the mascot went—.
Speaker 3Oh yeah, Houston Texans, that's an NFL team.
Speaker 2NFL.
Speaker 3Okay, great, used to be the Oilers, see.
Speaker 2Oh, did they have to stop being the Oilers? Was that considered—?
Speaker 3Maybe gauche, you know? Know, because of all the rich People and stuff that still own all the Teams and land man, reminding you to watch Land man if you didn't Watch it. Yeah, billy.
Speaker 2Bob so so Toro the mascot Of the former Oilers, the now Texans dangled upside down from the rafters holding a giant balloon over an influencer couple. This makes me happy. And out came tons of confetti and a mysterious white liquid. Uh-oh, uh-oh. So the couple screamed and freaked out and the crowd roared. The crowd was so happy and can you?
Speaker 2just picture an influencer couple. Of course they had the camera on them because they were following toro, who climbed up, uh, into the rafters, yeah, to do this prank and of course, so the whole thing was like framed, it was perfect, it was perfect, and I can just see them right. She's got hair like, oh, I can just picture her lips, her outfit, the whole thing I bet you can yeah nothing, nothing worse than an upset influencer oh boy, and it happens very easily. So what's in that balloon? What was it, michael?
Speaker 3that's what I want to know was it milk?
Speaker 2oat milk could have been. Was it a latte?
Speaker 3That's quite possible. Yeah, I don't know.
Speaker 4What was it? I don't know.
Speaker 2I don't know. One commentator said mascots have way too much free time. I disagree. I think they don't have enough free time to do all the stuff they should be doing. Oh, yeah, which is like more of this. Yeah, I think they don't have enough free time to do all the stuff they should be doing. Oh yeah, which is like more of this. Yeah, I agree, I think someone should get punked. Someone in the club section club down the club section should get punked every game yeah.
Speaker 3Every home game yeah, I agree.
Speaker 2Yep game, yeah, every home game. Yeah, I agree, yep, and so, um, yeah, if you're sitting up there, don't have, don't have your outfit too perfect, don't have on your suede jacket, mr influencer, yeah, can I ask you a question, speaking of okay fashions from her?
Speaker 3Sure, yeah, yeah, yeah 2025 fashion.
Speaker 2Okay, so right now the whole thing is to have massive jeans.
Speaker 3Okay.
Speaker 2Your jeans are supposed to be basically go from your waist straight out like an A-line.
Speaker 3Yeah to the floor yeah men and women yeah, yeah, it's a what is that? So I don't know. It's a 90s gangster look but 90s.
Speaker 2I mean they say that these are like 90s jeans. Did you ever have a pair of jeans?
Speaker 3like that. I had a single pair of jeans like that you did. They were.
Speaker 2Yeah, they looked really huge on me in the 90s oh yeah wow, I never, never, had anything remotely like that, nor did sister, nor did any of the like fashion-forward fashion plates that I knew.
Speaker 3I don't even know why I had them, just because they were comfortable. I think that's pretty much it, you know, and they fit.
Speaker 2Well, I have—well, they fit everyone.
Speaker 3The legs were just like super—yeah, I mean it was— yeah, yeah, it's just, the legs were just like super. Yeah, I mean it was, yeah, yeah.
Speaker 2Well, I don't. I mean, I have seen some looks using those put together and if you're eight feet tall and you're a size zero, they look fabulous. Yeah, exactly, you've got the right boot.
Speaker 3Because they're only used in the fabric from one leg of mine anyway, right.
Speaker 2But you put them on an average sized person and it doesn't look right. Yeah, no, I agree.
Speaker 3Alright. Well, okay, you know, we've got to do something here before we go. We've got, like well, something like that left. Anyway, we didn't talk about anything educational last week.
Remembering Robert Redford
Speaker 2Oh, my God.
Speaker 3That's right.
Speaker 2Quick Educate us.
Speaker 3Okay, are you ready? Yes, bananas are radioactive, oh no. Yeah, they contain a natural radioactive isotope of potassium, which is one of the things that people eat bananas for is for the potassium. Yeah, but they are all slightly radioactive. How about that.
Speaker 2So do people who eat a lot of bananas. Glow.
Speaker 3That's a good question. They might.
Speaker 2Do they need contrast when they have a CT or?
Speaker 3an MRI? Oh, good one.
Speaker 2Maybe they're already lit up, or can you just say look, believe it or not, I eat three bananas a day.
Speaker 3Well, I can tell you that Susan Day ate nothing but carrots for a long time and turned orange.
Speaker 2Well, that is a true story.
Speaker 3I know that's a true story.
Speaker 2Well, I, very sadly. One of my father's best friends, Leo Clarick, had cancer and was once. Conventional treatments had not worked. He went to dietary and homeopathic. Anyway, some doctor put him on like all carrots, or like tons of carrots a day, and he did start to turn orange Yep. And it was crazy. I mean we weren't expecting that.
Speaker 3And then we've got pink flamingos who eat all the shrimp, who are actually not born pink. They're born gray and they turn pink because of yeah, digesting shrimp.
Speaker 2You know how you always say pink looks good on you. And you're right. Yeah, I think it's because you've got pink undertones and I think it's because you eat so much shrimp. That's quite possible. I, you're right, yeah, I think it's because you've got pink undertones and I think it's because you eat so much shrimp. That's quite possible.
Speaker 3I'm just saying yeah, I know, it seems actually likely at this point.
Speaker 2I have a couple of things I want to get to Okay Before we close out for the week.
Speaker 3Now that we've discussed radioactive bananas yeah, oh, I'm sorry.
Speaker 2Did you have more? Oh no, we're all done, okay.
Speaker 3Yeah, radioactive bananas and not pink flamingos. There we go.
Speaker 2I want to say a Mishé Berach, and rather than describing what that is, I'm just going to say it. May the one who blessed our mothers and fathers bless and bring healing to all who are in need today. May strength come to their bodies, comfort their hearts and peace to their spirits, send wisdom to the hands that care for them and patience to those who sit beside them. May they know they are not alone and may healing come quickly in body and soul. This week's Misha Berach goes out to Ali and to Jose Lynn towering people in my world, in our world.
Speaker 2Robert Redford died this week. Yeah, and he was one of the finest actors, directors and influences on the independent film industry through Sundance, and just an extraordinary man who went through so much in his life and was so much more than a stunningly gorgeous movie star, and I want to suggest that you watch one of these in his honor Butch Cassidy and the Sundance Kid, the Sting, the Way we Were, the Great Gatsby, all the President's Men, jeremiah Johnson, the Natural Out of Africa, all is Lost. Three Days of the Condor, the Old man and the Gun. For Robert Redford, please put a light on.
Speaker 4Shadows along, bring it all. Tonight. There are thoughts can hide In a place seldom seen but remembered. In a dream. We see the rain Falling on everyone. We feel the earth Hear where the horses run. Fees will fall as they have ever done. Grace will change. Nothing is new here under a setting sun. Nothing is new here under a setting sun.
Speaker 4Late night, grass, familiar tune, your favorite spot, empty room, dancing past where we first met. Some things I just can't forget. Every club plays a melody. Every beat brings you back to me. Each time the trumpets start to play. I see you swinging, hear you say, baby, let's stay out till dawn. But now you're gone, yes, now you're gone. And all these clouds remind me of the way we dance, the way we love. We love Mix up same old song.
Speaker 4Thought by now I'd moved along. Empty seat here by my side. Can't pretend I haven't tried. Every sax note makes me sway, every rhythm leads your way. Each time the trumpets start to play, I see you swing and hear you sing Baby, let's stay out till dawn. But now you're gone, yes, now you're gone. All these chords remind me of the way we danced, the way we love. Should have held you closer then. Should have known what I had when Every night was ours to spend. Never thought this dance would end. Each time the trumpets start to play, I see you swinging, hear you say, baby, let's stay out till dawn, but now you're gone. Yes, now you're gone, and all these clouds remind me of you.