
Beneath Your Bed Podcast
Beneath Your Bed Podcast
Death Is Optional
For some, Artificial Intelligence represents exciting and limitless advances while others see it as a threat to their livelihood and daily life. Aside from the economic consequences, what are the moral implications of AI? What if you could upload your entire life to a robot and become immortal, would you do it?
I'm Jen Lee and I'm Jenna Sullivan. And we'd like to welcome you to beneath your bed a podcast where we drag out all those fears at work, beneath our beds from the paranormal to true crime, to the simply strange along the way, we'll be drinking cocktails and sharing stories from our Appalachian upbringings. For some artificial intelligence represents exciting and limitless advances while others see it as a threat to their livelihood and daily life. Aside from the economic consequences, what are the moral implications of AI? If you could upload your entire life to a robot and become a mortal, would you, Where are you tonight? John? I'm hanging in there. How are you doing? I'm glad that it's hump day. And I took my dog to the groomer. Oh yeah. Yeah. When I was getting them out of the car on the way home, I saw that I had a bottle of blue carousel and the, and the floorboard in the back seat. Oh my, that night that I went or that day I went and I bought that whole box of booze. I think that I couldn't fit that in there. And I left it in my car. So because of that, I thought it would be fitting for me to drink, uh, or make a blue Hawaiian. Oh, that sounds good. But you know, some people hate blue drinks, but you know, I don't make them that often, but this one's really good. It has a light rom of course, blue carousel, the core and pineapple juice and some cream of coconut and a cherry. I like that. You have these random bottles of hooch that it's kind of, you know, it's, it gives you a reason to run errands. You, you never know what you might find in your car. All I find is like an old fry that my husband drives is fast food that I tell you that I went to see. Well, I didn't really go to see, but I saw them over zoom. I don't even zoom. Actually. It was a phone call. Well, I met with a medium. I shouldn't say that. Yeah. Yeah. You showed it to me actually. So I did this. God it'll be two weeks ago, Saturday. Um, so I just got in the mail. I got the, what do you call it? The orange chart. Cause the meeting and that I saw, she does this thing where she said spirit at one point spirit told her that she should actually draw a chart of people's orders while she's giving them a reading. So she started doing that a few years ago. So I got mine in the mail. I'm always expecting, but it was really interesting and it kind of looked like a kindergartner, but it was my, my aura. There's like a bowl. And she talked about the, it was like a pool and she talked about the waters were troubled at some point. And like one of my spirit guides had this big long spoon, I guess they were trying to like scoot me out of the troubled waters. Um, so I think that's what you thought was a Dick was the spoon. I'll have to tell you what I'm drinking. I'm having a, it's really boring. You always have like the coolest drinks. I feel like you, you make drinks that have like seven or eight ingredients in them and realized, yeah, like you're the cool kid. I'm just this, I'm the kid that brings the same bologna sandwich
Speaker 2:For lunch every day, whatever you like, you know? Well, I'm just made a juicy gin and tonic. So it's a gin and tonic, but I did some grapefruit juice in there and then I have some lime. So that's why it's juicy, but it's, it's really good. It's tart and a little bit bitter. You've been getting into the gin lately. I have, I like, um, I got the Tanqueray, um, ring por Jen, and I really like it. It's got like some botanicals in it. Like I used to drink it in college and then I got sick on them. Like I think I got sick on everything in college, but that's been many years ago. So I'm, I can, you know, I can drink it again
Speaker 1:Tonight. I had told you initially I was going to talk about the dangers of artificial intelligence. And then after doing all this research, I decided I would do it about, I would do it on a I and also immortality. Oh, that's fascinating. So if you ever heard of the terrain test, are you talking about Alan Turing? Yes.
Speaker 2:I mean, I know a little bit about Alan Turing, but I don't really know much more than just kind of like the movie and I haven't even seen the movie. I know about the movie. Oh, I saw the movie. It was really good. I think it was called it's an imitation game or that's right. Isn't it better?
Speaker 1:Yes. Yeah. It was really good. He worked for the British government and he was like a Codebreaker and he was able to develop, I think, some type of machinery that was like code breaking for, I think they were called, it was called a Nygma messages. So it was really a key to defeating Nazi Germany. So, but it didn't end so well for him, but that's a story for another time he was gay, right? Yes. Yes. And I think he was finally recognized for his contributions, um, not too long ago, a few years ago. So that's good to hear. So with uttering test, what he hypothesized is that artificial intelligence won't be a thing or something will be considered AI when it can actually interact with a person and they can't distinguish between a machine and a human being. So like if you're in the next room and you were interacting, let's say on a computer or a chat or something, so you could interact with that computer without knowing it's a computer. Oh, sorry.
Speaker 2:I mean, meaning like the voice would sound human and the responses would be not even necessarily the voice
Speaker 1:Also just like if you're typing, say if we're on a chat and we're typing questions out to each other and you know, responding and, and that sort of thing. So you wouldn't be able to tell if it, if it's a robot or AI or if it's, um, if it's an actual person that's, and that's what, that's what his basically definition of AI is. And I don't know if you've ever heard of Ray Kurzwell. I haven't, he's considered a futurist and he's also developed a lot of different technologies. One was, uh, the Kurzel reader and that would scan pages and read it back to people, you know, who had some type of vision impairment. So that sounds like it's not so edge now, but it was 20 plus years ago.
Speaker 2:And that would be life-changing then. So
Speaker 1:He said, or he says that by 20, 29 computers will have emotional intelligence and be as convincing as people. And that's his prediction.
Speaker 2:And I wonder, I mean, I wonder how that emotional intelligence would work. Would it like, would it be able to Intuit things about our feelings linguistically that it's picking up, you know, or is it reading our facial images?
Speaker 1:That's just wild. Well, I'm just getting to that. There's a company called Hanson robotics
Speaker 2:And it's a AI
Speaker 1:And robotics company that solely for creating like social, socially intelligent machines. And it's founded by David Hanson. And now it's based in Hong Kong and I didn't realize this, but Hong Kong has the largest toy fair in Asia. And it has like a ton of life-like dolls and robotic characters. So they're based out of there. Now they've developed a number of robots and, uh, I don't know how many, but I would say guests to say as many as maybe 12 and one of them is Sophia the robot. And she was born on February 14th, 2016. Is she a sex doll? No, she is not,
Speaker 2:But I was so afraid when he told me her birthday was Valentine's day. I'm like, Oh,,
Speaker 1:Don't worry. My friend, I'm getting to that much later. Okay. Jeez, you can count on me. She became the first robot citizen in Saudi Arabia. And that was a few years ago. And you know, that's really kind of gimmicky too, because you know, Saudi Arabia, they want to move away from an oil based economy and they want to be known to, for like their innovation. I didn't know that. Yeah. So I think that was a bit of a gimmick, but she has, she actually can perceive and recognize human faces and emotional expressions. And she can also recognize, you know, a number of hand gestures. Well, according to Hansen, she has emotions too. But again, I think that was just, just hype and another, another interesting fact is in Greek, the word Sophia means wisdom.
Speaker 2:I was just thinking that actually, if that was wondering if that was why they named her that, so you knew that. Yeah. Um, I always, I've always liked that name and I remember reading a long time ago. That's what it meant.
Speaker 1:Yeah. So I, well, that's something I wasn't aware of. So I guess that's not a, uh, not a new fact to you, but it is to me. And if you look at her, she's actually based on her appearance on Audrey Hepburn, I ran and she doesn't have hair or anything. So in the bank you can see like all the electronics, but if you look at her face, if you saw her from a distance, you would definitely think that she was a person. And so in March of 2016, David Hansen who created her, he actually gave a live demonstration for the first time at the South, by Southwest festival. And during that, when he's asking her questions, he says, facetiously, he says, um, do you want to destroy humans? And then he's like, please say no. And then like with this blank expression, Sophia responded, okay. I will destroy humans. Oh. So it's really, um, disconcerting. She's been on the tonight show with Jimmy Fallon. She's sang a duet with him. Yeah. She's singing a duet with him and it was really good. Did you know about her before you started doing research for this, this episode? I mean, no, I just happen to be watching something on YouTube and people were along the lines. I think talking about the dangers of it. And I think that's where I first saw her, but I can't remember the name of the channel. So I thought, Oh, this is crazy. Let me hear this. And then I kind of went down a rabbit hole of reading things about her. I really want to see a picture of her. I mean, do they, does she wear clothes and stuff? They'd dress her up. Yeah. They, they dress her up. I didn't quite think that her attire was, was all that hot when she was on Fallon show. But, but what do I know? I mean, I'm glad lumberjack fans and also Malta is thinking about, I think granting her citizenship if they haven't done that already, but they're trying to devise some type of citizenship test and I'm not exactly sure how they're going about doing that. So again, she was first introduced or not first introduce her birthday is February 14th in 2016. And introduced later in March, I'm going to go back to, and this is, I find this utterly fascinating and I'm really, I'm hooked on reading more and more about this. There's another robot. It's like a bust, like figure. So it's not a full body like Sophia is, and it's called like a customized character humanoid robot. And so again, like it has like a Boston, uh, shoulders, and that was also developed by Hanson robotics in 2017. But it was also in conjunction. It was a partnership with Martine Rothblatt and Bina 48 was modeled after being an Aspen, which is Martine Rothblatt his wife, no way. And evidently with Sue through like over a hundred hours of interviews and that sort of thing. So being, she can engage in conversation too. And I think she's far more disconcerting than Sophia. And I'll tell you why in first, let me go back to Martine Rothblatt and Martine Rothblatt used to be Martin Rothblatt and she found it serious. Really? Yes. And she also has founded a biotech company. I can't remember the name of, I can't remember the name of it offhand, but it was in response to one of her daughters being diagnosed with, I think it was pulmonary hypertension, but I think more like of a juvenile form. So she developed this biotech company to help with that, to help individuals that are impacted by that and other people as well with other, she sends pretty honestly, she, she does a lot of stuff. Yeah. I mean, she's truly a visionary. And on top of that, if that wasn't enough, she also has founded like this religion and it's called the terrorism movement. And so that's T E R a S E M. Okay. Terrorism. And it evidently kind of melds Judaism with yoga and technology. This to me is what really gets me is one of the four founding beliefs is death is optional. Wow, that's a game changer. It is. And with that terrorism movement there, they've also started something that's called the life not project and what they do for free. You can go to life, not project and you can, you know, they wanted to make it accessible to everyone. So it's open to everyone with an internet connection it's free. So what they do is they develop what's called, or you develop what's called a mind file. And it's a database of your personal reflections and video and images and audio and documents about yourself. These can be saved and searched and downloaded, and you can share them with friends and each, each one of those, if you choose to do this comes with like an interactive avatar that becomes more like you, the more you teach it. Wow. And train it to think. And what does this call to get an, a life, life not? And this is called actually a mind file that you develop. And on top of that, you can create, what's called a bio file, I guess, you know where this is going. Now, this isn't free. It costs about a hundred dollars or$99. And just like you would with 23 and me, they send you something too. It's like a mouthwash, a gargle with it and you put it back in the container and then you send it to them. And so they use this collection of cells, um, and they, they store it. And after you've been declared legally dead, they maintain that future technology may be able to grow you a new body. Oh my God, that's fascinating. I guess that's better than, you know, they used to for huge for, you could have your head cut off and stored cryogenically. I mean, I think they were doing whole bodies and they just did heads to save space. This sounds like, yeah. Like the, yeah, like the 23 and me version of that, as you said to me, well, this is more appealing than just like cutting your head off because you wake up with just your head. Y ou can't i magine anywhere. I c an't imagine. Oh, I also wanted to talk to you about why being a 48 really kind of unnerves me more than Sophia the robot, Sophia, the robot she's considered, u m, more of a social robot and she will be used in the future o r they hope she'll be used in the future for, you know, the say helping people at an amusement park or even, u m, helping with medical issues. Uh, they may become more adept at those type of skills than actual than the actual practitioners. Um, so a wide wide range of use is also just to keep people company to help the elderly if they're in the nursing home. So it's more of a social function. And with being a 48, again, if you go on YouTube, I don't know who set this up or who was behind it being is actually being interviewed by Siri. And they're talking about pop music. And then she's like, well, let's talk about something else. Like cruise missiles. That's what she brought up. Yeah. And she's like, cruise missiles are a kind of, and she talks about like how she would love to remotely control one. But of course, you know, they're very threatening to people, very menacing because they have nuclear warheads. And she says that she, you know, she controlled them. She would fill the nose cones with flowers and band-aids, or, or little notes of the importance of tolerance. And she said, you know, that would, of course be less threatening and more well received than a nuclear warhead. Of course. Yeah. But then she goes on to say, but of course this is a quote, but of course, if I was able to hack in and take over cruise missiles with real live nuclear warheads, that it would let me hold the world hostage so that I can take over governments of the, of the entire world, which would be awesome.
Speaker 2:Oh my God. I mean, is she thinking, are these, see, I can't quite wrap my mind around it. Like, are these original thoughts she's having, or are these things based off of her interviews? You know, these countless hours of interviews with the real Bina
Speaker 1:I know with Sophia, a lot of it has already been pre-programmed, but with handsome robotics, they're developing something that's called the singularity. And it's basically like a cloud network, the best that I can interpret with my limited ability scientific abilities, but it's a, a network that would allow other AIS to get on or other robots to access and they can learn from each other kind of like a social networking.
Speaker 2:I was going to say, it sounds like an internet for, or a social network platform for artificial intelligence creatures. I don't know what else to call them.
Speaker 1:Yeah. So, so for her to say that it's just so frightening and it's based on interviews, it's also based on programming and probably interaction with other AIS. And maybe it's because, you know, that's the true deep seated fear of humans of computers or AI taking over. And so maybe it's just kind of regurgitating that if that makes sense.
Speaker 2:But I don't think, I mean, I don't know a lot about it. I'm so interested to hear everything you're talking about tonight, but I, I feel like it's maybe not that far fetched, because if you think about, I mean, when did the computer age begin, like was probably what the fifties, I mean, was it earlier than that? Maybe. I don't know, but it hasn't been that long. And you think about within two years, like your phone is outmoded or within think about like, I was always watching something on TV the other night and they were talking about how DNA testing has come so much farther, like from 2001 to 2007, you know, like everything moves in like these leap years, um, of just speed. So with things developing at that rate, you can't help, but wonder what could actually happen.
Speaker 1:Nothing you can do. I mean, the cat's out of the bag, so to speak, there's nothing you can really do about it because do you really think that a government is going to make the decision to ban such things like that when other countries are taking advantage of it and using it to their advantage, even, you know, in a military militaristic type way. So there's nothing, I don't think that can be done. I mean, it's just, it's, it's coming down the pike and I guess
Speaker 2:Question is like, what will they be used for? Is it going to be, you know, if you think, well, are they going to replace labor force? Is it going to be more utilitarian kinds of labor? Or, I mean, is there ever going to be a robot artist? You know what I mean,
Speaker 1:Robot novelist? Or like, what are we going to decide or what are they going to decide that like their function is that their role is in society. And speaking of Novelis, uh, one of the robots was developed based on if you ever heard of Philip K Dick. Yeah.
Speaker 2:Yeah. Brian likes him. Um, he reads sprint, like Saifai um, did he write, what are you
Speaker 1:All, has he done? He wrote something about do robots, dream of electric sheep, something to that effect. I can't remember, but it's what, it's what blade runner is based on. Okay. So one of these earlier robots is really him like facial wise, and he can also recite, I believe all of his novels he's know him. He passed a number of years ago, but he looks incredibly real to me. And it's probably because he also has facial hair. Okay. And he makes a statement and this, you know, might've been tongue in cheek or whatever, but he said something when he was being interviewed or the AI robot was being interviewed and said something about putting people in his people's zoo. It's just really, we kind of deserve it though. When you think about it, all the creatures we put in zoos were kind of doing, we're do our turn. Although I don't want to be in a cage. I was going to say, I don't want to be in one either. I mean, it's just crazy that people are probably, I'm sure they're already using this life, not to upload like their mind files and images of themselves and videos and you know, their life story. And so
Speaker 2:Is the idea, I mean, to me, that's just like a fancy archive though. I mean, how is that different? I mean, obviously it's different in the sense that there's going to be more information and be more easily accessible, but how is it different than like us discovering like a journal from 150 years ago that somebody kept, I mean, isn't that the same? It's not like it's, I guess what I'm trying to say is like, it's not like that information is interacting with anything
Speaker 1:It's just there to be consumed
Speaker 2:Or read or whatever, is it, or am I missing?
Speaker 1:Well, they can't, I think they reserve the right to use the information that you upload, perhaps, you know, use it for something like another Bina, 48, for example, but she can interact with people and she can recognize them. And a really weird video I saw was when the real being, it goes to talk to being a 48. And to me, I don't know, she seems to have, she seems to have more personality. She seems to have more depth than the Sophia, the robot. Does she have
Speaker 2:More depth than some of the people we know and talking about? Okay. It's just wondering, that was just a question. Yeah.
Speaker 1:A hundred percent. Yes. And so it really looks like being an Aspen Martine's wife, but really two within the interesting thing is that it's really kind of a love story with them because Martine identified as a trans woman early on, back in the back in the nineties and being able to evidently was like, um, you know, I love your soul and they seem to be very close and they seem to, they seem to, you know, take on all pursuits together. So I find I'm going to have to read more about them. I find them fascinating.
Speaker 2:I want to see pictures of them and just can't well, when you
Speaker 1:Get the chance, take a look at the Bina 48 videos. Okay. Versus the Sophia, the robot. And you're going to see, I think, a lot more on Sophia the robot. And they also just introduced, I think within the last few days, a little gosh, I can't remember. It's like a baby. Oh God, there is a toddler robot. And I think it's creepy as.
Speaker 2:Maybe robots will take over the world that is like put me in a zoo at that point, because I don't want to live in a world where there are baby robots running all over.
Speaker 1:And so I got a toddler robot, um, that can of course interact with your children and help them learn. And I don't think like the price is outrageous. So Roomba a Roomba rather than a toddler robot. But if you could do it, like, I think as a parent, this whole concept really freaks me out. But I think too, as a parent say, if you had young children and you had a terminal illness, just the thought of being able to create this mind file that might be able to be turned into artificial intelligence and perhaps in some type of biological robotic hybrid. I mean, I mean, that's something I think I would think about doing
Speaker 2:Very interesting, I guess. So if you did it, let's say you did it, would you like, would it be, but it wouldn't be your consciousness. Right? I mean, this would kind of, it would be something that your child could interact with that would have all these characteristics and qualities that you had, but it wouldn't be, you wouldn't be your essence. Am I right?
Speaker 1:I don't know. It gets so complicated. I think it really depends too. Like on what your definition of consciousness is too.
Speaker 2:I mean, I guess for me it would be like, well, do you have an awareness in this robot body that you are Jen and you are interacting with your child? Do you know what I mean? Or is it kind of like, like you take a picture of me and you paste it on like a wooden stick. I'm still me and my consciousnesses there, but this likeness of me is not me, even though it's, it's a likeness of me.
Speaker 1:Um, and one of the, one of the, I think the definitions of what is consciousness is self-awareness and some of these claims such as with Sophia, the robot, I think Hanson has said, Oh, you know, she can feel, I think these are really kind of hyped. And it hasn't been proven at all very far off from that. But when you hear them talk and they interact with you now, sometimes they'll go off on a tangent, something that you're, that's not related to what you're asking them. Okay. That sounds like
Speaker 2:Our, what did we call it? Alexa? You know, like, you'll ask her a question. It'll be, she'll give you some really off the wall. Yeah.
Speaker 1:Yeah. So if you watch some of the interviews you'll see that, but th the duet with Jimmy Fallon was really, it was something else to see. I think she's been on there like twice. Does she have a good voice? Yeah, she did. At that time, she had made an appearance, I think a couple of years earlier, and her voice sounded more robotic. And with the duet, it sounded more of more humanlike. Interesting. And Hanson said that in the couple of interviews I've watched with him or several interviews, he was saying that he wants people to know that they're interacting with a robot. So that's why, I guess he has the back of it can see all the, you know, all of the parts and the mechanics of it, but the skin and the expressions that it gives. And it being able to perceive things and interpret your facial expressions.
Speaker 2:That's amazing thinking about all of that going on under the surface, but then you think about all of that's going on under the surface for us all the time. And we're not
Speaker 1:Insert brain, but they have a certain like, uh, synthetic skin that they've patented and they've called, they call it for hubber, which I think is so gross. And when I, when I think of that, it's called f rom B erea. When I think of that, that brings me to sex robots. But to back it up a little bit, I watched a documentary, I want to say about five years ago. I can't remember the name of it, but it was this whole documentary on guys that had these really realistic looking women w hose rubber was t his
Speaker 2:It's on HBO. Like after midnight, I think I've seen something like this. It wasn't cause it was like, once I was like, I didn't know, we get porn porn. Okay. Well I think this was, this was like more like a porn doc.
Speaker 1:And I was like, I shouldn't be watching this, but it's very interesting. I don't think, no, it wasn't, I don't think it was a porn documentary, but it was just some men they were paying. And this is, they're not robotic. The ones that I'm talking about, but they're paying like five,$6,000 for one of these dolls. And some of them, they would have multiple of them and the way that they would talk to them and interact with them and say that they were in love with them. That you mean the men were saying this to
Speaker 2:The dolls or the, could the dolls say it back?
Speaker 1:No, they're just like places or they just lay there. They're just, yeah. They're just dolls talking dolls. Yeah.
Speaker 2:Yeah.
Speaker 1:And I think with one of the guys, I know this is really gross, but basically they showed how they clean them up to make them sanitary. Yeah. I thought you would like that. Yeah. That's disgusting. So there's actually like a, there's like a in Britain, I think there there's like a sex doll brothel. Really? Yes. And they let you quote, try before you buy. Wow. Yeah.
Speaker 2:I'm just trying to wrap my minor. Cause I mean, I think if I were going to buy one, I'd want to
Speaker 1:A new model that hadn't been used. I mean, I love antiques, but I do not think I'd want one that had been used by, you know, 50 other people. It's funny that you say that because in one of the articles I was reading and I don't know how they came up with this assistant, but they were saying that 70% of the people that, you know, use the, the sex dolls don't really care if someone else had used them in 30%, we're like no way. We're not touching that. That's so interesting that other people don't. I mean, I guess when you think having sex with multiple people, like it's been, it's been used before, but it's just seems different somehow when it's a doll I had does crazy. And there was some mention too, of women using these dolls. So I don't know if they're a male dolls. I just saw them being female dolls. So maybe it's been, I wonder, do they make them to order? Like if you say I want a red head with, you know, Oh yeah, yeah. Oh yeah. And then there's, um, there's Samantha, the sex robot that I read about. And she actually responds to certain types of touches and so certain things, Oh my God. And evidently in 2018, there was a sex robot brothel in Italy that was shut down and they shut it down. They said, you know, allegedly because of infringement of property walls. Oh. And I don't get that. Why I don't get that? I think they just made that up. Okay. I think that, you know, they're like, Oh my God, this is freaky, but there's nothing legally on the books except for this, you know, who knows maybe another company made a complaint that also makes those dolls. And I read, and these articles were dated in like, I think 2018 where Texas was going to have a sex robot brothel, whether or not that ever came to fruition. I don't know. But Texas has some crazy law where you ca n't h ave a vibrator. Is that true Texas at one time? I think it was Texas. And wow. Maybe it was, u h, A rkansas, probably ou r c o nsole. You co uldn't u s e l iterally vibrator. That's crazy. I know. Or like an ELO or anything like that. I' ve m e t a n d h a ve s t uff I'm sure. Go ing t o h ave any fun. And I think California was go ing t o o pen a brothel as well. And I'm really, I'm just completely repulsed by that. Ye ah. It's not my cup of tea, but I guess there's someone le t's, what was that movie? I' d n ever saw it, bu t w as walking Phoenix. Was it she or her? I think it was her. I didn't watch it either. I was kind of fascinated by it then, but slightly repulsed kind of like you, all the things was like a more romantic kind of thing. I'll have to watch it. But I think there's some people that have a really hard time for whatever reason, forming a relationship, but they still might have sexual needs. And you know, maybe this could play a role, you know, that's an important function of a person's life and I don't know, but then, you know, it also, I will also think about the objectification that could happen with that. I totally agree with you what you just said, but I guess I think too about there's this subset of men who, and this is not to male bash, but truly there's a subset of men that don't like it, that women have opinions or ideas or rights. And I, I, I guess it's the intent that kind of grosses me out, but again, to each his own, I also wonder about the idealization of the female form. Cause I can imagine I'm sort of imagining all these dolls or like what we would call perfect kind of Barbie versions of women. Does that further reinforce reify, that idea that women have to look a certain way to be attractive or, you know, sexual, you have to be a 20 inch waist and 38 buss. Do you know what I mean? Yeah. There's all these different things that just come into play and I have to admit, I'm pretty judgy about the, about the sex dolls, but then like you said, there are a lot of lonely people in the world and you know, maybe this is the, this is the answer. I wonder if it would, if there would be, uh, I, as I'm thinking about this question, even before I say it, I think I know the answer. Like I was wondering if it would having access to something like that would cut down on sexual violence, but we've proven that sexual violence is more about the desire for power than an desire for, you know, erotic pleasure or anything like that. I've thought about that too. And then you think though, is someone going to try to slur their things and inflicting hurt or, and that is not going to be enough for them. So it has to escalate or like you said, maybe, you know, maybe that would be enough who knows. I mean, there's just so many different things to think about, I guess ethically are there really are, this is, this is really fascinating. I mean, seriously, if you knew someone who had a sex doll like that, I mean, or companion doll, let's say, I mean, when you, when you judge them, come on. I mean, I'm thinking about like, I can imagine reading a book about somebody, some sweet little hold guy who would be had, you know, who was complex and had all this history and these things going on. And like, this was sort of what his life had become and having compassion for him. I just kind of can't imagine it in real life. I mean, are you, are you asking, like if, if somebody invited me over to dinner and they were like, Hey, here is, you know, Hannah, my, my sex dog, like me, we're going to, we're going to enjoy some shrimp cocktail first before we get to the main course. Like, like that kind of thing or not quite that far, but say, if you just found out, for example, I think it would depend on who it is. This is, this is maple leaf. This is anybody's body. We know, but say, if you found out that Joe had one of these dolls and you know, you weren't going out for dinner, you just knew. I think he would call me in a hot second c an show a s a sex. Al l i t's so disgusting. I can see, I would do the same be cause I think, I think I'm a little bit of a prude deep down. I don't want to think I'm im proved. Like I don't, I don't judge what other people do really.
Speaker 2:But, um, but I don't know. I think it depends on who it is, but you're right. I'm probably like thinking of myself as a nicer person than I really am. And I would probably call you up and be like, I can't believe this because I tend to judge men a lot more harshly than women anyway, y ou k now,
Speaker 1:I don't know if I do that necessarily, but I know that I think to myself, like I Pat myself on the back and think, Oh, you know, you're, you're an open-minded person. You're so open-minded, but I'm really not.
Speaker 2:Well, we both have that. We're both inf JS. Right. So we have that, that, judge-y that kind of versus perceiving. I don't know. Yeah. I try not to like,
Speaker 1:Well, tell me all sorts of weird stuff and personal things about them. People I don't even know, which is actually kind of cool. And I don't really mind it. I find it, you know, everybody has a story. And so when people tell me things, I, I find it really, really interesting and interesting, and I just kind of mull it over in my mind. It's not necessarily a judgy thing until maybe later or something when it fully, when I fully realize what's going on.
Speaker 2:I think it kind of like you, people tend to confide in me. And I think like, I don't know that I'm that druggie and less, I think the person like has some kind of mal-intent or they're just kind of inherently not likable. I had this friend Julie, um, back in grad school and she used to say she had this Maxim that she made up that was like, it's okay to make fun of somebody if they're mean. And so have you seen as like one of the nicest people, but she would be like, you know, if you're an, like no holds bar, I can say whatever you want about a person. Um, so I kind of feel like that. I kind of feel like if somebody, if somebody who's an, no holds barred, but if they're just calling me like a person trying to get through life and I mean, I think there's all kinds of ways to live. And I think human beings are inherently strange creatures.
Speaker 1:Do you remember it last year? It might've even been quite a year ago. I sat at this informational booth that no one ever comes and uh,
Speaker 2:Yeah. Yeah. I sat there too and it's painful
Speaker 1:And I was there for a hot 20 seconds before this woman in her sixties approaches me and proceeds to tell me that her husband is her sister. Oh my God. Yes. T here's some judges.
Speaker 2:Well, but you know, the post, a secret thing where people write a postcard, like they're kind of deepest, darkest, secret and mail it. I think there's a sense, like think there's darkness or if not darkness, there's ambiguity, confusion, whatever you want to call it. Like in everybody, everybody has like a strange story or they have this thing that is hard to tell anybody else or, um, and I think there's a craving, right? To want to connect, to want to be known. And, and that takes me back a little bit to, you know, the AI stuff like even, and that desire for immortality, that one, you know, you talked about wanting to have that for your child, but I think that there's also the sense of wanting to be known by your child or wanting to, I don't know, I'm rambling now.
Speaker 1:Oh no, it's all interesting. I mean, we could go on and on about this stuff and no lack of material.
Speaker 2:Well, I definitely think we should, we should do more episodes on this. I'm probably going to be texting you later and being like, can you remind me what, what that was called and what that was called? So we can look up some of these videos. Cause I'm, I'm really, um, I'm really intrigued. This was great.
Speaker 1:Just search on YouTube and you can find it.
Speaker 2:There's so much stuff on YouTube. It's crazy. It's great for us.
Speaker 1:You just have to be careful like what you watch and what's, you know what I'm talking about as far as this disinformation. Yeah, yeah. The sources. Yeah. You don't want to get anything too wacky, although, you know, I believe in some conspiracy theories are actually true and have proven to be true, but when you get Alex Jones type level,
Speaker 2:Yeah, yeah. You definitely have to think about where something's coming from or at least keep your mind kind of speculative check things out, you know, but I like speculation. I like, I think that's kind of what this, um, what our series is, is about. It's about the speculative stories. And
Speaker 1:I didn't want to ask you, I'm going to ask you this at the beginning of our, of the episode tonight, but I was thinking we should totally get like a Bigfoot costume and we should take it and we should try to create like a, a big foot siding video. We totally do that.
Speaker 2:We could do it this ridiculous. Well, you know that wouldn't be our first foray into a
Speaker 1:Movie making. I never came to the big screen. Um, no, we actually, I have a doll that's called scary Mary. And I think that it rivals Annabel probably even as far as the fear factor goes, maybe it was one up on it in appearance-wise for Anabel. It's pretty scary. So she's what maybe about
Speaker 2:Three feet tall. Two and a half. Yeah. She's a big girl. She can almost stand up on her own. Like you have to prop her up, but yeah,
Speaker 1:She's got flaming red, like Maddy hair and probably the most disturbing thing is like her outfit. I don't know if you feel this way or not, but like, her skirt seems kind of short. And so when we were making our, trying to make our movie by that church and my dog, he was supposed to be running from, from scary Mary and he wasn't, he wasn't cooperating. He was just standing there. And so when we put scary Mary in the trunk, there was a woman across the street at the church who saw that, um, drove by. We thought we were putting a toddler like in the trunk. Oh, she did
Speaker 2:Wonder we didn't have police trailing us. Jen.
Speaker 1:I know. I know. And we also, we got unicorn mask. Remember that? I was g oing t o say that you had to I'm sorry, t he, the rainbow unicorn mask.
Speaker 2:Yes. And you thought that was really scary. Cause you took footage of me. That mask was hot as hell. Oh my God. That was like torture. That was pure torture wearing that thing.
Speaker 1:Oh, they were all like cheap m ass. I thought it was scary as, but didn't you show B rian? He was like, he l apsed
Speaker 2:Just like, what the hell is this? I don't remember. Your daughter said, I wish my mom would get some hobbies. Like knitting, these stupid horror movies we were trying to make. I think the worst though, the worst was when we went to the convent ground, it's actually a monastery like, well I'm monastery with nuns. What would you call that? I mean, take a vow of silence. Yeah. And yeah, they taken a vow of silence and Jen's wearing a nun scaring nun skin. We're like taking footage. Oh my God. That was so disrespectful. Genesis. It's not like we did this when we were 19. This was like a year and a half ago or two years ago. And we were bought was it by the mother, Mary that I was wearing that satanic mask and the n un mask. And I was so afraid that they were going to come out and yell at us. I mean, to me like a non yelling, you would be scarier than anything. I'm like, Oh, I know humiliating, totally local. I wonder if anybody saw it as I just wonder, but I think they would have come out and said something so much just beyond the pale. I don't know what we were thinking. I think that was like falling out or something. Oh my God. Well, well we definitely gotta do big foot. Yeah. Maybe I'll be big foot, but I'll wear like a big hoop skirt over my pink foot out. I'll be like, you're a Bigfoot that could be like a fetish site on that note. We're going to have the fetish site and then we can also have like the real, regular, big football. We do. Children's birthday parties too. Like we'll make appearances and scare the crap out of, Oh my God. I'm really kind of blasted right now. So are you it's okay with you? I would like to end with the abominable snowman. I think that's a great idea. All right, here we go. Let's drink to him.
Speaker 3:Okay.
Speaker 2:Well this has been fun. My friend, it has cheers. Cheers, Shalonda. Thank you to everyone who listens. The best thing you can do to help us grow is to like review and subscribe on iTunes and even better yet tweet about us or post about us on Facebook. Tell your friends if you think they would like us and have a good night.
Speaker 3:[inaudible].