Dispatch Ajax! Podcast

Artificial Intelligence in Pop Culture

Dispatch Ajax! Season 2 Episode 67

We begin by exploring the curious case of droids in the Star Wars universe – conscious beings treated as property and slaves despite their clear personhood. The moral contradiction is striking: characters form deep emotional bonds with these synthetic beings while simultaneously accepting their status as possessions. This paradox raises profound questions about how we define personhood and the ethical implications of creating sentient life only to subjugate it.

Speaker 1:

He must have felt really badly when Chow Yun-fat got a series in America.

Speaker 2:

Gentlemen, let's broaden our minds.

Speaker 1:

Are they in the proper approach pattern for today? Negative, negative. All weapons Now Charge the lightning field man.

Speaker 2:

Part four huh, at least at least part four will this be the end skip? Is it the end of artificial intelligence for us, you and I?

Speaker 1:

is. It's at the end, my friend Will it finally take over.

Speaker 2:

You don't even need us. Yeah, man, we are obsolete. Let's talk about them, their robots, now. We could go on and on days and days, weeks and weeks, years and years, before the heat death of the universe even happens. Just talking about robots and artificial intelligence in popular culture specifically, mostly movies, books. Jesus, we'd be there time immemorial.

Speaker 1:

It would be literally the entire premise of our podcast.

Speaker 2:

Hey, we're moving and shaking folks, all right, we have more things to talk about, so we can't just spend forever on AI. But there were a few more things we wanted to discuss. We wanted to get out there, we wanted to lay it hard on wax, as they say, that nobody says.

Speaker 1:

We have a few more things to say, and by that we mean the main premise of the episode we were trying to do. That turned into many episodes.

Speaker 2:

You know, ai is an evolving topic and subject. True, it touches our real lives much more than Rocky does, and there's a lot more philosophical and theoretical implications, much more than Highlander. So I think there was room to talk about this and it was something that was both vital and, in our minds, necessary. So let me just get us off the hook there. So now, completely apropos of nothing, droids in the star wars universe. Now, this is a fantasy world, however much technology seems to be prevalent in it, but I think droids are maybe one of the most prevalent presences of ai in mass pop culture. Everybody knows star wars and by knowing that, you know about at least c-3po and rtd2, and if you're a little younger than us, you definitely know about them. Droids, like the Droid Army, the fight on I don't know, genosha, and Is it Kashuk? Or the Trade Federation, and blah, blah I don't even know where those came from. Are they trade droids? You know what Did they?

Speaker 1:

make those. Those movies don't make any sense or matter. So I mean sure. I mean yes, I think they are trade federation, yes, I think they are by the trade federation, and then the that's.

Speaker 2:

I mean, that's that's smart for them to make these droids and then, I'm assuming, selling them to their partners, because they seem to be fighting everywhere roger roger.

Speaker 1:

Yeah, I think their trade federation built. I don't know that. They well they're paid for. I don't know that they built them but they never tell you. Either way, I mean, the only real origin thing that you get into is the Jango Fett shit in that movie.

Speaker 2:

Right, but that's all about the clone, cloning and Clone Wars.

Speaker 1:

you lucas never actually gives you anything. I don't think they were explored, even in the gendy tartakovsky stuff or in the ancillary clone wars stuff. I don't think they ever tell you any of that probably not.

Speaker 2:

I mean, it really doesn't matter. It was a passing thought, but really what I'd like to talk about is these the weird depiction of droids in star wars.

Speaker 1:

So bizarre it's so odd.

Speaker 2:

Again, a real Star Wars podcast probably knows many more examples, a much more intricate understanding of the moral implications of droids and how they function in the plethora of societies in the Star Wars universe. But what we are exposed to are these seemingly conscious beings that also maybe are not, maybe just run their specific programming, but they also definitely seem to be in constant slavery to everyone, except for the few times when they are given free reign to live their lives. So I'm trying. Okay, like c3 and rtd2 are definitely slaves. They are bought and sold.

Speaker 2:

They are owned by well, jesus uh we don't need to go through that whole pantheon but in the canonical, when we first meet them, which would be a new episode four, however you want to think of it, they were owned by Leia and then are sent on a mission, captured by Jawas, then sold to Luke and his uncle and aunt, and then Hold on a second Weren't?

Speaker 1:

they owned by Captain Antilles and Leia just reprogrammed them.

Speaker 2:

That's true. Yes, that was Captain Antilles.

Speaker 1:

That which I learned mostly from the droids cartoon TV show.

Speaker 2:

That was a thing, but if you also own a droid and then someone else is reprogramming that droid. How does that work?

Speaker 1:

Right? Is that legal? Trying to figure out how these things work technically in Star Wars is a fool's errand, I think in general, but we should go down this rabbit hole. They were owned by Captain Antilles, reprogrammed by Leia, and then they found the way to Luke. I don't really want to factor in the idea that Anakin created those droids, because it's just stupid and it makes Empire not as good.

Speaker 2:

There's so many problems there. But despite their lineage, what really crumples me up is how do people interact with them? How are they used? Are they just used? To me that's almost like they're pets okay sometimes they see them as friends, as something to care for, but they don't ever free them from bondage. We see that they can live their life when we get to empire. We do see an android bounty hunter right ig88 yeah, I used to have that toy.

Speaker 1:

I'd love that fucking toy. That was so cool. But I lost. Yeah, I lost the fucking breather thing that attached to him, the breather thing I'm thinking of a different droid, never mind, it doesn't matter, it doesn't matter so many, so many droids there are a lot of droids in star wars there are a lot of droids in Star Wars, just like everything in Star Wars, plus maybe for a different podcast.

Speaker 2:

But Star Wars is, I think, a real genesis for pumping out content 100% For marketability and saleability.

Speaker 1:

They weren't the first, but they were the most prolific.

Speaker 2:

Yes, I think they were a test case, for let's give everything a name, a title, turn it into a toy so that we can sell it, which has become rampant thereafter jumped off and mass pop culture, I think with star wars I think you and I both recognize that that is one of the biggest problems with comic books and with, you know, movies in that sense in general.

Speaker 1:

But Star Wars really did you're right set that definition. That is the baseline. I mean, you had Evel Knievel before that and you had the Six Million Dollar man, things like that. But Star Wars really is the reason that you have the economic structure for IP merchandising today.

Speaker 2:

That is why we know, like IG-88. Are you making things to tell the story or are you making things to sell a lunchbox? Be that your BB-8 or whatever cute little baby Yoda thing, you can look at yourself and say is this the right way to do things? Maybe not, but my bank account says that I'm going to keep doing it. So just because you find something immoral doesn't mean you're not going to do it. Yeah, that's true.

Speaker 2:

Speaking of, I'm sure there's a lot of people who find the behavior of I guess you'd call flesh beings, with the droids in the Star Wars universe possibly reprehensible, but you don't really hear a lot about that. There is a galactic constitution which declared all sentience equal. It decried memory wipes, maintained to eliminate personality quirks and questioned why they were recommended if droids truly lacked personalities. The movement, the Droid Rights Movement, also considered the use of restraining bolts a form of slavery and practiced outlawed. Sure, fine. None of that actually shows up in any of the movies, except for Solo, where you have L-337, the one that no one likes. Who is in favor of? Yes, yeah, well, not the only one that no one likes, but one of the many Star Wars movies that no one likes. Sure, at least it does have the question coming up.

Speaker 2:

Here's a droid. It's a hodgepodge droid built from different parts of other robots or mechanical beings, and it has a goal to free robots from slavery. But essentially all droids in Star Wars function on some type of slavery basis. You know, whether they're sentient or not, I think that's debatable, but perhaps that doesn't matter. You know, perhaps we should focus less on sentience and what matters more are the significance of relationships that people form with them. I just say, is this robot sentient matters less than is this robot my friend, my colleague, a part of my family, a part of my family? Perhaps the question about relationships and whether they meet personhood is less important than if these relationships are sufficient to grant an important kind of moral status based inherently on the relationship forming and how they are viewed.

Speaker 1:

You mean like the droid in Rogue One, rogue?

Speaker 2:

One. I don't remember the droid in Rogue One, the.

Speaker 1:

Alantidek droid, who sacrificed himself so that they could execute the mission.

Speaker 2:

All the protagonists in these movies seem to have a moral relationship with their droid, up until it comes to a point where they want them to do something role of like the morally superior being to sacrifice themselves in some way to acquiesce to their quote, unquote owner's wishes, not because that's what they're programmed to do, but because it's what they've chosen to do for the good of this relationship and the benefit of their owner. Whether that's R2, constantly putting himself at risk to aid in the situation, constantly putting himself at risk to aid in the situation. That's L-337. That's the Alan Tudyk robot 3PO constantly acquiesces to whatever.

Speaker 1:

3PO is an interesting case because he doesn't fit into that archetype. He exists on his own to survive. I make no moral judgments on 3PO for his actions, but he doesn't fit into some of those structures although he does do things.

Speaker 2:

I mean like when the ewoks capture the band oh he does lie to the ewoks, pretend to be a god. That's true even against his wishes in a way. I mean he kind of needs to be convinced by it and he gets R2's advice, but C-3PO is also one of, I think at this point kind of become a classic archetypal character of the bumbling best friend who is a consistent coward in almost every situation.

Speaker 1:

Oh yeah, I mean, he's a vaudeville character.

Speaker 2:

But it's also like hard to point out. You know, I think there's that famous Eddie Izzard bit where he like compares Scooby and Shaggy, yeah, as these protagonists who are constantly running away from any danger. And you just don't get cowards as your protagonists, very often Right, even in a full secondary or third situation, and C-3PO is one of those. It's very rare that he stands up to anything or takes action to save the day, but that's also not his role. And we come to the question is he programmed? Is that a self-preservation within his programming? Is that a personality trait by a developed sentient being? It fascinating, don't know I. But again, like none of these are given any weight or thought. Maybe in the comic books, in the novels, maybe in further series. I bailed out on star wars long time ago, so yeah, I mean other than the the main stuff that rises to the surface.

Speaker 2:

I don't partake it's. It's not for me and that's fine. It can be for whoever I like what I like, you can enjoy. What you like, I'm not gonna like. Hey, nobody should watch star wars. Hey, watch all the stars you like. Obviously the prequels are not for me, but you know the generations coming up, they love the prequels. I disagree, but I also don't see the point in arguing about it. You know, yeah, go ahead and like what you like. Well, yeah, it's not hurting me at all. I mean, star Wars, this is we could get into like.

Speaker 2:

I get it. Yeah, the many problems of Star Wars, that's a whole other podcast series if we wanted.

Speaker 1:

Trust me as a Star Trek fan.

Speaker 2:

I understand we don't need to do that of robots and artificial intelligence in this extremely popular franchise that how do you make sense and how do you find it morally distinguishable to treat these droids obviously different than you treat anybody else? There is a ton, a ton of slavery, a ton of oppression within the star wars universe. I think finding the moral high ground, almost any level with almost any set of characters, is a flawed venture. Within the Star Wars universe, be that Jedi Rebels, the Galactic Empire, darth Vader, the droids, anybody, it's a very gray universe, for how distinctly black and white they try to paint it. But the nature of droids as lesser than, but obviously like, befriended. But is there a Jim Crow version of droids? Are there three fifths of a humanoid for these?

Speaker 1:

droids. It's a good question. You expect us as an audience to respect the personhood though he has, personhood even in the narrative that they they construct so like. I think, unfortunately, by nature, this treatment of this kind of ai is rooted in imperialism, colonialism and racism. Because you have to like, you have to create these strata that you categorize conscious beings in, and it's easy for us to swallow, because obviously you know white hegemony and american hegemony and whatever, but also classical colonial stratification of races and peoples. It's such an easy thing for us to just go yeah, cool, that seems right, mm. Hmm, I think we have tried to address that later. I don't think Star Wars has done enough to address that later and I'm not trying to be a Trek versus Star Wars guy when I say this, but I think Star Trek actually does do a lot to address that.

Speaker 2:

I think there's a specific edict within star trek that when sentience and theoretical personhood is ascertained within a being, that they are then set aside as not being tools, not being enslaved, they are given the right to choose.

Speaker 1:

Except when the Enterprise D became sentient and they were like, eh, whatever, eh, I mean again yeah, not every time.

Speaker 2:

This isn't a hard rule.

Speaker 1:

I had a really good point there I had a good point there and I fucking lost it. Sorry.

Speaker 2:

That's fine, it's fine. And again, there is so much of star wars that we could talk about and it's kind of one of those things when you really break down. I feel this a lot of times when I break down any fantasy thing, when we were talking like I was thinking about other fantasy things and I was like, well, how to train a dragon is a movie. That's being live action, quote, unquote, remade. But you have these dragons who are kind of thought of as pets, but they definitely know language and have thoughts and wishes of their own, but they are subjugated as writing war vehicles because they need that for the story. There's no like, hey, we should let them be themselves, except for like the end of the series, when I mean, essentially you're, you're letting your dog go live its own life after you've already like used it up for three movies.

Speaker 2:

But that's like a fantasy thing, you know. But when you we start breaking down problems within fantasy worlds, you know like, well, why does harry potter? Why do they celebrate christ? None of that makes sense, right? If they can fly everywhere, why do they need flu networks? What has stopped them from using this magic all the time? It's? At least they do get into elf rights with their, with their own form of indentured slavery that's true hey, you got one thing right.

Speaker 2:

Kudos for you.

Speaker 1:

She didn't get a lot of other things right. She can get fucked. Which of our favorite sci-fi or fantasy writers aren't extremely problematic? Maybe Asimov? It is tough.

Speaker 2:

It's a lot of them. Speaking of Asimov, we should at least discuss a little bit. We won't get into a lot of the books Just like. That's a whole. It's a whole thing another kettle of fish. But I think your laws of robotics right definitely need to come into play.

Speaker 2:

We at least need to mention them absolutely offhand, he made these three laws of robotics one a robot may not injure a human being or, through inaction, allow a human being to come to harm. A robot must obey the orders given to it by human beings, except where such orders would conflict with the first law. And a robot must protect its own existence as long as such protection does not conflict with the first or second law. Now, obviously, in his stories we find the conflict inherent in some of these laws and that their implementation is not without its own consequence. But I think where I would like to take those laws and see them implemented in movies would be the alien franchise. Hmm, now, the alien franchise is I know you scrunch your well mostly that this is my, um, no, my connective tissue. I'm just curious. I mean one you have.

Speaker 2:

In each alien film there is a synthetic person, an artificial being, usually one. They almost always look and act and feel just like other humans, feel in a touch sense, not necessarily in a emotional sense, although that does tend to evolve as the franchise expands. We'll discuss in a second. It starts off with ash, an alien. Then you get bishop and aliens and then call in alien resurrection.

Speaker 2:

Alien three is the one that doesn't have. I mean, you get a little bit of Bishop, but not really, not really, but but a little bit, but it's the one alien film that doesn't other than alien versus predator, which we're not counting.

Speaker 1:

No, or alien versus predator requiem.

Speaker 2:

It is the one alien film that doesn't have Right Right, that doesn't have a robotic person, an android integral to the plot, because human being to come to harm like, specifically quoting Asimov's setup. You don't have synthetic androids, you don't have them where they look and act and seem just like people, to the point where the fact that they are not human is a not known fact. In the first one, ash is essentially a secret android Nobody knows. And his again, he breaks the second law by his programming, because we are led to believe that his programming has caused him to put the, the humans he's on board the vessel with, in danger so that he can bring home a sample of xenomorph to the company. The company knows sends them on this mission. They have ash there to make sure that their will is carried out. Now he is set up as a mini antagonist.

Speaker 2:

In the first one, bishop is a a complete friendly protagonist. In the second one, he you see him, you know evolve. He obviously doesn't want to put himself in danger, but when he is called to that duty, you know, he recognizes his own desire for self-preservation, but the needs of his human companions outweigh that. He also seems to come to a culmination of ideas about humanity. Culmination of ideas about humanity, seeing the love that Ripley seems to have for Newt and how far she is willing to go and what she's able to do. He specifically comments her that you know you did pretty good for a human. But he is distinctly a positive force throughout the entire film. It is a twist in that we are kind of set up to think that he will turn, because we saw the synthetic being turn in the first one. Him being a proactive, helpful force is a oh wow. He was good. We can like Bishop throughout the whole second film and the third film In the third film you don't really get much.

Speaker 2:

He just kind of set up to be an exposition dump. Well, yeah, so that we can see what happened in between the second, third film. In the fourth film again, you have a subversive secret robot in call winona rider. I would, yeah, winona's writer character. She is, but I also think she's doing the least out of any of these synthetic beings in the Alien franchise.

Speaker 1:

That feels like one of the Joss Whedon things. Right yeah.

Speaker 2:

I mean, she kind of seems like his manic pixie dream android in a way.

Speaker 1:

Well, I mean, what do you think buffy is? I mean, yes, he created a buff, he created a buffy to live alongside ripley because he couldn't deal with ripley being ripley you know what I mean like he needed, he needed his own version of ripley to exist alongside her yeah, but let's, let's shrink her down.

Speaker 2:

Let let's yeah. You know a way thin. Small it always reminds me of, like his, his serenity character, where it's like, oh, I have to have the manic pixie dream girl, but she's the perfect weapon as well, right you?

Speaker 1:

know it's like OK.

Speaker 2:

But then I think where the AI question really pops up is in Prometheus and past that. That's when we're introduced to David what one might call the evil AI, a very dystopian version, and this seems to be a synthetic person that has chosen his own aims over those of humanity.

Speaker 1:

I still don't understand the motivation of that character. I'm sorry, I don't like Prometheus. I don't like any of the subsequent movies. Until you get to Romulus, I just don't. And I know we'll get to it, but I just don't get it. It doesn't make any sense to me. It just get, it doesn't make any sense to me.

Speaker 2:

It just I think there is. In the first film we definitely see that it is his programming by the company that has put him on a course to bring harm to the crew by letting them die, impregnation, all for the, the benefit of the corporation. And the second one, we don't have any of that. It's, you know, a flip the script moment where, oh, it's a, it's a good synthetic person. Three and four, we kind of flush those. It's not really relevant when we get to david. It depends on your interpretation of david within this world. I think at this point, even though it prequel, it seems to be this moment where, like he gets some, some almost always surpass humanity, another higher rung on the idea of perfection, that then, by creating that one, there is a horror element. When humanity gives birth to something beyond it, there's a natural framework of fear that is created. You know the, the frankenstein, frankenstein's monster. It's outside of our control and it it is beyond us. You know how do we deal with that? Uh, two, I think they're in this dystopian view of what AI can become.

Speaker 2:

You see that David is looking for perfection in himself. He obviously sees himself beyond his creators and he sees the xenomorph as an avenue for even further perfection. Thus his wills do not align with his, I guess, programming. But again it's hard to understand what he is quote programmed to do. But again it's hard to understand what he is quote programmed to do. It appears that he's programmed to be like an autonomous being of synthetic origin who is meant to bend to the will of the humans in his not not necessary care but his service. But I don't know. He set it up as a flaw in his programming that gives him the will to do whatever he wants, which then turns him against the crew in searching for his own goals and aims to be enacted, which brings about the destruction of everybody in the film. But the genesis of that is kind of like you kind of have to read into it because it's not well explained.

Speaker 1:

You have to do the work for the movie. Yes, and that's a problem it is.

Speaker 2:

I think I'm kind of interpreting as a a kind of a jump to conclusions, but it is, you know, in the, in the way I mean it's, it's called prometheus for a reason well, yeah, no shit, yeah it's the creation of this thing that leads to the downfall of them all.

Speaker 1:

You know I mean he's literally helping his creator die because of the knowledge that he is trying to gain. That was during the Ridley Scott period, where he did like Hannibal. I've read that book. It's not saying the things it thinks it's saying. It's saying and knowing, especially the behind the scenes of how prometheus was written and rewritten by damon lindelof. Don't give it too much credit.

Speaker 2:

It's, it's I think again it is a bit of a jump to conclusions map for a poorly constructed and even worse written set of films yeah it looks great, it feels like a legitimate alien movie and I don't understand why people like it retroactively ruins the alien franchise.

Speaker 1:

I don't.

Speaker 2:

I don't know if it's retro, I can still like what I like for what it's bringing to the table itself. If we have to take it as a whole.

Speaker 1:

You and I will be upset that they retroactively ruined the things that were good.

Speaker 2:

Right, it is distinctively a bad idea, bad execution, and it does tarnish if we're looking at and a totality. It does tarnish that image, but it's not going to stop me from still enjoying the good parts of it. Sure, by themselves, that's all I'm saying.

Speaker 1:

Maybe that's a journey people need to take. As a star trek fan and a star wars fan, I understand both of those.

Speaker 2:

I wonder if there's a lot of people who, like they, saw Prometheus, maybe before they saw Alien.

Speaker 1:

Yeah, you're probably right, I never thought about that. Yeah, there probably are, and for them maybe the experience is completely different. I don't know.

Speaker 2:

Exactly. I don't think they are key to understanding the ideas behind the laws of robotics or our views on sentience within AI in films or fiction, but what I wanted to do is point out that these are some distinct key ideas in the dystopian view of robots and artificial intelligence within pop culture. So much of what we see and how robots and AI are depicted is as a negative influence, an antagonist, something that's going to bring about the downfall of humanity. And I think, when you look at the totality of fiction, I think in a lot of books AI are seen as beneficial and quite good overall, but especially in movies, I think there is this need to depict them in a negative light. Obviously, you have plenty of good versions. You know your WALL-E that we don't want to talk about. I think you know something like you can talk about WALL-E. It's fine. We're not going to talk about WALL-E.

Speaker 1:

Okay.

Speaker 2:

I mean about wally. It's fine, we're not going to talk about wally, okay? Um, I don't. I mean you just think we do not want to talk about wally. We, we have. We have plenty of other things I would rather talk about.

Speaker 2:

These really are the more exciting versions of ai and what lead to probably a more dumbed down take on what mechanical beings might bring to humanity. But they are a what am I trying to say? A cathartic expression of humanity's idea for what creating artificial intelligence might bring. Now, again, this does happen with david or ash, but I think you're gonna hit some bigger ones that really kind of form the mold of what we see as dystopian versions of artificial intelligence. Obviously, there's just HAL from 2001. And I mean you have, you know, terminator, ex Machina, westworld Dune, but I think, I think we're really going to have to break down with, like, why? Why do they have these versions in pop culture? Why is it so popular? What is it saying about us and our thoughts of our creation and what it could do to us?

Speaker 2:

Now, one, there is a natural, inherent fear of the unknown. I think. Creating artificial intelligence, creating a robot who can think and possibly feel and has a will of its own. We just don't know what that would bring, and that's inherently something that, throughout time immemorial with humanity, anything that we don't know or understand, we fear and we tend to alienate and fight against. That is part of our nature. I think that we also lead to the fact that humanity in itself is inherently violent and we are killers from an evolutionary standpoint. Throughout the totality of humanity's history, we have fought, warred, killed and othered constantly and to constantly and to think that what we create wouldn't have similar purpose or course of action in a what you're talking about fiction.

Speaker 1:

You're getting an original sin, essentially yes, I think there is a bit of a bit of that the flaw that, in creating something else, the flaw of the creator carries over.

Speaker 2:

Right right into his progeny, Right. I think that is a possibility. I think we also feel the need to tell these stories of black and white and by putting our thoughts the way that we see everything else, by attaching it to this entity that we don't know and don't understand. It's an easy story for us to tell of the way that we interact with everything around us that might be the same thing that happens to us by this other creation man, ronald moore was onto something, wouldn't he?

Speaker 2:

yeah, now refresh my memory of Caprica. Oh, yes, because it's been a while. Sure, but it is the creation of artificial life that then leads to the Cylons, because they then start making themselves.

Speaker 1:

Yes, so Caprica is both helped explain in a really great way and then also completely ruins Battlestar Galactica in a lot of ways. So what happens is a cybernetic engineer. He creates AI essentially what we would consider droids and he's like one of the most wealthy and famous people in the colonies, and his daughter ends up dying and he tries to upload her consciousness into a cybernetic life form.

Speaker 2:

Oh, it's just a it, it's a robot. Robocop terminator scenario.

Speaker 1:

Yeah not terminator I would say, but well, robocopop versus Terminator is in that vein his dead daughter's consciousness into one of the cybernetic creations he made, because he modeled the brain and the brain structure of his daughter even though he thought it didn't work, and then it just sort of all of a sudden worked. It's very Frankenstein in that sense. They did the thing, they pulled the lever and then he walked away and then later it turned out it really did work, you know, and then she comes to life. I'm glad you brought that up because Cap caprica is a really weird example of how people don't know how to bridge this gap here.

Speaker 1:

You watch battle star, you can have all sorts of opinions about ai and philosophy and nature, and what have you? Caprica does this weird thing where it tries to explain unexplainable part of that that is existential. It literally tries to quantify the existential and it works from a writing standpoint and then doesn't work from a narrative standpoint. Right, yeah, it gives you an explanation of how these things happen, but it also kind of ruins the existential part of the nature of consciousness that they try to get to in Battlestar. And it's still Ronald Moore. So, like you're like dude.

Speaker 2:

Right, right. The Cylons are one of those. Another key building block that we see in the Cylons hate humans and are constantly attacking them and you know it gets into. You know again another reason why would AI attack humanity? To try and wipe it out? I think Cylons they fear humans and they also see them as superior to humans.

Speaker 1:

This is the whole philosophical thing they get into in Battlestar, like I have to destroy my father. It's this Greek tragedy thing that people are obsessed with Right. It's this Greek tragedy thing that people are obsessed with Right.

Speaker 2:

I mean, I think that's more of a rhetorical flourish that they provide there, because, I mean, they're basic. Yes, they're trying to kill their father, but it's really, I think, about what is stopping them from being themselves, from evolving. It's humans, or whatever, I guess. Do they call themselves humans? Oh, no? Well, no, not the C, no, I know.

Speaker 1:

Yeah, humans call themselves humans in the vsg world I don't, I don't remember, so I don't think they ever actually use the. Well, fuck, they might at some point. I don't think they do off the bat, but I think they. No, they do. They 100 do because they have to address it at the end when they go to earth. They 100 do even if it's later in the show because they're like, but I think there is.

Speaker 2:

In a lot of these versions, you see that humanity is placed as the enemy, and sometimes that is because they, they are seen as the the one that can stop their progress. Sometimes it is simply the ant in the highway. You know where? If, like, we are building a highway, we recognize that there are ants, you know living their lives, but we do not care about them in the construction of our own synthesis, of our will. They are beneath us and thus their aims, wills, goals, do not matter in the culmination of our own.

Speaker 1:

I can tell you've never read the three body problem, but they address that very, very clearly. They have a whole treatise on this exact concept.

Speaker 2:

Yeah, I, I haven't. Well, if you, if you do have something, to well.

Speaker 1:

No, I mean, if you want to be really depressed, then go ahead. But if you want uh, I'm already depressed.

Speaker 2:

I don't know if I need more it's an extremely nihilistic work.

Speaker 1:

If you're in a place where you don't want to be really depressed and hate yourself and think there's nothing worth living for, don't read it all right, that's a.

Speaker 2:

It's a glowing then terminator. Obviously we have. I think this is probably the holy grail of the dystopian version of AI and robots, where, essentially, ai is born in Skynet and Skynet turns against humanity, concocts a scenario where humanity helps to wipe itself out and then proceeds to go on a killing rampage to exterminate humanity.

Speaker 1:

Um well the reasons why. So this is one of the. This is one of the like. One of the most foundational things about dystopian sci-fi, ai fiction, is that ai becomes self-aware. Its programming is how to best serve humanity, how to you know how to stop war from happening. This is the case in like, in at least Joss Whedon's version of Ultron, skynet, the Matrix, many other I mean we talked about in other episodes, other fictional works that deal with this, and their conclusion is well, just kill humans and then there won't be any war To protect humanity, you have to get rid of humanity, because humanity is inherently a threat to itself.

Speaker 2:

Right, exactly so the logic dictates that we need to systematically remove that variable, right.

Speaker 1:

Because Skynet, specifically, is a defense department ai driven thing. It's invented by the american department of defense like and it's not just some ambivalently, you know, like computer that somebody made like it was specifically made for this reason, right right.

Speaker 2:

cyberdyne will help AI, which will give rise to Skynet, which will then help humanity destroy itself, then try to crush all of past, to then kill the mother of said leader, but in doing so leaves part of itself behind, allows for the leader to be created and allows for itself to be created by leaving parts of itself that are then found by Miles Dyson, used by Cyberdyne to then construct the circuitous loop of time travel logic that nobody needs to get into when it comes to time travel stuff, though it doesn't make any sense really at all, but still, narrative wise, those first two movies, that's pretty great sci-fi stuff Even though they're action films or, in the case of the first one, a total B movie.

Speaker 1:

Do they?

Speaker 2:

I don't remember. Do they explain? Because I mean, when you set up two, are we led to believe that Judgment Day won't happen at the end? Because I feel like it's left ambivalent. It's like they don't know whether it will happen or not. But you know, at least we've grown as people. Yeah, they leave it up in the air for sure which again I mean yeah, if you want to talk about a franchise, uh, devolving over time and and tarnishing its, its legacy, you got terminator all over that yeah, but I would rather take terminator.

Speaker 1:

You know now. Now, retrospectively, I would much rather watch t3 or and and or take it in canon over any of the other sequels that came out, would you rather?

Speaker 2:

T3 or Prometheus.

Speaker 1:

Oh, that is a great question Once a sequel and once a prequel, but you're right. I mean, we're dealing with time travel, so okay, okay, that's fair.

Speaker 2:

And is T3 your third?

Speaker 1:

Well, I mean of the Terminator movies. I mean because it's the third one that happened. Yeah, I mean because it's the third one that happened.

Speaker 2:

Yeah, I mean Well, no, I mean in your ranking.

Speaker 1:

Well, because I think we both agree that the Terminator franchise has diminishing quality. There has never been one that has risen more. It's not gone up in quality as the sequels have gone on so like yes, by default, terminator 3 is the third best terminator movie, but alien, that's also a good question, because there are a lot of those that are not good quality either, because that movie went from a b-movie to an extremely great genre film. And can we just blame james cameron for all of this? Can we just like like shoot him? Can we just like put him against the wall and be like stop it? Well, I mean, what did he do wrong? Well, I mean, he's, he's the Sarah Connor. If you get rid of James Cameron, we don't have these problems.

Speaker 2:

Right, but you also don't have the greatness either I know, but isn't that the whole thing in the Termitos?

Speaker 1:

don't have the greatness either, I know, but isn't that the whole thing in the termitos? Yeah, it gives birth to this great dude who does this thing that leads humanity. But then, but at least all this strife, and if you just get rid of him, then you know the robots win, then the matrix becomes important, robots would never I mean the robots would never exist. If you kill sir connor, according to terminator 2 yes, which is great, because that's all, that is a time.

Speaker 2:

Yes, you're right, you were correct, that is a time travel we don't, we don't need to deal with time loops, yeah that's a time travel conversation, not a, not an ai conversation yeah, but I think it is again.

Speaker 2:

You know what, what reese says. You know that it's. It's an unthinking, unfeeling killing machine and it will not stop ever until you are dead, which is like, quintessentially, what is so terrifying about an AI robot apocalypse coming to destroy humanity? It's these creations that we have made. Obviously, they have been in the Terminator universe. One they are these horrifying endoskeletons made of metal that we can't stop. Two they have then made them into cyborgs, so you can't tell it's a Terminator, but it's still coming after you. Three, it has no empathy. We cannot relate to it on any emotional level. Like zombies, they will not stop Good point Ever. They're just going to keep going until you are dead. Right, it is one of these perfect villains, um, that we can't, we can't empathize with, we can't really even understand and we also can't stop, which is just so perfect, uh, of an antagonist.

Speaker 1:

It goes to our core of of our, of our fears as a species. Yeah it's, it's great, it and you know what? It's the only reason that that movie ended up becoming more mainstream, because that would have normally been like a, b, like a straight up b movie. I mean, that is a, that is a b movie, low budget, coming out of the Corman camp, essentially.

Speaker 2:

Yeah, I mean, it's one of those things that it streaks of genius, coming from humble beginnings in a derided genre that it was able to rise above.

Speaker 1:

You know it's. It's like it could have started off at some point in. Schwarzenegger wasn't even a star then. Like that should have just been relegated to I come in peace status. But because you're right, because of those things that it hits on, it became an iconic thing and you know what may have actually legitimized that genre more than without it. Like that might have actually legitimized the genre because of the things that it appeals to. Like you're saying, then, had that movie not existed, I don't know that we would have the geek culture we have today.

Speaker 2:

Yeah, I think there's so much that really spawns over that, especially when we deal with our ideas of robots and AI. Whenever we think of the biggest downfall, we point to Skynet, we point to Terminators as a collective ideology, and I think part of that is also like they were able to put a face to our coming robotic destruction. If you look at like what Hal did. Hal is also, you know, a progenitor of this, but there is not the anthropomorphism that we get with the killer robots in Terminator A big box with a red light and a monotone, soothing voice. It can only terrify so much and it doesn't hit on certain levels and it's harder for us to identify with that.

Speaker 2:

I think that's another reason why that has trouble breaking through in the way that killer robots have you know since you know, especially since Terminator, but even in in you know pre-Terminator things.

Speaker 2:

I mean, you have, you know, I think, most of the killer robots that we think of, you know in your, your Westworld, your Megan, your Battlestar and Terminator. You have these two legs, two arms, a humanoid body, something made in our image, in our likeness, that is out to kill its creator, ethean tale, coming to its final, conclusive, combustive end, which I think is kind of like what drives a lot of like, especially mainstream thought with this genre. Now, you do have plenty of utopian positive ideas. With AI, you know from, you know an after gang or creator or data or her or or blade runner. There's plenty of positives we can see, but I think the dominant, you know, tale of of the dark possibilities of our creating something completely out of our control, this frankensteinian creation that is, that is beyond our logic, our understanding or our control, gives rise to the horror that we see could be, and I think that is what really captures the imagination of the dark side of our possibilities.

Speaker 1:

Mm-hmm.

Speaker 2:

Please go away.