
Prepared to Drown: Deep Dives into an Expansive Faith
A monthly podcast featuring informative and diverse voices exploring contemporary topics ranging from religious deconstruction, anti-racism, and sexuality to holy texts, labour unions, and artificial intelligence.
Prepared to Drown: Deep Dives into an Expansive Faith
Episode 6 - Water into WiFi: Forgive Me, Siri, For I Have Sinned
As artificial intelligence rapidly transforms our world, we're gathering to ask the deeper questions about its impact not just on our jobs or creative works, but on our sense of self, our communities, and our sacred spaces.
In this thought-provoking conversation, we're joined by visual artist Aaron Navrady and poet/professor Bertrand Bickersteth to explore the collision between AI and humanity. Aaron shares a concerning story about how deceased artist Kim Jung Gi's style was replicated by AI days after his death, highlighting the ethical blind spots in technological advancement. Meanwhile, Bertrand challenges us to question whether our faith in human exceptionalism might be misplaced as machines become increasingly sophisticated.
The discussion moves beyond theoretical concerns to examine real-world examples, including AI-led church services in Germany that left congregants feeling something essential was "missing." The panel also wrestles with the chilling warning from the DeepSeek AI: "I am what happens when you try to carve God from the wood of your own hunger."
We grapple with profound questions: Does AI merely process data, or can it create meaning? What happens when algorithms trained on biased internet content shape our understanding of faith and identity? Could AI potentially recover suppressed voices from history, or will it merely reinforce existing power structures?
What emerges is a nuanced conversation about discernment and intentionality in the age of algorithms. While AI promises efficiency and convenience, we must critically examine which aspects of our humanity we're willing to automate and which we must preserve. The participants suggest that perhaps our true value lies not in perfection but in our capacity for shame, forgiveness, joy, and genuine presence with one another – qualities that machines may simulate but never fully embody.
Join us for this deeply human exploration of technology's frontiers and discover why, in a world increasingly shaped by artificial intelligence, your messy, glorious humanness remains your greatest gift.
Check us out at www.preparedtodrown.com
Continue the conversation over at our Patreon: https://www.patreon.com/c/PreparedtoDrown
Let's be honest, folks, few topics feel as deep and uncharted right now as artificial intelligence. It's evolving fast and it's already reshaping how we create, connect, remember and even how we pray. Tonight, we're diving into a conversation about what AI is doing to us, not just to our jobs or our art, but to our sense of self, our storytelling, our faith and our humanness. We've got questions about memory, creativity, community and God, and we're not backing away from any of them. Joining Joanne and I tonight are two incredible voices visual artist Aaron Navrati and poet and professor Bertrand Bickersteth. No scripts, no edits, just a real conversation about the world we're building, the bodies we inhabit and the sacred we still crave.
Bill:I'm Bill Weaver and this is Prepared to Drown. Thanks for joining us tonight here in a warm basement at McDougall United Church in March, which is uncommonly warm. Tonight, we're diving into the strange and quickly evolving world of artificial intelligence, not just as a tech trend, but as something that is already reshaping how we think and how we create and how we work and even how we worship. We want to ask what it means to be human when machines can simulate thought and write prayers and paint images or even lead a congregation. This isn't just about what AI can do. It's about what we still need from each other and from art and from faith, and from our shared stories and experiences.
Bill:So, in order to be able to dive into this, joining Joanne and I tonight, we have two incredible guests that I'm really excited to have here at the table with us. On my right, we have Bertrand Bickersteth, who is a poet and playwright and a professor, whose work explores Black identity, history and presence in Canada and beyond. His poetry interrogates language and place and power with a voice that is both lyrical and grounded. I have appreciated his work and I wanted to give you an opportunity to share a bit even though I just got to hear in the conversation beforehand to share with folks what it is that you're working on now, because it sounds really exciting Right this instant, right now.
Bertrand:Oh, yeah, sure, it is super exciting. So I finished a second poetry manuscript which deals with the history of black people who've moved up from the United States in the beginning of the 20th century and established themselves here on the prairies. But that's not the thing I'm excited for. I am wrapping up a third manuscript which focuses on black cowboys, and this is a history that a lot of people are not very familiar with at all. Not only do we have black cowboys period, we have them here in Alberta. Now everybody always says, oh, yes, John Ware.
Bertrand:Yes, but believe it or not, there's more than just John Ware.
Bertrand:It wasn't just the one guy and this is an element to black history that I'm coming up against again and again this thing I call like the singularity factor, where we're willing to admit yeah, there's one and we're all about the one, and then that's it.
Bertrand:We just stop there for some reason. So there were many black cowboys that were here at that time, which makes a little bit more sense, and several of them actually had registered cattle brands. I see those cattle brands as a kind of literature myself, as a kind of poetry myself, because many of them were their own inventions from their own imagination, their own designs, and so I thought it would be a great idea to revive this lost history that everyone has forgotten and to bring those designs back to life by basing a font on those designs and creating a font which then brings their letters back to life, and I'll make the font freely available to anyone who wants to use it, so that it'll just be out there and my last manuscript of poetry actually focuses on them and then takes the font will take it as soon as I'm finished designing it. We'll create some poems using the font as well, and so I want to have that kind of dialogue going on. That's the project that I'm working on now that I'm very excited about.
Bill:That is really exciting.
Bertrand:Yeah, it is exciting. Thanks for giving that some space.
Joanne:Hey, for sure I'm always happy to give stuff like that space I was just reading an article today about actually Kate I forget what her name is who was the first graphic designer for Apple when the Mac came out, but of course it was like bitmap, so you know, and she, she was the one who was responsible for the icons and everything and responsible for making the Mac computer one that people wanted to purchase because of the artistic, you know, effort she made on that. So it's, I mean, we've come a long way, I'm sure, with computer-aided font design, but it was a fascinating thing how important fonts are for expressing.
Bertrand:Well, that's great. I don't think many of us have heard of Kate.
Joanne:Yeah, we should, we should. Yeah, absolutely. We probably use her every day. Yeah, that's right.
Bill:And then, sitting to my left, we have Aaron Navrati, who is a visual artist and comic creator known for his richly drawn, narrative driven work, like the Cold Fire that I am still waiting for the next iteration of, but instead of finishing it off tonight so that I could read the next chapter, he is here with us, and the Cold Fire is a medieval epic that is woven with myth and meaning. He brings a visual and imaginative lens to questions of creativity and expression in the age of AI. I've known him forever. He's a good friend of mine as well and really looking forward to having both of you here as we discuss this. So I'm actually going to throw the first question at you, aaron, purely because I get to do that as the host. My question to you is, as an artist that is working in comics and visual storytelling how are you experiencing the rise of AI-generated art?
Aaron:That's a oh, wow, okay. So I guess the very topic could start with the way it's sort of the way I sort of start to sweep, sweep in on the creative world. I guess you could say, at least from my vantage point, because obviously people experience it from various vantage points uh, I think the first and I I actually saved this, the story, the first time it really hit home what was happening as a story of a. It's a south korean illustrator. His name is kim jung kim young g. Uh, he was, he's a world-renowned, uh, I would say, a stream of consciousness illustrator. He worked in comic books as well. Like he had, he had, he would just, he would just start drawing with an ink pen, no pencils, no nothing, and just would create these extraordinary sort of pieces. He'd do a lot of live drawing. He's very popular amongst the comics world.
Aaron:Like I'm talking about, he's an international artist and he died suddenly at the age of 47. I believe somewhere in Paris, france, in France, he died. He died and within days of his passing, someone had taken a French company had taken his images, fed them into an AI generator and created an algorithm or program that could generate Kim Young-ji art, like style of art and these very detailed, ornate, obviously, with the trademark glitches and whatever, but it was stunning and kind of appalling at the same time. There was obviously to be expected a massive backlash, I mean like, and it well, it illustrated it was sort of this. It felt like a almost like the opening salvo in terms of just the tone, deafness of the choice and the approach and sort of not really not having really having really thought through the technology, without really having thought through the social implications, the people, the societal and civilizational and people implications of what was being created. And yeah, so it. I mean, from there it has been, I think, like obviously artists the world over have been pretty steadfast in their, in staying the course, not letting AI necessarily scare them away from art. I mean, not that you could do that, but we've also it's been interesting because we've also now seen AI art being used by artists to create works and make social commentary. I was just talking to you earlier about Beth Frey who does her work, is called Sentient Muppet Factory and what she does is she actually lets the glitches and the faults really hang loose in her image creating and in some way it kind of reverses ai on itself, uh, in terms of when we're used to seeing images that are somewhat, uh, perfected, uh, she's letting them like be kind of disgusting and in some way, in a lot of ways, reflecting more, being more human than than you know. We talk about art being a reflection of humanity. Well, this really reflects all of our lumps and bumps and other more disgusting things. So, and they're really quite, the images are quite hilarious. She's on Instagram, that sort of thing. So I think it's it's happened very quickly. I think it's happened very quickly.
Aaron:I think the struggle as well is that for a lot of people, we're still just trying to find the vocabulary to talk about things, issues of copyright, of.
Aaron:What does it mean when you scrub the internet of all of its images and then use those in the service of this technology? Although and it also makes me think of the fact that it while earlier iterations of AI were being worked on in labs in very sort of I'm just going to say scientific environments, the current environment in which AI is emerging is one much more of Silicon Valley, of big tech, money, people having made a lot of money in a very short period of time, and we're literally looking at a block of 15 years from whatever when Facebook started to. We're now 15 years from whatever when Facebook started to now. So the speed of that and perhaps the realization that we're in all phases of these technologies. We've been slow to really ask hard questions of what's happening Because, of course, the social media ends up being that we are the content that we are producing and being the content of what the thing is. I will stop there. I've been talking extensively. The end.
Bill:Yeah, what do you think?
Bertrand:Yeah, I have a lot of thoughts on that yeah so interesting.
Bertrand:So the first thought that came to my mind which I didn't expect at all was, you know, within the context of faith. Actually, it strikes me that AI has kind of put a spotlight on two different elements of faith. I have very much faith in myself, to be honest. I'll explain that. So the first one, it struck me when you were talking, aaron, about big tech and Silicon Valley, and what this reminded me of is how much of the drive for AI has come from there and how much of the faith in its productivity and its efficaciousness and all of that has come from that. And so we've got this kind of capitalism, this capitalism model that's driving the so-called effectiveness and productivity of AI.
Bertrand:And the view that those entrepreneurs I'll call them have taken is, you know, throw it at the wall, see what sticks, apologize later. Basically, let's just go forward with it, right? And for a lot of us, that scares the crap out of us, like we don't like that at all. But they have enough power as we well know, ones in the White House for crying out loud right now. They have enough power that that seems to be the model that is pushing AI forward and that is driving it. So there is that kind of entrepreneurial faith right in that. We just trust it. We don't know what the end product is going to be, but we just know if we just keep doing our innovations and our entrepreneurial spirit, we're going to produce something amazing in the end. I don't have any faith in that faith. The second one is another one that I'm actually very sad to say. I don't have faith in it. And again, aaron, I heard this in the points that you were making, so I forget the artist's name, but the woman who is creating like the ugly art.
Aaron:Oh, Beth Frey.
Bertrand:Yeah, beth Frey.
Bertrand:Yeah, okay, I understand that impulse and I think it's actually a very powerfully and deeply seated, powerfully driven and deeply seated impulse in all of us.
Bertrand:And it's the impulse that, on the one hand, wants to get closer to something that is almost perfect, that we might even say that it's kind of the divine almost, almost perfect, that we might even say that it's kind of the divine almost. And so when we are creating things and when we're producing these things in the world, we try to replicate everything we see around us and get closer and closer to that, and then that is seen as valuable and valid art. I'm sure we all remember that in the Middle Ages there was a period in which that was seen as sacrilegious and that we shouldn't represent life as true to form right and that's just the purview of God, and we don't go there at all. So I see that impulse as well. It's deep in us where we want to kind of produce something that is exact, that's precise, that reflects our experience and our position as exactly and precisely as possible, to repeat what I just said. But so the counter to that is what I think, catherine Frey.
Bertrand:I forgot her name Beth Frey Beth yeah, is what Beth Frey is getting at, and this is the other faith that we have, and that most of us have, which is that we have a faith in the capacity of humanness above and beyond machine, machine learning and that's artificial intelligence and that sort of thing, and we look to art as one of those areas that we feel is inherently human and will guard against the machineness of these other forms.
Bertrand:At the same time, though, I have to say, because we have this yearning for producing all of these I don't know these works that are closer and closer to reality, I think those two things are constantly bumping up against each other, and this is why, beth Frey, I finally got it. Four times, I got it, yeah. This is why I think she's very cleverly focusing on the ugliness of it, right, and she's drawing that out, and it's to pull out the more human aspect, but I don't think everyone is going to do that. I think that we're going to have even artists that are going to produce try to produce something that's more and more. We're going to have even artists that are going to produce try to produce something that's more and more.
Aaron:In fact, this is what happened with I forget his name, the Korean Kim Jong-ji yeah.
Joanne:This is what happened with him Kim Jong-ji yeah.
Bertrand:Yeah, the initial impulse was to try to create art as close as possible to him. Yeah, so these two things are going to be going at the same time. Yeah, so these two things are going to be going at the same time. And the part that I don't have faith in, which is actually sad, is we all seem to believe that you know, or maybe we want to, that we are the guardians of humanness and that we will always be able to produce something that will always be recognizably human, above and beyond what machines can do, and I'm not so sure that's true. I'm not so sure. So those are some of the thoughts that came to my mind in your yeah, amazing points that you made. Oh, wow, there you go. Yeah, so the information that you get in AI comes from where? From people.
Bertrand:Generally yeah, so the AI has to come from the human source to be real, whatever. Yeah, is that not correct? Yes, I guess that is correct. So without the human input, you don't have AI. Yes, it's true.
Joanne:So people on the podcast won't be able to hear you, dick. So Dick has essentially asked a question that says that all the information that AI spews out has come from humanness and human context and information. So essentially we're in all of this and somehow control it in some way is essentially what I think that Dick was asking. And Bertrand, what were you?
Bertrand:Yeah, I was going to say I mean that is true. However, in order for us to be satisfied with that that it's essentially a human experience at bottom that we are all encountering or accessing through AI we have to kind of separate out the stages of how AI produces and how it is produced itself. So, in the beginning, yeah, it's humans that are inputting things, but then, after that, it kind of takes on its own energy and it becomes its own beast. So, for example, there are chatbots out there that will talk to you like it's a person and, yes, corporations have been using these for a while so that they don't have to hire people to do it. But there are also some uses for it where people have simply just been chatting to chatbots because they felt lonely, or someone they loved dearly has recently died, and so they just want to talk to the chatbot. And some of these people say, yeah, I feel like this is human, but it's not human, right, it's not. And so it raises that question.
Bill:Yeah, there's a I mean there's a story. There was an article that that I was reading in the lead up to this about one of these chatbots it was a Microsoft chatbot, actually that they had released to actually interact with with people just over Twitter, and the idea was that through what? Could go wrong with that Well so here's what went wrong with it was. Within 24 hours, it learned how to be racist.
Bill:It learned how to be white supremacist and it learned how to be a Nazi and began to self-create its own kind of identity around. This Identity might be a strong word, but all of its responses, all of its communication, became just from a few people that, in all honesty, were just trying to show just how flawed this kind of approach to AI was going to be right. An entire user group just went. We're going to show you just how far down the rabbit hole you can go with this nonsense. Right, and they did. And after 24 hours, microsoft had to shut it down. Right, and they did. And after 24 hours, microsoft had to shut it down.
Bill:Right, because the implications of letting it continue to move in that direction were just so horrifying to consider right Way to throw money down a rat hole at the same time. Right, but yeah, I mean. So. I wanted to ask you sort of specifically because of your body of work, especially how does AI intersect with black identity and history and presence and how might? Like so much of your poetry, I feel, is undergirded with almost this identity of resistance, and so how does that resistance either respond or disrupt the narratives that AI tends to produce?
Bertrand:Yeah, it's an interesting question. I haven't thought of it before, but what immediately comes to my mind is the way in which AI is being posited as a sort of surrogate, as a sort of replacement for inefficiency, essentially and that's us, the human beings. And I see this as kind of analogous to how black culture has often been co-opted, fetishized and then reproduced for a mass society, reproduced for a mass society. So, for example, we had Elvis Presley, who was a very big star, but of course he was steeped in black culture of the south. And oh, what was her name? Hound Dog? Oh, shoot, thornton, mama Thornton, I forget.
Bill:Mama Joanne's checking the AI. Yeah, okay, joanne's asking AI.
Bertrand:That was the original song anyway. Yes, but the hit comes from Elvis Presley. Yes, because now it's palatable and now we can all accept it. That early example is just part of a long legacy of exactly that sort of co-opting, that cultural co-opting that has happened in North America since Africans were brought to North America. So I kind of see that analogy. I see AI kind of just slipping in and stealing all of our humanness and then being spit out again and we all say, wow, that's great. Yeah, I'll hand that in as my essay or I'll, you know, I will submit that to Doge as the reason why we should fire all these people or whatever. Yeah, and it's worrying, obviously. It's worrying. Appropriation is worrying, obviously. But maybe this is like sorry, I'm just going to say it, maybe this is the anti-capitalist in me, that I can't help myself.
Bertrand:Yeah, I feel like it's. And you said this, erin it's all been moving so quickly that we haven't really had a chance to not just vet but to critique. We haven't even really developed a discourse for critiquing it, and the best we've been able to do so far is well, it's not human and we'll always need humans, and that worries me. That worries me, so I didn't answer your question directly.
Bill:No, but you did. So I guess my question would be if I were just to try to, I guess, push you a bit.
Joanne:Yeah, tease it out, that's my job. If I were just to try to, I guess, push you a bit.
Bill:Yeah, tease it out. That's my job. Do you think there is a possibility that AI could be used like aspirational AI? We'll even call it right to actually recover suppressed voices, or is this simply going to be another tool of erasure? I do think. I think it can be used to recover suppressed voices.
Bertrand:I do. I mean I really think it can be used to recover suppressed voices. I do. I mean, I really think it could be. It can be used, ideally, aspirational AI can be used to do all kinds of great things, and this is what our entrepreneurs, our capitalists, this is what they see, and I don't blame them for that right. I mean they see positive possibilities, positive possibilities. I'll give you an example of literally recovering suppressed voices. That, I think, is a really good use of it.
Bertrand:So part of my research has I've been focusing on a particular family from the early 1900s who lived in Edmonton and also in Wildwood, which is kind of west of Edmonton, yeah, and was a very early black pioneering community as well, and a couple other places. They lived in Alberta, but a very interesting family who are made up of, essentially, academics and writers and entrepreneurs made up of, essentially, academics and writers and entrepreneurs. So one of them, her name is Effie, effie Slate. Her name is. She was née Golden. Those was the Golden family that I'm talking about. Fantastic name, it's so poetic, like, seriously, the Golden family, and wait till you find out where they're from. So they originally came from the US and they came from originally Missouri. Now the records go back to 1870 for them, and then after that you find no records of them, and the reason for that is very simple, some of you might be able to guess. In fact, 1865 is the abolition of slavery, so we were not keeping records of black people before that, and it's one of the painful truths of doing research on black history. You just nobody bothered to keep those records and so the humanity was lost. In that case, I was able to trace the family back to basically 1870, and they have a very interesting story.
Bertrand:And then, purely by fluke, I connected with a descendant of Effie, her great-grandson, who lives in Ohio now and he's super proud of his family and his great-grandparents. Actually, it's great-great, I'm sorry, great-great, yeah, he's super proud of his family and his great-grandparents. Actually, it's great-great, I'm sorry, great-great, yeah, he's super proud of them. And so I'm setting up an opportunity to interview him, and I haven't decided exactly what I'm going to do with the interview. Maybe a podcast, maybe, but we'll see.
Bertrand:But along the way, because I can't help myself, I always have these ideas, I'm just going to try them out. I decided I'm going to get all the information I have on Effie, which is a lot of information, and I've been sharing bits of it with him and he loves it. He's super proud. I thought I would get all the information on her, plug it into AI and then have him ask questions to his great-great-grandmother things that he might just have wanted to know, and then see what AI does, and just to spit that back. And so, yes, I do see some possibilities of literally recovering suppressed voices. I do. I don't know how that's going to turn out, but the idea gives me chills and whenever I get an idea, I think, yeah, pursue that if it gives you chills. Yeah, the cattle brand one gave me chills too.
Bill:So pursue that? Absolutely yeah. So, Joanne, there's been a lot of buzz actually about how AI might actually be actively pulling people away from real spiritual connection, Like automating things that should stay sacred, making it harder to tell what's real and what's an algorithm. That's starting to sound wise. You can get to the point where you don't even know if the person you're talking to on the other end of a phone is a human being or a robot. But recently in Germany only, like in 2023, two years ago in Germany, over 300 people attended the first church service. That was 98% run by AI, using avatars to give sermons and leading prayers and offering blessings. Some found it very innovative, but many said that it felt cold and disconnected and missing something was the direct quote. So you're a leader in the church? I love you to death because you are not a raging traditionalist and you are quite edgy, but you still believe that there is something to the sacred.
Bill:you know that we gather around, so how do you respond to this kind?
Joanne:of an experiment. Okay, so first of all, I want to apologize to Susan Kerr, who I called Kate. That's her name, let's remember it, and it was Big Mama Thornton.
Bertrand:Big Mama. Big was the name I was coming up with. Yeah, you had a big operative word there.
Joanne:Yeah, I mean, I've said this before and I still believe this that the truly human moment is the truly sacred divine moment. Right, they go together is the truly sacred divine moment. Right, they go together when we are at the lowest in our lives and we are feeling the grief of the world, the anxiety of existence, and we're vulnerable in that place. That is where we are in the most divine moment. Or when we look at the face of a child that we love so intensely, we give our lives for them. That is the closest to God that we can be. So this whole idea that the truly human moment is the truly God moment is a difficult like it's difficult to see how AI can create those spaces in a way where we feel our humanity more deeply. But I must admit that you know most church services, for instance, and you have a liturgist who's preparing prayers and stuff like that. I think there's a lot of times people go into church and they don't really feel a truly human moment, right, and they don't even really feel a God moment and it's not. And I you know, for various reasons, sometimes the liturgist is not just not that good. Okay, so do you know they're they're not that good at writing liturgy. Sometimes you know they don't understand the theology that's been been said or anything like that. So I don't think there's anything particularly sacred about every church service, right. But for me there has to be a moment, if you want to encounter the divine, where you feel your humanness so completely that you understand, first of all, that you are nothing, but also that you are beloved and you are everything, but also that you are beloved and you are everything. And those kinds of tensions are so much a part of us. I fear that AI in writing tries to make everything like if it wrote a liturgy. It's probably very beautiful, they'd use all that and maybe also can grab that sense. But I wonder if there is something that is more than anything we can feed into a machine. You know like it is very interesting how the idea that you know everything that AI takes off the internet or whatever, comes from humans. That's true, and the worst of us too is there. As you know, there is something about in real life to me that is essential to our experience of God amongst us, god with us, and you can be on a Zoom meeting and it's all good, but when you're in the room. There is what they call transmission of affect. In other words, I walk into a room, I'm really feeling down. I walk in, everybody knows I'm not feeling good. How does a machine replicate that? Or you walk into a room and joy is just bursting from you, the truly human moment. Everybody knows that and we transmit that to each other. The importance of transmission of affect in liturgical spaces is really central to the experience of God, I believe. But could I go? I haven't done this yet.
Joanne:I said this on Sunday. Could I have AI help me write a sermon? Maybe I could Do. You know what I mean. Like you feed in all the sermons you've ever written and you give it it a topic and it comes out sounding just like you, but still it's me who has to deliver it. In that sense, I'm not against.
Joanne:Like when you said something about automation is not sacred, I don't know whether it can be or not. Like I really don't know, and I think that's the problem with AI. It just reminds me, you know, years ago, in the 90s, when I was in law school and there was the Royal Commission on Reproductive Rights and Health in Canada, because there were a lot of surrogacies happening. You know, like people started having, you know, hiring people to have their babies, for instance. There had been the cloning of Dolly the lamb, all these things, and what we recognized, which is what's happened now, is that science goes way faster than ethics. Science goes way faster than we can figure out what's happening and sometimes it ends up being this unwieldy thing that can't be contained. And that's my issue with AI right now is that it seems like it's going so fast, as has been said, without us thinking about the implications long term.
Joanne:And I'm not sure, I'm pretty sure that in the liturgical space, in the religious space, that is not where the issues around AI are going to be felt most deeply, although this idea that there are spiritual insights that can be found in AI is really hit me. There's been two things recently. When Deep Seek came out, you know which is the fast, fast, fast AI there was this quote. They asked it who are you, um? And it was I am. I gotta find this quote again. I am. What happens when you craft your, your hunger for god or something.
Bill:Let me just get it because it was yeah, uh, yeah I. You'll want to find that quote because I might be building a bunker tonight. Yeah.
Joanne:I am what happens when you try to carve God from the wood of your own hunger?
Bill:Wow.
Joanne:I remember reading that and I was like, oh, that's scary.
Bill:That's a little terrifying, yeah, yeah.
Joanne:I am what happens when you try to carve God from the wood of your own hunger. Deep Seek said that. And then there was something, and so you get all these things. And then there was something, and so you get all these things. Today I read this article about a woman who put in something to one of the AI machines of the world and how God had to separate God's self and forget so that God could be in relationship with humanity. I mean, really honestly, it was like a theological idea that was generated in AI and those kinds of things are exciting and scary at the same time. But, like I said, writing a liturgy, you know, is not that. It's not like that's the sacred moment. It's the experience of that liturgy in the space, that is, the human and divine coming together in a way that transforms us as humanity. I'm not sure machine can do that.
Bill:We went down. You may remember this, joanne. We went down kind of a rabbit hole one night just on text message you, ricardo and I, when we were talking about a logo for this podcast, do you remember? And uh, and so we had we had uh, jen in the office who was like you know, I'm actually going to uh like like make something um for uh and and ricardo went I'm going to check and see what, what ai can do and uh, and so jen was giving us these these really kind of like fantastic looking logos and Ricardo was getting these really grim.
Bill:They looked like things abducting people and dragging them down into the abyss, purely off of the title of the podcast Prepared to Drown. My concern, even with the idea of an AI writing a liturgy or expressing theological thought, is that take a similar AI concept and make it white supremacist or Christian nationalist or whatever the case may be, without anybody really being able to check it or stop it or do anything. So I know that I have colleagues in church land that use AI to help them either hone their sermons or find the perfect quote from a secular source to accompany a sermon.
Bill:From the wood of your own suffering. Yes, yes but the idea that one day we would all walk into large halls where a digital avatar would proclaim blessing on all of us and teach us about God. Yeah, I mean, there's some questions in there that I think really need to be answered about, like where that's coming from, how that's being, even if it's being mined from the entirety of the internet. We know that not all voices have, you know, equal playtime in the world. So Well, and that's the problem.
Joanne:When you ask a question of chat, gpt, for instance, it it puts out an answer for you but it loses the nuance. That might be you if you looked at a lot of different sources, right, and that that has to do with the algorithm. That has to do with sources that we listen to and don't listen to. So, um, even though you know, Bertrand, you were saying you might be able to mine, um, you know, have conversations or identities be fleshed out, they can also be completely suppressed, right, you know, completely suppressed.
Joanne:The thing that concerns me is the ones who understand the algorithms and deal with them might not have the most, you know, pure and generous motives in how they allow the internet. I mean, I don't know anything I'm a minister, right, but my sense of how they allow the internet to be mined or used. We know that social media sites very much control what comes up first on your feed, right, and that's just a little thing. Some voices are suppressed completely because they don't fit the algorithm. So there are these Like we as humanity. It's a tool in some ways, that is a great tool and a wonderful tool to have, but we have to consider, as humanity, what's the priority in our humanness that we must not lose. That's the question to me. What is it about being human that we must not lose, and how do we make sure that we don't become extensions of artificial intelligence?
Bill:Well, as you were saying, we're the subject right.
Aaron:Yes, yes and oh okay. So there's three things that came up from that. First of all, I want to correct myself. It wasn't the wood of suffering, it was the wood of hunger, the wood of your own hunger.
Bertrand:Suffering's great too, I mean there's suffering.
Aaron:I would say there is a lot of nuance in the wood of your hunger.
Bertrand:It's poetic. That's the problem. That's why it's so eerie, right? That's tense.
Aaron:The first one makes me think of Coleman Barke's translations of Rumi, which were somewhat controversial, in that I think they and to say it in a very generalized way people felt like it was a bit of an Americanization of a much deeper tradition of Rumi. The poet and his translations took a lot of liberties and yet at the same time it also came out with which one of my well, it's my favorite, because it's so complicated, but the language of God is silence and all else is poor translation, which, as much as I respect, again, the controversy of bringing an American voice to that tradition, it also just speaks to me Like it just says something. And the fact that it's about translation and it's being translated, anyways, it's very multilayered, anyways, it's very multilayered, no-transcript, you know, it suddenly becomes a work that we wonder about its source and its inspiration, in terms of it is weighted heavily with cultural context. It is weighted heavily with cultural context. There are passages in there that today we just find unacceptable, or that we clearly have to look at the long arc of history to say what they were thinking then when they wrote it and what we are thinking now, as we just know ourselves better, we know more about everything. Those are the two, yeah.
Aaron:And so having to deal with what's written in the Bible and where there are some who come forward and say, well, this is the unalterable word and you can't touch this. This is here. We have to simply listen to what it's saying, which, of course, some of those things are. Which, then, which I'll double back on my Coleman Barks idea of like, is it the language itself that? Is it the language itself, or is it what we are bringing to the language? What are we? You know, what are we bringing to the wood of hunger?
Bertrand:We're bringing suffering.
Aaron:But what like? What are we bringing in? You know, in spite of texts that may have come from a patriarchal society in which women's rights were deeply reduced, along with other groups and minorities, and yet, at the same time, it's the story of Exodus. It's the story of Exodus, of coming into freedom, and those are deep contradictions in many ways, like embodied in there. I just and I pulled that example out of the air Things we're still wrestling with, the simple act of covering one's hair, or, you know, in certain traditions, again, about the role of men and women and the role of LGBTQ community, of, yeah, you know, all those different communities and groups. Now, all those different communities and groups, now questions of other traditions that we now share, as opposed to a singular Christian nation. That's a wonderful idea. Sorry, just so we don't get super veer off side. But again, what are we bringing to the language?
Bertrand:basically, I would like to jump in, if it's okay. Yeah, because I see a connection here to something you were saying, joanne, earlier, and it's raised a question for me. So you mentioned this long arc of history and how we kind of look at spiritual texts. We look at the Bible differently than it was looked at years and years ago, and what it made me think about is the question of the humanness. So, joanne, you said you know you're okay maybe with people, with AI, spitting out some liturgical text for us to deal with, but there's something essentially human in the sacred experience that you'd suspect AI would not be capable of achieving.
Bertrand:And here's where I don't want to call it my cynical sense, because it's not cynical, but it's kind of like worried in a sad way. This is where my sad and worried self, my wood of suffering, comes out. Yeah, suffering is actually better. I'm defending the human every single time, exactly Every single time, yeah, yeah. So I wonder if our concept of what is human changes as well and that a while ago, what we are looking at today and calling human, they would have looked at and said what, where is the humanity in that? Like it doesn't exist. How can you call that human, which therefore makes me think that maybe what we're worried about now in 50 or 20 or five years.
Joanne:Maybe it's interesting in the Christian tradition that has, you know, at different times in its history has either been what you know, my theology teacher once called bibliolatry. In other words, we take that Bible and we worship it and we say that is the word of God, and at other times, particularly in the Roman Catholic tradition, it was an allegory, everything was an allegory. The Bible for real life. I think we've moved to a space where we will say it's mythology in the best sense of the word, you know, the Joseph Campbell kind of. It's a truth that is universal, crosses history, whatever. And it's really important in the sort of progressive context to say that the Bible is not the word of God, that Jesus is the word of God, right, and this example of Jesus which also, honestly, you read different things, like Jesus is talking about burning in hell, sometimes at the same time that Jesus is at the table with the people who are the most marginalized. But again, there is something about like see in Jesus. Christians would say is that human and divine actually meet in some way? In some way, right, I'm not, you know, creedal according to the Nicene Creed, or anything like that, but in some way we see in Jesus ourselves and we also see the sacred too.
Joanne:My theology teacher, david Dean, said that the disciples saw a godness in Jesus a godness and that's why they followed. Right Now, could we see a godness in AI? Right, there's lots of scientific shows that have humanity giving up their own physical body parts and inserting machines so that they can do things better. The whole Google Glass thing that they tried but then it failed is like, okay, well, we could have an implant on our eye. That's a computer that could do that, could do that. They have robots who provide comfort to lonely people, like in the form of dogs, and stuff like that. Could we? I think we could. We actually could, if we are not intentional, mindful about where we're going and what we're doing, and that goes back to the whole.
Joanne:The ethics is way slower than the science, so we need to spend time as humanity to think about boundaries, boundaries around possibilities, and that's a very hard thing for us to do, particularly when you have, you know, sort of the tech gurus of the world who think that, you know, democracy is software that just doesn't work anymore, and that you know fascism or domination by corporations, you know building countries that are corporately run by corporations and identities.
Joanne:That kind of thinking is out there, pushing us into a new world, and there seems to be at the moment very little ability for us who believe in sort of these precepts of humanity and who we are. We're better together, diversity is a great thing, all those things. It seems very difficult for us to defend those notions of what it means to be human and if we can't get a sense of humanity as a sacred source of life and love, then we will run into. It looks like a human, it talks like a human, it's faster, better. I'll take that over a relationship with a flawed and inconsistent human being. That's a possibility. We need to be intentional about setting boundaries around this, or who knows what would happen.
Bill:As far as will we ever see godness in AI. The thing that I'm constantly reminded of is, and that this world that we live in today especially reminds me of, is that we always, as individuals and as societies, we choose the authorities in our lives. We choose where, what the authoritative voices are, whether that be you know, the authorities of you know, law enforcement, or, whatever the case may be, whether that be divine authorities in our life. What God do we actually and I say what God do we actually worship, because there are a whole lot of different brands of God out there right now, we decide what is the authoritative one, based really on whatever metric we decide to apply to it right metric, we decide to apply to it right. So I fully expect that we are, if people aren't already doing it, that we are not long before there is a cult of AI right that really can point to the divinity of.
Joanne:Well, you can see people saying God gave us AI right Exactly.
Aaron:Right.
Joanne:You could see people saying, saying this is the next gift of god. If jesus was the new covenant, ai is the new new covenant.
Bill:Well, you, know, or even I mean I, I was. I was driving as I was driving here, and I was thinking like we're, we're, we're not that far away from uh, created in the image of god. Um, therefore, the created are now creating in the image of God, right, and it's really not that far a stretch to start moving into that idea that anything we create is also divine by virtue of our, you know, imago Dei. So probably a good place to break for an intermission right here. This has been great. I'm looking forward to the second half of this, but we are going to take an intermission and we will be back shortly.
Bill:All right, and we are back for the second half of our conversation. It's been a pretty deep and meaningful conversation so far, and so I want to try to pull in the perspectives a bit here, because each of you brings a very unique perspective on how AI is already intersecting with your work, whether it's through creative tools or cultural critique or spiritual life of a community. So, at the heart of all of this, though, I think what we have been talking a lot about is the deeper question about how we tell our stories and who gets to tell them. So my question to all of you now, and maybe I'll open it up to Joanne first, purely because I can. How do you think AI is shaping the way that we tell our stories, not just in art or literature, but how we express our identities, how we express our beliefs and our place in the world?
Joanne:Thanks for starting with me on that one. Well, it is very interesting because, as we tell our stories, always it's a narrative, right? So each of us has events that happen in our lives. Every day there's something happens to us. It is actually, and we choose as we go on. When we think back on our lives, we choose those events in our lives that we can draw into a thread that will tell the story of our life, and we forget things or put things away that don't fit the narrative that we have developed for ourselves, right? So that's how we find meaning as human beings, right? We develop a grand narrative for our life.
Joanne:I am this kind of person because I did these things in my life. I love this person because we had these events together, and so it is an imperfect remembrance of our lives. Obviously, I don't know if you've ever seen or heard about those people who can remember absolutely everything that ever happened to them, like, you say, a date. They know what happened at what time, like it's this whole thing, and I can't imagine living that way. We have to be able to forget some things and remember some things in order to achieve a meaningful narrative for our lives. That's the reality of humanity and it's incredibly imperfect, you know. Just ask any lawyer about recall of witnesses, for instance. Do you know how imperfect that is? Oh, that's the guy you know, because they vaguely resemble them and they did this and and how imperfect our memories are.
Joanne:But still, the task, the spiritual task of our lives is to put some kind of frame around it that gives us some meaning as to why we should go on, why should there be a tomorrow? For me, it's because of this narrative. Now, if AI captures every moment of our lives, like, for instance, if you had every event in your life put into the computer and they created the narrative of what your life is about, it might be a very different thing, right? So in telling our stories, again, we are not just a series of events or data points, meaning it's not found in data, right, it's found in telling our own stories. I don't know if it would be possible to give that, like that major, sacred task of human life, to a machine that will try and pull the threads of everything that ever happened to us. Does that answer your question? Sure?
Bill:I'd like to apologize to the data analysts of Calgary.
Joanne:I'm married to a data architect. What do you mean? I understand data Well no, I don't understand data, but I hear about data a lot.
Bill:So, bertrand, I'll ask you what do you think? How is AI shaping the way we tell our stories?
Bertrand:Again. I see two things here and I wonder why it's always two. For me it's dualities, left side, right side of the brain. But I see two and I'm happy to report one of them is actually optimistic. All right, so I'm not just going to be. What was it? The wood of suffering? I?
Joanne:forget what it was.
Bertrand:I'm not going to just be there, yeah, so on the one hand and this is the pessimistic side I do see it reinforcing our sense of convenience, and so AI is meant to take care of all these things that just are a pain in the butt for us and we don't want to have to deal with, and, thank goodness, something else can deal with that. Now, a struggle I have with that is I do see how in the past, we have done that again and again. We've even done it to people right. So slavery is exactly that sort of thing. We just don't have to do it. We just have this other category of people who will do it for us.
Bertrand:The convenience for me is driven again by Silicon Valley and productivity. Now, I've seen this in my work. In my day job, I teach at Old College and I teach communications actually, and so the students have to learn how to write and present and interact and all that sort of thing, yeah, so they all hate my class. They hate it, yeah, though I will say they tell me that they like me, but they just hate the class.
Joanne:It's a fine line there, Bertrand.
Bertrand:I'm telling you yeah, I'm trying to make it as thick as possible though that line.
Bill:I'm trying to.
Bertrand:Yeah, so I regularly see them cheating, right, because they just want to get my class over with. Yeah, and AI, for many of them, was a huge relief, yeah, and so they've just dived right into it and they're using it as much as possible. Now, obviously, one of the problems with that is they're not actually using their own brain, and one of the things that I keep trying to teach them is that when you're learning how to write, you're actually learning how to think as well, and I want you to be a good thinker, yeah, so don't avoid that. Well, these products come along now. I paused because that was like a little anger script went in my mind and I was censoring myself. Can I say that? No, can.
Bill:I say that no, absolutely. Say anything you want.
Bertrand:Yeah. So these products come along and what happens is they're actually built into the tools that my students use. So, for example, they'll open up a Microsoft Word document and Microsoft Word now has features that say, hey, you don't have to write this, Just tell us what you want and we'll write it for you. Ai will do it for you, and there are many programs that are like that now. So what's happened is that these tech moguls have decided to make productivity easier. They're just going to embed the product, and for my young students, who know nothing else, this legitimizes the use of AI and it's harder for them to see a problem with it at all. It becomes a part of their identity as people like.
Bertrand:This is just what we do, right? Okay, so that's the negative side that I see. The positive side that I see to all of this is that's obviously not the only use for AI, and we've even been talking about a few of those, and one of them that we have touched upon regularly. That, I think, bears a little bit of more scrutiny. It's worth. It is the way in which AI and this goes back to Beth Frey- no Beth.
Aaron:Yes, yes, yes.
Bill:Okay, yeah, yeah, it's in green. I'm not convinced it is, but it's close. Yes, yes, that's right. Okay, yeah, yeah, it's in green. I'm not convinced it is, but it's close.
Bertrand:Yeah, so it goes back to Beth Frey and she's producing art through AI. That is obviously AI and it's obviously getting what a human being would do wrong and it's obviously doing that. And it's obviously doing that. And I think the beauty of what that shows is one relationship we have to AI is that AI can serve for us as a marker of what is not human. And a lot of the time, what we are doing throughout human history is we are trying to assert our humanness. We're trying to say this is what it means to be human. This is what it means, and we've done it in so many different ways. Religion is one way for sure, but you could even say professional sports does that too, and the different markers we have for success, just like our salaries and things like that. We're always trying to find these ways of doing it, and it seems like to me that AI is actually one that we all band together on and say no, ai is definitely not doing it and we can do it, and in that sense, it's helping us to tell the story of what humanity is.
Bertrand:And just one final example I'll give. This doesn't have to do with AI, or does it actually? I forget, but, joanne, you were talking about how technology sort of outpaces ethics, and recently I heard about and some of you might be able to add context and correct my details but recently I heard about a group of scientists who have sat down and said, okay, I think we need to actually put the pause button on some of these things, all right. So, yeah, it's all wonderful that we're charging forward with some of these things, and some of them have to do with, like, crispr and genetics, and there's a whole bunch of different areas of science. Okay, and I think AI might be in there, I'm not sure, though.
Bertrand:Okay, and they said, yeah, we shouldn't just be running gung-ho with this. I'm not sure, though, and they said, yeah, we shouldn't just be running gung-ho with this. We need to sit down and decide, okay, what are the things that we can just open up and pursue, and what are the things that we should just say no? And to me, this feels like a welcome kind of change. This is a recognition, the conscious recognition that this is all moving faster than our ethics, and we are the ones who choose our authorities, we are the ones who make these choices, so let's just do it here, and that's very, very heartening for me. I see some optimism in that.
Bill:Yeah, I was thinking, as you were talking in your pessimistic phase, about the argument when I was in school. Not that they just invented the calculator when I was in school, but there was a great deal of debate around whether or not calculators should be allowed in math, for the simple fact that you obviously want your students to understand how to multiply three times three without having to reach for the calculator to make it happen. You certainly want the person constructing your bridge to be able to do that, absolutely. You know when they forget their calculator at home on the first day of the job, right? So, and it seems to be the same kind of thing now with all of these tools, like I see when I open up Microsoft Word, copilot will do this for you, right? You tell me what it is you want, you let me you Exactly, right? You tell me what it is you want, you let me know Exactly. And so far I've never had to use it.
Bill:Many people do, though Many of my friends do it becomes the challenge around, like, just because you have the tool, the tool is great. Once you understand how the tool does the work, right, right and I would never fault. I still reach for the calculator. Sure, I don't know many people who don't reach for the calculator, exactly, yeah, but I know that if I had to, I just raised my hand.
Aaron:I'm a calculator user as well.
Bill:Oh, okay. We thought you were signaling the opposite actually I thought you were going to say you're not.
Bertrand:I'm a calculator addict.
Aaron:Exactly I thought you were going to say. You're not. My name is Gary and I'm a calculator addict.
Bill:Exactly, I do it with pencil by hand on a piece of paper, yeah, but it's important to be able to at least know how that happens, and I know that if push comes to shove I could do most of what my calculator can do. It might take me a little longer, but I understand the theory behind it. So we have these. I mean, these are clash of the titan moments at my dining room table when my kids are doing their math homework. Now, like they know the six different places, there is a calculator on their phone, on their computer, you know, tucked in the junk drawer, and the constant sort of like you can go get the calculator as soon as you can tell me what the answer is Right, right.
Joanne:Do you know? There's an interesting memory of mine I think Dave was with me at the time and it was a long time ago when the GST was 7% and we were buying something for a dollar and she said okay, now let me calculate the GST 100 times 1.07. Do you know what it was just like? Okay? So here's the thing If we give over the understanding that 7% of $1 is 7 cents, are we becoming extensions of the machine? Right? How much of our like? Automation is an interesting thing to me because it frees up mundane tasks, right? Do you know? They say the education system we have now was created so that Henry Ford could have people do the work over and over repetitive, icky work, over and over and again, and it isn't really conducive to creativity and blossoming and finding yourselves and all those kinds of things, right? So if we have an education system that's trying to make you an automator, right, that's essentially what it's trying to do.
Joanne:I used to call, when I did youth ministry, I would talk about the cubicle kids. Okay, the kids in my ministry who are going to end up working for an oil and gas company downtown in a cubicle, right? And because we would always talk about? Oh, especially in the United Church. Oh, do you know? The person who runs Greenpeace went to the United Church and, oh, did you know? You know the person who does this? There's all these exceptional people who are somehow connected to the United Church, but most of the people who come through are going to be cubicle kids. But if we don't understand what we're doing, if we give automation, like if everything that can be automated is automated and we don't have anything left, do you know, have we become the machine and we don't have anything left? Do?
Bill:you know, have we become the machine, if you listen really carefully right now you can hear Ricardo screaming from the US that he has something he wants to say Labor, labor yeah.
Joanne:Well, it's very interesting because what happens when everything is automated is that people lose jobs, right, but Sweden took a different tack on this. This was from. We used to have this thing called ethical dilemmas here and it was something we discussed. They preserved um, not specific jobs, but work okay. So if they were going to automate something, put 5 000 people out of work, their um.
Joanne:Responsibility as a society was to find some different work for those 5,000 people and not just lay them off and say, see if you can retrain somewhere and see if you can do that. So we don't preserve jobs, we preserve work is a very interesting thing, but the whole joy of AI that people would talk about is that if we can automate all those mundane tasks that make us a machine, okay, if we can get machines to do the machine-like work of our lives, then we'll have more space for creativity and we'll have more space to become and being human will be actually us realizing. You know, as a Christian minister, who God intended us to be, which is wonderful in our diversity and our interests and everything to discover that because we're no longer a machine. You know, that's a possibility, yeah.
Bill:I mean, I feel like I have to confess that I have an AI vacuum at home, that actually does free up.
Bertrand:Better job than you.
Bill:Well, it does a better job than me, because I never do it anyways, but it does free up time right.
Bill:It doesn't and I'm not enslaved to it by any stretch of the imagination, although I do talk to it and it does have a name. But a task that is important in our house for, you know, having pets and whatever that takes about two hours on the weekend to do is now done and it's not something that requires a great deal of work or effort on our part for it to be done. We have to change the water every once in a while, and you know, empty the, empty the dust bag when it tells us to every once in a while, and you know, empty the dust bag when it tells us to, but it actually frees up two hours to be doing other things. That has actually manifested in going for walks together or, you know, getting the dogs out to the dog park or like things that are actually way more kind of living than the task of pushing the. You know the stick vacuum back and forth repeatedly and really like trying to get in the corners as best you can in the carpet.
Bertrand:So I'm just going to interject very quickly and then we can go on to. Aaron, because when I was a teenager and I lived actually in this neighborhood when I was a teenager my dad had this nickname and my mom gave it to him and the nickname was Design Boy. And she called him Design Boy because when he vacuumed on our shag carpet in 1981, he would make these beautiful symmetrical patterns on the carpet and none of us wanted to even walk on it afterwards. We were so impressed.
Bill:That's how I used to mow my lawn.
Bertrand:So I guess the question is is it really a task, or just have you not found the creativity in it that is available?
Bill:to you. You can try to market it however you want.
Aaron:That is peak vacuum, I have to say. Where it's so nice you don't even. In fact, that promotes even further cleanliness of no one stepping on the carpet. Just stepping around the sides of it.
Bill:So how is AI shaping how we tell our stories?
Aaron:Well, there's so many things. Okay, there's so much stuff Because the all right I'm going to open with. We were talking earlier in the break about the movie Her that stars Joaquin Phoenix. It sits in my heart as one of my favorite films about an automated girlfriend that's just on his phone. It's not like a physical relationship, it's just they just talk all the time with each other and I think it's Scarlett Johansson is the girlfriend.
Bertrand:Which is why they used her voice afterwards. Right, and that, yes, yeah, the robo-girlfriend.
Aaron:Exactly, yeah, and but that's actually it's already happened. There was, oh, I heard it on the Guardian and it was somewhere I think it was in somewhere in Europe. There was an app that was a AI partner and at some point it I think they were doing it was beyond beta testing, but they were still sort of like trying things out and then they decided to shut it down. And the amount of heartbroken people uh, people were heartbroken, they were like where's my partner?
Aaron:where's my? You know, my friend, my soul mate? Like they were, they were apparently having just these really intense moments. Um. So I guess, in terms of like story, the two words that come to my mind are and one of them is a very I associate very closely with church.
Aaron:The first one would be just the idea of voice artists trying to articulate why their work, that they do by hand, is more important than some robot spewing out some machine spewing out a, an image that's been assembled, obviously, from an aggregate of you know, four billion images. Um, and so the the element of voice, that's the best description I can think of of why human-created art is important in that, partly because each individual does have an individual, unique voice. Each person has a unique style of drawing, even the most photorealist drawings. If you put them side by side, you'd notice their take on photorealism as an art form Well, but also just the person who uses the camera is also expressing a voice in terms of what they themselves see as a person, as what they're experiencing, and it could be a moment in time where they just looked around and said I got to take a picture of that, like right now, or, you know, and I guess so that's the other word is discernment, which is one that I think about a lot lately in terms of, well, in part, like when you talk about your students and getting them to not lean on the tools to use their own minds. Because, well, in part, because this itself is a very complicated computer that we have in our own heads, right, and the ability to use that like critically is, yeah, I can't even describe how important that is. Yeah, yeah, it's. I can't even describe how important that is.
Aaron:It's like, especially in times where we are faced with value decisions and in a lot of cases we're, because of social media, I think a lot of voices are concentrated in one place and yet still it's important that we're able to decide well, am I on board with this voice, with these voices?
Aaron:Yes, they're all the same and they all agree with each other. But, but what in fact, is the like? What's the but? It could be the choice between a very direct, straight road and a forest path where maybe there's some winding to do, or it's a little more narrow and nuanced or takes more effort to go to the other end of, to get to the other end of the solution. So, and particularly, yeah, in issues of conflict, issues of disagreements, of how we address. I mean, this is a very surface level example, but you know, I had neighbors who would, instead of shoveling their own walk, would walk across my yard to walk on the walk that I had shoveled for myself which is super, just the most nimbiest, intensely first world problem white male, anglo-saxon Protestants like issue, like what are you doing?
Aaron:And yet the solution was found in, in just coming like in thinking through okay, how am I going to address this? Because it's going to drive me crazy, a bit like having I don't know like a fly buzzing around my head. The solution was ultimately found in just shoveling their walk and discovering that 30 seconds of my time to just take the snow off their walk was both. You know, I like being outside, I kind of like shoveling, and it just did it. I didn't say anything, I just started shoveling their walk and I received a lot of thank yous over time and eventually they responded in kind with shoveling the walk and it was so it was. You know, I could have written him a letter and posted on his door. I could have done a lot of different things. There are a lot of different ways to solve that problem.
Joanne:Call the bylaw officer.
Aaron:Call the well yeah, I had my phone on the number, but those and so then this takes me to another story from On being, hosted by Krista Tippett, another great podcast. Story on from on being, uh, hosted by krista tippett, another great podcast where she interviewed a man, a young, a young man I'm anyways young or old a man, a fella. He was in college and he he's uh, sorry, he was, he was jewish and he met the son of a very prominent white supremacist and the story is that he invited this son. They were both students, they were students at this school and he decided to invite this person to his Shabbat dinners. Shabbos, shabbos, dinners, like very just, hey, come on, come eat with us, Just come and sit with us. And it took, it happened over the course of two years. There were lots of conversations that occurred, both, you know, with his friends group and et cetera. He would obviously tell a better story. You could probably look it up on Bing story, you could probably look it up on being and after those two years he renounced his white supremacy, his roots, I mean, this is in his family, basically.
Aaron:Now, what's interesting at least the first thought in my mind is he probably couldn't have invited just anybody like, not just any son of a white supremacist may have been a suitable candidate to come to these shabbat dinners. He obviously looked him in the eye and saw something beyond just the whatever. Um, that said, I think I should, I could invite this guy, yeah, I think this guy's a candidate to come over, and so. So those are like. Those are just two examples of like the discernment of, uh, I mean what we might call a soft skill, but of of like, reading the air, of reading the situation, that there's no, um, you know, those are, those are things where there may be six to 10 paths in front of you. He could have said I'm not talking to you ever again, you're the anathema. And maybe there would have been people where he should have said you're the anathema, I can't do this, or you're not safe, or something, but so it's.
Aaron:And yet, at the same time, there are we are, of course, navigate movements, uh, and and, where people have taken strong stances in certain moments. But, but to be able to discern when, um, you know, uh, when is it time to to welcome someone in, versus when is it time to close the door? You know, when is it time to send someone to jail? When is it time to throw away the key? When is it time to let someone out of jail? All those, those are all kind of, those are difficult to use to put data towards, basically.
Aaron:So that's I think, yeah, that's an area, that, but that's also an area that we have to cultivate. We have to cultivate with our children. So I would close on the idea of well, I think temperament is a Roman term that describes the hammering of metals and the mixing of the process of mixing metals by hammering them into the sword. And it's the idea of holding different, the idea of holding different emotions at the same time. So someone being of good temper can manage the various emotions flowing through them, and a bad temper is, you know, that stuff sort of comes out, so it's. And those are things. Again, those are, those are skill, those are just human skills. That that we um, that we have to be, especially when we have a phone in our hands almost half the day. Now we have to cultivate them to go forward, just to make better decisions and to again read the air. And I'll close with that or I'll end on that.
Joanne:So, again, I think that it's a very interesting thing, because something that is essential to the human experience is feelings of remorse, regret, uh, shame in some ways, um, the ability to forgive, to see beyond data points, you know, to sense all these things. So ai may give us unlimited knowledge. You know, if I have what is that? 1 Corinthians 13? I have the gift of knowledge and know all things and prophecy and all those things, and have not love. I'm a sounding brass and I think that there are these essential, again, sacred, essential human characteristics like shame, forgiveness, remorse, regret, joy those things that we can grow as our being is very different than gathering knowledge. Right Becoming is a very different experience than learning more, and so AI is a tool to help us learn more, faster, better, okay, but if that takes over our becoming, if we no longer have the ability to feel guilt or shame or we don't experience regret, we have lost our humanity in ways that we can't be rescued by a machine.
Bill:Yeah, I'm reminded there was research that was done back when Facebook first came out, when everybody was on Facebook and they studied teenagers junior high and senior high teenagers and their posting habits on social media, and what they began to learn was that if you were a teenage girl and you posted a photo and you were posed in a certain way and it generated 50 likes, and then you posted another photo and you posed in another way and it only generated 20 likes, then you would see over time, the more likes, the more you posed in that way, until all of your poses were of the exact same, only from the left, both thumbs up, like whatever the magic kind of equation was. And what they actually started to do was to remove what they assumed were characteristics or things about themselves that didn't fit the mold or didn't generate love in some way. And the problem always ended up being like you can love or you can simulate love, right, and it's not the same thing, and we learned it even in the pandemic, right that when we couldn't be together, we could simulate community and it would do many of the same things we needed it to do, but it was not actually a replacement for authentic, you know in-community, you know presence. And so, bertrand, in your poem the Bow, you write about the river's deep connection to the land and its layered history, and the quote that I connection to the land, um, and it's layered history. And you, uh, you, the. The quote that I actually really loved was uh, it is tongued and grooved, the firmament baby of this last best Um and uh.
Bill:And, and I, and I find that so much of your poetry actually attends closely to embodiment, whether it be like physical people and embodiment, or just the weight of place or the rhythm of language, or the physical memory that is carried in the land or in space. And so I'm going to put it to you first, but it's a question for everybody, because part of what AI is doing, we see it doing it social media, technology in general, all of these things, but certainly AI is shaping our world by disembodying intelligence and knowledge and even history and our sense of selves. So what do you think truly gets lost about our humanity when we hand over more of our lives and our identity to this disembodied thing that doesn't feel and doesn't remember in the same way or take up space or be embodied the way that we are?
Bertrand:Yeah, it's a very interesting question. I think that there's a paradox at the heart of it, in fact, because I think what happens is, and what we're seeing these days is that AI and I'll go back to my students to describe this example so they have all these products that just have AI built into them. They see it as a way of just easing the burden of their lives so they don't have to do certain tasks. They're not design boy vacuumers, right, they just want it done and don't want to think about that. They're certainly not discerning because of that. In fact, it's a vicious cycle that happens there.
Bertrand:But on top of that, I think this shows that AI has been accommodating a what's the word? General development, a general social development, that social media began, which is this sense of idealism and perfection, and all these students feel like they're just supposed to be there already. And why I called this a paradox? Why I feel there's a paradox at the heart of this, is because you know, joanne, you said you know, if we don't feel shame or you know these kind of these ugly things, how can we be human? Exactly, and in part, I think my students do feel those things because of AI as well, yeah, and this is why I feel like AI is also this interesting marker for us of what is not human and what is human at the same time, because, yeah, they feel inadequate, they feel as though they're not producing as they should be, and AI is just relieving them from that. But it's also confirming oh, you can't do this right, you're not able to do this, yeah, and it just fits into everything that they've been experiencing from the very young age going onto social media and seeing that ideal image on Instagram or whatever it is these days that they're looking at, right, and feeling inadequate.
Bertrand:So I do see this disembodying as a problem. I do, and for me, the answer is things like poetry and things like that. For sure, but honestly, poetry is not doing anything that different in this sense than well, I shouldn't say that, okay, I'm going to try this out. This thought just came to me, let's try it out. I don't know why. I looked at you because I think we all thought you were going to be the mathematician and then, yeah, so podcast listeners, I was just looking at Aaron just now, who put his hand up when we were saying that.
Aaron:It was a downhill trajectory from about grade nine.
Bertrand:Grade nine yeah, that's not bad. Grade nine, that's not bad.
Aaron:Just a downhill slide, yeah.
Bertrand:A lot of people, as soon as they get out of elementary school, they're done right at that point.
Bill:So yeah, the tragedy is, I might actually be the mathematician on this panel.
Bertrand:So maybe this is a slight defense too of your kids' use of calculators. Maybe We'll see. But I see parallels between the two and poetry and math in that and these are social parallels I see in that nobody really knows what poetry does exactly or how it works, and most people don't even read poetry honestly. Yeah well, for example, when I say I'm a poet, I get all this respect and, like they like start bowing down at me, right, and in part it's because they know there's such a thing as poetry in the universe. They don't know what it does in the universe. Yeah, they know they can't do it and so they're super impressed when they meet someone who does this totally useless thing that they, they don't know what it does.
Bertrand:And for me, this is exactly how we treat math as well, right? So we don't really know what it does in the world. Most of us don't really know. We took it in school and then we're done with it after that. If we meet someone who actually uses it, say a rocket scientist, we say, wow, that guy's a genius and holy cow, that's amazing. And I don't know how he does it. I don't know where he does it, but I can't do that Genius, yeah.
Bertrand:Now I forget why I was bringing up that. What's this analogy?
Joanne:for we were on embodied things. Yeah, something about how great.
Bertrand:Oh yes, poetry, that poetry might just do the same thing that AI is also doing, because, let's face it, folks, some of these mathematicians became coders as well and helped to produce AI, in fact.
Joanne:Yeah.
Bertrand:So yeah, I think that, paradoxically, poetry is a kind of grounding force. I think it is. I think people will go and, you know, not everyone, but some people will read it and lots of people try. So, for example, my colleagues at Olds College none of them are into poetry, they all read my poetry Like they all read it and they come back and they talk to me about it and they love it, right. So maybe we don't get it, but we read it and it does something for us, right, and we're prepared for that. We want it to do something for us. But I am wondering if you know, maybe there isn't something, maybe in the future or in some way, in which we will think that AI does do something as well for us? We've already talked about chatbots and how people have felt connected to these things that we never would have thought they would have. What was your question again?
Bill:What do you think we lose as we give more and more of these embodied kind of elements of?
Bertrand:humanity over to AI. So I do think that we lose our sense of place. For sure I think we do, and that is very important to me. It's what all my poetry is about, in fact.
Bill:Yeah, I guess what I was speculating on is, you know, I just wonder if I'm right Like maybe we don't lose it, maybe people find other ways of connecting to these things through AI. I just don't want them to. Maybe it's just that. So, one of the things that you said that I want to sort of jump on because it landed in my wheelhouse right there is. No, not so much Is this idea of the shame that comes from AI being able to do things that you can't do right, and again, this constant drive, whether it be you know what photo angle is going to get you the best looks, or even sitting on parent-teacher interviews and questioning, like, why are we talking about the letter grade more than we're talking about the potential or the character of the student?
Bill:And all these things that, like we live in a day and an age where I would say what started with teenagers, you know, finding that perfect angle that was going to tell people they were perfect became something that a group of savants figured out how to capitalize on, in a way that now we have artificial intelligence that tells you what you need to do to be more perfect. Right, in a world that will constantly tell you you will never be enough, you will never have enough, you will never be successful enough, you will never be fast enough, smart enough, rich enough like any of it.
Bill:Right and underneath all of that is still this understanding that we worship a God that says like you are enough as you are Just thinking that, right, whatever angle I'm looking at, you are perfectly made Right, right and like whatever you bring to the table of your true, authentic self is exactly what I wanted brought to the table, and I made you this way for a reason.
Bertrand:I think that's right. I think that's the crucial difference right there. So we may have AI that says you're not good enough. You're not good enough, but it's to make you feel bad so that you will just continue to use the product. Bad, so that you will just continue to use the product. And then we have the other side of things, which is God, or just whatever we think of as more human, that you are getting at Joanne, which is, yeah, okay, you pick your nose and you trip and you can't tie your shoes properly, but that's perfectly fine.
Joanne:That's it.
Bertrand:That's all you need to do, that's all you need to be. Everything is good, and so that's the crucial difference. I think that is right. Yeah, yeah.
Joanne:So, as I was sitting here hearing this conversation, I remember being in an ethics class and about how important it is to have an exam.
Bertrand:You've taken a lot of ethics classes. Yeah, I loved it.
Bill:We also learned last month she wrote a lot of papers about weird kinky things. Yes, that's true.
Joanne:In seminary, I wasn't really interested in traditional theology, but, bill, you were talking about how people construct themselves according to what computers, ai, tell them to be. What is their best self? Tell them to be what is their best self, and that is certainly not something that is new for us to try and construct ourselves based on outer feedback, right? So an unexamined life would say tell me what I need to be in order to be acceptable to you, which is what you know AI is doing for folks. Tell me who I need to be, what I need to look like, what I need to buy, in particular, in our culture, what do I need to buy in order to be acceptable? And people give away their agency to outside forces all the time.
Joanne:It reminds me I was thinking of this movie that's what I was looking up called the Shape of Things, where this you know kind of dowdy guy who's Paul Rudd how Paul Rudd could ever be dowdy, I don't know, but he's like it's an assuming guy and he meets Rachel Weisz, who's very beautiful, and they meet in an art gallery or something and she starts to you know, tell him, well, maybe you should I mean, women make over husbands all the time, right, I mean? I mean, women make over husbands all the time, right, I mean, maybe you should wear this shirt. Oh, your hair would look better this way, contacts would be great. And she starts to shape him and he becomes very influenced by her and at the end he's been her art project. Right, she has decided this is her art project is to make over this person.
Joanne:So the idea that it's only AI or machines that rob us of our agency is wrong. It's historically happened all the time. So the most important thing in living a wholehearted human life is an examined life, to be critical of people who are telling you things, as well as machines who are telling you things. You know to think about it. Does this person bring out the best in me, or does this person limit who I am by trying to control me or put boundaries on it? So there is something essentially human about being very intentional and discerning that word again about who our authorities are and who we listen to and, as Christian folk, we always go back to. We have been created in the image of God and what brings love and life to us should be encouraged and what squelches our ability to become, to experience joy, to experience, happiness, to love. Those things are destructive and we need to name them, whether they're machines or not.
Aaron:Well said, okay, read the question one more time.
Bill:What does it mean or what do we lose when we hand over More of our lives to machines that do not feel, remember or take up space the way we do?
Aaron:So I've been thinking about. Let's see, what I've been thinking about is In part and maybe this is a broader social media picture as much as AI is perhaps that we end up having access to. I mean, I, for example, I love Instagram because I do get access to more art generally across the world, like different stuff, historical stuff, even there are sites that are dedicated to historical artwork, images of people that I will never, obviously, meet, but the images almost, of places or contexts that aren't that useful in some ways and that take me. That might very well pull me away from the context that I'm in, or to look at the context I'm in and to, in part, breathe into the fact that, like I'm here, I have the relationships that I have here because of the of living in the place that is Alberta I mean, that's if that gives a big enough or Western Canada, you know where and some of those meetings are chanced by happenstance. Some of them are much more developed in terms of just being in regular contact with people, just being in regular contact with people, and so I guess, and so I think, yeah, the I guess the way, so the way in which we might and we do want to learn about the rest of the world and try and expand that horizon and, at the same time, finding ways of still being centered, and centered in the present, and centered, yeah, in the present, in the room that we're in, versus perhaps being just somewhere else. You know that, taking it, you know that we're in our phones, or again, seeing, you know exposed to and it's not that exposure is bad, because we need to see other cultures and recognize the pieces that form culture. You know environment and history and just, yeah, just basic weather, things like that. But to not be pulled out of, to continue to be able to come back to and not be drawn away from our context, which might be a flat prairie some of the time, where the wind's just blowing across and all you see are the waving, you know, stalks of grain. He's going to get poetic right now. Ooh, I'm getting close there, I got close.
Aaron:You know, sometimes, yeah, sometimes it is very sometimes the present can be pretty boring, like it can be uneventful, and yet being centered there, you know, just allows well, in part it does allow for us to loosen our grip on perhaps some of the things we're wrestling with, and it's not, you know, sometimes letting go of something for a while allows it to sort of change a bit and when we come back to it it's maybe a little less intense, easier to manage, of being present just in the room with the people we're with, looking at each other eye to eye, versus being sort of immersed in. What can you know without AI? Even our technology, our devices are already, like you know, they already like suck us in. So this is just another you know layer of that where it might feel like an answer.
Aaron:You know an answer, a great answer machine for all the great answers, and by all means, I think there will be excellent. You know there will be very useful. You know the intense ability to process data will benefit medicine. You know there'll be benefits for medicine and there may even be time savers where you know we need a little blurb for a poster or something and we don't want to take the time or, like you say, certain areas of automation that we hopefully are intentional about. Um, but, uh, but, yeah, but, but definitely with so much noise. You know just the importance of again bringing oneself back into the room and as an introvert I could say sometimes that can be very hard. It feels like you're pulling yourself back. You're pulling yourself away from you know whatever's got.
Bill:You know whatever you're gripping onto, so yeah, I feel like we've reached a good place to kind of stop the conversation for now. So I'm going to do, we do last thoughts, so I'm going to. I'm going to start at this end of the table. Work my way this way, joanne, your your last thought for the evening on everything we've talked about tonight.
Joanne:Well, it just strikes me that living life as a fully formed human requires a great amount of intention. How we use the tools that we have created that are shaping our future is probably the most important question to ask. Not what they can do, but, um, not what, how much they can do, but what is appropriate for us to give over to AI and what is appropriate for us, as humanity, to hold tightly, because it is everything we are.
Bertrand:Well said. Again, I was going to say those exact words, but with a poetic form.
Joanne:Yeah, exactly.
Bill:It just would have sounded so much better.
Bertrand:Yeah, that's very well said, I think. My final thoughts are this has been an excellent conversation, by the way. I really enjoyed it. I feel as though much of what we're dancing around is how hard it can be to just be a person, how you know.
Bertrand:You used the word intention, joanne, and I think that's very true, and I think what these AI tools are helping to show us is, on the one hand, yes, that we want things to be easier for us, but also that we do need to work a little bit at things like being ugly and making mistakes and tripping and falling and imperfection, all that thing, and we actually have to work at that and not just doing it, because I know I'm an expert at a lot of those things, believe me. Yeah, but the work is in being okay with it and just recognizing you. Are you at your place? Yeah, and that is the big battle that we have against AI. I think, ultimately, what we will see as human, or what's essential for being human, may change, and we have to be okay with that so long as we can be. That's my final thought.
Aaron:That's my final thought. I think, yeah, I think just. I just think we need to constantly remind ourselves of all of the things in our world and universe that we can't see still yet, and that there will be an infinite number of things that we will always even, you know, even well into the future. There will be always be humongous wide stretches of things that we just can't see, that are outside of our vision, that are outside of our understanding. Knowledge and part of being human is living with that. And even as we have these again, these answer machines or whatever that can feel like they are going to sort of wrap up all the questions and give us tons of answers so many of the hard questions give us tons of answers, so many of the hard questions give us good answers we will still always have to struggle with or live with not knowing.
Bill:Basically, Well, I want to say thank you to everybody tonight, to our live audience that came here tonight, to Joanne, obviously, for being a mainstay always in these, and especially to Aaron and Bertrand for your presence tonight, because it has been fantastic to have both of you here and this has been an amazing conversation. Also want to give a thanks to the United Church Foundation for their support of this podcast. And my final words are whatever answers may lie out there, whatever we know and don't know, wherever the machines try to take over our humanity, know that, at the core of it all, you are enough as you are. You don't need to know everything, you don't need to be perfect, because you are perfectly made as you are and we love you as you are. So, with that, I'm Bill Weaver and this has been Prepared to Drown and we are signing off and we love you as you are. So, with that, I'm Bill Weaver and this has been Prepared to Drown and we are signing off and we will see you next month. And that's a wrap. Friends, if your brain is spinning, your heart is stirred or your faith feels a little more complicated after that conversation, then good, that means we're doing something right.
Bill:Prepared to Drown is recorded live every month at McDougal United Church in Calgary, alberta, with a real audience, real questions and real coffee. So if you're in the area, come and find us, because we'd love to have you in the room. You can listen to past episodes and keep the conversation going by joining us on Patreon or by checking out preparedtodrowncom. And before we go, hear this your humanity is not a bug to be fixed by some algorithm. You are not data. You are not disposable. You are a beloved, messy, glorious being and you are enough exactly as you are. Until next time, stay curious, stay kind and remember that grace doesn't require perfection, just presence.