
The Current
We're seeking inspiration toward deeper discipleship through conversations with people working toward justice, cultivating deep spiritual practices, forming community and connection in significant ways, and helping one another heal from trauma. As we follow Christ to the margins of society, to the wounded and grieving, and into the hard work of peacemaking, we find that we are not alone on this journey. Join us to resist despair, and to regain some hope in the world, in the church, and in Christ.
Most weeks, Pastor Chris Nafis is talking with scholars and practitioners who are inspiring and faithful, and some weeks Pastor Chris is engaging with the book of Acts. Each week, we find the Spirit calling us deeper into the death and resurrection of Jesus, into a life with God, and into loving one another well.
This is a ministry of Living Water Church of the Nazarene, which gathers in San Diego's East Village, the epicenter of homelessness in this city. We are committed to meaningful worship, community formation, and service. Join us sometime :)
The Current
Derek Kubilus - What Would Wesley Do With ChatGPT?
As artificial intelligence weaves itself into the fabric of our lives, how should people of faith respond?
This conversation between two pastors explores the spectrum of reactions to AI in ministry settings—from those who eagerly use it for sermon preparation to those who refuse to touch it. Rather than settling for simplistic answers, they dig into the nuanced ethical terrain of this emerging technology, drawing parallels to previous communication revolutions like the Gutenberg printing press and radio that similarly upended society.
What makes this discussion particularly valuable is its focus on human formation. While acknowledging legitimate concerns about job displacement and environmental impacts, the deeper questions emerge around how these technologies shape us as people. Can we outsource our thinking to AI without sacrificing something essential about our humanity? Recent studies showing deterioration in critical thinking skills among regular ChatGPT users suggest the answer might be no.
The conversation takes a fascinating turn when examining how AI might amplify existing problems we've seen with social media—the illusion of connection leading to actual isolation, the absence of natural boundaries, and the addictive pull of technologies designed to keep us engaged. Through a Wesleyan lens of accountability and virtue formation, they suggest communities of faith might offer exactly the kind of intentional discernment needed to navigate this new frontier.
Whether you're curious about AI's implications, concerned about its impacts, or simply trying to develop a thoughtful approach to technology, this episode offers a compassionate framework for moving forward with both wisdom and hope. As one pastor notes, "The church has seen a lot of new things over 2,000 years, and we've come up with some pretty good wisdom for how to deal with them."
What boundaries have you set around technology in your own life? Join the conversation and share your thoughts on navigating faith in the age of AI.
all right, all right. Well, hey, derek, it's great to see you. We were just talking off the screen. It's been like 15 years since we've had like a real conversation and I'm so grateful you're willing to just spend some time. Thank you for coming on the podcast thank you for having me.
Derek Kubilus :This should be fun I think it will be fun.
Chris Nafis:Yeah, I mean from I saw your Facebook posts. This is how I ended up contacting you. You posted a New York times article about um.
Chris Nafis:It was like a, a, a chat, gpt or an open AI former employee that was kind of having all these doomsday predictions and stuff and I was like man, I have been thinking a lot about this and you're you're someone who, at duke, I just knew as, like this, really kind of opinionated in some ways, but really sharp, yeah intelligent person yeah, but like thoughtful, you know like there's not that many people who have strong opinions that I've seen like change their opinions as they've like kind of had conversations and stuff and I've seen kind of you go through some of that, and so I just have a lot of a lot of respect for you and your how you think through things um well, thank you, I appreciate that I was like, yeah, I should talk to derek about ai.
Chris Nafis:This would be great. We could have a really fun conversation. How? How did you start thinking about ai? Or like, how did you come to focusing on that?
Derek Kubilus :well, I've been aware of it just through, like the cultural hype and stuff.
Derek Kubilus :Um, I probably didn't interact with it until way late, but, um, it was actually my wife that got me thinking really deeply about it. Um, because she is an executive assistant, so she's like the assistant for a ceo at a big non-profit and she's constantly taking classes and reading books and things to kind of um, take her skills to the next level. And one of those classes was on using ai and I was like wait a second, like you're telling me, you took like a super practical, essentially continuing ed course on using this thing like it is in. In that this was like a couple of years ago and maybe a year ago, and I was like whoa, this is way more integrated into life already than I would have thought. Right, like we've all seen people make the funny pictures and stuff like that. But and so we've had several conversations in, that has led me down a bit understand the technology itself, which is so complicated. I think I'm not wrong to say that the people who design the technology don't absolutely know how it works.
Chris Nafis:That's what I'm reading that they don't really know how this thing is happening. They've kind of lost the tracking of it.
Derek Kubilus :Well, it's kind of designed to do that. So, like they call it a black box, like you set it up and just let it go, and it grows in ways that are almost impossible to chart. But even more than how it works, I'm interested in how we respond to it, and I have friends who are all over the gamut, especially in ministry. So I'm a United Methodist clergy person and I have one friend who is all aboard the AI train. He is unabashed that. He uses it to write sermons and develop Bible studies and all of these things. He's constantly posting pictures and graphics and videos that he's creating for his church using AI. And I have some friends that are will not touch it for their own sort of ethical reasons, do not believe that it should exist. And people who like when they see their friends posting AI things on social media, will actually like confront them in all capital letters that they should stop doing that. So there's this huge wide spectrum right now in people's feelings about AI have you used it much yourself?
Derek Kubilus :I have. So I have. I think you could characterize it as I've dipped my toe in the water. Yeah, um, I refuse to use it in sermon writing or in book writing, like those are the places that like it. Just, I just don't want those things to be tainted.
Derek Kubilus :Right now, where I have used it and I actually got this from, I feel like I'm like sharing a deep, dark secret right now. But that's part of it too, because it is, it almost feels a little dishonest. There have been a couple times when I have had to write an email over an issue that could potentially have a lot of conflict surrounding it and, as you may recall from our days at Duke divinity school, I don't always have that much tact with my words, and so I have essentially written out in email like everything I want to say and then told chat GPT to um, hey, make, make this sound warm and professional and inoffensive, but get all these points home. And it has done an okay job. I mean nothing that I could, uh, copy and paste necessarily like whole cloth, but definitely change some things in light of what it said, but I also feel kind of dirty about that. Yeah, how about you?
Chris Nafis:Me, yeah. So the first time I really tried it was I was trying to make a lineup for my son's soccer team that I was coaching and it was. I was just like it was complicated because I wanted everybody to play three quarters of the game, one of them all to only play two positions, and I was just like like I was like this is like a math problem. I bet I wonder if Gemini, which was Google's one at the time, could do this. And I put it in this was a last fall and it was incapable of doing it. It kept spitting out like I had four quarters and all that and it kept giving like having the same players play the same positions for three quarters and then it would mix them up for the fourth quarter.
Derek Kubilus :And it wasn't.
Chris Nafis:And then I would tell it like no, this is incorrect, you're not doing what I'm asking you to do. And then it would say oh, I'm sorry, uh you know, I messed up and then it would give me another version of the same thing.
Chris Nafis:So that was. So my first time using it myself, I was very unimpressed because it couldn't do this simple math thing. And then you know, learning that's really not. These are language models, so it's really not the strength of what it does.
Chris Nafis:I have been using it and this is I also feel like this is like a deep, dark secret or something but I've been using it in this podcast actually to uh, to make a transcript of the podcast, which is helpful to have for people that, um, are hard of hearing, and it's it's very labor intensive to do that, but it does it super fast and easy.
Chris Nafis:There's like a tool in our hosting site that does it, and then it writes like a little synopsis that I've used to kind of have the synopsis which, again, I'm the same as you like for like a book or a sermon or anything that I feel like was the content itself. I'm like hard, no, don't want anything to do with it, but since it's like a summary of something that I've already done, it feels a little bit more. In a sense, it's like grunt work type work. Yeah, it's a little more like all right, this is. This is where it's actually useful and it has saved me a lot of time. So I don't know if everyone's going to tune out of this podcast now that they know that they're.
Derek Kubilus :It's all fake yeah.
Chris Nafis:Actually I have cloned myself and I'm just kidding Um, but other than that I really haven't used it at all. Uh, I haven't really done, emails and stuff, um, but I'm sure that I've interacted with it, which is part of you know a fair amount, which is part of you know what. What we're working through is like all right, this thing is coming, whether we are intentionally engaging with it or not. It's like we're getting spam calls from it. We're seeing it on.
Derek Kubilus :Oh yeah, I I've read a couple books and a lot of like online stuff and I've read in several different places all the telltale signs of AI generated text. Telltale signs of AI generated text and I'm actually I feel like I'm getting pretty good at um spotting it like in the wild, and I'm seeing it every day. I see it every day. I see the pictures, which I think are really easy to spot, Although I'm told baby boomers might struggle with it. I see those every day on social media, Absolutely every day. The app, if you're at all interested in, like Pinterest, I have a few things that I sometimes like to follow on Pinterest. That app is quickly becoming just unusable because it is so full of just AI slop. I mean, just every post becomes AI really quick in some categories.
Chris Nafis:Yeah, I had some friends who told me there was like a john oliver episode just on that, specifically on his uh show. So I watched it before just to have this conversation and he has a whole episode on that and specifically points to pinterest as, like you know, this woman's complaining on on the show they're showing a video of her saying I just searched up garden and like almost every picture is not of a real garden it's all ai, and you can tell, because this, this and this, and I think I think right now there I mostly feel like I can tell, but I think it is getting better and it will be harder to tell the videos that are coming out now are crazy.
Derek Kubilus :Yeah, they have a kind of softness to them that sometimes can key me in. But the New York Times just had like a little quiz five videos or maybe it was 10 videos and I got exactly five right and five wrong, which means like I wasn't picking up anything essentially. So, um, yeah, I mean it's in to think about where we came from three years ago and what, what it was capable of If you go out in three more years. Or the big essay that got everyone talking was called ai 2027, which was a big thought experiment of where ai could take us through 2027, with lots of very serious doomsday scenarios and stuff like that. It was very sensational.
Chris Nafis:Um, it's, it's just uh, the possibilities are wide open and I think that's what makes a lot of people nervous right now yeah, that's what makes me nervous, because it just feels like there's it feels like we're on the cusp of like a major revolution in like how we, how we discern what's true and what's not, which is kind of we're getting out the videos and the pictures, um, which we've already kind of been there already yeah, absolutely stuff.
Chris Nafis:But now it feels like I had the first uh moment this week. I saw I was on twitter and there was a picture someone had shared of like an apartment complex in Ukraine that had been bombed by Russia, and it was the first time where I was really like I feel like I've thought about this, but it was the first time I was like looking at a thing and I was like I don't know if this is real, I don't know and I can't tell you know, and, and I think that that's going to be pretty normal now, there's been pretty mainstream news stories where, like the, the broadcast network was either fooled or didn't care enough to check and they've, you know, shared videos or photos that were AI stuff. Or, rachel, my wife told me about a I forget what newspaper it was, but they published a whole summer book reading list about books that you should read this summer and the entire list was AI generated. None of the books were real.
Derek Kubilus :I heard about that. Published it yeah.
Chris Nafis:So there's that side of it, and then you know, like I don't know, it just feels like there's going to be a lot of money and power because it's going to change how we work. I mean, it feels like we're kind of heading towards like an industrial revolution sort of a situation where everything just sort of changes very quickly and we're kind of all figuring out how do we live now in this new world that has not only the internet but has artificial intelligence running around?
Derek Kubilus :Yeah, absolutely. There's a really great book called the Gutenberg Parenthesis that I think puts all of this in context really well, and if you haven't checked that out, I mean, I think it's a good book on its own, but it has a lot to say about specifically the revolutions that happen in the wake of new communication devices and new forms of communication and sort of what that does to society and um, it's, it's really interesting like um, sorry, you can hear my dog probably in the background, okay, um, if you break it down, one of the cheap the, the. The main theses of that book has to do with what happened with the Gutenberg printing press, and the early Renaissance came from this technology that people had to disperse information more quickly and more widely than they ever had before. And so you, you know, not only do you have the reformation, the Protestant reformation, obviously, you have, like the peasant revolt in germany, you have the wars of religion in england and france, you have the english civil war, and I think there's a case to be made that in all these um instances you can link them in some way to the advent of the printing press. They had these things that they would call pamphlet wars, where communities, people in communities would start constantly handing out pamphlets on different sides of the issue, and then those pamphlet wars would turn into, eventually, real wars. And coincidentally, you also see around this time conspiracy theories start to pop up. With it.
Derek Kubilus :With the advent of new forms of communication, you always get the spreading of these Sorry, these conspiracy theories, which are almost like a virus that sort of gets attached to the way we communicate. So this is where you see a lot of the anti-Semitism. You see the protocols of the elders of Zion and the shadowy cabals of secret people. That's where a lot of it has its start. And then when radio is invented, you see something very different but similar. Radio causes another kind of revolution, but it's much more centralized because radio happens on bandwidths that can be easily controlled.
Derek Kubilus :You see revolutions in um nations. So this is not instability between people and tribes and towns and villages, this is instability that's created between nations themselves. So that's when you have the rise of radical nationalism, fascism, communism. Those are made possible by the airwaves and that's what we really have, until television um uh kind of starts to work out the kinks of that and then you wind up with walter, cronkite and woodward and bernstein, in an age of ethics in journalism, where it's this really big deal to get your facts straight and everything is so well organized. And then the internet comes along and social media and blows that all up again with this very egalitarian kind of communication style and so you get the internal pressure and you get, as we've seen, the conspiracy theories. Um, I would say, and you get, as we've seen, the conspiracy theories, I would say, ultimately you get, you know QAnon and COVID conspiracies and those kind of things, and then people start taking the moral implication of the technology more seriously.
Chris Nafis:Moral implication of the technology more seriously, yeah, yeah, well, and a part of it feels like the internet revolution is still kind of in progress, like it feels like it's been a long time for us. You know, I don't, I don't know how old you are, derek. I'm guessing you're about my age, but we're I'm a 42 millennial.
Chris Nafis:Yeah, yeah, and so we kind of grew up in, in had the time before the internet and, like the internet, you know, facebook came out when I was in college and so we've kind of seen it. It feels like it's been forever in some ways, but it really, in terms of like, in terms of generations and in terms of history, we're really still the internet's pretty new, and then now we're adding a whole nother layer of cause, like AI is isn't a thing without the internet. I don't think right like it doesn't yeah sure um, but we're adding like another layer to it.
Chris Nafis:We're not only um, and you know, on the internet it's a mixed thing with social media and stuff too, because on the one hand, like you said, it's very egalitarian everyone can say what they and be heard and build a platform and a voice for themselves. You don't have to go through an editor or a newsroom or journalism school. But then, at the same time, that, like that, that unfilters everything, which just opens things up to lots of wild stuff. And now AI feels like it's this huge generator, will be a huge generator of like fake things, I guess. I think that's what because, like, it's not gonna be communicating true things in the sense of like facts and things you know, like you know we can talk about what truth is, I guess, but, um, but it's gonna be almost exclusively used probably to have conversations with you know, know, not real people. Well, do you see it being different? Or what do you see?
Derek Kubilus :I mean, yeah, number one. What's really good right now, I think, and why conversations like this are good, is because we have a little bit of an interregnum. Before AI completely saturates our lives, we have this moment where we actually know that it is coming, which is something that I don't think we really had with social media in particular. Right Like, we were just kind of like oh, this is a new thing. Oh, this is a new thing, and before we knew it, we were all using it like crazy. Right Like.
Derek Kubilus :I remember being in seminary, like having to force myself not to check Facebook when we were studying for exams, because everyone would post, you know, their page count of how many pages they had yet to write and stuff like that, and it was just on us before we knew what we were doing.
Derek Kubilus :And it took like the crisis of you know, I think I don't I don't want to offend any of your listeners, I don't want to offend any of your listeners the twin crisis of, like a Trump presidency and a pandemic to open up all of our lives, open up all of our eyes, to say, whoa, there is some really nasty stuff that this can be used for. And just now, I feel like we're starting to catch up with it. We're starting to intentionally teach children how to use social media appropriately, how to spot fake news and things like that. Hopefully we can get a jumpstart on AI, things like that. Hopefully we can get a jumpstart on AI. Hopefully we can start having these conversations and have a little bit better footing, because we see the wave coming, if that makes sense.
Chris Nafis:Totally, and that's part of why I wanted to talk to you is because I think, like that's what you know, we could, we could go on, and I feel like I've done this with friends and even with congregants at times and just kind of gone down doomsday lists of like what happens next with AI.
Chris Nafis:You know what happens politically, what happens to the workforce, what happens to creative work, what happens to you know all of these things and you know we don't really know. We can conjecture and guess and it seems like whatever's going to happen is going to be big. But the real question for me that I'm interested in is, like, what do we? How do we respond to it? How do we prepare for what's coming? How do we kind of get like, sharpen our tools for discernment on, like as we figure out the new ethics and morals of like a new technology and morals of like a new technology? You know, what do we need to prepare for as churches or as followers of Jesus in order to be able to do that work well, to look out for people you know.
Chris Nafis:So, for example, I think AI is already beginning to whittle away at some like white collar jobs, which technology has been whittling away at blue collar and low wage jobs forever, but, like I think the whole job market is going to change pretty rapidly. So how do we prepare for that Like? What does it? What does it mean to think about? You know, we get so much of our sense of identity and value from the work that we do. What happens when a lot of people, suddenly their work, is kind of taken from them? How do we re understand what it means to be human and how to come alongside? You know those? Those are the kinds of things that I'm looking to figure out. Sure, that's like a million questions all in one, I guess, but you know, what do you see as, like, the main tools that we need to like, think about as we enter into this new season in the church of like, how do we, how do we work through these things in terms of, like, ethics and practice?
Derek Kubilus :Well, I'm. I'm a United Methodist, so as a, as a fellow Wesleyan I I'm not sure if you guys use this term or not we're really into something we call holy conferencing. Is that a Nazarene term?
Chris Nafis:You know I haven't heard any Nazarenes use that term. I heard it around. You know we went to Methodist school together. Sure, but yeah, tell us what it is.
Derek Kubilus :But it's just all about the power and the grace that's found in conversations for lack of a better word, and we Methodists talk about it a lot. The problem is that we can be really bad at it. A good example of how we're bad at it is around the LGBTQ debate in our church, Like we just had a major split in everything. And through that whole process we talked about holy conferencing a lot, but I don't think we ever did it. Does that make sense? What we did is we shouted our talking points at one another, but what we didn't do was just sit in a posture of listening to one another's stories. So, like I know in United Methodism in my own denomination, what I would like Like I know in United Methodism in my own denomination what I would like is for us to call a whole conference just to talk, Not with the goal of making hard and fast rules at the end of it. Sometimes people think, oh, if you're going to have a meeting, you got to have something, a piece of paper that you know the meeting produces a statement or something. I don't even think we need that. I just want to hear from people and have discussions, and I think we could like invite some academics to present papers, and we could invite some people just to tell their stories. Where have you used AI? What have you found? What has been good? What has been bad? Is there someone who has lost their job because of AI? Is there someone who has a career in AI? What does the environmental science, what do the environmental sciences have to tell us about it? What's their story? Um, and I, I it's.
Derek Kubilus :It's tough, though, because I can already feel that polarization happening around this issue. Like I said, I have some friends who are all about it in, some friends who were dead set against it, and so those sides are already forming up, and so those sides are already forming up. But what I'm interested in is kind of this fuzzy middle, and I'm interested in a lot of people getting a lot of people together who have the humility to say we don't know, we don't know how this will affect us, we don't know what the environmental impacts are going to be, the social and economic impacts, but we want to try to figure out how we can respond carefully and faithfully to it. Faithfully to it. So I mean, I don't have a great answer for you, other than I hope churches and denominations have intentional conversations about how to move forward.
Derek Kubilus :And it was interesting, my own denomination, the United Methodist Church Board of Discipleship, just released like a thing into social media, like just into the ether as I was preparing for this conversation. They just very flippantly talked about AI and why you should use it, why it's a tool and so on and so forth. And they just said, well, and we'll be sensitive to the environmental things and so on and so forth and misinformation and all of that. But they said go ahead and use it. Essentially. And he was like well, I don't know, I'm not there yet. I need to have more conversations with people and figure out. You know, I know someone on social media who makes their living as an illustrator, who says look like I'm staring down the barrel of losing my job, Like I went to art school and AI has just learned from everything that I've ever made and dumped on the internet and can do what I do very easily. Um, I think we need to hear those kinds of stories, you know.
Chris Nafis:Yeah, for sure. And and to talk about success stories too, maybe, of how like I've distinguished myself from this AI juggernaut and how you know what I mean. I don't know that they, these stories, even exist yet, but you know, like. So my wife runs a small business.
Chris Nafis:She's a flower farmer and has built this, this whole business around Instagram and stuff you know we she could tell stories about how she's built a up a sense of connection and authenticity with, like, locally grown flowers in the face of mass market flowers that are mostly shipped from south america or asia to southern california, which is just a long way. It's environmentally unhelpful all those kinds of things, but there is space to kind of carve out um, something better if you can figure out how to like tell the story well and how to connect I don't know if that's there for for illustrators.
Chris Nafis:I know my brother, who edits this podcast for me, is feeling the the squeeze with video editing world and photography world, um, so yeah, like it would probably be helpful for people to even people in those fields to just talk, have like an open forum to talk to each other about, like all right, well, how do we? How do we you know, I don't fight, this thing is maybe not even the right word but how do we distinguish, you know, real art from generated art and how do we kind of tell that story to the rest of the world so that people can support human artists or whatever the case might be?
Derek Kubilus :And I know that was part of what the big Hollywood writer strike was about a couple of years ago Was they saw the writing on the wall that like, so to speak, that like we're going to be out of a job if we don't do something to try to protect ourselves. Now, that wasn't so much a conversation as it was just sort of sticking up for themselves, um, but it just goes to show, you know, some people have lost their jobs already. Well, yeah, for some people are staring down the tunnel at it. Um, I mean, I have seen advertisements on television that I know were made with AI. I mean I can just see it. I read copy all the time online. That is obviously AI generated. But here's the thing, and this is where it gets difficult. Okay, when we ask that question about jobs and AI, we should be aware that there is a deeper question that lies behind it, which is how do you feel not just about, like, ai taking people's jobs, how do you feel about capitalism?
Chris Nafis:Yeah, right, right behind it.
Derek Kubilus :If you're uncomfortable with a new, more efficient technology, putting people out of work whether you're talking about the car displacing the horse and buggy, or solar power displacing co-workers, or AI displacing coders and illustrators and filmmakers then your problem is with capitalism, which is totally appropriate. My problem is with capitalism, you know, other than war, I think, is probably the most evil thing that's ever been invented. So I think we should be asking those questions. But what we need to keep in mind is that CEOs and shareholders are not going to be asking those kinds of questions. They want efficiency, they want higher profit margins, they want fewer HR things to deal with. So, with their support, you're not going to get rid of it because there are too many stakeholders in the profit possibilities. You're not going to get them to limit it when it could potentially make so much more money.
Chris Nafis:So and maybe I mean oh, go ahead.
Chris Nafis:No please, please. So this reminds me. So I had this conversation with this guy. He was kind of a stranger at this camp where I was at and I was just asking him what do you, what do you do for a living?
Chris Nafis:This was, uh, several years ago and he's he works in um, agricultural engineering was his work and he says he told me this story and you could tell he's kind of like still processing the thing that he does for a living. But he goes to these um agricultural processing factories and installs machines that displace all these workers. So say like I'll go to this like factory and you know, in wherever Central Valley, california, where there's like 20 women whose whole job is just to slap these stickers on peaches or avocados or whatever it is that they're growing, and they're working all day just like slapping sticker after sticker after sticker, whatever it is that they're growing, and they're working all day just like slapping sticker after sticker after sticker, and in he'll install a machine and all of a sudden the machine can do the work of all 20 of them and there's no and they're. And then they're like I mean, their job is miserable, they get paid crap for it.
Derek Kubilus :But now it's the only thing they have.
Chris Nafis:It's the only thing they have and now they're out of work and what it kind of like. It kind of hit me in a new way that, like, someone is benefiting from that increased efficiency but it's not ever going to be the workers or the lower class. It's always going to be the ones who own the capital and who own the technology. So, like, he and his company that's installing these things are going to make a lot of money on these. You know, agriculture, machines this is a very rudimentary thing but it's still happening. You know they're going to make a lot of money on installing all these things all over the place. And then the person, whoever owns that farm or whoever owns that production line, they are going to benefit from the efficiency, when it would be nice if, like, we could all benefit from the efficiency. Like it would be nice if no one has stickers on oranges or whatever. But we could all live a little easier and work a few less hours because there's less work to do, because we can do it more efficiently.
Chris Nafis:But that's not how it works in our current system of capitalism. And you know, like people will say, capitalism has made things much more efficient. It's brought a lot of thriving and wealth to the world and all these kinds of things. But the problem is when it's unchecked or unregulated, and when there's, when it's just kind of in this free market way, then, like, the wealth always funnels its way up, especially when there's a new technology that makes things more efficient Everyone on the bottom end ends up suffering and the only real check is that now those people who were slapping stickers on the oranges, they're not gonna be able to afford to buy as many oranges, which might affect the bottom line if it happens on a mass scale. But like, generally speaking, the people who are making the money off it don't really care, you know, they're just happy to have a. That's what you're talking about, right?
Derek Kubilus :Yeah, and I mean not to put too fine a point on it, but what you're talking about is who owns the means of production.
Chris Nafis:Yes, Right yeah.
Derek Kubilus :Yes, exactly. So, yeah, and it and it. It strikes me one of the most depressing things I ever read was about this huge bill that um was debated on in Congress in, I think, the 1960s. That was all about going to a uh, four day work week with the federal government, and it was assumed that private industry would follow the government's lead, as it does with, like, holidays and things like that, and the argument was that, you know, we have much more efficient means of producing things that requires less labor, and so we should be inviting people to have more leisure time, like we have washing machines now, rudimentary computers that are supposed to help us by sort of taking the time we would spend on those things and allowing us to direct it elsewhere. And it turns out the answer is no. You keep working just as hard as you always have, if not harder, and you just always produce more right. And so if you were designing a society with this advanced technology for yourself, you might say well, if I could have robots sorry, this got very sci-fi very quickly If I could have robots that do everything for me, then I would be free to do what I wanted, and we should just do that for everyone. We make robots that can fix other robots and then it would be great. Except that when it plays itself out in a capitalist world, except that when it plays itself out in a capitalist world, only some people will get the benefit of the robots. Right, and that's what we're seeing. It's not like those farmers are saying to that woman oh great news, we have this machine that puts the stickers on the fruit, so we can just pay you and you can go chase your dreams, because we got a machine that does it automatically. Right, like that's what we would want, but we live in a world where people don't yet know how to share.
Derek Kubilus :Um, there's a really good book, and I can't remember the author of it. Please, please forgive me. It's very Google-able. It's called Habits of the High Tech Heart and I think it was written in the early 2000s as a response to the internet. There was another sort of classical virtue ethics book called Habits of the Heart. That was written many years before, and this tried to apply the lessons of that book to what was then known as the internet age. Right, which seemed so naive back then, but the idea being that the classical ethics of you know if you want to talk about the fruit of the spirit peace, patience, kindness, gentleness, self-control, those things like.
Derek Kubilus :The task in every technological age is to apply those things to the technology that you're using, and that's the big question. How do we use AI in a generous manner, right? How do we use AI in a kindly manner? How do we use it gently? How do we use it in such a way that it allows us to be better stewards of our creation, of the creation that's been handed to us, and not just absolutely destroy it, which is what we're on the cusp of doing? The amount of power that these servers that generate this AI require are just incredible. That generate this AI require are just incredible. I mean, um, I think it's Google is building a nuclear power plant just to serve its AI. Like um.
Chris Nafis:Microsoft too, I think, I think maybe it was Microsoft. They've been talking about re. There's a nuclear power plant here near us that's been decommissioned and I think they're working on bringing that back online. I think some of the motivation's got to be some of the ai stuff, but it's going to need an incredible amount of power. Yeah, like scale, this thing, as it seems like they're going to, might be one of the few limiting factors.
Chris Nafis:Actually, that checks the technology a bit but in the meantime, like, I've heard stories of people who live near some of these servers that like, are in and out of power themselves in their own homes. Really, yeah, and I get you know, maybe that's all ai generated nonsense, I don't know. But uh, there's, you know. I just think there's going to be a, there's going to be a cost to it environmentally, just, with the how do we produce that power cleanly? And then you, in terms of sharing it, and that you know there's all the wealth and everything gets mixed up with with those questions also. Um, yeah.
Derek Kubilus :So when I was writing, when I was using it for the sake of writing those emails, what I was thinking to myself is okay, I am using this right now to make peace with someone and therefore I sort of judged it to be okay in that circumstance. You know what I'm saying. Now, as humans, our ability to rationalize anything is really high, right, but I guess I'm just upholding that as an example of can we use this with an eye toward virtue and have discussions about because we know it's not going anywhere and because we know it's getting more powerful every day and more ubiquitous every day can we talk about how to use it responsibly? Otherwise, you know, is it our moment to become Amish? Is it our moment to become Amish, Like the Amish kind of pick an arbitrary moment in time For them it was, you know, 1790 or whatever in America and it's like okay, technology here pretty much stops and we have to give the okay for anything new to come in.
Derek Kubilus :Are we just saying that for us, that date is 2025, um, or are we going to sort of learn how to integrate this technology with how we live as christians? And for me, that means learning how to use it virtuously, which I?
Chris Nafis:I need to have more conversations in order to figure out yeah, and I, you know, I, that's an interesting approach to it that I really like. I think I think part of uh, what comes to mind for me is that, like, most people are not going to do that. You know what I mean. No, but it's kind of like a lot of Christian practices where, like, the world is not going to suddenly turn on their virtue and their care for actual justice and their love for their fellow humans their care for actual justice and their love for their fellow humans. But we can, in the midst of it and we can maybe in some ways at least be some salt in the line. That's not going to change. It's not going to resolve all the issues that are going to come up with AI, but at least we can find our own ethic and work out our own way of like discerning.
Chris Nafis:Is this an appropriate use, for example, like yeah, I don't think that, um, I want to use it in sermon writing. Like it just feels like the wrong thing to do, um, and maybe maybe that changes over time. I don't know, but like you know we can, others might do it, I, I'm sure there are. I, in fact, someone told me today about an, an AI pastor somewhere in Europe that the whole congregation has, you know, has an AI person doing all of the things, which is insane to me. It seems like a, you know, one of those things that's going to get lots of attention and that's probably why they're doing it. But like that's, I'm not interested in that, you know. But where do we draw those lines and we can kind of collectively discern for ourselves and maybe other industries don't do that or other churches don't do that. But you know, having those conversations can at least be helpful for us to figure out what we do.
Derek Kubilus :Yeah, and here's the thing too. I mean, you talk about the AI pastor and how that seems just almost wacky, and I agree it does sound almost wack wacky. Here's the thing about ai, though, is the wacky things are already happening right, and so we need is as silly and as stupid as it is we need to have those conversations right. There are people who are having parasocial relationships with AI. There are people who are treating AI as if it's a religious oracle or something like that. Yeah, that sounds wacky.
Derek Kubilus :If you would have told me in 2015 that one day, um, you know, a majority of christians, of of conservative christians would believe that joe biden and hillary clinton were drinking the blood of children, I would have said that's wacky, but then it happened right, and millions of people just sort of got on board with that story. So, as silly as it is, I think we need to talk about those things, and perhaps we need to hear from those people who have had those relationships or who go to the priest. I I can't stop thinking about the, the movie priest, which is about the, the vampire hunter in the future, who confesses to an ai in a booth and he offers them absolution. I mean, there will be people who go down that route.
Chris Nafis:Because that, to me, the in talking about how do we use AI responsibly, one of the halves is kind of what you were talking about before, of like, all right, what is a good use, what is a worthwhile use? This is a peace building activity, and the other thing to think about for me is like, what is this doing to me? Um, how, how is this affecting me as a person and affecting the way you know? So like, one of the things that came out recently was they had the first study of, like brain scans of people who have been using chat, gpt, uh, regularly for their work, and one of the things that I found was that it's deteriorating their ability to like, think critically, to write, to do the tasks themselves, like the, their skills in the areas that they're handing things off to that GPT are beginning to atrophy.
Chris Nafis:And so if I'm leaning on this thing and you know, sometimes that might be okay if I don't need this skill anymore, I don't, I don't need this like slapping skill, like I don't need to slap the thing on the oranges anymore, because I'm never going to do this again, because there's a machine that does it. You know, to my sense of purpose in the world and my ability to like navigate the world well, love others well. That is being robbed of me because I'm essentially outsourcing it to AI, and so how are these things shaping us as people? I think maybe that's part of my sermon writing thing is like I want to be like sermon writing for me as a spiritual practice.
Chris Nafis:It's part of how I stay sharp and discern my way forward. And if I hand that off to AI, you know, the study shows that I'm not going to even remember what I wrote like an hour later, you know. And so, uh, how is that affecting me? You know what I mean.
Derek Kubilus :I mean, I have felt it myself when you know, every once in a while, like maybe once a year or something, I'll take a vacation and then we'll have a guest preacher and then maybe a service like a church picnic or something, and before I know it it's been three weeks since I've, like, actually sat down to write a sermon. And then when I go to do it I don't quite have that edge. You know, I'm not quite as incisive with my thoughts. I have to do a little more research and kind of have a couple false starts. Stuff like that, relying on AI, I think, would put me in that place all the time where I would just lose my ability to write all together and for some people they may not think of writing that way, but I do and I am grateful for my. I don't do many things well, you know. So the one thing that I pride myself on I want to keep doing, and it forms us in other ways too, you know.
Derek Kubilus :Think about the way social media has changed our relationships right? I remember this was years ago I was mentoring a kid who I had been mentoring him, hadn't seen him for a couple years, hooked up with him for coffee and he had broken up with his girlfriend that he'd been with for like two or three years. And I said, well, how'd she take it? Was she crying? And he said, oh, I don't know. And I said, well, what do you mean? You don't know? Well, it was because it happened over text, right, and that was totally fine for him. And I don't know, maybe I'm just a stodgy old man and I don't know, maybe I'm just a stodgy old man, but that's not fine. Like, when you break off a relationship with someone, you should be there to see their tears Because that will elicit your compassion.
Derek Kubilus :Increasingly, there are more stories about people that are having what I would call parasocial relationships with AI, and whether they're friendships, whether they are in some way romantic, and those things, as AI gets better, are going to become easier and easier for people to have. And when you use Gemini or chat, gpt or whatever, you know how obsequious it is right. Like, oh, how was that? Is there anything else we can do for you? Like, is that okay? Like, dah, dah, dah, dah, dah. Imagine how obsequious in AI, boyfriend or girlfriend would be. Well, that's going to treat. That's going to teach you to use people like tools to meet your own ends, instead of you treating them as an end in and of themselves, like an ai doesn't need anything from you, so you can't really truly have a relationship with it. So yeah, I think personal formation should have a lot to do with how we think about it. How does it form us as people?
Chris Nafis:Yeah, yeah, I think I mean that's where you're saying we can take some of the lessons from the social media revolution in the same way, because it's you know, social media. I think I think I can say most of us agree at this point that it's turned out to be a sort of a hollow social identity or social experience, where it has this illusion of connecting us to all of these people, this illusion of connecting us to all of these people, but at the end we have like an epidemic of loneliness happening because no one actually feels that connection. Because we're doing it in this way, that's not actually real. We're not seeing people in person. We're not sharing our whole selves with people. We're we're sharing a facade of what we want to present to the world. You know we're.
Chris Nafis:You know which is creating all the FOMO stuff world. You know where. You know which is creating all the FOMO stuff the fear of missing out, stuff where you know you just see all these people on their best day with a super flattering picture that may be edited, and then you feel like you're less because of, because that's the relationship we have with everybody. That's what we see from everybody's feeds. Um, instead of like actually having friendships, where we're walking alongside people on good days and bad days and where we actually see them, and some of the trauma work that we've been doing in our church lately has kind of highlighted the somatic significance of being around other people and the smell and the heart rates and all the things that kind of align us with one another when we're in person.
Chris Nafis:So social media has kind of shown us like all right, you can go down this path and as a society it's going to make for loneliness and for a hollowness and for a divided people and so maybe maybe we should be kind of more cautious stepping into all these AI things that are coming AI therapists, ai relationships, ai you know, leaning on AI for all of these things, because you know we can kind of see some of those patterns re-emerging with the, with the new technology and again, like I don't know exactly how we navigate that well or perfectly, but does give me some pause when thinking about like all right, how am I going to? You know, this tool seems so great. This AI thing never judges me when I share my darkest thoughts with it, you know. But is that?
Chris Nafis:I mean, maybe some of that judgment and shame is helpful sometimes so that I can know, like, all right, I don't want to feel bad about myself because someone's judging me, but also I should know that that's like a territory that I shouldn't let my thoughts wander and dwell in too much, because like it's bad for humanity if I'm always thinking about this violent image or whatever this yeah, I mean we, we supply boundaries for one another.
Derek Kubilus :yeah, and too often we have used our technology to try to escape those boundaries in various ways. Um, whether you want to talk, I mean, I guess the most obvious place is something like internet pornography, right? Like that is about boundary transcendence. That's about imagining yourself in situations that normally no one would ever be okay with you being in and they would put a little stop to you and say, well, you're indulging yourself a bit too much, right? Um, uh, ai would have no reason not to give you what you want, and in fact it would have every reason to, to to give you everything you wanted so that you will keep using it.
Derek Kubilus :Right, um, it will be interesting also, um, because internet things do get worse over time. Number one, like you said, we, we figure out how hollow they are. But number two is, once technological industries kind of get everyone using everything, that's when they start to monetize it, right? So I was really shocked when I sat down to watch a movie with my wife on Amazon Prime after several months of not watching it, to find that they had added commercials to it. Right, and, if I wanted the commercial list experience, I would have to, and now I'm less likely to watch something on Amazon prime. Eventually, when AI is monetized, however that's going to look, it will give us that that, um, that gentle kind of boundary, but we can't rely on that like. We need to think about it before. We need to be thinking about it right now and and having conversations, um, about how to respond to it.
Chris Nafis:Yeah, I mean, part of what's coming out for me in this conversation is that self-control is going to be a super important skill for us, or fruit it's one of the fruits of the spirit. Actually, it's going to be an important task for us to take on, you know, because I think in our church we have a lot of folks that struggle or have struggled with addiction and like these are all like addiction type things, know, like oh yeah, they're all dopamine hackers yeah for sure.
Chris Nafis:And uh, yeah, like I think the, you know the, the pornography stuff is definitely an addiction. Some of the, I think even the social media stuff, there's like addictive elements to it because, like you said, there's no, there's no natural boundary and unless we are able to control ourselves which is really hard to do, like it's it's very hard to have self-control all the time in in all areas of life. It takes work and effort. It's it takes habit building, which is not an easy process, like repetition and kind of reminders and, and you know, discipline, sometimes imposed from outside of ourselves, so that in our moment of weakness we have something that's keeping us from going to the liquor store and getting the drink or whatever. You know what I mean Like something that helps us to stay disciplined. I don't feel like we're very good at that. How do we foster those things? I you know.
Derek Kubilus :I don't mean to put you on the spot, but how do you?
Chris Nafis:foster self-control in congregants.
Derek Kubilus :Well, as a good Methodist I'm going to say a big part of that is holy conferencing. The method to Methodism is meeting with a small group of people, hopefully every week, and talking about how that's going essentially, and asking questions around like what harm have you done either to yourself or to someone else, and what good have you done? Where are your bad habits? Where are your good habits? How are those things going? And it's tough right now.
Derek Kubilus :I think in this society there's some good things that are happening, along with some bad things.
Derek Kubilus :Along with some bad things, we have seen the examples of how behavior modification is done poorly, right, oh sorry.
Derek Kubilus :We've seen lots of examples of poor behavior modification that relies on shame and fear and anxiety, and we're starting to call those things out now.
Derek Kubilus :Like I see that, especially among young people, who are talking a lot about the way people in authority try to shame them into doing not necessarily the right thing, but the thing that they want them to do.
Derek Kubilus :So in light of that, we need to start growing new ways based on communities boundary that we enter into willingly in order to consciously shape ourselves. So essentially, what that means in the church is you say I am going to make myself accountable to this community for the way I show up in the world, mm-hmm, that I recognize that this community has something to teach me about how to walk this journey of life and how to build virtue over and against sort of other communities that might shape me into making money or succeeding in the corporate world, or finding the perfect mate or having the perfect body. This community exists only for the sake of helping shape my soul, and so I am going to submit to it and I'm going to submit to the process of being formed by it, even as I help form other people. And that's about the most radical thing I think you can do in a society like ours.
Chris Nafis:It really is, and it's a hard thing to do. So we actually, um, just in our tuesday night bible study, we we finished a, a book that we were reading through like a, a first, second thessalonians, with the commentary thing, and then so I was like, all right, what are we going to do next? Let's let's try out the wesley questions, which, if you don't know yeah, I'm sure you know them, derrick. But if, for those who don't, you know john wesley, who the founder of Methodism, who's? You know we're in a Wesleyan denomination in the Church of the Nazarene and hugely influential theologian, although you can look him up if you don't know who he is. Hopefully most people know who John Wesley is at this point. But you know he had these questions that he would ask his essentially his accountability groups, his small groups that they would run through, and they're very intense questions about you know, were you, uh, you know, were you honest with who you are, or were you putting on a front? In other words, are you a hypocrite? I think that's like the first question.
Chris Nafis:I didn't say it perfectly, but you know there and it was interesting to talk through, to kind of present these questions to our Tuesday night group, because there were definitely people in our group who seem to be kind of triggered by the questions because they've had that net you talked about at the beginning of what you said, that negative, shame-based reinforcement of like you are bad if you can't answer these questions well, and so I think like we've got to maybe find some ways to ask accountability type questions of ourselves, but it's got to be voluntary and it has to be asked in a way that's like we're trying to shape one another positively and not just like bash people, not not set up another, uh, sort of legalism or another like you're not good enough unless you blah, blah, blah, blah, blah, um, but that this is a sort of a coaching.
Chris Nafis:You know, like this is us trying together to to get better, to do better, to be shaped in the ways that we need to be shaped by one another and by the Holy Spirit. You know, but it's hard, I don't know. It's a hard sell on people these days, do you?
Chris Nafis:find it different in your context.
Derek Kubilus :Oh no, no, it's very difficult, I will say in the we call them class meetings, a lot of people call them small groups or accountability groups. In my class meeting we have some rules that make it just a little bit easier. So we take that whole list of questions that Wesley had and he himself actually boiled that all down into three more general questions that we answer, which is what harm have you done? What good have you done? And the third one is more complicated. It's how have you attended upon the ordinances of God? Which is what have your spiritual practices been like this week? And so what we do is everyone answers Well, first we answer the biggest question, which is a very famous Methodist question, which is how goes it with your soul, which is a very famous Methodist question, which is how goes it with your soul. And then we go around and we answer, which is just a general how are things going with you? So on and so forth, and then we answer those three more specific questions.
Derek Kubilus :But the rule is no one is allowed to comment on them. You can ask a question, but you have to ask permission to ask a question and the other person has to give you permission to ask it. Or if you're, if you're the one answering the questions, you can say you know, what do you guys think about this? I want to hear from you, and so your vulnerability is totally up to you. Mm-hmm, and that works really well in terms of getting people to share more deeply. As time goes on, as they built up trust and they know inside that no one, they become convinced that no one really wants to shame them or posture in front of them or anything like that. Now, it's not foolproof, but it just works a little better for me, I guess. Does that make sense?
Chris Nafis:I like that. I actually wrote down the questions because it happens to be Tuesday when we're recording this and we've just done the questions last several weeks, so I'll probably bring some of that to my group for tonight in a few hours. Yeah, I appreciate that. That's good, and there we go. I mean so you know I know we started talking.
Chris Nafis:This conversation is based around like kind of the AI stuff, but I think this is where I wanted to get with. The conversation is like how do we intentionally shape our church life, our own individual lives, like our spiritual lives, in order to be ready to handle whatever AI throws at us, cause we don't know all that's coming? And I do think that some of these kinds of practices right, like learning how to be intentional about who we are becoming and thinking about the different influences on on that, on us, and then being able to make decisions based on our sense of like ethics and harm. And you know, I think that is like so in some ways, like the answer is like the church needs to just be the church and do discipleship well, and of course, that's like that's not foolproof either.
Derek Kubilus :But you know it's really about. It's like you use the word intentional. It is about making intentional choices with regards to things. That's the number one thing that I'm walking away with from this conversation. It's that so much about. If we're going to learn a lesson from social media is that most people just aren't intentional about the way they use it. They use it the way they like to use it, the way that it appeals to them, but they don't intentionally make decisions around ethics and virtue and holiness. When it comes to the technology With AI, I hope we can get to a place where it's not that we need to reject it outright or we don't embrace it fully, but that everyone sort of makes choices about how to use it and about where to use it.
Chris Nafis:And we can argue about those choices but ultimately respect them. Yeah, I think that's great and, as I think we've kind of mentioned.
Derek Kubilus :Like I don't think that's going to solve all the problems that are going to arise with AI.
Chris Nafis:It's not going to save everybody's jobs or anything, but at least gives us a path as Christians who are trying to navigate, like a tricky new world that's emerging, how we can navigate it well, and then you know, I think, the politics of it and all the other sides of it like some of that. We can maybe try to have some roles of advocacy or whatever, but I think most of it's going to happen whatever we do. I know I feel very powerless in the face of, like, the onslaught of AI that's coming our way. Yeah, and so you got to kind of let go of what you can't control in some ways and then try to be faithful in the decisions that we do have.
Derek Kubilus :You got it you got it.
Chris Nafis:Oh well, I feel like we could talk about AI for hours, uh, and I there's a chance I may, uh, talk more about it in future episodes, but, um, I really want to thank you for coming on, derek. It's really good to get to spend some time with you, thank you for yeah, it's been fantastic.
Derek Kubilus :Thank you for asking me.
Chris Nafis:Thanks for doing it. Um hope you have a great evening over there. Any final thoughts? Anything you want to like, send us out with or have the last word on anything.
Derek Kubilus :No, I just think, um, I, I, I, I'm scared too. Mm-hmm, I think it's okay to be scared, um, because there there are a lot of scenarios that can go very poorly for a lot of people. But what gives me comfort is the fact that I'm part of a community that not only offers me support but offers me guidance with how to deal with things that seem very new. And the church has seen not AI, but the church has seen a lot of new things over the last 2,000 years, and we've come up with some pretty good wisdom for how to deal with it, and we've talked about a lot of that here today, and that gives me a lot of comfort.
Chris Nafis:Yeah Well, thank you, derek, and I appreciate it. That's a great final word For those of you listening. Thank you for giving us a little more of your time. Don't forget to subscribe if you haven't, or share this with someone who's thinking about AI, and we'll catch you next time. Thank you, thanks, derek, thanks man.