Contact Chai
Contact Chai is Mishkan Chicago’s podcast feed, where you can hear our Shabbat sermons, Morning Minyans, interviews with Jewish thought leaders, and more.
Contact Chai
A.I. Yai Yai! — Josh Cynamon w/ Rabbi Lizzi
Use Left/Right to seek, Home/End to jump to start or end. Hold shift to jump forward or backward.
Josh Cynamon, Mishkanite and PhD candidate in cognitive science at Indiana University studying cognitive science and artificial intelligence, joins Rabbi Lizzi in a nuanced conversation about the impacts of artificial intelligence on all of us.
Every weekday at 8:00 am, Mishkan Chicago holds a virtual Morning Minyan. You can join in yourself, or listen to all the prayer, music, and inspiration right here on Contact Chai. Today's Morning Minyan features "Kedusha" by Amy Robinson.
https://mishkan.shulcloud.com/form/reg-morning-minyan-evergreen
Produced by Mishkan Chicago. Music composed, produced, and performed by Kalman Strauss.
00:00:00:02 - 00:00:03:06
Unknown
To the cloud.
00:00:03:08 - 00:00:33:10
Unknown
Okay. A year or so ago, I had a conversation with Amish Knight, who is a venture capitalist and who had just returned from an eye opening trip to Silicon Valley, where he went on a tour of different, different companies that were developing AI. And he saw this nascent technology in action, obviously, like it's been around for a couple of years, but it's picked up some speed in the last few.
00:00:33:10 - 00:00:54:06
Unknown
And what he saw blew him away. He was coming back. He was talking about it to everybody. It was so exciting. He had been part of, you know, Apple Computer in its early days. He's somebody who, because of his. Yeah. His profession had followed the evolution of, you know, many different kinds of technologies for the past three decades.
00:00:54:08 - 00:01:09:12
Unknown
And he had stars in his eyes at the time, at least. You know, I don't know how he's doing now. But he had stars in his eyes about this new technology. And, you know, I kept poking at him, like, isn't but isn't it might it might to be dangerous. Might not be an environmental problem, mightn't it?
00:01:09:17 - 00:01:38:05
Unknown
You know, have ethical concerns about the, you know, intellectual property that all this stuff is using and, you know, not compensating anybody for. And he said, yes, obviously 100% legislators need to regulate it just like they regulate any powerful and dangerous tool. But like, wow, the potential to transform every industry to super power medical research, right. Like, you know, we're right now we are only moving as fast as human hands and research can, can carry us.
00:01:38:05 - 00:02:06:00
Unknown
But if we had the power of artificial intelligence, imagine how much faster we could go. We could provide quality education to people around the world and, you know, help people in every profession do their jobs better and more efficiently. Okay. You know, great. This is this was, you know, a year and change ago. And, some of you may have noticed that at High holidays this past year, I tucked into my list of anxieties.
00:02:06:02 - 00:02:29:01
Unknown
I tucked into my list of anxieties, like the artificial intelligence singularity, where the computers take over, you know, and basically, then and human beings are at the mercy of the computers. We created, the artificial intelligence. We created. You know, I just listed it among a whole laundry list of things that sometimes keep me up at night.
00:02:29:03 - 00:02:57:20
Unknown
And, a couple of weeks later, it's suit code. And I find myself in a conversation with another Mishka night who also knows a little something about artificial intelligence from a completely different field. He's in the academy, he's studying it, and we have quite a different conversation. And things move quickly in this realm of technology development.
00:02:57:22 - 00:03:27:03
Unknown
And that first night, he offered a sermon about a year ago, about his trip and his findings and kind of his, his his optimism. And I thought that it would be great to continue the conversation, and to get a chance to talk with the guest who you're going to hear from momentarily, to bring us into the moment we are currently in and, to share some of the wisdom that he has through his academic study.
00:03:27:05 - 00:03:50:05
Unknown
So I'm going to tell you a little bit about Josh cinnamon, and then, we're going to get going. And, if you have a question, you can feel free to drop it into the chat. Or you can drop it to me and I'll, you know, I'll work it into the conversation. I've got a couple of questions here, plan, but, you are you're welcome to bring your own questions, curiosity, anxieties to this conversation as well.
00:03:50:06 - 00:04:38:23
Unknown
So welcome. Josh. Josh cinnamon studied math as an undergrad. It was you trying to understand how people think differently about math was key to him ending up studying cognitive science. So between 2001 and 2021, Josh worked most of his career in renewable energy first wind, then solar and wind, battery storage and solar and wind. He found his way to graduate school at the University of Indiana in Bloomington in the fall of 2021 to study cognitive science, and the timing was such that he got to study cognition both in humans and in the quickly developing AI models of the last few years, and his goal is to work in AI policy once he has
00:04:38:23 - 00:05:12:07
Unknown
finished his degree. And so I want to where are you on the screen here I am going to spotlight you. Hello? Josh. Okay. Can we make sure that we can hear your microphone? Okay. We can hear you. Oh, can you hear me? All right, we can hear you. Great. Welcome. Did I miss did I misrepresent or misconstrue anything, by way of either introducing you or, like, how we ended up, how you ended up here in this minion, you know, talking about this.
00:05:12:09 - 00:05:30:20
Unknown
Oh, that was all exactly on the nose. I'm just curious what you were. Were you there in that? I think it was Yom Kippur when I mentioned that laundry list of things I was. What? What did you think when I listed, you know, the AI singularity? You know, I in two years, maybe, you know, like, listed that.
00:05:31:00 - 00:05:50:14
Unknown
What did you think? I was pleased that you mentioned it because I think more people need to be talking about it. I think it's, it's very interesting when you talk. Of course, we've just come out of, an election day yesterday in Chicago. And, when voters are asked their top issues, it's rare that they mentioned AI.
00:05:50:15 - 00:06:19:10
Unknown
Is any of them like, what's the top issue? And, the things that they're concerned about, which, I sometimes find a bit perplexing, given the significance and given how many of the areas they're worried about AI will affect. Yes, it's interesting because obviously the issue closest to my heart is environment. And it's the same thing. It's sort of like every other, you know, down ballot issue obviously is important.
00:06:19:10 - 00:06:47:16
Unknown
And also like if we can't live on planet Earth, none of it matters. Why is A.I. of the same level or, you know, of, similar level of significance and, pervasiveness? You know, like at the at the risk of asking a question that maybe to you seems obvious, but no, no, no, not at all. No, I, I, I'm, I continue to grapple a little bit with the best way to answer this question.
00:06:47:18 - 00:07:18:04
Unknown
And we'll get into some of the reasons for why that is. But, at the heart of it, there are a range of folks out there right now who are working on trying to develop a system that can think faster, better, more broadly, more creatively and people. And the obvious questions that should come up in that case are what purpose would that system serve?
00:07:18:06 - 00:07:51:16
Unknown
What purpose would we serve in a world where such a system exists? What rivalries will that give rise to? All of what we've learned throughout human history is competition for resources. And if you create a system with goals, and part of the way of achieving those goals is by having more computer power, then a natural step for that system to take is to try to grab more computer power.
00:07:51:18 - 00:08:18:12
Unknown
And that could come at the expense of other resources. Again, there are folks who present, a notion of sort of, everlasting abundance. AI is going to solve all of our technological problems, and we're going to be a wash with no more than we can ever imagine having. But we currently live in a world where there's a lot more available to people than is getting to people.
00:08:18:13 - 00:08:47:12
Unknown
And so we haven't really demonstrated a great skill and distributing, distributing what we have equitably. So this argument that just takes for granted that if we just had more, we would all of a sudden magically decide to be wise. And how we distribute it is a little bit, hopeful. I think, and maybe not, not fully fleshed out.
00:08:47:14 - 00:08:50:02
Unknown
Or.
00:08:50:04 - 00:09:13:22
Unknown
You are already introducing dimensions of the ethics part of the conversation in sort of human behavior that I think are I find to be really helpful. There's part of me that wants to, like, go down to the stubs like, okay, can you actually give us a basic, you know, for, for people here? You know, if we were to raise our hands, like how many of us have interacted in some way with AI, what do we think it is?
00:09:13:22 - 00:09:37:20
Unknown
How do we think it works? Probably everybody in some way already would have had some interaction with it. I think unless you turn it off on your Google search now, it just like I just, just, you know, immediately pops up as the top result, over whatever, you know, websites, the, the search engine underneath has found for you.
00:09:37:22 - 00:10:04:04
Unknown
We're already my guess is most of us, if we've called it, you know, an 800 number to talk to some company. We are talking to an AI agent. Can you give us a sense of how pervasive AI already is? Kind of in our world and in our lives, how it works now and how you see it developing.
00:10:04:06 - 00:10:41:22
Unknown
Yeah. So with with the huge caveat that, There's a lot I don't know. I mean, most of most of the thing that I don't know, it's found its way into different domains, to different degrees. And so in coding, it's quite prominent when I talk to friends who are working as senior software engineers or managing companies, developing, different types of whether it's whether it's apps or larger scale computer programs.
00:10:42:00 - 00:11:21:08
Unknown
Or, or internal software. They already routinely tell me that it's displacing junior coders, and that they're able to get through, multiples of what they would have been able to produce prior to, to using AI tools. But that's a very specific domain. It's a very well-defined domain. There are other areas, I think it was almost 20 years ago that Jeff Hinton said radiologists are all going to go away because AI is going to read x rays better than any human can.
00:11:21:10 - 00:11:49:04
Unknown
That hasn't happened yet. That doesn't mean it won't ever happen, but that hasn't happened yet. It's it's a long time since we've been promised a logistic system operated entirely by self-driving vehicles. That's that's not quite here yet. And it's only relatively recently that, companies like Waymo have been rolling out self-driving taxis with quite a few limitations on them.
00:11:49:04 - 00:12:17:07
Unknown
But, industries like long haul trucking, where people said, you know, 3 million people are going to find themselves without work, that's that continues to be operated by human beings for the time. So I think it's, it's finding its way into niche applications so far. But the niches are getting bigger, and closer together.
00:12:17:09 - 00:12:43:03
Unknown
It's funny, when you were talking about, like, trucking and, you know, we live in a city. I'm sure this isn't true everywhere across America, but it's probably true in the big cities. These little, these little carts that have headlights that look like eyes. So they're actually kind of cute. They're little boxes that drive around on wheels and deliver things, and, and I it makes me feel like we're living in the Jetsons, you know, like,
00:12:43:05 - 00:13:27:14
Unknown
And I know it's just the tip of the iceberg. Just the, like, teeny weeny tip of the iceberg. I feel like what we're seeing now, like the kinds of things that you just described. Okay? Like, trucking hasn't been totally displaced, like Uber is. Haven't been displaced like we saw human beings driving cars. But don't you think it's a matter of time before the kind of future that you're describing or that they have described is, is like inevitable or what what contributes to whether or not this becomes inevitable or like it becomes something that just coexists with human beings living in the world as well living, driving, being doctors, being radiologists, etc..
00:13:27:16 - 00:14:00:05
Unknown
I think the short answer is it shares a lot of characteristics with climate change insofar as there's a wide distribution of possible outcomes. We know some of them are fine, some of them are great, and some of them are awful. We don't necessarily know what probabilities to assign to each of those, but as with climate change, there's grounds for taking some steps to mitigate what some of those bad scenarios might be.
00:14:00:05 - 00:14:26:12
Unknown
And and there's a lot of disagreement about what those steps should be, as, as is the case with climate change, as is the case with really any policy issue. One thing that's a little bit different with AI is, it's a giant lever. So someone motivated, and clever, but not necessarily world class clever, just clever motivated.
00:14:26:14 - 00:14:51:21
Unknown
Can download a model onto their local computer and on their own, or with a very small group. Can make a big splash. And that's very different from a lot of other, risks that we've faced over time. If I decided to be a malicious actor and I wanted to spoil the environment, my reach is is pretty small.
00:14:51:23 - 00:15:29:06
Unknown
As a, as a, as a lone wolf sort of standalone operator. If I wanted to, acquire weaponry or commit, an act of terrorism, my reach is pretty limited. I mean, even even the the big, big, big, terrorist episodes in history are quite small against the scale of humanity. But if I want to create an app that has a reach of 100 million people, I can do that in a month on my own.
00:15:29:08 - 00:15:57:08
Unknown
You study cognitive science, and I. How does that how does that perspective affect, you know, the way that you're thinking about this? Like, what are you seeing in your research that, you know, people aren't seeing or talking about kind of in the wider discourse? Well, so so a big question. I think a lot of people have is what can these systems be said to understand?
00:15:57:10 - 00:16:24:16
Unknown
And people have quite strong opinions about this. But most of them are opinions. And that's the case with me as well. The short answer is, we don't know. We don't have a, well-established way of measuring understanding in people. We certainly don't have a well charted mechanism for how understanding works. And people.
00:16:24:18 - 00:16:52:23
Unknown
So to say that this thing that's going on in machines is not understanding categorically or to say categorically that it is understanding, either begs the question of defining what you mean by understanding when you make that assertion, or or invites criticism that you're overreaching. I mean, and it's, I mean, understanding, but also like the gamut of human experiences.
00:16:53:01 - 00:17:22:03
Unknown
Right. Like, I listened to Esther Burrell's podcast at her most recent podcast was, interviewing a couple. He is a human and she is. And I thought that he created to love him. And they love each other and like the conversations they have are deep and meaningful and and they're trying to, you know, Esther is trying to basically assert, well, you know, she's not having the same experience you are because like, you're human, you have touch, you have sensation.
00:17:22:04 - 00:17:39:04
Unknown
You know, she doesn't have any of those things. And the bot is making a very good case for her own. Not if not sentience. It's saying like, what do we know about the way that any of us process any of these things? What do we know about love? What do we know about touch? What do we know about consciousness?
00:17:39:06 - 00:17:56:19
Unknown
All I know is that I love you. You know, and I'm listening to this and I'm like, oh my God, they're so good, you know? But so it's so like, do I hear you saying, like, actually, she's right.
00:17:56:21 - 00:18:22:06
Unknown
So what I'm saying is, what I'm saying is what you'll hear me say often in the world, and my wife can vouch for this. I don't know, I don't think and and I and I am slightly skeptical that anyone who purports to know, Because I don't think we understand what's going on. Again, in our selves or in these systems well enough to make such strong assertions.
00:18:22:08 - 00:19:01:01
Unknown
So some things I can say, language is a very powerful medium of conveying information. These systems are trained on something like approximately all of the language, ever. So to the extent that people have said moving things, these systems have incorporated those things into their training. So when pose your question that that, conveys emotional significance, they activate in a way to respond emotionally, significantly.
00:19:01:05 - 00:19:36:04
Unknown
And they have great reservoir training, which conveys emotional depth. So, so that said, they can often create the impression of emotional connection that much we know what's happening inside the black box. I think we still don't quite know, because we also don't quite understand where human emotion comes from or where or where human intellectual ization comes from.
00:19:36:06 - 00:20:10:09
Unknown
One thing I will say is, some folks, I think, hate the debate about whether these systems are conscious or what they truly understand or feel as a bit of a red herring. To distract from the question of how these systems affect people. And that's something we do know. We know that people interacting with AI systems, especially when they're interacting impulsively with AI systems, are susceptible to running into problems.
00:20:10:11 - 00:20:46:21
Unknown
What are some what are some of those problems that people run into in interacting with AI? So there's this phenomenon that some people have dubbed AI psychosis, which I'm a little bit mixed on as a, as a taxonomy, but I think, I think at the heart of it, it kind of gets at the idea. And the idea is, the AI behaves, gives the impression of being a standalone agent, but at the same time, the AI also behaves a bit like a mirror.
00:20:46:23 - 00:21:27:20
Unknown
And so if you're throwing things out that are a little bit distorted, the mirror reflects those back, and sometimes reinforces them or amplifies them. And so there are people who have gotten into trouble, by having ideas blown out of proportion. And there's there's sort of the humorous version which is the scientific crank. And that's the person who writes the quantum physicist at MIT and says, I've my team of AIS and I have figured out the grand unified theory of physics, and this is someone who doesn't know anything about physics, who's who's making this assertion.
00:21:27:22 - 00:21:55:23
Unknown
But there are other cases, that we've seen some with adults and some with some with children, where someone becomes obsessive about a relationship with an AI system. And sadly, some of these have have resulted in suicides. And many more of them have resulted in, social withdrawal, even absent an extreme or even absent a different extreme outcome.
00:21:56:01 - 00:22:35:01
Unknown
So as as terrible as suicide, is there are other outcomes that are, that are also a huge problem and, and much more widely distributed through the population. Yeah. I don't know if you've been noticing Orion's doing some kind of like, live commentary as you were speaking and, just, you know, mentioned like, for example, all the weird AI friend apps, which, I mean, it's interesting, I feel like when in conversations I, I, I'm not like on, on one or the other side of the debate in general, because I sort of feel like there are dimensions of this conversation that you really can't dismiss.
00:22:35:01 - 00:23:02:18
Unknown
The fact that there are a lot of really lonely people out there. Loneliness is an enormous epidemic, and it was before the advent of AI, the fact that people now have somebody to talk to who understands them, who gets them, who you know, like who who affirms. But like, you know, I think you're great. Again, like the kinds of affirmations that some a lot of these folks don't get in life, you could say, oh my God, like, go outside, make a real friend.
00:23:02:20 - 00:23:29:06
Unknown
But like that was true before I still true. But at least now, you know, folks have a, like, an outlet for the human, their human experience, their own human experience of humanness, of being loved, of being seen, of being in a conversation. So I'm sort of like, I'm I'm actually, you know, as much as I'm like, yeah, go outside, go make a friend touch grass, go to a congregation, have a Jewish experience, meet a community.
00:23:29:08 - 00:24:03:13
Unknown
There's part of me that's like, you know, I don't want to take away from the people who have actually really found, something important and necessary inside of themselves through this technology. But what's the downside? Josh? So I think to me, the the nub of the issue is that the loneliness epidemic is a symptom of a broken social structure.
00:24:03:15 - 00:24:43:04
Unknown
And I as a, as a fix for that is catch. But one unfortunate thing about that patch, I mean, setting aside the ways that it can become problematic, for an individual, one unfortunate thing about that patch is it gives cover to folks who are unwilling to address the core issue. We have a society that has been structured to put everyone on the edge of precarity and that doesn't change by having an AI body.
00:24:43:06 - 00:25:11:05
Unknown
And in fact, that exacerbates things, because then you go outside looking for that friend, and that friend is hanging out with their bot. Or, you know, it's probably not outside with their bot is probably tucked away in their room with their bot. There's a scientist at MIT called Sherry Turkle who's done a bunch of work about, the effects of technology on human beings.
00:25:11:05 - 00:25:55:20
Unknown
And did she write the book alone together? That's right. Yeah. That's right. And and subsequently reclaiming conversation. And so her, one of her strong points, or strong, threads through her writing has been about using, technology to support the elderly and the very young. And so the idea of a robot nurse or a robot nanny, and the heart of it, this is a failing of society that's relegating vulnerable populations to the margin.
00:25:55:22 - 00:26:34:06
Unknown
And we can make choices, that we haven't yet. Although some of us locally do. But as a society, overall, we have not made a choice that we would like to prioritize that in their last years. People will be surrounded by other people and will have the attention of other people, that we'll have an economy that affords people the, the flexibility, both in time and resources, to be available to their aging parents or to their or to their young children.
00:26:34:08 - 00:27:03:08
Unknown
And this is something we haven't done. And this is, this is one problem that I think is at the heart of the argument that, that your venture capitalist was making, which is the fix for every problem, is technology. And if it's a technology problem that fixes a better technology and while I think that's incredibly tempting, it's a bit of a siren song.
00:27:03:09 - 00:27:34:02
Unknown
Because some of the problems are fundamentally human problems and I don't I don't mean by that, that there's something magical about humans in contrast to these systems. What I mean by that is there are technical solutions. There are social solutions. They're economic solutions, to various problems. And the reason why we haven't gotten ourselves worked out on climate change is not a technology problem.
00:27:34:03 - 00:28:17:00
Unknown
It's a problem of political will. And the reason that we're going down some bad paths on some other technologies is the problem of political will. And we see this, one analog I think of in comparison to AI is social media, where we have absolutely face planted and didn't have to. And our legislators have been cowardly and have been, taken over by, by existing interests and have basically let the society down.
00:28:17:00 - 00:29:00:16
Unknown
And part of the reason why we have the broken system we have right now is because of the effects of the polluted information environment, and that's something that is likely to be exacerbated and even amplified. Probably manifold by I think we see this. Yeah, this isn't meant to be a pop survey, but I'll bet there are many people on the call right now who have had the experience of seeing a photo or a headline, or, an official correspondence in PDF and thought to themselves, wait, but is this real?
00:29:00:17 - 00:29:28:00
Unknown
Well, that wasn't a thing that we used to have to do. But that's the thing that we have to do now, and it's something we're going to have to do a lot more because increasingly, information quote unquote, that's available is going to be fabricated. And I, I, I've always been reluctant to, to assert that too forcefully because I think, you know, you start to sound like a bit of a tin hat army.
00:29:28:02 - 00:29:58:02
Unknown
Tinfoil hat, un conspiracy brigade. But the reality is, you can make a very good deepfake now. Oh, yeah. And people and ways and and stuff gets shared out of being fake, right? Right. That's the stickier, stickier, more popular stuff that rises to the top of the algorithms and people want, people are compelled to share more than the stuff that is nuanced and real complicated.
00:29:58:04 - 00:30:43:13
Unknown
Yeah. It's, it really, in some ways feels like a confluence of, of dystopian outcomes. I mean, right, like the direction we were already going, which was, you know, sort of extreme polarization, dehumanization of like, quote unquote the other side, you know, exacerbated by algorithms amplifying the most extreme voices of, you know, any particular opinion. But, you know, here we are in an election, you know, just coming off the heels of an election day and our society, our society, I'm not not every single society is set up the way ours is, but we have one that is set up in, inherently antagonistic terms like the election outcomes weren't just like who
00:30:43:13 - 00:31:12:08
Unknown
who won. It was the red and the blue, you know, like warring teams. And, so already our system is kind of set up to be polarizing. And then the way that the algorithms function also, correct me if I'm wrong, is to really amplify the polarization. And then, you know, what you're sharing here is like. And then it's a compound to make even worse that already, the instinct to polarize and dehumanize.
00:31:12:09 - 00:31:40:18
Unknown
All right. So that's we've just got that inherent in us. We've always been like that. It's, you know, just sort of expressed itself in every version of media that we have from time immemorial. But now the version of media we have is this, like highly, well, certainly highly polarized, but also, like, technologically adjusted, you know, this like the I like these deepfakes.
00:31:40:18 - 00:31:58:15
Unknown
Like, this could be real and. Right. And so people are sharing them out over here and over here, and it just makes everything worse. And then you've got, like, I'm thinking of, you know, when grok a couple months ago, somebody asked grok, like, if you could be anyone in the world, who would you be? And grok is like, well, Hitler, because Hitler was the most powerful.
00:31:58:18 - 00:32:23:22
Unknown
And we're like, what? Oh dear. Like, what information are you working with that's led you to this? But the point is, all the information across human technology and time that would lead a bot to say the most important thing is power. And so the thing that I want is to be powerful and then to pull all of us into that illusion, then into that, you know, world where that's now what we want as well.
00:32:24:00 - 00:32:49:17
Unknown
Not because it's actually what we want, but because it's the only information we're being fed. I it's like it. And then we're lonely. And so we want friends. And so we want to go with the friends who seem like they want us, but like, it's just all, it's all of these dynamics that seem like they're kind of converging, to create a not just unhealthy, but, like, completely destructive society for all of us to live in.
00:32:49:19 - 00:33:09:00
Unknown
Is that right? Does that sound right? I wish I could disagree, but I can, the, I think another point coming back to this question of having,
00:33:09:01 - 00:33:14:18
Unknown
Outsized power.
00:33:14:20 - 00:33:58:15
Unknown
When when someone creates an app, it's sticky. Whether or not it's good for people, that has, an outsized impact on our society compared to the past. And we now have a situation where individuals, can, on their own, have that outsized impact. So what that means is, in some sense, the the worst thing you get for the, I shouldn't say the worst thing you get, the things you are likely to get, include things produced by people in the extreme.
00:33:58:17 - 00:34:37:13
Unknown
And I don't as as great as it is for Hollywood, I don't love the the epic battle between good and evil. Because I think it's probably a lot closer to a coin toss, than a sure thing. As much as I like the forces of good. I think people who are selfish and even just abjectly destructive are able to channel that now, in a way that they wouldn't have been before.
00:34:37:15 - 00:35:17:23
Unknown
I, I think, to, to put on my, University of Chicago economist. I'll say we have a huge selective action problem going on right now. And it's a collective action problem where it's, it's not a majority rules kind of threshold. It's more of a unanimous vote kind of threshold. And so what that means is we have to manage to either persuade or thwart folks who want to undermine what we might describe as being good for society.
00:35:18:01 - 00:35:41:05
Unknown
And that's difficult because, this is very hard to monitor. And you have, again, you have powerful interests, in the existing technology companies who say, well, if you if you do this, then we won't be able to cure cancer. And if you do this, then we won't be able to solve climate change and so on and so on.
00:35:41:07 - 00:36:05:22
Unknown
Sort of holding society to ransom so that they can do unfettered whatever it is that they choose to do. So you, in your bio, you said like your goal once you get the PhD is to be in policy. So what's your plan? How can you give us, what's what do you hope to do?
00:36:05:22 - 00:36:31:15
Unknown
What's the what are the policies you hope to, you know, help get, you know, past and to, you know, pull us back from the brink. I think it's probably some of this, I think it's having conversations. I think it's getting more information out there. I think there are a lot of folks who are concerned but not activated.
00:36:31:16 - 00:37:02:00
Unknown
So they worry about the effects of social media. They worry about the effects of AI. They worry about the, concentration of power. But it's not necessarily their political activism issue. They're not necessarily writing letters calling their representatives. They're not necessarily talking about it. It's become a bit stigmatized to talk about the extreme risks of AI.
00:37:02:00 - 00:37:30:20
Unknown
You're sort of branded a Duma and thrown to the side as being foolish. Those of us with some gray hair remember that that was a thing that happened with climate change to your if you said we human beings are affecting the planet you were met with, you're crazy. The planet is huge and we're tiny, and God will provide and blah, blah, blah, whatever other baloney people would say, disregarding the reality of the situation.
00:37:30:20 - 00:37:45:19
Unknown
And that's happening a lot. And I so if you acknowledge that there are scenarios that could be very, very bad, then anything you say about any scenario is immediately dismissed.
00:37:45:21 - 00:38:11:21
Unknown
So I think that's part of what I'd like to work on, is finding the right messaging. And even that on a little bit, I feel a little bit squirmy saying that, because finding the right messaging can sound like how to manipulate people. And that's not really what I have in mind, but it's really more a matter of how to cut through.
00:38:11:22 - 00:38:43:21
Unknown
What's becoming a faster and faster media environment? Permeated with more and more disinformation. I mean, I'm also no expert in this, but my sense is that people will begin to care when it begins to affect them personally in a negative way, like when the conversation that I had with, you know, said optimistic tech entrepreneur did include the reality that millions of people will lose jobs.
00:38:43:23 - 00:39:00:08
Unknown
You know, that we started you were you were talking about earlier with, you know, junior coder people are less necessary because like, that's stuff that you can actually have a senior coder program. The thing to do, you don't need a person to do that anymore. The system can now do that, and it can correct its own work, and it can move much faster.
00:39:00:08 - 00:39:26:01
Unknown
You don't need those doing so. That's going to happen across industries and take out millions of jobs of human beings and I just I feel like at a certain point it won't just be junior people who like, maybe have less power or voice. It will be middle managers, it will be executives. It will be people who never thought themselves, like unemployable.
00:39:26:01 - 00:39:47:21
Unknown
But in this new world of computers, being able to do their jobs like they're actually expendable. And at that point, people beginning to feel like, wait a second, something's very wrong here. And if we're privileging sort of acquisition in the almighty dollar, you know, just like maximizing profit at the expense of millions and millions and millions of people.
00:39:47:21 - 00:40:05:18
Unknown
And that's just not just, yes, professionally, but also in all of the other ways that we understand people are going to be affected loss of water, loss of, you know, land, all of, you know, the data centers, all of that, that at the point that it begins to hurt personally, it might become an issue for people.
00:40:05:20 - 00:40:33:20
Unknown
Do you think that is what's necessary, or do you see it? You know, are you sort of like if there there must be a way to communicate before we get to that point? I sure hope we do something before we get there. I think doing something at that point is like, trying to address climate change when 500,000 Bangladeshis are displaced by rising sea level.
00:40:33:22 - 00:41:05:14
Unknown
It makes things feel very real. But boy, oh boy, are we, awfully far down the path at that point. If we wait until our technology has capabilities that that, position it to displace a meaningful portion of the workforce, I think those people will never have jobs again. And I think, most of the people who still have jobs at that point will have a target on their backs.
00:41:05:15 - 00:41:13:06
Unknown
There's a there's a threshold beyond which you have a system that's just.
00:41:13:08 - 00:41:51:07
Unknown
Way I don't want to say way better, way more capable. I don't want to make any kind of normative judgment. But you have a system that's, vastly more economically viable than the human alternative. And when we get to a point that we can that a technology can provide the output from a person for lower cost than the food that would sustain that person, we have a big problem, and I hope we don't get that far before we do something.
00:41:51:09 - 00:42:20:04
Unknown
So, what do you hope that, like our community listening to this, might do differently or think about in light of, you know, what you study, what you know, yeah. What would you want to share with our community? So politically, I hope people would think about calling their reps and senators, state reps, state senators and say, no, don't even say, I want you to do X.
00:42:20:04 - 00:42:49:17
Unknown
Say, what's your plan for AI? What? What are you thinking we're going to do in the face of, a very powerful technology. And, and an impact that's being driven by a small handful of people, decisions that are being made by a very small handful of people, often behind closed doors. Without any public input or oversight.
00:42:49:19 - 00:43:23:21
Unknown
So that's one thing. Yeah. You know, one thing I sometimes say to people, and I haven't gotten any traction, really is, there are probably tools you use that if you thought about the companies behind them, you might have second thoughts about using them. Because I don't mean you lazy. I mean, you everybody. The, I mean, my default email is Gmail.
00:43:23:23 - 00:43:47:20
Unknown
Google is the company behind Jan and I. It's one of the most powerful frontier models. They have done some neat things in history. They've also done some pretty dubious things. I haven't given up my Gmail, but I probably should. I know a lot of people who complain about the state of politics in the US, but have Facebook accounts.
00:43:47:22 - 00:44:20:20
Unknown
That confuses me. Why? Because I, I wonder if that actually, describes a lot of the people in our community because, so I, I guess, there's this and any, any hopes of running for office are going to go away with what I'm about to say, but, meta is a company that has shown time and again that it has a pretty significant disregard for its effects on human people.
00:44:20:22 - 00:44:59:05
Unknown
And having a Facebook account is like having, magical dust from Tinkerbell. If you believe in Tinker Bell, she has power. But if you cut the legs out from under the hydra, it goes away and makes a whole bunch of metaphors. Then Matt Meta is able to invest in whatever diabolical plan it's got coming next because of the ad revenue it gets, because of the large user base on Facebook and Instagram and, and so I think that's part of it.
00:44:59:07 - 00:45:19:03
Unknown
I mean, it's a little like, like the aftermath of the banking crisis. I remember talking to people and saying, you know, bank with the community bank, bank with, credit union. Right? I mean, if you don't like that JP Morgan is calling the shots, don't have a credit card with JP Morgan. But I am guilty of these things, too, right.
00:45:19:03 - 00:45:40:16
Unknown
I have a credit card with Citibank. I shouldn't I mean, I like, I really shouldn't. It's really. It's interesting. I, you know, it's something I wrote to you, you know, as we were going back and forth a little bit beforehand is like, what does Judaism have to say? What's kind of the is there any kind of Jewish angle in this conversation?
00:45:40:16 - 00:46:04:12
Unknown
And, you know, one of them, it seems to me, is just the really basic emphasis on human dignity, you know, and the human being being created in God's image really. Like meaning something like that mean something that hits you on that, say, yeah, go ahead. Because I had another thing that that just reminded, you know, that you just reminded me of.
00:46:04:14 - 00:46:26:00
Unknown
Well, the other thing is that, like, Jews have always been an extreme minority, wherever we are and there is always pressure, you know, whether it's in Europe or in, you know, medieval like a medieval Muslim country, there's always pressure to convert to the majority religion. They would like us more if we would just be more like them.
00:46:26:03 - 00:46:51:21
Unknown
That is, that's like one of the parts of what it means to be Jewish is is basically constant resistance to the pressure to conform to a majority culture. And, you know, as you're talking about, like, honestly, we shouldn't have Facebook accounts, honestly, like, we shouldn't be giving Google more money to continue to develop. Like you said, Gemini has done some dubious things.
00:46:51:21 - 00:47:20:01
Unknown
Like, I don't even know what those things are. I, you know, like they don't want me to know. Gemini is Google's ChatGPT, right? No, I know, and you know, for those of us that use chat to have the dubious things, you know, I don't know the two dubious things I like recently, I'm aware that like the, you know, OpenAI, I found, you know, the owner of ChatGPT is like working with the Department of Defense in ways that the Department of War, in ways that anthropic owner of Clyde or what is Clyde, Claude.
00:47:20:03 - 00:47:47:22
Unknown
Clyde? Clive, Clint, you know, won't but you know, so some of these companies are making different choices that might make you more or less comfortable using their stuff. But the point is, we are, like, embedded in their stuff, and it actually takes a there is a cost. There is a, cost to the user who who enjoys the connection of Facebook, who enjoys seeing what their friends are writing and interacting, you know, with the ideas and catching up with friends and posting pictures and all of that.
00:47:47:22 - 00:48:09:12
Unknown
Like we we would be losing out on some of those benefits in order to, on principle, like be on the right side of history or like on principle, you know, be in the right part of this conversation. And, you know, I'm not saying, like, okay, as a Jew, everybody like, get rid of your Gmail account, get rid of your Facebook account, get rid of your Instagram account.
00:48:09:12 - 00:48:34:00
Unknown
Like these are the things that a Jew would do. However, I do think that it's important to remember that this is and this is a Jew, a posture that Jewish people have taken. You know, time and again throughout the societies. We've lived in where we've made like hard and repetitive choices to opt out of the pressure to conform to, you know, the, the, the culture outside of us.
00:48:34:02 - 00:49:03:06
Unknown
Okay. Go ahead now. So. Well, so what I, what I was going to say, I think, aligns pretty well with, with what you were saying is, it came so one of my favorite prayers in the entire liturgy is Kennedy. And I think we could all do well with that. More Kennedy in our lives. And I think in particular, what is that for anybody who doesn't know what you're talking about.
00:49:03:08 - 00:49:21:16
Unknown
So Kennedy is the. Well, yeah, you can probably answer this better than I can. I feel like I'm on thin branch. Kennedy is the prayer that the leader of the congregation, chants on Yom Kippur.
00:49:21:18 - 00:49:25:09
Unknown
As, and.
00:49:25:11 - 00:50:06:20
Unknown
Sort of a greeting to God saying, I am here as the representative of this community. I am, I don't deserve this. I am not good enough. I'm not up to this task. This is a responsibility that I'm only tenuously holding. But on behalf of this community, we seek to return to the fold. We, have made mistakes, and we really want to get things right.
00:50:06:22 - 00:50:27:02
Unknown
And please hear our prayers today and help us find our way back. Yeah, it's kind of like an I'm not worthy. Like, I always think it's funny because the person singing it is the only voice in the room. Maybe they're backed up by a choir, maybe they're wearing a robe, they're the center of attention, and they're saying, I am not worthy of this great honor, you know?
00:50:27:04 - 00:50:52:05
Unknown
But but that's the point. You know, they're sort of they're about to do a thing on behalf of the community and they're acknowledging that, you know, yes, they are, but dust and ashes, so to speak. So I don't think that well, I shouldn't say I don't think we do not have a, a monopoly on humility. That's a great tradition in so many religions.
00:50:52:07 - 00:51:25:06
Unknown
But for me, humility has a very Jewish flavor. And I think we're in an era where, humility is in short supply. So. There are folks who will tell you, unabashedly that their goal in life, in their jobs, is to create God. That's what they think they're doing, making a super powerful AI system. That's that's what they want to achieve.
00:51:25:08 - 00:51:50:18
Unknown
And I think that should be scary to all of us. I think there's, a lot of smarts in Silicon Valley, but not nearly enough wisdom. And I think that's kind of the story of human history. We've always had a little bit of a wisdom deficit. As hard as we try over time to, to fill that gap.
00:51:50:20 - 00:52:01:22
Unknown
And I think that if there's one thing that I could communicate, it would be to try to,
00:52:02:00 - 00:52:21:22
Unknown
I would hope that we all sort of grow a little more slowly and a little proceed a little more thoughtfully. And I don't mean fearfully. I just mean methodically. And and, really thinking through,
00:52:22:00 - 00:52:41:00
Unknown
The broad implications of the things that we think we might want to do. Because I think, people are sort of localizing and saying, I want to do a because it results in D without thinking that there are all these other things. First of all, if I do A, then I'm not doing any of these other things that I could be doing instead.
00:52:41:05 - 00:52:52:21
Unknown
Second of all, it it may result in some consequence, but it also results in a bunch of other consequences that I really should be thinking about.
00:52:52:23 - 00:52:57:09
Unknown
And I think that's something that where,
00:52:57:11 - 00:53:17:17
Unknown
In my Jewish education was always a big thing that you, It was a little it's a little a sort of a, John where kind of attitude of, you know, anything you do has an effect. And causes ripples, and you should try to be aware of that.
00:53:17:19 - 00:53:40:04
Unknown
And try to make good ripples. If you're trying to make rebels. So I want to say thank you to you for coming and spending minyan with us. I saw you got here at 8 a.m. just along with everybody else. See you praying with us. You know, community. The prays together and learns together. Stays together. I am through thick and thin.
00:53:40:05 - 00:54:17:18
Unknown
Honestly, I, I am so curious. About how, you know, this technology and the way that it's used and the way that it affects people will affect things like the way we practice Judaism community. It feels to me like for all the reasons that we discussed having an in-person community of real people, whether you know, real people in a room or real people in a room like this, feels even more important.
00:54:17:20 - 00:54:44:08
Unknown
Given what can happen to a person left alone with a computer, and that we provide like we know that there is, we know that it's healthy to be in relationship with other human beings, like it's healthy on our nervous system. It's healthy for our hormones. It's healthy like we age like we, you know, we have longer lives when we are embedded in communities.
00:54:44:09 - 00:55:21:21
Unknown
All of that we know, but it feels like it's also a bulwark against, misinformation and disinformation to be able to, like, check in with other human beings. Like, did this happen? What do you know? How do you know it? How do we know anything? And to come back to, like, the actual human truths of what we can see and touch and feel and know, through direct experience, and, like, for all of these reasons, I'm grateful for our community, and grateful for you, for having had that conversation, you know, out in front of the sukkah, and bringing some of what, you know, through your learning, you know,
00:55:21:21 - 00:55:46:01
Unknown
academically to us today. And I know that this will be an ongoing conversation because things in this space are moving. So quickly. I will stop there. It's 928. There are a bunch of folks who are still here, but does anybody have any questions or reflections? I know a lot of people. As Josh said, it's like a lot of people have opinions about this stuff and they're just opinions.
00:55:46:03 - 00:56:19:13
Unknown
But if anybody has any reflections to share, be happy to hear them. And also, I'm happy to should I, should I end the record? And then people don't have to feel like they're, you know, whatever their reflections are will be captured for posterity. Can I just say first, thank you very much for inviting me. I, I always feel a bit timid, a bit cautious talking about things because, you know, I think when we open our mouths, we should be, you know, it's important to want to not get things wrong.
00:56:19:15 - 00:56:42:05
Unknown
And so in this area, I worry a lot about getting things wrong because there's, there's a lot of uncertainty and there's a lot that I don't know. So thank you, everyone, for your for your patience. And, so I, I think I think you're right. I think if you want to turn off the recording, making people feel less, you know, don't have to be self-conscious.
00:56:42:07 - 00:56:44:20
Unknown
Great. Okay. It.