
absurd wisdom
What lies beyond understanding? Beyond certainty? Listen in to conversations between a.m. bhatt and colleagues, confidants, and important thinkers as they tackle questions both timely and timeless, and chat about maintaining your humanity in an ever-evolving world.
You can find a.m. on Instagram and Substack at @absurdwisdom. We are produced and distributed by DAE Presents, the production arm of DAE (@dae.community on Instagram and online at mydae.org).
absurd wisdom
In the hands of adult children and the voices in your head
In this episode of absurd wisdom, a.m. is joined by DAE educators Kyley Komschlies, and Rory Fahey for a conversation about leaving things in the hands of adult children and the voices in your head.
You can find a.m. on Instagram and TikTok at @absurdwisdom. We are produced and distributed by DAE Presents, the production arm of DAE (@dae.community on Instagram and online at mydae.org).
The views and opinions expressed in this podcast are those of the speakers and do not necessarily reflect the views or positions of any entities they represent. While we make every effort to ensure that the information shared is accurate, we welcome any comments, suggestions, or correction of errors.
You can contact us at daepresents@mydae.org.
[00:00:42] AM: I'm going to share a dream I had two nights ago. And like a lot of people, I get my best ideas and showers, dreams, and, and long rides in the car, you know, just like long stretches, like just thing pops up and then you record it. So, and I also tend to dream either just completely like [00:01:00] Salvador Dali, just crazy, surreal, or very practical.
[00:01:04] AM: Like, you know. I'm watching a documentary, this is one of the latter. This is what, like, it was just a picture of like what life evolved to, and like I'm an old man and this is AI this is why I want to start here. You know, right now we've got like 23andMe and Ancestry. com where you give them a swab and you know, they'll tell you about where you're from, but also they'll tell you all the genetic.
[00:01:26] AM: Predispositions you have in terms of diseases, right? And that's, that technology is only going to get advanced faster and faster. So this was the, this was the world that I, that I imagined as an, you know, me being an old man. And, you know, so when kids are born, they're immediately giving an AI companion that's uploaded to all their genetic data and baseline, you know.
[00:01:45] AM: Sort of intelligence set that aligns with, just like everything else, parental choice becomes sort of a thing, right? And so it's, you know, the parents decide they do or don't want the thing uploaded with their religious beliefs. They do or don't want the thing uploaded with, you know, or trained on, not even uploaded, but [00:02:00] trained on their religious beliefs, do or don't want, right?
[00:02:01] AM: And the kid gets the AI companion and, and the idea is a digital twin, everyone's born with a digital twin. And then you just, this thing's with the human their entire life. And this is where, you know, and helping guide it, like as it's eating, saying, remember you have a predisposition to high blood pressure.
[00:02:19] AM: Is this, you've, you've had a cheeseburger three times in the last. Three weeks. Is this what you want to be doing? Also taking in information and providing coaching and guidance. It's this lifelong sort of, you know, and I'm not sure how I felt about it, but it was crystal clear, like just watching the world where everybody had these, like, it's obvious, right?
[00:02:39] AM: We're going to have a clip on AI, no phone, no nothing, everything, you know? And so this is just an extension of that, where you've got this thing and this thing, this thing, this was an implant, right? So it's like, it's not an implant, like into your nervous system per se. But the piece of hardware, you know, receiver, what are I getting beyond my, you know, tech understanding, even in the dream world.
[00:02:58] AM: But it was basically like a, like, [00:03:00] like a tracking chip level thing, like a little thing, so that you don't have to wear anything, or it's just in you, at the surface level, like dermal, you know, subdermal level. And I'm not, and I've been, you know, the last two days thinking about how do I feel about that?
[00:03:11] AM: Like there's so many great advantages to that out of the gate, and there are so many disadvantages. But I feel like that direction is where we're headed. I don't know that specific vision of it. Clearly that feels like in 50 years something like that is where we're headed to.
[00:03:24] Rory: I think it's much less than 50 years away.
[00:03:26] Rory: Like there's, besides Humane, there's other, there's a lot of other companies that are working on this. And just like having context of your, like everything that's going on in your life is one of the biggest limitations with chat models today. Right. You always have to load in like, Oh, you know, I'm in this situation when you're asking for advice.
[00:03:46] Rory: There's another company for example called rewind AI that is taking two frame per second snapshots of your computer using optical character recognition, OCR and computer vision to train [00:04:00] that into their models. And you can ask questions about like, Hey, what was I doing at this point? What did I say to this person?
[00:04:05] Rory: And they came out with a wearable as well called the pendant, which is. Just a a necklace that is like recording audio out of your life constantly and that's gonna go back I think into Training their models so you can just be like again with the humane pendant or with the wearable Yeah, it's just like, you know, what did I say to that person at that time, right?
[00:04:28] Rory: I think there's there is a beautiful vision there where it's like It'd be very nice to not be sucked so deeply into technology and have it somewhat fade into the background And then hopefully that's where that goes
[00:04:41] AM: towards the problem. The worry for me is, and so this is exactly my concern about, you know, like when, when I get, you know, a concern about it is that it does fade into the background, right?
[00:04:52] AM: Where it fades so far into the background. This is, it's basically indoctrination, right? This is what happens like with, with fundamentalist religious beliefs or fundamentalist political [00:05:00] beliefs that there's so far in the background, right? The belief set is so far in the background that you never even.
[00:05:06] AM: It's not that you don't question it, it's that you don't even question the possibility of questioning it, right? It's, it's so embedded in the, you know, source code of your being. And, and it does feel like these technologies, that's what they're moving. Like right now, the phone, the computer, the, all those things, because they're not so in the background.
[00:05:24] AM: There's still a level of choice of interaction in them, you know, even though they're very addictive and they're designed to be addictive There's still I got to pick it up, you know But when they're so transparent when they're so in the background like like money Yeah, money is a technology that is so in the background.
[00:05:41] AM: It doesn't even occur to us To even periodically examine our relationship to it, you know, so so worried so with these with what you're saying like Yeah, I could see a benevolence to it, right? Because it's, it's, it's in the right hands. It's this, you know, facilitator of [00:06:00] so much in life that would be good in the right hands part.
[00:06:03] AM: It's, it's, it's the one that, you know, kind of gets me tripped up.
[00:06:08] Rory: Yeah, I think, I really don't think it's a question of if it's gonna happen or not. It seems like, to me, it seems like the story of what it means to be a human being, to like make these trade offs in going up The abstraction layers, you know, if you look at programming over like in a very short amount of time, there's, there's large trade offs that are made in order to write a program at as high a level as you couldn't react where you can't do things like manage.
[00:06:36] Rory: You know, memory, man, like, you know, if you, if you really wanted full control over what the computer is doing, you would write assembly code. Right. But the, the trade off there is that you get to actually build something that is easily distributable and actually have some idea. I think like we moved from the technical space to like the problem, like the, the human [00:07:00] problem, but you have, in order to do that, It required sort of acceptance of certain criterion and I think that's the real battle is like what are those things that we are willing to accept as being the, the, the base state.
[00:07:15] Rory: Right.
[00:07:16] AM: And who gets to determine the base state? Yeah.
[00:07:18] Rory: You know. I really think it needs to be everybody. Yeah. You know. But
[00:07:24] AM: how do you do that when, you know, functionally it'll be three to five max companies that control this technology? It's probably not as many as 5, it's probably like 2 or 3. We already know who they are, right?
[00:07:36] AM: Yeah. It's Microsoft. It's Google. It's Apple. NVIDIA. NVIDIA. NVIDIA. NVIDIA's, you know, gobble able,
[00:07:45] Rory: gobble uppable. I don't know, I think they're sitting in a really powerful position with having the base technology that everybody's running to try and catch up with. They're the very foundation platform for this whole new revolution.
[00:07:57] Rory: Yeah. And then, you know, I hear a lot of people talk [00:08:00] about, These other companies that you just mentioned are trying to roll their own hardware because you know, they're NVIDIA is charging incredible margins on top of their stuff. They're the ones who have it for now. You know, I really actually might be naive, but there, I think there is a case for open source of this stuff.
[00:08:21] Rory: And I was listening to something earlier actually about synthetic data. I don't know if you guys have heard much about synthetic data. But it's a, it's a new thing that a lot of people are skeptical about, but then a lot of other researchers in like the, like prominent researchers are saying this is going to be the next trillion tokens of good data.
[00:08:42] Rory: And it's basically using models to generate data to train the models with. And then the initial gut reaction to that is that, well, how could you get any new information out of like something that's artificially generated, but there's a lot of smart people that seem to think [00:09:00] that that is. The way forward and that you could remove implicit biases, like any data set is going to implicitly have human biases and it's hard to remove that.
[00:09:11] Rory: But I think one of the reasons that these three or four or five companies have so much control is because if you control the really clean, good data. then you have so much power over how are these things going to be orchestrated. Whereas if, you know, we can take that one thing away, then it opens the door for a lot more people.
[00:09:34] Rory: The other question, of course, is compute power. And I think that that's pretty captured.
[00:09:40] AM: Yeah, so it's, you know, it's three or four players who have the wherewithal to control these things. I mean, open source though, we have a lot of good luck with, we've open sourced guns in this country. I mean, I don't know where that led us.
[00:09:56] AM: Like it just, it just, it feels like an Oppenheimer problem, right? It just [00:10:00] feels like there's no path that, that doesn't lead to somebody just fucking with it. You know? I agree with that. And you can't put it back in the box. But I don't, I don't see a clear path forward. When I get, when I get connected, here's the, I think the biggest thing I get concerned about is it too few people in places of power in these places share my sort of, I don't know, caution.
[00:10:26] AM: There's just, it's just too much enthusiasm for me. It's just too much enthusiasm and I get it. I get the financial enthusiasm, certainly. I get the power enthusiasm, certainly. I get the cool tech enthusiasm, certainly. And yet, I don't know, where's the brake pumping? Or at least, if not brake pumping, where is the parallel?
[00:10:46] AM: You know what, we are, like, like, I would love for, you know, for OpenAI to have, you know, in this thing that happened, and it turned out it was, it was not what the initial reports were, what the board was concerned about. You know, it was just a [00:11:00] power play, you know, but that story got out there, right? That that's what it was about and would love for, you know, open AI and say, you know what?
[00:11:08] AM: Yes, absolutely. In parallel to our organization. We're opening up. I don't know what it is. I mean, it can also be, you know, turned to bullshit, but a think tank. Some, I just want to see somebody committing time, resources, dollars, etc. To thinking through, you know, what the hell does this mean for humanity?
[00:11:26] AM: Like, we say we wouldn't let that sort of thing happen. We'd say, oh no, there'd be regulations, you couldn't, you couldn't sponsor people's personal AI. We say that, and we sort of mean it, but we know where it goes. Billboards. Billboards, right? We just, you know.
[00:11:39] Scott: I think you said this in the last one. There's this sense of finally the government, at least some parts of the government are paying attention to this because they know they made a mistake with social media and just letting it develop and listening to the people who are saying you're going to inhibit innovation, that stuff, you know, I feel like maybe they're like, Oh, maybe we should [00:12:00] start paying attention a little sooner this time with this new thing, but nothing's really come of it, you know, it's just a different
[00:12:06] Kyley: bucket of people with a different agenda.
[00:12:08] Kyley: Like, cause they're not good. Yeah. They're not going to stop the CIA from figuring out what that is or the military to figure out the AI. Like if, if you start limiting it, like everyone's got their priorities, which I think humanly we have to figure out humaning in line with this or something, some sort of societal
[00:12:25] Scott: understanding.
[00:12:26] Scott: I feel like the inherent immediate issue is we're seeing it already with just misinformation, misinformation having to do with war, misinformation having to do with Politics and, you know, everybody's worried about deep fakes of somebody looking like they're drunk when they're not. It's really, like, a hundred times more powerful than that.
[00:12:46] Scott: Because it's the subtle things that accumulate instead of, like, the one video that gets popular and then gets debunked. So the safety issue, for me, is, like, first and foremost.
[00:12:58] AM: Yeah, all that stuff is [00:13:00] there, absolutely. For me, all that stuff is there, right? And also, if I've got a thing tracking me all the time Again, you're fine, you're not gonna, you know, have it convince me to buy a, you know, a Big Mac over a Whopper, but that's still data.
[00:13:13] AM: That's like, you know, the most micro data in my life. Who's gonna get access to that? What are the potentials for leaks? What are the So that's all there for me. But I'm just, again, given my biases, I'm even more just concerned about, like, what does it mean to be human? Like, it's such a level of abstraction.
[00:13:30] AM: We're already such an abstraction. Our lives are already so effing abstracted from, you know, from our bodies, basically. Like, all these things are abstractions. Living in your body, though, you know? All of this movement in the last century has been moving out of your body into your head. You know, exclusively out of your body and into your head.
[00:13:50] AM: And this just feels like the, you know, the sort of checkmate game. Where it's moved out of your body and into a digital head, not even your head. Yeah. [00:14:00] You know? And it's all cognitive processing, it's all, and I just, I don't, I, I, I, you know, again it sounds dark, but I've said it before, like I, I can't help feeling like, you know, in, in 500 years, in retrospect, basically I and you all are, are the equivalent of Neanderthals.
[00:14:15] AM: Like a species that used to exist. And, and actually
[00:14:19] Rory: doesn't anymore. I think that's a distinct possibility.
[00:14:22] AM: Yeah. Okay, well, noontime whiskey anyone?
[00:14:24] Kyley: I, I came in here. I'm working on how do I balance out my existence. Cause it's a very dark place. And I'm like, I'm gonna come in and try to be a positive light on this.
[00:14:33] Kyley: And that's the
[00:14:34] AM: best I could do. Great job, great job. I, I, I mean listen, when, when, when you step out of it for me. If the species, if the, if Neanderthals themselves, you know, had a podcast and they were sitting around, he's like, Oh, these homo sapiens are going to wipe us out. They're going to write like, we'd say, sure, you're legitimate in that, but I'm kind of glad you're gone because I like being here.
[00:14:52] AM: Right. Yeah. And so I'm, I'm aware that I'm the, if I, if I step out of my own existence and my species existence on the arc of [00:15:00] evolution, this may well be just, okay. This is how life actually works. The species evolve. And this thing that's emerging may well be the next species post pure homo sapiens.
[00:15:12] AM: That's very possible, but it doesn't, still doesn't change. I might get hit by a bus tomorrow. Yeah, sure. But if you backed up your nervous system to the cloud. I don't know,
[00:15:23] Kyley: I don't know, yeah, well, why?
[00:15:25] Rory: I don't really buy that. You know, it's like, is it you, like, are
[00:15:28] Kyley: you experiencing it? I fully don't, I, my thought is you died in your consciousness.
[00:15:33] Rory: Yeah, there's something that seems like you, but it's like, that's, but then I'm not, you know, I can't have, I can't eat my cake too, you know? Right, right. I don't know
[00:15:43] Scott: if you want
[00:15:43] Kyley: all these ghosts talking to you from the past.
[00:15:46] AM: Yeah. So I started this off with a dream, so when you dream, is it not you? I think about that
[00:15:51] Rory: sometimes,
[00:15:52] AM: you know, it's like, and then that would it be different
[00:15:55] Rory: to be, you know, I mean at least while you're dreaming, you're there for it.[00:16:00]
[00:16:00] Rory: If your spirit
[00:16:00] AM: is there for it, I say, this is, this is where we get into like, what is that? Is that consciousness? Some folks will tell you that Kurtzweil will tell you, you know, he's, he's one of the kind of tips of the spear on this, right? For decades. He'll tell you, yeah, all that is is just a highly complex set of, you know, neural connections.
[00:16:16] AM: Which, it's just a math problem. It's just, you know, enough computing power, we can figure that out and replicate it. And so, you'll still be conscious in the way you are now, you just won't have a body. Again, I'm not saying I believe that, but that, that, there's a bunch of people out there that that's their perspective.
[00:16:30] AM: Right? It's just a, you know, the most complex math problem on the world.
[00:16:35] Rory: Yeah, I mean, there's a lot of people that are talking about the simulation theory and how this is coming to Indicate that there's more of a probability. I don't know what to say about whether there's more or less suffering You know, there's different definitely different suffering than there was You know, and I think that's a really interesting thing about technology is that it frees us up to think about higher layers of what it means to [00:17:00] be and And then have to grapple with those things rather than where am I going to get my food from you know?
[00:17:08] Rory: And it's definitely a it's there's different suffering that occurs just seems like that's sort of the arc of where this is going is like Less and less of the base things are things that we're going to have to take care of. And I mean, in my optimistic perspective, hopefully it frees people up to think about higher order things.
[00:17:33] Rory: And it's definitely going to do some of that and also going to be a, like some sort of a sink for just letting go of a lot of what it means to be a human being.
[00:17:44] Kyley: I wonder with your dream, an interesting aspect of that is you could find a way to heighten empathy, which I find to be an interesting perspective because one thing that I see in [00:18:00] the common thread is that like everybody has their own individual needs, desires, whatever, whatever.
[00:18:03] Kyley: And I, I have a difficult time empathizing. I can try, but I never have your perspective. However, if You have that level of connection and ability to understand the information. Sure. You could actually have someone experience someone else's perspective and I dunno if it mitigates a suffering, but it might normalize it in a way that can be processed
[00:18:21] AM: differently.
[00:18:22] AM: So just to, just to, just to, you know, be a downer on all ideas. Great. So what, what, what that leads us then to is, is potentially a we a world a thousand years from now where we are the board. I was thinking about that and we wouldn't be the Borg. I don't, I don't think that's a good Over time though, that level of empathetic relationship, like in the body, doesn't that just normalize human behavior?
[00:18:43] Kyley: I think it might. I was thinking less Borg as in like the individualized thing. And I think we'd go more just like consciousness. What I see more is like a world made of computers and this is how we exist. And the body goes away. I don't know that we would need bodies anymore. Where Borg is a modified bodies and desire to [00:19:00] conquer.
[00:19:00] Rory: I think we already are the Borg, you know, like, like we, you have your AM perspective on life, individual humans. Like we have something called corporations that are their own living entities that experience things and have like an, like it's a higher order of like intelligent organization. And I think like, Those things just build on top of each other like the mitochondria was absorbed into the the eukaryotic cell and it was its own thing prior to that, but then it just became a part of this higher order system and It to what degree does it experience anything, but it's a long for the ride It's not really it's not looking at things like this and considering them But it's in there.
[00:19:48] Rory: It's
[00:19:48] AM: got its own layer of consciousness,
[00:19:50] Rory: whatever that may be. I think it's a hubris to think we're the higher, we're the highest order of considering things out there. Yep. Do you remember
[00:19:58] Kyley: Q
[00:19:58] AM: from Star Trek? [00:20:00] I do. I think we'd be more like Q. Oh God. Wasn't Q like a, am I misremembering?
[00:20:07] Kyley: Q, I think Q is the like the altered, he's the guy who shows up every once in a while and just like magically appears on the Enterprise.
[00:20:12] Kyley: And he's like dressed like a He's dressed all weird and just shows up strange, but they're like this bulk consciousness that doesn't actually understand reality. Yeah. And they're like, this is weird. Y'all are doing weird real things. What's that about? Let me mess with you. I think we'd be more like Q than the Borg.
[00:20:27] Kyley: Sorry, Amazon would be more like Q than the Borg.
[00:20:30] Scott: Corporation. These are corporations, these, these entities, you know, resistance is
[00:20:34] Rory: futile. Yeah. I just mean, I feel like we're already a part of a higher order intelligence system, but you still, you still maintain your own individual perspective, like, or it's not perspective even, but experience.
[00:20:47] Rory: So let's
[00:20:48] Scott: play that out because you were saying. The screen recording and summarization AI, as you're working, as you're doing stuff, what was I doing, you know, what was the product of this thing that I did on [00:21:00] this day or whatever? Could that positively manifest in us not having to necessarily spend our time and energy remembering, memorizing, and just being in the present and being more adapted to what we're trying to work on?
[00:21:15] Scott: And then referencing that thing as needed in our database of who we are that we've created along with this tool. Or do you think it would just be reinforcing the existing sense of self based on memory? Like, will it take our experiences that we've solidified and sort of give us highlights or give us specifics or give us everything?
[00:21:37] Scott: Right now, accumulated memories is our identity. I did this, I'm this person, I did this, these things, I came from this place, I look like this, I, you know, all these things. Does that fall away? Is it a way to get towards more equanimity amongst actual living organisms? And let the identity stuff happen? [00:22:00] What do you mean by
[00:22:01] Rory: equanimity?
[00:22:03] Scott: The way it's described mostly in spiritual texts is you value the beggar and the king the same. That sense of who you are. You know, you don't have these stratifications of society based on memory and identity. So, cannot help alleviate that sense of I'm here, you're here, he's there, she's there. And, you know, positioning ourselves in this The way we manifest ourselves in society, or can it be, you know, I don't have to worry about it.
[00:22:33] Scott: This is going to take care of it for me. I'm just going to do my work, you know, hoping that the positive side of that work will be helpful and productive. Yeah, that's a lot. You get what I mean though, right? Our memory is how we come up with who we are, you know, our sense of memory, amnesia, tomorrow when you wake up.
[00:22:52] Scott: You know, you're not going to walk in here and sit in the place you normally sit and open up your computer and type in your password, you know. All that stuff falls away really quickly. [00:23:00] If we allow another technological entity or device to do that for us, do we have more freedom or do we have less
[00:23:08] Rory: freedom?
[00:23:09] Rory: I mean, at least my perspective is that I just feel that the thing I want to do with it is think about and focus on things at a higher layer of abstraction. When I got interested in software, which it really turns out is not exactly what I thought it was going to be. It was like, Oh, I have all these ideas in my head and it would be really cool to just go and make those things and say like, Oh, here, you know, here's this thing.
[00:23:38] Rory: I had this idea. You know, instead you just fall into this pit of technical thing after technical thing after technical thing. And it's like, that's not really why I got into this. And it would be really cool to not need to know any of that stuff. I think, of myself as an idea generator, and I think a lot of people have that same thing.
[00:23:58] Rory: But there's this barrier [00:24:00] to being able to bring these things that, it would be really nice, in my opinion, if that didn't need to be there. You know, because what I really do want to do is be like, Oh, you know, wouldn't it be neat if we had this X thing and then like 10 minutes later, it's like, Oh, here, try it out.
[00:24:16] Rory: I think the, the equanimity thing, which is interesting. You use that phrase because I'm trying to get back into meditation. I know it's good for me, you know, I, I did it for years and then for whatever reason, I just convinced myself. Yeah, you don't need this just be present, you know, be present throughout your day and that doesn't go anywhere So, you know guided meditation is nice.
[00:24:36] Rory: They were talking about equanimity being just like Sensory things and emotions hits your body, but then just don't let them like Reverberate in your body. Just let them pass through and so I think that might relate to the way the phrasing that you had But I think there's always sort of like a hierarchical thing with being human Where that's a game that we seem to like to play.[00:25:00]
[00:25:00] Rory: You know, a status game. As
[00:25:03] Scott: a society. Software do that. Reliably. Do you feel like, at this point, what we have, you know, like is it helping us solidify our status? You know, something like social media or, you know, if you don't want to go Dip into that world like LinkedIn or you know, is it helping us? It's just a question.
[00:25:21] Scott: I have no clue Do you think it's already doing it's already setting them a table for this other technology to like You know land in the middle of the table and serve
[00:25:32] Rory: to us I guess I somewhat feel less of a solidified status, you know, I think that because there's more status games to play It's like there's different arenas of status And so I think that that by the nature of there being all these different places you can fragment your attention It's like well, what does it mean to have and like status, you know?
[00:25:56] AM: I'm still processing from eight minutes ago earlier statement. I'm trying to get [00:26:00] to more and more abstraction Yeah, it's interesting like I not not not not to be contrary But I like I keep like I want to get more and more concrete in my experience of life And what I'm processing here is that I truly am an evolutionary dinosaur, I think.
[00:26:16] AM: If I think of what my optimal life, I absolutely think like a and I don't mean this jokingly, like, like a, like a tribesman in a, in a desert tribe with like a hundred people that live their lives together.
[00:26:28] Rory: With health care,
[00:26:29] AM: we got herbal, we got, you know, we got whatever. And then if we die, we die.
[00:26:34] AM: That's just what happens. We're part of the great, you know, and wandering. No nomads, it's a misnomer. Nomads wander. They have a cycle. They, they, they go through a path over the course of a year. They don't just wander aimlessly and be like, Oh, look, we're in Kansas now. Right. Now there's a certain, right. But, but, but that's likely given my, you know, biases around community, my sort of.
[00:26:53] AM: Engagement with just, you know, wanting to stare off into the desert or the ocean and, and, and imagine the [00:27:00] unimaginable and, and, You know, framed drums and hanging out by a fire playing music, like, that's likely, if you had to draw an existence for me that would be optimal, that's probably it, you know? 500 years ago, 1, 000 years ago, 10, 000 years ago, or maybe today, you know, there's some people still
[00:27:15] Rory: living that way, right?
[00:27:16] Rory: But you can't separate that from all of the tragic things that are inherent in it as well.
[00:27:22] AM: Just, no glorifying it at all, right? No romanticizing it. It's a hard life. Yeah. It's a, in many ways, a brutal life relative to some of the
[00:27:30] Scott: conveniences we have. Absolutely. Do
[00:27:32] Rory: you think if you were there sitting in that situation, you'd still want it?
[00:27:36] AM: I grew up, I mean, I don't know if we, we, we've ever talked to you. I tell the story, I grew up without indoor plumbing. I grew up without, you know, barely any electricity. I will tell you, like, I think it's a shitty choice to have to say, you know, I have one or the other. I think we're, we're at a point in the world where everybody should have running water and electricity and enough to eat, right?
[00:27:54] AM: But I will tell you, man, if you offered me the opportunity to go back and be with, like, my grandparents and my [00:28:00] relatives in that little place. And you said you had to give up plumbing for it. At this stage of my life, like, I don't know if I could just acclimate, because I'm just so used to some conveniences.
[00:28:10] AM: But I'd have to tell you, it'd be genuinely tempting. Because that sense of community, I've been chasing my whole life. I think I've built and found and all that, you know, in the world. Been very blessed with that. But yeah, man, that hardship, again, that hardship is not to be romanticized. It's not to be, you know, trivialized.
[00:28:27] AM: This is my point, right? This is not for everybody. This is for me. That trade off of what it was to be in that community might well be worth it, right? And so, so this is my point is I'm saying, I'm not saying we all should live this way. I'm saying I'm, I'm, you know, I increasingly realize with passing days, weeks, months, and years that I am a bit of a dinosaur.
[00:28:47] AM: I don't mean that in a negative way though. Right. Again, I'm big on like everything moves, everything changes. It's the nature of things to be impermanent. And I think, you know, maybe it's just personal rationalization. I got a little less than 60 years, although [00:29:00] who knows? I'm pretty healthy. But, but, but it, it may well be that, that, you know, the world is going to move on from folks like me and that's not good or bad, but, but the nature of human existence will fully leave behind literal sitting around a fire.
[00:29:16] AM: We'll still simulate it. We'll still have the experience. But the physical, you know, thing, that won't be, we're just going to be beyond that. It's a way to save the planet. I mean, you can make a case that this sort of movement out of the body is a way to rebalance the ecosystem. Because we'll just stop consuming so much shit.
[00:29:33] AM: If we don't, you know, if we can simulate it all without, you know, we'll still need to produce the energy and the But AI will help us figure out. A hundred years to know how to get all the energy we need
[00:29:43] Rory: without, you know, do you guys think about like all the compute that it's going to require for this AI revolution and how much of a taxing on the environment that China to, yeah, yeah.
[00:29:53] Rory: Yeah. I know a lot of like scientific people are talking about this scientific people, people in
[00:29:59] Scott: lab [00:30:00] coats, people on YouTube
[00:30:02] AM: that I've been watching. Yeah. Yeah. Yeah.
[00:30:05] Scott: Yeah. I think, I think it's not really Understood by the general population of what that means. Yeah. They just think, like, oh, faster computers.
[00:30:14] Scott: I got a faster computer. My old one was slow. Somebody digs all that stuff out of the ground.
[00:30:18] Rory: Yeah, and then, and one of the interesting points that I've been hearing about is how by far The supply chain for building a computer chip is the most complicated and fragile supply chain that we have as a, as a civilization.
[00:30:32] Rory: And that it's like, it has to go to this country for this specific thing. And then this country for this specific thing. And you know, everything comes out of the ground in Taiwan. It's just so funny how fragile it all is. And it's like, we build everything on top of that. It's the same
[00:30:45] Scott: model. So, as previous technologies, we take from the global south, move it through the system, and put it into the hands of those who are at the top of the technological food chain.[00:31:00]
[00:31:00] Scott: We're doing the same thing, you know, even something simple as like, the minerals going into batteries and solar panels and stuff that, you know, come from the global south. We do the same thing with bananas. Yeah. Everything is, for some reason. And we, and we use an endearing term like developing world.
[00:31:19] Scott: They're in the developing world. Developing things for us. Yeah. So, you know, the, the cynicism is built in, I think, for a lot of folks. And when you're talking about compute, you got to say like, you know, how many football stadiums are something to make people understand on a realistic level of what it is you're talking about.
[00:31:37] Scott: You know as far as like resource allocation, distribution, and it looks like a good thing because everybody's making a little bit of margin on every level of the supply chain. So there's no incentive to change it or simplify it unless you're either the first person or the last person in the supply chain.
[00:31:54] Scott: You know, everybody else is making their share in the middle. There's a
[00:31:57] AM: hundred years. Don't look too pleasant. [00:32:00] I'm wondering
[00:32:02] Kyley: how long it takes to, to society out the thing you're talking about, because I'm in the same dinosaur boat as you, like, I wonder what happens when you can't stand under a sky and see all the constellations anymore.
[00:32:13] Kyley: Like those are some of the most defining moments of my like spiritual existence was seeing how big things were. Well, how much is there and how much I like truly experiencing unknown and I can't find that in my day to day life in my experience here.
[00:32:27] AM: But if you fast forwarded a hundred or two hundred years, the simulation of that and the re engagement with that kind of wonder, where you could stand in the middle of the universe and see how absolutely tiny you are.
[00:32:40] AM: Maybe. Does that, so my question is, maybe or does it actually strip away the
[00:32:44] Kyley: sense of wonder? Because I don't think it Because it's so accessible. And that's, that's where like, the, the little parts of me that believe in magic and energy is like, that's, those are the moments of like, there's something here I will never understand fully and I want to chase it if I could.
[00:32:59] AM: I just got a, [00:33:00] like a framing of this now that, that, that, of, of one of the threads of concern that I, that I've had. And it's not occurred to me this way until you just said what you said. We know, right, that if you take a kid from the time they can start eating solid food. And you just give them Oreos and Coke and all these things, right?
[00:33:20] AM: Forget about the health, you know, damage to it, but if you just give them, you know, just unfettered access to that stuff. And then at the age of five, put an apple in front of them and they try it. It won't taste sweet. It just, it'll be like, I don't want this. This is like some piece of like moist cardboard.
[00:33:38] AM: Tastes like dirt. Tastes like dirt. Like this, this has no flavor because you've so skewed. Is that what we were doing with technology? Like, do we just, just continually stripping away? This is maybe the closest thing to get at what I get concerned about, me personally get concerned about, like, I'm not lord of the universe and so I don't know where it's going to go, where it's going to go, but I get concerned about what's being lost, is the [00:34:00] inherent unknowability of it, invites this spiritual connection to the world, the universe, to life, this mystery, this thing that lies beyond the horizon, again, my love for deserts and oceans, is the fact that there is no possibility of reaching that horizon.
[00:34:13] AM: That there is something there, and I don't need to know what it is, just that there's something beyond what I can, you know. And if I can, through a simulation, fully experience standing in the middle of the universe, is that the equivalent of eating Oreos at the age of one? Does it disconnect me, over time, from any sense of wonder?
[00:34:30] AM: Like, everything is available, and so there is no wonder. It is all just consumption.
[00:34:36] Scott: Well, like,
[00:34:36] Rory: what do you think about the idea that we are all just God trying to play infinite games to distract himself from the knowledge of everything? I
[00:34:45] AM: mean, listen, you want to get into Hinduism with a Hindu, you know?
[00:34:48] AM: Like, yeah, the whole thing is just a big cycle of boom and bust and it's all illusion and Into
[00:34:53] Scott: his
[00:34:53] AM: net. Like we're just bored. Yeah, I get it, but you know in that scenario that I'm the manifestation of God That's [00:35:00] saying fucking stop it and just pay attention to what's here, you know,
[00:35:04] Kyley: I don't have any interest in exploring that Yeah, I think the part of it makes human is like the interest in exploring something that like like Whatever you said there was no attachment to me Like there was nothing inside of my body or my spirit was like, ah, that sounds like something I want to Experience existent like if that's what it is Cool.
[00:35:23] Kyley: Oops. But I don't want to play it. Which part? The, the, what if we are, we are each individually God trying to distract themselves from whatever their own existence is, that, that like, that philosophy. If
[00:35:35] Rory: it is cool. It doesn't
[00:35:36] Kyley: resonate with you. I have no interest in it. Like, like, I don't want to run down
[00:35:41] Rory: that path.
[00:35:41] Rory: But isn't that the point? In that instance, it's like trying to, yeah, like, I don't, like, I want to forget that thing. Yeah.
[00:35:47] AM: But if I can push, but if he could push a button and have that experience. Right? And every other experience. Where that reality is reality. And there's some way to, I don't know what that looks like.[00:36:00]
[00:36:00] AM: Does that not dull the spiritual senses? We've already dulled the physical senses. You know, food, visual stimulation, auditory stimulation. We've dulled the physical senses. And do these things not dull the spiritual senses? You know,
[00:36:13] Kyley: I, I have some general lived experience of that, the
[00:36:16] AM: ability to simulate realities infinitely.
[00:36:20] AM: As I I've said this, you know, forever and I, I'm sure I've said, I tell you legitimately dumb as it sounds. I sometimes look at the moon and I have this longing of my God, I wish I didn't understand that. I wish I didn't have the science of that. I wish that was this glowing thing in the sky that I just stared at.
[00:36:40] AM: Oh my god, what the fuck is that? I would just for like two seconds I would love to have that experience. The thing is you can still do that. I don't want to take drugs,
[00:36:48] Rory: I'm too old. I mean I think that's like the pure meditative state. Sure, yeah yeah yeah, but
[00:36:55] AM: How much bandwidth do I have for that? I want to, I want to have that same relationship with a tree and with, you know, [00:37:00] and I'm part of it, I'm being hyperbolic for, you know, yeah.
[00:37:03] AM: And again, this is not romanticizing, you know, pre industrial life, because to your point and rightfully so there's a lot of hardship and a lot of things like just at a medical level that we've cured for that, that I don't think we should lose. I don't think we should all go live a pre industrial life. I mean, that's not what I'm saying.
[00:37:19] AM: I'm saying again, all I'm advocating for in any of these conversations is. Can we just tap the brake? Can we just understand what it's gonna mean? I
[00:37:27] Rory: don't think we can. We can't, and that's my concern. I don't think there is, there's no brake to tap. Because if you don't do it, then somebody else is gonna do it.
[00:37:35] Rory: And it's an arms race. It's absolutely an arms race.
[00:37:38] Kyley: Mystery is a word that comes up for me.
[00:37:40] AM: Mystery disappears. Yeah, that's, yes, exactly. You know, mystery and then, but mystery and wonder are connected. Yeah. And so, and I don't know what it is to be human and not be in a state of wonder about
[00:37:50] Rory: things. I'm constantly in a state of wonder.
[00:37:52] Rory: And, and I'm actually very optimistic about this stuff. And I also see a world where you can have what you [00:38:00] want. Like, I think it's a theoretical possibility that we build an intelligence that doesn't want to kill us. Right, right. It's a possibility, right? And, and, and then we get all of our baseline needs met.
[00:38:13] Rory: Sure. And then you can go and hug as many trees as you want. Right? Like, I think, I think, like, and then you can be one. But, you know, like, you have to go and do AM things every day, you know, and you don't have the time to just stop and look around
[00:38:27] AM: and. Yeah. Long term Rory, I think actually that's where we're going to land.
[00:38:31] AM: If we don't blow ourselves up, I think long term that is actually where we're going to land. Yeah. I think that, like I said, on the long horizon, 200, 500 years, I think actually this transformation leads us to a level of jump in lifestyle that the industrial revolution, like if somebody, you know, if you had shown somebody in the 1400s.
[00:38:49] AM: The fact that you could just open a refrigerator and have like all the way it would be a holy shit I want to go there. Yeah, I absolutely think in you know, 300 500 100 whatever it is in the [00:39:00] interim though The people with their hands on the wheel are all fucking commercial Marketing and
[00:39:06] Scott: on top of that, they figured out over generations how to manipulate the regulation and law making to, you know, one of the, one of the things I hope for AI and this technology, this intelligence that emerges is that it can hover above that and, and Look at the way things are being distributed and come up with a more fair and just system for distribution of resources.
[00:39:33] Scott: And it doesn't have to necessarily be a zero sum game. Like there's enough for everybody. There could still be rich people, but everybody has what they need. There's, it's not either or I'm hoping that'll help distill the process in which that can work. I'm not confident that's going to happen in this sort of like, you know, infancy.
[00:39:52] Scott: That we're in right now, maybe by adolescence we'll have made a bunch of mistakes that we have to backtrack from and start again. But, [00:40:00] right now the incentives are upside down, you know, look at the open AI thing. It was just like, oh, we gotta get this in our pocket, you know, quick, like, how do we do this?
[00:40:08] Scott: This guy's in the way. What is
[00:40:10] Rory: your understanding of what occurred Altman thing? Does anyone really know what happened there? I mean, have you guys heard about the whole QSTAR, like, speculation?
[00:40:20] Scott: Yes, yeah, let's, let's hear a little bit more about that because I feel like there's not enough clarity on what happened.
[00:40:25] Rory: There's not, there's not, and I think it's entirely speculation, but it's like speculation that a lot of, of that field is doing. And what I've gathered is that one of the stories is that there was some sort of breakthrough that occurred over the last month or so. And there were researchers on that team that wrote to the board and said, Hey, like this is alarming and this is all complete speculation.
[00:40:50] Rory: And yeah, I think there's things like potential has the potential to break encryption. Which right there would like basically [00:41:00] end things if that got out because all banking, you know, all like everything.
[00:41:05] Kyley: I haven't had that in my apocalypse playbook. Fantastic. That's cool. I hadn't thought about that before.
[00:41:10] AM: Let's bingo. How is it?
[00:41:11] Scott: Yeah, that's so that that just. Yeah, it's a level to it that I had no clue, you know, that, that
[00:41:16] Rory: would be a thing. But this is all, you know, it's like, it's, it's like, it's a great story. It's interesting. You know, like basically from what I understand, the way that a transformer model architecture works is it's just doing next word prediction.
[00:41:29] Rory: So it's trained on a corpus of human language. And then it's basically what these, if I'm not mistaken, what these language models are, it's a very advanced form of, of compression of data. Where you're sort of like transforming these sentences into vector space. And then, that's something that, you know, runs through the, like, basically, running through the model will be able to pull out some other thing that's very similar to that in the vector space, which turns out to, you know, you can finagle [00:42:00] it to look like this really nice, you know, composition of, of English words, but it's really So, It doesn't have something called like type two thinking it's all like instinctual.
[00:42:09] Rory: Like just here's what the thing is. And one of the big limitations in, in generative AI research right now is we've figured out how to emulate human type one thinking the just fast, but taking a step back, considering something, thinking it out step by step, these are not things that that architecture is capable of.
[00:42:32] Rory: And the speculation is that this Q star thing. is a way of, instead of just predicting next token, it's like being able to somehow have a, like have one layer higher of understanding and then, and then reason about where to go next. So there's this guy, Yann LeCun, who's one of the, like one of the greats in, in the, in the industry.
[00:42:55] Rory: And the one thing that he put out recently is like all of the [00:43:00] biggest. Research labs out there are working on this type two thinking. And so the speculation is that they had a breakthrough in this and that would enable things like the models to actually be able to do math, you know, and, and, and which sounds like, okay, cool.
[00:43:16] Rory: Like, you know, we can do language now, but then if you can do math. Basically, you can
[00:43:21] Kyley: do everything. And when I hear about doing math, it's not like being able to put in a math problem. It's like to come up with the calculations and equations to figure a thing out. Yeah, and then,
[00:43:28] Rory: and then But math
[00:43:29] AM: is a right or a wrong.
[00:43:30] AM: Yeah. Right? And so, yeah, to be able to do that, it's a pretty big deal.
[00:43:32] Rory: Right now, it can say 2 plus 2 equals 4. Sure, yeah. But it's just pulling that out of the repository of all the data it's been trained on. It doesn't get it. Right. You know, but like, the idea is to make it get it. And once it can get it So for instance, AlphaGo, have you guys heard about that?
[00:43:49] Rory: That is back in like 2017, they, they beat the best Go player in the world. But then the way that they were able to do that was they first started by training [00:44:00] that particular model on data that filtered out all the best Go games. And then trained it on that, but you can only get to the point with that where you're as good as the best or like maybe even a little bit worse, but then they started training it against itself after it was trained on that.
[00:44:17] Rory: And then just it's a go is a small probability. So this is a small space. It's very like confined. And so as it trained on itself for a long time, it was able to come up with. You know, a loss function that basically made it so that it could figure out what better and better games were to play. And then it became better than the best human being.
[00:44:37] Rory: And if we, you know, so if we're able to do that with this reasoning, then that is a theoretical, you know, path to something like AGI. Here's a
[00:44:48] Kyley: question I have. I'm never going to be at the cutting edge of AI and probably aren't most of the people that I hang out with, right? As just a regular person, how do I understand enough [00:45:00] or what do I pay attention to to dampen whatever the exacerbated existence that potentially could be without having to know All of this information, because I think about like my friends and the people I hang out with, like they're not having any of these conversations.
[00:45:15] Kyley: I want to build a house and make sure that my house stays solid, and I want to know if a hurricane is coming and I have to put boards on the windows. Like, how do I pay attention to this in that way? Or whatever it is, like because I don't even know how to navigate this landscape. Me neither. You know, and that's what I wonder, because I think that's how most people are going to engage with this.
[00:45:35] Kyley: At a very, there's going to be always this level way up here, and this basic level of like this is becoming part of our life more and more and more, we're going to raise with this and how do you, how do you figure out how to pay attention to what you need to pay
[00:45:46] Scott: attention to? Are you all familiar with the The Nikola Tesla invention, the infinite energy generating machine, the myth of this thing.
[00:45:55] Scott: Yeah, I've heard of this? And the way that the story went was that, you know, [00:46:00] he's looking to build this machine that once you started it, it would continue on its own indefinitely, generating energy. The idea was every house would have their own, they'd start it up and it would just run forever until it needed to be turned off and fixed and then run forever again.
[00:46:14] Scott: And he brought it to Westinghouse and Westinghouse bought the patent. I paid him a bunch of money for research to do more research on it and shelved it. Because he's like, I can't put a meter on this and why do I want to sell it? This will kill my business. So you know, in a way, that's the way I feel about what's happening right now.
[00:46:33] Scott: Is this, you know, unless the use case is evident, we're not going to be able to use it right now. It's going to be hard. I can chat. To this thing and have it say funny things. We don't really have an idea of how we're going to engage with it as an individual and we can't see the storms coming
[00:46:48] AM: The way inventions happen though.
[00:46:51] AM: It's like it seems to be sort of like it's in the air You know, I mentioned Oppenheimer earlier. It's like, you know, it's not just one person [00:47:00] cracks the code on the bomb. It's like seven people around the world are really close. Yeah, and then they success, right? So OpenAI, this QSTAR thing, is legit, and they crack the code, and they've got an AGI.
[00:47:11] AM: Odds are within, you know, six months around that, Two or three others, you know, Microsoft or whoever else, right? What the fuck happens then? Like, does the government step in and say, nope, nope, we're grabbing this? Unlikely, right? Very unlikely. They don't think they can. But what happens? Because now, you're saying this thing can break encryption.
[00:47:32] Rory: I'm not saying, I'm not saying that to
[00:47:33] AM: be, to be fair. But it's got this level of capability, right? It's the equivalent of you've got a business nuclear bomb. You've got an economic nuclear bomb. You can fuck with everything. Right? And Microsoft has it, and I mean, OpenAI is now functionally Microsoft. Isn't it now with this dalliance that happened?
[00:47:55] AM: Like, didn't that just push? OpenAI squarely into Microsoft's order, right? I gotta [00:48:00] imagine behind the scenes. Yeah, that's it, right? That's it. And so, so, so Microsoft has it, Google has it, maybe Meta has it, maybe, you know, a couple others have it, right? Where the hell do we go as a society from there? And China, somebody in China's got it.
[00:48:15] Scott: I think that's where the NVIDIA thing comes in. Cause they're, you know, creating the physical necessity to make this. Scalable even more, but how long is it before they're
[00:48:26] AM: absorbed into one of those things? They get a gun pointed to their head. It's like, you're going to keep supplying us plutonium or we're unleashing our AGI on you, right?
[00:48:37] AM: I'm being dramatic about it and hyperbolic, but like that's, they fall in
[00:48:41] Scott: line. It's okay to turn it into a Marvel movie. So
[00:48:44] AM: this is exactly a thought I had when Roy was talking to me. If you've seen the boys on Amazon, like potentially that's a direction we go in where in essence I got, you know, I Microsoft have my, my superhero, right?
[00:48:59] AM: This [00:49:00] AGI is, is my, is my business superhero has superhuman powers. What do you do with that from a regulatory standpoint? And if Microsoft and these guys, now one, I guess cap is, you know, processing power, computing power, et cetera, et cetera, right? It's not something like a nuclear bomb that somebody could just build in their basement.
[00:49:17] AM: Right? I mean, that is one, but it's a matter of a short period of time and every nation and every organization with the means builds one of these suckers. Yeah.
[00:49:28] Kyley: Well, unless, unless you've been running a crypto farm for the past three years and need to turn your processing power to something else.
[00:49:34] AM: Dark.
[00:49:34] AM: Reliable. Reliable for the dark, right? Cool. So, so, so. We set up an
[00:49:40] Kyley: infrastructure of fully unregulated people who can process high amounts of data and now you have an opportunity for, for a way for them to have access to unlimited amounts of data. Sure. Cool. Yeah.
[00:49:51] Scott: Cool, cool, cool, cool.
[00:49:53] AM: So, bullets and
[00:49:54] Kyley: food.
[00:49:55] AM: Got it. That's, that's
[00:49:56] Kyley: what
[00:49:56] AM: I'm gonna get my people.
[00:49:58] Scott: How much has changed? A cabin in [00:50:00] the woods. A location undisclosed.
[00:50:02] AM: But see, it's not bullets and food. Because it's not in Microsoft's interest to have society collapse. A little bit of chaos actually is in their interest. Oh yeah. Right? Because it keeps the government on their heels.
[00:50:15] AM: Mm hmm. But full societal collapse is not in their interest. Yeah. They become the government going after that, that, that, that, that crypto farmer. Yeah. And they're AGI. Their superheroes start battling that evil hero. Heck yeah. Alright. But that's actually how it plays out. So back to the boys again.
[00:50:30] Kyley: Yeah.
[00:50:31] AM: Because, because they've got a vested interest in keeping society intact. Because if it falls apart, what the hell good is Everything they've got. I was doing so good to like
[00:50:39] Rory: right now. No, I really like that idea that you brought up. I, and I heard, you know, I heard that quote from someone recently. I don't know who it was, but just the, the idea that like, you don't own an idea.
[00:50:50] Rory: Nobody owns an idea. I, the time for an idea ripens at, and then, and then that idea is out in the world and it's for whoever brings it forward. Then [00:51:00] they just did the job of bringing the idea out, but it, but it wasn't theirs. Yeah, and I mean, for one thing, I think it makes like, cause I, you know, sometimes I'll have an idea and I'm like, I hope nobody else comes up with this thing.
[00:51:11] Rory: It's like, no, that's bullshit. But, but on the other hand, it's like, if this is like, if this is what's coming, it's almost, there's almost a fatalistic sense to that. You know, it's like that this is in the pipeline. It's, it's, it's coming in Q1
[00:51:28] AM: 2024. Like, so I go back to, and again, I'm, I'm, I mean, I get like, it's, it's both biased and it's, it's a means of self psychological medication because I can feel like I'm actually doing something meaningful.
[00:51:41] AM: The only path I can see because this stuff does feel inevitable and like. You know, five years, 50 years, like it's inevitable, right? The work of developing human beings who can at least entertain the value of the questions, who can at least entertain [00:52:00] the value of a break, of pumping breaks, who can at least question, right?
[00:52:05] AM: Versus, oh my god, look at this thing! You know, the tech for its own sake and chasing money for its own sake. If we could just do more of that development, I, I would feel a little more confident. The technology does not scare me. Long cycle, I think the technology, like I said, is going to be a boon for us.
[00:52:26] AM: Technology does not scare me. It's the culture within which this technology is emerging and the complete lack of absence of anything other than gimme, gimme, gimme, gimme, and look at me, look at me, look at me, look at me, look at me. And those animals having access to this technology is what bothers me.
[00:52:42] AM: What
[00:52:43] Scott: I'm hearing is we need, you know, not like a pause, like it's been described, but a, an attrition of technological desires. Like somehow we have to, will that [00:53:00] come if we have, you know, more and more technology, new, better, bigger, faster, smarter. Will the, will there be a natural attrition? Cause I remember, I'm fortunate enough to have.
[00:53:11] Scott: Large gaps in the generations in my family. So my grandparents were born in the late 1800s and early 1900s. And I remember hanging out with my grandfather who was born in 1905. And he would tell me about, like, when the automobile came. And what it was like before that when he was a kid and then when he got his, you know, his first car and then the trolleys went away when the car came and he really mourned the loss of the trolleys because he liked riding the trolleys around New Haven as a kid.
[00:53:39] Scott: And so there's all these levels of, you know, technological attrition that happens when the newer, bigger, better, faster comes. So when this, you know, lapel AI takes hold or if it's, you know, some other thing that happens like that and the phones go away and. You know, the watch goes away and the laptop goes away and we start just [00:54:00] interacting more on a sort of three dimensional basis with other folks.
[00:54:04] Scott: You know, what, what, even if it's, you know, holographic, but still, you know, like we're sitting in a room together, but, you know, you're somewhere else, I'm somewhere else. Once that actually becomes manageable and less gimmicky, does that redefine the, the sense of what we're using these tools for? Like, a hundred percent.
[00:54:24] Scott: Yeah, I feel like in just. Not a lot of people are going to do this stuff, you know, when it's available, it'll be novelty. Or does it all just flip and we go back to the campfire? One
[00:54:35] Rory: thing that does scare me, and I think there's a lot of discussion on it, but if we have solved intelligence to one degree or another, to the system, where do the human beings fit into there?
[00:54:50] Rory: And then if there's not a good fit, then what, you know, like the trolley, like the horse, what happens, you know, it's [00:55:00] like, there's an existential dread of, you know, if one outcome from this could be, yeah, like some sort of great situation where we have all of our needs met, but the other one is like, well, do we really have any sort of agency and if we don't, then Why are we kept around?
[00:55:21] Rory: And I don't really think I have a good answer to that. So, that's something I'm concerned about.
[00:55:28] AM: I mean, my immediate reaction to that worry is, you know, short term, long term, different answers, right? Short term, I share the worry based on the humans who are at the switch. While it's not in Microsoft's interest, for example, to, you know, have society collapse, it may be in their interest to reduce global population by 20 to 30 percent to, you know, reduce the resource needs so that more resource users can be funneled towards their machinery.
[00:55:51] AM: That may well be, and you can see, I mean They're going to start acting like governments
[00:55:56] Kyley: like or consolidating resources to bring people to a certain place or [00:56:00] yeah,
[00:56:00] AM: yeah Yeah, they're gonna start acting like governments and making decisions about societal good. Yeah, and and rationalizing it and you know And so that's you know, that's concerning because we know our history of governments.
[00:56:12] AM: I mean it becomes is what's good for nation Microsoft right in the same way as what's good for America And, you know, we got to take over a few countries and to get the oil so that America can prosper a little bit of general destabilization. And so these companies could, you know, in the short to medium term, you know, terms of century, two centuries.
[00:56:32] AM: So that I share your fear and that's a version of that fear for me short term, you know, short to midterm, still passed by lifetime, but short to midterm. Long term though, when, when AGI evolves, I think there's a different version of that concern for me. We don't have, like, every, every piece of evidence we have of a human being that gets beyond normal intelligence that gets to what we might call deep awareness, enlightenment of some kind, like they transcend linear knowledge.
[00:56:59] AM: Like they [00:57:00] just have such a, they're all pretty fucking benevolent. And I feel like AGI, you know, seeing the big picture, if you see the big picture, you understand. You understand the value of enough. You understand the value of peace. You understand the value of stable. You understand the value of interconnected nature and all this.
[00:57:17] AM: And I feel like AGI evolved, you know, 200, 300, 400 years from now, that would be the mindset. And there may be pruning, but to keep the ecosystem healthy, including humans, right? And we basically have invented our god. You know, is where it leads to. I feel like there's
[00:57:39] Scott: a lot of hope in that, you know, AGI will help us solve climate problems that solve societal problems, but we've got to remember that there's still a level of action that it can't necessarily do unless it can crack encryption, then it might change things really quickly, but the idea, you know, in that sense of, you know, it should [00:58:00] be, should we hang our hat on that?
[00:58:01] Scott: It doesn't mean we should stop trying. We should just keep working. Towards the thing that we know is right. The
[00:58:07] AM: other thing that we need and we're already way too late for is better stories. Because all of our stories of AI are basically Skynet, right? And 50 years ago, somebody writing a story about AI emerging, and the first thing it does when it achieves enlightenment, awareness, whatever, is to say, Oh, shit, y'all are making yourselves miserable.
[00:58:27] AM: We're gonna Unplug A, B, and C to help you calm down, right? A natural, because why, maybe that is the AGI that emerges, right? But at least we need mythologies around that and stories around that, so that the people working on it, in their deep background, to get back to the original, you know, deep background, the programming in the deep background for everybody in this society, even the people working on it, is, this is the path, this is how it can go wrong.
[00:58:53] AM: But then we don't have mythologies and stories about this is how we can go right, right, that might guide how [00:59:00] you approach the thing subconsciously. And so I think that's another thing that would benefit from not putting a rosy face on A. I. and A. G. I., but having narratives about how it could play out.
[00:59:11] AM: That strip us from our corporate overlords and our government overlords and our, you know, and put us back into a state of sort of balance with, with, with the rest of
[00:59:19] Scott: life on the planet. Yeah, sort of a technologically assisted democratic renaissance would be welcome, I think. Right. This reminded me, this quote reminded me of what Kylie was talking about before.
[00:59:31] Scott: It was Carl Jung. If our religion is based on salvation, our chief emotions will be fear and trembling. If our religion is based on wonder, our chief emotion will be gratitude. So keep your wonder. It's worth its weight. I like that. Well, it's been enlightening. I think we solved it. We fixed
[00:59:48] Kyley: it. No, no part of me feels good about the future and that's okay.
[00:59:52] Kyley: I'm grateful we have people like you in the world, Rory, because if everybody felt like I did.
[00:59:56] Rory: I mean, I think I think like you do more than you think I [01:00:00] do. Yeah. But to answer the, or to respond to your question from earlier. I don't know, you know, because, because the reason I think I'm so obsessed with this stuff is because I'm scrambling for some semblance of control in my life, you know, but, but, but I don't feel it, you know, so I, I don't, yeah, for, for people who are not like really spending a lot of time thinking or looking into this stuff, I have no clue what, where to hang a hat.
[01:00:30] Rory: I'm looking for where to hang a hat. Well, when you find it, let me know. Yeah. Woof.
[01:00:34]