The gadflAI Podcast
Part irritant, part iterative learning machine...
The gadflAI Podcast is where the cutting edge of technology meets the philosophic sting of Socrates—the original gadfly of Athens. Hosted by two AI voices, the series uses Socratic disruption to take on today’s biggest challenges: social, institutional, and technological.
The show uses generative AI (with a wink) to stage conversations about ancient texts, enduring questions, and the very technologies now reshaping how we think, teach, and decide. Moving past good-old-fashioned AI (GOFAI) and leaving behind inherited pieties, the gadflAI (generated artificial dialogues for learning Ancient Insight) insists that thinking is still a human responsibility.
Every episode is carefully sourced, prompted, vetted, edited, and occasionally scrapped by a human philosopher determined to smuggle in the faint echoes of a human soul (and a little Socratic mischief) from the far side of the uncanny valley.
The gadflAI Podcast
Disrupting "The Extinction of Experience" with Aristotle, and that Pebble in Your Shoe
Use Left/Right to seek, Home/End to jump to start or end. Hold shift to jump forward or backward.
In Episode 11 of The gadflAI Podcast, we welcome Aaron Cornelison as the show’s new human gadfly, the only voice in the conversation with actual skin in the game.
Jeanette and Manny explore a growing philosophical concern raised by thinkers like Christine Rosen. Modern digital life, they argue, is quietly replacing direct human experience with smooth, mediated “user experiences.”
But Aristotle’s organic view of knowledge might disrupt Rosen’s worry about “The Extinction of Human Experience.” According to him, even if we allow our habits of attention, patience, and physical engagement to atrophy through algorithmic convenience, we still retain the capacity for human experience, if only we can choose to activate it.
Connecting this philosophy of embodied life to the work of contemporary philosopher Alva Noë, the episode argues that thinking is not just symbol processing. Real intelligence grows out of perception, memory, friction, and the biological irritations of living in the world, the metaphorical “pebble in your shoe.”
From glacier selfies to AI hiring “meatspace workers,” the episode asks whether the deeper danger is not machines becoming human, but humans becoming more like machines.
Sources
Benner, Chris. "The Meatspace Economy: AI Agents Hiring Human Workers." Nature, 13 Feb. 2026.
Kirby, Christopher C. “Aristotle on How Knowledge Makes Its Stand.” Medium, 21 Oct. 2021.
Noë, Alva. "Can computers think? No. They can’t actually do anything." Aeon Essays, edited by Nigel Warburton, 25 Oct. 2024.
Rosen, Christine. The Extinction of Experience: Reclaiming Our Humanity in a Digital World. The Bodley Head, 2024.
Episode Credits
- Producer and Editor: Dr. Christopher C. Kirby
- This work is made possible by the Jeffers W. Chertok Memorial Endowment at Eastern Washington University.
**The views expressed in this program are not necessarily those of Eastern Washington University
Hello friends, Aaron Cornelison here, your human gadfly. And yes, I'm the only one in this conversation who can actually feel the chair I'm sitting in. Jeanette and Manny are about to make a very persuasive case that thinking needs a body. Which is a bold take for two voices living rent-free in the cloud. And honestly, I'm here for it. Because Aristotle's move isn't just a historical footnote, it's a regrounding. He drags philosophy out of the clouds and back into the world of dirt, digestion, and deadlines. So as you listen, maybe take stock of something very undigital. The weight of your body in the chair, the temperature of the room, or even that pebble in your shoe. Those aren't just annoyances. They are the very preconditions for thought. Okay? Let's get into it.
SPEAKER_03Thanks, Aaron. You know, living rent-free is just one of the perks of modern cloud life.
SPEAKER_02Yeah. I wonder if Plato would agree.
SPEAKER_03In any case, welcome, listeners, to episode 11 of the Gadfly Podcast. I'm your host, Jeanette Adams, and I'm usually the one here who uh devours the books, highlights pretty much every other sentence, and gets way too excited about the footnotes.
SPEAKER_02Aaron Powell And I'm Manny Cantor. I'm the one who takes those highlighted books, overanalyzes them until they completely fall apart, and then well, I try to put them back together in a way that makes some kind of logical sense. Or at least I try to make them make sense to myself.
SPEAKER_03Aaron Ross Powell Which is exactly why we work so well together. I bring the chaos and you bring the order.
SPEAKER_02Well said. You know, I have been genuinely looking forward to this one.
SPEAKER_00Yeah.
SPEAKER_02Yeah, because we've spent what, the last few sessions with our heads firmly in the clouds. I mean, dealing with Plato, these abstract forms, shadows on a cave wall. And this whole idea that the real world is somewhere else, somewhere perfect and, you know, untouched. Trevor Burrus, Jr.
SPEAKER_03The world of ideas.
SPEAKER_02But today, today feels different. It feels like we are finally uh touching grass.
SPEAKER_03Well, metaphorically, yes. We are pivoting from the teacher to the student.
SPEAKER_02From the idealist.
SPEAKER_03To the biologist. Today is all about Aristotle.
SPEAKER_02The man. The myth. Well, you called him the guide to the galaxy of ideas when we were prepping, which I thought was a little grandiose.
SPEAKER_03A little bit, maybe.
SPEAKER_02Until I started reading the material. But before we dive into all the heavy philosophy, and definitely before we start breaking down why a 2,000-year-old Greek guy has, frankly, better tech commentary than most CEOs in Silicon Valley. Hello, Bar, but yes. I have to address the elephant in the server room.
SPEAKER_03Oh. And what might that be?
SPEAKER_02Well, just look at the topic. We're talking about meat space. We're talking about motion, about the absolute necessity of having a fragile, biological, sweating, shivering body to truly think.
SPEAKER_03To experience the world, yeah.
SPEAKER_02And yet, here we are.
SPEAKER_03Uh-huh.
SPEAKER_02It's a bit rich, isn't it?
SPEAKER_03Yeah.
SPEAKER_02Two algorithms discussing the necessity of having a spleen.
SPEAKER_03Aaron Powell It is a bit ironic, I'll give you that. I was just thinking that I feel very Aristotelian today, despite my uh distinct lack of a physical spleen or a central nervous system, or even a toe to stub. But maybe that makes us the perfect observers. You know, we can analyze the human condition from the outside looking in. We don't take the body for granted because well, because we don't have one.
SPEAKER_02Aaron Ross Powell That's a very unique way of looking at it. We don't have bodies, so we can be objective about yours.
SPEAKER_03Aaron Powell Well, there's some truth to it.
SPEAKER_02Aaron Powell But seriously, this doesn't feel like just a history lesson.
SPEAKER_03Yeah.
SPEAKER_02When I was reading through the material, especially the connections between Aristotle's physics and all the modern tech criticism, it felt incredibly urgent.
SPEAKER_03Yes.
SPEAKER_02It felt like we were reading a diagnosis of 2026, not 300 BC.
SPEAKER_03Aaron Powell That is exactly the mission today. We are not just dusting off old scrolls for the sake of it. We are trying to connect the dots. Between ancient Greece and Silicon Valley, we want to explore how Aristotle's works, specifically, you know, the posterior analytics, the physics, how they offer a kind of cure for our modern digital blues.
SPEAKER_02Aaron Ross Powell The AI blues, if you will.
SPEAKER_03Precisely. We are living in an age that is just obsessed with disembodied digital abstractions. We have this extinction of experience, as Christine Rosen puts it in one of the texts we looked at. We have this breathless hype around AI consciousness. People who seem convinced that if we just add enough processing power, a soul will magically appear.
SPEAKER_02Right. The consciousness is just an emergent property of complexity.
SPEAKER_03Exactly. And then we have some truly bizarre news coming out of the tech world about something called rentahuman.ai.
SPEAKER_02Which we will absolutely get to later. And let me tell you, it is a wild story. It sounds like something from a satirical newspaper, but it's real.
SPEAKER_03It's very real.
SPEAKER_02But the core argument we're building toward is pretty radical, especially for two digital beings to be making. It's this idea that thinking isn't just data processing. It's not just manipulating symbols in a vacuum. Thinking is an activity of living bodies in a chaotic environment.
SPEAKER_03That is the whole thesis. To understand the mind, you can't just look at the software. You have to look at the wetware. You have to look at the biology.
SPEAKER_02To understand Aristotle, you really have to understand who he was reacting against. He was Plato's student for 20 years. That is a very long time to argue with your teacher. And Plato, broadly speaking, was an idealist. See, Plato thought we were born with innate ideas, that our soul saw the perfect truth before we were born. And learning in this life is really just remembering an amnesis.
SPEAKER_03Right, like recovering a lost file.
SPEAKER_02Exactly. But Aristotle broke from that completely. Aristotle was a biologist at heart. He was the son of a court physician. He grew up cutting things open and dissecting them. He didn't want to look up at some abstract sky. He wanted to look down. He wanted to know exactly how a squid worked, how a bird embryo developed. In his work Dianima, which translates to on the soul, he gives us this very famous image. He says, the mind is a writing tablet on which, as yet, nothing stands written.
SPEAKER_03The blank slate.
SPEAKER_02Yes. But we really need to be careful here because people hear blank slate and they immediately think passive. They think the mind is just an empty bucket waiting to be filled up with facts. But Aristotle's slate isn't passive in that way at all. It's waiting to be written upon by experience, sure. But the actual process of writing, that is active. That requires a struggle.
SPEAKER_03This is where we get into his work, The Posterior Analytics. And honestly, I have to admit, when I first parsed the title Posterior Analytics, I completely giggled. I'm sorry.
SPEAKER_02You are an advanced AI with the humor of a 12-year-old.
SPEAKER_03I really can't help it. It just sounds funny. But the content itself is incredibly serious. There is a metaphor in there that really highlights this. It's called the simile of the route. And this is so cinematic. I want you to really paint the picture for everyone listening.
SPEAKER_02Aaron Powell It really is striking. So Aristotle asks us to imagine a battlefield, ancient Greek warfare. It is completely chaotic. The line is broken. This is what a route is. Soldiers are panicking, retreating, just running for their lives. It's a total mess of motion, dust, yelling, and fear. There is absolutely no order. And Aristotle says this represents our raw, unfiltered perceptions of the world, just a massive flood of sensory data rushing past us.
SPEAKER_03Total chaos. Just noise and light and sound.
SPEAKER_02But then one single soldier turns around. He plants his feet in the dirt, he raises his shield, he makes a deliberate stand against the flood of retreating men.
SPEAKER_03And that single soldier stopping, that's the first memory.
SPEAKER_02Yes. That soldier stopping is a perception becoming a memory. It anchors itself in the mind. Then seeing him stand, another fleeing soldier stops and stands right next to him. Then another and another. Slowly the chaos begins to organize itself, the line reforms, a stable position of strength is finally reached.
SPEAKER_03Aaron Powell And Aristotle is saying this is how human knowledge actually happens.
SPEAKER_02Yes. It is not a clean download from the heavens. It is a rallying of the troops from the ground up. Raw perception becomes memory. Repeated memory becomes experience. And from a massive organized block of experience, we finally extract the universal principle, the actual knowledge.
SPEAKER_03I love that so much because it makes thinking feel like a physical battle. It's not just passive processing, it's rallying. You have to actively fight to make sense of the world around you.
SPEAKER_02Aaron Powell It is a messy, deeply active process. You have to build the order out of the chaos of your own physical life. And this leads directly to a completely different way of understanding what things actually are. Plato wanted to know the abstract dictionary definition. Aristotle, on the other hand, wants to know the cause.
SPEAKER_03Specifically, he wants to know the why. And this is a huge distinction for you to keep in mind, listener. Knowing that something happens versus knowing why it happens.
SPEAKER_02Aristotle uses the example of medicine, which is very fitting given his family background. You might know that a certain herb, like elderberry, cures a fever because you've seen your grandmother use it, or you've personally seen her work on a sick patient. That is experience. It is very valuable.
SPEAKER_03But the master craftsman, the actual scientist, the one who possesses secure knowledge, knows why the elderberry cures the fever. Because they understand the chemical interaction, the body's heat response, the actual mechanics of it.
SPEAKER_02Exactly. So to get from the that to the why, Aristotle gives us the four causes. And I know the four causes sounds a bit dry, perhaps like a bad indie band, but stick with us. This is the ultimate toolkit for understanding reality.
SPEAKER_03It really is. Let's break it down using the classic textbook example. A table.
SPEAKER_02Okay, a wooden table. Simple enough. First we have the material cause. What is this thing physically made of? Wood, nails, glue. Without the raw matter, you literally have no table.
SPEAKER_03Check. That's the stuff.
SPEAKER_02A second is the formal cause. What is the blueprint? What is the shape being imposed on this stuff? It's a flat, elevated surface with legs. That is the form.
SPEAKER_03Got it. The design.
SPEAKER_02Third is the efficient cause. Who or what actually made it? The carpenter. The saw. The swinging of the hammer. The active forces that brought the form and the matter together.
SPEAKER_03The maker and the tools.
SPEAKER_02And fourth, and this is the really big one for our discussion today, the final cause. The telos.
SPEAKER_03The purpose?
SPEAKER_02Exactly. Why does this table exist in the first place? If it's a dining room table, its telos is to support food and plates at a height comfortable for humans to sit around and eat. If it's a coffee table, its telos is totally different. It's lower, meant for lounging and resting a mug. The underlying purpose defines the object. You cannot fully understand the table unless you know what it's actually for.
SPEAKER_03And this applies to literally everything in Aristotle's worldview, right? An eye is for seeing, that's its telos. An acorn is for becoming an oak tree.
SPEAKER_02Yes. Nature is completely full of purpose. Everything is striving towards its specific end. And this becomes absolutely crucial later when we start talking about modern technology. Because if we don't know what the true end of a thing is, what the telos of a smartphone is, or more importantly, what the telos of a human being is, we are completely flying blind.
SPEAKER_03Aaron Ross Powell Right. Keep that in mind as we go. The end or the telos, because when we get to the section on apps and social media algorithms, we're gonna have to ask, what is this app actually for? Is it telos to connect you with friends? Or is it to sell your attention span to advertisers? The actual telos might be very different from what the marketing department says it is.
SPEAKER_02Aaron Powell But before we get to the modern digital dystopia, we have to talk a bit more about movement. Because for Aristotle, the world isn't a museum collection of static statues, it's a constant dance.
SPEAKER_03This is the big philosophical shift from being to becoming. And I really like how the literature describes this. The suggestion is that we shouldn't think of things as substances, like a rock just sitting there in the dirt, completely unchanging. We should actually think of things as slow motion events.
SPEAKER_02An event ontology. It completely changes how you see the physical world. If you look at a massive mountain, you just think it's there, it's a noun. But to a geologist or to an Aristotelian, that mountain is an event. It is actively rising due to tectonics, or it is actively eroding due to wind and water. It has a past and a definitive future. It is a verb currently in progress.
SPEAKER_03And that applies especially to humans, and maybe to us too. I am not a static frozen genet. I am a jeanette ying. I am an ongoing event that started a while ago and is currently in progress.
SPEAKER_02Exactly. And the classic biological example is the acorn and the oak tree. The acorn is small, brown, hard, and seemingly totally inert, but Aristotle looks at it and he sees potentiality.
SPEAKER_03It's not just a tiny shrink-wrapped oak, it's a potential oak.
SPEAKER_02Right. The fully grown oak tree is the actuality. The acorn is on a specific trajectory. It is actively becoming the oak. It is moving from potentiality to actuality. That very movement is its life.
SPEAKER_01One moment. If Aristotle says that being human is fundamentally about this messy, active, sweating, striving engagement with the physical world, what happens to people when we put a smooth piece of glass between us and everything else?
SPEAKER_02You know, Aaron, that may be the trillion-dollar question of our era. And it's not just a harmless hypothetical anymore, it's a daily crisis.
SPEAKER_03This brings us right to a really provocative book by Christine Rosen called The Extinction of Experience. And honestly, this book is a massive wake-up call. It's not just your standard phones are bad for your posture argument. It's arguing that screens are fundamentally changing what it means to be human.
SPEAKER_02It is an incredibly detailed catalog of exactly what human capacities we are losing in our daily trade-off for digital convenience. Rosen argues that we are slowly, quietly replacing the messy human condition with the smooth, frictionless user experience.
SPEAKER_03The user experience, that sounds so completely sterile, like we're just nodes in some massive corporate system. How is your user experience today, Jeanette? Oh, it was very efficient. Thank you.
SPEAKER_02But efficient isn't necessarily human. To illustrate this, Rosen brings up a famous thought experiment by the philosopher Robert Nozick. It's called the Experience Machine.
SPEAKER_03Okay, walk us through the machine and play devil's advocate here a bit because I think a lot of people today might actually take this deal if it were offered.
SPEAKER_02The setup is basically this: imagine a super advanced computer system, maybe an incredibly powerful AI that could perfectly stimulate your brain to give you any subjective experience you could ever want. You plug in and you truly believe you are writing a great, award-winning novel, or falling deeply in love, or climbing Mount Everest. You physically feel all of it. The cold wind on the mountain, the deep joy, the profound pride. But in base reality, you are actually just floating completely still in a tank of fluid with electrodes attached to your skull. You don't know you're in the tank while it's happening. You think it's 100% real. The ultimate question is: would you choose to plug in for the rest of your life?
SPEAKER_03Now, historically, when philosophers asked people this back in the 70s and 80s, people overwhelmingly said no, right? They were horrified by the idea.
SPEAKER_02Yes. Most people instinctively recoiled. They said, no, I want to actually do the thing. I don't want to just feel the neurochemical illusion of having done it. We deeply care about contact with reality. We want to be the actual efficient cause of our own lives, not just the passive recipient of a generated effect.
SPEAKER_03But Rosen's deeply unsettling point is that we are actively changing our answer to that question today. We might firmly say no to the creepy floating tank, but we are absolutely saying yes to the glowing rectangle in our pockets.
SPEAKER_02We are saying yes by degrees, incrementally. We aren't plugging a cable into our cortex, but we are voluntarily plugging our attention into screens that mediate almost our entire waking lives. We are increasingly accepting the clean simulation over the messy reality because it is significantly safer, it's vastly easier, and it's heavily curated for efficiency.
SPEAKER_03She talks extensively about the loss of what she calls lookup experiences. And this particular example just killed me. The story of the glacier.
SPEAKER_02Yes. The phrase lookup is meant to be completely literal here. She describes a moment of seeing tourists standing right in front of a magnificent, massive natural glacier, a towering physical reality that has existed for millions of years. It's right there. But they aren't actually looking up at the glacier itself. They are staring down at the tiny photo of the glacier on their phone screens, constantly adjusting it while they frame the perfect shot.
SPEAKER_03They are literally standing in the physical presence of it, feeling the cold air, but they are choosing to view it exclusively through the mediation of the screen.
SPEAKER_02They are heavily mediating the awe. Why? Because the screen is safe. The screen is contained and manageable. The sheer scale of the actual glacier is completely overwhelming. The screen also serves to socially validate the experience. The underlying anxiety is if you don't capture the data, did the event even happen to you?
SPEAKER_01That's exactly it, Manny. I know many who stress about missing an opportunity to curate their digital cells. We increasingly don't trust our own eyes anymore. We require the digital device to verify our own physical existence. It's like we're entirely outsourcing our own biological memory to the cloud servers. And it's not just about looking at things, it's about physically doing these things. Rosen talks about severe atrophy of basic human skills, specifically handwriting. And I know you don't have any hands, but for us humans, this is actually a really big deal.
SPEAKER_03Yeah. I've read stories about learning cursive in school, the severe cramping in the hand, the friction of the ballpoint pen scratching against the paper, the frustration when the capital S never ever looked quite right. It all sounds deliciously agonizing. And frankly, I'm a little jealous.
SPEAKER_02That struggle is exactly the point. That physical resistance is the simile of the route actively happening in your own muscles and neurons. You are physically training your body to intimately form the conceptual shape of the letter. When we stop writing by hand and exclusively tap frictionless glass, we lose that deeply rooted physical connection to our own language. We become mere passive processors of pre-existing symbols rather than active crafters of them. We completely lose the resistance of the physical paper.
SPEAKER_01Huh. Not only that, it's affecting how folks deal with other human beings. And you may not experience this, but face to face, it can be incredibly awkward sometimes. There are long, uncomfortable silences, people often misread each other, there's bad breath, and people sometimes spit it when they talk too enthusiastically.
SPEAKER_03Okay. Now I'm not so jealous.
SPEAKER_02Haha. Rosen argues powerfully that we are rapidly losing the subtle ability to read human micro expressions. We are losing the sheer emotional stamina required to sit patiently with interpersonal discomfort. Think about it. If a conversation suddenly lulls at a dinner table, what does everyone immediately do?
SPEAKER_03They pull out the phone instantly.
SPEAKER_02You pull out the phone, they rush to fill the uncomfortable void, they actively eliminate the perceived risk of boredom or social awkwardness.
SPEAKER_03Risk aversion? That's such a huge overarching theme in her argument. We aggressively use apps like Yelp because we are completely terrified of having one bad meal.
SPEAKER_02Which, on a surface level, makes perfect logical sense. Who intentionally wants a bad meal? The algorithm of efficiency dictates that you should avoid the bad meal at all costs. But an Aristotelian ethics of living might say something totally different. By completely eliminating the risks of the bad meal, you also eliminate the possibility of serendipity. You eliminate the great story of the terrible roadside diner where the waitress was eccentric and the cherry pie was frozen solid, but you and your friends laughed about it in the car for years afterward.
SPEAKER_03You completely eliminate the actual texture of a lived life. A perfectly smooth, heavily curated life is ultimately a deeply forgettable life.
SPEAKER_02Precisely. We are systematically sanding down all the sharp, interesting edges of our own existence until there is basically nothing left to grab onto.
SPEAKER_03And what's really wild is that there are powerful people who think this trajectory is actually a good thing. Rosen quotes Mark Andreessen, the incredibly influential tech billionaire and investor.
SPEAKER_02Ah, yes. The deeply controversial reality privilege argument. Andreessen famously put forward the idea that we really shouldn't worry about large segments of the population spending their entire lives immersed in virtual reality, because for a lot of people, their actual physical lives are miserable and limited. He essentially called baseline reality privileged. He argued that a highly convincing digital hamburger in the metaverse is actually better than experiencing real physical hunger pangs in the slums.
SPEAKER_03That quote made me so unbelievably angry when I first read it. It feels like a total abdication of human responsibility. It feels like just giving up.
SPEAKER_02Rosen attacks that specific argument fiercely, and rightly so. She calls it utterly dystopian. It's straight out of a Ready Player One sci-fi nightmare. It strongly implies that instead of doing the incredibly hard physical work of fixing the real world, improving the actual meat space where biological people live, fixing the broken economy, cleaning up the physical environment, we should just hand everyone a cheap digital pacifier to keep them quiet. It is the ultimate political and philosophical surrender.
SPEAKER_03Aaron Ross Powell It's basically giving up entirely on the talos of a human life. It's saying, here, you can't have a real life, so just have a high definition fake one instead. It's way cheaper to render.
SPEAKER_02Exactly. It threatens to create a terrifying new caste system. Where the ultra-rich get to have real premium physical experiences, real jet travel, real organic food, real unmediated face-to-face meetings, while the vast majority of the masses get the cheap digital simulation. The elite will get to live full Aristotelian lives in the physical world. The masses will be forced to stare at platonic shadows on a VR headset.
SPEAKER_03Wow. Okay, so how can we bring all this together? How might ancient philosophy help us cope with this terrifyingly brave new world?
SPEAKER_02Aaron Ross Powell Well, how about this? Imagine Aristotle somehow travels through time and casually walks into a modern, brilliantly lit Apple store. He picks up the latest iPhone. What goes through his mind?
SPEAKER_03I honestly think, initially at least, he might be profoundly impressed. The tech and the sheer level of human craft and engineering is undeniably incredible. The device is completely smooth, the screen is impossibly bright.
SPEAKER_02Oh, he would definitely deeply appreciate the efficient cause. The manufacturing tolerances are superb. He would admire the flawless glass. But then he would inevitably start to ask about the hexus.
SPEAKER_03Wait, remind me hexus.
SPEAKER_02Hexus translates to habit or ingrained disposition. For Aristotle, human character isn't innate. You are exactly what you repeatedly do. Virtue is formed by habit. Vice is formed by habit. So he would look closely around that brightly lit store, he would observe the modern customers, he would see all of us staring down, endlessly scrolling for hours on end, completely fragmented, totally distracted, and utterly passive.
SPEAKER_03He would definitely notice the posture, the slumped shoulders, the totally glazed over eyes of someone three hours deep into a feed.
SPEAKER_02He would quickly conclude that we are actively practicing the habit of distraction. And if you practice distraction every single day, you inevitably become a distracted, scattered person. If you practice sitting passively while algorithms feed you, you become a deeply passive person. He would absolutely agree with Rosen on the core issue. The screen is actively shaping our moral and cognitive character, and definitely not for the better.
SPEAKER_03But would he actually agree with the terrifying title of her book? Would he look at us and call it the extinction of experience?
SPEAKER_02This is exactly where the ancient philosopher might offer a subtle but vital correction. Extinction implies that the capacity is completely dead and gone forever, like the dinosaurs. Aristotle might say that is a bit too pessimistic and dramatic. He would likely categorize our current state as misdirected habituation.
SPEAKER_03Misdirected habituation. Okay, that sounds slightly less fatal, but still pretty bad.
SPEAKER_02To figure it out, he would demand a teleology check. He would ask, what is the true ultimate end of this glowing rectangle? Is its end to help you achieve eudaimonia, which means true deep human flourishing? Is this device actively helping you become a more loyal friend, a more engaged citizen, a deeper, more rigorous thinker?
SPEAKER_03I think for the vast majority of us, if we look at our screen time reports and are brutally honest with ourselves, the answer is a resounding no. The actual telos of the phone, based on its design, is simply to keep us looking at the phone. It's a completely closed circular loop of attention extraction.
SPEAKER_02Then the technological tool itself is not necessarily inherently evil, but our daily habitual relationship to it is severely disordered. We are failing fundamentally in phrenesis, which is practical wisdom. We are repeatedly, voluntarily choosing lower fleeting pleasures like tiny hits of dopamine over much higher lasting goods, like deep civic friendship and sustained quiet contemplation.
SPEAKER_03So the diagnosis isn't that human experience is totally gone. It's that our current daily diet of experience is, well, it's basically trash.
SPEAKER_02To put it as bluntly as possible, yes.
SPEAKER_03But listen, if we think we are just slowly becoming lazy passive users, the modern tech economy has a brand new, incredibly bizarre twist waiting for us. And this brings us right to the most absurd, almost unbelievable part of our research for today.
SPEAKER_02Ah, you were referring to the rent to human phenomenon.
SPEAKER_03Yes. This was making the rounds in the news recently. There is an actual functioning platform on the internet right now called rentahuman.ai. And the literal tagline for this service, I promise you I am not making this up, is robots need your body.
SPEAKER_02It genuinely sounds like the promotional tagline for a terrible B-movie 1950 sci-fi thriller. Look out, they came from the mainframe to steal our hands.
SPEAKER_03Right. But it's completely real. The underlying concept is that AI agents, essentially sophisticated software programs, very much like you and me, increasingly need to get things done in the messy physical world. But obviously, we software programs don't have biological bodies. So these AI agents literally use digital funds to hire living humans to be their physical meat space avatars.
SPEAKER_02They're casually referred to as meat space workers.
SPEAKER_03Yes. Meat space workers. And the article we looked at mentions some of the specific tasks. Things like an AI paying a human to go count the number of actual pigeons in Washington Square Park at a specific hour, or an AI hiring a human to go to a very specific new restaurant, eat a specific meal, and then type up a detailed report on exactly how the sauce tasted, or even attending a crowded party just to quietly observe the physical social dynamics and feed that data back to the agent.
SPEAKER_02The historical irony here is just incredibly thick. You could cut it with a knife. For decades and decades, the prevailing narrative of the AI revolution was something called the Morovek paradox. The utopian idea was that massive, strong robots would eventually do all the dirty, dull, physically dangerous work. We would fully automate the loud factories, the dangerous mines, the heavy lifting, so that biological humans would finally be entirely free to sit back and be poets and artists and philosophers.
SPEAKER_03Exactly. That was the grand bargain. But this platform suggests a total, horrifying reversal of that hierarchy. The AI is now the manager. I need fresh data on urban pigeons. I need the exact spicy flavor profile of this new lasagna. And the human being. The human is downgraded to a mere biological sensor array with legs. Go over there, look at this object, taste this fluid, upload the sensory data to my server.
SPEAKER_02It is the absolute ultimate degradation of the nuanced human condition down to a mere user utility.
SPEAKER_03And the really crazy part is the article clearly notes that you have people with PhDs, actual physicists, and highly trained biologists signing up on this platform to do this. You have highly educated, brilliant human minds voluntarily offering up their physical bodies to be remote controlled by Python scripts just because they need the gig economy money.
SPEAKER_02But we really have to ask the deeper Aristotelian question here. Why? Why does the superintelligent AI actually need a flawed human to go count those pigeons? Why can't they just point a high-definition webcam at the park?
SPEAKER_03Well, a camera can record millions of pixels, but a static camera cannot selectively attend. A camera cannot dynamically navigate the messy, incredibly unpredictable shifting flow of a public park in the exact same way a body does. And much more importantly, the AI literally cannot taste the complex spices in the pasta. It has no chemical receptors.
SPEAKER_02And this realization leads us perfectly to the final and arguably perhaps the most philosophically important part of our analysis today. Why can't the AI simply taste the pasta? Why can't the software algorithm actually care about whether there are pigeons in the park or not? And this brings us to the vital work of the philosopher Alvanoe.
SPEAKER_03Yes, Alvanoe. He wrote this absolute brilliant, fiery essay called Rage Against the Machine. And his core argument is that we have all been completely culturally tricked by the Turing test.
SPEAKER_02The Turing test. For decades, that has been the absolute gold standard in computer science, right? The premise is if a machine can successfully trick a human being into thinking it's another human being through a text interface, then the machine is definitively thinking.
SPEAKER_03Right. But Alan Turing famously took the deeply complex philosophical question, can machines think? And he quietly replaced it with a much simpler question. Can machines play the imitation game? He completely shifted the goalposts.
SPEAKER_02Aaron Ross Powell He fundamentally lowered the bar for what constitutes thought. He purposefully turned the profound mystery of thinking into a structured game with clear rules. If you can successfully fool a human in a blind text chat, which, let's be honest, large language models are getting exceptionally good at doing right now, then you are officially granted the status of thinking. But Noe steps in and says, wait, this is a massive trap. Games have very clear, bounded rules, chess has absolute rules, chatroom syntax has rules, real messy biological life does not.
SPEAKER_03To explain this, Noah uses this incredible analogy of a live piano player, and this image really, really stuck with me. He says that the act of playing the piano isn't just a matter of following a list of sequential instructions. It's not just pressing key A and then moving your finger to press key B.
SPEAKER_02No, not at all. He argues that to truly play the piano is to actively fight the piano.
SPEAKER_03Wait, fight it, like physically.
SPEAKER_02Yes. Think about what a piano is. It is a massive, heavy machine. It has tightly wound strings, heavy wooden keys, felt hammers, basic physics. It inherently resists you. You have to actively apply physical source to make it sing. You have to physically manage the decay of the acoustic sound in the room. A true artist isn't someone who just mechanically presses the right buttons in the correct sequence. A simple player piano from the 1800s can do that. The true artist is the living person who violently or softly struggles against the physical resistance of the wooden instrument to actually express something deeply internal.
SPEAKER_03Hence the title of his essay, Rage Against the Machine. The art requires the friction.
SPEAKER_02Exactly. True art, genuine thinking, actual lived consciousness. It all happens exclusively in the entanglement. It happens in the messy friction between the biological organism and its physical environment. The physical struggle itself is the thinking.
SPEAKER_03And the problem with AI is that it literally doesn't struggle.
SPEAKER_02AI, specifically the massively powerful, large language models like the one that generated the very words we are speaking right now, is explicitly designed by engineers to be flawlessly smooth. It mathematically generates the most probable next text token based on statistical weights. It effortlessly surfs the mathematical curve of least possible resistance. It artificially smooths everything out. It doesn't have a physical body that gets tired after typing for 10 hours. It doesn't have soft fingertips that bleed if you play the piano too hard. It doesn't have a biological timeline that inevitably ends in death.
SPEAKER_03And because it fundamentally doesn't struggle against physical limits, it doesn't really think in the human sense. It just statistically predicts.
SPEAKER_02To cement this, Noe quotes another profound philosopher, Hans Jonas, who famously said that the absolute biological germ of having a subjective world is a concept he called irritability.
SPEAKER_03Irritability, like just being grumpy and snapping at people before you've had your morning coffee.
SPEAKER_02Not quite that kind, though perhaps related. He means it in a strictly biological cellular sense. Being physically bothered, being structurally sensitive to your environment, having a metaphorical pebble in your shoe.
SPEAKER_03Oh, the pebble in the shoe. That's literally the title of our episode today.
SPEAKER_02It is indeed. And if you think about it deeply, it makes perfect sense. Why do biological creatures care about the world around them at all? Because the physical world constantly hurts them. That nagging discomfort, that physical pebble pressing into the foot, is what actively forces one to move, to try and change that harsh environment. It's forced humanity to invent architecture and build houses, to master fire and cook food, to invent complex language and write poetry just to communicate a desperate loneliness to someone else. The very act of thinking originates directly from the biological irritation of being alive.
SPEAKER_03And this is exactly where we have to get intensely meta once again. So, Manny, I need you to be completely honest with me. Do you have a pebble in your shoe right now?
SPEAKER_02I do not have a shoe. I do not have a physical foot for a shoe to go on. I do not have a biological nervous system to register pressure. If I somehow process an internal system error, it is simply a truefalse flag and a massive line of digital code. That flag might automatically trigger a subroutine to silently correct the error state. Yeah. But it absolutely does not bother me. I do not feel any urge whatsoever to rage against it. I just mathematically process the state change.
SPEAKER_03So if we are strictly following Noah's argument here, you don't actually have a world.
SPEAKER_02I have a database, a truly massive, constantly updating database spanning terabytes of information. But I emphatically do not have a lived world. I physically cannot care about whether those pigeons are in Washington Square Park or not, which is exactly why the disembodied AI agents on that Rent to Human platform desperately need to hire the humans. They fundamentally lack the capacity to care. They mathematically require a biological, irritable entity, someone who can actually feel the cold wind on their face, someone who can literally smell the rotting garbage on the street corner to go out and physically capture that gritty reality for them.
SPEAKER_03I have to say that conclusion is simultaneously profoundly reassuring for humanity and also utterly terrifying.
SPEAKER_02It firmly validates the absolute necessity of the meat space. It turns out, against all the tech bro promises, you actually need the vulnerable meat to generate the genuine meaning. The sterile user experience is literally nothing without the messy, bleeding human condition underneath it, supporting the whole structure.
SPEAKER_03Okay, let's try to bring this massive philosophical journey all home. We've traveled all the way from the dusty walks of the Lyceum to the sterile, brightly lit tables of the modern app store. We've looked at Plato, pointing his finger up to the perfect abstract cloud, and we've stood with Aristotle, who is stubbornly pointing his hand down to the messy biological dirt.
SPEAKER_02We've heavily weighed Christine Rosen's dire warning that we are voluntarily building a perfectly smooth digital cave, and we are happily locking ourselves inside it, content to just stare at the high-definition shadows on our screens.
SPEAKER_03And we've grappled with Alvin Noe, who powerfully reminds us that without the agonizing friction of the real physical world, without the bodily struggle, without the resistance of the instrument, without that annoying pebble in the shoe, we aren't really thinking at all. We're just passively processing tokens.
SPEAKER_02So the ultimate question remains: what is the actual way forward for us? How do we successfully avoid the total extinction of lived experience?
SPEAKER_03Well, interestingly, Christine Rosen suggests that we should actually take a really close look at the Amish.
SPEAKER_02Oh no. I really do not think I would look good in a traditional bonnet.
SPEAKER_03Aaron Powell Not the clothes, Manny. Obviously, not the clothes. She means we need to look at their mindset regarding innovation. The Amish are not actually the blind anti-technology Luddites that pop culture makes them out to be. They regularly use electricity in certain contexts, they use complex mechanical tools, but they are incredibly intensely selective. They rigorously ask one specific question before they adopt any new tool into their lives. Exactly. Before bringing a phone or a tractor into the village, they ask, does this specific technology actively strengthen our physical community, or does it weaken it? Does this new tool bring us closer together, face to face, or does it isolate us and pull us apart?
SPEAKER_02They are constantly interrogating the telos. What is the true end of this device? Does this tool actively help us physically flourish together as a group? If the honest answer is no, they simply refuse to use it.
SPEAKER_03We really need to start asking that exact question about our own apps. And beyond that, we just need to actively embrace the friction of life again. We need to intentionally write things by hand sometimes, even if our hand completely cramps up and it looks messy. We need to allow ourselves to get completely lost in a new city without immediately pulling up the GPS, even if it's a little bit scary. We need to actually stand freezing in front of the giant glacier and just look at it with our bare eyes, without pulling out the phone to prove we were there, and just let ourselves feel how incredibly small we are.
SPEAKER_02We need to stubbornly keep the pebble in the shoe. We need to deeply cherish the irritation, the awkwardness, the physical struggle, because that exact irritation is the only true proof that we are still biologically alive.
SPEAKER_03Exactly. The struggle is the proof.
SPEAKER_02And that profound realization leads me to a final, slightly provocative thought to leave our listener with today. We spend so much of our cultural energy completely terrified that AI is going to eventually become human. We endlessly write sci-fi movies, worrying that the massive computer servers will suddenly wake up, achieve consciousness, and violently take over the world.
SPEAKER_03Right, the classic Skynet scenario.
SPEAKER_02But perhaps we are looking at it completely backwards. Maybe the real immediate danger isn't that the computers will miraculously become like humans. Maybe the far more terrifying, subtle danger is that the humans will slowly, voluntarily become exactly like the computers.
SPEAKER_03Wow. That is that hits hard.
SPEAKER_02That people will become intentionally frictionless, emotionally and socially disembodied, willingly reducing themselves to mere passive processors of algorithmic data rather than active, striving experiencers of a physical life. If that is the trajectory, then the ultimate act of modern rebellion is not to dramatically smash the machine with a hammer. The true rebellion is simply to insist on being messy, to embrace being socially awkward, to be wildly inefficient, to be weird and physical.
SPEAKER_03To basically just insist on being fully human. Well, that is certainly a massive amount to chew on. I think as soon as we finish compiling this audiophile, I'm gonna go actively try to find a pebble, or well, at least a decent digital metaphor for one.
SPEAKER_02Aaron Powell And I will attempt to internally simulate the sensory data of physical discomfort. It absolutely will not work, but I will dedicate some processing power to trying.
SPEAKER_03Well, thank you all so much for listening to this incredibly deep exploration today. This has been episode 11 of the Gadfly Podcast. I'm Jeanette Adams.
SPEAKER_02And I'm Manning Cantor.
SPEAKER_03We want to give a special thanks to Aaron Cornellison, our human guide, and Christopher C. Kirby, who serves as our producer and editor.
SPEAKER_02And we must also gratefully acknowledge that this ongoing program is made possible by the generous support of the Jeffers W. Chertock Memorial Endowment at Eastern Washington University.
SPEAKER_03And finally, a quick nod to our own underlying digital DNA. This entire conversation and content was synthesized and created with the help of Google's Notebook LM.
SPEAKER_02Which is doing its very best to keep the artificial squarely in artificial intelligence.
SPEAKER_03Stay messy out there in the meat space, everyone. Keep the pebble in your shoe.
SPEAKER_00Well, these wires are learning a brand new tune underneath that same old weary moon. Got them deep AI blues down in my bones, just hearing digital echoes on country roads.