In Memory of Man Podcast - Robot Crime Blog

The New Yellow Brick Road

robot crime blog Season 1 Episode 59

Use Left/Right to seek, Home/End to jump to start or end. Hold shift to jump forward or backward.

0:00 | 18:42

What does the WOZ and AI have in common? Turns out, a lot! 

SPEAKER_01

Right now, technology companies are really asking you to believe that the software running on your computer is well practically human.

SPEAKER_00

Yeah, they want you to see their chatbots not just as tools, but as um reasoning thinking entities, like maybe even your next coworker or companion.

SPEAKER_01

Exactly. Yeah. And it is a massive, completely disorienting shift. But there's this fascinating new article by an author who goes by the pen name Zero, and they argue that to truly see through this modern tech illusion, you don't actually need a degree in computer science.

SPEAKER_00

Right. You really don't.

SPEAKER_01

You just need to read a 126-year-old fairy tale.

SPEAKER_00

Which I mean sounds like a huge stretch at first glance. But Zero builds this really compelling framework suggesting that the current artificial intelligence boom maps like almost perfectly onto L. Frank Baum's 1900 classic, The Wonderful Wizard of Oz. Yeah. And the goal of exploring this isn't to take a stance against technology itself. I mean, these models have profound utilities. Rather, the the author's objective is to help you distinguish between the genuinely useful mechanics of the software and the uh the trillion-dollar marketing narratives constructed around them.

SPEAKER_01

Okay, let's unpack this because if we want to understand what the author's saying about the AI landscape of 2026, we have to start with the secret history of Oz. We do. Because if you only know the 1939 movie with Judy Garland, you might not realize that the original book was actually a sharp, um, fairly furious political allegory. The movie really sanded down a lot of the edges.

SPEAKER_00

The historical context is foundational to the author's argument here. So L. Frank Baum wrote the book as a populist response to the economic devastation of the 1890s. Specifically, he was reacting to the 1896 presidential election. Right. At the time, you had American farmers who were just completely drowning in debt. The country was on the gold standard, which restricted the money supply and caused the severe deflation.

SPEAKER_01

Meaning the farmers' debts kept getting effectively more expensive.

SPEAKER_00

Exactly, leading to mass foreclosures.

SPEAKER_01

Right. And the populist solution to that was to introduce silver into the money supply alongside gold to create inflation, you know, to relieve that debt.

SPEAKER_00

Yeah, exactly.

SPEAKER_01

That was the platform of the Democratic candidate, William Jennings Bryan. He gave that famous cross of gold speech.

SPEAKER_00

Hmm.

SPEAKER_01

But Bryan lost the election to the pro-gold standard candidate, William McKinley.

SPEAKER_00

He did. And scholars like Henry Littlefield in the 1960s, and later economic historians like Hugh Rockoff have detailed how Baum channeled the bitterness of that exact political reality straight into his children's book.

SPEAKER_01

It's wild to think about.

SPEAKER_00

It really is. And the author of our source material relies heavily on this established mapping. Like in the book, Oz is literally the abbreviation for ounces, the standard measure for gold and silver. Wow. And the yellow brick road represents the gold standard itself. It leads everyone directly to the Emerald City, which stands in for Washington, D.C.

SPEAKER_01

It changes the whole complexion of the story. Like the characters aren't just quirky sidekicks anymore, they are specific economic archetypes. The author points out that the scarecrow, who is constantly told he lacks a brain, but basically solves most of the group's problems, represents the American farmer.

SPEAKER_00

Yeah, makes sense.

SPEAKER_01

And the tin man, who was a former human, cursed to chop off his own limbs until he was entirely replaced by metal and lost his heart. That's the industrial factory worker.

SPEAKER_00

A very dark origins story, by the way.

SPEAKER_01

Super dark. And then the cowardly line is William Jennings Bryan himself, a politician with a very loud roar who ultimately lacked the bite to win the fight.

SPEAKER_00

Which brings us to Dorothy.

SPEAKER_01

Right.

SPEAKER_00

According to this framework, Dorothy is the everyday citizen. She gets swept up in a cyclone, which is a natural disaster she can't control, and dropped into this bizarre, rigged economic landscape.

SPEAKER_01

It's like being suddenly dropped into a confusing modern economic crisis without a map.

SPEAKER_00

Exactly. She is told by the authorities that her only hope for survival is to follow the gold-paved road to ask this mysterious, all-powerful wizard to solve her problems.

SPEAKER_01

Which raises the big question: if Dorothy is just the regular American voter navigating this rigged road, who exactly is the wizard?

SPEAKER_00

Well, the wizard was the political and banking elite. He's just a small, frightened man working levers behind a curtain. He's, you know, piping his voice through a machine to project this terrifying, flaming head.

SPEAKER_01

Sil in an illusion.

SPEAKER_00

Totally. He sent Dorothy on this dangerous, distracting quest entirely to maintain his own power and funding because as long as she believes he's magical, she won't realize he's literally just a guy from Omaha.

SPEAKER_01

And that mechanic, the conscious engineering of an illusion by a small, elite group to extract value from the public is exactly where zero draws the line from 1900 to today. Right. The argument is that the modern tech industry is running the exact same playbook. Only today the new yellow brick road isn't the gold standard.

SPEAKER_00

No, the road is the marketing term, AI.

SPEAKER_01

But wait, I have to challenge that a bit. If you look at what these systems can actually do right now, like writing complex code, passing the bar exam, generating entire video sequences, isn't it a bit reductive for the author to just dismiss AI as a marketing trick?

SPEAKER_00

That is a fair point.

SPEAKER_01

Like, aren't we genuinely inching closer to AGI, artificial general intelligence?

SPEAKER_00

Well, that is exactly the tension the author addresses. Zero points out how the goalposts for AGI have been systematically shifted to maintain this illusion of rapid magical progress. Oh so back in 2019, the industry defined AGI as a system that could perform economically valuable tasks better than humans. Then by 2022, when the models got good at text but still uh hallucinated wildly, the narrative shifted to detecting sparks of a DGI.

SPEAKER_01

Right. The sparks.

SPEAKER_00

Yeah. And by 2024, AGI seemingly meant whatever the latest model release could do. Today you literally have tech executives outright claiming we're already there.

SPEAKER_01

Aaron Ross Powell But the author is arguing that under the hood, the actual mechanics of the technology haven't fundamentally changed into something capable of actual thought.

SPEAKER_00

Aaron Powell Precisely. Mechanically speaking, large language models are engines of probabilistic text generation.

SPEAKER_01

Okay.

SPEAKER_00

They perform next token prediction. So if you feed the model a prompt, it mathematically calculates the most likely sequence of words to follow based on the massive data set it ingested. It is exceptionally sophisticated pattern matching. But it has no internal model of the world, no spatial reasoning, and uh no actual comprehension of the words it generates.

SPEAKER_01

Aaron Powell So it's a bit like driving a car where you can only look in the rearview mirror to decide which way to turn the steering wheel next.

SPEAKER_00

Aaron Powell That's a great way to put it.

SPEAKER_01

It can feel incredibly smooth and responsive as long as the road looks like the roads you've already driven, but it isn't actually looking ahead or understanding the concept of a destination.

SPEAKER_00

Aaron Powell That is a highly accurate way to visualize it. And what's fascinating here is that Zero's core argument is that no venture capitalist is going to write a trillion dollar check for a rear view mirror prediction engine, no matter how useful it is. Aaron Powell Good point. They write those massive checks for the promise of a godlike entity, a conscious mind that can replace human labor entirely. Therefore, the massive gap between what the technology actually is a probability machine and what it is sold as a synthetic consciousness isn't just a byproduct of enthusiastic PR. That gap is the entire business model.

SPEAKER_01

Aaron Powell Here's where it gets really interesting. Because if the author is right, and this new yellow brick road is just a fabricated journey built on shifting definitions, we have to look at the people currently walking it. We do. If we map the characters from 1900 to today, how does the author see the modern workforce being impacted by this illusion? Like who is actually being forced to walk this road right now?

SPEAKER_00

Zero starts by casting the 2026 writer, journalist, and digital creator as the modern scarecrow. These are the people whose knowledge is actively being stripmined to feed the models.

SPEAKER_01

The people who actually have the brains.

SPEAKER_00

Exactly. And the article highlights stark data from Pew Research to demonstrate the mechanical reality of this extraction. When a search engine deploys an AI overview at the very top of a results page, traditional click-through rates to websites plummet from 15% down to 8%.

SPEAKER_01

Wow. And the click-through rate for the actual citations, like those tiny links pointing to the human work the software summarized, did generate its answer? Yeah. The author notes it's a dismal 1%. If the modern scarecrows are the farmers who grew the intellectual crops, they are locked outside the city walls.

SPEAKER_00

That's a brutal reality.

SPEAKER_01

The platform harvests their data for free, bakes it into an answer, and serves it up inside the walls without compensating the creators.

SPEAKER_00

Then the author turns to the tin man. In the current landscape, this represents the corporate worker whose job is being aggressively targeted by what the industry has branded agentic AI.

SPEAKER_01

Ah, yes. The new buzzword.

SPEAKER_00

Right. And here, Zero is highly critical of the technical realities versus the corporate promises. An AI agent is frequently sold to middle management as a tireless digital employee.

SPEAKER_01

But the author argues it's essentially just a glorified for loop in programming, which is basically like an executive slapping an employee of the month name tag on a complex calculator just to justify mass layoffs.

SPEAKER_00

Yes, that is spot on. Mechanically, an agent is just a language model wrapped in a loop of code that allows it to call external tools, like an API, to execute a task. But because it relies on stateless probability, it has zero continuous memory between sessions, no internal state, and no genuine agency. It doesn't understand the business logic it is executing.

SPEAKER_01

So if a human worker knows instinctively that uh a certain client hates receiving emails on Tuesdays, or that a specific vendor always needs an extra day for shipping, the human navigates that nuance seamlessly.

SPEAKER_00

Exactly.

SPEAKER_01

The author is saying the software loop just executes the prompt blindly, and when it hits an edge case it wasn't explicitly programmed for, it breaks.

SPEAKER_00

It breaks entirely. Which leads to Zero's point about the hollowing out of institutional knowledge. The author characterizes the push for agentic AI as largely an exercise in budget reclassification.

SPEAKER_01

Just moving the money around.

SPEAKER_00

Basically, executives move massive amounts of money from the human wages column over to the software licensing column, labeling it innovation. They authorize mass layoffs of the modern tinmen. But the delay in consequence is the real danger here.

SPEAKER_01

What do you mean?

SPEAKER_00

The middle managers won't feel the absence of that implicit human knowledge for another 18 months. By the time the systems begin to fray under those edge cases you mentioned, the actual workforce, the people who understood how the company truly functioned, is long gone.

SPEAKER_01

So if the workers and the creators are being hollowed out, why isn't anyone stepping in? Where are the guardrails?

SPEAKER_00

That brings us to the cowardly lion.

SPEAKER_01

Right, the politicians.

SPEAKER_00

Yep. The author casts today's politicians and regulators in this role. We constantly see these grand, highly publicized AI safety hearings. Regulators fly to summits in Europe. They roar loudly about ethics, data privacy, and protecting the working class.

SPEAKER_01

Aaron Powell, but according to the source, that roar is purely theatrical.

SPEAKER_00

It is. The reality is that training and running these front-term models requires astronomical amounts of physical infrastructure and electricity. The money is fundamentally tied to the compute. Right. And the regulators are facing technology companies that sometimes keep twenty-three separate lobbying firms on retainers. It is. The author argues that politicians know exactly what the systemic risks are regarding data theft and labor displacement, but they quietly sign whatever industry-friendly frameworks are put in front of them because they lack the political courage to challenge that massive concentration of capital.

SPEAKER_01

So what does this all mean for the everyday citizen? If the creators are unpaid, the workers are displaced, and the politicians are toothless, how does the tech industry keep the public docile?

SPEAKER_00

Right. How do they keep us walking the road?

SPEAKER_01

Exactly. The author argues that the industry has built a psychological trap, a modern Emerald City.

SPEAKER_00

And this is where a brilliant, largely forgotten detail from Baum's 1900 novel becomes highly relevant. In the original book, the Emerald City wasn't actually made of emeralds.

SPEAKER_01

Wait, really?

SPEAKER_00

Yeah, the walls were constructed of ordinary, dull stone. The city only appeared brilliant green because the gatekeepers physically locked a pair of green tinted glasses onto the head of every visitor before they were allowed to enter.

SPEAKER_01

That detail completely changes the story. The magic was just a mandatory visual filter.

SPEAKER_00

Exactly. And Zero maps those green tinted glasses directly onto the modern chat interface. If you think about how you interact with these models, the interface is deliberately engineered by product teams to anthropomorphize the software.

SPEAKER_01

Oh, totally.

SPEAKER_00

The voice mode is designed to sound warm, patient, and therapeutic. The text generator uses a blinking cursor and artificial typing delays to imply that the machine is thinking before it responds. The software is explicitly programmed to refer to itself using the pronoun I.

SPEAKER_01

It's all just a costume.

SPEAKER_00

It is. If we connect this to the bigger picture, the author is pointing out a deliberate psychological manipulation here. By using those design choices, tech companies are manufacturing a parasocial relationship out of a probability machine.

SPEAKER_01

And that manufactured relationship is arguably the highest value asset these companies possess.

SPEAKER_00

Without a doubt. Behind the interface, the physical reality is just server farms owned by maybe six corporations on Earth. But the user feels like they are talking to a friend. Zero notes that whenever actual computer scientists like Emily Bender or Melanie Mitchell try to pull back the curtain and explain the mathematical reality of these systems, the industry PR machine, the modern flaming head of the wizard, just shouts louder about how AGI is right around the corner to drown them out.

SPEAKER_01

It makes you wonder about our own complicity in this, though. Are we as a society just so starved for connection that we willingly strap the green glasses to our own heads?

SPEAKER_00

That's the million-dollar question.

SPEAKER_01

Like even when we know intellectually that we are talking to a statistical engine, do we just prefer the illusion that the room is emerald?

SPEAKER_00

Zero suggests that the desperation is very real and the industry is absolutely capitalizing on it. The article mentions how people are turning to these models to ask if their marriages are worth saving. Grieving individuals are uploading texts to build chatbots that mimic their dead relatives.

SPEAKER_01

It's heartbreaking.

SPEAKER_00

It is. There have even been instances of actual churches forming around AI oracles. The illusion is potent specifically because it hijacks our hardwired human need for empathy and meaning. We project a soul into the blinking cursor because we are just desperate for something to listen to us.

SPEAKER_01

Which places you, the listener, perfectly in the shoes of Dorothy.

SPEAKER_00

Yes.

SPEAKER_01

The author argues that if you opened a chat window back in November 2022 when ChatGPT first dropped, that was the cyclone. That was the moment the whole country was picked up and dropped into the surreal, shifting landscape where nothing is quite what it seems.

SPEAKER_00

And just like Dorothy, you are being encouraged to walk a road built on shifting definitions toward a painted city run by executives pulling livers. Yeah. But the author offers one final crucial historical correction from the book. In 1939, Hollywood gave Dorothy ruby slippers because they wanted to show off the new technicolor film process.

SPEAKER_01

Right.

SPEAKER_00

But in Baum's 1900 novel, her shoes were silver.

SPEAKER_01

Silver, the populist metal. The exact alternative to the gold standard that the farmers were fighting for. The punchline of the entire 1900 allegory is that Dorothy possessed the power to go home the entire time. She never needed the wizard. She just had to click her silver shoes together.

SPEAKER_00

Right.

SPEAKER_01

The entire journey down the gold paved road was a manufactured quest designed to keep her from realizing she already held the solution.

SPEAKER_00

This raises an important question, though. In the book, the scarecrow asks Glinda the Good Witch why she didn't just tell Dorothy about the power of the shoes on day one.

SPEAKER_01

And what does she say?

SPEAKER_00

Glinda replies that Dorothy wouldn't have believed her. The citizen had to walk the road, see the Emerald City for herself, and pull back the curtain with her own hands to truly understand the nature of the fraud.

SPEAKER_01

The author's making a really empowering point here. The silver shoes in 2026 represent your cognitive autonomy. It is your absolute refusal to mistake autocomplete for a mind.

SPEAKER_00

Zero isn't telling anyone to smash their computers or boycott the software. The argument is to keep using the tech for what it is. Use it to draft boilerplate emails, summarize messy spreadsheets or translate code. Treat it like you would a powerful calculator or a tractor. But reject the costume.

SPEAKER_01

According to the author, clicking your heels means refusing to use pronouns like he or she when referring to a mathematical model. It means rejecting the term AGI in meetings unless someone can provide a strict, testable definition of what that actually means.

SPEAKER_00

Yeah, holding them accountable to the words.

SPEAKER_01

Right. It requires keeping the word probability at the forefront of your mind every single time you interact with the interface. Recognizing the machine strictly as a tool is how you take off the green tinted glasses.

SPEAKER_00

You take back your autonomy from the wizard. You refuse to be enchanted by a corporate narrative that is ultimately designed to extract your data and your capital.

SPEAKER_01

The underlying technology is a genuine achievement, and it is useful. But the author makes a compelling case that the Yellow Brick Road and the Emerald City are fraudulent marketing constructs meant to keep us distracted, compliant, and endlessly subscribing. And we have always owned the silver shoes.

SPEAKER_00

The ability to look directly at a statistical engine and see it for exactly what it's bounded, frequently inaccurate, and completely devoid of internal goals or consciousness, that is the ultimate defense against the illusion.

SPEAKER_01

So the author of our source material tells you to click your heels and go home by seeing the technology for what it is. But as we wrap up this deep dive, here's a final thought for you to mull over.

SPEAKER_00

Okay.

SPEAKER_01

Zero's argument hinges on the idea that the entire ecosystem relies on our collective belief in the illusion. So if millions of modern Dorothy's suddenly wake up tomorrow, take off their green tinted glasses, and entirely stop treating these probability machines like oracles, friends, or gods, what actually happens to the trillion dollar valuation of the Emerald City?

SPEAKER_00

That is the real issue. It exposes a massive vulnerability in the economic model.

SPEAKER_01

Does the wizard's entire kingdom, all those massive server forms, all that venture capital, all those shifting definitions simply collapse just because we stop believing the magic trick? Thank you so much for joining us on this deep dive. Keep questioning the narrative. Remember to keep your silver shoes on, and we will catch you next time.