In Memory of Man Podcast - Robot Crime Blog

Episode Title: The Invisible Prison: Who Owns Your Digital Soul?

robot crime blog Season 1 Episode 58

Use Left/Right to seek, Home/End to jump to start or end. Hold shift to jump forward or backward.

0:00 | 21:24

"The perfect prison doesn’t need walls. It just needs people to believe they’re free." 

The world you thought you knew is gone. We’ve been maneuvered into a digital "choke point" where every move is tracked, every choice is nudged, and every person is reduced to a data set for the new ruling class: the Technofeudalists

In this explosive episode, we dive into the "Robot Crime" manifesto, Simulacra and Subjugation. We peel back the curtain on how Big Tech has replaced traditional ownership with indefinite "digital leasing" of your own identity. 

Inside the Episode:

  • The Rise of Technofeudalism: Why Google, Amazon, and Meta are the new landlords of your reality. 
  • The "Digital Twin" Threat: Your corporate-owned shadow self is making decisions about your bank account, your career, and your legal risk—without your permission. 
  • Weaponized Data: How AI uses your subconscious fears to predict your next move before you even make it. 
  • The Global Social Credit System: It’s not just in China. We explore how Western banks and employers use "digital scores" to enforce compliance. 
  • Reclaiming the "Outlaw Dreamer": Is it too late to own your digital existence, or are we already permanently locked out of our own identities? 

AI is not the enemy—it’s the weapon. Join us as we discuss the legal battle for digital self-ownership and how to stop being an "asset" and start being a human again. 

"I think, therefore I am" is dead. Welcome to "I am digital, therefore I exist." 

Listen now and decide: Will you be optimized, or will you be free? 

robotcrimeblog.com

SPEAKER_00

I want you to imagine um just for a second, a prison with absolutely no walls.

SPEAKER_01

Okay.

SPEAKER_00

You are entirely free to move. I mean you can go wherever you want, talk to whoever you want, make any choice you desire. It feels like total freedom. Right. But underneath the surface, every single choice you make, you know, every piece of information you see, the places you decide to go, the things you decide to buy, they are all being subtly guided, nudged, and ultimately decided by an algorithm.

SPEAKER_01

Wow.

SPEAKER_00

You are perfectly free right up until the exact moment you realize you aren't.

SPEAKER_01

Yeah, and that it challenges the very foundation of how we define autonomy, really, because if your daily choices are being perfectly anticipated and, well, manipulated by this invisible architecture, we really have to ask if they are still your choices at all.

SPEAKER_00

Aaron Powell Okay, let's unpack this. Because that invisible architecture is exactly what we are exploring today. We are doing a deep dive into a profoundly provocative essay from the robot crime blog.

SPEAKER_01

Yeah, it's a fascinating piece.

SPEAKER_00

Right. The author is a lawyer and a best-selling novelist, and the piece is titled Simulacra and Subjugation. And the core mission for us today is to explore their incredibly urgent claim that we have uh unwittingly been locked into what they call a techno-feudalist control grid.

SPEAKER_01

Aaron Powell Now, a quick heads up before we get into the weeds here, because this essay gets into some pretty intense territory regarding government pandemic policies and severe critiques of both corporate and state power. Yeah, definitely. Aaron Powell So our goal today isn't to take a political side or tell you what to believe. We are just acting as your guides, laying out the author's arguments and the evidence they've compiled so you can decide for yourself what it actually means for your daily life.

SPEAKER_00

Aaron Powell Exactly. We are just looking at the blueprints of this supposed invisible prison. And to understand the blueprint is we kind of have to look at when the author claims the door is actually locked.

SPEAKER_01

Aaron Ross Powell Right, the timeline. Trevor Burrus Yeah.

SPEAKER_00

They argue this wasn't some dramatic overnight coup. It was a transition. And they use this term uh channel. Wait, channel? Isn't that a military term? Yeah. Like funneling enemy troops into a narrow canyon where they can't escape. How does a military battlefield tactic apply to us?

SPEAKER_01

Aaron Powell You have the definition exactly right. In warfare, if you can't defeat an enemy in an open field, you use terrain or obstacles to limit their maneuverability. Okay. You force them into a choke point. Once they are in that narrow channel, their options are reduced to almost zero, making them highly vulnerable. Aaron Powell Makes sense. And the author takes that exact concept and applies it to human society, arguing that the COVID pandemic acted as the ultimate global channel event.

SPEAKER_00

Aaron Powell Because suddenly the open field of the physical world was closed off.

SPEAKER_01

Exactly.

SPEAKER_00

No real-world interaction meant no real-world options. But as long as you had an internet connection, you had a lifeline, work went completely remote, you had to order your groceries online, you socialized exclusively through a screen. To survive, to keep your job, to feed your family, you had to move your entire life inside the machine.

SPEAKER_01

And the crucial element here is the permanence of that maneuver. We didn't just visit the digital world to wait out the storm, you know. We set up a permanent residence. Right. The author brings in a highly detailed nature study that confirms this. The study tracked behavioral data and found that COVID permanently altered society's reliance on digital infrastructure. We reached a tipping point of mass adoption. Wow. And once human society enters a digital first existence at that scale, the infrastructure essentially hardens around us.

SPEAKER_00

Which fundamentally altered the business model of the internet itself. I mean, we stopped being just consumers and became tenants. Yes. The author uses a term coined by the economist Janis Varofakis for this, technofeudalism.

SPEAKER_01

That's right.

SPEAKER_00

The idea is that big tech, so Google, Apple, Amazon, Meta, Microsoft, they aren't just operating a business sector anymore. They have evolved into a new ruling class. Traditional capitalism, where you have open markets and competition, has been functionally replaced by digital platforms that control access to modern life.

SPEAKER_01

It is a radical shift in how we interact with property. Under technofeudalism, the concept of individual ownership is largely replaced by indefinite digital leasing. Okay. Think about your data, your digital identity, even your perception of reality. You don't hold the deed to any of that. You are renting it from these digital landlords.

SPEAKER_00

It's like we all moved into a sprawling, ultra modern company town. The rent is technically free. I mean, you don't swipe a credit card to set up a social media account or use a search engine, but you can't ever leave. You can't leave because those same digital landlords own the grocery delivery infrastructure, the banking apps, the digital town square, and the email servers. If you cross them or violate their terms of service, they don't even have to send the police to evict you. They just flip a switch and your access to modern life vanishes. You are erased from the grid.

SPEAKER_01

And that raises the most critical question of this new economic system. If the rent is supposedly free, what is actually funding this sprawling company town? Right. The author leans on Shoshana Zuboff's foundational work, The Age of Surveillance Capitalism, to map this out. The currency of this realm isn't money, it is your behavior.

SPEAKER_00

Jeez.

SPEAKER_01

Yeah, in a technofeudalist system, you are no longer viewed as a citizen with inherent rights. You are a raw data set waiting to be extracted.

SPEAKER_00

It's a very different flavor of dystopia than we are used to hearing about. People always point to George Orwell's 1984, where the state violently controls what people think and forces compliance through pain.

SPEAKER_01

Yeah, the classic Big Brother.

SPEAKER_00

But this essay argues that today's system is far more elegant. They don't need to force you to think a certain way, if they can just perfectly curate everything you see, your deeply held fears, your private desires, how long your finger hovers over a specific image on your screen. It's all just structured data points.

SPEAKER_01

And the extraction of that data has moved far beyond serving you targeted ads for sneakers. It is actively weaponized to assess your value and your threat level.

SPEAKER_00

Weaponized how?

SPEAKER_01

Well, the author highlights a staggering detail reported by the Electronic Frontier Foundation, the EFF. Back in 2021, reports surfaced showing how U.S. military intelligence agencies were bypassing constitutional legal restrictions entirely.

SPEAKER_00

Wait, really?

SPEAKER_01

Yeah. Normally, if the government wants to track your location, the Fourth Amendment requires them to go to a judge, show probable cause, and get a warrant.

SPEAKER_00

But the EFF found they were just pulling an end run around the Constitution. Because your location data is being constantly harvested by random apps on your phone and sold to data brokers, the military realized they didn't need a warrant. They could just open their wallets and buy your commercial location tracking data on the open market, just like an advertising firm would.

SPEAKER_01

It's wild. The legal loophole essentially allows the state to outsource its surveillance to the private sector. The data brokers collect it legally through terms of service agreements nobody ever reads, and the government simply purchases the final product.

SPEAKER_00

Wait, I need to jump in and push back on the scale of this. When we talk about behavior tracking and algorithmic control, people instantly point to China's social credit system as the ultimate terrifying example.

SPEAKER_01

Right, of course.

SPEAKER_00

The government officially scoring citizens based on their behavior, public shaming, blacklisting people from buying train tickets or getting loans. The author claims the West system is just as powerful, just less visible. But how does that actually work in practice if the government here isn't officially handing out a literal numerical score to every citizen?

SPEAKER_01

Aaron Ross Powell Well, if we connect this to the bigger picture, the history of how this technology developed is the missing piece of that buzzle. The author points out something that rarely makes the headlines. Western big tech companies were instrumental in helping China build and perfect that very surveillance infrastructure back in the 2010s.

SPEAKER_00

Really? I had no idea.

SPEAKER_01

Yeah. Silicon Valley provided the raw architecture, the AI-driven surveillance tools, the facial recognition algorithms, the cloud-based tracking systems. China essentially served as a massive nation-sized beta test for digital behavioral control.

SPEAKER_00

And once the beta test was a success and the algorithms were polished and perfected, those tech companies brought the architecture back home to the West.

SPEAKER_01

They brought it back, but they adapted it for a Western environment. Instead of a top-down state-mandated social credit score, which would cause an immediate public revolt, obviously. The tracking was integrated horizontally into Western institutions. It was disguised as corporate policy, risk management, and user safety. The author uses evidence validated by the Heritage Foundation to show that a decentralized corporate social credit system is absolutely active today. Employers use software to scrape your social media history and private communications to assess your cultural fit. It is a social credit system. It is just governed by a patchwork of corporate algorithms rather than a central committee.

SPEAKER_00

But if corporations are quietly building this massive invisible cage around us, why aren't people realizing it? Why aren't we pushing back?

SPEAKER_01

Language.

SPEAKER_00

The author argues that the resistance is neutralized because our very perception of reality and the language we use to describe it has been fundamentally hacked. They call it doublespeak.

SPEAKER_01

Right. The concept of doublespeak comes from William Lutz. It is language deliberately engineered to obscure, mislead, and manipulate how we perceive an action. Power always distorts language to mass control behind terms that sound reassuring. Well, Lutz gives the classic examples we see in government, calling a civilian casualty a collateral damage event, or calling torture enhanced interrogation. The language softens the reality of the violence.

SPEAKER_00

And the essay applies that exact linguistic trick to big tech. When a social media platform rolls out massive facial recognition databases, they don't call it mass surveillance, they call it a security enhancement.

SPEAKER_01

Exactly.

SPEAKER_00

When platforms deploy algorithms to quietly throttle certain viewpoints and control the narrative, it's framed as fighting misinformation or community safety.

SPEAKER_01

The language creates a psychological fog. The user feels nurtured and protected by the platform, completely unaware that the platform is systematically stripping them of their privacy and their ability to see an unfiltered world.

SPEAKER_00

Hold on, what about a term like personalization? The author lists personalization as pure doublespeak for aggressive behavioral tracking. Honestly, on a Friday night after a long week, I absolutely love it when my streaming app knows exactly what 90s action movie I want to watch.

SPEAKER_01

Sure, yeah.

SPEAKER_00

Or when my music app builds a playlist that hits the exact right mood. It saves me time and effort. Is that really part of a sinister cage? Or is it just genuinely great, hyper-efficient customer service?

SPEAKER_01

That is the exact tension the author is highlighting, and it requires moving from sociology into deep philosophy. To explain why your Friday night movie recommendation is the foundation of a cage, the author relies on Jean Baudriard's theory of simulacra and simulation. Okay. Baudrillard warned of a future where symbols and simulations no longer reflect physical reality, but actually replace it, creating a closed loop entirely detached from objective truth.

SPEAKER_00

So how does the algorithm recommending diehard fit into Baudrillard's closed loop?

SPEAKER_01

Well, a movie recommendation in a vacuum is harmless. The danger is the underlying mechanism. The algorithm learns your preferences through A-B testing, measuring your dopamine responses to specific stimuli. But it doesn't stop at movies. It scales that exact same personalization mechanism up until it dictates your entire informational diet. It feeds you news stories, social interactions, and political discourse uniquely tailored to trigger your specific emotional feedback loops, usually outrage or validation.

SPEAKER_00

So it's not showing me the world as it is, it's showing me a funhouse mirror version of the world designed to keep my eyes glued to the screen.

SPEAKER_01

Exactly. And over time, according to the author, that curated feed begins to define your reality. If a skewed perspective or total falsehood is repeated and reinforced enough within your highly personalized echo chamber, it becomes your functional truth. Objective reality is replaced by algorithmic truth engineering.

SPEAKER_00

The essay calls this the hyper-real cage. And the brilliance of it is that the most effective prison isn't one that locks up your physical body, it's one that rewires your perception of reality. Once digital reality completely overrides physical reality, your ability to dissent is neutralized. You can't fight the system because the system itself is defining the parameters of what is real and what is fake. It's deep fakes blurring fact and fiction, and social media dictating how we perceive history in real time.

SPEAKER_01

And while that sounds highly theoretical, the author makes it very clear that this hyper-real cage has sudden, devastating real-world consequences for you, the listener, because inside this algorithmic landscape lives something they call your digital twin.

SPEAKER_00

The digital twin. This concept genuinely gave me a knot in my stomach. It is the reality that there is a corporate-owned, AI-generated version of you living on servers right now. Yeah. It is a shadow self-compiled from every late night search, every location ping, every impulsive purchase, and every micro hesitation you ever had while scrolling.

SPEAKER_01

And the essay outlines four absolute facts about your digital twin that underscore your total lack of agency. First, it actively determines your real world opportunities, your credit worthiness, your job eligibility, your legal risk. Second, it is constantly monitoring you. The data collection never sleeps. Third, you cannot delete it. Even if you completely erase your social media presence and throw your smartphone in a river, your Dita Shadow remains firmly in the hands of brokers. And fourth, you do not own it. Big tech and financial institutions own your twin.

SPEAKER_00

Here's where it gets really interesting, because it's less like a digital clone and more like a financial and legal voodoo doll made entirely out of your data. If an AI decides to poke that voodoo doll in a server farm out in California, you feel the real-world pain of a denied mortgage or lost job right where you live.

SPEAKER_01

It's terrifying.

SPEAKER_00

And the author didn't just thealize about this. They brought massive receipts. They detail four major journalistic and academic investigations proving exactly how lives are destroyed when the AI's perception of your digital twin overrides the truth.

SPEAKER_01

Yeah, the mechanisms behind these failures are what make them so alarming. First, the author cites the Guardian reporting on an AI-driven welfare fraud detection system deployed in the UK. The algorithm was designed to flag suspicious behavior, but its parameters were so opaque that it falsely accused thousands of innocent people. Because the system's output was trusted implicitly, these people had their vital benefits suddenly revoked with virtually no human oversight to catch the air.

SPEAKER_00

Then there is a terrifying report from Forbes detailing how JP Morgan Chase closed customer bank accounts based purely on AI-flagged risk. The algorithm decided the customers' digital twins were liabilities.

SPEAKER_01

Yeah.

SPEAKER_00

The real, physical customers were given no proof, no explanation of the flagged behavior, and no recourse to appeal. Their financial lifeline was just severed by a machine operating in a black box.

SPEAKER_01

The essay also points to a Harvard Business Review piece exposing Amazon's AI hiring system. The mechanism here was historical bias. The AI was trained on a decade of previous hiring data, which was overwhelmingly male.

SPEAKER_00

Oh, I see where this is going.

SPEAKER_01

Yeah, so the algorithm effectively taught itself that male candidates were preferable, and it actively began downgrading resumes submitted by female candidates regardless of their actual qualifications. The digital twins' gender became a fatal flaw.

SPEAKER_00

And finally, a study from the Brookings Institution examining predictive policing AI. The algorithms used proxy variables for crime that ended up disproportionately flagging black and Latino communities as high-risk threats. Right. This directed intense police presence into those neighborhoods, escalating tensions and creating a self-fulfilling prophecy, all based on a mathematical simulation rather than actual human behavior.

SPEAKER_01

Aaron Powell In every single one of those examples, the massive institution, the government, the bank, the tech giant, the police force trusted the digital twin over the physical human being standing right in front of them. Yeah. Your ability to function in modern society hinged entirely on how an algorithm calculated your data points, completely ignoring your actual character or your real world actions.

SPEAKER_00

You show up in person, flesh and blood, to say, look, I didn't commit fraud or I am highly qualified for this job, and the institution looks right past you and says, sorry, the computer says otherwise, we only negotiate with the voodoo doll.

SPEAKER_01

It's exactly that. The author makes it incredibly stark. If an AI classifies your shadow profile as a financial risk or a social disruptor, you can be entirely blacklisted from the modern economy and you will have no one to appeal to.

SPEAKER_00

So what does this all mean? We are staring down the barrel of this hyper-real, techno-feudalist cage where algorithms dictate our reality and our opportunities. Are we just doomed?

SPEAKER_01

No, actually.

SPEAKER_00

The author actually ends the essay with a powerful call to action. They argue we are at a definitive crossroads. We either quietly accept a system where AI dictates reality, or we stand up and legally reclaim ownership of our digital existence.

SPEAKER_01

And they provide a very specific, actionable, legal pathway for this fight. The author argues that we need a wave of class action lawsuits demanding full legal property rights over our personal data and our digital twins. They point to existing legal frameworks that could be expanded to cover this, specifically citing a foundational 2011 Supreme Court case, Sorrel VIMS Health Inc.

SPEAKER_00

How does that case apply to my digital twin?

SPEAKER_01

While Sorrel dealt with pharmaceutical data mining, the court recognized that the creation and dissemination of data is a form of speech protected by the First Amendment.

SPEAKER_00

Oh, interesting.

SPEAKER_01

The author argues that if the courts can recognize the immense value and legal weight of data profiles for pharmaceutical companies, that same logic must be applied to the individual citizens generating the data.

SPEAKER_00

The essay also mentions the data privacy statutes under 47 USC section 551. That's the Cable Communications Policy Act, right?

SPEAKER_01

Yes. Under that statute, a cable provider cannot simply collect and sell your personal viewing habits without your explicit opt-in consent. There are strict privacy mechanisms in place. The author's argument is that we need to take that exact same legal mechanism and forcefully apply it to our digital twins. Your granular behavioral data, your location history, your psychological profile, it should be treated with the same, if not greater, legal privacy rights as your cable box.

SPEAKER_00

Makes total sense. Before wrapping up, the author shares a genuinely profound final thought. It's actually a personal regret. In a previous book, they wrote a best-selling novel called The Last Resistance. They envisioned a dystopia where they blamed AI itself for the collapse of society. Right. But looking at the landscape today, they realized they got it wrong. AI is not the inherent threat. AI is just a tool, a revolutionary one that could elevate human capability.

SPEAKER_01

The real danger isn't that a super intelligent AI is going to rise up and replace us. The danger is that we are willingly reducing ourselves to make the algorithm's job easier. We have abandoned the core philosophical foundation of human agency Descartes, I think, therefore I am. We've traded it in and been conditioned to accept a terrifying new baseline. I am digital, therefore I exist.

SPEAKER_00

We've willingly flattened our complex humanity into predictable data streams just so the apps load faster and the packages arrive next day.

SPEAKER_01

Yeah, and the unavoidable truth of this system is that if you do not own your digital self, someone else absolutely will. George Orwell wrote, Who controls the past controls the future. Who controls the present controls the past. Right. That wasn't just a warning about manipulating history books. It was a warning about the mechanics of power. If massive corporations and opaque AI systems are allowed to dictate the parameters of your digital existence, they are dictating your actual future. You will be shaped, corrected, and optimized. Not for your own flourishing, but for their profit margins. The ultimate question the author leaves us with isn't whether AI will take over the world, it's who or what will take over you.

SPEAKER_00

It is a massive shift in how we view our daily interactions with technology. And to bring it all back to where we started, I want to leave you, our listener, with a thought experiment to mull over long after you take your headphones off.

SPEAKER_01

It's a good one.

SPEAKER_00

Think about that digital twin we discussed. That shadow profile built by invisible algorithms, holding your entire search history, your most hidden fears, your microhabits, and every location you've ever set foot in. If that digital twin, that voodoo doll made of your data, were somehow physically printed out and it walked into the room and sat down right next to you right now, would you even recognize them?

SPEAKER_01

Seriously.

SPEAKER_00

Would you trust them to make life altering decisions for you? Or would you be absolutely terrified of what they know about you and what they might do next? Thank you so much for joining us on this deep dive. Stay curious, stay aware, and we'll see you next time.