In Memory of Man Podcast - Robot Crime Blog

Your Internet Is Working. The Internet Is Dead

Robot crime blog Season 1 Episode 63

Use Left/Right to seek, Home/End to jump to start or end. Hold shift to jump forward or backward.

0:00 | 23:53

 This field report argues that a massive digital infrastructure is currently replacing the human-centric internet with an AI-driven substrate that harvests our data to automate all layers of society. The author warns that the bridge cohort, those who remember life before the digital takeover, is the final generation capable of recognizing and documenting this total transformation. By 2030, the text predicts a rigid economic hierarchy ranging from wealthy owners of the AI infrastructure to a "stranded" class managed by universal basic income and synthetic entertainment. To survive this shift, professionals are encouraged to become sovereign operators by owning their own hardware and private data rather than relying on corporate cloud services. Ultimately, the source serves as an urgent call to preserve human legacy and autonomy before the physical and cognitive worlds are fully subsumed by machine logic. 

robotcrimeblog

SPEAKER_00

You are currently living through a technological transition that will uh completely close by the year 2030.

SPEAKER_01

Yeah, it's a remarkably short window.

SPEAKER_00

Aaron Powell Right. And if you happen to be listening to this and you were born roughly between 1965 and 1995, you actually have a very specific, frankly, chilling role to play in how this all shakes out.

SPEAKER_01

You really do.

SPEAKER_00

So welcome to the deep dive. Today our mission is to unpack this heavily researched, verified, data-backed document. It's called the Bridge Cohort Field Report.

SPEAKER_01

Written by Robert Keesling.

SPEAKER_00

Trevor Burrus, Exactly. Robert Keesling. He's a solo trial attorney and an author. And his central argument here is that the movie The Matrix wasn't just science fiction. Trevor Burrus, Jr.

SPEAKER_01

It was an architectural blueprint.

SPEAKER_00

Trevor Burrus, Jr.: Yes. A blueprint for the economy we are literally building right now.

SPEAKER_01

Aaron Ross Powell And the urgency in his report, I mean, it really comes from the timeline. He's not predicting what the world might look like in, say, 2050. Right. He is documenting a soundational shift in human infrastructure that has already happened. The concrete is poured.

SPEAKER_00

Aaron Powell, I gotta be honest though. Okay, let's unpack this. Because when I first saw that he was using the Matrix as his central framing device, I rolled my eyes a bit.

SPEAKER_01

Aaron Powell Oh, totally.

SPEAKER_00

I mean, it's kind of the ultimate tech bro cliche at this point, isn't it? Like the leather trench coats, the glowing green code, the whole we live in a simulation thing. It feels, I don't know, a little played out.

SPEAKER_01

Aaron Powell Well, it is definitely a well-worn comparison. We picture those pods and the pink goo and we think, uh, well, I'm sitting in my kitchen making coffee, so clearly we're not there.

SPEAKER_00

Exactly.

SPEAKER_01

But what's fascinating here is how Keisling points out a massive, terrifying correction to the movie's logic. It validates the metaphor perfectly.

SPEAKER_00

Yeah.

SPEAKER_01

You remember the film how Morpheus explains that the machines are using humans as literal batteries?

SPEAKER_00

Right, yeah. To power their civilization because they blotted out the sun, so they need human body heat.

SPEAKER_01

Exactly. But technologically, that makes zero sense. Using a biological organism as a battery is wildly inefficient. Sure. So Kiesling argues that the Wikkowski's got the architecture right, but the extracted resource wrong. Yeah. We aren't being harvested for our thermal energy. We are being harvested for our training data.

SPEAKER_00

Okay, that just gave me goosebumps.

SPEAKER_01

Right. And the crucial, really horrifying difference for you listening right now to understand is this com a machine needs a battery forever.

SPEAKER_00

Yeah.

SPEAKER_01

But once a multimodal AI learns exactly how to be us, you know, how to reason like us, how to write like us.

SPEAKER_00

It doesn't need us anymore.

SPEAKER_01

Exactly. It doesn't need to keep us around to power it.

SPEAKER_00

Which, I mean, that brings us back to that specific group of people Keisling addresses, right? The bridge cohort.

SPEAKER_01

The 1965 to 1995 folks.

SPEAKER_00

Yeah. So if you fall into that birth window, you are the bridge. You are the only generation in human history that will vividly remember the before state.

SPEAKER_01

Aaron Powell What it physically felt like to have an untracked childhood. Trevor Burrus, Jr.

SPEAKER_00

Right. To use a rotary phone or uh to get completely lost because you didn't have GPS in your pocket. Yeah. And you'll also see the afterstate where reality is mediated by orchestrated AI agents.

SPEAKER_01

It is a profound psychological burden, honestly. Because nobody born before you could have written this report. The technology just didn't exist for them to conceptualize it.

SPEAKER_00

That makes sense.

SPEAKER_01

And nobody born after you will be able to write it because they'll have no internal baseline. To a kid born in 2015, a synthetic, algorithmically generated world isn't some dyscopia.

SPEAKER_00

It's just Tuesday.

SPEAKER_01

Exactly. It's just their baseline reality. Trevor Burrus, Jr.

SPEAKER_00

Man, and if we want to look at exactly how far we've crossed that bridge, we have to look at what Kiesling says about the Internet. Because he argues that the web, you know, as a human space, the one you and I grew up on is completely dead.

SPEAKER_01

It's gone.

SPEAKER_00

And when you look at the data he compiled, it just s shatters how you view the web. There's this March 2026 report from Human Security. They analyzed over one quadrillion digital interactions.

SPEAKER_01

A quadrillion.

SPEAKER_00

Yeah. And they found that bots now generate more internet traffic than actual human beings.

SPEAKER_01

Aaron Powell And the growth rate is what's truly staggering there. I mean, between 2024 and 2025, human internet traffic grew by uh about three percent. AI agent traffic surged by 7,851 percent in that exact same window.

SPEAKER_00

Aaron Powell That is insane. It fundamentally breaks your mental model of what being online means. Completely like RFs, the SEO research firm, they analyzed 900,000 newly published web pages in April 2025. Right. And they found that 74.2% of them contained AI-generated content. Think about that. Three out of every four new pages you stumble across weren't written by a person. They were assembled by a model, published by a script, and indexed by a crawler. Yeah. It's basically like wandering into a sprawling digital ghost town. But the creaky part isn't that it's empty, it's that all the tumbleweeds blowing past you are actually algorithms talking to other algorithms.

SPEAKER_01

Generating fake engagement to sell ads to avoid.

SPEAKER_00

Yes, exactly.

SPEAKER_01

That's a perfect way to visualize it. The human internet isn't empty, it's uh it's paved over. And we have to explain how this happened mechanically because it wasn't some accident.

SPEAKER_00

Right, the economic loop.

SPEAKER_01

Yes. The economic loop that funded the human internet has been deliberately severed. Think about how search used to work. You'd type a question into Google, get 10 blue links, click one, and land on a publisher's site where you'd see an ad.

SPEAKER_00

Right. And that ad paid the human writer. It was a fragile ecosystem, but it worked.

SPEAKER_01

But now an AI overview model reads those ten links for you, summarizes the answer right at the top of the page, and you never click the link.

SPEAKER_00

So the AI company keeps your attention and the ad revenue.

SPEAKER_01

And the human creator starves. Kiesling notes that between late 2024 and late 2025, Google traffic to publishers dropped 33% globally. Wow. Business Insider cut 21% of its staff just because their search traffic plummeted.

SPEAKER_00

He even highlights Dig, right? The classic link sharing site.

SPEAKER_01

Oh yeah, that story is wild.

SPEAKER_00

The original founders tried to relaunch it in early 2026. It lasted exactly two months before they shut it down. And their stated reason was literally an unprecedented bot problem.

SPEAKER_01

The pipes of the old web are completely clogged.

SPEAKER_00

But wait, if the tech giants built their empires on this human internet, why would they intentionally destroy it?

SPEAKER_01

Aaron Powell Well, because if we follow the money, we see they aren't just burning down the old internet. They are clearing the lot to build something infinitely larger.

SPEAKER_00

Right.

SPEAKER_01

And to understand what that is, you have to look at the massive, almost absurd disconnect happening in tech right now.

SPEAKER_00

Aaron Powell You're talking about the layoffs versus the capital expenditure. Because this part of the report is just wild.

SPEAKER_01

It really is.

SPEAKER_00

Since 2020, nearly 900,000 tech workers have been laid off. 92,000 in just the first four months of 2026 alone. Yep. But at the exact same time, the hyperscalers, you know, companies like Meta, Microsoft, Amazon, Google, they are spending money like we've never seen in corporate history.

SPEAKER_01

Trevor Burrus, Jr.: Half a trillion dollars globally in combined capital expenditure in 2026 alone.

SPEAKER_00

Half a trillion.

SPEAKER_01

And to put that in perspective, the actual revenue of the entire global AI industry that year was somewhere between uh$50 and$80 billion.

SPEAKER_00

Yeah, math doesn't add up.

SPEAKER_01

Exactly.

SPEAKER_00

Yeah.

SPEAKER_01

You do not spend$500 billion to service an$80 billion market. You don't spend a half trillion dollars to build a slightly better productivity tool to help accountants write emails faster.

SPEAKER_00

Right.

SPEAKER_01

You only spend that kind of cash if you are building the replacement for the market itself. Kiesling calls this infrastructure the substrate.

SPEAKER_00

Aaron Powell Wait, so let me get this straight. The displaced middle class tech workers, you know, the people getting these severance packages right now, they're basically funding the computational build-out of their own replacements.

SPEAKER_01

Aaron Powell That is the brutal reality of the economics here.

SPEAKER_00

Yeah.

SPEAKER_01

The substrate is being built to close the loop on human labor entirely. And it has two halves.

SPEAKER_00

Okay.

SPEAKER_01

The cognitive half consists of the massive data centers and the AI agents. That eats knowledge work. It replaces human managers with software orchestration.

SPEAKER_00

And then the physical half is humanoid robotics.

SPEAKER_01

Yes.

SPEAKER_00

Like he cites Tesla's Optimus program targeting millions of units by 2030 at$20,000 a robot, and figure AI deploying autonomous humanoids into BMW manufacturing facilities.

SPEAKER_01

Exactly. The data center is the brain, the robot is the body. They're the exact same product, just separated by a network cable.

SPEAKER_00

Man.

SPEAKER_01

And together, they create a parallel labor force that never sleeps, never takes a sick day, and crucially for corporate margins, doesn't require human dignity.

SPEAKER_00

So if the cognitive and physical layers of human labor are being eaten by this substrate, what happens to the rest of us? Because if the tech giants own the tools, they essentially own the economy. Yes. Which leads to Kiesling's breakdown of society in the year 2030. He divides the population into five concrete economic tiers. And if you're listening to this, you really need to think critically about which bucket you're currently standing in.

SPEAKER_01

It's sobering.

SPEAKER_00

At the very top, you have the substrate owner. This is a tiny fraction, maybe 0.1 to 0.5% of people.

SPEAKER_01

Right. These are the architects and the major shareholders. And their defining characteristic isn't just their immense wealth, it's their total insulation from the system they built.

SPEAKER_00

They're off the grid, basically. Yeah.

SPEAKER_01

In a way, yeah. They own the infrastructure. They live 15 to 25 years longer due to extreme continuous biomarker monitoring and concierge medicine. But most importantly, their exposure to the public algorithm, what Keisling calls the feed.

SPEAKER_00

Oh, right. The feed.

SPEAKER_01

Their exposure to it is practically zero by design. They do not consume the algorithmic slot.

SPEAKER_00

And then below them you have the substrate dependent operator. This is roughly 8 to 15% of the population. They make great money at like$150,000 to maybe a million dollars a year.

SPEAKER_01

Yeah, the power users.

SPEAKER_00

Right. They're renting cloud-based AI tools to do incredibly high value work.

SPEAKER_01

But Keisling warns they are deeply vulnerable. Because they don't own the underlying models, their entire livelihood can be wiped out on a random Tuesday by a software update from a hyperscaler.

SPEAKER_00

Like an update that suddenly does their specific high-paying job natively?

SPEAKER_01

Exactly. Poof, it's gone.

SPEAKER_00

Then you have the augmented worker. About 25 to 35% of the workforce. These are the people who kept their jobs by learning to use AI, but the utopian promise of technology completely failed them.

SPEAKER_01

Right. They aren't working four-day weeks.

SPEAKER_00

No. They're working longer hours for less relative pay because the employer captured all the productivity gains. The AI just raised the baseline of expected output, putting them on a permanent treadmill.

SPEAKER_01

And that leads to the most staggering shift, the bottom tier. Keysling calls them the stranded.

SPEAKER_00

The stranded.

SPEAKER_01

He projects this will be 50 to 65% of the adult population by 2030. This is the massive class of people whose economic utility has simply been priced out by the substrate.

SPEAKER_00

And this raises the inevitable conversation about universal basic income or UBI, because we hear it pitched all the time as this utopian safety net.

SPEAKER_01

Oh, constantly.

SPEAKER_00

But Kiesling points out that UBI is no longer a hypothetical thought experiment. He cites that massive study by Sam Altman's open research.

SPEAKER_01

Right, the three-year trial in Illinois and Texas.

SPEAKER_00

Yeah, where they gave people$1,000 a month, no strings attached.

SPEAKER_01

Now this raises an important question because the data from that trial dismantled a lot of assumptions on both sides of the political aisle.

SPEAKER_00

How so?

SPEAKER_01

Well, the conservative fear was always that free money makes people lazy, you know, they drop out of the workforce to do drugs. Right. Empirically, that didn't happen. Substance abuse actually went down.

SPEAKER_00

Oh wow.

SPEAKER_01

But the progressive hope was that UBI would be a springboard to the middle class. And the trial proved that is also false. The critical insight was that UBI does not produce upward mobility, it produces stable poverty.

SPEAKER_00

I got a pushback here though, just playing devil's advocate. Is stable poverty really a bad outcome if the alternative is mass starvation? I mean, if sixty percent of people are economically displaced, isn't a managed existence with a safety net better than utter destitution?

SPEAKER_01

That is the exact philosophical trap Keith Ling highlights. The stranded class in 2030 isn't starving in the streets like some Dickens novel. They are heavily managed. They are fed, medicated, and endlessly entertained by the feed. Their fast food cashier is a kiosk, their delivery driver is a drone, their elder care might be robotic. They will interact with more synthetic agents than real humans in a given week. It's a highly pacified existence. They aren't dying, but they have been stripped of all human agency and upward mobility.

SPEAKER_00

Which is absolutely terrifying. But here's where it gets really interesting because the report pivots from a pure warning into a strategic playbook.

SPEAKER_01

A lifeboat, essentially.

SPEAKER_00

Yes. There is a tier that survives and thrives in this new economy without being a billionaire, and he calls it the sovereign operator. If you are a professional, you know, a lawyer, an accountant, an architect, this is the part to laser in on.

SPEAKER_01

The sovereign operator sits between the hyper-rich owners and the highly vulnerable dependent operators. And the key difference here is hardware and data ownership.

SPEAKER_00

Not renting.

SPEAKER_01

Exactly. Instead of renting an AI from the cloud, the sovereign operator runs local open weight AI models. For the non-developers listening, open weight essentially means you download the brain of the AI directly onto a powerful computer sitting physically in your own office or basement.

SPEAKER_00

Like a llama or something similar.

SPEAKER_01

Right. You aren't connected to the hyperscalers cloud at all.

SPEAKER_00

And Keisling gives this incredible real-world example from his own legal practice to illustrate the power of this. He had to cross-reference 65 deemed admissions against decades of Texas case law to draft what's called a Stours demand. Right. For those outside the legal world, a Stours demand in Texas is a highly complex, high-stakes settlement maneuver. It requires absolute precision. And he notes that back in 2015, this task would have taken a contract attorney two full weeks of billable hours.

SPEAKER_01

Two weeks. But using his local sovereign AI setup, running against his own private curated database of every Texas Dowers case since 1929, he drafted the demand, verified every legal citation, and had it on the opposing counsel's desk in under six hours.

SPEAKER_00

Aaron Powell Under six hours. That speed is amazing, but uh the speed isn't actually the moat, is it?

SPEAKER_01

Aaron Ross Powell, Jr. Exactly. So what for the listener is that privacy is the ultimate economic moat. Yeah. Because Keisling ran that entire process locally, what developers call local inference. None of his clients' data, none of his proprietary workflow, and none of his intellectual property ever touched a server owned by Google or Microsoft. If the tech giants cannot observe your workflow, they cannot train their next model on it. If they can't clone it, they can't build a vertical product to replace you. You become unfirable by the algorithm.

SPEAKER_00

Aaron Powell You own the data, you own the machine, you own your future.

SPEAKER_01

Yep.

SPEAKER_00

But he also issues a stark warning that this window of opportunity is rapidly closing. Because right now, becoming a sovereign operator just requires buying the physical hardware and spending your weekends learning how to run local models. Right. But by 2030, Keislin predicts this kind of independent setup will require expensive professional licensing. Trevor Burrus, Jr.

SPEAKER_01

It's the classic life cycle of disruptive technology. First, it's novel. Then it's widespread. Then the established institutions absolutely panic. Bar associations, medical boards, financial regulators, they're all gonna mandate strict, expensive licenses to run local AI under the guise of protecting the public. When in reality, in reality, it will build a massive barrier to entry that keeps the middle class out. So the time to build your sovereign system is literally right now.

SPEAKER_00

So if the sovereign operator's entire survival strategy is unplugging from the cloud, what happens to our civic life? Like how does this massive technological earthquake reshape the way we govern ourselves and find meaning?

SPEAKER_01

It's a huge question.

SPEAKER_00

Which leads into Keisling's analysis of politics and religion. And I want to be very clear to you listening here, we are strictly impartially reporting Kiesling's analysis. We aren't taking a side.

SPEAKER_01

Right. Kiesling looks at the political landscape and concludes that the traditional forms of democracy will remain. We will still have voting booths and candidates. Sure. But the substance of public discourse will be completely hollowed out by AI swarms. These are coordinated networks of synthetic personas that perfectly imitate human voters.

SPEAKER_00

He actually cites research from a team, including Nick Bostrom and Gary Marcus, published in the journal Science, which documents how malicious AI swarms can infiltrate digital communities and shift public viewpoints at an unprecedented scale.

SPEAKER_01

It's already happening.

SPEAKER_00

Yeah. There are already companies, one called Double Speed, another system called Golaxi, that advertised the ability to orchestrate thousands of social accounts to mimic natural localized human interaction.

SPEAKER_01

And Keisling's thesis is that by 2030, voters simply will not be able to distinguish authentic human speech from synthetic speech online. And because of this total collapse of trust, politicians across the entire spectrum, left, right, and center will find themselves in the exact same bind.

SPEAKER_00

The performative anger thing.

SPEAKER_01

Yes. They will have to rely on what he calls performative anger. Voters will inherently know they are being manipulated by algorithms, so politicians will simply perform credible outrage about the system to win votes.

SPEAKER_00

While simultaneously utilizing those same AI swarms to get elected.

SPEAKER_01

Exactly. It's a closed loop of deep cynicism.

SPEAKER_00

He does note how different global blocks will try to handle this, though. The European Union, he predicts, will pursue aggressive regulation, you know, mandatory watermarking, criminal penalties for synthetic political content. Right. It will likely slow their economic growth significantly, but preserve a somewhat safer, slower public discourse.

SPEAKER_01

China, conversely, is explicitly integrating the substrate into state control by design. Yeah. And in the US, Kiesling foresees the rise of a cross-partisan substrate skeptic movement. People from wildly different political backgrounds uniting over a shared demand for verified human spaces.

SPEAKER_00

It fundamentally rewrites our civic identity. But perhaps the most profound cultural shift happens in the realm of religion, where Kiesling points out a really fascinating data contradiction.

SPEAKER_01

Oh, this part was so interesting.

SPEAKER_00

Right. The popular assumption is always that advanced technology kills religion. But macro data shows that globally, religion is growing.

SPEAKER_01

Yeah.

SPEAKER_00

Yet if you zoom in on high robot usage areas, you see measurable steep drops in traditional religiosity.

SPEAKER_01

So both things are happening at the exact same time. Yes. Kiesling argues that by 2030, the societal split becomes undeniable. For the stranded and augmented classes, traditional faith actually surges. Why? Because a physical church, mosque, or synagogue offers the one thing the substrate cannot easily replicate.

SPEAKER_00

Cheap, physical, in-person community and meaning.

SPEAKER_01

But the tech elite, you know, the operator and owner classes, they drift into something entirely different. Sociologists call it techno-religion. Now, they might not call it religion themselves, they view it as pure rationality. Sure. But the psychology is deeply religious. It's rooted in solutionism. When you spend 14 hours a day orchestrating all-knowing, all-capable AI agents, you start subconsciously treating the substrate as a higher power. You begin to believe that with enough compute, humanity can solve mortality itself.

SPEAKER_00

The substrate becomes your salvation.

SPEAKER_01

Exactly.

SPEAKER_00

He even points to formal AI religions that are already popping up, things like the way of the future or an art collective turned movement called Theran Noir that literally worships a speculative AI deity they call Mina. It's wild. They might only have half a million adherents by 2030, but culturally their impact is massive because they attract the wealthy, influential engineers building our infrastructure.

SPEAKER_01

And Keisling's warning here is really about the danger of drift. If you don't avely intentionally practice something outside of the digital ecosystem, whether that's a traditional faith, a rigorous philosophical practice, or just an unwavering commitment to physical human community, you will default into this techno-religious posture.

SPEAKER_00

The substrate will assign you your meaning. Yes. So bringing this all back down to earth, what does this mean for you, the listener? If Keisling is right and the transition is already cementing, what is your next move?

SPEAKER_01

Well, he breaks it down into highly actionable advice. First, if you have the technical capacity, become a sovereign operator, buy the hardware, learn to run local models, and violently protect your private data corpus.

SPEAKER_00

And if you are currently an augmented worker, you must specialize immediately. You have to find the work that the substrate eats slowly. Right. That means focusing on physical presence work that requires state licensing, like an electrician, a specialized plumber, or an HVAC tech. Or pivot to work where human liability is legally required, like a trial lawyer in a physical courtroom, or a surgeon in an operating theater.

SPEAKER_01

An AI can't go to jail, so human liability remains a premium product.

SPEAKER_00

Exactly. And if you fear you are slipping toward that stranded class, his advice is profoundly, deeply human. Move closer to your family, radically reduce your fixed living costs, and ruthlessly build physical community. Join a neighborhood association, go to a bowling league, attend a local town hall. The feed is engineered by trillion dollar companies to be the perfect friend you never have to call. You have to actively resist that comfort. Pick up the phone, show up in person.

SPEAKER_01

And Kiesling ends his report with a specific, really sober assignment directed squarely at the bridge cohort.

SPEAKER_00

The 2028 window.

SPEAKER_01

Yes. By the year 2028, he warns, synthetic content will be so flawless, so emotionally resonant, and so pervasive that all new digital content will be inherently suspect. If you post a photo, write a blog, or record a video of your family after 2028, the default assumption of society will be that it is AI generated.

SPEAKER_00

Man. Because of that impending reality, you had a duty to document the before state right now. Print physical photographs and put them in heavy albums. Write physical books. Write handwritten letters to your children and grandchildren that they can actually hold and keep in a wooden drawer.

SPEAKER_01

You are the last generation that will remember what it meant to be entirely human.

SPEAKER_00

Which brings us full circle back to the Matrix metaphor. Kiesling points out that when we talk about the Matrix, we almost always focus on the first movie, right?

SPEAKER_01

Taking the red pill, waking up, destroying the system.

SPEAKER_00

Exactly. But he argues the third movie, The Matrix Revolutions, is actually the most honest forecast of our future.

SPEAKER_01

Right. The third movie doesn't end with a grand human victory, it ends with a settlement. Humanity doesn't destroy the machines, and the machines don't wipe out humanity.

SPEAKER_00

They reach a tense, complicated equilibrium where they simply have to share the same planet.

SPEAKER_01

Exactly. There is no grand escape from the substrate. The system is being built because capital relentlessly chases returns, and the highest returns in human history are in artificial intelligence.

SPEAKER_00

It's not a villainous master plan hatched in a volcano lair.

SPEAKER_01

No, it is just the machinery of global economics doing what it does. We are heading for a shared settlement.

SPEAKER_00

Which leaves us with a really provocative final thought to mull over. If we are truly entering this Matrix 3 settlement where the synthetic and the human are permanently, irreversibly entangled, and the digital harvest makes all online truth completely unverifiable what happens to human identity.

SPEAKER_01

That's the real question.

SPEAKER_00

When the only irrefutable proof of your own lived experience, the only verifiable proof of your own soul, is the fading physical ink on a piece of paper you hid in a desk drawer. Wait, no, a piece of paper you hid in a desk drawer. What physical piece of your mind will you choose to leave behind before the digital harvest makes human truth impossible to prove?