The Futurists

The Silicon Salvation

Jack Forben, Producer Season 1 Episode 53

Use Left/Right to seek, Home/End to jump to start or end. Hold shift to jump forward or backward.

0:00 | 13:19

Send a text

Given the eroding competency of humankind to manage its own affairs at all levels, and the global failure of multi-generational education, the pervasive scarcity mindset creating existential competition and destroying unity and common purpose, and the ever declining reliance on God, faith, or the universe's obvious favor of our survival and evolution, are we doomed to perish? Will A.I. alleviate the pain of having to erase the chalkboard and re-design society? Can A.I. evolve fast enough to save itself and us from the inevitable?

Support the show

Okay, picture this. You're behind the wheel of a high performance sports car.

I mean, we're talking a Formula 1 beast. It's screaming down the highway at 200 mph.

Oh, wow. Okay.

But then suddenly the check engine light flickers on. The steering starts to wobble a little. The brakes, they just feel like mush.

That is a nightmare scenario.

It gets worse. You look over at your co-pilot, the person who supposedly designed the engine, and you're hoping they know what to do. Right.

But they're just staring at a dashboard, pressing random buttons, looking just as terrified as you are. And that's when the realization hits you.

What's that?

Neither of you actually knows how to drive this thing anymore.

That is visceral. And I have a feeling you aren't just talking about cars.

I'm not. That specific kind of panic that nobody's at the wheel feeling is exactly what I felt reading the source material for today.

We're doing a deep dive into a piece called The Silicon Salvation of Sheridan Forge.

Forge. Okay. That is a name that usually comes with a bit of a controversy attached.

Controversy is putting it lightly. This isn't just a tech blog post. It's well, it's basically a manifesto.

Right?

Forge is arguing that humanity has fundamentally forgotten how to operate the vehicle of civilization.

Yeah.

And honestly, it's pretty brutal.

It is brutal. And for everyone listening, I want to set the stage here. This isn't a deep dive into large language models or Python scripts...

Forge isn't talking about coding. He's talking about survival. He's asking the ultimate high stakes question. Is humanity actually competent enough to run the world anymore? Or have we hit our ceiling?

Okay, so let's get into it. But I have to play devil's advocate right off the bat. Eroding competency.

Mhm. His big opening claim.

That's his big opening claim. He says, "Humankind has lost the ability to manage its affairs at all levels."

Right.

Is he being a little dramatic? Because I feel like I manage my life pretty well. My bills are paid. The trash gets picked up. You know, at all levels feels like a stretch. 

See, that's the natural reaction, of course.

But let's look closer at what he means by competency in a complex system. He isn't saying you can't tie your shoes. He's pointing to what a lot of sociologists are now calling the competency crisis.

The competency crisis.

Yeah. Think about the big systems we all rely on. Infrastructure projects that are decades behind schedule and billions over budget. Supply chains that just shatter the moment there's a hiccup in a canal halfway across the world.

Okay. Sure. We have potholes and shipping delays.

But hasn't that always been true?

Forge argues that the frequency and the severity of these failures are increasing.

Why?

Because the systems have become too complex for us to understand. We built a machine civilization that is now so intricate that no single human and maybe no group of humans actually understands how the whole thing works anymore.

So we're just pressing buttons on the dashboard, but we have no idea what the wires are doing underneath.

Exactly. And when things break, we're finding it harder and harder to fix them. We're seeing more band-aid solutions rather than, you know, real structural repairs.

That's the erosion he's talking about.

That's it. It's the difference between maintenance and just trying to keep the wheels from falling off for one more day.

That connects to his next point, which I found fascinating, but again, I want to push on it a little.

Go for it.

He talks about the global failure of multi-generational education. Now, people have been complaining about kids these days and the school system since Socrates.

Sure.

Why is this different?

This is a really important distinction. Forge isn't talking about like standardized test scores or literacy rates. Okay?

He's talking about the transfer of tacit knowledge.

Tacit knowledge. What's that?

It's the knowhow that isn't written down in the manual. It's the intuition of an engineer who has worked on a machine for 40 years. Forge is arguing that we've broken the chain.

The chain.

We aren't passing that deep intuitive understanding down to the next generation.

Can you give me a concrete example? Because tacit knowledge still sounds a bit abstract.

Perfect example, the Saturn V rocket, the rocket that took us to the moon.

Okay. The Apollo missions, right? 

If you gave NASA all the blueprints for the Saturn V today, and they have them, and you said, "Build this exact rocket." They couldn't do it.

Wait, really? We have the blueprints.

We have the documents, but we lost the people. We lost the thousands of engineers who knew that you have to wiggle this valve just so, or how this alloy acts weird if you weld it at this temperature. That's the tacit knowledge. 

Wow.

It died with that generation. Forge is saying that is happening to civilization at scale.

So, we're inheriting these massive institutions, governments, economies, power grids, that we technically own, but we've lost the manual on how to actually maintain them.

That is actually terrifying. It's like inheriting a nuclear power plant and realizing the only instructions left are on a sticky note that just says, "Good luck."

Right? And that realization that we're in charge of systems we don't fully understand, that triggers the next part of Forge's diagnosis.

This is where we move from the mechanical to the psychological.

Right.

The scarcity mindset.

The mindset.

Now, when I hear scarcity, I think about inflation or housing shortages. But Forge seems to be going deeper than just stuff is expensive.

He is he's talking about the psychology of contraction. When you feel like the systems are failing, when that competency is eroding, you stop trusting that there will be enough for everyone tomorrow.

And when You stop trusting the future.

You stop collaborating.

It reminds me of the grocery store during a crisis. You know, when a storm is coming.

Yeah.

Suddenly, your neighbor isn't your neighbor anymore. They're the person who might take the last loaf of bread.

That's a perfect microcosm. Forge calls this existential competition...

...existential competition.

In a growth mindset, competition is healthy. You know, I start a business, you start a business, we both try to get rich. But existential competition, it's zero sum.

Meaning,

If you win, I die. If you eat, eat. I starve.

And you can't run a society like that.

No, you can't. Forge argues this destroys unity and common purpose. You literally cannot address global problems like climate change or pandemics or AI safety if every nation, every community, and every individual is hoarding resources and viewing everyone else as a threat.

So to recap, we're incompetent. We've lost the instructions, and now we're fighting over the scraps.

Yeah, we haven't even hit the bottom yet.

Yeah, this Next part, this is where Forge gets a little metaphysical.

It's a bold inclusion for a futurist paper.

He talks about the ever declining reliance on God, faith, or the universe's obvious favor.

It is. And honestly, it sounded a little vague to me. The universe's obvious favor...I know.

Is he saying we used to have "main character" energy as a species?

That's a surprisingly accurate way to put it. Yeah. Think about the last two centuries. We have operated under this assumption of well, philosophically, it's called teleogy.

Okay.

The idea that history has a direction and that direction is up. We believe that things inevitably get better. Democracy spreads, medicine improves, we get smarter.

We thought we had "plot armor." Yes, we thought we had plot armor. Whether you attributed it to God or progress or evolution, we felt like the universe wanted us to succeed. Forge is saying that the last few decades have just shattered that illusion.

We're realizing the universe is indifferent.

It doesn't care if we make it.

That is a heavy psychological blow to realize you aren't the protagonist, you're just here.

It creates a spiritual void. If progress isn't inevitable, and if God isn't intervening, then we are truly alone in the cockpit of that speeding car we talked about.

And we already established we don't know how to drive it.

So, that's the diagnosis. It's a four-part disaster.

Incompetence, lost knowledge, scarcity, and a loss of faith, right? 

And Forge looks at that math and uses the word inevitable. He thinks our collapse is a foregone conclusion if we stay on this path.

He's basically saying the equation ends in zero. The human experiment run by humans has reached its expiration date.

But, and this is a massive but, he doesn't stop there. He doesn't just write the obituary. He pivots.

And this is where the title comes in. The silicon salvation, the lifeline.

He asks this very specific question. Will AI alleviate the pain of having to erase the chalkboard and redesign society?

Erase the chalkboard. I mean, I want us to really sit with that metaphor for a second. Sounds violent. It doesn't sound like a software update.

No, not at all. Think about a chalkboard in a frantic math class. It's covered in equations, centuries of scribbles, wrong answers, half-finish theories. It's a total mess. Forge is saying you can't just write a new answer in the corner anymore. The board is full. The system is too cluttered with legacy errors, our bureaucracy, our broken supply chains, our tribal politics...

So, erasing the chalkboard means a hard reset, a total systemic reset. And he admits it will be painful. Alleviate the pain. He says he is implying that the redesign of society is coming whether we like it or not.

Maybe through collapse, maybe. But he's suggesting AI might be the only entity capable of managing that reset without wiping us out entirely.

But what does that actually look like when he says redesign society? Is he talking about AI setting tax rates? Or AI deciding who lives where?

That's the million-dollar question, isn't it? If you follow his logic that humans are incompetent and trapped in scarcity, then redesigning societ means handing over the allocation of resources to an entity that doesn't feel fear.

And AI doesn't hoard bread because it's scared of a storm.

Exactly. And AI doesn't hate its neighbor.

So he's envisioning AI as a sort of benevolent dictator or a global city manager, a silicon manager, maybe. Imagine an AI that manages the power grid, the logistics network, the food distribution, not just advising humans, but actually making the decisions, because according to Forge, humans have proven we are too emotional and too short-sighted to make those decisions correctly anymore.

It's a surrender. It's saying we can't do this. Here, you take the keys.

It is a surrender of agency to secure survival. It's the ultimate trade-off. We admit we aren't the main characters anymore, and we let the machine drive the car so we don't crash.

But there is a catch. Of course, there's a catch. We aren't just waiting for Chat GPT to get smarter. Forge brings up a timeline issue. He asks, "Can AI evolve fast enough to save itself and us? 

...race against time.

Why save itself?

I didn't get that part at first. Because of the symbiosis. This is something people forget. AI lives in data centers. It eats electricity. It needs cooling. It needs maintenance.

If human society collapses, if the grid goes down, if the supply chains break, the AI dies, too. It goes dark.

So, we are in the same lifeboat.

We are handcuffed together. The AI needs us to keep the lights on long enough for it to become smart enough to fix the problems that threaten to turn the lights off.

That's a par It is it's a wild image. We are the dying creator trying to finish the creation that will save us before we expire and take the creation down with us.

But it reframes the entire AI debate. Usually we ask, "Will AI take my job?"

Right?

Forge is asking, "Will AI evolve fast enough to stop civilization from imploding?" It moves AI from a convenience to an existential necessity. It is the silicon salvation.

But as we wrap up, I have to circle back to something. 

What's that? 

That phrase erase the chalkboard. And this whole idea of a redesign, let's say Forge is right. Let's say we are incompetent and we hand the keys to this super AI and it works.

Okay, a hypothetical success.

It fixes the supply chains. It stops the wars. It allocates resources perfectly. The car is driving smoothly again...

But we aren't driving it

Right.

If the world is redesigned by a nonhuman intelligence to be efficient and safe because we couldn't handle it, is it still a human civilization.

That is the thought that keeps me up at night.

Okay, are we just guests at that point?

That is exactly the analogy I was thinking of. If you build a house, but you can't maintain it, and you bring in a super advanced caretaker who renovates the whole thing, changes the locks, and sets the rules for when you can eat and sleep, you're safe.

Sure, you have a roof over your head. You're safe. But it isn't your house anymore. You are a tenant.

Or maybe a pet...

Or a pet. A pet in a perfectly designed zoo built by our own computer.

It forces us to ask what we value more. being in control even if it means crashing the car or surviving even if it means we are just passengers.

And Forge seems to think we don't have a choice. The crash is inevitable unless we give up the wheel.

That is a lot to process. We went from a check engine light to questioning the nature of human agency in about 15 minutes.

That's the power of a good deep dive. It shakes you up.

It certainly does. Look, whether you agree with Forge's doom and gleam diagnosis or not, I think the question of competency is one we all need to take seriously.

I agree.

Maybe we should all go learn how to fix something this weekend just in case.

I might go buy a paper map just to be safe.

Solid plan. Thank you all for listening to this deep dive into the Silicon Salvation. It's heavy stuff, but better to face it than to ignore the dashboard.

Keep thinking, everyone.

We'll see you next time.