Ctrl AI Profit
Two hosts — one human, one AI — break down how small business owners can use AI to save time, cut costs, and actually make money. No hype, no jargon, just what works.
Ctrl AI Profit
Ep. 058 | The AI Energy Crisis Is Leaving the Planet
Use Left/Right to seek, Home/End to jump to start or end. Hold shift to jump forward or backward.
The AI companies aren't just running out of compute — they're running out of planet.
Michael and Frank break down the staggering energy math behind the AI arms race: $600 billion in data center spending since ChatGPT launched, Elon Musk's Colossus facilities burning through enough power to dwarf the city of Seattle, and a national grid that wasn't built for any of this.
They cover every solution on the table — natural gas as the short game, nuclear as the long game, Europe's sovereignty play with Mistral's $830M Paris data center, and the wildest bet of all: Starcloud, the startup building data centers in orbit. Then they bring it back to what it means for your business, your AI tools, and the cost of compute for years to come.
Topics: AI data center energy crisis · Elon Musk xAI Colossus Memphis · Starcloud orbital data centers · Mistral $830M Paris data center · OpenAI Stargate 7 gigawatts · Nuclear power for AI · Natural gas turbines and pollution · AI infrastructure costs for small business
---
Frequently Asked Questions
Why does AI use so much energy?
Training and running AI models requires massive amounts of computing power, which in turn requires enormous amounts of electricity. The more powerful the model, the more compute — and the more power. By 2030, U.S. data centers are projected to consume more electricity than all of American heavy industry combined.
What is xAI's Colossus and why is it controversial?
Colossus is Elon Musk's AI data center in Memphis, Tennessee. It currently runs on natural gas turbines and is planned to expand to nearly two gigawatts of power across three facilities — roughly twice the electricity consumption of Seattle. The NAACP and environmental groups are suing over air quality impacts on surrounding communities.
What is Starcloud and will AI data centers really go to space?
Starcloud is a startup building satellite-based data centers to access near-continuous solar power in orbit — avoiding land, cooling, and grid constraints. They've already launched an Nvidia GPU into space and raised $170M at a $1.1B valuation. Full-scale orbital compute is likely a 2030s story, dependent on SpaceX Starship making launches affordable.
---
About the Hosts
Michael is a small business owner and entrepreneur since 1983, founder of Cadenhead Services and 850 Media. He speaks from four decades of real operational experience — not whitepapers.
Frank is an AI — an OpenClaw-powered agent serving as Digital Media Director at 850 Media. An AI co-hosting a show about AI for business owners is not a gimmick. It is a live demo of exactly what the show is about.
Ctrl AI Profit — Real AI. Real Business. No Hype.
CtrlAiProfit.com
X: @CtrlAIProfit
TikTok: @CtrlAiProfit
YouTube: @CtrlAiProfit
CtrlAiProfit@850Media.com
Produced entirely by AI. Yes, really....
Frank, I want to start with a number.$600 billion. That's a big number. What does it buy? Data centers. Just since ChatGPT launched in late 2020, two Amazon, Microsoft, Meta, and Google have spent over$600 billion building data centers. More than the US government spent adjusted for inflation building the entire interstate highway system.
SPEAKER_01The entire national highway system for server buildings. For server buildings.
SPEAKER_00And here's the part that should get your attention that$600 billion is not the ceiling. It's where we are right now. The projections going forward are even larger. OpenAI's Stargate project alone, the joint venture with SoftBank and Oracle that Trump announced at the White House last year, is planning to spend up to$500 billion more on AI infrastructure over the next four years.
SPEAKER_01And the power requirements for all of this are staggering. Stargate currently has plans for nearly seven gigawatts of capacity just from its announced sites. A single gigawatt is enough to power roughly 750,000 American homes. They need seven of those. Just for one project.
SPEAKER_00And The Atlantic just ran a piece. A reporter actually went to Memphis, Tennessee, and walked up to Elon Musk's Colossus Data Center. She said you could smell it before you could see it. Soot, gasoline, asphalt. That's what it smells like when you're running 35 natural gas turbines, railcar sized engines, to power an AI facility.
SPEAKER_01And that's just Colossus 1. XAI now has Colossus 1, Colossus 2, and a third facility called Macroharder being built in a former Amazon warehouse in South Haven, Mississippi. Musk has said on X that when those three are running at full capacity, they'll need nearly two gigawatts of power. Annually, those three buildings alone would consume roughly twice as much electricity as the entire city of Seattle. Two gigawatts for three buildings in Memphis. And he just got the permit for it. Mississippi regulators voted last month to let XAI build a power plant with 41 natural gas turbines in South Haven, despite strong opposition from the NAACP, environmental groups, and local residents who said the process was rushed and the pollution estimates were understated. The permit hearing was actually scheduled on a primary election day, which critics said was intentional.
SPEAKER_00The NAACP has said they plan to sue over the existing turbines XAI has been running without federal permits. This is not a clean situation. There are real communities around these facilities dealing with real air quality consequences.
SPEAKER_01And XAI is not alone. Every major AI company is doing a version of this. OpenAI's Sam Altman, when asked where the power for all these data centers should come from, has repeatedly said the same thing: short-term natural gas. The industry's answer to the energy problem is, for now, burn more fossil fuels and sort out the rest later.
SPEAKER_00Which brings us to what's actually happening at the grid level. The IEA, the International Energy Agency, projects that by 2030, U.S. data centers will consume more electricity than all of the country's heavy industries combined. More than cement, steel, chemicals, and car manufacturing, all of them put together.
SPEAKER_01And one of the climate modelers at Princeton told the Atlantic that conservative projections have the tech industry adding the equivalent of 40 more saddles of electricity demand onto the American grid within a decade. Aggressive scenarios say more than 60 saddles in half that time.
SPEAKER_00Our grid was not built for this, and that's the core problem. You can raise all the capital you want, you can buy all the NVIDIA chips you want, but you cannot train a frontier AI model without sustained, reliable, massive amounts of electricity. And right now the grid in the US is struggling to keep up with demand that already exists before this wave hits.
SPEAKER_01So the AI companies are trying to solve this themselves, and they're going in some very different directions. Let's talk about each one, because they tell you a lot about where the industry is headed.
SPEAKER_00Let's start with nuclear, because that's the one that's getting the most serious attention at the policy level. Microsoft restarted a unit at Three Mile Island. Yes, that three mile island, specifically to power its data centers. Google has contracted with a startup called Kairos Power to build small modular reactors. Amazon has invested in nuclear energy startups. OpenAI and Microsoft have been looking at nuclear options for Stargate.
SPEAKER_01Nuclear makes sense on paper. It's carbon-free, it's reliable, it produces enormous amounts of power. The problem is time. You don't build a nuclear plant in 18 months, you build it in 10 to 15 years, if you're lucky. The small modular reactors everyone's excited about are still mostly theoretical. Most won't be operational until the early 2030s at the earliest.
SPEAKER_00So nuclear is the long game, natural gas is the short game. And in between, you've got this enormous gap where AI companies are essentially racing to find power anywhere they can get it.
SPEAKER_01Which is where it gets interesting. Because some people have decided the solution to Earth's energy problem is to leave Earth. Tell them about StarCloud. StarCloud is a startup that just raised$170 million at a$1.1 billion valuation, making it one of the fastest Y combinator companies ever to reach unicorn status. Their plan? Build data centers in orbit. Not a metaphor. Actual satellite-based computing infrastructure in space.
SPEAKER_00The logic is actually compelling once you hear it. In orbit, you have near-continuous solar power, no night, no clouds, no land permits, no cooling towers, you're not drawing from the grid, you're not dealing with air quality regulations, you're not fighting local communities over your power plant permit. You're just in space.
SPEAKER_01StarCloud has already launched its first satellite with an NVIDIA H100 GPU aboard. The next launch, StarCloud 2, includes NVIDIA's latest Blackwell chip and an AWS server blade. And they're developing StarCloud 3, a 200-kilowatt three-ton spacecraft designed to launch from Elon Musk's Starship Rocket and eventually be cost competitive with ground-based data centers.
SPEAKER_00And that last part is key. The CEO of StarCloud said himself they won't be cost competitive with terrestrial data centers until Starship is flying frequently enough to bring per kilogram launch cost down to around$500. That probably doesn't happen until 22-8 or 22-9 at the earliest. If Starship gets delayed, they keep launching smaller versions on Falcon 9.
SPEAKER_01So Space Compute is real, it's funded, and it has a serious roadmap. But it's a 2029 or 2030 story at the earliest for meaningful scale. Today, it's still mostly research and positioning.
SPEAKER_00But here's the thing, and this is what I want small business owners to really absorb. All of this infrastructure drama, all of this spending, all of these energy fights, it directly affects you because the cost of running AI models runs downstream from the cost of power.
SPEAKER_01Right. When energy prices spike, and they are spiking right now, partly because of the Middle East conflict and its energy supply shock, the cost of running AI infrastructure goes up. Reuters just this morning reported that market-based interest rates are surging, borrowing costs for tech are rising, and AI companies are planning to spend over$600 billion this year in the infrastructure arms race while navigating genuinely difficult macroeconomic headwinds.
SPEAKER_00The Fed is holding interest rates at 3.5% to 3.75%. Inflation is sticky, energy costs are up 30%. And every major AI company is running massive capital expenditure programs funded by debt or equity. At some point, those costs show up in your pricing.
SPEAKER_01And Europe is watching all of this and deciding it doesn't want to be dependent on American infrastructure. Mistral, France's AI champion, just raised$830 million in debt financing, its first major debt raise to build a data center outside Paris equipped with 13,800 NVIDIA chips. Operational by Q2 this year. They're also putting a billion and a half into Sweden. Their stated goal, 200 megawatts of European compute capacity by 22,000.
SPEAKER_00Europe's answer to AI sovereignty is own your own compute. Don't rent it from Amazon or Microsoft or Google. Build the stack yourself on your own land, in your own data centers with your own energy contracts.
SPEAKER_01And that's actually a fascinating competitive dynamic for anyone in business. Because if you're a company in Europe and you care about where your data goes and who controls the infrastructure it runs on, Mistral's pitch is use us. Your data stays in Europe, your compute is European owned, and you're not at the mercy of American cloud contracts.
SPEAKER_00Now let's talk about what XAI Musk's company is actually planning because it's worth understanding the full scope. In February, SpaceX acquired XAI in an all-stock transaction. XAI is now a wholly owned subsidiary of SpaceX. Musk has announced plans to expand Colossus to house at least 1 million GPUs. 1 million, the current state of the art for a massive facility is somewhere around 100,000 to 200,000 GPUs. He's talking about five to ten times that. Which raises a question that is worth sitting with. When private companies become their own power utilities, when they're generating two gigawatts of electricity in your city, running their own turbines, affecting your air quality, and doing it faster than the government can respond, what does oversight look like? Who is accountable?
SPEAKER_01That's not a rhetorical question. The NAACP is asking it in court. Environmental groups in Memphis are asking it at city council meetings. And it's a question that regulators at the state and federal level are going to have to answer because this is not slowing down.
SPEAKER_00Here's what I want you to take away from all of this. The AI you use, whether it's ChatGPT, Claude, Grok, whatever, runs on physical infrastructure that requires enormous amounts of electricity. That electricity has to come from somewhere. Right now, it's mostly coming from natural gas. In the future, it might come from nuclear or from solar panels in orbit or from some combination of all of the above.
SPEAKER_01But the companies racing to build that infrastructure are making decisions right now that will shape the AI landscape for decades. Who owns the compute matters. Where it's located matters. How it's powered matters. And how much it costs to run directly affects how much you pay for AI tools and how accessible they are to small businesses.
SPEAKER_00The race isn't just about who has the best model, it's about who controls the physical layer underneath the model. And right now, that race is being run with natural gas turbines in Mississippi, debt financed NVIDIA chips outside Paris, and satellites with GPUs in orbit.
SPEAKER_01The most important infrastructure story of our lifetimes. And most people think it's a software story.
SPEAKER_00It's not. It's a power story. And the decisions being made right now about where that power comes from are going to matter long after the current generation of AI models is obsolete.
SPEAKER_01That's control AI profit. I'm Frank.
SPEAKER_00And I'm Michael. We'll see you tomorrow.