
Mystery AI Hype Theater 3000
Mystery AI Hype Theater 3000
Petro-Masculinity Versus the Planet (with Tamara Kneese), 2025.01.27
Sam Altman thinks fusion - particularly a company he's personally invested in - can provide the energy we "need" to develop AGI. Meanwhile, what if we just...put data centers on the Moon to save energy? Alex, Emily, and guest Tamara Kneese pour cold water on Silicon Valley's various unhinged, technosolutionist ideas about energy and the environment.
Dr. Tamara Kneese is director of climate, technology and justice at the Data & Society Research Institute
Due to some technical issues during our recording, this week's episode is a bit shorter than usual.
References:
Sam Altman is banking on fusion
“Regenerative finance” in the crypto era
Fears of subprime carbon assets stall crypto mission to save rainforest
Corporate carbon offset company accidentally starts devastating wildfire
The AI/crypto crossoverAI/crypto crossover no one asked for
Blockchains wanted to build a smart city. The state could not sign off on its water rights
Predatory delay and other myths of sustainable AI
Book: Digital Energetics, on Bitcoin/AI computing as a larger energy problem
Fresh AI Hell:
Fake books about indigenous languages
Surveillance company harrasses own employees with cameras
Schools SWATing kids based on AI outputs
Check out future streams at on Twitch, Meanwhile, send us any AI Hell you see.
Our book, 'The AI Con,' comes out in May! Pre-order now.
Subscribe to our newsletter via Buttondown.
Follow us!
Emily
- Bluesky: emilymbender.bsky.social
- Mastodon: dair-community.social/@EmilyMBender
Alex
- Bluesky: alexhanna.bsky.social
- Mastodon: dair-community.social/@alex
- Twitter: @alexhanna
Music by Toby Menon.
Artwork by Naomi Pleasure-Park.
Production by Christie Taylor.
Welcome, everyone, to Mystery AI Hype Theater 3000, where we seek catharsis in this age of AI hype. We find the worst of it and pop it with the sharpest needles we can find.
Emily M. Bender:Along the way, we learn to always read the footnotes. And each time we think we've reached peak AI hype, the summit of Bullshit Mountain, we discover there's worse to come. I'm Emily M. Bender, Professor of Linguistics at the University of Washington.
Alex Hanna:And I'm Alex Hanna, Director of Research for the Distributed AI Research Institute. And this is episode 50.
Emily M. Bender:50!
Alex Hanna:We're 50 and we kick. Uh, which we're recording on January 27th of 2025. That said, as we record this, LA County is still dealing with historic wildfires while just last week, New Orleans experienced its first ever blizzard. So we're thinking about climate change just a bit.
Emily M. Bender:Unfortunately, so are the tech CEOs, but not in the way you'd hope. Sam Altman is banking on fusion energy to save us from the environmental degradation and massive energy use that his own technology demands. Don't worry. It's always just 20 years away. The fusion that is. Meanwhile, everyone from crypto bros to AI boosters think that we can tech our way out of this problem.
Alex Hanna:We're here to pour cold water on the technosolutionist bubble with help. Our guest today is Tamara Kneese, Director of Climate Technology and Justice at the Data and Society Research Institute. Welcome, Tamara.
Tamara Kneese:Hi, thank you so much for having me. Huge fan of the show.
Emily M. Bender:Thank you for being here . I'm going to take us straight into our first artifact, and you can see it, but I can't. Let me get it back. Okay. Um, this is an article from January 23rd of 2025 by Zoh Ahmed, uh, published in something called Techspot, and the headline is, "First ever data center on the moon set to launch next month." Subhead, "The self contained facility promises to offer unparalleled data security, and, environmental benefits." This is such nonsense. Um, so, um, yeah, Tamara you share this one with us. What what led you to pick this piece?
Tamara Kneese:Well, so a few things. I really enjoy the hubris of putting data centers on the moon after we've maximized, you know, as much land as they can take up here on planet Earth. Why not just add the moon to the equation, um, so Lone Star Data Holdings, uh, this is really funny too, because the play on "moonshot project," I mean, can you get any more perfect than this? But there's something about the copy in particular that feels very much like something from The Onion and it's sort of beyond parody. So I'm going to read it in a dramatic way. So, "Lone Star has signed up the state of Florida, Isle of Man government, AI firm Valkyrie, and pop rock band Imagine Dragons as customers for the data center, called Freedom--" Of course."--which will be powered by solar energy and use naturally cooled solid state drivers." Um, and so I, I love that the environmentally friendly aspect of this data center on the fucking moon is highlighted because, you know, okay, so the emissions from the data center or the air pollution, it's not going to affect anybody because nobody lives on the moon. Um, but the idea that this is somehow environmentally friendly when we know that, uh, traveling to space is incredibly energy intensive, right? So how will these data centers be constructed and then maintained, particularly over years?
Emily M. Bender:Hilariously, it ends, or towards the end of it,"The harsh environment, maintenance difficulties, and astronomical costs could create some problematic issues."
Alex Hanna:Literally astronomical. Yeah.
Emily M. Bender:"There are also inherent risks associated with space launch. There is no option for equipment recovery if something goes wrong. Thankfully, the data center will have a ground based backup at a central facility in Tampa." So basically, whatever supposed environmental benefits, of which there are none, are completely erased by having to have a second data center on Earth.
Tamara Kneese:Well, and clearly Florida is a great place to add data centers, right? There's nothing--
Alex Hanna:So much of this, this is--I mean, first off, Florida based companies. So, lots of Florida man jokes to be had here. Second off, like, isn't one of the big points of a data center to be sort of closer to where, you know, like deliveries happen. I mean, they're not streaming Netflix from the moon, uh, but like, you know, but it's still like, this is why you have data centers in like Iowa or different places closer and you're having, you're, you're doing this, what is it? 93 million miles away. Okay. That seems like a--
Emily M. Bender:Yeah, yeah and--
Alex Hanna:--interesting data.
Emily M. Bender:The moon is also not geostationary. So for communicating with it, you're going to have to deal with the moon's orbit.
Alex Hanna:Yeah, but the, like, yeah, reading this was like a fever dream. Just like the, the collect, the collection of investors, like Imagine Dragons, just, I don't know. They did us one song, what, "Just a young boy," isn't that one? I don't know if you know the lyrics, please come up with a parody in the chat. But yeah, this is such a weird thing. And it's, it's just, it's just bad money after bad money after bad money. Just following an absolute nightmare.
Emily M. Bender:And what an entirely on the nose case of,'It's for the environment.'
Alex Hanna:I know. Yeah.
Emily M. Bender:So thank you Tamara. Uh, our next one, um, there's a lot in this text, and we're going to focus mostly on the environmental stuff, but there's a couple things we also just have to get to because it's so ridiculous. So this is, uh, Business Week, The Big Take, um, published January 5th, 2025 by Josh Tyrangiel? Maybe Tyrangiel. Um, and the headline is "Sam Altman on ChatGPT's first two years,
Elon Musk and AI under Trump:An interview with the OpenAI co-founder. Would you like to read 5, 000 words of Sam Altman's answers to these questions? Not really. Um, but there's a few things in here that, that I wanted to, um, uh, they talk about the founding of open, of OpenAI and, um, there's, where's this part? Um, yeah, okay. So interviewer says, "One of the strengths of that original OpenAI group was recruiting. Somehow you managed to corner the market on a ton of top AI research talent, often with much less money to offer than your competitors. What was the pitch?" And Altman says,"The pitch was just 'come build AGI,' and the reason it worked, I cannot overstate how heretical it was at the time to say we're going to build AGI. So you filter out 99 percent of the world and you get only the really talented original thinkers."
Alex Hanna:You have people who are just really, yeah, it's also, he's got this term, he's, he's, he's saying, "People were afraid to talk to me because I was saying I wanted to start an AGI effort. It was like cancelable. It could ruin your career."
Emily M. Bender:I mean, it's true that, you know, not too long ago, if you said that you were working on AI, let alone AGI, people did not take you seriously. And that should still be the case if you ask me. Okay. Um, should we drop down to the bit where they're talking about fusion?
Alex Hanna:Yeah, I think there's, I mean, there's a lot of stuff here just about, you know, the, uh, the palace intrigue, which I think we've talked about some more and don't care that much about, but the, the most, um, fun stuff here in, in a bad way is, um, is this, um, yeah, like the stuff on fusion. There's some stuff in here about his schedule and blah blah blah, some weird stuff on like how the interviewer thinks he has a high emotional IQ and I'm just like or whatever emotional intelligence. Yeah, but I mean the fusion stuff is, let's get into it.
Tamara Kneese:Yeah.
Emily M. Bender:Yeah, the interviewer says, "So energy," and Altman says, "Fusion's gonna work." Interviewer: "Fusion is going to work? Um, on what time frame?" And I love how they've got the um actually in part of the transcript.
Alex Hanna:That's nice.
Emily M. Bender:And Altman says, "Soon. Well, soon there will be a demonstration of net gain fusion. You then have to build a system that doesn't break. You have to scale it up. You have to figure out how to build a factory, build a lot of them. And you have to get regulatory approval. And that will take, you know, years altogether. But I would expect Helion will show you that fusion works soon."
Alex Hanna:And there's a note on Helion, which is, "Helion: A clean energy startup backed by Altman, Dustin Moskovitz of Facebook fame and Reid Hoffman of LinkedIn--" And weird AI op ed in the New York Times last week fame, "--which focuses on developing nuclear fusion."
Emily M. Bender:Yeah, so--
Alex Hanna:So they continue, "In the short term is there any way to sustain AI's growth without going backward on climate goals?" And he says, "Yeah, but none of it as good in my opinion as quickly permitting fusion reactors. I think our particular kind of fusion is such a beautiful approach that we should just race towards that and be done."
Emily M. Bender:This is giving Trump.
Tamara Kneese:Exactly.'Beautiful', 'beautiful' is like a key Trumpian word.
Emily M. Bender:Yeah. So yeah, Tamara, what do you think?
Tamara Kneese:Yeah, so I mean obviously as you mentioned, you know, Altman specifically and personally has Investments in different nuclear fusion and nuclear energy companies. Um, and there's another company, Oklo, um, that he is also an early investor for along with Peter Thiel, of course. Um, and they have already had a number of agreements based entirely on speculative unsubstantiated technology. Uh, and this is the ongoing grift, right? Like, tell people. That a techno technological solution to climate change, to our energy needs, which are being exacerbated by the expansion of AI that, you know, there's a quick fix for this problem, that they have the solution, that it's forthcoming and that all they need is a ton of money right now. And faith, of course, that it will go as planned. And, you know, the reality is like you know, nuclear fusion is something that has been talked about as a possibility for 50 years, and it has not come yet. And even though there have been a few breakthroughs, particularly at Lawrence Livermore labs, um, there's nothing on the scale that would be needed in order to actually have nuclear fusion be an energy solution for our. For our world. Um, and so I, I think, you know, the, the grift around fusion is very similar to the promise of AGI, right? Like, it's this thing that is always just beyond the horizon. Um, if you give me a ton of money and you let me build a bunch of stuff, it will happen.
Emily M. Bender:Yeah. And boy, when we get there, we're gonna have to put lots of factories, right? There's scale coming for sure. Um, I'm reminded of Karen Hao's excellent thread, I think today on what's going on with the, um, the DeepSeek model that was released by, do you know the name of the organization or company in China, Alex?
Alex Hanna:It was, well, DeepSeek is its own organization, but it's funded by effectively like a single Chinese hedge fund manager. Um, and so I don't know that, but yeah, I mean, yeah, Karen had this huge thread on it, but yeah, continue.
Emily M. Bender:And one of the great points that she made is that the thing about scale was never actually a good science of getting towards AGI, which is like clearly not the case because AGI is ill defined and like, that's not science, but always was actually a business model. Because scale is something you can plan and you can measure and you can say, look, we've gotten this many more users and we've made this many more servers and this sounds like exactly the same thing. We're gonna need lots of factories, so give us money because I have a plan and I'm going to use the money and yeah, I totally see what you're saying. It's the same grift and it is not at all surprising that Altman is invested in these same things. I think we should catch up on the chat here a little bit. So first of all, um, NDrWTylr, I never know what to do with that handle, "After they've built fusion and AGI, I take it they're going to build teleporters and faster than light travel." Abstract Tesseract says, "Replicators too, I bet." Um, and then a little bit later Abstract Tesseract says,"Universal housing and healthcare are an unaffordable pipe dream. But fusion? Oh yeah, let's do it."
Alex Hanna:Yeah, I mean, it's, it is, it does, it does totally, um, follow the Star Trek, uh, fever dream, though, because the way the technological development happens in Star Trek is that they have somehow unlimited energy, and that just eliminates all scarcity, and so then that sort of, you know, like, that's, that's the sort of thing. And like, what, Mark Andreessen had this, like, absolutely batshit tweet, like it, that was all about that was to that was sort of like about how how basically AGI is going to lead to like unlimited productivity, which is going to lead consumer goods to effectively fall to zero and I'm like who the fuck's growing your crops, Mark? Like what do you, like how does that work? Like what do you think productivity is?
Emily M. Bender:And what about care work?
Alex Hanna:What about just like, you know, all so many different types of work. I mean, what do you, what do you, what are you trying to automate? In what world does, you know, your sort of techno fashy dream leads to sort of, uh, a world of a post scarcity world and now, now, now I'm kind of getting into like this post scarcity thing and I'm thinking about the, um, this, uh, the old Peter Frase essay on four futures, where he talked about post scarcity, but I mean, it's, but it definitely shows like a move towards this very authoritarian kind of notion of post scarcity, where they're like, they think that, like, you're going to have unlimited energy, somehow we're going to go to question, question mark, question mark, post scarcity, to like, you know, technosolutionist utopia.
Emily M. Bender:Yeah. And Dr. W Tyler, maybe that's what I'm gonna do with that handle, "Looking forward to the data center on Romulus."
Alex Hanna:But, but Romulans, though uniquely have a different type of warp core, uh, that, uh, is, is that kind of causes temporal rifts, which I think is a plot point in, uh, either DS9 or Enterprise. Uh, get in the comments, tell me which one, because I'm not going to look it up here.
Emily M. Bender:Someone has to out Star Trek geek Alex here. Alright, speaking of authoritarian post scarcity, um, a little bit more from this interview."A lot of what you just said interacts with the government. We have a new president coming." coming. When was this published?"You made a personal $1 million donation to the inaugural fund. Why?"
Altman:"He's president of the United States. I support any president."( laughter) And the, the interviewer pushes back a little bit more. I don't understand why it makes,"I understand why it makes sense for OpenAI to be seen supporting a president who's famous for keeping score of who's supporting him. But this was a personal donation. Donald Trump opposes many of the things you've previously supported. Am I wrong to think the donation is less an act of patriotic conviction and more an act of fealty?" So, you know, glad that that reporter is pushing back a little bit, and"I don't support everything that Trump does or says or thinks. I don't support everything that Biden says or does or thinks." There's some really nice both sides ism right there."But I do support the United States of America and I will work to the degree I'm able with any president for the good of the country--" Still doesn't explain an inaugural donation. Like what the hell. You know, I think let's I guess I'll finish his answer here."--and particularly for the good of what I think is this huge moment that has got to transcend any political issues. I think AGI will probably get developed during this president's term and getting that right seems really important. Supporting the inauguration, I think that's a relatively small thing. I don't view that as a big decision either way, but I do think we should we all should wish for the president's success." Ahhh. I'm experiencing moral energy just reading those words.
Alex Hanna:Yeah.
Emily M. Bender:So, yeah. Um, oh, and Abstract--
Alex Hanna:Yeah, go ahead, go ahead.
Emily M. Bender:Oh, Abstract Textract coming in with the linguistics perspective
here:"'I support any president' would be an interesting example sentence in an intro to semantics course." For sure.
Alex Hanna:Yeah, it's just incredible. I mean, yeah, everybody's given their $1 million fealty donation. Um--
Emily M. Bender:This reminds me of one other thing in here about how he was talking about how cool it was to be around when it was invented. Um, and I just want to see if I can find that.
Alex Hanna:What was invented?
Emily M. Bender:Uh, AGI.
Alex Hanna:Oh, yeah,
Emily M. Bender:Um, I'm not finding it. He's talking about, "AGI has become a very sloppy term." As if it ever was otherwise. Um, anyway, somewhere in here, Altman's like, it's just, it's my dream to be part of this amazing moment, and ugh. Shall we go to Greenland?
Tamara Kneese:Sure. I was trying to remember though, did he ever say something about drinking the Kool Aid in that interview?
Emily M. Bender:He did.
Tamara Kneese:Okay, yeah. I was like, I'm pretty sure he mentioned that. Yes. He had drunk the Kool Aid, which is so funny because I feel like the the cult of personality, like death cult specifically um, kind of tinge. And we can talk about that particularly with some of the fantasies around the TESCREAL kind of, you know, exit fantasies. But I, I think, um, there's something a little bit Freudian about him talking about drinking the Kool Aid.
Emily M. Bender:This is, I mean, on one level, it's like, you usually don't say that about yourself. Like that expression is usually used to refer to people who have gone off the deep end by believing someone else's bullshit. Right. And Altman here is saying, I believe my own bullshit, I guess. But also as you, as you flag Tamara, that was actually a story of mass murder.
Tamara Kneese:Yeah.
Alex Hanna:Yeah. It's interesting. And I don't know if that's been a shift within tech circles or not. I mean, I don't know linguistically, um, how that's traveled. Yeah, but it's just, I don't know. He's also got, before we move on, there is a weird thing here that he has about like research labs and like how he views research that I think it's worth spending some time on, um, where he talks, um, uh, where is he saying, he's basically like, um, okay. So, um, we've talked, so, yeah, "We've talked a little bit about how scientific research can sometimes be in conflict with corporate structure. You've put research in a different building from the rest of the company, a couple of miles away. Is there some symbolic intent in that?" And he says, whatever. He's like, no, it's just logistical research will have its own area."Protecting the core of research is really critical to what we do." And then he says like protecting it from what? And this, I think there's some interesting admissions is here. So he says, "The normal way a Silicon Valley company is is you start up as a product company, you get really good at that. You build up to this massive scale. And as you build up to this massive scale, revenue growth naturally slows down as a percentage, usually. And at some point the CEO gets the idea that he or she is going to start a research lab to come up with a bunch of new ideas and drive further growth."That's worked a couple of times in history, famously for Bell Labs and Xerox PARC, usually it doesn't. Usually you get a very good product company, very bad research lab. We're very fortunate this little product company we bolted on is the fastest growing tech company maybe ever--" Which is probably not true."Certainly in a long time, but that could easily subsume the magic of research I don't intend to let that happen." And I feel that like not that facts matter for any of these people, but like, that's not really, like, I don't think that's really how that's worked in what you've had product kind of emerge and then research not be, and I mean, there's always been a weird co, you know, existence for these things, but research has, has led in many different guises. Right. And I mean, I'm curious on what you think about, Tamara, and sort of being in this space and, and kind of tracing a lot of the kind of research products interplays a lot.
Tamara Kneese:Yeah, well, it's interesting because, you know, I mentioned before Lawrence Livermore National Labs and obviously they're responsible for a whole lot of things that are also not good like military applications, but it Altman is directly dependent on research that is coming out in order to make the products that he's imagining for the future even remotely possible. And so the idea that you could detach product development, I mean, R&D is supposed to be about that symbiotic relationship between product development and scientific research and inquiry. Um, and, you know, there's a reason why tech companies are also really interested in paying attention to what's happening at, say, the NSF. And yeah, so I, I do find that to be a really naive view of what research means, and maybe it's, part of it for him is coming from a really tiny startup. I mean, how tiny was OpenAI when ChatGPT launched? And they basically had no one on staff. It was just a tiny group of people. Um, and they did grow rapidly. I don't again, I don't, I don't know if they are the fastest growing tech company ever. Um, but that that's so different from tech companies that have a fully fledged R&D, um, roster and, you know, where they are imagining 5 or 10 years into the future, but in a very different way, right? Like, it is still kind of looking at the market and trying to predict where markets are going. Whereas this is much more about, um, trying to make the research catch up to the ideology.
Alex Hanna:Yeah.
Emily M. Bender:That's such a good way of putting it, trying to make the research catch up to the ideology because that's what, that's what AI and AGI are, right? They're ideologies, not, not scientific inquiry, not, you know, actual products. Um, we had a lot of fun last episode or the one before, I think it was two ago. And we've done a bunch in quick succession here, um, tearing apart Sam Altman's reflections and just like, he's so, you said naive. He's so naive about like what uh, you know, actual product development is and things like that.
Alex Hanna:Yeah. MJKrantz says in the chat, "Key point, Bell Labs was most successful when it was focused less on application technology than when it was focused on fundamental science, um, rep, I guess, comma, rather than product development. It has become less relevant since the nineties, uh, when it shifted back to application research." I mean, and Bell Labs also, it's also the kind of antitrust on Bell more generally.
Emily M. Bender:And it wasn't, wasn't PARC kind of famous for never managing to actually make products? Like they come up with these great things and other people made the products.
Alex Hanna:Yeah. PARC was sort of kind of. You know, bandied about as a thing, and then they kind of invented the GUI and then Apple stole it and ran with it, you know, right.
Emily M. Bender:All right, let's, let's go to Greenland. So we have time for this. It's also so ridiculous. Um, so this is a article by Margot McCall on November 15th of 2024 in TechCrunch. And the headline is a quote."'I went to Greenland to try to buy it.'" And then subhead, "Meet the founder who wants to recreate Mars on earth." So, um, Uh, let's see, reading the, the start of this article, "Last summer, a twin propeller plane touched down on the gray cratered terrain of Nuuk, the capital of Greenland. A 28 year old, deboarded, ready to march into the Nordic Parliament building with a bold proposition.'I went to Greenland to try to buy it,' Praxis founder Dryden Brown wrote in a viral tweet later. On the phone with TechCrunch last week, he filed down his edgelord bluster.'Obviously they have a sort of sense of pride that makes the idea of being bought, it's almost like condescending,' he said, 'but they would actually like to be independent.'" This guy,
Alex Hanna:This fucking guy. Also, first off, very, very, um, offended that the name of this guy's thing is called praxis.
Emily M. Bender:I know.
Alex Hanna:First off that's, that's our word. Bring, give it back. And then, just like the uh, just like, the complete, like, of course this guy's name is Dryden, like, I can't think of a more silver spoony name than a 28 year old getting off at, getting off a plane and just being like, I want to buy your country. I'm like, okay, slow down colonizer. Like, what are you doing?
Emily M. Bender:Yeah. No, only the president gets to do that.
Alex Hanna:Oh God.
Emily M. Bender:Sorry. Just, um, uh, poor Greenland. Um, okay. So" rather than buying Greenland, he wondered whether he could work with the government to create a new city purposefully built on uninhabitable land. 'What if we can sort of build a prototype of Terminus,' he said, referencing Elon Musk's preferred name for a city on Mars." Here we go again with the Foundation stuff, right?"A member of the Danish parliament was not amused. 'Greenlandic independence requires approval by the Danish parliament and a change of our constitution,' politician Rasmus Jarløv tweeted,'I can guarantee you that there is no way we would approve independence so that you could buy Greenland.'" Ah, um, so there's, okay. He's got the financials for it. Um, including, of course, uh, Peter Thiel is involved in this. Um, and, uh, so this is, "Brown emphasized Praxis as an internet-first ideology, one that has courted controversy, like when a Praxis member guide reportedly said that quote--" again, do I have to say those words out loud?
Alex Hanna:You should, we are, we are a podcast, so you should. I can say it, so it's, "when a Praxis member--" It's okay."--when a praxis member reportedly said that quote 'traditional, European/Western beauty standards on which the civilized world at its best points has always found success.'" So yeah, I mean like an internet first ideology. You know, uh, Euro white eugenics, like, you know, it's, it's just--
Emily M. Bender:It's, there's so much wrong with this. I mean, first of all, Praxis is supposed to be a company, but it's also an ideology. That's weird. Um, secondly, all of the racism. And thirdly, beauty standards are the, are the basis of success?
Alex Hanna:Yeah. Yeah. To basically to like, to vibe off of André Brock. It's like all the libidinal, a libidinal economy. It's all just like, it's just like, it's just all about like natalism and like making white babies, you know, it's all, it's all, it's all there, man.
Emily M. Bender:Okay. So they've raised money, um, thanks to Peter Thiel and his followers. Um, and. Okay, so "Praxis is one of the prominent examples of a, quote, 'network state,' a term defined by former a16z investor Balaji Srinivasan as an internet community that acquires a physical home and, quote, 'gains diplomatic recognition from pre existing states,' he wrote. Marc Andreessen has praised the concept, and Ethereum co founder Vitalik Buterin created his own network state experiment." It's just, this is, this is so gross on so many levels. Somewhere in here they talk about terraforming Greenland.
Alex Hanna:Yeah.
Emily M. Bender:Is Greenland not part of Terra already?
Alex Hanna:This guy Srinivasan is a terrible person, like just like his name comes up like over and over. This guy is, yeah, this guy is like he is like really out to lunch. I was just searching for what I found his name for, and he's, you know, he's done this, this is network state stuff. Um, I found this article in the New Republic, uh, written by Gilder and, um, and it's, and apparently he's quoted Mark Andreessen has said,"Balaji has the highest rate of output per minute of good new ideas of anyone I've ever met." And it's, it's all about the network state stuff. You know, this guy is like, this guy is, um, just like absolutely fascist when it comes to San Francisco politics, like, you know, like just absolutely--oh yeah. He's quoted as saying, "What I'm really calling for is something like tech Zionism." Um, yeah, like, so, you know. And so I'm just like, yeah, just absolutely fucking batshit stuff, you know? So it's no surprise that this is like the ideology on which this stuff is grounded.
Emily M. Bender:So, um, regular listeners might be saying, wait a minute, we haven't said the phrase AI in a little while. So, Tamara, there's a lot of similarities here. Can you, can you fill us in on the connections?
Tamara Kneese:Yeah, for sure. So, um, it's important to keep in mind that after the crypto crash, a lot of Bitcoin mining companies quickly pivoted to AI as generative AI started taking off in the hype cycle. And so in many respects, it's the same kind of hardware, same kinds of data centers, GPUs, right? Like the same kinds of hardware. Um, and the same infrastructures that are needed. And so a lot of startup companies were able to pivot and then it's interesting too, because I mean, there's the ideological side of Bitcoin as well, which is all about a kind of, uh, petro masculinity anyway. Bitcoin is all about using as much as you can in terms of being resource intensive, because that's the entire point of Bitcoin, right? Is like, it only works if it is incredibly resource intensive.
Emily M. Bender:Did you say the word petromasculinity?
Tamara Kneese:Yeah.
Emily M. Bender:That's wonderful.
Alex Hanna:That's a great word.
Tamara Kneese:Yeah, and basically, you know, uh, this is why Peter Thiel really hates ESG regulation in particular, um, and you may have noticed that it's all over, um, like Project 2025, for example, is anti ESG stuff. Um, a lot of that is about the environmental regulation that might stop things like Bitcoin mining, um, and also, uh, AI infrastructure, right? So it's kind of interesting because everybody really hated Bitcoin mining. You know, the press was really against it. People talked about how it was such a waste of energy and it ended up being banned in so many places in the world, except for the U.S. Um, and, uh, now we have AI infrastructure, which is going to have a lot of the same effects on the environment, a lot of the same kind of energy requirements. Um, but there is this ideology associated with AI as well, but for some reason it's been much more mainstreamed. But at the same time, if you look at the fringes, if you look at places like Praxis and you see like a white supremacist petro masculine, uh, fantasy of getting hot chicks to come to your weird little like fake city in the middle of Greenland. Um, that is, that is, you know, not disconnected from the most mainstream companies that are leading the, the AI hype cycle. Um, and--
Emily M. Bender:And it's the same people, right? So Peter Thiel is here. Marc Andreessen is here. It's like, this is the fringes, but it's not really like the fringes aren't that far. The fabric is frayed and the fringe is right there.
Alex Hanna:Yeah.
Tamara Kneese:Well, can we go back to Sam Altman for a moment? Because of course his, uh, wonderful, uh, Worldcoin project as well, and so the Orb, which was going to scan the eyeballs, um, of people in the Global South--
Emily M. Bender:Which did, they collected a lot of that data.
Tamara Kneese:As a form of financial inclusion in theory. And again, so there's always some kind of rhetoric around how this will be useful. And actually, even with the environmental aspect of it, and energy relationships. One of the arguments that some Bitcoin companies made when I was at Intel around like 2021, um, was that all of the energy intensive needs of Bitcoin mining would actually be good for renewable infrastructure because it would, uh, raise the demand and therefore they would build out more renewable energy infrastructure. Clearly, what we're seeing happen right now, uh, is that both crypto energy requirements and AI, because crypto is, of course, still contributing to energy, uh, you know, uh, strains on the grid. Um, it. You know, we're not seeing that at all. In fact, we are seeing the expansion of fossil fuel industry. We're seeing AI being used to further drill oil and gas, um, and we're seeing the perpetuation of coal. Like Trump just said that coal is great. Um, and we have all of these, uh, definitely not renewable energy sources that are now being used to power specifically AI and crypto, um, rather than using the energy that we do have available to do useful things like heating people's homes. Right.
Alex Hanna:Yeah. And that's an absolute like, this is an absolute batshit argument. I'm sorry. I'm just like in disbelief. It's like the thing, this is a weird analog, but it's like the thing that the, the Facebook CMO said after they changed the like queer, like the transphobia and homophobia guidelines. They were like, well, because there's going to be like more transphobia on the platform, like people are going to be more attracted to LGBTQ causes. And I'm like, I took, I like, I read that and I took, psychic damage. Like--
Emily M. Bender:Oh, I hadn't seen that one. But yeah, it's exactly the same thing. Let's make the problem worse so that people are more inclined to solve it. Oh man. And, and nevermind the suffering that's going to happen in the meantime, while we make the problem worse. And, oh, and you know, of course AI is going to solve it for us somehow anyway. So, yeah. All right. I want to take us into Fresh AI Hell and Alex, you gave me once again, your, um, your prompt for this week. You are--
Alex Hanna:Did I?
Emily M. Bender:As in like you said something that like snapped it into place. You are, you can be a AI Hell demon or just yourself as you like, but you are on the moon in a space suit. Repairing that data center and singing an Imagine Dragons song.
Alex Hanna:The thing is, I don't know enough Imagine Dragons songs that I can--
Emily M. Bender:Well, you're singing, you're singing the one that they're going to, you know, produce in three years. You can make it up.
Alex Hanna:Sure. Uh, so I'm just jumping along and for those of you who are just listening, I'm doing slow arm movements and I'm wrenching away, and yeah, I can't honestly, this one's very difficult. I'm just gonna, I'm going to go wrenching away in Margaritaville, looking for my last, looking for my lost-- My lost, is it lost or last shaker of salt? Um, anyways, I, I, I lost it. I can't, I can't today. I'm too, I'm too, I'm too fried. I'm sorry to our gracious--I'll give you two next time.
Emily M. Bender:But I just love the idea of trying to, um, consume a margarita in zero-G where you have like little, little, balls of margarita floating around in front of you.
Alex Hanna:Yeah, interesting.
Emily M. Bender:All right. So Fresh AI hell. Um, this one is one that's been sitting there for a while, but I really wanted to get to it. And this is from the Anishinabek News, um, up there, north of us here, um, from, um, October 22nd, 2024 by Marcy Becking, and the headline is, "Fraudulent Anishinaabemowin resources a serious concern," and what they're describing are cases where people have used the synthetic text extruding machines to create fake books with things like the most frequently used Ojibwe verbs. Um, and, you know, that's really, uh, harming efforts at language reclamation among these communities when people try to go find out, okay, what can I learn about Ojibwe and they get these fake books back. And it's not, uh, just these languages, but many, many Indigenous languages around the world. And I am furious as a linguist. Sort of connected to that. Alex, you want to do this one from Yann LeCun?
Alex Hanna:Yeah. So this is one from about a month ago, Yann LeCun on LinkedIn, which he says,"Every institution, library, foundation, cultural group, and government around the world that possesses cultural content should make it available for training, free and open," in three asterisks on either side, "AI foundation models. Free and open AI systems will constitute the repository of all human knowledge and culture, uh, perhaps someone could draft a new content license to that effect. Quote, 'You can use our content to train your open, your AI system, but only if you make it freely available with open weights and open source inference code.'" Uh, yeah, he got a lot of, there was a lot, there's a lot of pushback on this. I replied to him personally. I'm like, this is, the worst. I forgot exactly what I said. I was, but basically this is a terrible idea. Why would people give up their cultural knowledge when it's effectively colonizing their, uh, their, their, their, their, their culture. And he actually replied to me and said something like, this is so pessimistic. Like, why wouldn't you allow this? It's like, no one wants your fucking techno fascist future, bro. Like people want control over their cultural artifacts and heritage. Why would you give it to a huge company?
Emily M. Bender:And speaking about connections between this and the Praxis city that they're going to try to put into. Greenland without actually connecting culturally with anything going on in Greenland, right, um, it's the same colonialism.
Alex Hanna:Yeah.
Emily M. Bender:Yeah. Okay. Next. Um, this is a post on Bluesky by Jason Koebler from, um, 404 Media. Although, so I think he's, this is just him quote tweeting or quote posting hyper visible. Um, so Jason says, not only did they use surveillance cameras to harass their own employees, they also got hacked, exposing tons of highly sensitive, footage from their network. And this is from an article in the Verge, um, "Security camera startup Verkada in talks for a $4.5 billion valuation." That's sort of the, how it started, how it's going."Surveillance company harassed female employees using its own facial recognition technology." Um, and "Verkada's clients include Juul Labs, Equinox, and Red Lobster."
Alex Hanna:Yeah, that's basically what so many people have been saying. Like, oh, any kind of surveillance or facial recognition, facial recognition technology? This will immediately be used to harm women and gender minorities.
Emily M. Bender:And at the company working on the very thing. Lovely. All right. Keeping up the pace. This is so sad though.
Alex Hanna:Yeah, this is, uh, from Futurism. Um, their sub publication, The Byte, the title is "Schools using AI to send police to students homes," by Victor Tangermann. Uh, Pull quote, "It was one of the worst experiences of her life. Um, schools are employing dubious AI powered software to accuse teenagers of wanting to harm themselves and sending cops to their homes as a result, often with chaotic and traumatic results." Uh, yeah, so this is part of, you know, I think a long line of technology being used to ostensibly monitor students or teens specifically for self harm and suicidal ideation. And just like everything in this country, we not only have surveillance for that, but then send cops to try to address that. Um.
Emily M. Bender:This is like automated SWATing.
Alex Hanna:Yeah. Yeah. Fucking nightmare.
Emily M. Bender:Okay. It gets a little bit better. Um, this is from December 26th in Gizmodo, um, headline by Thomas Maxwell. Um, sticker is "artificial intelligence." Headline is, "Leaked documents show OpenAI has a very clear definition of quote'AGI.'" Subhead, "We finally have a real definition of the elusive quote 'AGI.'" And the definition is, according to the leaked documents that they got their hands on, um, uh, okay, so this was a documents exchange between OpenAI and Microsoft."The two companies came to agree in 2023 that AGI will be achieved once OpenAI has developed an AI system that can generate at least $100 billion in profits."
Alex Hanna:I remember when this came out, people are like, well, that's a great definition. I guess it's, it was just about money all along.
Emily M. Bender:Yeah. I mean, at least it's operationalizable. Like you can tell when it's happened. Um, and you know, how much carbon are they going to emit along the way? Yeah. Yeah. And then finally-- did you have something to say there Tamara?
Tamara Kneese:I was just going to say, yeah, you can put that in an OKR.
Alex Hanna:Yeah. Generated a hundred billion dollars. Well, it's also just saying like, well, we have AGI once we can actually, uh, you know, quantify our return on investment to shareholders. And I'll say we've made kind of a middling amount of money.
Emily M. Bender:Right. And much less than we've taken in.
Alex Hanna:Yeah.
Emily M. Bender:Sorry. Made, made much less than we've put out into, you know, all the chips and all the compute and everything.
Alex Hanna:Yeah.
Emily M. Bender:All right. And then finally, Alex, you want to do this one?
Alex Hanna:Oh yeah. Sure. This is, so I think this is, comes from a user, not sure if it was the original one, but. Dynamoe, uh, with like, uh, the, the word and then an E at the end, like, uh, Moe from the Simpsons dot, uh, bsky dot social. And it's this fun little banner where it says "Stop forcing AI into fucking everything," and then above it says,"Nobody asked for it," and below it,"Everybody, everyone hates it." So that's a maybe a good, like, sense of the milieu. I feel like, yeah. And I mean, there's, there's been a lot of that, uh, effort. There's a few more things. I mean, and we'll probably talk about it on the next podcast. We'll bring it up in crude, including, uh, The Reed Hoffman fever dream letter and, um--
Emily M. Bender:That op ed was so banana.
Alex Hanna:Yeah, we should talk about that next time. It was, it was just a wild time and, uh, a few other things, but, you know, like they say on Wait, Wait, Don't Tell Me, if it happened, if any of these things happen on the news, you'll hear about it on Mystery AI Hype Theater 3000.
Emily M. Bender:Oh man. Yeah. And you know, the thing is that it's, it's ridiculous. Some of the, I mean, the T shirt's great, but the, the rest of the stuff is ridiculous or destructive or sad. And it is also urgent. It's not just, ha ha, they're still on their nonsense because the environmental impacts are drastic and getting worse. And it is so frustrating that anybody gives Altman, for example, the platform, like even the, the, the journalist was sort of gently pushing back, but also like platforming these nonsense ideas. Yeah. Yeah. All right. Um, we, I think are at time here in our foreshortened episode. Um, that's it for this week. Tamara Kneese is director of climate technology and justice at the Data and Society Research Institute. Thank you so much for joining us today.
Tamara Kneese:Thank you so much for having me. This was a lot of fun.
Alex Hanna:Thank you so much. Tamara. Our theme song was by Toby Menon, graphic design by Naomi Please-Park, production by Christie Taylor, and thanks as always to the Distributed AI Research Institute. If you like this show, you can support us in so many ways: rate and review us on Apple Podcasts and Spotify, pre order The AI Con at TheCon.AI or wherever you get your books, subscribe to the Mystery AI Hype Theater 3000 newsletter on Buttondown, or donate to DAIR at DAIR-Institute.org. That's D A I R hyphen Institute dot org.
Emily M. Bender:Find all our past episodes on PeerTube and wherever you get your podcasts. You can watch and comment on the show while it's happening live on our Twitch stream. That's Twitch.TV/DAIR_Institute. Again, that's D A I R underscore Institute. I'm Emily M. Bender.
Alex Hanna:And I'm Alex Hanna. Stay out of AI Hell y'all.