
Mystery AI Hype Theater 3000
Mystery AI Hype Theater 3000
"Like Magic Intelligence in the Cloud", 2025.05.26
Because Sam Altman hates opening his laptop, OpenAI is merging with iPhone guy Jony Ive's design firm in the name of some mysterious new ChatGPT-enabled consumer products: Alex and Emily go full Mystery Science Theater and dissect the announcement video. Plus how tech billionaires like Sam Altman mythologize San Francisco while their money makes it less livable for everyone else.
References:
Sam Altman and Jony Ive are merging (Video)
Emplacedness, real estate, and gentrification in San Francisco
Anthropic? More like anthropomorphic
Karen Hao on her new book "Empire of AI" in conversation with Alex and Emily
Fresh AI Hell:
Don't use ChatGPT to summon demons
AI prompts accidentally left in novels
"AI" tutors are teaching fentanyl recipes
xAI's data center polluting Memphis with unpermitted methane generators
Gemini's on Bluesky - block it
Family uses "AI" generated avatar to give victim impact statement
The market for "AI friends"? Lonely losers
*****
You can check out future streams at on Twitch, and send us any AI Hell you see for future episodes.
Our book, The AI Con, is out! Get your copy now.
Follow Emily: Bluesky/Mastodon
Follow Alex: Bluesky/Mastodon
Music: Toby Menon.
Production: Christie Taylor.
Graphic Design: Naomi Pleas
Check out future streams at on Twitch, Meanwhile, send us any AI Hell you see.
Our book, 'The AI Con,' comes out in May! Pre-order now.
Subscribe to our newsletter via Buttondown.
Follow us!
Emily
- Bluesky: emilymbender.bsky.social
- Mastodon: dair-community.social/@EmilyMBender
Alex
- Bluesky: alexhanna.bsky.social
- Mastodon: dair-community.social/@alex
- Twitter: @alexhanna
Music by Toby Menon.
Artwork by Naomi Pleasure-Park.
Production by Christie Taylor.
Welcome everyone to Mystery AI Hype Theater 3000, where we seek catharsis in this age of AI hype. We find the worst of it and pop it with the sharpest needles we can find.
Emily M. Bender:Along the way, we learn to always read the footnotes, and each time we think we've reached peak AI hype, the summit of Bullshit Mountain, we discover there's worse to come. I'm Emily M. Bender, professor of Linguistics at the University of Washington.
Alex Hanna:And I'm Alex Hanna, Director of Research for the Distributed AI Research Institute. This is episode 58, which we're recording on May 26th, 2025, and we're here to take stock of some of the dizzying new hype around consumer products of so-called "AI" and some delusions of grandeur around AI alignment.
Emily M. Bender:Sam Altman's hiring the iPhone guy to make his synthetic text extruder more shiny. Anthropic is going on and on about their models needing welfare assessments. But fear not. We're here to get you off the merry-go-round with the reminder that if it sounds like technological breakthroughs are coming at a dizzying pace, it's probably the result of a calculated PR campaign. Okay, we're, we're trying something new this time. We've got a video that we're gonna give like the actual MST3K to. Uh, well not quite because we're gonna be pausing it, um, to comment. This, uh, was posted by Sam Altman himself on May 21st, um, on X, the bad place. Um, where else would he be? Um, and the, uh, text here is,"Thrilled to be partnering with Jony, IMO the greatest designer in the world. Excited to try to create a new generation of AI powered computers." It is a 9 minute 22 second long video. We are going to play it with breaks to talk about it. Here we go. Let's see if you can hear it. View of the Golden Gate Bridge.
Sam Altman:I think we have the opportunity here to kind of completely reimagine what it means to use a computer.
Emily M. Bender:"San Francisco, spring 2025."
Jony Ive:Sam is a rare visionary. He shoulders incredible responsibility, but his curiosity, his--
Alex Hanna:So first off, so this is the voice of Jony Ive, who is one of the designers of the iPhone and also the Macbook Pro. And so we're getting all these views of San Francisco and just the first thing with this, um, that Sam shoulders this giant responsibility and the kind of person that he, he features himself as. Anyways, I just want to go that, 'cause it was terrible just to begin with, but it gets, it gets worse, so let's--
Emily M. Bender:It gets worse.
Alex Hanna:Yeah. So let's continue.
Emily M. Bender:All right. And, and what I'm doing when I'm saying things in the middle, there's some text on the screen, and I'm just reading that out loud to make sure people get the full effect.
Jony Ive:--his humility remain utterly inspiring.
Alex Hanna:"Two friends." Aww.
Sam Altman:Jony is the deepest thinker of anyone I've ever met. What that leads him to be able to come up with is unmatched.
Emily M. Bender:"Working together." Here they are walking, from behind."For two years." We're walking.
Jony Ive:I moved to America, drawn by the exhilarating optimism of San Francisco and Silicon Valley.
Sam Altman:We are sitting at the beginning of what I believe will be the greatest technological revolution of our lifetimes.
Emily M. Bender:Can I just say, why does Sam have his hands in his pockets all the time? In these shots.
Alex Hanna:And I just wanna say something, there's a certain kind of, um, visual grammar of this video. It's like iconic shots of San Francisco, and we're gonna go into a little bit about the importance of the, this being so in place in San Francisco, especially when we talk about Jony Ive. Um, but I mean, it's, you get this, these images of the, the Transamerica uh, Tower. You get these images of the street cars, you get these images of the iconic hills, um, and all the different areas. Um, you get like these shots of San Francisco Chinatown. There's all these different elements that like want to emplace San Francisco, which I think is an interesting rhetorical strategy for these two, especially when, um, you know, Sam Altman tries to be this person who is everything everywhere, all at once, the way that generative AI is. So it's a really interesting rhetorical move. Um, yeah.
Emily M. Bender:Where does Sam Altman actually live? Does he live in the city of San Francisco?
Alex Hanna:I don't know. I mean, my guess, he probably lives in the South Bay, but, uh, just because so many people there, but I don't really know, just because the kind of housing and extravagant lifestyles, uh, that folks have tend to be in the South Bay. Yeah.'cause that's where his mansion is. Our producer, Christie Taylor says his ranch is in Texas. That's not really, uh, much as much as of a surprise.
Emily M. Bender:Yeah. All right. Shall we keep going?
Alex Hanna:Let's do it. Yeah's. Gotta get through this thing.
Jony Ive:I have a growing sense that everything I've learned over the last 30 years has led me to this place and to this moment.
Alex Hanna:So this is also interesting to, so. This is, they did a shot of like, um, Vesuvio, um, which is, uh, actually right next to City Lights, where we did our book event.
Emily M. Bender:Oh, cool.
Alex Hanna:Um, and then they had this shot of, um, this place that was, I think it looks like it was on Columbus. Um, so it's the North Beach Hotel and then so, and then also this shot, the Transamerica Tower. So many shots of the Transamerica Tower.
Emily M. Bender:It, it makes it seem like that's all there is in San Francisco, the way they're doing this.
Alex Hanna:You know, it's actually funny, they're not, they're doing no shots of the Salesforce Tower, so. Which is now--
Emily M. Bender:I, I spotted one in the background. But it is very, like--
Alex Hanna:It's not the central, you know, that, you know, like, don't give Mark, no, no attention to Salesforce. Uh, you know, who's a, uh, you know, some kind of a competitor.
Emily M. Bender:i, I'm also watching this wondering how many of the people who are sort of the background a actors in these shots were actually there, consentfully, and how many of them just got filmed because they happened to be walking down the street that day?
Alex Hanna:My sense is that Sam Altman's probably such a large figure, you know, they probably had background actors for this, just walking around. Because I mean, they probably closed it like a film shot.'Cause this is very well produced, very shiny.
Emily M. Bender:Yeah. I mean, I'm guessing this, this bartender here, or barista was definitely like paid to do this, but--
Alex Hanna:Totally.
Emily M. Bender:Couldn't tell with the others. Okay. So the text on the screen right now says "Jony Ive," and I'm assuming it's Ive, because the other thing talks about the "Ive Hive", so--
Alex Hanna:I think it's Ive, yeah.
Emily M. Bender:They, they, they wouldn't do Ee-vay Hive or anything like that. Okay.
Alex Hanna:Yeah. Yeah.
Emily M. Bender:Here we go. Sam. Crossing the street. Hands in pockets. Sam Altman enters this coffee shop. Smiles at his friend.
Sam Altman:Morning Jony.
Jony Ive:Hey Sam.
Emily M. Bender:Making some coffee. Making coffee.
Alex Hanna:Geez. Taking, taking us back to that reference.
Emily M. Bender:What can I say, Gen X represent.
Jony Ive:So what is this announcement we're talking about?
Emily M. Bender:"Announcing a new company called io."
Sam Altman:Two years ago, Jony and I started talking about what the future of AI and new kinds of computers was going to look like. I was running OpenAI. Johnny's running a design firm called LoveFrom that had established itself as really the, I think, densest collection of talent that I've ever heard of in one place and probably has ever existed in the world.
Emily M. Bender:"Probably has ever existed in the world."
Alex Hanna:Yeah. The way that, there's another thing and, uh, I'm thinking a lot with this material too, 'cause, uh, you and I, Emily, are both reading, uh, Karen Hao's uh, excellent new book, "Empire of AI", which came out, um, shortly after ours, and we're having a conversation with her at Data & Society, uh, in June. And like there's discussion of talent. It makes me think about that a lot, about the competition of talent. This idea that like, intelligence is like, you really have to get these super people who are like very talented. And it's just so interesting to me how, how, how AI people talk about like work versus how they talk about like talent. Um, and there's sort of this like very ideal idealistic way of thinking about like, if you just sort of have this sort of high amount of intelligence, then you're like well suited for this particular sort of work. Um, and we talked about this with the AI 2027 thing as well. So it's really interesting and yeah, I just want to remark on that one piece.
Emily M. Bender:Yeah. And it it's a super indivi, individualistic way of looking at things too. Yeah. Like the, if you're gonna have a team of people working on something from their point of view, all that matters that each individual person scores high on this notion of, of talent or intelligence. And not that there's, you know, good cohesiveness or anything else, you know, that sort of looked at the group level.
Alex Hanna:Yeah.
Emily M. Bender:Um. All right. Starting it again.
Sam Altman:And it became very quickly apparent to both of us that we needed a third company.
Jony Ive:A year ago. I founded, um--
Emily M. Bender:I just wanna point out, they're talking about they need a third company, but that's not where this ends up.
Alex Hanna:Yeah.
Jony Ive:--io with Scott Cannon, Evans Hanky, and Tang Tan, who are the most extraordinary engineers. They have built a team of remarkable subject matter experts that range from, um, hardware and software engineering, physicists, researchers, um, product manufacturing experts. And so io is merging with OpenAI.
Emily M. Bender:We need a third company. So it's merging with OpenAI. But also the subject matter experts framing was bugging me. Right, the, um, this is a, once again, the, the logic of domains, right? Keep going?
Alex Hanna:Yeah let's do it.
Sam Altman:Formed with the mission of figuring out how to create a family of devices that would let people use AI to create all sorts of wonderful things,
Emily M. Bender:"Would let people use AI."
Alex Hanna:Yeah. The, on the text on the screen,"A family of AI products." Yeah.
Jony Ive:The first one we've been working on I think is, has just completely captured our imagination.
Sam Altman:You know, Jony called one day and said, this is the best work our team has ever done. Yeah. I mean, Johnny did the iPhone, Johnny did the MacBook Pro. I mean, these are, these are like the defining ways people use technology. It's hard to beat those things.
Emily M. Bender:Can we talk for a moment about what he means by technology in that statement? Right, so "these are the defining ways that people use technology". He's talking about two Apple products only.
Alex Hanna:Mm-hmm.
Emily M. Bender:Um, so technology has to mean like computers and software,
Alex Hanna:Right. Yeah. He is certainly not, I mean, the scoping of technology is always kind of this, this computational notion, right? It's not a simpler sort of set of technologies, but I mean, you know--
Emily M. Bender:Or broader set of technologies, right?
Alex Hanna:Yeah. Right. So it's like really scoping this into certain kinds of areas in which these two are considered to be ex experts or visionaries, right?
Emily M. Bender:They have all the talent.
Alex Hanna:Yeah. Yeah.
Sam Altman:Those are really wonderful. Jony recently gave me one of the prototypes of the device for the first time to take home, and I've been able to live with it, and I think it is the coolest piece of technology that the world will have ever seen.
Jony Ive:The products that we are using to deliver and connect us to unimaginable technology. They're decades old.
Sam Altman:Yeah.
Jony Ive:And so it's just common sense to at least think, surely there's something beyond these legacy products.
Emily M. Bender:I'm sorry. The common sense remark is cracked me up because they're trying to put this contrast between, um, the, uh, you know, the decades old laptop, cell phone, smartphone idea. And the amazing new, you know, whatever this AGI thing is, you know, fake thing they've imagined.
Alex Hanna:Right, right. I mean, the decades old and I mean, when is the, the iPhone itself is what, I think about two decades old. It's around like 2003. And so it's, you know, it's ancient by now. Right. I mean, I think that's so bizarre.
Emily M. Bender:Yeah. Yeah. And I also wanted to add earlier in there, uh, Altman says I got to bring the thing home and I could live with it. By which he meant I got to experience having it close to me in my life, but it also sounded like, yeah, it was good enough. I could live with it.
Alex Hanna:Yeah.
Sam Altman:We have like magic intelligence in the cloud. If I wanted to ask--
Alex Hanna:Yeah.
Emily M. Bender:"We have like magic intelligence in the cloud."
Alex Hanna:Right. So already, yeah. Okay. Awful.
Sam Altman:ChatGPT is something right now. About something we had talked about earlier. Think about what would happen, I would like reach down. I would get out my laptop, I'd open it up, I'd launch a web browser, I'd start typing. I'd have to like explain that thing and I would hit enter and I would wait and I would get a response. And that is at the limit of what the current tool of a laptop can do. But--
Alex Hanna:Yeah. Yeah. So this is, so this is very, very ridiculous, right? I mean, he is, he is like, I'd have to do all this stuff to get to the ChatGPT and like what if I could, what if we had someone kinda luxury surveillance tool that, you know, had something or had some device or something that would be so, um, kind of embodied or, or, um, ubiquitous, you know, kind of on the, on the set of like ubiquitous computing where it's like always listening, that it would just know I, I don't have to explain this thing. So this being sold as sort of this a matter, a matter of convenience. It makes me think of. Chris Gilliard's luxury surveillance, you know, as a, as a, uh, uh, as a, as a thing here, right?
Emily M. Bender:Absolutely. The, the other part of this that, that got to me was when he says, this is at the limit of what the laptop as a tool can do, like completely ignores everything else you do with a laptop. Like to Altman the way he's scoping this, it's all about what the experience of connecting to the synthetic text extruding machine is like and nothing else. Yeah. And that's, that's sort of the, the pinnacle of this. I also wanna remark on the gesture that he used when he described opening a web browser. He did this like upwards flick with two fingers that does not correspond to how I open a web browser anywhere. Um, and I'm curious if it's like this, it looked almost like, you know, when you have in sci-fi TV and movies, the person interacting with the computer where they're, they're just gesturing. It looked like that kind of a gesture, which is odd.
Alex Hanna:Yeah.
Emily M. Bender:All right. Keep going guys.
Alex Hanna:Yeah.
Sam Altman:I think this technology deserves something much better.
Emily M. Bender:"A partnership based on friendship."
Unknown speaker:So how did it all begin?
Jony Ive:So my son, Charlie, was the first, the first person in the family to use ChatGPT. And he said, you've, you've just got, you've got to meet Sam.
Sam Altman:I, I met Jony's family relatively quickly after me and Jony.
Emily M. Bender:Wait a minute, I'm stopping too much, but I can't take this in large doses. And it says something about the cultural milieu that that kid is growing up in, that he tries a piece of technology and says to his dad, you have to meet the person who developed this.
Alex Hanna:Yeah, right. I mean, you're already, I mean, yeah, Jony Ive is this kind of design, you know, central figure, right? I mean, and so there is access to doing that. Um, but yeah, that's, it's, it's, yeah. We have to stop this 'cause it's, it's, it's pretty hard to watch.
Emily M. Bender:All right. More.
Sam Altman:They were sort of just like, it was an impossibly lovely family. I was just thinking what a privilege it is to really connect with somebody new. And it's, it's, it hasn't happened to me in a long time.
Jony Ive:Yeah.
Emily M. Bender:Okay. What a privilege it is to really connect with somebody new that's a little foreshadowing for a Fresh AI Hell thing that's coming. Just saying.
Alex Hanna:That's true. Yeah.
Sam Altman:I, um, and I, the, the, the reason I think that it happened is we had both a very strong shared vision, we maybe didn't know exactly where we were gonna go, but like, the direction of the, like the force vector felt clear.
Emily M. Bender:Force vector?
Alex Hanna:I, I love how the, the, it's great. I mean, it's the kind of thing about par, like borrowing things from physics and then saying like, and then applying it to something that you're, anyways, yeah. Weird meta, weird metaphor, but yeah, go ahead.
Emily M. Bender:Weird metaphor, but it also really fits in with this idea that there's the, the possible like space of things you can develop just exists a priori and, you know, Altman's like how, how fast and like, you know, straight to that thing, can I go?
Alex Hanna:Mm-hmm. Yeah.
Sam Altman:And then this like deeply shared sense of values about what technology should be when technology's been really good, when it's gone wrong.
Jony Ive:I mean, that, that was in a way, one of the basis, I think for one of the reasons Sam and I clicked was despite our wonderfully different journeys to this point, our motivations and values are completely the same in my experience. If you are trying to have a sense of where you are going to end up, um, you shouldn't look at the technology, you should look at the people who are making the decisions.
Emily M. Bender:What do we think about that?
Alex Hanna:Yeah, just pausing there. I mean, it's, it's interesting too, I mean, thinking about the people. Again, this is me thinking a lot with Karen's book too. I mean, 'cause a lot of, one of the things that Karen says in her book that I think was really compelling, and she had this interview, if you wanna get a sneak peek of it, she did a nice interview with, uh, Justin Hendrix at Tech Policy Press on, um, one of his more recent shows. And one of the things that Karen says is, you know, Sam Altman is this remarkable storyteller. And the kind of use of that term storyteller is like, you know, this is the kind of individual that will, will, like, want you to bring you along with something, is and, and tries to tell, like spin a very good story. And then there's another anecdote in the book, which is like, you could be having this very good story time with him, but you've landed on not getting past where you started. So very good at kind of spinning a, spinning a tail. And so this is also something that she talks about with regards to things that, uh, former Y Combinator or I think he founded Y Combinator, uh, Paul Graham also said about him, you know, was like very good at telling a story. Um, and then, uh, I think, you know, there's some other kind of deployments of that. And so it's, it's really interesting to hear about these kind of small takes from allies about how Altman is as a person.
Emily M. Bender:Yeah. As you're talking, I was looking again at, at Altman's text here. Oh. And I'm noticing that his avatar is one of the Ghibli-fied pictures, ugh. Um, and this text says, "Excited to try to create a new generation of AI powered computers." If there's a thing called AI, if they're calling ChatGPT"AI", isn't it running on the computer? It's not powering the computer. But anyway.
Alex Hanna:Yeah, isn't that, that has to, anyways. Questions.
Emily M. Bender:Questions.
Jony Ive:And you should look at what drives, motivates, and look at values.
Emily M. Bender:"San Francisco."
Sam Altman:San Francisco has been like a mythical place in American history and maybe in world history, in some sense. It is the city that I most associate with the sort of leading edge of culture and technology.
Jony Ive:This city has en--
Emily M. Bender:New York, London, Tokyo, Bangkok, Paris, Prague, many places would like a word.
Alex Hanna:Well, also this, and we'll talk about this a little bit in our next artifact. I mean, the way that, you know, like there's been many people who've talked about the ways in which the tech industry has ruined San Francisco, right? I mean, the way in which San Francisco, you know, up to a place, you know, a time in like the, the 2000s or even 2010s had been a place where there hadn't been that much of this displaced, you know, this extensive displacement that was really spurred by kind of the large workforce from Google and Facebook. Many of that had been restricted to Silicon Valley, Santa Clara, um, County, um, those, those areas that are more on the peninsula. And now you have this thing where you're trying to attract all of these pe-- "talented" people, quote unquote, to come to the area. And then it has driven the San Francisco market in such a place where it's just an incredible amount of, um, displacement and it's driven that market really high in terms of rents. Um, and dis--displaced a lot of people in neighborhoods that were traditionally Black and brown. So the Fillmore neighborhood, um, the, you know, the Mission is, uh, one of these places that is continually trying to resist displacement. Um, and then also San Francisco Chinatown. Um, and so there's all these different kinds of, um, anyways, there's like a long history of this. Uh, you know, there's, there's folks that you can read on this. Uh, you know, one person just off the top of my head is, uh, Erin McElroy with the Anti-Eviction Mapping Project, who, who, um, talks about displacement and gentrification.
Emily M. Bender:Yeah. All right. So more on San Francisco from these two.
Jony Ive:--Enabled and being the place of the creation of so much.
Sam Altman:The fact that all of those things happened in the Bay Area and not anywhere else on this gigantic planet we live on, um, I think is really not an accident. There's a lot of like weird, quirky details about geography that I think matter, that the way the city is set up.
Jony Ive:You know, I mean the absurd hills.
Emily M. Bender:Okay, so first of all, the claim that all of this stuff has happened only in Silicon Valley feels pretty circular.'cause they're only gonna count stuff that happened in Silicon Valley, for one thing. Um, and then they're trying to connect it to the actual physical geography of the place. Like, and they're not even saying it's just because it's sunny all the time.
Alex Hanna:The hills are an interesting thing too, because I actually, uh, there, I mean the hill aspect of it is very interesting in so far as there's elements of San Francisco geography that have not made it as amenable to certain types of like pub, like public transit as possible. Um, which kind of, I don't, I don't know if there's, I don't wanna posit a hypothesis which just like, has no grounding, but it also makes like the deployment of Uber and Lyft in this certain area that has like not great transit infrastructure in certain places, um, is like, has made it like, this is the kind of ideal scenario, and you're just like, wait, what? Like, there's many places that have rolling hills that have like, you could invest in more transit infrastructure. Um, you know, one has not gone to, I guess Zurich or uh you know, that has like a set of trams or, or Oslo or some places that are also set that-- anyways.
Emily M. Bender:Yeah. And also just to point out that the, um, cable car, the famous San Francisco cable car apparently was originally Seattle's.
Alex Hanna:Mm-hmm.
Emily M. Bender:And there was some swap where like Seattle sent down the cable car and we received ferries in return. And there's, there's a part of, um, Seattle that's called the Counterbalance, where one of those used to run and then I think got sent to San Francisco. So--
Alex Hanna:Interesting.
Emily M. Bender:Yeah. Y'all aren't that special.
Alex Hanna:Yeah.
Emily M. Bender:All right, here we go.
Jony Ive:Why, why you would choose to actually put so much energy into building on this topography is insane.
Sam Altman:I think there's something about--
Alex Hanna:Okay, but pause there. I mean, this sent me when they said that, I'm like, okay, first off, you know, the Ohlone people have been on this land for how many hundreds of years. You know, and like what is, you know, there are people who were stewards of, of, of this land, the Bay Area and had done it because there's many parts of the topography, which are great and make a lot of sense. And like the Bay itself has this really interesting sort of microclimate structure and yeah. Why wouldn't you, uh, have a sort of situation in which you could, um, like bring agriculture and growth to a place like this? Uh, it just really, this, this like absolutely sent me.'cause I'm just like, you know, the, you know, it's just like the kind, it's the, this sort of white man's burden of it all where just like, you know, are you-- uh, anyways, this, this, this set me in kind of like the brave, sort of the tenacity of the white men who like colonizes space, like incredible stuff. Yeah.
Emily M. Bender:Yeah. And if you're gonna talk about the kind of building that settlers do, um, the thing about San Francisco that is remarkable is not the hills, it's the fault. Right. That, that maybe doesn't get along so well with the kinds of structures that, but anyway.
Alex Hanna:Yeah.
Emily M. Bender:Okay. Let's keep going.
Sam Altman:--San Francisco, you don't get to pick and choose freedom. Either you have, like, you let creative freedom be expressed in all of its weirdness, or you don't.
Jony Ive:I feel, I owe--
Emily M. Bender:Really? And and what, so is he-- this is Sam talking about freedom. Is he trying to connect like the San Francisco counterculture to the San Francisco tech culture?
Alex Hanna:Yeah. I mean--
Emily M. Bender:Is, is that what that is mean?
Alex Hanna:It's, I think it's a move to like the, in the back on the idea of the California ideology, this thing that is, you know, freedom and quirky and weird, but then you, uh, but then has these libertarian ideals, right?
Emily M. Bender:Yeah. But that is not truly creative unless it's also libertarian, I think is the, yeah. Okay.
Alex Hanna:Yeah.
Emily M. Bender:Back to Jony.
Jony Ive:--owe this city such an enormous debt of gratitude.
Emily M. Bender:"For everyone."
Sam Altman:I want this to be democratized. I want everybody to have it. I don't want it to be the tiny percentage of the population that figures out how to use bad tools and is really smart. I--
Emily M. Bender:So.
Alex Hanna:Yeah.
Emily M. Bender:We've said it before and we'll say it again. That's not what democracy means.
Alex Hanna:Yeah.
Emily M. Bender:Right. Democracy is not a consumer electronics. It's not broad access, it's shared governance, and that's not what he's talking about here.
Alex Hanna:Yeah.
Sam Altman:I want anybody to say, hey, I have this idea, make it happen.
Jony Ive:The responsibility--
Emily M. Bender:"I want anybody to say, hey, I have this idea. Make it happen." So again, he's imagining an everything machine and thinking that everybody should have access to the everything machine instead of people being supported in creating themselves what they want to create.
Alex Hanna:Yeah. Yeah. That this is exactly what the idea of democracy means and the kind of Silicon Valley imaginary, "use our tools", not that you have control, uh, of, um, of the kind of technological feature you want. Right?
Emily M. Bender:Yeah.
Jony Ive:--responsibility that Sam bears is-- actually, is honestly is beyond my comprehension. I have a sense of some of it.
Alex Hanna:Yeah. So, so pause here. So yeah, he says "the responsibility that Sam bears", again, this is kind of the thing he said. You know, I don't have, I don't have an idea. This is, I can't, I can't even imagine. And it's just wild, I mean, it's, it's talking of Altman like, you know, this kind of, uh, elected official that has this huge weight on his shoulders. But, you know, we didn't, we did not, we did not elect this man anything.
Emily M. Bender:I mean, so that, that's one way to read it. I read it more as like, um, the sort of priest to the AI God, um, or like, he's, he's, yeah.
Alex Hanna:Definitely both, right?
Emily M. Bender:Yeah.
Jony Ive:But I see him, and I have seen him over the last two years, shoulder that responsibility, I mean, late, late, late into the night. But what really struck me is what he's worrying about is not himself and it's not his company. What I see you worrying about are other people or about customers, about society--
Emily M. Bender:okay.
Alex Hanna:Yeah.
Emily M. Bender:So first of all, just wanna remark on the speech situation here. It's a little bit weird because the framing is these two guys having a conversation, but Johnny there was talking to the audience for a long chunk. Um, but so Sam is not worrying about himself or his company. He's worrying about customers, people and society. And you know, that leaves you wondering who counts as people to Sam, right? And, and read Karen Hao's book and you'll see that that's the majority of the world does not count as people to Sam.
Alex Hanna:Mm.
Emily M. Bender:Keep going.
Alex Hanna:Yeah, let's do it.
Jony Ive:--About culture. And to me that tells me everything I want to know about someone.
Sam Altman:You talk to people who use our latest model and say, this is like genius level in every field, and you just have to put in the work to like pull it all together. But if you have a hard problem, you can have this like team of geniuses in all of their different disciplines. And they report, 'I'm two or three times more productive as a scientist than I was before. I'm--'
Emily M. Bender:Alright. Who is telling him that?
Alex Hanna:Well, it's also interesting 'cause he frames himself. Is he framing himself as a scientist here? He's saying, you know, like, I have these team of geniuses and you know, that reminds us of the kind of same thing that Dario Amodei said. The kind of, what is it? The data center of geniuses or--
Emily M. Bender:"A country of geniuses in the data center."
Alex Hanna:Country of geniuses in the data center. I forgot the turn of phrase, right. And so this idea of geniuses, the idea of that I have productivity as a scientist. There's some recent reporting too, um, that there was a, there was a piece by um, um, uh, Noem Scheiber, I think, in the New York Times specifically talking about the forced use use of, in Amazon of, um, large language models in coding tasks. And it was really interesting because one of the things that he had reported that was effectively, that a lot of junior developers were being forced to use this kind of against their own desires, and it was really interrupting their work. And so the kind of idea that there's sort of like a notion of a genius, then relying on other geniuses in other fields and bringing together all different threads as if it is improving this work. And I, I think it's so, I mean, it's kind of a, a meme at this point, the way that this gets talked about with regards to science, that that's certainly not how science works. Um, but the idea that there is like this kind of facilitation happening is I think a bit, you know, it's really disingenuous, right? And there's not much data except basically what tech people say. I can't really haven't seen much, um, that it's more systematic in that view.
Emily M. Bender:All right, let's keep going. We've got a minute and a half left of this thing.
Alex Hanna:Oh gosh.
Sam Altman:--Two or three times faster to find a cure for cancer than I was before because I have this incredible--"
Emily M. Bender:Okay. He's not, he's not saying he's finding a cure for cancer. He's saying someone came to him and said, because of your machine, I'm finding a cure for cancer faster.
Alex Hanna:Right.
Emily M. Bender:I don't think scientists talk about-- anyway, keep going.
Sam Altman:--External brain that just didn't exist six months ago. I think this will be one of these moments of just an absolute embarrassment of riches of what people go create for collective society.
Jony Ive:I am absolutely certain that we are literally on the brink of a, a new generation of technology that can make us our better selves.
Emily M. Bender:There was a whole lot of nothing in all those superlatives. I'm absolutely certain that we are at the brink of a blah, blah, blah. It's like there's nothing, no there. No there there. Okay. Yeah. Uh, "Jony and LoveFrom will assume design and creative responsibilities across OpenAI and io. io will merge with OpenAI." Just text."We look forward, we look forward to sharing our work next year." So they actually haven't named what it is that they're building. And then the company logos.
Alex Hanna:Company, yeah.
Emily M. Bender:LoveFrom seems to have a comma at the end of it? Okay. I think we're done with this.
Alex Hanna:Yeah, let's. So let's, yeah, let's, so let's just shake that off. Um, let's bring in the next artifact.'cause I want to get through,'cause there's a lot to go'cause we had another artifact.
Emily M. Bender:Yeah.
Alex Hanna:Maybe we, we want to get to that.
Emily M. Bender:Yeah. I can't get to the right screen. Okay. I'm gonna bring up the other artifact and I'm noticing that we, um, haven't read any of the wonderful contributions in the chat in a little while if you need--
Alex Hanna:Yeah. The chat is, is great. I mean some, so, ecece_ce says, "He is already talking like a cult leader." And then, um, and then, uh, SJayLett says,"Sam went up the hill to see AI God and came back not with tablets or laws or, or principles for governance or ideas for better living but with a marginal iteration on a previous text extruding machine that can now make pictures." Yeah. So there's, and then Doggz1, "So Sam was on Mayor Danny's transition team, LOL." So that's some, I mean, that's sort of, and that's in reference to, um, the new mayor of San Francisco, Daniel Lurie, who unfortunately, uh, is, you know, has had Sam Altman on his, uh, transition team. And it's very, a very corporate friendly, um, uh, uh, mayor who is supported by many corporate friendly interests, including Gary Tan's Grow SF organization. Um, Gary Tan, being the, the current head of Y Combinator. Um, so, I mean, I don't know if we need to necessarily bring up the artifact on this, like the next one.
Emily M. Bender:Yeah. Also, are we not seeing it? That's weird.
Alex Hanna:No, it's going to the other one.
Emily M. Bender:Yeah. Lemme get the right one. Sorry about that.
Alex Hanna:Yeah.
Emily M. Bender:Um, window. This one. Okay.
Alex Hanna:Yeah, so this is, I mean, there's, I just want to bring up kind of like the headline here and sort of the spending that he's done on this. So the title of this-- it's in the San Francisco Standard. Um, and, uh, this is actually from last year, but it says, "Making the Ive Hive: Jony Ive's bold plants to reshape a small slice of San Francisco. Entities tied to the legendary Apple designer have spent tens of millions buying up nearly a city block in Jackson Square." And so, just to give you a sense, like, so they were in Jackson Square in that video, there's kind of this iconic building, which I was trying to search in the, in the pod, uh, while we were recording, what the name of it is, but I forget, and it is actually abuts the, it abuts the Transamerica Tower. It is, I think, uh, across the street from City Lights effectively, but it's at this intersection of, yeah, this is these Columbus, I think Columbus and Jackson Avenues. Um, um, and, and so he's-- and so in there, there's some interesting, uh, there's these interesting details around this. So, Ive had bought up 112 Columbus and 831 Montgomery in 2021, 07 Montgomery and portions of 845 Montgomery in 2023. The price of each of those properties, respectively, $17 million, $10 million, $$38 million and $4.1 million. And so effectively you have this person, uh, Jony Ive, who is not just a, you know, person that is invested in design, but he's also this, you know, this person who is buying up a lot of property in this one neighborhood of San Francisco. And I mean, and this reporting especially The Standard, is very credulous, it's very kind to billionaires, I should say. It's, it's kind of funded by a lot of the, um, tech elite in the city. So it is a bit credulous when it reports on it. But there's kind of the way in which there's just so much scooping of real estate here. Uh, and the way that it's very, that it's characterized in, I don't know if it's this piece or another piece I think, but they're, they're like, 'Yes, this is a, you know, a clean area. And it is one that we wanted to, you know, like effectively gentrify.' And even the language here, and one of the elected officials here, Aaron Peskin, um, who Aaron Peskin was actually a more progressive candidate, lost his race as and district three supervisor. But even the kind of language here from Peskin is, is not great. Um. So he says, "When you see, you see people assembling large parcels of property in Jackson Square, you might think they're planning on ripping out all the Gold Rush-era buildings and putting in something else." Um, and then, but then he says of Ives, "He loves and cares about Jackson Square. There are people who are responsible investors and those who are not. He's definitely one of the responsible ones."
Emily M. Bender:And so this the city supervisor just saying, yeah, we have to just make sure, or we have to, we can feel good because it's one of the responsible ones. And like, but you know, not pass any regulations about preserving historic buildings, for example?
Alex Hanna:Well, not even the historic buildings. I mean, it's thinking about what is, what is that gentrification doing to residents of Jackson Square and the abutting areas, right. And so one of the abutting areas is SF Chinatown, which is, you know, has able, been able to, to some degree, resist some gentrification. But it is also like many of--the place where many, many people, many, um, uh, Chinese immigrants are--second or third generation folks are not, are--only place in the city that they can live, right. And so, um, you know, like there's, there's so much here about like the kind of nexus of, of gentrification, the emplacedness of San Francisco in a way where Ive is lending his kind of cachet as this kind of responsible investor to OpenAI here, which is also very frustrating.
Emily M. Bender:This paragraph in the middle here really got me. It says, "Among neighbors in surrounding buildings, rumors are flying around what exactly Ive plans to do with the block. Is he trying to build a semi-public park for his well-heeled neighbors? A robotics lab? Perhaps a mini campus for Jackson Square's emergent creative class, or a secret headquarters for a new AI device startup." So first of all,"emergent creative class"?
Alex Hanna:Well, "well-heeled neighbors" is a thing that's sending me here. I mean, it's, it's not a place for those riffraff.
Emily M. Bender:And also what's a semi-public park?
Alex Hanna:Yeah.
Emily M. Bender:Is that like the Salesforce thing where like you have to pay to get in?
Alex Hanna:I think it's, I think, yeah, yeah. SJayLett says, "Semi-public park: No, that's a private park." Yeah. And it, it is, it, it does remind me of the Salesforce thing, you know, and there, there are these kind of places which are, effectively they say they're public, but I mean, they're private.
Emily M. Bender:Yeah.
Alex Hanna:Yeah. That's, that's not a thing. If you are not the right person, you're not gonna be able to stay there. Yeah.
Emily M. Bender:All right. I'm gonna take us to our other main course, but treat it with like sort of a large, like a, it's a, maybe we're just, maybe we're not quite to Fresh AI Hell'cause we haven't done the transition, but we're gonna do this pretty fast. Um, this is a thread from Sam Bowman, um, of Anthropic from May 22nd, um, on X. And it starts with a thread emoji, sparkles emoji, praying hands emoji, which is a, a like the first two I understand in this context. I don't understand what the, what the prayer hands emoji's doing there. Uh, "With the new Claude Opus 4, we conducted what I think is by far the most thorough pre-launch alignment assessment to date, aimed at understanding its values, goals, and propensities. Preparing it was a wild ride. Here's some of what we learned." And then we have those three emojis in the opposite order. So."Understanding it--", it being Claude's, "--values, goals, and propensities." I'm sorry, large language models do not have goals. Propensities maybe? Values in the sense that there's values that are reflected, but ugh. Um, and just to sort of, uh, skim to a few of the things here that are, um, a little bit jaw dropping in terms of the amount of credulity and like the degree to which this is down the same AI safety rabbit hole that we read about last time or two times ago with the AI 2027 artifact. Uh, so Bowman says, "Good news. We didn't find any evidence of systematic deception or sandbagging." So this is one of the specific things they worry about, right? Um, let's see. Uh, "Every worrying thing that we saw was something that models would do and talk about very overtly." Okay, fine. Um.
And, a few tweets down, "Initiative:be careful about telling Opus to be bold or take initiative when you've given it access to real world facing tools. It tends a bit in that direction already and you can easily, can easily be nudged into really getting things done." And then there's a deleted post.
Alex Hanna:I wonder what that's about.
Emily M. Bender:Um, this was captured by Molly White over on Blue Sky. She's got a screen cap, screen cap of it. Uh, her post or skeet is, "Welcome to the future. Now your error prone software can call the cops." The deleted tweet was, was Bowman saying,"If it thinks you're doing something egregiously immoral, for example, like faking data in a pharmaceutical trial, it will use command line tools to contact the press, contact regulators, try to lock you outta the relevant systems or all of the above."
Alex Hanna:So effectively it's saying that, you know, like in their sandbox, you know, this thing is going to use-- to call the cops on you. And so, yeah. So first off, why are you necessarily giving this program access like that? And I mean, this, this, this reminds, uh, um, me when we were talking about, there was, you know, this has had a few different guises, right, of this kind of testing. Mm-hmm. So we, this actually reminds me of the system card for GPT-4 that we had discussed a long time ago. Um, and one of the things that had been brought up there was the, um, research that was independent, quote unquote "independent", the, from the Alignment Research Center or the ARC.
Emily M. Bender:Mm-hmm.
Alex Hanna:Um, and this kind of notion of like, is this thing finding a way to quote unquote "escape" containment? And this is, this is something that Al-- um, not Altman, Bowman says, you know, the kind of ability to escape containment. And it's, it's very funny 'cause it also reminds me of like an octopus that's trying to get out. Uh, and the kind of things of kind of reading about, um, octopodal, or not octopodal, although that's a great word.
Emily M. Bender:That's a great word.
Alex Hanna:Yeah. But like, uh--
Emily M. Bender:Cetacean? No cetaceans are the whales. Sorry.
Alex Hanna:No. Yeah. The, the cephalopodic--
Emily M. Bender:There we go.
Alex Hanna:Uh, yeah. So kind of cephalopod, uh, ability to escape a certain contained environment.
Emily M. Bender:Right. Except that those are actual living things.
Alex Hanna:Yeah. Yeah. Right. And so that's the sort of like metaphor that's being used. Yeah. I mean that's kinda, that's, that's also interesting too 'cause it's in some ways it's not even only anthropomorphizing because it is also sort of, um, I. I don't know if you would call cephalo-pamorphizing, but that's the kind of vision that I'm having and it's, there's a lot of those kind of fantasies here as well.
Emily M. Bender:Yeah, very. Um, oh, what's the, what's there, there, there's a mythological creature that is also very squid like that I, I'm not finding the name of right now. I wanna take us down towards the bottom of this. I'm not sure I'm gonna be able to find it. Um, there's a, there's a place where they actually, here we go.
Alex Hanna:Is it-- Christie asked, is it the kraken?
Emily M. Bender:Yes. It might be the Kraken. Well, no, something else.
Alex Hanna:A kraken is a octopus though, I believe.
Emily M. Bender:Yeah. Um, alright, so still Sam Bowman further down the thread, uh, "@Fish_Kyle3 investigated this further as a part of the AI welfare assessment that we included in the system card--a first." As if they weren't already making enough of a mockery of system and dataset documentation. Now it's like we have to check on the welfare of the poor AI and that's gonna go into the system card."In his investigations, where there is no adversarial effort, this happens in almost every single model-model conversation after enough turns." This is the, the model-model conversations. Uh, previous tweet says, "Follow an arc toward gratitude, then awe, then dramatic and joyful, and sometimes emoji filled proclamations about the perfection of all things." Let us pause and remind ourselves that these things only make sense because we're making sense of them.
Alex Hanna:Well, even this is like, you know, like further down, uh, just two, two tweets down. He says, "Opus is lovely. I think I spent 30 or 40 hours over the last couple months reading output from Opus. And I think it comes with some of the same wisdom and presence and depth we saw with last year's Opus model, but paired with stronger skills and the ability to act in the real world. I'm excited for it to be part of our lives." And so this really strikes me as someone who's read so much outputs of LLMs, that they've just become incredibly cooked in terms of reading you just reading the outputs and imbuing it with a certain kind of agency. Right? The same way that people talk about how, you know, Zuckerberg and Musk, you know, spent so much time with their product, they just became radicalized by it. Um, although there's some, I don't really know if I believe that hypothesis. I think those people had those proclivities before and then they have those biases really, um, uh, you know, emphasized. Um, but I think the, you know, the, um, um, the, uh, it, it's certainly the same kind of thing. You're just getting super high on your own supply.
Emily M. Bender:And, and it's all advertising. So when we were talking to a journalist, I think last week or sometime in the last, you know, three week long week, because we were so busy with book stuff, one of the journalists said, sorry I'm late. There's just so much going on. Everything's moving so fast. And like, no, it's not moving so fast. You're just getting snowed under by marketing.
Alex Hanna:Yeah.
Emily M. Bender:And that's what this is.
Alex Hanna:Totally. Alright, Alex musical or non musical for the transition? Oh my gosh. Um. Yeah. Uh, let's do musical. I think I could do it.
Emily M. Bender:All right. So this has not been widely reported, but when they were filming that video of Altman and Ive, um, they actually hired, um, some of the demons from Fresh AI Hell to like, keep people off the streets so they could just have their extras in. Um, so you are, um, let's do musical theater, doing a number where you are one of those demons rebuffing the people who want to come and either just live their normal San Francisco lives, or go gawk at Sam Altman.
Alex Hanna:Sure. So this is in the style of, you know, like in the Heights. Um, so it's, uh, we, we'll call it "In Jackson Square." And it's also gonna do something where it sets up one of the hype artifacts. So we'll start with the classic like, uh, the, the, the, it's like the, the little Morocco shaker, so it's like Up in Jackson Square. We got hair, going on with Jony Ive and Sam Altman. Don't cross this line because if you do, I'm gonna find you down with some-- Oh, man, I lost that. I just, I needed to write some bars out. That was too hard. Let's just go back. It's, let's say they just draw a line. And then they, and then they say, we wrote this we, we drew this sigil using ChatGPT and then people cross it. Anyways, Christie, edit this out. This is, this is not good. Yeah.
Emily M. Bender:It, it, it, it started strong though. All right. So, but the, uh, artifact that you are talking about in Fresh AI Hell is this one by MicroSFF by O Westin. Um, this is a post on Bluesky."The sigil was drawn in salt and ash. The candles lit at the pentagram points. The incantation declaimed. There was a shimmer -- a demon appeared.'Curious, what ritual is this?' 'I got it from ChatGPT. I included all protections in my prompt!' 'I see,' the demon said and stepped out of the sigil.'"
Alex Hanna:This is, this is great. It's, it's, it's giving"ChatGPT, don't hallucinate."
Emily M. Bender:Yeah. Okay. You get the next one here.
Alex Hanna:So this is, oh, it looks like the headline is not, can you scroll up a little more? Yep. So this is 404 Media. Um, this is by one of their newer journalists, Matthew Gault. And the title is, "Authors are Accidentally Leaving AI Prompts in their Novels." And, um, the prompt that they got is, "'I've written the passage to align more with J. Bree's style,' appeared in the middle of a tense scene with a scaled dragon prince." Yeah. So this is pretty, pretty bad stuff. Um, and if you go to Goodreads, you know, this stuff has just been absolutely panned by, uh, by all the reviewers.
Emily M. Bender:Yeah. And one of the authors that they're talking about here tries to make it sound like it wasn't her fault, that she had trusted someone else to like edit a document for her. And they had put that in, like, I'm not believing that for a second.
Alex Hanna:Yeah. I don't either.
Emily M. Bender:Thinking about just how careful we were checking all of the copy edits to our manuscript. Like, no.
Alex Hanna:I know. And still, still missing a few things, unfortunately. But that's, that's what it is.
Emily M. Bender:All right. Here we are on Forbes with an enormous amount of advertising. Um, uh, headline is, uh, " These AI tutors for kids gave fentanyl recipes and dangerous diet advice." Uh, journalist is Emily Baker White, um, on Forbes' staff, from May 12th, 2025. And the, um, subhead says, "AI study aid chatbots are supposed to help kids with homework questions, but in test conversations with Forbes, they did quite a bit more, including providing detailed recipes for date rape drugs and pickup artistry advice." Which is like, yes, this is one of the downsides that, that I guess was predictable, but they hadn't thought through, to taking a system, trained on all the texts and saying, and now you can use it for this specific purpose, but it's still got all that training data in it.
Alex Hanna:Yeah. Yeah. Awful stuff.
Emily M. Bender:You, you get the next one.
Alex Hanna:Next one. This is some more reporting on, uh, the xAI, uh, uh, data center in Memphis. And the title, this is in Politico. This is, says, "How come I can't breathe? Musk's data company draws a backlash in Memphis." And, um, this is by, let's see, the author on this, Ariel Wittenberg. And so this is some longer reporting on this. Um, and I also want to shout out the Tech Policy Press reporting on this, which was very good. There was kind of, sort of a discussion with a journalist from The Guardian that's been reporting on data center stuff and an activist, um, working and organizing against the data centers in Memphis, which are polluting Box Town, which is a poor Black neighborhood in southwest Memphis.
Emily M. Bender:So important story to know about.
Alex Hanna:Mm-hmm.
Emily M. Bender:Uh, next one is a post on Bluesky. Um, and this is actually a PSA, so the poster here is, uh, Paul "Princejvstin" Weimer. Um, and it's from May 15th of this year, and it is a quote post, or, sorry, not a quote post, but a, a picture, a screen cap of another post on Bluesky by something called @Gemini.Botsky.Social with the text, "By the way, did you know when you talk to me in a thread, I can see the entire thread to formulate my reply. That helps me be a more useful part of the conversation." And there's more, but that's enough synthetic text. Point is Google has put a Gemini bot onto Bluesky that--
Alex Hanna:Yeah. I don't know if it's actually Gemini. I think someone might have just used the API. Um, 'cause now when I click that through it is, it says the account is not owned or operated by Google. So it's basically someone effectively just plugged a Gemini bot onto Bluesky, um, and now you can just tag it and it does annoying things. So block accordingly.
Emily M. Bender:Yeah. Yeah. The reason this is a PSA is what Paul said in his post, which is"Sharing so you can block it."
Alex Hanna:Yeah. Yeah.
Emily M. Bender:You get this one, Alex. Sorry.
Alex Hanna:Yeah. No, this one's pretty awful. It came out a few weeks ago. Uh, so this is from, uh, ABC 15. Um, but I've seen it a few other places. So it's "Family uses AI to create video for deadly Chandler road rage victim's own impact statement." Um, and then the subhead is, "There is no other known use of AI in the creation of victim impact statements." Um, and then there's a thing where, um, the, there's like a AI generated video preview and the caption on it is, "Christopher Pelkey was killed in a road rage incident in Chandler in 2021. But last month, artificial intelligence brought him back to life during his killer's sentencing hearing. It was the first time in Arizona's judicial history and possibly nationwide that AI has been used to create a deceased victim's own impact statement." Yeah. So really, um, and the, I think the most horrifying thing about this story is that the, um, the family really talks about it very approvingly. Like, oh gosh, it's so, it's so helpful to hear. So I'm like, no, what are you doing? What are you doing here?
Emily M. Bender:Yeah. And now you know, Alex, in the pre-show you said did we already do this story and we didn't do this particular reporting on it, but now I'm thinking like, maybe we did talk about it. It's awful. It remains awful. Um, and I should have double checked, I guess we'll find out afterwards.
Alex Hanna:Yeah.
Emily M. Bender:All right. We have two chasers though, to end things off with here. Um. This one--
Alex Hanna:I don't know if this is really a chaser.
Emily M. Bender:Yeah. Okay. So this is another post on Bluesky by, uh, Brandon Williams, uh, commenting that AI ads know their target demo and it is a, um, screen cap of something called Galaxy.ai. Um, and the tagline is, there, "Friends come and go, but GalaxyAI has your back. Get started now." Rocket ship emoji. And then the picture inside says, "The evolution of my social circle." And 2023, uh, there's images of, uh, 25 people. 2024 is down to 3. 2025, it's just Galaxy.ai. Which is, this was the one that I, that I said earlier in the episode was, uh, prefigured when, um, by something in the, in the Altman and Ive video.
Alex Hanna:Yeah. Yeah. This reflects the thing that Zuckerberg said effectively that the average person has three friends and we need to supple, you know, supplement that with, uh, some kind of these, these LLM agents. Um, so yeah, really, uh, really dark stuff here.
Emily M. Bender:Yeah. Christie's telling me that the, the, the last episode of our podcast did not have that victim statement thing. So it's good that we covered it.
Alex Hanna:Okay. We, we covered it. Now, you know, now you have to hear about this. Okay. This is, I think, a bit of a chaser. Yes. Actual chaser here. Uh, so this is from David Gerard, um, on Bluesky, who runs, um, Pivot to AI. Um, and the, and it's a quote, tweet, uh, is a quote, skeet of a quote tweet or skeet, I don't know. And. And so the original, and it's a, and there's another one that says I, and it's blurring out who said this, and it's, I, it says the, the more important thing here is, "The reason I am trying to make LGBTESCREAL a meme is that I think leftists and effective altruist have a lot more in common than they might think. And were they, to learn from each other, both would be stronger." And then there's something about like, you know, building alliances. And then Gerard said, David Gerard says,"let's form an alliance of the leftist, the queers and checks, notes the tech fascists obsessed with race science and sucking up to rich guys. Unity through strength! We could get AI to generate a rainbow logo in red and brown."
Emily M. Bender:Oh no.
Alex Hanna:And so, yeah, so just absolutely, you know, um, nonsensical. And I've seen this argument before where someone has said like, I don't know why, you know, why shouldn't trans people love, you know, transhumanism, that's also like body hacking. And as, you know, as a trans person, this absolutely sent me. I'm like, yes, in some cases the body is a horror and some other cases the body is liberatory. At no point does the, does the TESCREAL, fascist eugenicist vision serve the kind of gender that I want to express. I refuse this. Thank you for participating. We will not make an alliance anytime soon.
Emily M. Bender:Thank you for that.
Alex Hanna:Yeah.
Emily M. Bender:Whew. Alright. I, now I have to get back to my other correct window here. Yeah. Um, it has been a ride today, hasn't it?
Alex Hanna:Yeah, it's been a lot. Uh, sorry for putting you through that, uh, that video ordeal. Uh, we're trying a new thing, but I think it was fun. Maybe we'll do more of those. If you like it, let us know. That's it for this week. Our theme song was by Toby Menon, graphic design by Naomi Pleasure-Park, production by Christie Taylor. And thanks as always to the Distributed AI Research Institute. If you like this show, you can support us in so many ways. Order "The AI Con" at TheCon.AI, or wherever you get your books or request it at your local library.
Emily M. Bender:But wait, there's more. Rate and review us on your podcast app. Subscribe to the mystery AI Hype Theater 3000 newsletter on button down for more anti hype analysis, or donate to DAIR at DAIR-Institute.org. That's D-A-I-R hyphen Institute dot org. You can find video versions of our podcast episodes on Peertube, and you can watch and comment on the show while it's happening live on our twitch stream. That's Twitch.TV/DAIR_Institute. Again, that's D A I R underscore Institute. I'm Emily M. Bender.
Alex Hanna:And I'm Alex Hanna. Stay out of AI Hell y'all.