
Mystery AI Hype Theater 3000
Mystery AI Hype Theater 3000
The "AI"-Enabled Immigration Panopticon (with Petra Molnar), 2025.05.05
This week, Alex and Emily talk with anthropologist and immigration lawyer Petra Molnar about the dehumanizing hype of border-enforcement tech. From hoovering up data to hunt anyone of ambiguous citizenship status, to running surveillance of physical borders themselves, "AI" tech is everywhere in the enforcement of national borders. And as companies ranging from Amazon, to NSO Group, to Palantir all profit, this widening of automation threatens a future of faceless human rights violations with no attempts at accountability of any kind.
Petra Molnar is associate director of York University's Refugee Law Lab, and a faculty associate for the Berkman Klein Center for Internet and Society at Harvard University. She's also the author of the book The Walls Have Eyes: Surviving immigration in the age of artificial intelligence.
References:
Department of Homeland Security: Robot Dogs Take Another Step Towards Deployment at the Border
Leaked: Palantir’s Plan to Help ICE Deport People
Athens prepares to host DEFEA 2025, a major hub for international defence cooperation
Fresh AI Hell:
Meta served teen girls beauty product ads whenever they deleted selfies
Dating app/luxury surveillance leaks personal info
"AI" for subway crime prediction
CA used "AI" to make bar exam questions
CA using "AI" tool to bypass building permit process
Wildly unethical "AI persuasion" research on Reddit users
AI makeup to retouch Holocaust images
Check out future streams at on Twitch, Meanwhile, send us any AI Hell you see.
Our book, 'The AI Con,' comes out in May! Pre-order now.
Subscribe to our newsletter via Buttondown.
Follow us!
Emily
- Bluesky: emilymbender.bsky.social
- Mastodon: dair-community.social/@EmilyMBender
Alex
- Bluesky: alexhanna.bsky.social
- Mastodon: dair-community.social/@alex
- Twitter: @alexhanna
Music by Toby Menon.
Artwork by Naomi Pleasure-Park.
Production by Christie Taylor.
Welcome everyone to Mystery AI Hype Theater 3000, where we seek catharsis in this age of AI hype. We find the worst of it and pop it with the sharpest needles we can find.
Emily M. Bender:Along the way, we learn to always read the footnotes, and each time we think we've reached peak AI hype, the summit of Bullshit Mountain, we discover there's worse to come. I'm Emily M. Bender, Professor of Linguistics at the University of Washington.
Alex Hanna:And I'm Alex Hanna, Director of Research for the Distributed AI Research Institute. This is episode 57, which we're recording on May 5th, 2025. One of the Trump administration's promises post-election was to conduct the largest deportation project in US history. At the same time, Frontex, the EU's border security agency, wages a large scale campaign to keep out people on the move. People who are often fleeing state violence. And as ICE and other agencies seem to be spinning up the very infrastructure to target immigrants and refugees at this unprecedented scale, it is clear that this is also a project reliant on, (and) benefiting tech companies.
Emily M. Bender:From hoovering up our data to hunt out dissidents and anyone of ambiguous citizenship status, to running quote "AI"-assisted surveillance of physical borders themselves, tech is everywhere in the enforcement of national borders. And from Amazon to NSO Group to Palantir, these are companies whose collaboration with immigration enforcement, enforcement is greatly profitable to them and threatens a future of faceless human rights violations with no attempts at accountability of any kind. And unlike some of what we talk about on the show, this is a kind of tech that's harmful regardless of its accuracy.
Alex Hanna:Here to help us examine the hype and harm of border tech is Petra Molnar. She's Associate Director of York University's Refugee Law Lab and a faculty associate for the Berkman Klein Center for Internet and Society at Harvard University. She's also the author of the book,"The Walls Have Eyes: Surviving Immigration in the Age of Artificial Intelligence." Welcome, Petra. It's so great to have you here.
Petra Molnar:Thank you so much for having me. It's a real pleasure to be here with you.
Emily M. Bender:Thank you. Thank you. This is gonna be a tough topic, but I'm really glad to be working on it with you.
Petra Molnar:Yeah, likewise.
Emily M. Bender:Okay. This is an article, um, or a, not exactly an article. Well, it's in the newsroom of the Science and Technology Directorate of, uh, US, what, Homeland Security? Is that where we are?
Alex Hanna:Yeah. This is Department of Homeland Security. The Science and Technology Directorate.
Emily M. Bender:And this, it says,"Feature article: Robot dogs take another step towards deployment at the border." And this is three years old. It's from February 1st, 2022. Um, but it feels of a piece of what's going on now. So let's, let's read this and comment. Uh, "The American Southwest is a region that blends a harsh landscape, temperature extremes, and various other non-environmental threats that can create dangerous obstacles for those who patrol the border. The territory is vast and monitoring it is critical to our nation's security. That's why the Science and Technology Directorate, S&T, is offering US Customs and Border Protection, CBP, a helping hand--or paw--with new technology that can assist with enhancing the capabilities of CBP personnel while simultaneously increasing their safety downrange." So where do we start with this?
Alex Hanna:Yeah, I mean, it's, it's, let's, let's get this next paragraph and let's go to the image here. So the quote here is from, um, Brenda Long who is, um, the, a program manager at S&T. And the quote is, "The Southern border can be an inhospitable place for man and beast. And that is exactly why a machine may excel there," says this person."This S&T led initiative focuses on automated ground surveillance vehicles, or what we call AGSVs. Essentially the AGSV program is all about dot dot dot robot dogs." And there's an image of one of these, um, Bost-- I don't know if this one's, um, created by Boston Dynamics, but you've surely seen these Boston Dynamics, uh, style dogs. Anyways. Yeah. Alright. Petra, do you want to, uh, get in here and, and remark on this?
Petra Molnar:Sure. I mean, it's, it's a bit hard to know where to start because this is probably one of the most sharpest or or violent examples of how technologies are being introduced into virtually every single aspect of migration. Um, and if I can just share a personal anecdote perhaps, for me, this was also one of the more surreal moments of my career trying to understand this interplay of migration and technology. Because in February of 2022, I was literally on the sands of the Sonora Desert. I was really lucky because I got to work with some of the search and rescue groups that are going into the desert to meet with people on the move who are in distress, offer them water and assistance, and sometimes also deal with human remains. And literally when we were on the hands of the Sonora going to the final resting place of a young husband and father who passed away in the desert, the Department of Homeland Security put out this press release. Um, and there's just this sharp, um, so something was really sharply brought into view here, right? Like the fact that even in this artifact that you're showing us, it's all about the kind of assistance that these tools provide to border enforcement. Rather than thinking about what is this actually doing to the environment that people are crossing, people who are exercising their internationally protected right to asylum. And the other note that I wanna make is that, you know, throughout this, this artifact, you'll see this kind of framing that's maybe a bit subtle, but once you pick up on it, you'll see it everywhere. States like to say that we need technology, we need surveillance, we need AI, we need robo dogs lending "helping paws" to deter people from coming. That more surveillance, more technology is somehow going to strengthen the border and people are not gonna come anymore. But that doesn't work. Instead, what happens is people who are desperate to seek safety will take riskier routes. And actually there's been an exponential increase of deaths in the Sonora corridor since the introduction of this type of smart border technology. So again, you know, this is part of a bigger ecosystem, but there's something really viscerally disturbing about seeing robo dogs like this being introduced.
Emily M. Bender:Yeah, and I hear you about this point about the whose safety is sort of, of concern here. And it's very much just CBP personnel. And if we really cared about the safety of the people who end up crossing the Sonora Desert, we would make it possible for them to enter in, you know, El Paso or wherever, where there's shade and water and they don't, you know, have to put themselves at risk. But that's clearly not the goal of the government that's behind these things.
Petra Molnar:Yeah exactly, and, and I think it reminds us, again, this foundational question that we have to always ask ourselves, like, who is this actually being developed for and whose priorities matter and indeed take precedence when we innovate? Um, you know, for me, I, I'm not a technologist. I'm a refugee lawyer by training and an anthropologist. So I came to this field like really quite accidentally, but I was always really concerned about the fact that tech is really about power and who gets to be around tables that determine what we innovate on and why. And again, if migration is a problem and something to solve, and you also have massive private sector involvement, which I know we'll talk about later, um, for creating this kind of solution to this so-called problem, that's why we, we end up with things like robo dogs or AI lie detectors or drones, instead of using technologies to empower people, to offer psychosocial support to make legal cases, uh, fair and more transparent. It's always about weaponizing technology against groups that have historically been marginalized already.
Alex Hanna:Yeah, absolutely. There's a, there's an element of this and I'm seeing that the, the manufacturer of this is Ghost Robotics, so it's not a Boston Dynamics. So, uh, not, you know, Ghost Robotics is another kind of haunting name. Um, and so there's, yeah, so the subhead here is, "Downrange, the danger to CBP agents and officers is very real." So primary focusing on the harms to the enforcement agencies. And just the kind of framing here of, you know, of course, who this serves. I mean, this, this framing, uh, right here is, it makes it quite clear. So, uh, this is a quote here from, um, somebody from CBP."'Just like anywhere you have your standard criminal behavior, but along the border you have hu--you also have human smuggling, drug smuggling, as well as smuggling of other contraband, including firearms or even potentially WMD,' explained agent, um, Brett Becker of CBP Innovation Team, or INVNT.'These activities can be conducted by anyone from just a lone individual all the way up to transnational criminal organizations, terrorists or hostile governments and everything in between.'" So, you know, already talking about what this enforcement is, you know, you're waiving everything from individuals, um, and on, and people on the move to hostile governments trying to sneak in WMDs into the southern border. Um, so, you know, very clear on and on the framing here.
Emily M. Bender:Yeah. And I wanna just flag exactly how dehumanizing this is. So just below the photo, it says, "The goal of the program is to leverage technology to force multiply the CBP presence, as well as reduce human exposure to life-threatening hazards." So when they say "human exposure", there, it is very clear that the only people that are counting is human are the CBP agents and not the people on the move.
Alex Hanna:Yeah.
Emily M. Bender:Uh, all right. There's some really, like laughably awful, uh, not anthropomorphization, but um, I guess we would say, um, canine-pomorphization? canine-morphization, um, where they, they make a bunch of jokes about these things like being shaped like dogs. Um, and we can go back up a little bit, but I think the one I'm remembering is at the very end, um, where, uh, here we go. Yes."In the future, could metallic beasts of burden shoulder some of the physically taxing and dangerous operational work to become a CBP agent or officer's best friend?"
Alex Hanna:Yeah.
Emily M. Bender:Yeah.
Alex Hanna:There's, there's a comment in the chat from Abstract Tesseract,"So many years of companies and cops marketing these weapons in cutesy ways makes me truly sick to my stomach." I remember I went to, um the, the Barnum and Bailey Circus, which went, I think went bankrupt and then reopened because there was a lot of criticism of their, an animal cruelty practices. I mean, rightly so. But then they replaced the actual animals with like the, these like robot, the robot copaganda. And they like, so they had one of these things. So I thought that was so, uh, so ironic.
Emily M. Bender:Yeah. All right. So is there anything else in this article that we wanna look at? Since I skipped us down to the bottom?
Petra Molnar:One other thing that comes to mind, if I may, is also just how these technologies, not only do they dehumanize people, right, that are at the, again, the sharp edge of, of where this is pointed towards. But what they also do to the kind of responsibility that lies in the way that, um, immigration officers have to behave, or make decisions. You know, this, one of the other concerns with tech in the migration space, because so many decisions that get made are already very opaque, very discretionary when a human officer makes them. And technology adds this kind of veneer that people who are powerful like to hide behind. Um, maybe robo dog's not the perfect example for this, but again, there's this kind of reliance on the fact that it's autonomous, that it's going to make decisions in complex regions and spaces, right? But what is that really doing to the fact that, you know, these are very complicated situations. Um, and I just, I just can't believe that this is, this is where we are at, you know?
Alex Hanna:Absolutely. And this is, I mean, I'm curious Petra, in, in your experience and in, in, in being quote unquote on the ground here, you know, like in terms of kind of the dragnet that these tools are being used at the southern border. I mean, how do they fit in the broader kind of array of technologies that are used? Because it's not just--robo dogs are far from the only thing, right? There's so many other kinds of things. And we know from DHS' own, you know, AI inventory that they've published that there so many different technologies in use. So how does this kind of fit within the larger, you know, range of technologies that they're using?
Petra Molnar:Yeah, I mean, robo dogs are, I guess one of the, the latest kind of incarnations, um, of this hellscape. But it also includes physical infrastructure, um, different types of trip wires, smart wall technology, drones. Um, you know, also massive data gathering, uh, practices that are happening also further and further inland. Always stretching this kind of dragnet right horizontally into the sky, um, vertically across the landscape, making it very difficult to, um, try and avoid this. Again, that's how we end up with, with people dying in really dire circumstances.
Emily M. Bender:And what exactly does a smart wall entail?
Petra Molnar:Well, that's, that's a good question. You know, not a lot of these parameters are super clear. Um, you know, for example, just last year, um, there was a, one of the, the border expos in, in Texas, uh, there was a brand new kind of addition that was announced that was going to be added to the, the wall across the, the US Mexico corridor. And I was there in May of last year. And, um, my friend and colleague, Todd Miller, who's an amazing journalist in Arizona who works on borders, I would highly encourage listeners and, and viewers to check out his work. We went down to the border to see kind of what's changed and to also try and see what this so-called like smart track actually even looked like. And you know, at first blush it just seemed like it was a bunch of wires that was there possibly, you know, to record environmental factors, maybe movement, maybe feeding information in some sort of autonomous way. But again, a lot of the parameters are not totally clear because a lot of the procurement, of course, around these technologies is actually not shared with researchers like myself or journalists as well. So we find, um, you know, information about this in a piecemeal way or when the powerful actors decide to share this in, in a press release like this, right?
Emily M. Bender:Yeah. Yeah. And just that phrase, "smart wall" sort of pulls on all of the AI hype, I think, right? This is, this is, uh, gonna figure things out for itself. It's somehow better. It's, you know, um, it's a, it's a good idea. Like smart is a very positive valence term. Um, and to sort of just like slap that label on it and not say what's actually going on, um, certainly makes it harder for the public to be informed. Um, and so really appreciate your work in, you know, talking about this and helping people see what's going on.
Petra Molnar:For sure. And I think it's also important to mention that this is very much a bipartisan issue too. I mean, I don't want to move away from this very critical moment that we are in right now with the Trump administration really expanding its surveillance capabilities. But much of the so-called smart border tech was actually introduced by the Democrats, um, in, in the guise of trying to like pretend that their ways of bordering were somehow softer or smarter compared to the previous Trump administration, when in fact it actually is, uh, a progression of thinking around carcerality, around control, around surveillance that actually needs to be queried in the same way as, as some of the violent practices that are happening right now as well.
Emily M. Bender:It's a very good point. And this, this article is from February 1st, 2022 smack in the middle of the Biden administration.
Alex Hanna:Yeah. And I think that's a really important point to highlight. I mean, the idea that E-carceration or any types of smart technologies or any, or any ways less violent than having people or CBP agents or whomever actually patrolling, I mean, you're increasing the panopticon. You're not making it smaller. There's not a way that this is any, any softer or more precise, you know?
Emily M. Bender:Yeah. So shall we move to our next artifact with that?
Alex Hanna:Yeah, let's do it.
Emily M. Bender:Yeah. Okay. So from the middle of the Biden administration to, um, let's see, just a couple weeks ago. This is a piece, uh, by Joseph Cox at 404 Media from April 17th, 2025. Um, sticker is just "news" and the headline is, "Leaked: Palantir's Plan to Help ICE Deport People." And then there's a big long subhead over here,"Internal Palantir Slack chats and message boards obtained by 404 Media show the contract contracting contracting? contracting giant is helping find the location of people flagged for deportation, that Palantir is now a, in quotes, 'more mature partner to ICE,' and how Palantir is addressing employee concerns with discussion groups on ethics."
Alex Hanna:So, yeah, so this article is kind of a long, kind of a long read on a bunch of reporting that 404 has been doing and others about, um, the kind of large data warehousing that Palantir and associated organizations have been really sa you know, um, really, um uh, uh, accelerating under the Trump regime. So the, the, um, first three paragraphs,"Palantir, the surveillance giant, is taking on an increased role with Immigration and Customs Enforcement, or ICE, including finding the physical location of people who are marked for deportation, according to Palantir Slacks and other internal messages obtained by 404 Media. The leak shows that Palantir's work with ICE includes producing leads for law enforcement to find people to deport and keeping track of the logistics of Trump's mass deportation effort, and provides concrete insight into Trump's, the Trump administration's wish to leverage data to enforce its immigration agenda. The internal communications also show Palantir leadership preparing for potential backlash from employees or outsiders with them writing FAQs hugs that could be sent to friends or, or family that start to ask about Palantir's work with ICE." That's wild. I I just saw that, I mean, said like you have to explain to your family what you're doing is not fundamentally amoral. Uh, and so then the, uh, quote, there's a quote here from, uh, Akash Jain, CTO at Palantir and President of Palantir USG, who says, "Hey, all wanted to provide a quick update with our work on ICE. Over the last few weeks, we prototyped a new set of data integrations nad workflows with ICE. The new administration's focus on leveraging data to drive enforcement operations has accelerated those efforts." So Petra, I, I'm wondering if you can mention a little bit about kind of these efforts not only, you know, with ICE or, but other types of enforcement agencies doing these large scale kind of data harmonization, uh, efforts and what they've been, you know, like what, what this has entailed.
Petra Molnar:Yeah. I'm so glad we're talking about this element of, of this whole story because I, I think sometimes it's really easy for us also to get caught up in the hype of something like a robo dog right? But it's a lot harder to try and understand the kind of infrastructure of data that underlies these kind of, um, decisions that get made when the administration decides that deportation is its number one goal. Private sector partners come in and they have to effectuate these goals somehow. And, and Palantir has been making bank, I mean, they've been involved in, um, these kind of public private partnerships for years. They are one of the primary actors in what some of us call the border industrial complex. Um, they've also expanded further, right? And, and a few years ago, there was a lot of criticism because even the World Food Program was partnering with Palantir. Why, nobody really knows. But again, they are able to kind of get their fingers into all of these very lucrative pies. Um, but it's also showing, I think this particular article shows that some of the cracks are perhaps beginning to appear with its own workforce, challenging, um, some of the ways that this is being presented. Right. Um, but it is, it is really depressing to see just how much of a hold actors like Palantir, but also Cellebrite, um, Israeli tech companies, other bigger actors like Clearview AI have in this space again, of, of kind of presenting their tools as a solution to the government's so-called problems. Um, that's what I think we're really seeing here very clearly.
Alex Hanna:Yeah. And Palantir, I mean, the kind of prior, prior in incarnations of Palantir focused a lot on these data fusion centers, especially on kind of domestic surveillance. Um, I know, you know, that's-- folks like Sarah Brain, you know, who have written about the way that this has been marketed to local police departments and ways of connecting, um, you know, really focusing on trying to target people who are suspected of criminal behavior. But this is, this is a real kind of scaling up of Palantir's machine here.
Emily M. Bender:I was really mad about this paragraph. So, uh, "The leaked material contains more specifics on, on that work." I skipped a couple paragraphs, but that work."Palantir's role also includes a quote'self deportation tracking project', which is designed to help ICE develop a more accurate understanding of people who voluntarily leave the United States, and another project concerning immigration lifecycle operations, which will support the logistics of deportation, such as overlaying information about detained or removed individuals, and the availability of transportation resources, according to the Wiki." And like the phrase "immigration lifecycle" ought to refer to, you know, welcoming people to the country, helping them get established, you know, making sure that they have a community that they can join, that they're housed, that they're fed, that they-- right. And, but that's not what they mean by that at all here, right?
Petra Molnar:No, not at all. And, and I think there's that bit also about, you know, voluntary deportations. I, I would also like to challenge that because there's been other programs in the past, including by the International Organization for Migration, IOM, that had a similar program like this that would basically give cash incentives for people to self deport. That is coercion. I mean, it's, it's the same kind of framing, right? That, that somehow you're able to compel somebody to, to remove themselves in the middle of their legal case, for example. Um, yeah, I mean, again, it, it, it all speaks to power imbalances that are so clearly coming to the fore now.
Emily M. Bender:Yeah.
Alex Hanna:Is Palantir, I know you mentioned Cellebrite, and which is a, which is an Israeli, uh, company. Can you talk a little bit about the kind of. I mean the network of different companies that are involved. I mean, Palantir probably gets the most press with regards to US surveillance, but I'm wondering, you know, there's a whole network of organizations that are involved in this infrastructure, right?
Petra Molnar:Yeah, totally. Alex, it's like an ecosystem that is so vast, and I think actors like Palantir catch our attention again because they are so huge. But there's also so many small and medium sized actors that are doing really evil things as well. Like one that comes to mind is Brinc, for example. Um, maybe you remember it from a few years ago. It was essentially some young tech bro, uh, who decided that it would be a good idea to put a taser on a drone and fly it along the US-Mexico corridor. And he got a bunch of VC money for it. Luckily The Intercept covered it. There was a big thing. Finally they pulled it or whatever, but still, like millions of dollars were raised for this so-called pilot project. So there's also these companies that like no one's really heard of, but I'm so glad that you brought up Cellebrite. Um, again, and Elbit Systems, Israeli companies that are also really implicated in this because this is also a global issue. Um, I've been trying to understand this from a global perspective because there's so many intricate relationships that are, um, being strengthened in this current moment. And much of this technology is first experimented upon Palestinians, both during the Gaza genocide, but also during the occupied Palestinian, um, West Banks, uh, area, for example. Um, technologies are tested there and then exported out for border enforcement. Elbit Systems, for example, has, um, fixed AI towers along the US-Mexico corridor. I've seen them with my own eyes. They're there. Um, Hermes drones, for example, Israeli drones, fly along the Mediterranean for border enforcement. So it is this kind of global, um, industry that has grown up around the testing of tech in, in spaces of occupation and conflict, and then being, exporting that tech for border enforcement too.
Alex Hanna:Yeah, absolutely.
Emily M. Bender:All right. So, um, in contrast to that, Palantir writes, um, back to the article,"Palantir writes it remains committed to privacy and civil liberty protections and says it believes this work with ICE is, quote, 'intended to promote government efficiency, transparency, and accountability.'" So, uh, "'We believe these conditions are the necessary predicate to provide the tools to help ICE drive accurate enforcement actions and enable fair treatment and legal protections, including due process for citizens and non-citizens,' the Wiki says." Like they're just lying through their teeth here, there's, like, in in what world? You know? Um, I'm going do the next one too, 'cause it gets even worse."'Palantir is cognizant of the risks to privacy and civil liberties involved in these mission sets and how they may be influenced by shifts and priorities,' another section reads.'Many risks will not be within our means to address. Some are structural and must be fully baked into the equation by virtue of a willingness to engage at all in these efforts. It's important to note that there will be failures in the removal operations process,' it adds." You know what, you can actually avoid the threats to privacy and civil liberties by not building this tech.
Alex Hanna:Yeah, yeah, absolutely. Go ahead. Go ahead Petra.
Petra Molnar:No, I was just gonna say that that just about breaks my heart because this is, I think, one of the foundational problems. In this whole thing, like the fact that private sector actors are literally able to wash their hands of responsibility, right? Well, not our problem if things go wrong, like we're just developing this, selling it off to ICE and you know, that's it. We see this manifesting in classrooms, right? Where people who are learning to be engineers and coders don't feel responsible for the tech that they're creating. We see this in Palantir and big, um, private sector uh, you know, actors, we see this with the government. Like there's so much responsibility laundering in this whole thing. Um, it really kind of turns the stomach, doesn't it?
Emily M. Bender:It's the cute little robot dog's fault, surely.
Petra Molnar:Exactly.
Alex Hanna:Yeah. I mean, I think there's, I mean, it's a really important point too, because I think the way that they are, I mean, they are abdicating responsibility here. And so, you know, here, you know, they, they continue to say, "Many risks will not be within our means to address. Some are structural and must be fully baked into the equation by virtue of a willingness to engage at all in these efforts. Important. It's important to note that that will be failures in the removal operations process." I mean, and it's such a anodyne kind of, uh, like framing of that. I'm curious on, on, on, on your thoughts, um, Petra, just thinking about the different discourses that are being mobilized. I think with Palantir and, you know, and I mean by extension, Trump and, and I mean in, in DHS kind of around criminality, around culpability and how the tech really plays into it or doesn't really play into it. I mean, I'm curious on your thoughts on that and what you've observed.
Petra Molnar:Yeah, absolutely. That's one of the major kind of animating factors behind, like normalizing a lot of this tech. Um, some scholars like César García Hernández have been calling it "crimmigration", or the criminalization of migration. Um, you know, this, this kind of idea that people on the move are somehow criminal unless proven otherwise. And that's even, not to get too legal-y, and I'm a reluctant lawyer anyways, but if I may, it's already baked into the way that the law around refugee status functions, right, right. Like you're not a refugee until proven otherwise. So there is this whole reverse onus principle that lies at the heart of how we even think about people on the move who are claiming asylum. And then you have this other layer, right?"Well, and they're probably criminals anyways, or they're terrorists", or these days, "or they're pro-Palestinian" or whatever other thing is being applied to people. Right. Um, and if that's the foundational idea, and then all of a sudden you have these tools that are being presented to you by the private sector to help you with this goal of rooting out the criminals, the, the terrorists, the threats, the ultimate other. Um, you know, it just, it maps onto logics that have always been there with bordering, they're just now made sharper and more, um, I think obvious in a way actually, uh, with this kind of messaging.
Alex Hanna:Yeah, absolutely. I, uh, that term is really useful. Crimmigration. Yeah. Thank you. This kind of idea of the act itself, of being a person on the move should be somehow criminal behavior or you're fleeing something or you're trying to sneak in some kind of a, a, um, some kind of criminality or something to, to--
Petra Molnar:And if I can just add to that, it, it also only applies to specific groups of people, right? People who are racialized, people who are othered. It doesn't apply to investor immigrants who are able to move freely, right? Or to goods. They're able to travel around the world. But certain groups of people that are historically marginalized, um, or those who are facing systemic racism, they, they are the ones who are the ultimate other and therefore have to be controlled somehow. Through technology, increasingly.
Emily M. Bender:Yeah. I wanna catch up a bit on the chat here. Then there's, there's three quick things I wanna hit in this article before we move on. Um, so on the point of, uh, historically and actually also currently in the US, offering payments to people who quote "self deport", Magidin says,"And they're not telling those people that if they quote 'self deport', they will be barred from returning legally for perhaps up to 10 years and are instead telling them that they can come back legally after they return to their home countries," which is rough. Um, Magidin again, "They are committed to their privacy and their civil protection. Those folks have sued elsewhere to prevent their contributions to unpopular causes
being made public." Abstract Tesseract:"The absolute gall of talking about transparency and due process while people are literally getting disappeared." Um, and, uh, Magidin again, "'Hey, don't blame us. We didn't create the risks. We're just greatly exacerbating them.'"
Alex Hanna:Yeah. Yeah, absolutely. There's a few other things. And do you want to, do you wanna read them? I mean, they're, they're very bad at the end of this article.
Emily M. Bender:They're bad, but they're kind of hilariously bad.
Alex Hanna:Yeah. So, so yeah, go ahead. I'll, I'll do, I'll do one.
Emily M. Bender:You do the first one. Yeah.
Alex Hanna:So John, so this quote from the article, so "John Grant, Palantir's ethics education program lead--" Which, it's just really incredible that Palantir even has such a thing, um,"--posted links to some internal pages that quote 'might help people think through some of the questions this work might raise for you.' End quote." They were, and these are in bullet points."Ethics FAQ," quote, "Can it be right to support a customer who you think is
wrong?" end quote and "Ethics discussion:the ethics of immigration." And I'm just really dying to see what kind of draws they have on those pages internally.
Emily M. Bender:Yeah. And Petra, what does ethics of immigration mean to you?
Petra Molnar:Well, you know, when I read that, I think like, yeah, I mean is it ethics of who gets to come, right? Like is it ethics of, um, who's kind of in power making those kind of determinations? And also it's, it's ethics, right? It's not human rights. It's not law. It's not something that is enforceable. Um. It's something for us to think about and wring our hands over, maybe over the water cooler. Um, and that's about it.
Emily M. Bender:Yeah. Make up trolley problems with, you know, people on the move in them. I have to say-- cat! So Alex, I thought you were gonna go for the one that, Jain's quote here. So, "'I recognize this is a topic of interest for a lot of hobbits and we're working to integrate these updates into the PCLFAQ,' Jain added, with hobbit a likely reference to--" Obviously."--the Lord of the Rings, um, which Palantir gets its name from." So they refer to their employees as hobbits.
Alex Hanna:That's just really, it's just really, I mean, I don't know what Tolkien's ethics were. If you, if you think he would be rolling in this grave, get the chat. Um, and then the last--
Emily M. Bender:These last two here. Yeah.
Alex Hanna:Yeah. These last two. So the, this one that I think circulated around was this image where Palantir, this ad. So Palantir is currently running adverts at US colleges, which say "A moment of reckoning has arrived for the West--" And west is capital here."Our culture has fallen into shallow consumerism while abandoning national purpose. Too few in Silicon Valley have asked what ought to be built and why. We did." So we're really just going, you know, full force, you know, return to like Samuel Huntington, you know, the, the, the Muslims and our, and the barbarians are at the gates. You know, it's just, we're back to it. You know, it's, it's, it's a tale as old as, I don't know, organized religion.
Emily M. Bender:I mean, I agree that too few in Silicon Valley have asked what ought to be built and why. I think that Palantir got the answer wrong.
Alex Hanna:Yeah.
Emily M. Bender:So, okay. Then one last thing, and this is just absolutely horrific."Acting ICE director Todd Lyons said at the recent Border Security Expo that his intention for the agency is 'squads of trucks, detaining immigrants in a similar way to how Amazon trucks are around the country delivering packages.'" This was reported in the Arizona Mirror. I mean, the dehumanization is never far away, but it is like, so on the nose there, it's unbelievable.
Petra Molnar:Yeah. I mean it's, it's, it's become so obvious, right? Um, it's, it's shocking to see this, but, but also, you know, I mean, it's been kind of ticking along with, with Palantir for, for many, many years. I mean, and, and also makes me think of it, the, the kind of political aspirations of the company, or at least its founder Peter Thiel, the fact that, you know, he essentially bankrolled, um, Vice President JD Vance's political career. I mean, yeah, they're, they're really enmeshed in there, in these nation building projects. Um and unfortunately right now part of the nation building project is to deport the so-called unwanted people.
Alex Hanna:Yeah, absolutely. And when you see this ideological project. You know, there's a, there's a sort of highbrow version, which is this, you know, which is a barely disguised, uh, uh, barely disguised racism. Um, that, and, you know, Alex Karp just released this book, um, that was effectively an extended version of this sentence, which is "What if Silicon Valley was enmeshed in the military industrial complex?" As if it wasn't, you know, born of such a thing uh, in, in its earliest in, you know, in it earliest incarnations. And then there's the kind of lowbrow populist, uh, Trumpian message.
Emily M. Bender:Yeah. So we, we have Magidin in the chat here, who's a deep well of knowledge about all things, Lord Of The Rings. Um, so comment is, "Well, the thing is that the people who had palantirs--" And sorry for pronunciation, "--the Dunedain, Sauron, Saruman and Denethor, all either completely ignored or definitely looked down on hobbits. So."
Alex Hanna:You also have Were Piligrim who says, "Bilbo Baggins type hobbits or Gollum type hobbits?" So.
Emily M. Bender:Yeah, I mean, so who are the people, like one would hope to see like mass worker action at Palantir, but I'm afraid that they probably have collected the people who think that, oh no, don't put politics in my computer science classes.
Alex Hanna:Yeah, I think you, you've got a, you've got a certain kind of selection bias there.
Emily M. Bender:Yeah. Alright, so one last artifact, Alex, I'm gonna let you lead on this one.
Alex Hanna:Sure. So this is from a publication and I believe this is just a press release, um, from, um, but the publication is called Defense Industry Europe, and this is from May 1 of this year. And the title is "Athens Prepares to host DEFEA--" uh, which is D-E-F-E-A, um, which is the acronym for Defense Exhibition Athens? Or-- yeah.
Emily M. Bender:Yeah.
Alex Hanna:Um, and then, "--a major hub for international defense cooperation." And then there's an image of people walking through, uh, this kind of conference center. And then the, uh, the bold is about the conference. So the, "The countdown to DEFEA, Defense Exhibition Athens 2025, has officially begun, with the international event set to take place from 6 to 8 May, 2025, at the Metropolitan Expo in Athens. Held under the patronage of the Hellenic Ministry of National Defense, the exhibition will welcome 436, uh, exhibitioners from 37 countries, including 18 national pavilions and 98 official delegations from 45 nations." So Petra, you're, you're heading to this thing, right?
Petra Molnar:I am. I am. And you know, I said to myself that I would never set foot in a setting like this, because I have gone to DEFEA previously, I've gone to the World Border Security Congress while writing my book and, you know, it's, it's, it's hard to even reflect on what these spaces are like because they are just so overtly violent and, and difficult to be in. Um, and, you know, doing this work now since 2017, I've seen some difficult stuff, you know, like whether in the occupied West Bank, whether in refugee camps, you know, just really seeing conditions that challenge your, your common kind of understanding of humanity. That stuff stays with you. But I have to say, actually it's the stuff from spaces like this that really makes you feel so gross. Like the hottest shower in the world won't wash off what you feel when you are in these kind of big spaces. Um, but for a variety of reasons, uh, to collect some more data, I am going, um, again, but it will be, I think it'll be interesting to see also what has changed in the last couple years and how much more normalized, um, again, this kind of conflation of military and defense technology is with migration and border enforcement. Um, because before, for example, DEFEA was also sponsored by the Hellenic, the Greek migration ministry. Um, so there was a really, really clear conflation there. Um, and they made it very obvious that they were really interested in defense technology for the purposes of border control.
Alex Hanna:Yeah, and you talk a lot in your book about how Greece is just the site of just immense human rights violations at the border of people on the move trying to enter the EU and these kind of co-- concerted strategies of things like pushbacks and, and, and any type of people trying to make asylum claims. Can you talk a little bit about kind of like Greece's implication in this and as well as kind of like how they're, they're using this space as a way to communicate that with partners and in, in, in this text they mentioned officials from NATO and the EU coming and then other people from a lot of institutions and, and acronyms that I'm not familiar with.
Petra Molnar:Yeah, so Greece is a really fascinating player in, in this conversation. Um, they have been explicitly referred to in European Union policy documents as "Europe's shield" because they're one of the frontier countries, um, along with Spain and, and Italy as well. And they're also a country that's been hosting a large number of people on the move and refugees. There were thousands of Syrians, for example, arriving in 2015, 2016. And, and those numbers have steadily continued since then. Um, but what makes Greece interesting also is that it is this kind of laboratory for experimentation. Um, Greece got a bunch of money from the EU to build high tech refugee camps on the five main islands that were warehousing refugees in essentially open air prison refugee camps. But now those camps are full of biometric tech, drones. At one point there was even an announcement made, talk about like AI hype, um, of virtual reality glasses being given to the border guards to wear, collecting some sort of information, beaming it to a control center. So Greece is a very important kind of bellwether to, to pay attention to because there's so much normalization of tech there that then gets exported to other parts of the EU. And, and also countries like the US and Canada learn about it, right? And, and then they wanna replicate it. Um, I found myself in Greece in the summer of 2020 in the middle of the Covid pandemic because Covid was also being weaponized as a way to not allow people and refugees to, to move freely. And like a good little ethnographer, I thought I would stay for a few months and I stayed for two and a half years. And I've been going back since. And my most recent visit to one of the camps was last summer, um, to the island of Samos, to see what's kind of been happening there. And so much of the tech that's going to be shown at, um, DEFEA, this, this conference that's coming up tomorrow, is actually now being implemented in these spaces. Spaces that are supposed to be for protection, but actually they are spaces of carcerality now.
Alex Hanna:Yeah. There's one piece in here. Oh, sorry, Emily.
Emily M. Bender:Oh, I was just gonna lift up something from the chat. Abstract Tesseract says, "The image in the article is so disturbing. Like a fan convention for state violence."
Alex Hanna:Yeah, a hundred percent. If you go to the DEFEA website, it's really actually unsettling. You click on it and it's, all of it is like this um, background carousel of just people in military garb, just like walking around looking very important. Um. I do, one thing I wanna raise up, uh, here is-- Um, oh yeah, the image in the article. Thank you, uh, uh, Christie, reminding us to describe it. So there's lots of people in suits and some of them are holding up cameras and looking around. Um, and then there's a few different, um, signs for technologies. So one's called KeraMetal, one's called Matador Industries, one's called MSM Land Systems. And then it's just kind of packed to the gills with, um, exhibitors. Um, and you can't really see a lot of the, uh, paraphernalia, although there's kind of one that you can see to the right, in which there's like a parachuted or paratrooper like mannequin and then people kind of diving in a circle. It's sort of like, almost like inspiration core, except violent because everybody's a military, it's very bizarre. I'm like zooming in and trying to see on this. It's a really, it's a really bizarre kind of view.
Emily M. Bender:And the other thing that's striking me about this picture is that this must be from like 2021 or 2022, because I think everybody's wearing a mask.
Alex Hanna:Yeah.
Emily M. Bender:And I, I doubt that's the vibe that you're gonna find there tomorrow.
Alex Hanna:Totally.
Petra Molnar:No, probably not. But I think, we'll, we'll see some of the similar stuff that we're looking at in, in the photo there. I remember, you know, the last DEFEA I went to, yeah, would've been in 2022. And yeah, it was, it was shocking to just even be in such close proximity to a lot of the tech, the robo dogs, the tanks, the kind of paratroopers, uh, paratrooper stuff, the biometric tech. Um, and it's just all there along with free pens and tote bags that you can pick up and things to test out, and then people who kind of will, well at first willingly talk to you and then they see "researcher" on my badge and all of a sudden they're like, wait, not her. We're not gonna talk to her.
Emily M. Bender:Alright. So I think there's one thing, Alex, did you wanna get to the shield thing? Where did that go?
Alex Hanna:Oh, no. Well, I did want to mention two things. So first off, there's a very funny thing in here where they say, "One of the highlights of DEFEA 2025 will be the unveiling of Greece's new multi-layered air defense framework, the quote, 'Shield of Achilles'," which is very funny. Achilles, a a a man who is very well known for his defenses, you know, um, so just a very funny name. Um, and then, you know, the jokes write themselves. I did wanna also lift up, uh, uh, or, or not lift up. Uh, but like mention these, these two quotes, especially on the, uh, the autonomous version of the weapons. So, "The exhibition will also feature advanced missile systems in a full spectrum of precision guided munitions vital to modern warfare. Attendees will see a comprehensive range of autonomous systems, UAVs, USVs, and UUVs--" Um, what is a, I know a UAV that's a drone, but it, well, USV is a unmanned surface vehicle and UU, un, maybe underwater?"--and cutting edge counter drone solutions designed to neutralize both singular and swarm threats." And then, "In addition the display will also include main battle tanks, armored vehicles, helicopters, artillery, multiple launch rocket systems, communication command systems, small arms, guided missile, strategic weaponry." So really, yeah, I mean, like abstract test or augment. This is definitely like Comic Con for, um, you know, uh, uh, for, uh, state violence, uh, uh, um, executors and commanders.
Emily M. Bender:Petra, I hope that you can, uh, get what you need for your research without taking on too much psychic damage from just being in that space. It does sound awful.
Petra Molnar:Yeah, thanks. I mean, I've, I've taken a bit of a step back from doing a lot of the on the ground work, so a little bit here and there is okay. But yeah, it, it does take its toll for sure.
Emily M. Bender:Yeah. And to be in a space where the people are excited about these things instead of terrified is, yeah. Okay. I think that brings us to our transition to Fresh AI Hell, Alex. Musical or non-musical?
Alex Hanna:I can't get amped after that. I think non-musical.
Emily M. Bender:Non-musical. Okay. So, uh, let's say that you are one of the demons, of Fresh AI Hell, and you are tasked with taking one of the robo dogs out for a walk. But it is being recalcitrant.
Alex Hanna:Oh my gosh. How do I do this? Well, my, my AI Hell accent has been sort of vaguely New Yorkian, although if there's any New Yorkers on listening, don't, don't hold it against me. Um, like, Come on. Come on, Sparky. Come on, come on. Oh, I gotta get on your little boots. I gotta get on your boots to put 'em on. Okay. Sit, sit, sit. I command you to sit. You gotta listen. Don't you got ChatGPT in you or something. Listen. All right, sit. I gotta get your little boots on. The boots are actually like spikes so it doesn't fall down or something. Anyways, that's my impression. Sorry to all New Yorkers out there.
Emily M. Bender:Well, thank you. And, and um, yeah, as I was trying to figure out what prompt I could possibly use I decided I was gonna go for the only cute, though evilly cute, thing that we talked about. Um, okay. So Fresh AI Hell, um. Alex, you wanna go first?
Alex Hanna:Yeah. So this one is from Futurism and the sticker here is "The time of monsters", which is absolutely perfect. So this is from May 3. Um, and the title is, "Facebook allegedly detected when teen girls deleted selfies so it could serve them beauty ads." And the quote here, I don't know who it's attributed to, but it, it is,"This is what puts money in all our pockets," and it looks like, thank you. Searching this. This is, oh, okay. So this is, um, this is a quote that was told to Sarah Wynn Williams, the author of "Careless People," who, uh, published this kind of, who used to be the director of public policy, at Facebook, uh, or rather Meta and, um, and, and became a New York Times bestseller 'cause they Bar uh, Bar, they Barbara Streisand. Yes. I said, I said the full Christian name, Streisand effected it. Um, okay. And this attributed to "one of the top advertising executives," so oof.
Emily M. Bender:Oof. Yeah. Yeah. Um, and I was just backing up to our transition, um, Abstract Tesseract said about the dog, "Sudo sit."
Alex Hanna:Yes.
Emily M. Bender:I don't know where this went.
Alex Hanna:Super user privileges to those dogs. If only it was that easy.
Emily M. Bender:Okay. So this next one comes from TechCrunch. The sticker is "security". The journalist is Zach Whitaker. It is from May 2nd, 2025. And the headline is "Dating app Raw exposed users' location data and personal Information." So that's already bad enough. But the thing that I wanna focus on here is this bit of hardware, is these rings, um, that, um, supposedly, and this is just so awful. Um, so, uh."News of the security lapse comes in the same week that the startup announced a hardware extension of its dating app, the Raw Ring, an unreleased wearable device that it claims will allow app users to track their partners' heart rate and other sensor data to receive AI generated insights ostensibly to detect infidelity."
Alex Hanna:Yeah. Gosh, I cannot think of something that is just. I mean, this is the, this is the most insecure piece of technology I think I've ever heard of.
Emily M. Bender:Exactly. And also like "AI generated insights". Are they going to be reading in the sensor data and doing like an encoder decoder thing where the decoder just like randomly makes shit up about--
Alex Hanna:Yeah, it's gonna make some determination. Like it's gonna think that, you know, you're having sex when you just like went on a run or something.
Emily M. Bender:Yeah. And also to be a little bit on topic for the episode, does this look like the sort of repurposing of some border surveillance tech or something that would likely be turned into that?
Petra Molnar:Yeah, very much so. That's kind of where my mind went as well. You know, the kind of emotion recognition that is being played around with when we're introducing like AI lie detectors at the border. You know, these kind of like quasi or fully snake oil ways that somehow we can decode what's happening physiologically. Yeah. I mean all of a sudden you find it in a ring that is making predictions about your behavior and maybe you just have asthma.
Emily M. Bender:Yeah. And and the weird thing about this is it's, you know, it's the luxury surveillance thing, like people would be ostensibly in most cases wearing this ring voluntarily, although you could also imagine coercive situations depending on the relationship.
Alex Hanna:Yeah.
Emily M. Bender:Alright. This is supposed to be fun folks.
Alex Hanna:Sorry. All right, so this one is from, well, this isn't fun. So this is--
Emily M. Bender:This isn't fun, no.
Alex Hanna:This is from the Gothamist. Uh, the title is "MTA wants AI to flag quote, 'problematic behavior' in NYC subways." And, um, let's see, let's get some more detail. So, scrolling down. So this is by Steven Newman, Nessen, uh, published on April 28th, um, and said "MTA Chief Security Officer Michael Kemper, said the agency plans to use artificial intelligence technology to detect potential trouble or problematic behavior on our subway platforms." Uh, and it looks like this is, oh Lord. And so they say that, "it can analyze real-time footage from subway security cameras and issue automated alerts to NYPD if 'someone is acting out irrationally.' He called the technology quote,'predictive prevention' unquote, that they, that can essentially identify subway criminals before they commit crimes." Absolute nightmare stuff.
Emily M. Bender:Ugh, yes. Um, and Magidin says, uh, "I see a problematic behavior: trying to use AI to police folk."
Alex Hanna:Yeah.
Emily M. Bender:Yes.
Alex Hanna:I mean, and this, and then of course it's gonna be, I mean, this is, this is so easily going to be used to police people who, um, you know, may be having psychotic breaks or mental health issues. And, you know, this is gonna be deployed disproportionately on Black and Brown people in the city.
Emily M. Bender:Yeah. Okay. Definitely not fun. Next, this comes from Los Angeles Times. Uh, sticker is "California", journalist is Jenny Jarvey, uh, national correspondent, from April 23rd, updated April 24th. And the headline is, "State Bar of California admits it used AI to develop exam questions, triggering new furor." And I just wanna say, I cannot believe that we keep getting instances of lawyers getting in trouble for using ChatGPT or similar and then like submitting the results. It's like, don't lawyers gossip? Like how are people continuing to make this mistake? And then we've got, okay, great. So the bar which is in charge of, you know, licensure of lawyers is also doing the thing. Just, ugh.
Petra Molnar:Yeah, and it's, it's hard also, you know, from like the pedagogical side too. Um, you know, when you have students who are learning in your law classes, right? And, and all of a sudden you get a very strangely written, almost too perfect pleading, you know, from a student and you're like, hold on. You know, it's, it's hard to know, yeah, like how to police even the use of AI in, in your own classroom.
Emily M. Bender:Yeah. I mean, I think the trick is to not police it, but, and it's hard, right? But to, to work with students to sort of make clear why it is that it is not to their benefit to be doing this. Um, and once you make the decision not to police anything in the classroom, you're basically saying, yeah, I'm, I'm not going to be able to perfectly avoid all of this, but you never can. Right? Um, but yeah, it is super frustrating to look at something that's like, I know that this is not the student's writing, um, and yet I am, and I'm forced to assign grades. So, what do I do in this situation? Right?
Alex Hanna:Yeah. Yeah.
Emily M. Bender:Okay.
Alex Hanna:Staying in, stay in California. This is some awfulness that is from, uh, this is actually a press release here from the State of California."Governor Newsom announces launch of new AI tool to supercharge the approval of bidding, building permits and speed recovery from Los Angeles fires." Uh, and so, part of Newsom's plan, um, to integrate generative AI into everything from traffic to homelessness, um, which makes no goddamn sense, um, but the lead here is, "Leveraging the power of private sector innovation, Governor Gavin Newsom announced today announced the launch of a new artificial intelligence driven software to aid LA city and county in accelerating the approval process for rebuilding permits to help communities recover more quickly from the Eaton and Palisades fires." And this is really wild. So if you scroll down, like I don't wanna read what Newsom said."The software, created by Archistar, will be provided free of charge to the local governments and to users through a partnership between the state and philanthropic partners, including LA Rises and Steadfast LA--" These are billionaire funded organizations, by the way, LA Rises I think is funded by, um, Magic Johnson and Steadfast LA is funded by, uh, Rick Caruso, who ran for governor of, uh, or ran for mayor of LA and, and lost. Another kind of heir to a billionaire fortune. Um, and then "--with contributions from Autodesk and Amazon." And then there's a quote here from Caruso where he says, blah, blah, blah, "Bringing AI into permitting allows to rebuild faster and safer, reducing costs and turning a process that can take, take weeks and months into one that can happen a matter of hour of days, hours or days." And this is so ridiculous because you're issuing permits, but don't urban planners have to still verify all this?
Emily M. Bender:I mean, you hope so, right?
Alex Hanna:So you're, yeah. So it's this idea, the of your reducing bureaucracy, but you're also taking major people that work for city and county government, um, uh, out of the equation. Right?
Emily M. Bender:So this last paragraph that's on the screen, "The software uses computer vision, machine learning, and automated rule sets to instantly check designs against local zoning and building codes in the assessment process for building permits." And it's like, the reason that California earthquakes are not super deadly is that California's actually got pretty good, uh, building codes.
Alex Hanna:Yeah. Yeah.
Emily M. Bender:I do not want non-determinism in the application of the building codes. It's terrifying. Okay. Uh, gosh, I really did not pick anything uplifting this time.
Alex Hanna:No, you didn't. I, usually we have something nice to end on and it's, yeah.
Emily M. Bender:And I didn't do it this time. I'm sorry.
Alex Hanna:Really awful.
Emily M. Bender:Okay. So this is, um, a article from a Swedish, um, publication, um, about a museum in Malmo, which is across the bridge from Copenhagen, there. Um, and it's this really awful picture of a cheerful looking Holocaust victim.'cause it's a fake picture. And, um, it's in Swedish, which I don't read, but the quote tweet by, uh, Vivo Kale says, "AI makeup retouch on Holocaust images. I'm shocked. Shocked." And then, "Well, thanks to listening to, @ Emily and @ Alex on Bluesky, not that shocked." So yes, some museum decided it would glow up these images of Holocaust victims. Like, I, I can't even, okay, one last one. Also not uplifting. Go for it, Alex.
Alex Hanna:Yeah. So this one is really fucked up. So this is, uh, this is from 404 Media and the title is "Researchers secretly ran a massive unauthorized AI persuasion experiment on Reddit, uh, users." It's by Jason Koebler from April 28th. Um, and what they effectively did here is that they had bought, they had this subreddit called, I think it was r slash change my mind, where people--
Emily M. Bender:Change, change my view.
Alex Hanna:Change my view, thank you. Uh, and they had this, um, uh, basically this idea where people proffer arguments, you know, uh, against this. And then you can kind of arrange to give people points whether you find something convincing or not. And effectively what they did is that there's these researchers that are unnamed, uh, which is really weird to begin with. Um, and they, they haven't been named yet. Um, and so, um. Or at least in times, at time, a time of recording. And, um, they, they put, they developed all these bots to effectively say that they had been people from particular identities or had certain experiences to like try to change people's minds about things. So including sub, including like a sexual assault survivor, a a Black man, a trauma counselor, and a Bl--yeah, the Black, a Black man specifically opposed to Black Lives Matter. So like, just really, I mean, I don't think we've seen anything like this since, I mean the Facebook emotional contagion study, but it's, it's, they, and they made a couple thousand posts to this end and then they had argued that there was some kind of IRB approval on this, but the the identity of the researchers is anonymous. And I think the, also the, they didn't release, I don't know if they, I don't, did they put a data set with no name on it? It was something, or not a data set or a paper. I was listening to the 404 podcast on it. And just the details of this from a research ethics perspective are just absolutely shocking.
Emily M. Bender:Yeah, absolutely no informed consent. And you say, we haven't seen this since the Facebook emotion contagion experiment. That doesn't mean it hasn't been happening.
Alex Hanna:Yeah. Yeah. This is the one where they, you know, they, I think, did they admit it or did they get caught?
Emily M. Bender:I, they, they contacted the moderators asking for feedback on the article. And the moderators were like, hell no.
Alex Hanna:Yeah. So, yeah. Yeah. And so now red, reddit is taking, taking action against these researchers.
Emily M. Bender:Yeah. Whew. All right. Well, um, I'm sorry I didn't put anything uplifting in there. Let's say book. Book, that's uplifting. The book is coming out. Okay. That's it for this week. Petra Molnar is Associate Director of York University's Refugee Law Lab and a faculty associate for the Berkman Klein Center for Internet and Society at Harvard University. Petra, thank you so much for joining us and talking through these deep and important issues today.
Petra Molnar:Thank you so much for having me, and thanks for the work you do. Everybody buy their book.
Alex Hanna:Buy Petra's book."The Walls have Eyes," um, really, really fantastic book. I recommend everyone read it. Um, our theme song is by Toby Menon, graphic design by Naomi Pleasure-Park, production by Christie Taylor. And thanks as always to the Distributed AI Research Institute. If you like this show, you can support us in so many ways. Pre-order "The AI Con" at TheCon.AI or wherever you get your books. And join our virtual book tour kickoff event on Tuesday, May 8th. Or find us on the road, a full list of events@thecon.ai.
Emily M. Bender:But wait, there's more. Rate and review us on your podcast app. Subscribe to the Mystery AI Hype Theater 3000 newsletter on ButtonDown for more anti hype analysis, or donate to DAIR at DAIR-Institute.org. That's D A I R hyphen institute dot org. You can find video versions of our podcast episodes on Peertube, and you can watch and comment on the show while it's happening. live on our Twitch stream. That's Twitch.TV/DAIR_Institute. Again, that's D A I R underscore Institute. I'm Emily M. Bender.
Alex Hanna:And I'm Alex Hanna. Stay out of AI Hell y'all. Sparky, get your grippy shoes on.
Emily M. Bender:Oh, what a cute little sparky. Ouch.