Chaos and Conviviality
Dip and Scooter discuss current events and culture, with an eye towards strategies for liberation.
Chaos and Conviviality
Google and Elon Musk Want to Put Data Centers in Orbit. Will that Solve AI’s Contradictions?
Use Left/Right to seek, Home/End to jump to start or end. Hold shift to jump forward or backward.
On this episode of Chaos and Conviviality, Dip and Scooter discuss current proposals to put data centers in space, an idea which would seem to be a win-win for communities and data center developers. They analyze the push to adopt "artificial development," what it means for our communities, and how to fight back.
See time stamps below:
- 16:17: Are there any benefits of AI?
- 24:20: Indigeneity and space
- 27:38: Data centers in space: good for communities?
- 34:00: What might organizing in space look like?
- 55:35: AI bubble?
Highlighted show notes (for full show notes, see the webpage):
- Meet Project Suncatcher, Google’s plan to put AI data centers in space
- In Kenya’s slums, they’re doing our digital dirty work
- How Elon Musk’s Sci-Fi Hyperloop Failed
- Everything for Everyone: An Oral History of the New York Commune, 2052-2072, by Eman Abdelhadi and M. E. O'Brien
- Circular Vendor Financing in the AI Sector (includes graphic mentioned in conversation)
- Building the Structure of the New Society Within the Shell of the Old
Google is looking to build data centers for AI and orbit satellites as part of Project Suncatcher. So these satellites would basically be solar powered and positioned as to get maximal sunlight. Some engineering sticking points with these things are the lack of speed for like wireless communications between satellites and the fact that the satellites would need to be really close together, basically closer than any other satellites in orbit. And this whole project really hinges on Google's TPUs, which are kind of a new alternative to like CPU, GPU, right, uh processing units that they're working on. These would need to be able to withstand five years of space radiation. And they're kind of also waiting for the satellite price to drop for putting, you know, satellites or data centers in space. They're waiting for that to become comparable to data centers on Earth by the mid-2030s. So, in a bit of related news, uh, Elon Musk has recently merged SpaceX and XAI, so his space company and his AI company, pointing to the same kind of thing, the idea of putting these data centers in space as part of the rationale. So as people are organizing against data centers for AI being built near their communities, outer space is kind of being like presented as this good alternative for everyone involved. There's a lot of reasons to feel this way, but concretely the environment is getting harmed by AI. And as it's continue to be built out through data centers and things like that, that's going to just exacerbate more and more. And like, yes, that's true for a lot of technology and different things. And no, that doesn't make any of it okay, right? Like that's just because other things do it too, doesn't mean we should add fuel to the planetary fire. So as AI continues to be adopted, it will really end up overriding any sustainable gains that have been made by the energy sector because that increased demand will lead to the usage of dirty and non-renewable energy to power these data centers, right? And the people who will be most impacted by this are the communities' grids that the centers are connected to. As it's already kind of been shown, the power companies will pass the cost of these data centers on to these communities. And it will also increase the risk of blackouts during periods where there's a lot of demand, right, for that power. Keeping that in mind, like alongside that, the climate impacts of dirty energy are well known. Like it's not good. And that's not even really getting into things like the poisoning of the air and the water that isn't used up, as shown by XAI, who we just mentioned earlier in Memphis. But the thing about this AI is it doesn't just like fuck up black communities in the US because some people don't seem to care about that. So it's not just us, right? It's also built off of the unequal labor of the like diaspora, the black diaspora, and the exploitation of like other communities and other quote unquote global south countries, right? Behind the kind of black box of big tech platforms like AI, there's an army of people like workers in Kenya who have to moderate the content there. They're called data labelers, right? They have to kind of brute force moderate the content. They're doing it all by hand, basically. So they're manually trawling through the content in a space that really is basically a digital sweatshop. And I say that, I don't say that lightly, right? Like all of the kind of implied precarity, all of the implied unhealthy conditions that that framing gives us is accurately describing these conditions that these folks are doing this data labeling work in. But thankfully, people are fighting back using tools at their disposal to assert themselves and make their stands, right? Like these data labelers aren't just kind of like laying over and allowing these things to happen, right? Um and I really do hope that that continues and that these oppressive scripts that are kind of being written for these folks can really widen and like take all this stuff with it. You know what I mean? I think one of the ways that this fight can become really difficult beyond the kind of AI, like anti-AI common sense that can come from being directly exploited, poisoned, or taxed, is really simple. One of the things that can make this fight really difficult, beyond the kind of anti-AI common sense that can come from being, you know, directly exploited, poisoned, or taxed, is really simple. AI has had a lot of years of useful PR. I avoid saying good because that has a kind of moral value to it that people associate with positive. And that's not really what I'm trying to say, right? Useful here just really points to that PR being consistent and unavoidable. Think of you know popular science fiction stories, whether it's Blade Runner, whether it's Murderbot, whether it's Cyberpunk 2077 or other cyberpunk genre stories. All of them, not all science fiction stories, obviously, but these stories I listed as examples showcase the future where there is a kind of general artificial intelligence as a foregone conclusion. It's gonna happen, right? No matter what. And they depict it in different ways. So some will have it be unfettered, it's going wild, some will have it regulated, right? Some will be like, oh, we had to put that to the side. Sometimes it'll be, you know, you'll have robots or androids that are indistinguishable from people, or you'll have some kind of like hive mind in the cloud kind of being that's doing AI stuff, right? And the issue with real life is that we've kind of opted for like the saddest possible version of like a world with AI, where you have people doing things like talking to Chat GPT in order to work through like heartbreak or trauma or things of that nature, right? And that's just not good, that's not healthy, I don't think. Uh not blaming those people per se, but more like you know, that's something that is you shouldn't be talking to a machine that is kind of meant to feed your ego, basically, in order to deal with the complex issues in your life, is what I'm getting at. So that kind of, I think, is why that kind of thing. I think is why AI has been a bit stickier than the kind of fool's gold rush of NFTs that happened in the early 2020s. And I think it's worthwhile mentioning that, like, even though there is a difference between generative AI, which is what we've kind of been talking about when you know Elon Musk says he's gonna put data centers in space, that's what is backing those things, right? Like it's generative AI. General AI is what people usually imagine when they say AI, and that's a different thing, right? But I think that lack of clarity has been purposefully kind of blurred, right? Like, like that lack of clarity has been like heightened, which really allows generative AI, which is really based on like large language models. So again, all the stuff we've been talking about, ChatGPT, Grok, uh, these models are really like synthesis machines, which is just meaning that like they remix massive repositories of information in order to spit something back out of you, right? Back out at you, rather than thinking in the way that we kind of imagine thinking to happen. And so those kind of things can be conflated with the general AI of sci-fi, right? And I think this sort of conflation isn't really easy to have a conversation around because there's these flashy showcases of what Gen AI can do. There's a recently viral video of like Tom Cruise and Brad Pitt fighting, it's really ridiculous. But the basic idea is that, like, you know, that is somehow supposed to lead into giving credence as to proof that this generative AI is actually intelligent. How that does that, I don't really know, but it works for people to kind of again buy into this mythology around what this technology will bring to us in the future, right? And I find it really ironic that these uh tech oligarchs who are ushering in this AI era seem to miss the point of these sci-fi stories. So many of these stories are navigating the kind of difficulties, to put it very euphemistically and broad, of creating sentience out of technology, basically, right? Like it's never treated as like, oh wow, this is great and good. Usually it's very much a cautionary tale. And so I think there's something to really be said that, like, again, similar to those cautionary tales, especially like things in the cyberpunk genre, uh, there's something to be said for the ways that technological development, quote unquote, is used to enrich people who were already rich at the expense of people who aren't rich, right? So, in thinking through this, again, tying it back to this like viral AI videos, or if you see AI memes and like, you know, all these ridiculous AI things, I tie it to Guy Debord's like society of the spectacle. I won't go too theory hard for y'all, but the useful idea I want to highlight there is that like our society is consumer oriented rather than being focused on like making things to make sure we survive as a society level. It's like focused on like creating things that feel and look nice for people, for them to be entertained, whatever. And those things are placating, right? Like we can ignore the things we're actually going through materially because we're able to see something that's funny or we're able to see something that makes us feel good in that moment, right? And that's kind of all AI really holds as a promise for us is that we can like see memes that suck and see like videos of things that could have existed better if a person made it, but like it's an AI thing. And so I guess you can make it your own, kind of, but then it's gonna just copy something else, it's a mess, right? But yeah, those things are like the best chance we have to get a return on investment, quote unquote, on this technology, right? If you're a random person, like I'm kind of saying, right? Like you're if you're not one of these tech oligarchs, you're not somebody in the professional managerial class, right? You're not a middle manager or whatever, then if you're just some person, you're not gonna get benefit from it unless you find being plagiarized beneficial, unless you find being exploited beneficial, unless you find being made less capable, right? Having skills be not like cultivated, right? Like if those things are beneficial, then yeah, it's beneficial, but I don't think that's how anybody uses that word. So I just have a hard time understanding the utility of this technology in that way, right? For people.
SPEAKER_00Can you explain the incapable piece you're talking about? What do you mean by that?
SPEAKER_02Yeah, so I think the big thing is that, like, you know, there's a lot of conversations around stuff like the literacy crisis, anti-intellectualism, blah, blah, blah, blah, blah, blah, blah. And like, I don't personally like a way a lot of that conversation is framed. I don't like getting into the weeds on certain aspects of that because I feel like it becomes a kind of like people start sounding like um, for lack of a better way to put it, eugenicists, right? It becomes like an IQ thing, it becomes a, oh, you didn't go to this school, so you didn't get this, this, and this. And it's like, that starts to get real messy. So I'm not saying that. The point I'm trying to make is like if you're in school and you have an assignment to write an essay, and you say, All right, I'm gonna have my AI write this essay, then yes, your essay might be more consistent or whatever. It might be crafted in a certain way that's more uniform, or you know, it might highlight sources or ideas that you maybe didn't necessarily have in your repertoire, right? But it's like at the expense of like learning how to do that specific task. Um, that's kind of what I mean by this idea of being enfeebled, right? Like it is a shortcut that isn't good again, right? Like we could, we could like kind of talk about how like it doesn't really produce the kind of things we want in the way that we necessarily want them, but it also is something that like, especially when you are trying to learn or you're trying to do these kind of different things, that it's like doing that work for you in a way that is not very conducive to understanding how to like evaluate the work that it produces, how to do that work yourself, right? Like it's kind of just a way in which you're able to game whatever system you're participating in, because usually those systems, like again, school as an example, it's like usually writing essays is probably one of the best assignments you can get in school. And even that's very gameable. So, like imagine if you're like taking a quiz on your computer or whatever, right? Like, like that was already an issue, even when it's just like, oh, I'm gonna look up the answers on the internet or whatever, like that. And it's just really, I think, a question of like where skills are held, who has access to those skills, and like how that information is spread around, right? Because like if we have AI and AI is writing all of our essays, just using that example to continue the analogy, then if people don't know how to write essays, then it's kind of like, well, they have to use the AI to like put their thoughts together, basically. And that creates a situation to where um rather than like individuals having that ability and being able to bring a multitude of perspectives to a topic, right? It becomes a thing where it's like robot versus robot. Those individuals maybe are shaping in a small sense, right? Like, but they're not necessarily having as much input as like the entire cacophony of the language model that AI is working off of, right? And so it becomes a kind of like homogenizing thing too, which I find to be concerning at the very least. So that's kind of what I'm thinking about when I use that word being enfeebled, is like I'm not trying to talk about someone's innate capacity or something, and I'm talking about like things that are being done to people, right? Like it's like, oh, they're taking things away from you, they're taking capacity away from you. And that is, I think, a dangerous thing about this particular quote unquote technology or this particular quote unquote tool is that it's like alienating people even further from the ability to do stuff, yeah, in a lot of cases.
SPEAKER_00It's really giving um WALL-I. I don't know if you've seen WALL Eachly. It's multiple aspects, yeah. Obviously, the environmental one too. I understand in my work and some of my colleagues, like there's a lot of energy around, you know. I know coders love it. Some people like the AI search. Yeah. Um I see a lot of like shitty presentations that are clearly AI generated. What are what are we doing? But I understand people like these things, but it they're so the the benefits appear to be so minimal compared to the costs. And I don't think most people are understanding the costs, and so that's part of what needs to be done is how do we talk more about the costs? And I think I'm maybe wearing I don't know how to call it other than calling it the opposite of rose tinted glasses when it comes to thinking about MMI. I'm so like we've read a lot about it for the past or whatever, you know, year, two, three, and then you know, I I don't obviously I don't view it well, right? Yeah. Are there any benefits from generative MI that don't have to do with sh doing shitty work faster?
SPEAKER_02Yeah, I mean, maybe, right? Like maybe we can start parsing that out potentially. I think it's very dangerous to be like, no, and yes, rah-rah, right? But I I'm like, eh, I don't know. Because really, like, is it faster? Right? Like, people tell us, they tell themselves that it's faster, but I don't know if that's necessarily true, right? I mentioned that Brad Pitt video earlier. Again, like if you haven't seen it, we'll link it. But it's so ridiculous that I I don't know if you want to see it, if I'm being honest. Um, like, I'm sorry. It's just it makes me so mad. There was also like a Mario one that was very similar, like Mario and Luigi from the Mario Nintendo games fighting. And it's so funny because it was framed, at least the way I saw the videos, framed in this kind of, oh my god, AI's getting this happens like every week, right? Oh my goodness, AI's getting so realistic, it's getting wild, right? That happens all the time. And I'm like, I usually disagree because I'm like, again, I'm like, maybe it's a taste thing, maybe it's an eye thing, like as in like what you're paying attention to. Like, if I see AI animation, I'm like, why does it look so bad? Like a lot of people are like, wow, you know, but anyway, I think the thing is like okay, like I'll put that to the side because like when you ever you talk about like quality, people get real uh crunchy because they're like it's getting better, which I guess like is against true. Like, yeah, like you know, AI of three years ago is like less noticeable, I suppose, than like AI now, right? Like that's true, but like even if it quote unquote looks good, which again, debatable, uh, there isn't a good way to like make changes to it using AI, right? Like, like if you're like in, I don't know, chatbot X3000 or whatever, and you say, like, make me picture and you want it to change, what it'll do very likely is like you print it out, you have something, okay, do this, and you try to change it, it'll like introduce it's very likely that it'll introduce extra stuff into the thing. And then you kind of have to like keep fiddling with it in order to get it to look how you want it to look. And because again, especially in this kind of context with like creative things like art or animation or whatever, you don't get like the raw files, right? Like you can't like go into, you know, oh, I made an AI thing and then I put it in Photoshop and like I have all the layers and I can change the picture or whatever, or like I put it into Blender, which is an animation thing, and like I have all the 3D models so I can re move things. It's not really like that. It's like a you know, it just puts out the end result, right? So you can't really even like change it super well. And I think that ends up producing stuff that, like, kind of what you were just saying earlier, Scooter, is like there is a way in which it's like, wow, look, I made this presentation with AI. Oh, I wrote this paper with AI. And it's kind of like that feeling of that moment of elation of like the robot did a cool thing, kind of overrides the fact that like what it's doing, like producing, is actively like not good. And like when you get into the depths of like production of things that are there's a certain level of quality expected, that kind of just all falls to the wayside, right? And this really makes me think about like taking it back to kind of like how communities are impacted, right? Is like these issues really kind of are able to kind of obfuscate the material impact. And I think that's another issue I have with it, is like the fact that you can like make a Mario, you know, thing that looks like the Mario movie just by typing in a couple of words. And maybe this is a more like philosophical argument or whatever, but like I feel like that does something that's like not beneficial to you as a person because like there's no real like labor that you can see that goes into it versus like if you had to craft the model by hand or you knew someone on your team did, and then you have to like manually animate it. Maybe there's some aspects you like automate or whatever, but that's a different thing, I think, right? Versus like the whole thing kind of just being able to be spun up with a couple of words. And I think it's really critical to to again like ground us in the like material impacts of this technology, because again, like when we think about communities in the US that are dealing with these things, right? Like and fighting against these things, fighting against data centers that are fueling these AI systems. And you know, again, kind of going back to like how we started the conversation with these uh, you know, AI and space initiative type things, right? These data centers and space things, it's framed as, like we said earlier, a way to kind of resolve that contradiction, right? Like it's framed as a way to like solve the uh harm that AI could bring, right? And like if that works, which is a big if, right? Like it's like if that works, I think that the danger is because it's not a immediate issue anymore, and the AI infrastructure has been outsourced to a different place, right? Commun those communities might not see it as an issue, right? Like they're kind of like, okay, it's not in our backyard anymore, it's not that big of a deal, right? And I think like that is really critical to think about because if you remove a data center from the terrestrial, right, and then you put it into orbit, then that's kind of moving it out of a movement's community, and it's moving out of like every movement's community. Because as far as I know, we don't really have like social movements in space yet. And that probably sounds like a good idea to these people, maybe a bit utopian, but like I don't think that's the core issue I have with that.
SPEAKER_00This all sounds great in theory, but it sounds like A massive if, you know, if these projects, Project Suncatcher, can actually work. Speaking to people, knowing people that kind of touch in the space, it's pretty understood that this is a very like a moonshot type of project, that it's very unlikely to actually come to fruition, which makes me feel like it's it's kind of one of these strategies that the companies, you know, this generative AI, you know, group of companies will use to kind of sell what they're doing. Is they're gonna say, you know, we we could we're gonna do these things and like we're figuring it out in on like on Earth data centers, but eventually like it's not gonna be an issue. We're gonna go to the space and like I mean, I think especially with like Elon Musk, you can see all kinds of high-flying, high voluten. I don't know if that's the term I would use, like high-flying strategies that he will use to kind of garner support, you know. And you can talk about the boring company and everything he did there, right? It this feels to me like that again, but also it also feels a little different because it it seems like mass, a lot of money is being put behind this. But in kind of a side note, in thinking about this, I just really hate how it feels like we have no opportunities for input into the governance around space for anyone, other than you know, the super elite, right? Like there's no, you know, if you consider the US a democracy, even then, like space is never on the ballot. You know, there's no like way to opine on like what happens in space. And I don't know about you, but like I don't I don't really love the idea of like just all the space shit just flying around. Like I know it's like very low odds that anything ever comes down and like hits someone, but like why are we doing that? Like this is yeah, and we have no no input on that, you know? Yeah, because I I hate that. Um and it reminds me of I've recently read this book called Everything for Everyone, an Oral History of the New York Commune, and they had a really good like piece on governing what happens in space and like how you know how what is this ecosystem of all these you know companies, worker cooperatives that were like doing this these space projects, and like you know, what are they doing for and like what is and like how do they work with the different community assemblies? And you know, it was just really cool.
SPEAKER_02Yeah, no, I think that makes a lot of sense. And like I also have similar questions, and the you know, however, the only thing I would kind of amend to that or the the thing I would raise that maybe bumps up against that a little bit is that like we have to make sure these questions are framed within the purview or the kind of encompassed by otherwise encompassed by Indigenous-led stewardship. And I bring that up here because like in that idea of commenting, in that idea of doing assemblies, in this idea of democracy, right? Like, which we could probably have a whole series of the word democracy, but it's like those kind of you know, maybe you call them nuances, but I think they're like foundational. Um those pieces like aren't necessarily put next to each other, right? Like indigenous stewardship and these ideas of like a collective uh control or administration or whatever. Like it's just kind of like, oh, well, they'll be included too, obviously. And it's like, well, no, like you have to start from indigenous people, is the way I'm kind of thinking about it. And again, to kind of say more on that, because like I said, people in these conversations often don't come from that angle unless they are indigenous. Like for me, it's not necessarily about like black folks or native folks kind of like ruling or running things in a kind of it looks like how it looks now, but with like black and brown faces in those leadership positions. It's really about our our societies uh doing what's necessary to put themselves in the best position to really allow like our communities, you know, black communities, indigenous communities, indigenous communities that are black communities, black communities that are indigenous communities, right? Like allow those like beating hearts of movements to kind of exist and to like do the work that needs to be done in a way that isn't impeded by interests that maybe see that as tertiary or see that as secondary, right? And like I think the kind of simple reason I feel that way, the simple reason I've come to that understanding is because like we are groups that sit at like the bottom of many of these social systems, especially in a place like the US, I think is true worldwide, but like I haven't surveyed it literally every place, so I won't make that claim too forcefully. And I think that like again, like this is reasonable because you're like, well, there aren't indigenous people in space, so why are we talking about it here? But I think it's like again, like in order to get to space, we have to start from the land, right? So I think we have to really if we are to do stuff up there, we really have to like grapple with how things are down here, right? And again, like we talked about what's going on in Memphis. Uh, we talked about how there are ways in which these uh AI advancements are encroaching on communities. And this even ties into like with native communities, for example, uh, how this administration in the US and how corporations are also kind of trying to like establish ties with various like native nations to have these data centers be either going through their land or be on their land and trying to kind of like I see it as kind of like a bribe, right? Like, like we'll kind of bring you into the fold a little bit so we can do this stuff that we're trying to do.
SPEAKER_00Absolutely. No, I mean I think that makes a lot of sense. And then getting back to this data center and space concept sounds great in theory, maybe, but I'm picking up that you have a critical view of this as being a good deal for communities.
SPEAKER_02Yeah, I definitely do. I definitely do. Um, because I think again, like this is kind of I'm a kind of person who like for better or for worse, I try to like see what connections exist between different things. And so what I mean by that is like I'm not just thinking about this as like, oh, this is just about data centers. It's just about, you know, technology. I don't like being in communities I don't want it to be in, right? It kind of is a question to me about like utopia, right? And like what we, you know, we as in people in a general sense, we as in specific communities, whatever, right? Whatever we you can kind of imagine. I'm kind of talking about that we when we imagine what like the future can look like, right? That says a lot about what we care about, who we are, that kind of stuff. And so the problem kind of comes in in this AI context and this data center context and the space economy context. It's like, okay, we have this idea of how we want things to be. We being in this case, these capitalist oligarchs, right? Um and I say we instead of them because they're people too for one. And then there's a lot of people in our communities that of people who aren't oligarchs who like align with them, who see themselves in those people. That's a whole nother thing. So I won't go too far into that. But the point is, okay, I have this utopic idea, right? I am now I have to try to bring it to life, right? Um, I think another way to think about it is to like rather than just like, oh, I'm birthing this or I'm bringing it to life, it's more like I'm trying to bring life down to this thing, or I have this idea and I'm trying to make life fit into this box, right? And so I think that's really important to really think about because as we kind of are we've been not dancing around it, but we've been kind of hinting at this, uh, as we've been having this conversation. But like the political economy, right? Like the way that you know the political practices and economic practices are structured and set up, as it relates to the space industry, is really like stacked against anything resembling, you know, egalitarianism, the equitable redistribution of resources, you know, community, sociality, whatever, right? Like ecological harmony, blah, blah, blah, blah, blah. Like all these things that we might label as like things we would want to strive for. It's like the economy is not set up in that way. Um, and I think this is kind of a twofold thing, right? Where like on the one hand, uh space flight is becoming more in a relative sense, uh, accessible because the costs of launching things are lowering. And that's the hardest part of getting into space is the like process of launching the things into space. And this is something that SpaceX has kind of been leading the charge on, is like trying to figure out every usable rockets, right? And like if that is being able to become a little bit more consistent, then that would really lower the costs, right? Which means that industry would probably lean even more into space and expand that space industry, right? And with that comes uh a deeper, you know, that industry will expropriate life and labor so that it can grow from people like your, you know, different people in your communities, right? And so that's on one hand. On the other hand, or the other side of this is that like uh again, just to make it very clear, is that these interventions aren't happening by benevolent groups, they're happening through capitalists who have interest in accumulating more capital, because that's what they do. And those people, like if you paid any attention to human history ever, those people do not have our best interests in mind, right? And I think that's a really critical thing to like think about.
SPEAKER_00For sure. Elon Musk in particular, you know, he stated his stated goal for a lot of this, you know, his work around SpaceX is quote unquote ensuring the long-term survival of humanity and all life as we know it. Ugh. Which I think is just laughable at face on face value. I mean, it's clear that he does not give a shit about humanity. Um and only uses this as a claim or as cover for what, you know, for power, ego, yeah, greed. I don't know what his case is, like what you know, why he does the things he does, but you can see just like the boring company, right? And we'll link more on on these things. I don't think we're gonna go into these in detail, but doge. I mean there's no you and then not and then talk about X, Twitter and X, like, and just the rampant disinformation when you know he just lets proliferate, right? And and so and of course Elon is just one person, but I think you could say very similar things about the other space billionaires, um, including Jeff Bezos, Richard Branson, and also, I mean, billionaires in general.
SPEAKER_02So yeah, I think that's really critical because like thinking about again, like we're talking about space, right? And talking about what that would mean, trying to kind of concretize this conversation, because it is very much like it gets people's utopian thoughts going, and it's very hard to understand and think about like what it actually means. And so, like, were these hurdles in as it relates to like space not even travel, but just like moving stuff into and out of space, right? Were that to become a little bit easier, there would be people that are getting moved around, right? Like one of the resources that would get uh moved around are people, right? Because they'd have to work on these things, they'd have to right, like do these different activities to manage and uh administer these systems, right? And that really implies like both exploitation, as is probably clear, right? Like you, you know, people aren't gonna be getting rich off of this stuff, besides the people who are already rich, right? And resistance, right? So like people don't just take things sitting down. That resistance may not always be flashy or whatever, but that is something to keep in mind. And that really makes me kind of wonder what like organizing would look like in that context. Um, and I really think that like it is something that you know the questions for that context uh will have more in common with like like those same kind of questions that would be posed by like oil rigs or freighters rather than like the terrestrial work. You know what I mean? And I think that that is mostly because of the way that you know, like if you work at I don't know, whether it's a retail outlet or an office or something, right? Or like construction site, I don't know, whatever. Uh you like have your workspace and you have your home space, right? And it's like boom, you know, or even if you work from home, maybe work remotely or whatever, it's like you can leave your home and it's kind of like a different dynamic. But like if you're out in space or you're in low orbit or you're somewhere kind of remote or whatever, in the context of these oil rigs and freighters and stuff, right? It's like you can't really leave. You can't leave the office, quote unquote. You can't leave the workplace, quote unquote. I think that creates a very different dynamic also as to like what resistance or organizing looks like in that context. So that's what's happening. That would be what's happening in orbit, in space, whatever, right? But for like the terrestrial aspects of the space industry, of the space economy, think of like manufacturing, think of command centers, right? I think there really needs to be thought put into how to like really uh try and uh prevent the worst excesses that uh this uh emerging industry can bring as it grows, which to me really sounds reminiscent to company towns, right? Or freedom towns, which are another kind of instantiation of that that has been floated around recently, right? Like these spaces where it's like again, we're pretty like anti-state, anti-government here. And so I'm not trying to give them any points, right? But like there is something to be said for like if your government was fully directly ruled by like a company that is like very autocratic in a very straightforward way, like that becomes a very different context than like I live in a city and I have to deal with like my representative democratic body that is not good, but is like a different kind of thing than like a company that like owns everything, right? The money you use, the house you live in, the food you produce, like all that, right? That's something that we really have to think about in this space context, I think.
SPEAKER_00Yeah, that sounds not uh ideal to say the least. And so I think the question is like what can we do about these things and how do we kind of prevent the worst excesses of these, you know, the supercharged company town potential that this, you know, this could become, right? And I think that's kind of the the goal of this podcast, this this media project is just focusing on like really honing on solutions more so than just saying organize. Um and so we're gonna hope to do that. But unfortunately, I do think it is there's a lot of difficulties and challenges that occur or you know, organizing in this space, whether you know space, data, AI, whatever you want to call it, the next of these things. I think there's a lot of challenges associated with that. Um I think from the go, you think unions, right? And I think it could be a potential tool, but as in the case of Silicon Valley more broadly, the companies in this sector are rooted in a culture of individualism, very tight management control. And the employees are paid well, right? And so there's not much incentive to unionize or to threaten your your livelihood in that way. And the companies take pretty aggressive measures, um, generally speaking. Just recently, earlier in February, the National Labor Relations Board, the NLRP, dismissed a long-running complaint against Space Rex, ruling ruling that they do not have jurisdiction over the company because the company is more like the railroad and airline companies, which are covered by a separate act. Um this seems, you know, maybe not that big of a deal, but it makes organizing significantly more difficult at SpaceX. And so unions, it's a strategy, but there's just got to be, you know, that's a big hurdle. Um, because it you don't you just don't see a lot of unions in these companies, these types of companies. The other strategy, I mean, we talked about this in a prior podcast, is you know, how do we kind of build in the shell of the old and how do we build federations of cooperative businesses to kind of instead of trying to you know unionize and you know otherwise obstruct these companies, how do we just build it anew? If this is something we want, right? Space in space exploration, space travel, space data, whatever. Building I I don't think the federation of cooperative business solution, if you will, is really that possible here, in part because it's just a very high cap. These companies require a lot of capital, and that only like that level of capital comes either from put you know, well, I mean, it's it's possible, right? Like you could, you know, get enough capital to do operate in this way in this space, but personally I would argue that this money that you would need to do something of this nature could be used much better elsewhere um in other ways.
SPEAKER_02I definitely think that makes a lot of sense. And I really kind of see how, yeah, like focusing on trying to kind of beat them at their own game in a sense, is a losing proposition in this terrain, especially, because again, like while the prices are going down, they aren't low, right? Like, like it's it's a relative thing, you know what I'm saying? And I think highlighting that like uh way in which the NLRB, which has been a pretty counter-revolutionary force since it's come about, where it's kind of shifted labor organizing into being more contract-based, more negotiation based, rather than really oriented around unions kind of like orienting around like uh the balance of forces, right? Like how much can we kind of like uh exert uh energy upon our employers in order to really like get what we need ourselves rather than again like coming to the table, getting a seat at the table and writing out contracts that usually are like really nice for the union leaders and the like top brass of the you know workers or whatever, right? The labor aristocracy, rather than like from below rank and file people kind of leading that charge. Not say it doesn't happen, but it's kind of become very difficult for that type of work. And I think it's also important to think about like how this particular union is said, like, oh, they're more like railways than like these other kinds of companies. Because the reason that like things like railways are various, you know, or you can think about it like how some government employees have to continue working when the government continues to shut down, is because like, oh, they're providing quote unquote critical services for the functioning of community, society, the public, whatever. And so it's like, well, if you like do a work stoppage for one of those critical infrastructural things, then that's a different kind of level of risk incurred than like something that isn't framed as critical to the economy, for example, which I think is important to try to figure out how to do, but I think is like again a different you have to do a different kind of risk assessment for that because it sits in a different place within the kind of structure of the system.
SPEAKER_00No, absolutely. Thank you for adding that. And and to these two strategies that we kind of bring up, the unions and like this federation of cooperatives. I I don't want to dismiss the potential with these strategies, but it just feels to me in this moment that the most effective organizing tactics are probably going to be around either one place-based activism against like actual data centers on the ground, and then two, kind of like how do we build this broader movement and like general sentiment in the general population around like anti-AI sentiment, right? Like, how do you, you know, make it not cool to use AI? And like, how do we like not that we need well I was gonna say like we don't I don't want to like shit on people, like, but also we kind of do need to shit on people, right?
SPEAKER_02I'm down to shit on people, but yeah, go ahead.
SPEAKER_00And so, like, I mean, I I just you know, yeah, I I think it's yeah, I I it's one of those things I feel like it's fine to shit on people for. Um and so and I think if if that is done well, right, like that is a puts up a major hurdle. I mean, the way these companies work, right, they have to bring in dollars, right, and money, right? And so if people aren't paying for AI, they're actively opposing when they're commercial, like they're the companies that they use when the Bank of America integrates AI, and you say, you know, well, hopefully you're not using Bank of America. Credit unions are great. But say like you have like you use some company and they integrate AI, and you, you know, a lot of people are just throwing a fit, right? And maybe throwing a fit is another right word, but like protesting that like piece of things. If we can, you know, build the anti-AI movement, if you will, then I think that is a meaningful um hurdle for these companies in progressing.
SPEAKER_02For sure. No, I definitely agree with that, and I think it is really critical to kind of really lean into or highlight the fact that like this is a messy context to organize it. Construction or the build part of build and fight, it's like uh what does that mean, right? Like, especially if we think of build in the kind of more literal sense where it's like we are building infrastructure, right? It doesn't feel as like it fits as well. Uh but we also don't want to Like leave folks in the cold necessarily. And what I mean by that is like be like, well, y'all, y'all sectors too hard to organize. So good luck. You know, I don't think we should do that. Um, especially because, again, if we start to actually get up into space in any meaningful capacity, we're gonna see some, you know, I'm saying some exploitation. And like those people, those people who are hyper-exploited, right? Like space miners or whatever, I don't know, right? Like whatever that ends up looking like. People who maybe make decent money but are doing really dangerous jobs. If I were able to, you know, refer to seafaring again, refer to oil rigs again, right? It's like they they make decent money, I think usually middle class money from what I've kind of seen, but like, you know, it's not like hundreds of hundreds and hundreds of thousands of dollars or whatever, right? And usually like that work is very taxing, right? So I think that's kind of critical to not leave those folks behind. And so I think that like the general orientation I envision for what is to be done in that space is like orienting around like agitation and education from communities themselves. And I say that in a broad way because it is broad. So like having those communities agitate, educate, and then trying to like spill that over into the workplace, right? So like having workers be exposed to the harms of the industry they're in while people on the outside are like actively working against those industries, those companies, those organizations, and like having those people on the outside ideally be supported by people who are sympathetic on the inside, who kind of understand that like their role is more of one of like sabotage, basically, and like like accepting that, which is like a tall order, and like the likelihood is high that like this will become a kind of antagonistic dynamic. And I think again, like I don't know if that should be shied away from necessarily, but I do think that like that is a non-optimal situation to be in. But I I really think that that should be like thought about of like how can communities put pressure on workers because especially if those workers are again, as Scooter kind of mentioned, like maybe they have a little bit of class privilege, maybe they have a little bit of comfortability, maybe they have some ideological things that lead them to be like, I deserve to be here, I'm doing good stuff, blah blah blah blah, we're advancing humanity or whatever, right? It's like how can people be like, uh-uh, no, you're not? That ain't cool, you know what I'm saying? Like, really kind of lean into that, and then how can that be leveraged into like workers being like, yo, people do not like this? And like that might be kind of an avenue for change and that kind of thing, right? And so, like, I think another thing for us to keep in mind, and I feel kind of weird highlighting this because it's like, well, duh when I say it, but it is something that I feel like gets locked in the shuffle sometimes. It's like we live in a world that is physical, that is tangible. Like, I know that atoms never touch, but like that's not super important for this conversation, right? Like, we are able to grab the world in a certain sense. Um, it's not as easy as that, but like I think it's more easy than sometimes people let on. We have agency, is what I'm trying to say, basically. And so, like I said, this world is tangible and physical. So I think there's room for like figuring out the choke points of these systems in the tangible sense, right? In like the logistics of people, of things, right? Materials being moved around, where people are working physically, right? Where you know certain places are that are sending things out into space physically, right? And figuring out how to like attack those things, right? And I don't necessarily mean like in a attack, I mean it in a kind of metaphorical way, right? Like like thinking about how can we really relate to those in a way that is again moving it towards both lessening the harm and like preventing the harm, right? Like kind of having both of those be happening at the same time. And you know, again, it's like the space industry isn't quite as developed to where this is a massive, massive, massive concern quite yet. But like to really use like the seafaring example from earlier to kind of help map out and imagine what I'm getting at. I think of things like the blockading strategies that uh we saw in kind of earlier phases of like the solidarity movement for Palestine in the West, right? So, you know, like a ship that was alleged to be, you know, transporting weapons for Israel would stop at a port and then people would like blockade the port, people would like have boats out in the water, you know what I'm saying, to kind of like stop that boat and it's tracks, right? And so, and then there are even times where like those workers would like strike or threaten to strike, right? Like that kind of stuff. Um, so there is a possibility for you know people to kind of come around, even in those positions where they're like maybe incentivized to do otherwise, you know? And so I think that like that could be something that happens, right? Uh like analyzing the ways that money, goods, all the things kind of needed to run this industry are actually moving, right? Like, where are they going? Where are they coming from? How do again like concrete individuals or organizations or groups relate to that flow of money and resources? And then think about like what is the like strategy from both ends to kind of like impede all of that, right? And then like you know, impeding those things from happening. And I kind of said this before, but just to really hit the point home, like how do we start working on getting people to impede themselves in that sense, right? Like if I'm a worker in this space, what can I do to help? And then trying to escalate or ratchet up that support or that movement around this work from sectors that are adjacent to these things, right? Uh, so that more force can be concentrated. So I'm thinking about things like uh, what is it called? The aeronautics sector, aerospace, like you think about like disconnects to weapons and the military and like all these other uh pieces of this kind of economy that's relatively integrated. Thinking about like what struggles are being waged or fought in those other spaces, and like how it can be like thought of as something where if people come together, they'll be stronger. So this means that's a good reason opportunity to collaborate, right?
SPEAKER_00Absolutely. No, and I love your focus on kind of like the expansive, like there are so many people, and like who knows who's actually you know gonna listen to this podcast at any point, but like who there are so many different ways that like to fight these things, and it kind of depends on someone's context. Um, and so I don't mean to like say don't you know don't unionize. I mean if you're like outside already, like you know, if you're in one of these things, yeah, for sure. Give it a go if you can, you know. Yeah, but like I just think it's probably something that's hard to do well, like you know, just with the management strategies on these companies. But yeah, that's it. I like uh the you know, and I think that's how we should strive to be just be open, expansive, like there's so many different ways, and we shouldn't be restricting ourselves unnecessarily.
SPEAKER_02Yeah, I think like a critical piece of that is the way that, like, you know, this can sound kind of nebulous or or like, oh, I want to concrete, like, what do I do? Right. And it's like, well, I think it's more useful personally, it's more frustrating for this a listener, potentially, for sure. Um, it depends on the personality, I think. But like, I've gotten a lot of feedback in that vein of like, just tell me what to do. It's like I would, but like we'd have to really sit down and like you'd have to explain your context and blah blah blah blah blah. Like it's it's a lot more. And what I'd rather do is like, even if it takes longer, even it's more frustrating. I think it's more useful to be like, here's how we start thinking about things. And like my favorite analogy for this is like thinking about it in like an ecological sense, as in like these things happen in an ecology. And so rather than like, for example, trying to find like the objectively or absolutely most revolutionary thing you could possibly do in any given situation, because it's like, what does that mean? Right? Like, uh, you know, it's not math or physics, like you can't really get that granular, and even in those spaces, it's kind of question was you know, people like to portray things a little bit easier to understand they actually are. The point I'm trying to make though is like really leaning into relationality or the fact that things exist in context with other things, and again, I get that this is kind of abstract potentially, but like all to say, there's not a lot of things that you can do that inherently will or won't be useful, impactful, revolutionary, radical, progressive, whatever, right? Like, like the stuff you do is the stuff you do. I think the the the where the rubber really hits the road, um, and ideally it'll hit the road less because we're trying to get rid of cars and stuff, but like where the rubber really hits the road is in that space where you're like, I'm doing this stuff, I'm really trying to check in to make sure it's not harmful or whatever. I've really been intentional about how I relate to the world within my capacity as a person. How can I build connections with other people who are doing those same things, wherever they may be, right? With again having boundaries around like, I'm not necessarily gonna run to bigots to like convince them to like, oh, actually, here's a way that your bigotry can be useful. That's not what I'm saying. What I'm saying is like, okay, I work in tenant organizing, cool. How can I connect with people who are doing solidarity unionism or uh, you know, business unionism or whatever, acknowledging all the contradictions in each of our spaces and where some spaces maybe more contradictory than others or whatever, and saying, like, how can we start linking those together in ways that make sense and in ways that acknowledge that like our problems are connected, right? Because the problems know that they're connected. That's the thing I'm trying to get at. It's like the problems know they're connected. But like, I think it's more difficult for people working in their specific issues or specific areas to like see that. And like the again, the reason that we do it that way, or the reason I think of it that way, is because like that connections mean that one thing can be cut off, maybe, right? Like even if you can theoretically get rid of one industry, right? Like we stop space from happening. That doesn't mean that like exploitation also leaves with it, right? And if we don't orient around, you know, not only the specific thing we're facing, but the more general things animating it, then we're not gonna really be able to like stop our thing. And even if we somehow manage to do it, it'll be incomplete and people will be harmed by something else, right? So I think that's kind of how I think about it when it comes to this sort of stuff.
SPEAKER_00Yeah, no, and I I like the I mean, the point around like you're not gonna ever get a recommendation on like just I mean, that's not maybe not fully true, but like no one's ever gonna just tell you what to do, you know. Like everything is so context dependent, and like the most effective thing you can probably do is like take the time to understand. And I'm to some degree just repeating what you're saying, but like, how do you relate to all these different things and like develop the skill set to be able to identify, understand, and then make moves on those things? And I I say that because that's why we're doing this podcast. So just to help, you know, people to think more deeply and critically and like yeah, explore these topics because there's not a lot of media here to do such things. Um and also note that this was all none of this was that was planned. So Dip and I have known each other for a little while now, and one of the things I'm like, we should do a podcast, because you can just riff like without anything. So that was beautiful. But, anyways, getting back to the topic at hand, we talk about this, you know, how to contest the sector, you know, what does that look like? That said, I'm kind of curious just to see how much of this supposed AI bubble will actually cause the industry to self-implode. I mean, open AI recently has suggested, you know, federal guarantees for AI infrastructure, and then they proceeded to quickly walk that back. But I kind of see that as evidence of fundamental economic questions that are facing OpenAI and the other behemoths in the space. You can see this, I mean, OpenAI in particular has made$1.4 trillion in commitments over the next eight years. And, you know, thinking from like a capitalist startup economic perspective, to make it a going concern, they have to make multiples of that$1.4 trillion over that same time period. And the company is just nowhere near that. Right. And in 2025, the company generated 20 billion in revenue. And so, I mean, I think this is what they're thinking a lot about right now. Is like, where does this other revenue come from? Is it ads in ChatGPT? They've said that and I think they got a lot of backlash. I don't, I didn't actually research if that's staying or what's happening. How else are they, you know, getting revenue, right? They're gonna ramp up the pricing for people to use their LLMs. Do people pull back their subscriptions? Do enterprise you know units stop paying, right? Like, so I I just personally don't see the path to this revenue filling out. And so I'm curious to see like when this bubble pops, what happens, right? And what is the effect on this AI money machine? I'm gonna link uh in the show notes a pretty good graphic that just shows the relationship between the different you know company actors in this space, Nvidia, OpenAI, AMD, these other LLMs. Um and then beyond that, like what are the implications for space of this, like this whole conversation? Um, because I think a lot of this you know space exploration topic kind of relies on like the money coming in from some of this data center stuff. And so that being said, I don't think it's wise to rely on this to you know pop for the bubble to pop and self-implode. I think, you know, I'm crossing my fingers, obviously, but I think if it happens, it's gonna take years um to really slow down in any meaningful way, and we're gonna see a lot of damage to communities and to the environment in the process. And this is assuming that like the federal government doesn't step into backstop. Um, as we previously mentioned, OpenAM has suggested it, took it back. But the major proponents of these companies are actively currying goodwill with administration. You know, you'll see all kinds of meetings with the these executives and Trump. And also think we're I think anything that like we haven't touched on this much, but a lot of the other negative, you know, things, Palantir, the surveillance capitalism network. Like, yeah, I get a sense even if the AI bubble pops, that stuff's not popping, you know. Yeah, that's gonna stay around.
SPEAKER_02And so yeah, and like we spent a lot of time talking about organizing within this context, right? Kind of a you might have picked up on like an ambivalence or maybe even critical ambivalence towards the abolish direction rather than the bolster direction of the space industry, of this AI industry. I think AI especially is like, get that shit out of here. But I think as far as space, it's like, okay, organize, you know, like boom, a lot of organizing, especially if you come from like a communist or anarchist uh background or kind of perspective, it's like okay, worker self-management is the intention, right? Like you're working towards owning your workplace, basically. But in this context, I'm not really advocating for a like red or you know, quote unquote red or communist space program, nor am I just hoping that like you know, the workers on whatever, I don't know, like on the moon who have to do stupid moon stuff. Like, I'm not hoping for them to be treated better. I'm not hoping for benign space capitalism. My intention really is to hold like two truths at the same time while acknowledging the tensions that that brings, right? So I think there's a pretty high likelihood that like if space is able to figure itself out, like the space industry, that industry, especially low orbit, will become a critical site for value extraction, even if it doesn't look like oh, it's through data centers, but that is a very kind of uh clear vector at this moment, right? Um, that'll become a critical site of value extraction in the near future, relatively speaking. And I think that we should also, while having that in mind and while preparing for that and while organizing against it and around it, should uh think about it like it being space colonization, which is colonization, which isn't a neutral term or a blah blah blah, you know, like a kind of thing that just happens, right? It should be treated in my mind with a similar quality, or if not quantity, right? Like a similar kind of disdain, skepticism, criticism, and attack that is given two things, like nuclear technology, right, and non-space forms of colonization. So when we use the word colonization in like a social sense, right? Like thinking about space as analogous to these other kind of technologies that some people think is really cool and good and useful and important, but like a lot of us, maybe not a lot, I don't know. I'd hope a lot, but a lot of people I talk to, I put it like that, I think are like harmful and dangerous and right, that kind of thing. And like to just draw out those comparisons a little bit more. I think that like the reason I kind of see a similarity between nuclear energy in space, for example, because again, in this data center conversation, nuclear energy might be one of those sources of energy that's used for data centers, right? I think that like similarities are like the scale of investment needed, right? Like it takes a lot of money to get that stuff off the ground, engineering challenges, the feasibility of like it in comparison to like how how catastrophically wrong it can go. As we have millions of examples of not literally millions, but you get what I'm saying. Like, there's so many examples of like space things going bad, nuclear things going bad, and it's just like that risk is really serious in a way that, like, again, we think about it in a cost-benefit analysis sense. It's like, I don't know how worth it is, right? And with colonization, I feel like it's colonization as a history, even if we just focus on like the last 500 and some change years, right? I say just kind of, you know, in a blase way, because it's not just that, but you get what I'm saying. Like, even if we just focus on colonization in that kind of decolonial theory sense, I think that like it speaks for itself. But like just to say a little bit on that, I'm not trying to say that like the dispossession of indigenous and like foreign lands and people is the same as quote unquote the same as quote unquote mining rocks, right? Like, I'm not trying to make the a good equivalency between those two things, but I think it is more similar because of the logics and the current world we exist in and how that world functions. I think it's more similar than people would like to admit, right? And and therefore, we should be really critical and problematize the desires and the motivations that are kind of going into this interest in extracting resources from space, asking questions like, wait a minute, why do we need those minerals up there? Like, what is the like what are we doing with that stuff, you know? Or like what is the human cost of doing that? What is the biological or biotic cost of doing that, of this constant need to grow and expand and grow and expand and grow and expand? Like, what do we lose in that? What do we gain in that and who gains from that? Right? Uh, I think that's a really critical thing to think about. And, you know, I admit, right, like these are pretty big questions dealing with colonization, right? Dealing with space travel, dealing with AI, right? Like these are things that like people have put a lot of energy, time, and thought into. And it's not a simple, like, boom, just don't do it, or you know, like we can't just kind of flip a switch and like it's easy to respond to. But at the same time, I don't think it's an excuse to let those things develop in this negative direction without any resistance, right? Unimpeded, right? And I have a difficulty talking about this sometimes because whenever you talk about like new technologies or advancement in technology uh critically, it often reads to certain people, especially people who work on the technology, especially people who have an affinity for those technologies, right, or who like newness for the sake of newness in general. It's easy to be like, oh, you don't like that new thing because it's new, you don't like this change because it's changing, right? And the kind of negative externalities or the negative aspects, downsides are kind of treated as like, well, that's the cost of progress, right? Like it's treated as kind of like an ex if it does happen, it's exceptional, it's not built into the thing, that kind of stuff, right? But I really want us to keep in mind that like nothing really comes from a vacuum. So the impacts of a technology will have to be in conversation with the world, right? So like whether it's impacts from other technologies, thinking about you know, all of the social structures that those technologies came into being within or under, right? So again, we kind of touched on this earlier, but like we're not really pointing out quote unquote new issues that are appearing as a result of the potential of AI or the potential of space colonization in the space industry. Even if there are aspects that are unique, we're not saying like, oh, this is a oh my god, this is unprecedented, right? Which I will admit, like can be a demobilizing thing because if people are used to their lives as it is, so something's already been happening, it's like okay, whatever, you know. And I think people are really good at acclimating to some types of misery sometimes. So it's kind of easy to be like, well, this this sucks, but things suck already. So what's a little more sucking, you know? And I think that like even when people do things like question the exploitation that goes into something like smartphones, which are much more integrated into our daily lives, right? In a way that like these other two technologies that we're talking about haven't reached. That Acknowledgement kind of just exists, right? It's like, ah, those kids in Congo, oh, that's so sad, right? Rather than it being like a call or a inspiration to organize around that, to organize around the end of exploitation in a general sense, right? Like even in other places that again, because it's all connected, that connected up, or just even divest from those technologies or the over-reliance on those technologies in collective ways so that it actually has a meaningful impact rather than kind of just being a like personal consumer choice that is like not nothing, but it isn't necessarily going to shift the tide of the system, right? And so one way I really want to frame this tying together of AI data centers and space colonization, it's like an example of shifting the burden, right? So, in other words, it's an appearance that the issue is being resolved. Oh, you don't like AI centers in your community? Okay, we'll take them out. We'll take them out of everybody's community. Oh my gosh, great, right? But the content of the it, the, the resolution, right? Like the like the actual things that would happen as a result of that move, pretending or imagining that is possible, that doesn't really address why that kind of thing is an issue, right? Like why, like, how do we get to it to where they can say, you know what? Yeah, let's put these in this place, right? Let's do this here, right? That doesn't really resolve that. And I think that like if there's just a focus on like tweaking the system of exploitation, where it ends up just being a different group of people that are experiencing the harms of this data technology, like that's not the same thing as like the harm not happening, right? Like if someone's getting harmed, then someone's getting harmed, even if it's not you or your people, you know? And so I really can't help but to like question Google and SpaceX storming the heavens, right? Especially for the sake of AI. Are you kidding me? Like, are you serious? And so I think that like responding to this project, responding to these movements, and by movements I mean movements of capital, right? And the twin kind of encroachments of AI and space colonization, like those things are really critical to have some ideas around and how to respond to. I think that again, we're in a stage of transition, but lives are already being put at risk, as we see, right? Like, and we should care now. We shouldn't wait until it's like, oh my gosh, so many people were impacted by this before we start responding, right? I think that like the demands that these technologies require me is that the returns will also have to be very high, as Scooter was kind of saying. And so, like the potential for harm is so big. Like it's it's it's like hard to imagine, but like the potential is like, you know, it's like, man, we're putting our historical, you know, settler and capitalist forebears to shame kind of energy, right? Like, like, like the heights that could be reached with this kind of stuff is just like very like, I can't even begin to imagine it, right? And I think that all of that means that like we can't really tinker around the edges with this, right? Like, like the embeddedness of the systems that are animating AI, animating space colonization, really call for like digging in the dirt, being radical, and like pulling those roots out, right? And I think again, like we kind of talked about like some of the ways this could look like, and we talked about how it's not about one specific thing you do, you know, five steps to revolution kind of thing, but it is really like it could take many forms, seeing some of the flashes of resistance from people at ground zero doing data labeling, doing these kind of specific quote unquote unskilled, low-paid tasks, right? And we've seen communities railing against these data centers, they don't want them in the community, they're polluting, blah, blah, blah, right? Like hating on these things. We really also touched on the difficulties of doing that, right? Like, touch on the difficulties of expanding that struggle and having that struggle in general, but like the mission stays the same, right? Like we as people who care about each other, as people who care about the world, as people who care about issues, both in the specific sense. Like I see this specific person going through this experience, and the things animating them. Oh, they're going through the experience because of racism, blah, blah, blah. If we care about both those things, we really, I think, should orient around bolstering organizing efforts that are already happening and creating new efforts ourselves, but really bolstering them by linking them together to other struggles, not just haphazardly, but like because they're animated by the same root causes, right? So again, racism, ecoside, capitalism, patriarchy, neocolonialism, right? Colonialism in the classical sense too, right? And I think that's critical because addressing these issues in a general way that you know foregrounds autonomy, solidarity, and the direct stewardship, or like quote unquote democratic, if you want to call it that, um, directly source like access to land, work or labor, and the ability to live well, like that's critical, right? And I think that like again, when I talk about general versus specific, I think that like we should respond to things as they concretely present themselves to us. And we should really orient around seeing them as connected and always doing work to be like, hey, like this is a manifestation of a wider system. And so focusing on who's elected or who's in charge or that kind of thing doesn't necessarily mean that like we'll even be able to resolve that particular issue. And it definitely doesn't mean that we'll be able to stop that kind of person from appearing and causing harm that they are, right? And so if I had to encapsulate it right, like to make a long story short, I really do think I've kind of come around, honestly. I think the Luddites were cooking low-key. Like, I think that they were onto something. And I think that again, if you don't know what Luddites did, then definitely worth looking into because the way that people use it colloquially isn't necessarily a perfect encapsulation of what they're actually trying to do. I think a more accurate way to think about it is that like technology that like harms people's well-beings should be railed against, especially if it further entrenches elite power.
SPEAKER_00Poetic.