
Ctrl-Alt-Speech
Ctrl-Alt-Speech is a weekly news podcast co-created by Techdirt’s Mike Masnick and Everything in Moderation’s Ben Whitelaw. Each episode looks at the latest news in online speech, covering issues regarding trust & safety, content moderation, regulation, court rulings, new services & technology, and more.
The podcast regularly features expert guests with experience in the trust & safety/online speech worlds, discussing the ins and outs of the news that week and what it may mean for the industry. Each episode takes a deep dive into one or two key stories, and includes a quicker roundup of other important news. It's a must-listen for trust & safety professionals, and anyone interested in issues surrounding online speech.
If your company or organization is interested in sponsoring Ctrl-Alt-Speech and joining us for a sponsored interview, visit ctrlaltspeech.com for more information.
Ctrl-Alt-Speech is produced with financial support from the Future of Online Trust & Safety Fund, a fiscally-sponsored multi-donor fund at Global Impact that supports charitable activities to build a more robust, capable, and inclusive Trust and Safety ecosystem and field.
Ctrl-Alt-Speech
From 'Free Speech' To 'Flag This'
In this week's round-up of the latest news in online speech, content moderation and internet regulation, Mike and Ben cover:
- How platforms are responding to the Charlie Kirk shooting (The Verge)
- Bluesky Issues Warning to Any Users Celebrating Charlie Kirk Assassination (Newsweek)
- Right-Wing Activists Are Targeting People for Allegedly Celebrating Charlie Kirk’s Death (Wired)
- Charlie Kirk Was Shot and Killed in a Post-Content-Moderation World (Wired)
- Has Britain Gone Too Far With Its Digital Controls? (New York Times)
- The Censorship Alarm Is Ringing in the Wrong Direction (Public Knowledge)
- We now know who the new owners of TikTok will be - if Trump gets his deal done with Xi (CNN)
- Nepal’s Social Media Ban Backfires as Politics Moves to a Chat Room (New York Times)
Ctrl-Alt-Speech is a weekly podcast from Techdirt and Everything in Moderation. Send us your feedback at podcast@ctrlaltspeech.com and sponsorship enquiries to sponsorship@ctrlaltspeech.com. Thanks for listening.
So on an episode a few weeks ago, Mike, you mentioned that you'd gone and had a health check and it is my duty as your co-host to, you know, I take a pastoral role as well as, as, as debating, the merits of trust and safety and content moderation each week. And I just wanted to, you know, make sure that you're okay. Um.
Mike Masnick:Oh my gosh.
Ben Whitelaw:Partly that's in response to the stories we'll talk about today as well. But I, I've taken inspiration from the health wearable device. Whoop, right?
Mike Masnick:Oh yeah.
Ben Whitelaw:quite popular among celebrities. does a lot of like sleep tracking checks your bodily activities. Make sure that you are, you are, you know, doing okay. And on on their marketing page, they ask the following, they ask you to quantify how your body is feeling. So in the spirit of, you know, making sure you're okay, can you do that for us?
Mike Masnick:Yes. On a scale from one to 10, I'm an eight.
Ben Whitelaw:That's pretty good.
Mike Masnick:I have no idea.
Ben Whitelaw:it's more than I thought.
Mike Masnick:Yeah. How do you, how do you quantify these things? Uh, no. It's, uh, yeah. I'm, I'm doing all right. You make it sound like this is, uh, I think this is a HIPAA violation pen. I, I know that's an American thing, but I feel like my privacy has been invaded here. No, this has, this has been a really stressful week, I think. I think a lot of people are pretty stressed out from a bunch of things and some of which we'll talk about today. But, uh, can you quantify how your body is feeling? Ben,
Ben Whitelaw:Well, I'm, quantifying how I feel on what I call the Donald Trump UK visit Health spectrum. So
Mike Masnick:we, got rid of him for a few days and you have him.
Ben Whitelaw:he's in our fair country. I've been monitoring his, his, kind of general aesthetic. And, and, based upon that, I would say I'm, compared to Donald. Doing very much. Okay. Uh, I, I think my hands are in much better state than his hands. I'll say that. on that note, I think we should get going on today's podcast. Hello and welcome to Control Alt Speech, your weekly roundup of the major stories about online speech, content moderation, and internet regulat. It's September the 18th, 2025, and this week we're talking about the platform's reaction to Charlie Kirk's death. Why The New York Times Hates Britain and running a national government from a Discord server. I'm Ben Whitelaw. I'm the founder and editor of Everything in Moderation, and I'm with a very fit and very well, Mike Masnick. Did you, did you do your morning exercise? That is what keeps you
Mike Masnick:every morning I go and exercise. So it's, you know, it's what keeps me in shape. And, the reason we, we used whoop as the, opening today was,'cause I had been looking at it.'cause I had I heard an interview with the founder and, uh, I have other fitness trackers, but not, the whoop just yet, but.
Ben Whitelaw:Yeah. Yeah. I think you're actually probably in, in much better fetter than me. You know, I, I can't remember the last time I lifted anything more than a bottle of breast milk,
Mike Masnick:Well, I'm sure you lift your child every, every so often, and that's, that is good exercise. And, and I will note, because I am somewhat older than you are, Ben, that, It does matter, but at your age, you can get away with, less exercise. At, at my age, I would be in serious trouble.
Ben Whitelaw:Well, I think it's working, it's working for you. I, I'll say that as long as you, you turn up for the recording slot every week, I'm,
Mike Masnick:There we go.
Ben Whitelaw:talking of turning up, we are actually not gonna be recording an episode of Controlled Speech next week. We are having a week off. Both Mike and I are at events, related to, trust and safety and internet regulation.
Mike Masnick:cannot avoid it.
Ben Whitelaw:We cannot avoid it. I mean, it's a good, you know, it's a good excuse, right? We're not at somebody's birthday, you know, this is, we're actually working, but it means we're not gonna be recording an episode. I'm hosting an event in London, for everything in Moderation's 300 edition of the
Mike Masnick:Hey, congratulations. That's a
Ben Whitelaw:you. It's pretty good. It's pretty good. So started in 2018, done pretty much, 40 additions a year since then. So not had a huge amount of weeks off. And doing that in conjunction with a couple other newsletter writers who talk about write about tech policy, under the brand of Marked as Urgent, which is I think is a, I think a clever newsletter inbox name, but prob probably is lost on a lot of people.
Mike Masnick:I like it. I like it.
Ben Whitelaw:And then you are going to be at the Stanford Research Conference, right? For trust
Mike Masnick:Yeah. The, the Stanford Trust and Safety Research Conference, which is in, I think it's fourth year now, and it's always. A fantastic event. as I've said before, it's sort of, bookends the summer trust con earlier, and then the, the research conference at the end. And it's a wonderful time. And as, I've hinted, we have an announcement ourselves, and I'll be even a little more explicit, that you can sort of find it out if you look at the agenda. But, we who have done a bunch of game related stuff will have something new to announce in that realm. At the Trust and Safety Research Conference. So, uh, stay tuned for that. If you're gonna be there, you can come to my session, and if not then just pay attention'cause we're gonna announce something cool.
Ben Whitelaw:Nice. And it's, it's not, the session isn't gonna be you playing 1 billion users by yourself.
Mike Masnick:No, no,
Ben Whitelaw:This is a new game, a new digital game on top of the card playing game that we talked about last
Mike Masnick:yes. yes. So there'll be something, something exciting for people to learn about there. But, uh, feel free to bring copies of, 1 billion users and, and maybe we'll have a game going at lunch.
Ben Whitelaw:Nice. games galore everywhere you look. great. So we'll, dive into this week's story, Tonight. There's, a whole load to get into. we are gonna start with a story that. We actually didn't mention last week, but kind of took place just as we recorded just before we recorded, which is a story many of our listeners would've tracked, as very much dominated the week's news. And that is Charlie Kirk's fatal shooting during a speech at the Utah Valley University. this has lots of parts to it, lots of strands to it. We are obviously not gonna talk about, you know, the, gun related element to it. We're not really gonna talk about, him very much. We're gonna presume that a lot of people understand who he is and what he does, but there's a couple of really important. Parts of this story that we think are important to recap and to discuss the implications of, and also to, to reflect media coverage of it, which we often do on controlled speech. So there's really a platform reaction piece and there is also a, political element to it and the way that certain groups of the US political spectrum have reacted to, which I think are important. And I'm gonna kind of try and unpack the first one, Mike. And you're gonna, talk a bit about the second, the thing about this is, like any major news story, particularly a shooting and the nature and proliferation of smartphones nowadays is that. Horrendous videos of the death of Charlie Kirk. His shooting up on a platform in the university were widely available on social media almost instantly. and a lot of the coverage this week has talked about the proliferation and spread of those videos from very early on. I obviously went straight to social media as I heard about this, and I was confronted by one of the videos of, of his shooting as well, albeit from quite far back in the crowd. But there are lots of videos of people in the second and third rows who, shared the video instantly with hashtags on, video sites. And those were served by virtue of their engagement and their kind of virality to other users. Really without, I guess, those people being aware that, that had happened. so instantly we saw the spread of these videos and when something like that happens and something as graphic as, the shooting of somebody there is a natural reaction for people to ask like, what? Were social media platforms doing to prevent the spread of particularly the, coverage of, of the shooting and the video itself. And we've seen, you know, the Verge have done a very good kind of roundup of how the platforms reacted and other outlets have commented on that, and try to kind of characterize it. So if you have missed it, and if, if you maybe have taken a break from social media this week because of the nature of this story, I think it's just worth recapping. I think, I think fewer of the platforms and how they've reacted to it. So meta, with. Instagram, Facebook threads, and WhatsApp. They've done a number of different things across their platforms. they told the Verge, they marked a number of pieces of content as sensitive. so users had to click on them to be able to see them. They removed the most graphic material, and they have also kind of prevented the glorification of, Charlie Kirk's death. We can talk a bit about How that's possible, Mike, and how there are difficulties in doing so. YouTube, said that they were elevating, respected news coverage from outlets on its homepage. they put a restriction on the video for people who were under 18, and they also made some restrictions for people who were logged in as well. So. You basically had to be logged in and had to be a kind of, regular user of, of YouTube to be able to see that video. Reddit, similarly had site-wide rules, preventing the glorification of, of his death, and that's a kind of common theme we saw amongst some of the platforms. Again, removing some of the most graphic footage. TikTok gave a statement saying that they were enforcing community guidelines and safeguards to prevent people from unexpectedly viewing the video. Again, removing the most graphic elements, which we kind of presume are the kind of closeup videos that are, are obviously most, affecting for people. they also said that they removed, videos related to the killing from the for you page. although it's obviously very difficult to know how successful that was, so. many cases, you know, this is a blueprint that platforms have rolled out in many cases, for live news events that are unexpected and which generate a lot of, attention. Is there anything about the way the platforms reacted, might that, surprised you or that differed at all from what you expected?
Mike Masnick:Not really, honestly. I mean, I think they all sort of, you know, everybody's got some experience with some level of this kind of thing. I mean, this famously goes back to, the, attack in Christchurch, many years ago where it suddenly became. this realization that violent crime was being, put online for a reason, and that people were sort of promoting it. And so there's been plenty of discussion over the years since then about how various platforms should handle these kinds of events. Where I. You are seeing violence in, in, you know, in, in the Christchurch example it was the shooter themselves was sort of streamed an attack and, you know, a violent attack. and so there are things that people are doing, but it's very, very difficult. And I actually thought one of the, best, most thoughtful pieces I saw on this came from. Alice Hunsberger, no stranger to the podcast has been on many times and obviously writes for you at everything in moderation. and she wrote a thing on LinkedIn that I thought was, basically saying like, Hey, look, here's what's going on. Like, everybody is, working hard to deal with this. There's a very quick. People jump to the conclusion that, oh, the platforms don't care. The platforms do nothing. you know, they're doing things in bad faith and that is not true. You know, platforms, were very quick to sort of say like, we, You know, there are people at those platforms whether or not they were as effective as they might otherwise have been, or whether or not the teams are as well staffed as they need to be or whether or not they have all the tools that they need to deal with this. I think every platform for the most part was trying to, do something to deal with this, situation. It, then, you know, it may have spiraled outta control in terms of later responses and everything around those, those lines. But, You know, I think the initial response from most platforms was pretty typical. It takes a little bit of time to get geared up, and then there's an effort where you, you're being flooded with more people trying to upload, and in this case, rather than a single video where you could create, you know, a hash of it and, make sure that the same video didn't get uploaded. There were many different videos and many different angles from different people in the audience. I mean, the craziest one was like, there was the one guy who like. Jumped up who was in like, near the, front row and, and started streaming on TikTok right away and did like the, usual, like, Hey, it's your boy so and so on TikTok, right? You know, Charlie Kirk just got shot, blah, blah, blah. And it was just like, what is, you know, like. but then like, how do you deal with that? Like that is someone, you know, you could say sort of reporting on the situation. It didn't have the video of the actual shooting, which is what most people were initially concerned with. But like, it, it's, we live in, in a world where like everyone is a broadcaster and that presents some interesting challenges for the various platforms where, you know, you can go in different directions based on that.
Ben Whitelaw:Yeah. How do you write a policy to, uh, for that guy, right? The guy who instantly live streams when somebody's been shot. Impossible. that's the other things that. I was also interested in is the way that this story and, and some of the content around it seemed to sit between a few potential policy or guideline areas. So there was, you know, this idea that it probably wasn't graphic enough in some instances to be graphic content. in some instances, it violated the glorified violence. Policies that some platforms have, it kind of sat in between and there is obviously a kind of natural news element to it as well, which, you know, you have to make a judgment on, which some platforms do. did you also find that, you know, platforms did a fairly, good job of that based upon your kind
Mike Masnick:what is good, right? I mean, like that's the problem, right? Uh, you, that's a totally subjective standard, and different platforms have different policies around this stuff, and there is no right answer. but I do think for the most part it seemed that, platforms did try to deal with it. You know, there were concerns, especially on, X for example, which we already know has a very reduced trust and safety staff. Uh. And, is very focused on Autoplaying video. So there were a number of situations where people apparently came across autoplay videos of Kirk being shot, which is horrifying, right? You know, it's bad enough for, people to see it in any situation. And then to see it when you're totally not expecting to, is, you know, certainly potentially traumatizing. so I think, there were concerns about that. but, but you hesitate to say that anyone did a bad job or, as Alice wrote, right? Like there are people, all of these platforms who were desperately trying to triangulate and figure out like, how do we respond to this? And how do we deal with this? And how do we, do the best within our policies as we have them? and so like, I, I don't want to criticize any particular platform, just noting that, especially in the immediate aftermath of these things. every platform, I'm sure was all hands on deck to do something. And that also means that a lot of stuff was, probably blocked and taken down mistakenly because you're in a rush. You know, you'd rather take down too much than too little in these cases. and then you just have all of these other questions about like. Well, what is newsworthy and what is, you know, how do you balance out something that is newsworthy versus something that is, violent and, and, problematic in other ways? And how do you determine, is it celebratory versus not? Like, these are all impossible questions and there's no, like, oh, you know, this content obviously should have stayed up and this content obviously should have gone down. you know, so I, I, I don't, I'm hesitant to like, say like, oh, platforms failed and these did, did better.
Ben Whitelaw:Yeah. I mean, I, I, you know, we talk every week about. How difficult these decisions are and I think, you know, we saw. In many ways, the fact that, platforms were criticized on both sides was almost an indication that they were kind of in the right ballpark, right? There's, there was criticism about them being too quick and being potentially kind of sensorial, by, people on the right of the political spectrum. And there was people who. We're critical of them being slow and saying, well, you know, these videos have got millions and millions of views. I think one on Instagram got 15.3 million views. Like that's an indication of some sort of gap or failure. You know, I think about the old, British broadcasting adage that if you're being criticized by the left and criticized by the right, you're probably somewhere in where you want to be. And I.
Mike Masnick:that. I hate that. But I, I, I get what you're saying. I always hate that though.'cause like there's a chance also chance of you're being criticized by both sides.'cause you did something really bad.
Ben Whitelaw:Yeah, that's true. That's true as well. Um, but I think that's, it's the classic rock and a hard place, and I think that's something that, a lot of our listeners will know and understand and, but others won't. I wonder if, you know, I was also thinking about how some outlets covered the reaction, the platforms and one particular piece by Wired stood out, which. I kind of challenged the, framing of, in all honesty, the headline was Charlie Kirk was shot and killed in a post content moderation world. And the piece, kind of lays out that this is the first kinda major news story, which again, you can kind of critique. In a world where platforms have stepped back from moderating content since January since the Mark Zuckerberg, hostage video to camera since, you know, the inauguration, since the pressure that's been put by Trump on the big tech CEOs. and the kind of anti-regulation, rhetoric that's come from the administration as well. I would say that the kind of the stuff that platforms did is very similar. If not, you know, exactly the same as, the playbook they would've run prior to all of these events. And so I kind of challenged the idea that we are living in a post content moderation, world, inverted commas. this feels very similar. I think it's the fact that moderation has become a kind of toxic concept, in the last six to eight months that has really changed. and, wonder what trust and safety professionals, How they feel about that.'cause they're probably running the same playbook but within a different context.
Mike Masnick:Yeah. and I should say, I would, I would put more blame on the headline writer of the Wired article, uh, than the, than the article itself was written by Lauren Goode. She's a very, very good reporter actually in, in the space. but you know, I think it's natural to. In the last year, certainly more so than before, there's been this real direct attack on content moderation and. Companies have dialed it back. Certainly meta very famously has dialed back. We already know that X has dialed it back. and so there is a reasonable question for a journalist to ask, well, how did these platforms handle it? If they were saying they were sort of stepping back on content moderation. But I think the real story is exactly as you said was that, this is the kind of thing that trust and safety teams are geared up for. and to me it's also kind of the lesson in. You know, for the people out there who spent years claiming, oh, trust and safety is just censorship like. right, like this is what trust and safety is supposed to be there for. It's supposed to be able to respond to things to keep people safe on the platform, and that includes not having you suddenly, randomly, unexpectedly exposed to, to. horrific violence that, that you're not ready for and not prepared for. and so, yes, I do think that most of the companies sort of ran the playbook that, as I said, you know, they've been working on a lot of it since, since the Christchurch incident. But some other incidents, obviously in between. this is, even though there were some differences and some complications and some things that made it difficult, this was not a entirely unique or new kind of situation that they had to respond to, and they have people and tools and setups in place to deal with this kind of thing.
Ben Whitelaw:Yeah, it's interesting to me. I would've thought by now that we'd have a way of judging in a slightly more, uh. Quantitative way, how a platform does in a situation like this by now, you know, it would be helpful to have some sort of mechanism to say, okay, that platform performed well in these senses based upon some data that we have available or, some kind of, you know, so much of it is like, I saw this video at the top of my feed, therefore, you know, this. Is a failure or, you know, I didn't see any videos, so therefore I've, been somehow prevented from seeing this important news.
Mike Masnick:Right. right? there is the problem, right, because it's all subjective, right? So, this is the impossibility theorem at work, right? Where it's like, for every bit of content that is taken down, you're gonna have some people who say, yes, you know, that should have been taken down. You're gonna have some people who say that shouldn't have been taken down. So which one is the failure case and which one is the success? And we can judge, but it's our subjective judgment as well. And so. It's not as easy as this was the right answer and this was the wrong answer. There's just, you know, a whole spectrum of, tough choices that some people are gonna be upset about and some people are gonna think don't go far enough. And so, I don't know. and again, like. I feel, kind of hesitant to say that like any platform did particularly poorly on this side of it, on the immediate response to the videos of him being shot. I think that it's a very difficult situation. All the companies were scrambling. Some videos you could say were left up too long. There were some were maybe more aggressively taken down than they should have been. But you can't say like, oh, this company failed, or This company did particularly well. I.
Ben Whitelaw:No. And, and before we kind of move on to the political, reaction and, and I guess the reaction to the way the platforms dealt with the situation, it was just noteworthy to me that. The platforms didn't just remove content, ready, arrested talks about kind of how platforms do moderation normally in three ways, removing content, reducing its reach and then informing people. And by informing she means kind of putting labels, having kind of screens that show where content is graphic, even though we have had this, I guess, pushback against content moderation and trust and safety over the last 12 months, let's say. Platforms. We're still doing a bit of each of them, which I think is interesting and shows, I guess, how that playbook is holding to a degree. which I think is, noteworthy. let's go onto the kind of wider political conversation. it wasn't just the videos that were prominently, kinda showed on the platform. There's a lot of people who were reacting to the story, who were, quote tweeting who were, Arguably celebrating, his death, people who had different political views than him and as a kind of prominent, conservative voice. You know, Kirk was close to Donald Trump. And Trump came out and said he was a wonderful American, and said, you know, this kind of rhetoric is responsible for terrorism that we're seeing in our country. So. there was a kind of reaction to his death that we then saw become a story in its own right via the widespread doxing campaign that some of conservative Right. Decided to wage. Is that right?
Mike Masnick:Yeah, I mean there were so many different elements to this and, and obviously we we're trying to stay away from the sort of purely political aspect of this discussion, but like there was then the discussion of how do platforms handle the following discussion. and so some of that was, you know, how should platforms be handling. People who are arguably celebrating or condoning in some sense the violence and what do you do about that? And that raises all sorts of other questions. And then you had people who were then mad about people who were seen as, often in, in a not so accurate way, but seen as condoning and supporting the shooting. and then responding to that by. doxing them, harassing them. you know, there was an effort to sort of create a database of everybody who was claimed to have celebrated, the shooting potentially to get them fired or whatever. And then, so platforms had this sort of secondary struggle over. How do you, how do you handle the moderation on, on that? Are you allowing people to say things which. Is certainly First Amendment protected speech where people who did not like Kirk, could they express their opinion on this? did that violate certain rules? And then the response from others, then trying to harass and docs people who responded incorrectly, I guess would be the, the more diplomatic way of putting it to his death. and I think that created a much bigger struggle for the various platforms because unlike the shooting video itself, I think that has always been a more challenging
Ben Whitelaw:Hmm.
Mike Masnick:to respond to because again, so much of that is in the eye of the beholder and so much of that is, purely subjective. and. then you have to look at what falls under what policies, right? So lots of platforms have policies about encouraging violence or, doing things that will encourage more, uh, hatred and violence. And so where do certain posts or content fall on that spectrum? And that becomes a big judgment call. And a lot of people are going to disagree. And again, you have platforms that are all rushing to make very quick decisions at scale, which is also difficult. Just the sheer scale of it, which I, which a lot of people. Don't have a sense of, and at the same time, people are then going around criticizing all of the platforms. So people were very quick to say, oh, X is doing a terrible job at this, or Instagram is doing a terrible job at that, or, TikTok is doing a terrible job, but in, in both directions. In some cases it was like, oh, they're, you know, there's a huge community of people who are celebrating this, or there's a huge community of people who are doxing this, and why don't the platform stop it? so again, it's like this rush to assume that, the technology companies have to be immediately perfect. When they have sort of two opposed groups, effectively going at each other and figuring out how do you, step in and, moderate that impossible fight.
Ben Whitelaw:Yeah. I mean, I I'm asking you this because I myself, maybe doxed if I don't, but BL blue, sky,
Mike Masnick:Yes, we have a bell.
Ben Whitelaw:the bell. Mike is a board member of Blue Sky.
Mike Masnick:I am on the board of Blue Sky. Take everything I say is completely biased though. I will also note I have not discussed this with people at Blue Sky. I do not work there on a daily basis. I have no say or control over how trust and safety works at the company.
Ben Whitelaw:You've only got the angry responses of people on Blue Sky to what I'm about to
Mike Masnick:people yelling at me.
Ben Whitelaw:but there was, you know, blue Sky did a couple of things. they put out a, a kind of, a warning to users as for people who were celebrating. The death of Charlie rk and they said that glorifying violence or harm violates our community guidelines. We would take action on those who celebrate his death. And then, I wasn't super privy to it, but you know, there was a lot of people who were, doing that in kind of subversive ways, including saying rest in piss, and kind of playing on RIP and those people received some, bands. and again, I know you're not kind In the weeds on this, but the, that's a good example I think of where, platforms are trying to proactively through a statement on the platform and then in the actions of its enforcement try and address this issue. But, then get blow back from people who say, why can't I say rest in piss or rest in, you know, whatever. you don't have to say anything about that. I just wanted to use the bell'cause we haven't used the bell for a while.
Mike Masnick:Yeah, I, I mean it's, it is a very, very difficult situation where there is pressure on all sides. And I will note that there was, also at the same time that all of this was happening, there was also a widespread campaign mostly on. X to argue that the entirety of everyone on Blue Sky was celebrating Charlie Kirk's death. which was never true. And there were a few people who pushed back on it and pointed out that yes, there were some people, but there were also some people on Twitter and on Instagram and on TikTok who were doing the same. It was not a universal thing, but that did lead to. and again, like I have no idea what the decisions internally of the company. I haven't spoken to anyone at the company about what the decisions were in the last week. You know, I'm sure that they were under tremendous pressure from all sides about how do we handle this? And is the narrative being turned into this idea that, blue sky was celebrating, And, you know, the company had come out very quickly with the statement saying that, that was not acceptable under their policy. and then. then the question is how do you enforce it? And as I said, everyone's dealing with these things at scale, at massive scale, at scales beyond what most. People who haven't done it can comprehend. And when that happens, certain things may get taken down too aggressively, some things not aggressively enough, and then you have to correct over time. And it sucks for people who are sort of caught in that. but. as always, even when I disagree with, larger issues within all of these companies, with all of them, I think that they genuinely were trying to do the best that they could to deal with these things. and I say that of, including XA company that I have had plenty of things to disagree with over the last few years. It seemed to me that they were all trying to find that right line to handling the moderation of, of this event. And everyone is gonna make mistakes and everyone is going to make decisions that other people disagree with. which is why I really liked Alice's piece where she was like, basically. Have some sympathy for the teams of people who are actually trying to make these decisions on the fly with very little time and, not able to deal with the full context of these situations. So yeah, I mean I think it was probably a really tough situation with Blue Sky. I think it'll be interesting to see if they do some sort of postmortem on this at some point in the future. I know that there was criticism, there was criticism from both sides, and I think that was true of, lots of different platforms. Wasn't unique just to blue sky.
Ben Whitelaw:Yeah. okay. So difficult, difficult week for platforms. I don't want to legitimize Pam Bondi, but can we just, can we just talk a little bit about, her comments about this, you know, kind of. Hate speech towards Charlie Kirk in, the wake of his killing. And again, we don't need to get too political here, but I think there is, there is
Mike Masnick:I, I,
Ben Whitelaw:to point out around the nature of, the, ticket that, that Trump and the Republican party ran on around and its views on free speech and a sudden shift in the way that it's characterizing speech that it doesn't disagree with. Is, is that fair is, you know.
Mike Masnick:Yeah. I think, there is this bigger story here, which is that. Lots of people, and this is always true, and this is the point that we always make, right? Like everyone assumes that content moderation issues are easy. And it's basically like take down the content I don't like and leave up the content I do like, and the reality is that everyone disagrees over which content you like and which content you don't like. And we're seeing that now where you have a whole bunch of people. and the Pam Bondi example is, is a very clear one where she's suddenly saying like, we're going to prosecute people for doing hate speech on social media. which she in theory should know well, hate speech is protected under the First Amendment. you know, it's not true in other countries certainly, but in the US hate speech is still protected. and this is to me is one of the reasons why hate speech is protected in the First Amendment because if you allow someone like Pam Bondy to define what is hate speech, and that includes. in some cases it looked like people quoting Charlie Kirk, in a way that, you know, some people were saying, oh, you're, you're sort of trying to justify him being shot. Which, I think I don't, didn't see very many people actually trying to justify the violence. but people were interpreting it that way. And then if you turn that into something that can be prosecuted over, you create an enormous chilling effect on speech. But, you know, sort of the, interesting element of that was that you had. a group of people for a long time who were insistent that any sort of content moderation effort was censorship and shouldn't be allowed. And yet many of those same people are suddenly like, well, no, you know, these platforms can't allow people to be saying anything hateful towards Charlie Kirk or his supporters. And there's this, this idea that like, basically everybody has a line. that's, you know, that's where the difficulty is in trust and safety is figuring out where do you draw that line. So, it would be nice if the people who are now suddenly, who spent years falsely claiming that platforms were censoring people just by doing basic trust and safety work, who are now suddenly calling for not just trust and safety content moderation, but are calling for prosecution of speech on social media. Were to maybe connect two and two and say like, oh wait, this is why there is trust and safety. I don't think they're actually going to, get there and make that connection and say like, oh, maybe I was too hasty last year when I accused all these platforms of, being censors. but, it would be nice if there was a little bit more self-reflection on how, their views that, no speech should ever be taken down on social media. Contrast with how they're reacting to this situation now.
Ben Whitelaw:Yeah, it does have, you know, a wider effect, you know. Jimmy Kimmel was taken off air this week in part because of comments that he made about Charlie Kirk. Right. Um,
Mike Masnick:That, that's very pretextual. I mean, if you actually look at this is, I've been a little frustrated with reporting on that because everyone is sort of presenting it as because of his comments about Charlie Kirk. But it wasn't because his, his comments were totally benign. If you actually look at what he said, it was really that he was making fun of Donald Trump which led to the FCC than threatening Disney, which is just like such an obvious First Amendment violation. It's, really, really kind of scary. But it, you know, what it all boils down to is an effort. And this is why we're always nervous about where the regulatory, whims fall is that, When you have government officials telling people, this is allowed, this speech is allowed, this speech is not allowed. It really depends on who's in the government and if you don't trust them, and they have different incentives or different motives for what speech they want allowed and what speech they don't, you know, the real reason why Jimmy Kimmel was attacked was because he was making fun of the president. and that has to be admitted. And when you, leave open, the ability for government officials to force platforms to remove speech because they claim it's harmful, you have to be aware that malicious actors will then use that to censor criticism and pull down criticism. And as history has shown, almost always use those powers to attack marginalized voices over the powerful voices.
Ben Whitelaw:Yeah. I think also the, you know, we won't go into it now, but like this war against certain media as well is something that, you know, the publican party and, and Trump are clearly ramping up. You know, we've seen this instance of A, B, C being forced this kind of take down. Kim will show, we've seen a, the kind of court case, defamation case being brought against the New York Times, which, its CEO has said is, baseless and they will fight. But there's, there is this kind of, again, chilling effect potentially that comes as a result
Mike Masnick:yeah, there's a massive chilling effect across media, and that includes social media and you can't tell this story without telling the wider story of, of the. current Republican party really trying to control the media in all sorts of ways, which actually, we'll, we'll talk a little bit more about in a few minutes, I think with one of the other stories that we have.
Ben Whitelaw:Yeah, indeed. I mean, the New York Times, wrote this week a story which I found quite funny in the context of all that we've discussed, is. A piece entitled, has Britain Gone too Far With its digital controls?
Mike Masnick:Oh my goodness.
Ben Whitelaw:and there's, I expected to see your byline on it, Mike,
Mike Masnick:Oh, come on.
Ben Whitelaw:You didn't write it. You didn't write it
Mike Masnick:I did not, and I, I, wait, wait, wait. Just for the record, I wanna point out for listeners too. I had seen this story and chose not to put it on our rundown list today because I thought, no, I've said these things enough. Ben doesn't need to hear me say it. And then I show up today. And what has Ben put at the top of his list? This story?
Ben Whitelaw:It's true, it's true. I'll, I'll tell you why I, I included it in a second. Um, but the, this is a, I guess, an interesting story in the context of what we heard pointing the finger at Britain. And its, supposed. Draconian digital laws. There is a lot to criticize Britain for. and this piece kind of goes into some of the surveillance, facial recognition issues as well as its, you know, the kind of recent attempt by the government to break encryption. for Apple users, which we criticized and we have commented on, there's also rolled into that the Online Safety Act and some of the kind of broader diplomatic discussions that, Trump and Stan have been having this week whilst the president, of the United States has been in London. So this piece is kind of, I guess. summing up all of those, developments. I always think that a, a story, Mike, that has the question in the headline, the answer is almost always no.
Mike Masnick:Yeah, that's, that's, uh, oh God. I'm blanking on it, and I feel really bad. It's, it's a law. It's someone's law. Oh God. Yeah.
Ben Whitelaw:laws? White laws law? No.
Mike Masnick:no, no, no, no, no. it's a famous law and I know the guy and I feel bad and I'm apologizing right here on the podcast that I am forgetting your name.
Ben Whitelaw:I didn't even know
Mike Masnick:Uh, I'm going to look it up.
Ben Whitelaw:Okay. Uh, in the meantime, while you do that, I wanted to bring this,'cause
Mike Masnick:Ridge's Law
Ben Whitelaw:better religious law.
Mike Masnick:Ridge's. Law of Headlines is an adage that states any headline that ends in a question mark can always be answered with the word no.
Ben Whitelaw:I like that. Um, named after.
Mike Masnick:Ian Ridge
Ben Whitelaw:Ian, a better edge whose, whose work I've read before. very good. Um, I mean I wanted to kind of bring this partly to, talk about the New York Times' absolute hatred and mischaracterization of Britain over the years. there is a long standing, Series of stories in which New York Times kind of mischaracterizes our food, our like arts, our view on sports, like where we go on holiday, all this stuff. My favorite that I wanted to kinda share was in 2018, you may remember, it actually put out a Twitter call for people to submit their petty crime incidents in London. and that was like a kind of red rag to a bull for the, for the British public. Uh, there was the, a few angry responses, but mostly people, replied with funny kind of dry sardonic responses, you know, saying, uh, there was one time this guy on the tube escalator stood on the left and I was furious.
Mike Masnick:Well that I, I think that is, that is, it's reasonable to be furious at that. I cannot stand people who stand on the left on escalators.
Ben Whitelaw:Exactly.
Mike Masnick:that is a crime against humanity.
Ben Whitelaw:And putting in, you know, milk before hot water when making a tea,
Mike Masnick:Well see now there, there is some debate on that. I, I am in agreement with, the British Way, but I understand that, elsewhere, they don't feel as strongly about that.
Ben Whitelaw:No, there's one way of doing it, uh, in Britain and I, I won't be swayed on that. Um, so, so I think there's something kind of, funny here, which is the way that, I guess the New York Times and, you know, potentially as, as a very well read media source in the us how the, the US thinks about the UK and it's, uh, approach digital and tech policy right now. but also there's a kind of serious point there that,
Mike Masnick:Yeah, I mean, I think there is, a serious point and obviously, I mean, I've talked about why I think the UK has gone too far with, with some of its rules. But I do feel like also it's easy and I think this piece does a little bit of it, is sort of conflating a bunch of different issues, some of which are more serious than others, right? Like the attack on encryption is I think, very, very, very problematic. I think there are other laws that, that Britain has that are, are more problematic and sort of lumping them all together as if they're all the same, I think doesn't allow you to sort of. Separate out and prioritize which things are really bad. And I have my complaints about the Online Safety Act, but I actually do think a lot of the online Safety Act, there are elements of it that were done in good faith, and that Ofcom is trying to do things in good faith, and I appreciate that aspect of it. I think there are a bunch of. Elements in there that they sort of ignored. People warn them, that would backfire that we're seeing now. But I don't, you know, things like the attack on encryption, I think is just entirely in bad faith. and so being able to separate out that rather than just lumping it in is like, oh gosh, those, you know, those Brits, look at what they're doing. Again, especially at a time when the US government is suddenly attacking speech in all sorts of ways as well, which I think is, is. there's an element of sort of hypocrisy that doesn't get covered when we're just like, you know, oh, the Brits, the crazy Brits. Like, no, you know, everybody is sort of trying to figure out how to, how do you handle these things and how do you regulate these things? And so it's not just the uk, Australia, Europe, everywhere, and, you know, nobody including in the US has the right answer yet. And so it's easy to criticize. It's harder to come up with like, what are better approaches overall?
Ben Whitelaw:Yeah. And there's a good, actually piece by public knowledge, the kind of thing Tam, which we won't dive into, but makes that point around how there is an element of hypocrisy here, and particularly in relation to the Digital Services Act, which I think is, has come under fire from, Congress in the past. few other stories we wanna, we wanna touch on Mike. one that relates to, Donald Trump and TikTok, which again people might have heard about. And then we will finish up on a story we touched on last week. we're starting to see the long standing story about TikTok, emerge. and we're starting to get through the very long dark ton of as to find out what actually we're gonna do with TikTok in the us.
Mike Masnick:I mean, it's so crazy, right? Because you go back to the very beginning when, people sort of forget how this story started, which was that in 2020, a bunch of TikTok curves, made Donald Trump look foolish by, reserving a bunch of tickets to a rally so that they thought they were gonna have this huge turnout, and then nobody showed up. It was like a big, viral TikTok moment.
Ben Whitelaw:Yeah.
Mike Masnick:And almost immediately afterwards, Donald Trump was like, oh, TikTok is, run by the ccp and, and you know, we have to shut it down. And then took some very ham-fisted steps to try and shut down TikTok, which failed in court and were a mess and we're just done in the most incompetent way possible. and then that sort of disappeared with the election of Biden, but then there was still like continued. Activism by people. some of which later came out were, driven by meta hiring, PR people to drive this big story about how TikTok was a national security threat. which got driven up and became like this story of like, oh, you know, tiktoks connection to China makes it dangerous. Which it could, it might. but it was, it was very much done in a, a moral panic manner, which then went into. Overdrive also with people, uh, certain politicians. Then also complaining about TikTok users supporting Palestine, which is another sort of, you know, there's a whole bunch of like, you
Ben Whitelaw:Wedge issues
Mike Masnick:Wedge issues right? And which then led to congress, passing a law that said that TikTok had to be sold, and controlled by an American company, which. You know, timing wise, stupidly was supposed to go into effect the day before Trump's inauguration. and then he claimed to have saved it. TikTok went down for a day, but then his saving, it was just telling everyone to ignore the law, which he's done. There was always like, he would give like a 90 day delay before he would enforce the law, which the law. Did not allow him to do it. Gave him one chance to do like a 75 day delay if they showed like progress towards, he just makes it up entirely. And again, the attorney General sent this letter'cause the law really enforced, not against Byan or TikTok itself, but against like Google and Apple and hosting companies if they allowed the app to exist. And they sent this really weird Very questionable letter saying, no DDOJ will ever enforce this law against you, which you're not s that's not how the law is supposed to work. It's just that's not you. I, it's, it's hard to explain. How, but all of this time there's been this effort to then actually try to sell TikTok off now So that history is important.'cause going all the way back to 2020, the plan that Donald Trump's original plan was there were companies who were interested in buying TikTok and Donald Trump had to make the approval and he rejected any company that wanted to buy TikTok. That wasn't. a loyal MAGA Trump supporter. So he originally wanted to have Larry Ellison, who's a big Trump supporter, and the. founder and now CTO, but still, you know, hugely influential at Oracle, and Walmart. It was originally gonna be Oracle and Walmart, which is, another Trump supporting company. and that deal fell through for a variety of reasons. And so now, five years later, we're back in the same position where. It is sort of being sold where 80% is being sold. bike dance will still retain 20%. Importantly by dance still retains control of the algorithm that they will then license to this new entity. And Donald Trump had been very explicit that no deal would be acceptable where China controls the algorithm. And yet that is the way this is ending up. So there're going to create a new company effectively that will be 80% owned, by. Larry Ellison, Andreesen Horowitz, the big venture capital firm, and Silverlake Partners, which is a big private equity firm. and Oracle, which had been the plan all along, will effectively do the hosting. They have been hosting TikTok. This isn't a big difference, and they will do some security review, which everybody forgets was part of an original plan that came out in 2021.
Ben Whitelaw:Right.
Mike Masnick:there was an announcement, which I wrote about in, I forget, 2022 or 2023, that now all TikTok code was being audited by Oracle. Everybody forgets that, but now everyone's acting as if it's a brand new thing. but this app will be effectively under control of Donald Trump's friends, and there will be a board member appointed by the White House, which, Feels very sketchy in a country where you're supposed to have separation of the government and corporations. but we've already seen other aspects of this having a golden share in US Steel, taking 10% of intel There are a bunch of things where this government has gotten much more deeply involved in, c. things, so this will be a new company. It will also be a new app. That is, I think the biggest element of this is that people using TikTok will be forced to switch to a new app if they want to continue using TikTok. And I am wondering if this. Is a chance for somebody else to step in and sort of, you know, it could be Instagram, I'm sure Instagram will gleefully try and take that market, but it could be an opportunity for somebody new. There are a number of smaller companies out there that are trying to build TikTok competitors to step in and say, do you really wanna switch to the app that is controlled by Donald Trump and his friends? Or do you want to go to a different app? And so we'll see where that goes, but it's kind of big news.
Ben Whitelaw:it would take a brave competitor to, to frame their app as a, an alternative to Donald Trump. Right. You know, like, you don't wanna,
Mike Masnick:if
Ben Whitelaw:it's, it's the reason you would,
Mike Masnick:there are risks associated with that, but also if you look at the approval ratings, especially among young people, I think that there is a marketing campaign for somebody out there to present themselves as an alternative on the political front.
Ben Whitelaw:Yeah. I think that would be really interesting to see yeah, what the implications of that algorithm deal, that licensing deal will also, include. We will see. Um, one more story to touch on, Mike, which we covered in depth last week, which was the, social media ban in Nepal, which led to a widespread unrest in the Capitol and, a change of government. And, you know, the army has been in, in charge pretty much since then. this is another New York Times story, which I've allowed, because, you know, they're so, uh, anti Britain, but they, they. make a really interesting point about how one particular platform has been at the heart of some of those governmental changes since last week and when we, talked
Mike Masnick:Yeah. And is basically arguing that the government is being run in a Discord server. which is absolutely incredible. I assume most people are familiar with Discord, but it is like a, a chat app if, you're familiar with Slack or you know, any kind of. Group chat effort. Uh, that's what Discord is. And so, you know, basically what happened was a lot of the organizing for, you know, we talked about it was a generation z protest in Nepal. A lot of the organizing for it was in various, discord groups. And as the government collapsed and the Prime Minister resigned suddenly, like the discord chats became more and more important. And yes, sort of the military is in control, but there's sort of. Taking cues from what the discussion is in the Discord chat, including like a discussion over basically who should the interim leader be. And it's just kind of mind blowing when you take a step back and you're like, oh my God. Like, you know, this is a platform that was designed for like gamers to be talking to each other as they're, you know, playing Call of Duty or whatever. Yet here they're like rapidly planning a government and voting on things and having discussions over, you know, who should do this and who should do that, and how do we plan out for an actual election. I think it's, a really important reminder about how these platforms are. communication tools and they're tools that can be used in all sorts of ways for good and for bad. And we're seeing this, you know, when you suddenly need to have organization out of chaos. A whole bunch of people came together in a discord group to like try and figure out like, how do we, run Nepal? it's absolutely fascinating to me. And I think it's, another reminder of like the power of these tools when used well, to organize and, and. get people together.
Ben Whitelaw:Yeah, it's funny. That is true. It's funny that you took that from the piece.'cause what I took was what a whole. Bunch of chaotic mess. It sounds like there's 1% of Nepal's population is in one discord server
Mike Masnick:Yeah.
Ben Whitelaw:on the next leader of the country and anybody is allowed to jump in and talk. So apparently it sounds, there's a quote in the piece that says like, it's very disorganized and sounds like a random social media call. Can you, can you imagine just like
Mike Masnick:Yeah,
Ben Whitelaw:talking over each other and like shouting and then there's a bunch of mods who are trying to kind of bring. I dunno, some, some semblance of, normality to proceedings.
Mike Masnick:but to me it's like, it is this example of like. Where you see order come out of chaos, right? Like there is this chaotic situation and then there's always this sort of mad rush and then people try and put order into it. and so we're seeing that and to me it's just a fascinating example of that in practice.
Ben Whitelaw:No, it's, it is a amazing kind of follow up to
Mike Masnick:and just a related story to that too is that there was a Forbes story about how a bunch of people in Nepal were also using this app that, Jack Dorsey had vibe coded over a weekend, back in July, called Bit Chat. And I, I will say that very carefully'cause you can say that in a way that it sounds like something else, which. Jack Dorsey knows and did as a joke. Um, but it is a, a mesh app because as we talked about last week, they were banning all of these social media servers and, and apps. and the whole point of, Bit Chat is that It's a mesh thing, so it just goes phone to phone or device to device and doesn't need any server and you can't block it. And so it's really interesting how the people sort of went and found an alternative that couldn't be blocked. and so they were using that for discussions as well. And so again, this sort of interesting way in which sort of the technology finds a way and people sort of figure out ways to use the technology to organize and, and do stuff even in the face of, the government trying to shut them down.
Ben Whitelaw:Yeah, I think that is almost the. The kind of link between today's stories, isn't it? You know, whether you are in a protest situation in Nepal or you are responding to the death of a conservative activist in the us the users always find a, a way to kinda push the boundaries of what they're allowed to say, and platforms are only ever catching up. so I think that is my, summary of today. we've kind of run out of time, Mike, but we've covered a hell of a lot of ground. We're lucky enough to kind of talk about stories from the verge, from wired, from CNN, from the New York Times, you know, for all of its Britain hating nonsense, um, uh, from Vox, from Newsweek, and we'll include all of those in today's show notes as well. thank you for joining. Listeners. We won't be here next week, but we'll be back the following week, as usual. Thanks for listening.
Announcer:Thanks for listening to Ctrl-Alt-Speech. Subscribe now to get our weekly episodes as soon as they're released. If your company or organization is interested in sponsoring the podcast, contact us by visiting ctrlaltspeech.com. That's CT RL alt speech.com.