Allan Boyd Talks to Experts About Things
West Australian Independent Journalist Allan Boyd talks to actual experts about interesting things for about 15 minutes. Also on RTRFM!
I source a lot of content via the excellent academic-journalism website, The Conversation.
The things I cover are wide...
Including Cyber-security, Surveillance-capitalism, Media, Arts, Politics, Internet Stuff, AI, Big Tech, Science, Ecology, Social Justice, Human Rights, Activism...
I have a solid background in web development, arts, comms, politics and media. In particular, independent community arts. Been broadcasting on and off since 1996 with RTRFM. Serial student at ECU in Media and Cybersecurity. Lapsed sessional academic of Experimental, Performance Poetry and Creative Writing at Curtin. Ex-tree-planting contractor. Was a Perth Indymedia OG at the birth of Open Publishing. I'm a rogue web developer by trade. Muso. Building the internet with my bare hands since 1998! Aka the antipoet. Perth Slam co-host.
Allan Boyd Talks to Experts About Things
Rob Nicholls – Landmark Social Media Addiction Case
Use Left/Right to seek, Home/End to jump to start or end. Hold shift to jump forward or backward.
Dr Rob Nicholls - expert in Media and Communications at RMIT, talking with RTRFM’s Allan Boyd about the recent landmark social media case against Google and Meta in the United States.
==
INTRO: A recent US court case saw big tech giants, Meta and Google, pay out six million dollars to a single plaintiff for mental health damage - citing dangerous and addictive design features in the social media platforms. The case is seen as a precedent to many more to come. On The Record’s Allan Boyd caught up with an expert in tech law to find out what’s going on…
LENGTH: 15:37
--
Source: The Conversation - Meta and Google just lost a landmark social media addiction case. A tech law expert explains the fallout - March 26, 2026
Last month, Meta and Google lost a landmark case in the United States, one that could reshape how we think about social media. For the first time, a US court has found these platforms legally responsible, not just for what's on them, but for how they're designed, including the built-in features many of us use every day, like the Infinite Scroll. The plaintiff, a 20-year-old woman, alleges the design of the platform's algorithms left her addicted to social media and negatively affected her mental health. TikTok and Snapchat, who were also named in the case, settled before the social media trial began. So, what does all this mean for tech companies, for regulation, and the way we all use social media? To help unpack this, I am joined by tech law expert Dr. Rob Nichols, senior researcher in media and communications at University of Sydney. Welcome Rob. Hello, Alan. Okay, so uh let's start with the basics here. Can you just explain what's going on? Why is this case so significant uh compared to say previous attempts to hold these big media companies to account?
SPEAKER_01One of the big things is that in the US the platforms aren't responsible for third-party content. So if I said I'm addicted to cat videos on YouTube, there would be no consequences because the cat videos on YouTube were put up by other people. And this in this case, though, what the plaintiff did was very carefully said, Well, I don't care about the content, it's actually the platform itself. And a jury found that the platform itself, both uh Instagram and YouTube, to different levels, were addictive. They had a design fault that made them addictive, and because of that, uh, the the young woman in question became addicted, had mental health issues, and the court awarded her 3 million US dollars in damages, but a further 3 million dollars in what are called punitive damages, because they said that neither Mesa nor Google uh actually did anything about it when they knew it was a potentially a problem. So not billions of dollars, but enough to get uh each of Mesa and Google quite concerned.
SPEAKER_00Yeah, it's um a significant amount of money to receive for that as a as a human, as a person. And I guess it's it sends a good message towards towards those big companies.
SPEAKER_01Um, I think it does, but also it says, well, here's a precedent. So there've been lots of attempts to take action in the US against the platforms for this type of addiction, uh, or potential uh addictive causes, and they've all fallen off. But now there's a a formula so that uh those other cases can all go ahead and and basically use the same arguments that we used in this one.
SPEAKER_00It's set a precedent, I assume.
SPEAKER_01That's right. It set a precedent in the US, not not in Australia, but it it one of the big things was that the decision the jury came to wasn't based on lots of legal arguments, it was based at look on looking at the internal emails and internal Slack and all of the communications within Meta and Google that indicated that this was uh a design feature and not a bug.
SPEAKER_00Well, that's the thing. These these uh companies, as you say in your article in the conversation, uh borrow heavily from the behavioural and neurobiological biological techniques used by poker machines and exploited by the cigarette industry. Is this is this uh big tobacco uh moment here?
SPEAKER_01I think it is, and I think the the reason, Dead, is we've now got a a court finding that there is the potential for addiction, and the young woman in question essentially was harmed whilst she was a child. So she was using Instagram 16 hours a day. Well, and as the lawyer said, her mum didn't get a look in once she's got that addicted. I think what it does is says that well, there will need to be warnings on platforms for people who are over 18. There is a risk of addiction, uh, very much like the the early days of uh warnings on cigarette packs. Yeah, right. And there'll be a prohibition or a modification that says that miners can't use the platforms. Now we have that in Australia. We're we're the first in the world. And the the real problem here is that getting addicted starts when you're young because you're you're much more likely to become addicted to something before all of your your brains formed. So here the issue is well, perhaps for adults there'll be a a health warning, but for children, either a product design or like here, a prohibition for younger kids.
SPEAKER_00Yeah, well it's um like all these design features are built in, and that's that's what changes this rather than the content that's on the on these absolutely.
SPEAKER_01It's the the person who designed Infinite Scroll, he was also the person who designed uh capture. So if you've ever been really annoyed by trying to count motorcycles on a picture that's so blurry. Oh yeah, yeah. Capture the same the same person did both. Is that right? Wow, I didn't know not that. That's interesting. Yeah, and he's now he he said he doesn't regret uh designing capture, but he certainly regrets designing internet scroll.
SPEAKER_00Wow, that's interesting. Um so we also also hear from the US about section 230, which project which protects tech companies in the past. Can you explain what this is and how did this case get around that?
SPEAKER_01It's uh 230 says that platforms aren't responsible for the third party content that's uh they host.
SPEAKER_00Well, like you said earlier, like if you had a Google uh YouTube video or something, it's not a good idea. Absolutely.
SPEAKER_01So it's called a safe harbor provision. If they're given notice that there's a problem with that third party content, then they've they take it down, but there's a procedure for that. So essentially it says you can go off and do business as a a social media platform. You have to worry about some things, but the one thing you don't have to worry about is the content you host.
SPEAKER_00Right, okay. So we're looking at the um so the reason why section 230 is not effective here is because it's about design and not about content. And and it's uh it's a US law.
SPEAKER_01So we we don't have that here. I we do have some some interesting laws which might apply specifically here, and it's it's under our consumer protection law, the Australian consumer law. Uh yeah, because basically, if we all know about it, if you've got a product that's got a defect, that's the manufacturer of that product is liable for that defect. Now, under Australian law, software is a good, it's a product, and same for European Union law. So potentially somebody who was ended up being addicted here might actually say, you've got a you've got a defective product. Um and the damage that I've been caused, I'm gonna go for you for that through the Australian consumer. So there are different ways of doing it in different countries.
SPEAKER_00So what do you what do you what do you reckon this means now for everyday users? Are we likely to see changes on how these platforms actually work? Is this the death of infinite scroll? Maybe.
SPEAKER_01No, I think it's not the death of Infinite Scroll, but it might be the end of Infinite Scroll if you're under 18, and a health warning if you're over 18 that this product features infinite scroll and that can cause harm if you've got the potential for being addicted.
SPEAKER_00Well, it just sounds like tobacco or alcohol or drugs, or it's you know that absolutely.
SPEAKER_01Um, any basically anything that could be addictive, it but perhaps it's less like big tobacco, more like TV ads for gambling where you've got that you win some, you lose more at the end. Oh, yeah. It's a bit more like like that, where you know, some people have a flutter and never have an issue with it, whereas it can be quite harmful for other people.
SPEAKER_00It's just the whole process here, like it's these these flaws are designed, these health and mental health flaws and uh suicidal audiation flaws that are built into these systems to force you to be addictive, and it's it's quite unsettling.
SPEAKER_01It is unsettling, but if you were on the other side and you've got a platform where essentially what you're doing is competing with other platforms and other things for attention, yeah, and then you're selling that attention to advertisers. So something that improves engagement sounds like it's a thing that's really good for your product, but actually, if improving engagement means risking addiction for people who are susceptible to addiction, that's a big problem.
SPEAKER_00Yeah, it sounds like uh like it's like a heroin dealer has the same the same uh issues, yeah. I suppose.
SPEAKER_01Well, uh I mean the the that's it yes, because it's free. Your first one's free. Oh, true.
SPEAKER_00Well, that's right. Well, yeah, and Instagram's free, isn't it? And then Snapchat's free.
SPEAKER_01Instagram's free, yeah. So essentially, although you do get to that uh that thing that any product where uh it's free, actually you're the thing being solved.
SPEAKER_00That's right, yeah. And the uh the concept of the user, you know, a user in heroin dealing and the user in uh web development, both of which perhaps have the same meaning. Yes. Right.
SPEAKER_01So but heroin I don't think comes with uh an end user licensing agreement.
SPEAKER_00No, so or warnings or yeah, yeah, no. That was a bad analogy, but anyway, I'd just always thought that. Um so zooming out a bit, do you think this case will influence policy or regulation here in Australia or globally?
SPEAKER_01Well, I think here um what it gets to is a little bit of uh saying, well, actually that under 16 social media ban, we didn't do it because we thought or we knew that the platforms uh were potentially addictive. But this actually says we were probably on the right track for doing it. It also means that the e-Safety Commissioner, who's who's just in the last little while produced a report on how things are going, she'll probably have a a bit more of a focus on helping the social media companies to comply with their obligations. It's a little easier to say, yeah, you can comply with this, but part of what you're doing is protecting yourself. Look at your the court case that you just lost in California. Um and I think it's also encouraged the countries that were looking to see, well, how are things going in Australia? So in particular, France, Finland, the UK, into well, some of the objections to uh having a social media ban on young for younger people, perhaps those objections were based on well, it's a rights issue. Well, if there's potential for uh mental health harm, those arguments look very different.
SPEAKER_00What's next? Um Meta and Google have both said um separately that they plan to appeal this verdict. Yeah. So what what yeah, what happens if that if if they as they appeal? Watch this space, I suppose.
SPEAKER_01Well, yeah, no, I think it is watch this space. No, and you it's simply inconceivable that they wouldn't appeal. Partly the the thing that Snapchat and TikTok did by settling early was to avoid the punitive damages, the the second phase of the damages. But at the same time, I my expectation is that we'll see each of uh Meta and Google do what they've started to do, and to say, well, here's a version of YouTube for people under 18, which has less addictive features. Same with Instagram, I'd say the same with Facebook, but nobody under 18 uses Facebook anyway these days. And potentially TikTok and Snap will also have a look at should we have a change. And it's it's simple stuff like requiring people to take a break after 60 minutes or uh 90 minutes, uh five-minute break, and actually that's splitting it all up.
SPEAKER_00So break so that could be part of the design process there.
SPEAKER_01Yeah, and it's like poker machines having a mandatory mandatory break after a a while. If you have those sorts of mandatory breaks, actually people are less likely to go back because you know there are other things in the world, even if it's only for five minutes. So the distraction of those other things actually helps the the potential for addictive behavior.
SPEAKER_00So if they build those those sort of things into it, that that could help, I suppose.
SPEAKER_01It it could, and and what's more, it makes it a lot easier to say, well, we've under understood this and we've deliberately done things to help make sure it's not addictive. And so people getting addicted, it's not our fault. We've done the best that we can.
SPEAKER_00So we will see more of these cases, do you think? Like this is a bellwether, there's more, there's more coming, isn't there?
SPEAKER_01Yeah, the so basically that was the the bellwether one, the first one in California. There are ones in a a bellwether one in almost every other state in the US, and then hundreds more in the pipeline to follow those. So yeah, we'll see lots more of those. And so in the US we'll see a lot of that, but I think we'll also see some people getting to be a bit more creative, potentially, here by doing a class action up, but as I mentioned, under the Australian consumer law. So I think we'll see it in lots of different jurisdictions. I think the one thing here though is that if it's going to be a lot harder if there's somebody who's 14 and has addictive behaviour here when they're not supposed to be on there. Now, the law itself is saying it's the responsibility of the platform, but it does get to be a bit harder to argue, well, I got addicted anyway, doing something that I wasn't supposed to be doing.