Preparing for AI: The AI Podcast for Everybody
Welcome to Preparing for AI. The AI podcast for everybody. We explore the human and social impacts of AI, diving deep into how AI now intersects with everything from Politics to Relgion and Economics to Health.
In series 1 we looked at the impact of AI on specific industries, sustainability and the latest developments of Large Lanaguage Models.
In series 2 we delved more into the importance of AI safety and the potentially catastrophic future we are headed to. We explored AI in China, the latest news and developments and our predictions for the future.
In series 3 we are diving deep into wider society, themese like economics, religions and healthcare. How do these interest with AI and how are they going to shape our future? We also do a monthly news update looking at the AI stories we've been interested in that might not have been picked up in mainstream media.
Preparing for AI: The AI Podcast for Everybody
THE GREAT SOCIAL MEDIA RECKONING: AI is further weaponising social media, Australia is the first to fight back
The feed is not neutral. It’s a machine built to maximise engagement, now supercharged by AI that can spin up infinite content, orchestrate synthetic crowds, and pull users deeper into loops they never meant to enter. We recorded as Australia introduced a minimum age of 16 for major platforms, which sparked a bigger question for us: can a system that monetises attention be squared with public health, especially for teens?
We compare quick fixes with structural change. Yes, bans are leaky, but they create friction, signal norms, and force platforms to verify ages. The beating heart remains the recommendation engine. We lay out how the shift to phone‑first, algorithmic feeds around 2012–2015 tracks with rising anxiety, self‑harm, and ER visits among adolescents, particularly girls navigating relentless social comparison. Sleep takes a big hit too. Blue light, FOMO, and endless scroll wreck circadian rhythms, immune function, and mood. We share small wins that helped us: banishing phones from the bedroom and retraining feeds to starve outrage.
AI raises the ceiling on both harm and possibility. We dig into AI‑assisted posting, bot swarms, and deepfake scams that target elder users with cloned voices and faces. Then we contrast governance models: US platforms driven by market incentives optimise for engagement, while Chinese platforms tune algorithms for social stability and dial back domestic addictiveness. Neither model is perfect, but one lesson is clear—algorithms are steerable.
Our middle path: protect lawful speech but regulate amplification. Shift product design toward wellbeing by default—sleep‑friendly settings for minors, friction on late‑night use, measurable reductions in harmful spiral recommendations, and transparent user controls for calmer feeds. Back it with fines big enough to change incentives and investments. If we can tune the feed toward doom, we can tune it toward health.
If this conversation sparked ideas—or pushed your buttons—follow, share with a friend, and leave a review with your take on how you’d redesign the feed.
Welcome to Preparing for AI, the AI podcast for everybody. The podcast that explores the human and social impact of AI. Exploring where AI interstakes with economics, healthcare, religion, politics, and everything in between. I'm no good next to diamonds when I'm too close to start to fade. Are you angry with me now? Are you angry because I'm to blame? Welcome to Prepaying for AI with me, Pony Ma. And I'm Ross Geller. Oh, very good. From Friends. Yeah. Why? I don't know. So you may have guessed. I've been watching it actually. Okay. You may have guessed, listeners, or you may not have guessed, because it's nothing to do with Ross Geller. And you probably don't know who Pony Ma is. But today's special deep dive episode is about social media. And we record this today on the day that Australia have bought in the world's first regulatory ban of or so it's not a ban, is it? Is that is a minimum age for social media, but essentially it's being being seen as like a ban for social media for people under the age of 16. So we've been we've been planning to this episode anyway, but this felt like kind of the right time to do it. Um it's go on.
Jimmy Rhodes:I'm gonna say right off the bat, like, and I don't think this will be that controversial. Like, has this got a cat in hell's chance of working in any reasonable way whatsoever?
Matt Cartwright:No.
Jimmy Rhodes:But anyway, let's talk about it anyway.
Matt Cartwright:So the the episode is not, so we should say the episode is not about Australia's social media ban. It's just this is a a kind of um perfect segue into the subject. Um, I mean, you could talk about this at any time because it's one of the most important things out there, and and as we've said quite a few times on the podcast, like we're not learning the lesson, or we haven't learned the lessons from social media, we're going to repeat them with AI, and in fact, AI is going to also kind of turbocharge a lot of the things from social media already. Um it already is, yeah.
Jimmy Rhodes:Yeah, but I mean yeah, we're heading to a world, right, where rather than having an algorithm that finds genuinely human-produced content to that will stimulate you, so to speak. Um now you're just gonna move to a place where you can just create as much nonsense with AI as you want, um, to the point where probably in the future you'll have an algorithm that's just feeding you more AI generated stuff on the spot. I mean it's mad. Anyway, I hate social media.
Matt Cartwright:Good subject for us then. So, shall we very briefly like I I'm sure everyone listening to this will know about this law. Um, but it's called the Online Safety Amendment. Um, the online safety amendment, social media minimum age bill 2024. It stipulates that children under the age of 16 are prohibited from holding accounts on specified social media platforms. Um, it affects TikTok, Instagram, Facebook, Snapchat, X, YouTube, etc. And platforms, and this is the key thing because it's about platforms are required to take reasonable steps to verify age and deactivate accounts. Uh, Meta have reported deactivating thousands of accounts ahead of the deadline. There are penalties in place that target the platforms, not children. Um, failure to comply, fines up to 33 million Australian dollars, which is 21.5 US dollars. Um, and the goal of this is basically protect developing minds from the pressures and risks of social media, citing concerns about cyberbullying, grooming, and the addictive algorithm that fuels mental health issues. All good. Yeah, yeah, yeah. I don't disagree with any of this stuff. I think you know, let's get straight into um but just as an example, what's the easiest way to get around that?
Jimmy Rhodes:Our sponsor, Nord VPN. No, they're not our sponsor, but like I mean, literally, you just go and pretend to be somewhere else, right? It's already happening with all the um the porn stuff. Like they've they've said in some countries that you have to have an ID to watch porn, don't you?
Matt Cartwright:Don't the UK you have to register or you have to do some. I wouldn't know my VPN says I'm in uh South Africa or something. Yeah, when I accidentally put mine onto the UK, um I'm pretty sure there is a rule, there's a law in the UK. I don't know if it's like you have to register or you have to do maybe it's some kind of age-proof, but I'm sure the UK has a law. Yeah, digital LD stuff, yeah. But specifically for pornographic material. Yeah, yeah, yeah. Um but yeah, I mean, I guess the only thing is like if if if most countries or like I don't say all countries, but most countries do it, then it becomes more difficult. But I mean, yeah, you're right, it is a a barrier. I I kind of feel like in itself it's quite um it's kind of performative, right? I feel like it's it's been done it's been done to make a thing of it. Make a point, yeah. Yeah, and and I think there are there are ways that you can have more of an effect than this, but this has to be kind of part of it. So it's like I think I don't think on its own it stops it, but it makes it more difficult, and then you put in place other things like you know, the UK has talked about banning smartphones from schools and etc. etc. You know, you can limit the amount of time, you probably can't stop it completely, but you can do something to make it more difficult, you can kind of limit it. And I think the other thing is like it's raising awareness, yeah. And also if if the the platforms themselves are sort of face these fines, then maybe they start to just in in a way that, like, you know, like the EU had sort of the most stringent laws, and therefore, if you're a company that was going to operate in the EU, you just put in place the most stringent things because then everything else would kind of follow. It was easier to operate everywhere else. I wonder if, in a way, it's like if there is enough of a backlash against this, then that is enough to create a world in which the the the sort of tech giants or social media giants put something in place again that like reduces it, it doesn't stop it.
Jimmy Rhodes:The one it won't stop it. There are some people I feel a little bit bad for. So all the like kids who are like 14 or 15 years old, they've been smashing out Facebook. Well, the Australian TikTok influencers who are basically like, Yeah, yeah, now they're like how bad is this? If you've if you're like just if you're like about a year away from turning 16 and you just get kicked off the platform.
Matt Cartwright:I mean, I saw some stuff today. There was like there was a group of of like 15-year-olds that they're interviewing, and some of them were like, actually, the like it's really difficult for me, but I I I get it, and actually think it's a really like there's a lot of people who say, I I saw this in a survey recently about actually people said, Would you want your social media accounts to be? I think it was like, What what if your social media accounts got blocked and they're like it's terrible? And it and then it was what if all your friends got blocked as well, and they were like, Yeah, I think it's a good thing. So so so as long as their friends didn't have it as well, like a lot of from my understanding, a lot of teenagers just like totally get how bad this is. Yeah, they totally get it, but it's like, but all my friends have got it, how can I contact them? If none of you have got it, then it takes that away, then it takes it away.
Jimmy Rhodes:So yeah, yeah. I mean, I've I've uh I've recently taken to not taking my phone into bed with me because I was faffing around on. I don't really use this kind of social media like Facebook and and Twitter where it's more interactive, but I I watch YouTube and that's social media at the end of the day, and um yeah, I watched far too much of it and I stopped having it in the bedroom and I'm sleeping a lot better and feeling a lot better, frankly. I'm probably doom scrolling a lot less. Sure.
Matt Cartwright:And and I like I've said as well, like I we recorded an episode in information last year, like I took steps to limit my own social media access. Like I don't not look at social media, but I don't look at certain things. Um I trained my YouTube algorithm to stop feeding me the stuff that it used to feed me, like it's made such a difference to my health because what I realized is like if I the issue, the issue for me was just seeing the same thing thrown at you all the time. You you can't not think it's true, and I think what I've understood in the last year or so is like I've understood a lot more about how the brain works and understood that like you can train the brain in both directions, and so you can train yourself off it, but it takes a lot of will not effort. It takes a lot of willpower. Like, I would say, like, I'm I'm not you know, I am someone who has I think I've got like pretty incredible willpower. If I want to do something, I will do it. But even for me, like it feels like an effort sometimes, it really feels like I'm I'm I'm having to like constantly work to stop myself from doing it, and like frankly, most people don't it's not that they can't do it, most people also don't care enough. So, like I'm not saying this is a sustainable way for most people, I'm just saying like it is possible. Um both of us have seen, like me, particularly and myself have seen the dangers of this, and that's as like a guy in his forties who has a lot more life experience and ability to understand.
Jimmy Rhodes:So and it's fundamentally fundamentally is not a good thing, is it? If you if you if you said to somebody there is something that is so addictive that actually to get yourself off it you have to like whatever it is, you you have to expend like pretty like heroic amounts of willpower to get off it. What like without knowing what that thing is, do you think that's a good thing? You'd say no. It's not a good thing.
Matt Cartwright:I remember sort of like should you expose your children to something that's so I remember growing up in the 90s and hearing about like porn addiction and sex addiction and things and being like, oh, this is just an excuse. Like you you you get addicted to substances, right? You get addicted to like heroin or pharmaceuticals, that's what you get addicted to. And gambling is not a real addiction, it's just you can't stop. And as you get older, you realise that all of these things are truly addictive, and that I think is the key thing with like you mentioned addiction there with social media, is like it is an addiction because the dopamine hit that you get, like that's that's a chemical reaction, and once you've created that, your body craves it in the same way as it craves junk food or cigarettes or alcohol or you know, or whatever. Yep, like and that's the thing, and and and so it's proven now.
Jimmy Rhodes:Yeah, yeah, exactly. It's proven, it's not it's not a and on the flip side of it, like your gacha games, your things that like get people to keep incrementally paying for stuff on mobile apps, like all that stuff's designed around this. Yeah, like the people who are recruited to work for those companies who are who are are the sort of like the dark side of psychologists, like you know, they're they're the people who understand psychology, who are building games around this exact thing, like addiction cycles and dopamine hits.
Matt Cartwright:Let's get because you say because of the the sort of theme of the podcast, and and quite often with these deep dive ones, you know, we don't necessarily bring AI in until later on. But I mean, let's just bring it in at this point early on, because when we when we talk about social media, I I kind of feel like in a way, like at what point does social media and AI like at what point are they just the same thing? Um not to say that AI like social media is just AI, because of course it isn't, but uh but at one point like AI is so embedded in social media, like is it already, has this already happened to the point where I sometimes question where the and and I think this is a good thing. I sometimes question when I look at sort of comments on YouTube and I'm like, are any of these even people? Like, do any of these people exist?
Jimmy Rhodes:In terms of that kind of engagement, and I think uh comments on YouTube are a good example, Twitter's a uh X is another good example, right? Like what proportion I don't know what the answer is, and I was about to ask the question to an AI, weirdly enough, but like what proportion of tweets are AI generated? And and actually if you extend that, even if you take it beyond like pure AI generated, how how many are like AI assisted? How many tweets are written by a person but they used AI to write the tweet?
Matt Cartwright:But we know that since like the 70s or 60s, maybe even like Russia or the Soviet Union as it was then has been engaged on like a campaign to basically kind of you know tear apart the fabric of Western society with without at that point even the internet by just sort of ways that you can infiltrate and and and and soak sort of sorry, so unrest and you know misinformation, misinformation, but also just this like you know this this this sort of fracturing of society where you know anyone anybody with a different view, like no one can agree on anything, and people become more and more kind of extreme to one side. Like that there is there is lots of people who who you know defected from the Soviet Union who talked about how that was a a sort of intention and a a well-noted intention since the 60s or 70s, like that has obviously carried on. There are other hostile states, and there are hostile states to those states. You know, I'm not saying this all comes from from there's not the good and bad side, like both sides think they're doing the right thing and think the other side is evil, but there is this sort of very, very kind of sinister intent behind a lot of this, and social media has also kind of allowed this to happen. You know, of sort of bot farms that that that existed as well, with you know, hundreds of people sat with multiple mobile phones, fake commenting. And in China you had the Wu Mao where you had people being paid five five Mao, which is like kind of like what like five cents for each comment that they made. This was a big thing, sort of 10 years ago, etc. You know that that stuff existed, and the thing with AI is you now no longer need to employ any people to do this, yeah. You just need to set up a bot to sew this information, and I think that's kind of really important for people to think about as well, is just whenever you see any comment online, like you have no idea if that is a person, and you have no idea, you know, not just whether it is a real person or not, but but what is the intent behind that comment? I mean, it's it's so dangerous that because AI just allows this to be done on like a scale that was kind of unimaginable, even even three, four years ago.
Jimmy Rhodes:I asked I asked DAI 5.1, it's done a bit of research. I mean you'd have to check these numbers out, but just as an example, like there's been a study in 2024, they reckon 37 to 39% of posts on medium and quora if you use that. Much it's much much lower on Reddit, apparently. There was actually a study of Facebook long form posts showing that it rose from 5% pre-2023 to 41% in November 2024. Um, and we're obviously a year down the road from that. There hasn't been any proper study on Twitter, um, but they've there are it's plausible apparently that 15 to 30 percent of tweets have some kind of AI involvement now. So what I was talking about before.
Matt Cartwright:So when you talked about medium and and Reddit and stuff, you're talking on medium, are you talking about actually the content or are you talking about comments? Posts, it says posts. Wow. So that's the because medium is so for I don't know if people use medium, but like it's kind of like Substack. Yeah, fairly long form content.
SPEAKER_02:Yeah.
Jimmy Rhodes:Um it's saying a lot of long form content is AI generated. Now, I don't know, it doesn't distinguish between AI assisted, so somebody used AI to write something, and pure AI generated. So I'd probably caveat it with that because it doesn't specifically say that in there. But it wouldn't, I mean the 15 to 30 percent on Twitter is almost more alarming to me. Like if it's 30%, then like what in one in three tweets you're reading is likely to have been generated by AI. It doesn't entirely surprise me, but it's quite mad.
Matt Cartwright:On that form of on that thing of long content, because I like hate to keep going back to it, but I you know I write long form content in China, which I write in English and then translate into Chinese. And I was saying to you today, like obviously I use AI for a lot of it, so I I tend to I'll have a subject that I'm interested in, I'll have certain things like a paper that I've read or a podcast I've listened to, and then I'll kind of reference that and a few other bits I want and then tell it to go away. I'll give it a prompt with you know five or six sections, and I'll ask it to go and do deep research, then I'll drop that into um a sort of AI project that has all of my previous stuff, and I'll get it to kind of rewrite that a bit. And then I usually spend like I'll still spend like on any article, I'll spend at least a couple of hours like rewriting and putting in like personalized bits. The introduction is always personal. I'll put in my examples, I'll link it back to certain articles I wrote in the past, and I do all that work, and it takes me a couple of hours for one. I mean, the thing I wrote recently on HLA's, which is a three-part thing, that took me about six weeks. I was working on it like a few hours here, a few hours there. But I'd re I'd say I work 20, 30 hours. But I my so my point here is that like I sometimes question like the extra time that I'm spending is to make it personal. Like, partly it's for me writing it because I want to I want to write stuff because the writing process is good and it helps me to learn, partly it's because I think it makes me stand out, but it makes me stand out by like five like what I do those extra hours adds maybe five percent to what I do, and and the problem is if your audience is like a really high-quality audience that want that high quality of content is worth doing, but then if people are just gonna read the first 10% of it anyway because their attention span is not long enough, you're kind of what is the point? So when when when all of this content, you know, we're saying two-thirds of this content or or or a third of this stuff is just AI generated, possibly, and people will be like, Oh, that's outrageous. But actually, well, if you're reading it and you're happy with it, then you don't really care, I would say. For a lot of for a lot of people, I'm not saying most people, but for a lot of people don't really care. It's kind of sad, and I've said for a long time, I think I think there will remain a market which might be quite a niche market for things which are not necessarily handwritten, but like we say, AI augmented rather than written by AI. But for the majority of stuff, if people can't tell the difference, it will very quickly kind of not matter.
Jimmy Rhodes:Yeah, and I and I think I mean it's an anecdote, but and I you know, I say I don't I don't watch social media, like I do watch some video stuff, and I was flicking through channels on um on China like the Chinese channels thing, which is basically like TikTok, it has a lot of TikTok stuff literally copied straight onto it. So on WeChat? Yeah, it's on WeChat, yeah. I was on there, and uh for whatever reason I've started, I've come up obviously come across cat funny cat videos and and watched a few too many, and now I get them in my feed. But um yesterday I think it was or a couple of days ago, I was watching a cat video and I was like, it was like it was just too it was the setup was too good. Like it was like cats messing around with dogs and animals messing around with other animals doing stupid things, and I'm and then I realized I was like this is probably all AI generated stuff, and I and the moment I realized that I was like, shit, like this is what I said was gonna happen. Does it matter? And it's happened to me for that kind of thing, for cat it probably doesn't matter, it was still funny, and you're right, but like but then the moment you realise it takes something away, actually, like even for that, because if you're watching a real interaction between real animals, it's a ridiculous example, I know. But you're watching a real interaction the difference between you're watching a real interaction between some funny animals and you're just watching a bit of AI generated content, it does make a difference, even for something as banal as cat videos.
Matt Cartwright:Yeah, because why is it funny? It's funny because it happened.
Jimmy Rhodes:I think so, yeah. Yeah, yeah, yeah.
Matt Cartwright:I think also that that like coming back to my example again, I think that's the thing where I I said to you today about how like one of the things I'll always try and do is weave in like personal examples, because the whole point of writing certain things is because I have a personal interest and I want to say something about it, but actually I can just ask it. And I tried this, just said like make up personal examples. And when I look at it, I'm like, well, that didn't happen. But actually, for someone reading it, you could say, well, it doesn't matter because they don't know if it happened or not. But I think the point for me is like with a lot of this stuff, the thing that gives it credibility that it's real and it and it works or it has an effect or it's it's worth people noting is the fact that you've got Personal experience, and if it's just making something up, but then that thing then goes into the training data, and then you know it feeds itself, and then it's accepted as fact the fact that you know I was able to you know float in the air, which didn't happen, and it just becomes fact because then it's picked up by another thing and another thing, and then that goes into training data, and then when someone asks AI a question, it says, Yeah, it is possible to just float in the air because that's what the training data says.
Jimmy Rhodes:Yeah, and I think what you're getting at is what is basically what makes art art, in my opinion.
Matt Cartwright:Sort of, but it's fact fact. I'm not like that is one thing, but it's also factual correctness because in in a lot of examples like small small mistakes, which AI will still make because it you know it is only referencing what it what it's read, those small mistakes could be vitally important. Could be really important.
Jimmy Rhodes:Maybe, yeah. I mean, I also think I it's my opinion, but I feel like creative outlet is your creativity, and it's so therefore it's your art in a way. Like I'm not saying your blog posts are ever gonna be worth like a what a Picasso is, but for sure an AI generated Picasso is never gonna be worth anything because it wasn't made by a human. NFTs, mate. Well, yeah, they went well. Okay, so I think we're we're like getting off track a little bit. It's really interesting debate, but just to bring it back onto the social media, um the like what's the word, like the regulation of social media. Um, so where so where are we? So Australia is obviously the first example.
Matt Cartwright:Yeah, Australia's the one that's done it, right? Um, but there are things that I think are kind of in train at the moment. So Denmark is apparently planning a ban for users under 15, but with exceptions for parental consent starting at age 13. Like, why make it more complicated? But okay. The UK is closely monitoring Australia's model and has signalled a willingness to advance age-restricted bans or time restrictions. Although, you know, I'm not sure when when Trump says we're not allowed to do it, I'm not sure whether we'll be able to follow through on that. But let's see. We'll get tariffed. Malaysia and Indonesia have announced plans to impose age restrictions on social media in 2026. I'm not sure exactly what those restrictions are. Um, and then US states like I mean, I say like, and the obvious two, California and New York are focusing on lefty ones, you mean? Yeah, lefty nonsense, these woke states. Oh dear. Um I'm surprised Oregon aren't there in there as well. I think I came out a long time ago with my my political leanings. Um, so they those states are focusing on regulating addictive feeds and strengthening policy behaviour rather than outright bans, so basically doing nothing. Um the other thing here, I think you don't even know what that means. Yeah, the other thing here we should note at this point is like strengthening strengthening privacy settings, so you can do it, you can do it in secret. No, so I think this is like like where parents have like the parental lock that allows you you can't view certain things, right? So you can so you can set things for your kids. I have it in my my daughter has a Amazon fire, whatever it's called, and you can put you know, you put time limits on it, you put like she needs to do 15 minutes of like education stuff before she can do other things. I I I would imagine that's what it is.
Jimmy Rhodes:The other thing at this point to talk about better incognito mode. Sorry, anyway. Sorry, carry on.
Matt Cartwright:Um, the other thing to talk about at this point is like big tech's response, which as you'd imagine is just to like lobby the shit out of it. Um so you've already got this like movement of well, and slash threaten holding. Yeah, but you've got this movement of like threatenh. 15-year-olds in Australia who are like you know campaigning and they're all like fully backed and funded by Meta, basically. Like it's already been uncovered.
Jimmy Rhodes:So Elon Elon Musk's now threatening entire continents, isn't it?
Matt Cartwright:Yeah, exactly. I mean, they're saying the blanket prohibit prohibitions are ineffective and harmful. I I mean, I'm sure in some plays they'll be claiming they're illegal. Um Meta argue that without an account children will lose access to important parental controls and safety filters. Well, they won't need them because they can't because they can't look at anything on them.
Jimmy Rhodes:Um that's mad. Yeah, what sort of argument is that? I don't know.
Matt Cartwright:But I mean it's this the main thing is just gonna be lobbying, right?
Jimmy Rhodes:This is the Zuck talk into Congress again, isn't it? Well, I really like it. They don't understand any of the words that he's using.
Matt Cartwright:I really like this. I mean, this is not a quote from them, this is a quote from Gemini actually, which is quite ironic because Gemini that is um owned by the company that owns YouTube, but anyway, the underlying concern for big tech is protecting the pipeline of future users, as losing the under 16 demographic limits their long-term growth. I think that kind of sums it up perfectly, right? Like, how well yeah, why would you why would you I mean like the cynic in me is like why would you not push back against this? I'd I'd I'd love to think that these companies had sort of a moral backbone, but like why would they? Because they are purely like they are what they are, right?
Jimmy Rhodes:I mean, I'd sorry to go back to the smoking analogy, but literally this has happened before. Yeah, yeah.
Matt Cartwright:In the film, Thank You for Not Smoking.
Jimmy Rhodes:Well, exactly, like Thank You for Not Smoking, Coca-Cola in the you know, famously now in the 80s, like basically being like sugar's fine, fats. Well, you're still seeing it with big food.
Matt Cartwright:You're still seeing it with big food and big agriculture, and you know, and don't get recited on big farmer. So it's not a surprise, but yeah.
Jimmy Rhodes:No, uh like and this is something uh I guess in a way it's kidding feels more easy. It just feels like it's sort of like not it's we should be pre you'd think that if you've got a moral at the even the remotest moral compass you'd want to protect.
Matt Cartwright:Yeah, to some degree, like I feel that is kind of the case. Like you don't see like and you know, big you know my views on Big Pharma, but uh you don't really see Big Pharma pushing things on kids in quite the same way. Okay, you could say vaccines, but like you don't see it with pharmaceuticals. And I and I do vapes.
Jimmy Rhodes:They're not pushed on children, are they? Bollocks, they're not directly pushed on children, they're just like they're just vapes. They're just not the pharmaceutical industry, are they? Well, no, but they're like an industry that's subtly marketing flavoured smoking products to children.
Matt Cartwright:Yeah, fair enough. I I just think like the industry that I hate more than anything, Vig Pharma, even they are not, even they it feels to me don't implicitly target kids. The pushback, yeah, you know, if if you were saying like a smoking ban on kids, the thing on like pushing back on on on when when they've tried to ban uh junk food or or advertise around junk food and kind of reduce the portion size of of you know chocolates and stuff like that, when it's been targeted at kids, I feel like even those companies have not sort of they've known that like the sentiment of this would just look bad, whereas big tech just are just like plowing on with this. But I but I shouldn't be surprised. I shouldn't be surprised.
Jimmy Rhodes:I'm not I'm not you I think you do actually get in trouble for advertising stuff directly at kids though as well, don't you? With like if you started advertising Maltesers as like one of your five a day, yeah, you'd get in trouble for that, genuinely. Like, but I I I don't does Facebook do that sort of thing? I don't know. I think it's just infectious, isn't it? It's just kids are on it. But also I think with I think with this stuff it's it's where smoking was uh you know, okay, maybe 80 years ago, whenever it was that people were like saying smoking was good for you, like it improved your it made you healthier and stuff like this. I mean probably kids were smoking. So I'm I'm interested in top chimneys as well.
Matt Cartwright:When you say this, when we talked in the last episode about how like you know, driverless vehicles will at some point like the idea of driving a vehicle will seem crazy. Like, do we get to a day then where like the idea of of social media because we like we all know social media is so terrible. Yeah, do we get to a day where we like we're all like how did we ever think that was okay? Yes, I think so.
Jimmy Rhodes:If we're like in civilised society, yes, probably.
Matt Cartwright:I I wasn't giving the caveat we're in civilised society in in the world that we live in.
Jimmy Rhodes:Well, I don't know. I don't know, maybe like maybe not. I mean I i I feel like things are heading in that direction, but then at the same time, at the moment you've got the whole US versus everybody attitude of like I mean, all these companies are in the US, but yeah, I I I think probably we do end up there at some point. It's so dangerous. Certainly with respect to kids.
Matt Cartwright:Like I think there's gonna be a lot of this stuff comes out and and then you say we get to that world and it it we get in that we get to that world with kids, but you said at the beginning of this episode you don't see how a ban can work. So how do we get to that?
Jimmy Rhodes:Oh no, I mean Australia's ban. Like I think I think I think if it if it becomes almost like maybe not entirely global, but an almost global ban. I I think this is one of those things where over time, like things like the Australia ban will happen in more and more places, then it will just become recognised by more and more people. And and actually, how did we ever allow kids to yeah? And so and so so, like you say, like kids themselves will be like, oh yeah, you just don't do that at some point in the future.
Matt Cartwright:And parents kids know you're not allowed to smoke or drink alcohol. You try and do it occasionally, but you don't like openly sit at home smoking in front of your parents.
Jimmy Rhodes:Yeah, maybe some kids will sneak around and sneak off on around the back of the bike shed and go on social media still, but it won't be it'll be much, much reduced. And and why would you do it anyway if your mates aren't all in that?
Matt Cartwright:I do feel like we're going I'm in danger of taking us off track a little bit, but I do want to say something here because going back to something we talked about last year, and I said like one of the good things I could see with AI is it moves us away from screens because we get to a point where we have a more natural interface, right? So I'll come back to like the AI rabbit and whatever those devices where you you basically talk to them. So, like I do wonder if that more natural interface kind of becomes a way to reduce our dependence on social media, or or do we just end up with social media as like somehow there's a medium of social media that isn't visual that we haven't yet thought about?
Jimmy Rhodes:I think probably the latter. I I don't know, like if anyone could directly inject ads into your brain, I'm pretty sure they'd do it. Exactly.
Matt Cartwright:Um I don't know, like maybe we won't need a screen because we'll constantly be just interacting with other people in social media through a chip implanted in our Yeah, exactly.
Jimmy Rhodes:Maybe maybe Neuralink will get there first, and like if you you can go to the bathroom, but only if you listen to an ad first.
Matt Cartwright:Right. I I wanted to talk about the um there's a famous book. Anyone listening from the UK I think will have heard of this called The Anxious Generation, Social Media's Threat to Health. It's kind of an uh um a kind of viral book by Jonathan Hayte, um, and it basically talks about well, how the great rewiring of childhood is causing an epidemic of mental illness. So this is um basically about the replacement of a play-based childhood with a phone-burst childhood, and saying this is the primary cause of a catastrophic rise in things like anxiety, depression, self-harm among adolescents in the US and other English-speaking countries since the early 2010s. There's a kind of very specific here, two technological shifts which happened between 2010 and 2015, or or at least the kind of adoption happened then. So one is the rise of the smartphones, so introduction of affordable smartphones, and two is the dominance of social media feeds. So smartphone came first, 2010 to 2012, and then domination of social media feeds. So that's when we moved away from desktop social media like Facebook to things like Instagram, TikTok, and then this kind of highly algorithmic like driven feed that just feeds you the same thing over and over again. And then this kind of phone-based childhood is characterized by what they call social deprivation, structural sleep deprivation, fragmented attention, and social comparison. So I'm gonna read, let me just keep reading this bit because I think it's like quite good to set this um, like kind of set the scene here with this. So then um the onset and magnitude of this, the mental health decline, and this is like I recorded and documented, began sharply in 2012 for girls and about a year or so later for boys. There is a big gender gap in this, so the crisis is significantly more severe for girls. I think a lot of that is probably related to kind of um your self-esteem issues and issues around kind of appearance. Between 2010 and 2019, rates of depression, anxiety, and self-harm in females rose across all demographics. Um the US had everything what is it? I think that the rate of self-harm in that nine-year period rose by 100%, and emergency room visit for self-harm and suicide increased significantly from 2012. And then the other point, and this is the one that my colleague was was really talking to me about the other week, is the rise in screen time correlates with this decline in real-world social interaction. So you have like you know the percentage of people who are hanging out with like friends in person every day, dropped by over 40% between 2007 and 2021, and then that decrease in unsupervised face-to-face social time like takes away from children the ability for them to like practice things like developing social intelligence, conflict resolution, emotional resilience. And the really important thing here, I think, is that by not learning those skills at that time, when you become an adult, they don't have the skills to deal with basically deal with conflicts because they haven't had that experience of like resolving things, it's all been done online where you don't have to resolve it, you can just turn off and walk away. So the example that my my colleague had given to me is like you have an argument with me, like, and I remember this at school, is like you used to like like you would call it like break friends with people all the time, right? You'd have an argument and be like, I'm not your mate anymore, and then you'd have to learn to come back the next day and be like, Yeah, I'm your mate again now. Sorry. Like that sounds like a really small thing, but if your conversation is just online, when it finishes, you just like turn it off and move away from it. You never resolve the conflict. This apparently is one of the key points about how mental health is.
Jimmy Rhodes:Maybe that's why teams chat's so rubbish at work.
Matt Cartwright:Maybe, but it's why it's being affected so much by this kind of interaction because your your brain's operating system is not developing at the point and when you're supposed to. And then when you're supposed to be ready for the real world, you don't have these skills, you don't have the resilience, you're not able to resolve conflict, you don't have social intelligence because you haven't had that amount of engagement.
Jimmy Rhodes:I do, yeah, I feel like there's something to this. Like I hadn't heard this before, but they I can't remember what the percentages are, but when you do this like stuff around communication, they say about how much what a large percentage um body language is of communication, and then the next largest is obviously like speech and tone of voice and things like that. And then the lowest one, yeah, like is is if you're getting an email or something, it's it's like you're getting five percent of the communication modes that you rely on. Yeah, like whereas whereas when you're talking someone face to face, you're at a hundred percent and 95%.
Matt Cartwright:How often do you read an email and you're just like fucking hell? Yeah, well fucking hell, like what why are you taking that attitude? And it's like it wasn't, it wasn't, yeah. It's just like you style of it right in the or you haven't understood that they are trying to like have a dig. You just don't get it because you don't you don't see that and and and therefore your your mental sort of position at that time we're not designed, yeah. Yeah, I find myself quite like not that often, but sometimes it happened to me this week of like an email from someone and going to reply to an email and being like, I'll just go and speak to that person because if I send this email back, it's actually gonna probably affect our relationship. But if you haven't got that skill to do that, then you don't go and fix a relationship, you just send the email. Well, they don't send like I know not email, but like the message, whatever.
Jimmy Rhodes:Like email's a bad example, but yeah, or message, but anything that's text, like I mean, and to be honest, like the stuff you see online's horrendous. Like the just people just say to stuff to each other online that they would not say to anyone ever face to face, because you'd we might probably get physical violence off the back of it. But like, you know, I I I play video games online still, and I play Rocket League, and like people are vile to each other in in in these, like in these online environments, because you can be like there's no consequences, yeah. Um, and again, it's the same thing, right? Like, if you're physically face to face with somebody, you just wouldn't say those things or behave that way because there would be consequences because it's just really vile behaviour. Um, yeah, awful.
Matt Cartwright:I think one of the things that really surprised me about this, because I said these four points social deprivation and development harm, performance, society, and social comparison. So, this is like the thing where women, uh sorry, girls in particular, because girls are constantly uh sort of in their teenage years in this kind of state of social comparison about their appearance, their popularity, it's like yeah, body image issues are uh are so so sort of significant at this point. The algorithm thing that we've talked about to death. I mean, like the algorithm, like I don't think we need to talk about anymore, but like sending you down this rabbit hole of content of self-harm, eating disorders, you know, for for young men, it's political ideologies and sort of you know misogynistic kind of male tendencies. But the one that really surprised me is this one at structural sleep deprivation because I talked to you about this recently and saying, you know, how I think the number one thing to affect your health is probably actually if if you're in like in reasonable health, that the number one thing that probably affect your health for me is sleep. Like it's it I think it's that important. Yeah, and I don't remember like obviously when I was older, like I don't remember knowing anyone when I was a teenager who was like couldn't sleep, but you just sleep, you just lie in bed until 11 o'clock on a Saturday if you don't get woken up. But now you've got this apparently this kind of epidemic of like young people that that that can't sleep. Oh, that's bad, isn't it? And they have the phone, and you said you took the phone out of the bedroom, which is quite like quite interesting because it's like young people. I remember you had the TV, and like I could hear my mum coming upstairs, you quit turned the TV off at like 10 o'clock at night. People just lay in bed, like scrolling through their phone at night, right? And then they've got FOMO, their fear of missing out on everything, and then they can't sleep because they've got the fear of missing out, they can't sleep because they've been looking at a blue light all the time. Yeah, they're deprived of sleep, that affects their immune health, their emotional regulation, their memory. It like it improves increases the risk of depression, anxiety. Like this is physical as well as just mental.
Jimmy Rhodes:Yeah, yeah. No, I mean I I I I don't know who's letting kids take phones to bed with them at night. Like, they shouldn't be doing that at all, as far as I can tell. But um, yeah, you've got your funky orange glasses on actually. You care to explain yourself?
Matt Cartwright:My orange glasses are blue light blocking glasses, um, but they also dim the light, so they're supposed to help my circadian rhythm.
Jimmy Rhodes:Yeah, and help you sleep. I mean, I think they also make me look good. They do look pretty cool, actually. I'll uh I'll give you that. But this um yeah, I think I think like I the sleep thing makes total sense. Like I've I've been I I only started doing the phone thing recently, and it's honestly even in a week, it's just been a game changer. Um, you know, that and not drinking beer, like drinking beer is pretty bad for your sleep, but like you know, uh just not having my phone in my room's been almost as um much of a difference, really. Because because I wouldn't I wouldn't say I'm terrible. Like I'll probably still go to bed within an hour of going to bed, but now I'm just like back to Go to bed and hit the pillow and you're out pretty much straight away. Because you've got nothing else to do.
Matt Cartwright:Isn't the problem with this stuff? Like you've started doing this now, but all of this stuff, I don't think there's any person that we know who doesn't know this. Right? That it's bad for you. I'm not saying there aren't people out there who don't know, but I think like of the people that we know and we hang around with, I think everybody knows looking at your phone before you go to bed, and social media is bad for you. But not many of those people have the motivation or the willpower to do this.
Jimmy Rhodes:Oh, I've only done it recently. And I've found myself, you know, it's interesting. Like I have got I've I've we've been sticking to it. Um, me and my wife, we've been sticking to it. But we've you know, I've started like staying in the living room a bit longer and been on my phone in the living room and stuff like that. So it's not like I have this amazing amount of willpower, it's just that I'm determined not to take my phone in the bedroom, and it is better. Um, and I you know, and I realise I need to go to bed, so I'm not spending as long, but I have started like going on the couch for a bit and going on my phone instead, because I know I'm not an outlet in the bedroom.
Matt Cartwright:So I feel like we like we live in an age where everybody thinks they've found the one thing, right? I mean, we talked about this, like is in everything is insulin resistance, right? Everything is caused by COVID, everything is because of social media, everything is because of something, everything is because of mitochondria. Like, I'm listing actually all the things that I think are really important, but like my point is it's it's not one of those things. No, but with social media, it does feel like like the more you see this stuff and the more you realize that it's not just a case of like, oh well, it's really bad for us because the algorithm I think the one thing everyone is like, oh, the algorithm drives us to stuff that we shouldn't, it puts us in the echo chamber and it makes us feel it makes us doom scroll and feel depressed. That I think is kind of accepted. But when we look at some of this other stuff and find out that it's you know, it's destroying people's sense of self-worth and it is physically stopping them from being able to sleep, which by the way, when I say it's really bad for you, one of the things it's really bad for you is it massively, massively depletes your immune system, like possibly up to sort of like you know, more than 50%. Um being sleep deprived. Being sleep deprived, and I'm when I say being sleep-deprived, I mean being sleep deprived, I mean not sleeping well for one night. So being disleep deprived like makes you literally insane, but being sleep, like not sleeping well, sleeping like two or three hours in one night, you might feel fine, but you have this you know huge effect. I was listening to uh a kind of sleep expert talking the other day, and they were listing because there's a thing about sleep deprivation and Alzheimer's disease, right? And they were listing figures. Um, Margaret Thatcher was one, Ronald Reagan was another one, people who were famous for how they like only slept four or five hours a night, and this list of people who only slept four or five hours a night, and all of them had dementia or Alzheimer's. Wow. And I'm not saying like, you know, this is this is someone trying to make a point, but I'm just saying, like, you can see here how like so many things are being linked to this that we're starting to find out about because if it's affecting your sleep, it's affecting your self-confidence, it's all of these additional things that it's affecting. It's you know, it it probably affects how you eat, it affects your eyesight, that's for sure. Yeah. Jimmy's just put on a pair of my are these not mine. True, true dark glasses. Are they not mine? I don't know, are they yours?
Jimmy Rhodes:They look like mine, yeah.
Matt Cartwright:Maybe they are yours. Oh, I recognise them.
Jimmy Rhodes:Anyway, yeah, I'll pop them on for the rest of the day.
Matt Cartwright:As we don't do video on our podcast, no one knows what the hell we're talking about.
Jimmy Rhodes:No. We're both we're both wearing shades in a in Matt's bedroom, I think. It's not my bedroom, this is the office. Studio, yeah. Um, right, sorry. So yeah, like it's it's um yeah, like this is the thing, isn't it? Like social media you can't put the genie back in the bottle or or whatever the word is, but like I I think it's something that we have to like it says here like epidemic, mental illness, like some of the words on the screen, like they're sleep deprivation, fragmented attention, um, social deprivation. Like all the words on the screen here are awful. And I think we all know that they're these are real things, right? These are all real things, and I think particularly you've got to protect kids from this stuff. Um, as you said earlier on, like uh maybe I don't think we should expect any company to be doing anything in there that's not in their interest to their shareholders, basically. So it's obviously gonna have to come from governments and um um social institutions like putting these protections in place, right? But yeah, like all the stuff that you're looking at there, like it just sounds awful. It's that like uh in a lot of ways. I don't know, I'm not saying that people should start smoking, but it feels like it might be worse than that. Maybe that's a hot take.
Matt Cartwright:No, I don't I don't know if it is. What was the other thing that's comparable to smoking? So there was something else I saw recently that was comparable to smoking, and it's like it uh initially it sounds like oh, this is kind of nonsense. And when you look at it and you realize actually, because the thing with smoking is like pretty okay, smoking does have a few effects, but it it's kind of smoking is it's like a physical effect, right? It affects your sort of heart, your lungs, your arteries, etc. Like it, but it affects like a few things, but it it's it doesn't affect your mental health in the same way, it doesn't stop you from sleeping, it doesn't like if you think of the number of ways that that social media is affecting you. Not there is no single pathway that is as bad as smoking. But where smoking is like it affects your lungs, your your heart and your arteries, you've got social media is coming at you from about 50 different angles that it's affecting, it's it's affecting family life, it's affecting your relationship with your partner, it's affecting your relationship with your kids, it's affecting their ability to study, it's affecting your ability to sleep, it's affecting your eyesight, it's affecting your diet, it's affecting your attention span, it's affecting, yeah, I mean it's affecting so many things. I think we're talking like we focus a lot because of the social media ban, we focus a lot on kids here, which I think is kind of right, but the episode was never supposed to be about kids, no, right? And and I've talked like on this episode, but previously as well about like the effect that it's had on me and on people all around us. Like, this is not a thing that is just about children, no, and we see as well, like one of the things that that actually concerns me the most is like older people because one thing that shocks me now is seeing, and it's something I I see here because I live here, I don't know if it happens as much in other countries because I've only lived in China since you know social media really became a big or social media and smartphones really became a big thing, but seeing people in their 70s and older just sitting for hours and hours and hours just swiping through TikTok videos. And the problem for those people, like the different issues, the problem for a lot of those people as well is they're not able to understand, and this is where AI comes into it, they're not able to understand what's real and what's not. Now, none you know, none of us are completely, but they are they don't understand this technology in the way that they're also very easily influenced to thinking that something has happened, and then that rubs down on their relationship with their children and their grandchildren and what they say and how they perceive the world. Yeah, yeah, they're more affecting every age group. Yeah, exactly. In many ways, they're as vulnerable as as the young people, different reasons, but yeah, you're right.
Jimmy Rhodes:It is like it is typically elder, um more elderly people that are you know targets of scams and things like that. And again, it's it's because that's a perfect example, yeah. And it's and it's and it's partly because you know they haven't necessarily kept up with the rate of change of technology as they get older. Um, and so some of these things are just like it is hard to keep up with them. It's hard. I can I I mean I can see AI being abused for that quite a lot and especially within social media as well, like uh getting used in scams and getting used to you know, it's we talked about it before. It's not gonna it's it's it we're it's either possible now or we're very cl very close to being possible for like uh an avatar of me to call my mother and and and scam her out of money or something like that. But social media is the platform. Social media is the platform through which it yeah, yeah, through which that's it's gonna happen, yeah.
Matt Cartwright:Yeah, maybe this is something that people sort of know more and understand more about than than we think, but I think again, like giving the view of we're in China, and so it's it's useful to give a bit of context. Like there is, I mean, there are obviously other social media models in the world that are not Chinese or US, but then we had a look at the valuations of them, and I think you have you got the list and then the sort of biggest non non-US or China model?
Jimmy Rhodes:Yeah, so I mean if we're comparing like the reason we're talking focusing on those models is because we we did do a bit of research and effectively dominate. So so so like if you um the I the numbers are looked at, if you look at a pure social media company, the best example was Meta, I think, which is worth 1.4 trillion. Obviously, one of the big social media companies. There were examples like Microsoft and Google.
Matt Cartwright:The bottom of your list of really big ones was um 10 cent, which owns WeChat, which was 0.78 trillion. Yeah, so like half of that value, but still like massive. And then then the next one on list that wasn't US China, so the biggest was a Korean.
Jimmy Rhodes:Yeah, the bottom one, the bottom one that rounded out the top 10 was worth 39 billion, which sounds like a lot of money, but this was like a relatively it was a top 10 social media um organization, like conglomerate, um, from South Korea, and it's like one thirtieth the size of the U.S.
Matt Cartwright:Yeah, just remember the next the next one was was 0.78 trillion was was the next yeah, was the sort of yeah, the next lowest on the list. So it's a massive gap between them. So anyway, that that's why we're only talking about US and China. So um, like the US basically has a governance model, it's called so uh sorry, the US governance model is market-driven, right? So rapid, unregulated innovation, focus on engagement and revenue, so basically a capitalist approach. China's model is state-driven, it prioritises national strategic goals and social stability, which is something you see through content. I mean, even with the algorithm, like if I put something and I promote the Healthy China 2030 campaign, it will definitely raise my algorithm, raise raise my sort of engagement through the algorithm because I'm I'm you know promoting the message that the state wants to do, whereas in the US model it is purely about um you know what what do people want to buy basically and what do people want to look at. But I think like China's algorithm is ahead of the US one, like as a direct um algorithm, and that is this is a bit of a take, innit, really? Well, it's based on the it is ahead, it's based based on the fact that they own TikTok, which has the best algorithm.
Jimmy Rhodes:What do you mean the best algorithm? You mean algorithm for protecting people what protecting No, I mean the opposite.
Matt Cartwright:I mean the best the best algorithm for engagement is TikTok. Right, and and TikTok is is better than so Doyin is the the version of TikTok in China, TikTok is the version outside of China. Oh sorry, so you mean the best for engagement? Yes. Which is the reason why the US wants to buy it, because it's just so good. Right, okay. And is the reason why China has a different version within China because it wouldn't allow its people to be subject to the TikTok algorithm.
Jimmy Rhodes:Well, this is where I thought so I thought we were mixing up two things here. So one of this, one of this was about one part of this was about algorithms being controlled by the government because they want to have a level of control over it. Which is definitely ahead. Yeah, yeah. Well, ahead, yeah. But like the I mean, I think a lot of our listeners, uh especially if they're not at West, would be like, well, yeah, but that's just controlling what you get to see and what you're thinking, right? Yeah, which is the argument against all this. So but but I mean in terms of protecting um like children and protecting people in general from some of the harms of social media, possibly, possibly this this is a benefit in that sense, right?
Matt Cartwright:But I guess there's there's definitely a counter-argument for that. But sorry, my but my my point on this was that because you're right, like China's ahead in in algorithmic regulation, yeah. So China has more control over it, and the US doesn't really have any control over it, so they they're pretty much it's a free-for-all.
Jimmy Rhodes:But my point was that they would then people would argue that's free speech, which people would defend to the hilt.
Matt Cartwright:Where China is is ahead, because you're saying the biggest social media companies, and I s I said, well, what about ByteDance is the the parent company of of TikTok, Doyin is the Chinese version of TikTok, which actually came first, but anyway. Um Doyin, the algorithm used on Doyin, which is used inside China, which is basically TikTok in China, is not as addictive as the algorithm that's used on TikTok out of China. Yeah, and the understanding is that that is because of the algorithm regulation where China doesn't allow it to be so addictive because China wants to use it more to ensure that the right patriotic themes and positive energy, etc., is is used. So I'm sure that algorithm itself is as powerful, but they don't employ it in such a powerful way, and that is the reason why it's like been turned down slow. That is the reason why Trump wants them to buy out because whatever we say, or sorry, whatever they say is the reason. Fundamentally, it's because the algorithm is so powerful that they want to one, understand how it works, and to not have basically China controlling an algorithm that is able to kind of digitally enslave the rest of the world.
Jimmy Rhodes:I don't know about this, so I'm not suggesting it is, but do you think it's the algorithm's actually a better algorithm, or do you think it's that that short form video paired with an algorithm is just really good for figuring out what people like?
Matt Cartwright:It's both, but but if if it wasn't so obviously social media sorry, short form video is the most addictive form. Yeah. But if no one no one has been able to replicate the algorithm, which is why things like reels and shorts, etc., like all the stuff on Instagram, Facebook, it's not as good. They've all tried, but they can't get anywhere near it. They can't get anywhere near it. Well and and so I I listened to an interview with a kind of expert on this, and he said the reason it's so good is it so quickly just picks up and is so ruthless in only promoting the content which is going to be popular. So it doesn't care on TikTok. This is not doughing, on TikTok outside of China, it doesn't care about what the message is, it doesn't care about whether it's going to promote a useful account, it doesn't care whether it's gonna sell anything, it just literally says any content that it thinks would people would like to see and would be engaged with, that is the one thing that decides whether it's pushed. And what is the content that people most get addicted to is shocking, frightening, fear-driven. It's not the stuff that rewards you with nice cat videos, it's the stuff that makes you go, oh my god, the world's gonna end, and then I can't stop flicking through it.
Jimmy Rhodes:Well, that's the stuff, isn't it? But yeah, yeah.
Matt Cartwright:Well, no, but I think it's no, but I think no, but I think it's this more than pranks because you see that side of it like oh, the latest TikTok dance, but that is just going around because it's kind of viral, right? I think that almost hides the fact that the sinister side underneath is that if you let let's take the pandemic, let's say the pandemic sound like a Philistine in this, it's because I've literally never downloaded TikTok ever, and I'm probably not gonna want to, but let's say in the in the in the mid in the middle of the pandemic, like what would what would sell the most like what would people look at the most? Was it another video of how to make sourdough bread? Was it something about how like you know the world's gonna get better, or was it the look how many people are dying on the streets? It's the third one, so that's the thing that it promotes because it's as one criteria which is what will people watch? And what will people want, what people will watch is fear, doom, scare, anxiety. That's why it's so powerful. Yeah, and doing inside China doesn't have quite the same because it's trying to promote values, and that's why the US wants to buy it and control it. Yeah, that's green. Yeah, anyway. Um, so yeah, like integration, so just like the obviously in the US, like the famous ones Meta, TikTok, Global, um, and then like YouTube. I think AI is heavily integrated into recommendation systems, um, and it's used for content moderation, which like a lot of people find is very inconsistent. I know a lot of like the bans that have come in on like really weird reasons in the last few years are because AI has basically picked up things and got it wrong.
Jimmy Rhodes:Yeah.
Matt Cartwright:Um there's a ton of that, yeah.
Jimmy Rhodes:I mean, there's also people abusing the system and stuff, but yeah.
Matt Cartwright:Yeah, there are, but I mean the thing with AI coming in now is like it seems to be like AI doesn't like it's banning weird stuff, and then that kind of drives conspiracy as well. Like the reason it happens, I think it's just inconsistent. Yeah. Chinese platforms are Doyin, which is like what like we said, like Chinese TikTok and WeChat, um, Xiao Hong Shu, which is kind of like Chinese Instagram little red book, I think it's called in in in English.
Jimmy Rhodes:It was the one that became really popular for a while, didn't it, in the West when um when TikTok was about banned. Yeah, so that's Xiao Hong Shu.
Matt Cartwright:And Xiao Hong Shu went and actually people were going on there and being like, oh, it's just a lot of like I see quite a lot of videos of like Americans going uh hey, life's so terrible here in the US, like you've got it so good in China, and like those are getting like promoted the shit out of it, of course. Yeah, yeah, yeah.
Jimmy Rhodes:Um but everyone was like how everyone was quite impressed by how wholesome it was, wasn't it? Well compared to like TikTok, wasn't it?
Matt Cartwright:But it's interesting, like AI, yeah, like on Chinese platforms, it like the use is slightly different, isn't it? Because it it recommends content, but like I say, like here it's more like it focuses on what the country needs it to focus on. So I I I remember particularly like coming back to the pandemic again, how like on social media when China opened up, how the narrative literally changed overnight, yeah, yeah. From from how terrible the West was and how they had failed to control this to actually COVID's just a cold and like there's no reason to worry about it. And it just changed. And by changing the algorithm, that was then able to kind of represent the kind of patriotic theme that was needed at this time. So, like the AI systems are integrated into it, but they're integrated into it to achieve different aims, they're not there to commit to achieve commercial aims and just drive content, they're there to drive content to the things that the party wants to drive them to. Yeah. So yeah, which is not me judging which is right or wrong. I actually think my view is like I I'm quite confused on what I think. I think my view has changed over the years, especially since I've had kids, but like I'm really torn. Which probably leads into our next section, which is the same free speech.
Jimmy Rhodes:It's the same old argument with this stuff, isn't it? It's like it's it's it's all it's all good as long as it's actually good for you and it's benign, basically. Like stuff that you know, if all this wholesome stuff, all this good stuff that comes out of the system in China is great so long as it's to your benefit. It's not centering you. Exactly.
Matt Cartwright:It's all right, yeah.
Jimmy Rhodes:Exactly, exactly.
Matt Cartwright:So I I I sort of segueed into it. Free speech. Um we could probably do an entire episode on free speech. Um yeah, we're gonna have to limit ourselves. We're gonna have to limit ourselves. But yeah, did you want to kick off on this bit then? So, like social media and free speech. Social media and free speech.
Jimmy Rhodes:So I think the well, so the the line that all the big text. Companies are pushing that Elon Musk is pushing after he bought Twitter. Um and and the yeah, that's basically in the corporate interest of all these big social media companies is f basically absolute free speech, isn't it? And it and actually it wasn't like this before Trump got in. So um it m it like it almost switched overnight when Trump got in power. Um like obviously leading up to the election, there was a there was uh um Elon Musk buying Twitter, um which probably influenced the election to be honest. Like I'm not gonna get too far down that road, but like there was let's put it this way, there was a lot more m cautious moderation going on. Um whether you agreed with it or not, and whether you thought it was left-wing moderation or not, there was a lot more sort of um I guess moral responsibility, it feels like, um, whether it was directly whether it was rightly directed or wrongly directed in these social media companies. And that's gone now. Like they're all big tech companies in America, and it's a free-for-all, and it's free speech is absolutist free speech, is what it feels like. That's my opening statement.
Matt Cartwright:Yeah, and and the thing that we talked about earlier um was about how like there's always been a sort of the narrative as being owned by media, right? So this is not a new thing that have it like you've now got these tech oligarchs in charge, um, and sort of tech feudalism as we we talked about in a previous episode, but previously you had like got newspaper barons and media barons and stuff like that who were in charge of it. I think the difference was when you had printed media, and you still have it, like to this day, is like if you said something that was um you know that was not um well not not not not true, yeah, not true, then then the whole idea of sort of libel, you could be sued and you'd have to publish an apology, and so therefore you were careful on what you did, and that still exists, you know, people will not publish things that are gonna get them in trouble to some degree, or you have to be really sure that it's gonna make enough money to be be willing to do it. Yeah, but on social media platforms that doesn't exist. And the thing that I find really interesting is as we've moved to this world in which like it's a free-for-all it's the wild west online, you can say whatever you want, there's no consequences. We haven't then seen a oh well, there's nothing we can do about anyway, so now censorship doesn't exist and it's just a free-forall everywhere. You still can't say what you want in printing media. I still can't write a book just making things up and lying about stuff. I can't make a video and put it on the TV that says it, but somehow I can put that on social media. And the thing is now on social media, like the viewership and readership of social media is in many cases more than traditional media, so it's getting it's not like it was a minimal thing where it's like, well, it's just this little kind of your dark corner where like weirdos hang out. So so we we have this world in which we have two systems and we say, Well, there's no way we can make it work because it's the platforms, but then we have your conventional media which exists on like Netflix is a platform, a newspaper is a platform, a book is a platform, and they are regulated at the same time as social media is not regulated. You've got two systems going on parallel to each other.
Jimmy Rhodes:Yeah, I mean the distinction is that one is a like platform and one is a like news companies are governed by completely different rules because they're classed as a I can't what it's called. Yeah, but only because of a a of a law.
Matt Cartwright:But you but then you can't because you don't have a law saying the platform is. That's the only reason why.
Jimmy Rhodes:But you can't do that to platforms, it's impossible. They couldn't moderate the like there's no way YouTube could moderate all the content that's been put on YouTube or Twitter could moderate all the content that's on Twitter and fact-check it. It's just not that's literally not possible. They wouldn't be able to exist without it. That being said, and and actually I don't think I mean you can't say whatever you want on social media, you can still get in trouble for defaming people and for and for um insulting people and doxing people and doing other things, like you can get in trouble for things you do online, but there's a there's it's a bit more of a wild west in terms of like people can have an any opinion they want and and they're can stand on a global soapbox, right? And and and say it online.
Matt Cartwright:So I agree with you, but like also there's no way you're sort of making their argument for them though. So when you say you can't have that online, why can't you have it online? I I do think like for social media, like you okay, so so one thing with AI, for example, uh just as an example, like if you write content, right, you you could like it is feasible to say every bit of content has to be reviewed by someone before it goes on there, right? Every bit of content. I'm not saying every comment, because you'd have to like differentiate different things. So like people commenting is like essentially just people talking at each other.
Jimmy Rhodes:But that's the entire that's the entirety of Twitter. Yeah, so so maybe you so maybe you can't um what's the difference between that and doing a video of you saying something? What's the difference between a tweet and a video of you saying something? You can't make that distinction. If you're gonna say everything online it needs to be.
Matt Cartwright:But if you no no, but what I'm saying is if you're comparing like so a newspaper story, right? Doesn't have to sorry, has to follow certain role rules, but me writing an article on Substack doesn't have to follow those rules. To me, they're sort of more comparable than writing a tweet on Twitter and writing a book. Like they're very different, they're they're very different. Okay, in an ideal world, you'd have some way that you could, you know, you could um sort of triage all those messages, but then you'd get down the whole fact checker and like who who is the one who decides that they can and can't be you're making out like the stuff that's put in newspapers is fact.
Jimmy Rhodes:Like the only thing that newspapers can't really do is say something that about somebody that's incorrect. So they just have to use words. No, no, no.
Matt Cartwright:Everything in newspapers is true. I saw the Daily Star today. Yeah. I think Freddie Starrate and my hamster.
Jimmy Rhodes:They literally have opinion pieces by Jeremy Clarkson in newspapers.
Matt Cartwright:That is fact.
Jimmy Rhodes:Yeah.
Matt Cartwright:Whatever Clarkson says. Yeah, so I'm I don't, you know. No, but they can be held to account for if they say something that's not true. But if theoretically.
Jimmy Rhodes:Yeah, in which situation? Like, I mean, if you say if you say something if you say something in a newspaper that uh that's not true about somebody else and you print it, then yes, you can get done for defamation. But you can get done for that online as well.
Matt Cartwright:I almost feel like we're getting to a point here where the answer is the only way is like you either just don't have social media or you just opt out of it, or you somehow just have to completely self-manage yourself uh in terms of like just don't engage with it if you know it's bad for you. Otherwise, there's the like we're we're kind of offering no solutions here.
Jimmy Rhodes:How would you moderate social media if you were to moderate it? Forget about the newspapers, because like they're they are held to different uh levels, but like also they're not a platform where anyone can just go on there and say something like a a newspaper. It's a reputable journalist, if nothing else, they've got their reputation to uphold. Like if you're just someone who wants to spout off on Twitter, there's that doesn't exist. So what how would you moderate?
Matt Cartwright:So I think you can't stop it completely, but I think what you can do is you can kind of up the stakes so that um you have some way that you can like I d I don't give any answer. I don't know, I don't know how long I don't know how long it would be, but like if I think with AI there will be a way in which the algorithm, as much as the algorithm can steer stuff in a certain direction, the algorithm can also detect stuff. And therefore, if the algorithm doesn't like shut stuff down within a certain point after like whether that however however long it is, 24 hours, 48 hours, 72 hours, and minimize the damage that's done, then you are going to um face you know legal repercussions, which is basically fines and has to be enough of a fine that then social media companies put more and more money into control in it. The fear I have with this is hold on a second, the fear I have with this is the thing that it always comes back to is like whenever you and I guess this is an argument for free speech, is that whenever you suppress it, is who decides what you can and can't suppress. That's what I'm really uncomfortable with is who is it who decides, oh well that thing's not okay to say, but that thing is okay. It just becomes someone else having that.
Jimmy Rhodes:This is where I think there's an argument to be had. So the reason I said that you've already given the answer earlier in this episode is because you talked about like algorithms in China and algorithms in the US. Like this is all being driven through algorithms, yeah, right. So that all and that to be fair, like the the I guess the free speech version, the free algorithm version is what we have right now, which is that you just let it loose and it just finds what pe what actually gets the most engagement out of people, and which is the worst stuff for people. Exactly. It has been demonstrated, like there are now studies that demonstrate that that results in doom scrolling and negative effects on psychology and mental health and all the rest of it. So if that's the problem, then the reg for me, the regulation, it's not a regulation of like this is banned, this isn't banned. The regulation is you tweak the algorithm, you make somehow you say, right, you just have to put a bit more positivity in your algorithm, and what a fantastic solution that would be.
Matt Cartwright:Isn't that going down the Chinese model though? You're promoting social stability and national harmony and communist values. Um which I'm not saying it's a bad or good or bad thing, but that's the opposite of freedom of speech, then.
Jimmy Rhodes:So yes and no, I think I think I think I mean you're deciding for people what they should be interested in, basically, is the argument. Like I think arguably You're already doing that.
Matt Cartwright:But China arguably China's got you're already doing that by deciding what's what what's in the best interest or what's the best thing for them to view is things that will engage them but will ruin their mental health. So you've already made that decision.
Jimmy Rhodes:But we collectively get to decide this, right? Like you I mean at some point at some point in the future. They do no, at some point in the future, if there's a democratic party that like includes as part of their manifesto that they're going to bring in Jimmy's concept of a happier algorithm, then like if you all vote for that. Jimmy's law, it'll be called. Yeah, you all vote for that, and then you know, eventually you get like a democracy, the democratic version of some sort of but like this thing could this could happen, right? Like like it's the utopia, Jimmy. We've back to the utopia. It doesn't have to be communism, like that's an extreme you know, we can all collectively decide that like having algorithms that make us more and more self-destructive is not a good thing. Yeah, and so okay, when you're doing your algorithm in the UK, Facebook, Twitter, whatever, you have to have a it just has to be a bit more positive.
Matt Cartwright:You do you know what? The more I think about this, like let's let's throw out for a second that the the free freedom of speech argument and and and again like the not like we're not here to sort of like promote we're we're not here to be kind of you know to to promote China and how great it is, but actually the one thing you can say about like the algorithm, the way China manages it, is the reason they're doing it is for their social harmony and for what they think is good for their country. So if you as the UK, let's use the UK as an example here, if you think that the there is a net negative on society to this, and that by tweaking the algorithm, and I like I say, this is not to say that you just censor everything like China would do, but you tweak the algorithm in such a way that it plays into a more positive way because that is good for your national society and harmony, which is a thing that I think we'd all agree is important. I think that's not a bad thing. I don't think that doesn't mean that you have to go as far as China in regulating and censoring things, it means that you yeah, I think like you say, you use the algorithm and tweak it towards positive, that's you just turn it a little bit more to the right.
Jimmy Rhodes:And that's if you imagine in your wildest dreams that algorithms haven't been manipulated in the past already. You know, at the moment you've got Elon Musk in charge of Twitter. How many times has he fiddled with the algorithm a bit to try and to try and get it to say to direct people to the thinking the way it's gonna be? It's like like that's happening already.
Matt Cartwright:It's madness to think that like we live in a world where the algorithms are all just free and and bringing it and bringing it back to AI is what happened with was it 4.1 or whatever, the the version of Chat GPT where it became too sycophantic, right? They just turned the algorithm too far the other way, yeah, and it just started sucking up. Yeah, you you you know that proves that you can just treat the algorithm in that direction. Yeah, you treat the algorithm instead of giving it things that make people scared, you just give it more of things that make people feel happy and content. And that might mean that they're not always seeing the truth in kind of you're not seeing the truth anyway.
Jimmy Rhodes:Because the problem is the out so the other thing with the algorithm, right, is the algorithm drives the content. So if everyone's doom scrolling and that's what people are interested in, you get more content creators creating that crap or more versions of that crap for people to watch, and it doesn't matter whether it's true or fact or reality or whatever, like that's already happening. Like the algorithm, the algorithm loads of content creators have already complained about it, like having to put thumbnails they don't really want to put on their videos because it basically wins in terms of the algorithm, yeah, it gets more views, it gets more engagement, whatever.
Matt Cartwright:Even your titles, I have to do it is put in like panic at the first bit, exclamation mark. Yeah, we've done many. Don't panic, yeah.
Jimmy Rhodes:And I don't you don't want to do it, but like, do you want to have more people listen to you or not? Like it's yeah, so I I I think it's all fairly self-destructive. I think we've come to an agreement. We've solved it. We're not saying that we want to turn everywhere into a communist country or communist state or a communist version of itself, but like there's probably there's a lesson in here, isn't it? There's a lesson in here somewhere, yeah, exactly. Cool.
Matt Cartwright:So we've solved social media. I don't think that was the intention of the episode.
Jimmy Rhodes:But yeah, now we just need to get become CEO of these social media colours.
Matt Cartwright:I say we just need we just need control of the you know this this knob here. My knob. The knob on my knob on the microphone. The microphone thing. What's it called? It's not called a microphone thing, is it?
Jimmy Rhodes:Um probably not.
Matt Cartwright:No. It's a little box that the box that our microphones go in. Yeah. Is it just the thing like that you just tweak it left for more evil and doomy, and right for more good and happy? Is that all it is? That's how mine works. Just Elon Musk just needs to tweak it and everything will be better. Um yeah. So if you're listening, Elon and Pony Mar and uh I know not Pony Mar because he's already turned his in the social harmony direction. Um but who have we got?
Jimmy Rhodes:The Zuck. The Zuck, yeah. I don't know who's in charge of um what's it called? The Zuck? Google, who's in charge of the social media lot at Google YouTube, the YouTube.
Matt Cartwright:Well, anyway, whoever whoever you are, we don't know who you are. We we'd probably butcher your name if we did. Uh the Zuck, Elon Musk, Sachin Nutella. Sachin Nutella. Yeah. Twist your knobs a little bit further to the right, and then the world will be a happier place. And we'll all be good. And on that note.
SPEAKER_01:We're drowning in the dopamine glow. Trading presence for the endless scroll disclection human.
SPEAKER_00:To write our own lives on our own page. Even censorship's honest tyranny beats the illusion we're still free.