Kids Law
This is a podcast about children and the laws that affect them as they grow up. Alma-Constance, our host, decided to start the podcast in 2021, when she turned 10 years old. Living in England, UK, she discovered that at 10 years old she would have reached the age of criminal responsibility. This is one of the youngest ages of criminalising children in the developed world. That was a pretty shocking discovery for her especially as she realised that she and her peers knew nothing about what this meant in practice and how it can affected children and their families. With the help of Lucinda Acland, a lawyer, and supported by Next 100 Years, they set out to ask some questions of leading experts to help children make sense of it all. There are a lot of laws that affect #children as they grow up and they are confusing and complicated and can affect all aspects of their day-to-day life from #education to online protections or at home, if families break up. It is difficult to keep track and understand the laws and how they impact a child's life. Alma-Constance is determined to help #TeachKidsLaw at a much younger age to help them grow up into adults confident with their legal knowledge. Understanding how the law works and being able to understand complex concepts of #justice and #ruleoflaw will help anyone as they try to navigate their lives. You can email us: kidslaw@spark21.org or reach us on social media channels and our www.kidslaw.info website.
Kids Law
Online Safety Act - what is being done to keep children safe online
Use Left/Right to seek, Home/End to jump to start or end. Hold shift to jump forward or backward.
Adults might want to listen to this episode before listening with young children, as we mention the words self-harm, suicide, and eating disorders in the context of examples of harmful online content.
The new Act aims to protect young people from harmful content online and puts new obligations on online platforms to make them more responsible for their users safety.
Alma - Constance and Lucinda speak to Jess Smith, Ofcom's online safety director, who explains:
· How the Act will stop children being harmed by online activity.
· What tech companies will have to do to protect vulnerable people.
· The ways to ensure age checks and change inappropriate algorithms.
· How children have been involved to discuss their online experiences.
· What Ofcom can do to ensure that the online providers obey with the new laws?
When Jess was 10 years old, she says she was very chatty, into climbing trees and cartwheels and really liked reading. Her ambition was to be a professional horse rider competing in the Olympics.
Written, edited and produced by Lucinda Acland.
Resources and References
What the Online Safety Act does
Links for places that offer children support if they see harmful content online:
· NSPCC
· Childnet
· SWG4L
If you've got any questions, ideas about a topic or someone to interview, get in touch, we'd love to hear from you!! You can email us at kidslaw@spark21.org, contact us through the website: www.kidslaw.info or through social media: Facebook, X and Instagram @KidsLawInfo
Please subscribe, rate, and share the podcast with your friends.
See you soon in the next episode!
Automatically generated transcript
Hello, Lucinda here. I just want to say to any adults that you might want to listen to this episode yourself before listening with young children, as we mention the words self harm, suicide, and eating disorders in the context of examples of harmful online content.
Hi, I'm Alma Constance. And I'm Lucinda. And together in our Kids Law podcast, we're going to look at how laws affect children as we grow up.
So what are we going to look at in this episode, Alma Constance?
Well, keeping safe online is a topic we have discussed before in previous episodes, where we talked about the ways that new online safety laws could be used. I'd like to know what stage we are at and how these laws work.
We know that although there are huge benefits to being online, there are also many risks to children and young people from harmful online content. The Online Safety Act was passed in October 2023, and is a new set of laws to protect children and adults online. There are now new duties on social media companies and search services, which make them more responsible for their users’ safety on platforms. To ensure that these companies abide by the laws, an organisation called the Office of Communications commonly known as Ofcom, is now the independent regulator of online safety. Let's speak to Ofcom's online safety director, Jess Smith, who can tell us how they will make the organizations comply with the new law and make the internet safer.
Hello Jess, thank you so much for joining us on our Kids Law Podcast. We are so glad to have you here. Could you please start off by bringing us up to date about the new Online Safety Act and the way it is designed to work?
Hi Alma, thanks very much for having me on. It's a pleasure to talk to you today. As Lucinda said in our introduction, the online safety laws passed late last year as part of those laws Ofcom, uh, where I work needs to set out the detailed rules that services have to follow in order to put the laws into practice. The laws won't be fully enforced until we've done that, as soon as the laws passed, we came out with our first consultation on what platforms have to do to protect users from illegal content online. Now, this relates to a wide range of offenses, but includes things like, terrorism and predatory behaviour by adults towards children. So, we've. set some very clear and very strong rules about how platforms must address those issues. Then we came out with new rules for adult services, about how they need to use age checks to make sure that they can't be accessed by the government. By users who are under the age of 18. And we did that, in December last year. And then just recently, we've done another consultation on further steps that platforms need to take to protect children from content that isn't illegal but is harmful to children. And this includes things like, uh, bullying online violent content. And very sadly includes things like, content that relates to suicide and self-harm. And so this is really very harmful things for children to be engaging with. So we've set out our proposals. , we've now done a consultation, so platforms know what to expect. and lots of people have given us feedback on how we can change our measures and strengthen them and make sure that they really work in practice. And there’s a deadline in the law that means we have to finish finalizing these rules by the end of April next year. And we are, we are on track to do that. I understand that people want to see change as soon as possible, we know that platforms are already listening to what we're saying and are starting to make changes already. So we've already seen Instagram, for example, make new rules about how easy it is for adults to find children they don't know and to contact them. And that's linked to one of the proposals that we've made in our consultations. So we will sort of start to see change, I think, quite soon. but the law requires that we consult, and we make these as robust as possible. And that will mean that when the time comes to implement and enforce them, we will be on much firmer ground, to do that.
So what role does Ofcom have in implementing the act and how far have we got?
So Ofcom’s role as the independent, regulator is to take a, non-political view, , about whether or not platforms are doing enough, , to keep their users safe from online harms. So one of the things that we will do, for example, as well as setting out the detailed rules is we'll work really closely with a subset of the companies that need to implement them to look at what they're doing. And to help them effectively. So it's called supervision, where you work really closely with the company to make sure that it's doing everything, properly, and to answer their questions if they have any, what we're also responsible for then is enforcing. So that's when companies, um, If they don't comply with the rules, we can do things like levy fines, to incentivize them and encourage them and effectively punish them if they don’t, follow the rules
So what will Ofcom do to monitor and ensure that the online providers will comply with the new laws?
So this is a really good question and it's a really hard job actually. There are a lot of companies in scope of these rules, as well as search and social media. It also applies to certain types of gaming. It applies to messaging services. It applies to adult services. So we think there are thousands and thousands of companies in scope of these rules. So how we monitor it is really, really important to check whether or not, we're doing our job properly. But also, whether or not, the laws are working in practice or if any changes need to be made.
So, as well as the supervision work we’re also going to do a lot of research to monitor how, what's changing on the internet, what's the experience of users and particularly the experience of children. Ofcom is used to doing research, we do lots of it, , in preparation for our Protection of Children consultation, we spoke to over 15, 000 children and we will continue, , to engage with children and to find out, what's changing for them, how it feels to be a child online, whether or not they feel safer than they did before, these rules were enforced. Now within the law, the government has to review whether or not it's working in a couple of years. So all of our work will feed into that process to help them to assess, um, whether or not the laws are actually working in practice.
I have a question it sounds fascinating that you've been able to, involve so many children. And I'm wondering, is this the first time that, , to your knowledge that so many children have been asked to contribute about their experience online? So we're not the only, organization that does that. I know the Children's Commissioner, for example, often talks to large numbers of children and asks them about their online experiences as well.
In fact, the Children's Commissioner for Wales has just done a big, survey with children and young people. It's the first time that Ofcom's done it, so we're still on the learning curve about how best to engage young people And to tell us what they think. , but it's a really, really important part of what we're going to do. And ultimately this is all about protecting children and changing the frankly unacceptable experiences that too many of them have online.
So can you tell us more about how the Act will stop children being harmed by online activity? And what are some examples of these cases?
Yes, certainly. So, just to start with what's happening at the moment, we know that children are, particularly vulnerable online users. We know, for example, that many children, under, the minimum ages have social media accounts. We know, that many children have Ages attach their accounts, which mean that they effectively have an adult experience. , we know that children are more likely than adults to encounter, suicide, self-harm and eating disorder content. We know that children have told us that violent content online is unavoidable. so we know that there is a big problem here and that there has been a lot of inaction for a long time. what we've proposed to try and resolve some of these problems. First of all, we've told, social media services that children shouldn't appear in friend recommendations or friend requests for adults that they don't know. We don't think it's appropriate that children online should be able to be easily contacted by adults that they don't know. The second thing is that, all platforms moderate the content that is on their sites. But that needs to be age appropriate. That needs to take account of the user. And it needs to ensure that it doesn't serve harmful content to children primarily. We're requiring much wider use of age checks. Now this is particularly where certain platforms allow content that's harmful to children. We think that if you do allow content harmful to children, well it's really important to know which of your users are children in order that you can protect them from that harmful content.
I know one of the main problems is that young children can access the internet and sign up to social media platforms, even if there are age limits. What is the Act doing about that? And how will it change the way children can put themselves at risk with these decisions?
Given the current state of age assurance technology, we're requiring that services that have a minimum age do what they say they're going to do and apply it consistently. Their choice is that they can use Age assurance to do that, or they can make their service safe for younger children so that they can, if they do go on the service, they then have a safe experience.
For the age checks, how do you check the age? Because I've been on games and it's literally, it's only like you put your age in, but how do you make sure that that age is correct?
Yeah, it's a great question. And, there are a few different methods. So firstly, we've set out a standard of highly effective age assurance, which is, technologies that the services can use to assess people's ages. But It's not just the technology. It's also about how it's applied and how the service puts controls around that technology to make sure that people can't, , accidentally get through. Technologies that we think can work are facial age estimation, which is where, an app, , looks at your face. , using your camera, it doesn't necessarily take a photo, but it looks at your face and, and then estimates your age or age range. And that's actually quite accurate. There's some other methods, including using mobile network operator checks, which is where your phone company can, tell, the, provider how old you are, people can use, banking to show how old they are. And of course, there’s hard identifiers such as passports and driving licenses. Now, because there's. not much access to hard identifiers amongst children. We've required age checks to make sure that services know who a child is. So that is who is under 18. And, we think that that will drive the market so that new solutions and new technologies emerge.
We know that the way that search engines operate is to show you more and more of the same content, which is one of the ways that children can be exposed to harmful or explicit content. And what does the act say about that?
Yes, so this is. Key vector of harm for children. So it's a really important way in which children stumble across harmful content. They're not searching for it. And their social media feed is the algorithm is saying, or your friends, , you know, liked this or, this type of content seems to keep people that are similar to you online for a long time. And that tends to drive more and more extreme content now, if you talk to any child who's using online services, they often tell you that they wouldn't let somebody else use their account because it might, you Mess up the algorithm or it might mean that they get served content that they don't want to see. So they tailor their behaviour in order to, try and manage what the algorithm might then feed them. . And this is where our proposals with respect to recommended algorithms come in. We think that the way the algorithms work at the moment is too dangerous for children. We think that services need to actively remove content that is likely to be harmful from those algorithms and ensure that it is not served to children. And we also think children need to be allowed to. Give a negative reaction to the content that they're seeing and you can imagine if a violent, Video comes up in your feed as it might be shocking, you know, it might, uh, it might cause you to look at it again, just because you can't quite believe what you're seeing. We think that's, that's, that's not okay. It shouldn't just be on the child to make sure that they're able to remember to scroll past really quickly. It should be on the service to ensure that that content that, you know, is, is being algorithmically recommended to a child is not there in the first place. any content that's likely to be harmful should be removed from the Recommender Algorithm, and we think That will have a big impact in the way that these algorithms work. And the child should be given an opportunity to suggest that they don't want to see certain types of content that the algorithm , should make sure that they, don't serve that kind of content to, that child.
The law also makes it clear that harm can arise from the way content is promoted, like large volumes over a short space of time. And the annual transparency reports. I thought that was particularly interesting.
So we know that another way that harm arises for children is not, it's not just seeing one video or one image that's kind of shocking or harmful. It's how this content accumulates over time. So that a child can see harmful content again and again and again. And there's a concept in the legislation, which called cumulative harm.Now, we think this is really important and it really links to the way that these algorithms work. We know that, Children who engage with, , content related to eating disorders also engage with content related to fitness, for example. So it's not that fitness related content is harmful on its own, it's just what is that being served in conjunction with? What's the whole package of content look like for a child and, how does that affect them? So what we're requiring services to do as well as the measures to make algorithms safer. They need to risk assess for all of the risks that arise on their platform to children. And they need to risk assess for children in different age groups. And they need to ensure that they've got appropriate safety measures related to the level of risk on their platform. Now, The risk assessment duty, is a very important stand alone part of the Act. It's something that we can enforce against in its own right. And we will be looking very closely at those risk assessments. Now, we'll also, require services to update those risk assessments Thanks. Every time they make a significant change to the service, if they roll out a new feature or functionality that will need to be risk assessed and the risk assessment will need to be updated or every year depending on, whether they don't make any changes in a particular year.
Are there any particular groups of people who are at greater risk of online harm?
Yes, so sadly, vulnerabilities that affect people offline also, affect them online as well. So, we know that children are a particularly vulnerable group.
We know, Women, for example, tend to experience more online harassment, and they're more likely to be, victims of, domestic violence, which is also now perpetrated online to a large extent. We know that, black and minority ethnic, people experience more online hate.
We know that that's also true of LGBTQ. people as well. So people's sexuality, their religion, their ethnic background, age and gender all affect how, they experience harm online and their disability is an important aspect of that as well. So, there's lots of, Different sides to vulnerability, and, one of the things that we found is that quite often boys are fed, a misogynistic influencer content, such as Andrew Tate, and others. While they might not be such a vulnerable group on their own this is effectively the radicalization young boys. It's a complex picture, so, while certain groups may be more on the receiving end of, really harmful activity online and bullying and abuse and so forth, platforms need to do more to address it. The other side of that is what drives that behaviour amongst people who might then go on to perpetrate it online.
Do you know how the tech companies, what they're going to do about that?
Well, they've got flexibility in how they do that, obviously. We know that Ofcom has a duty to be proportionate in the measures that we recommend. And that's very clear from the legislation. So all of the measures that we have proposed, , we've costed effectively. We've made a guess about how much we think that will cost, the tech firms. But that's the function of the legislation. They have duties now, and I think they will have to sort of think very carefully about how they do that. They do mostly already have trust and safety teams they've always been competing for resources and they haven't been able to necessarily demonstrate that much impact on the bottom line. Well, now we've got the power to levy fines if they don't comply. So I think that the impact on the bottom line of those teams being able to do their jobs properly is very, very clear.
How are you going to reach out to every one of them?
It's a really good question. So for certain services, we've already established a relationship and we're, we're supervising them and we're working really, really closely. For others, we have, we are reaching out through trade bodies, lots of them kind of organized together and, and we're using those, , groups in order to reach their members. But, there's a lot more to do. And so we're trying to do a lot of business communications using the trade press and, , trying to reach out to services in different jurisdictions or particular kind of hubs. The other thing that we're doing is, working through kind of business intermediaries. So for example, professional services, companies like law firms and things like that, to make sure that, they are aware of what we're doing in case they're there asked for advice. Ultimately, we set out our rules, services must comply with them. For most services, we expect that they will want to take steps to comply with the rules so that they can continue to do business UK users. But if, as we suspect, some of them choose not to comply. Then how do we make sure that we are, deploying our enforcement resources in the most effective way?
Please can you tell us about your role at Ofcom and what you do on a daily basis? So I work in the Protection of Children team, and I also lead our work on, particular measures to protect women and girls. So on a daily basis, I work with my team on our proposals, that we, that we're consulting on, we listen to the feedback that stakeholders give us, , I get to talk to lots of very interesting and knowledgeable people, and I also get to work with a really brilliant team who are really passionate and committed and have all joined Ofcom because they really care about creating a safer life online.
I have a question I ask all of our guests. What were you like at 10 and what did you imagine you'd go on to be as an adult?
At ten, I was, very chatty, , I was really into climbing trees and doing cartwheels and things like that. I used to really like reading. Which I still do, although, I don't get to do it as often as I would like because I have three young children of my own now. , but my ambition for myself when I was 10 is that I would be, , a horse rider, professional horse rider competing in the Olympics. Which because I lived in East London wasn't actually that achievable, but it didn't stop me dreaming.
It might be interesting, to hear what you studied that led you into this work.
Yeah, so I don't come from a, a tech background, particularly I come from a policy background. So at university, I studied European politics. and I went on to do a masters in science and security, which is all about how, technology shapes the world and, changes, how countries, interact with each other. I worked in government for a long time. On technology policy. And then I moved to work on data and data rules and privacy and how we can use data to support technological development. . It takes a lot of different, , perspectives to make sure that that happens in a way that is safe And while I was doing that, the online safety act. Was published as a white paper and I knew, that Ofcom was going to be appointed as the regulator. So when I saw an opening and I jumped at the chance
Well thank you so much Jess for telling us about the Online Safety Act, how it works and the work of Ofcom. Do you have any final advice for children and those looking after them to get help and advice about protecting themselves online?
Yes, there's lots of great resources out there, , that can help parents and children about how to lead a safer life online. But I think the most important thing is that children know that online they are supposed to be safe and if they see content that doesn't feel right, they should report it and they should tell an adult. And that I think is a really good principle to start to live a safer life online and to help platforms as well, to know what is and isn't okay for children , to encounter.
Thank you so much.
Not at all, it was really nice to chat to you both.
Well, Alma, what do you think about what Jess told us?
Well, Ofcom's role is to create detailed guidelines to put the act into practice, and they have already started the process of eliminating harmful online content in general. To protect children and young people. 000 children have been asked to express their own views about their online experiences. To ensure that the act guidelines will be effective. It has a focus on age limits to prevent children's access to age inappropriate content. They use technology to identify the age of a child to help stop sites or apps from showing harmful content to young people. the age of 10, Jess wanted to be a horse rider.
Jess told us that there are a number of ways to get further information for children and young people, and for those who care for them, and we'll put the links in the show notes. In our podcast, we've been exploring how laws work and affect young people. All of these things help children understand their rights and responsibilities so that they can make informed decisions, not only about their lives, but also about voting for MPs who make the laws and understanding how the legal justice system works.
It's also important as Jess has told us that children should be kept safe and that adults must care for them. Remember, if you have any worries, talk to an adult you trust and tell them how you feel, particularly if you see something online that you don't like. This includes your teachers at school who are there to look after you too. So tell them that you need to talk to them.
You can find more information on the KidsLawInfo website. Keep your questions coming in. Please subscribe, rate, and share the podcast with your friends. See you soon in the next episode.
Bye