
The Security Circle
An IFPOD production for IFPO the very first security podcast called Security Circle. IFPO is the International Foundation for Protection Officers, and is an international security membership body that supports front line security professionals with learning and development, mental Health and wellbeing initiatives.
The Security Circle
EP 078 Catherine Knibbs, Human Behaviour Technologist, 'Parenting in the Digital Age: Protecting Your Kids Online'
Why are children not safe online?
What is Cybertrauma?
Any trauma that is a result of self or, other-directed interaction with, mediated through, or from any electronic internet/cyberspace ready device or machine learning or artificial intelligence algorithm, that results in impact now or future.
This event/interaction can be multi-modal, multi-platform and multi-interval, delayed or immediate, legal or not, singular or plural, and may include images, sound, and or text and may or may not be vitriolic in nature.
Events may include covert and overt typology and may be virtual and corporeal and/or at the same time.
(Knibbs, 2016-2021)
What can we, as teachers, therapists and other professionals, do?
Cybertrauma is trauma that happens when using technology. A lot of it will be connected to the internet in one way or another. Cyber bullying can happen on social media, in games, or even in classroom Zoom sessions.
As the adults in or near the situation, what can we do?
I can help. I am an international educator and author, Online Harms consultant, researcher and clinician, helping professionals protect the child from online harms in the metaverse.
Children are often much more tech savvy than parents and other adults in their lives, and will use games, social media and the internet in general from early on in life. With my help, teachers, therapists and other professionals in child facing roles can get clued up on what to look out for and how to help when cybertrauma does happen. I can also help with safeguarding against these issues in the first place.
Catherine Knibbs (Cybertrauma and Online Harm Specialist)
Cybertrauma: Clinician & Researcher, Intl Educator, Consultant, TEDx Speaker, Author about Online Harm/Behaviour (PhD’er in the background)
I am a Researcher, Psychotherapy Clinician, Author, Speaker, and Doctoral candidate looking at the real harm children suffer in a world of technology, which is advancing quicker than many adults can keep up with.
I have a background in Engineering from the Army, IT, and Computer Tech of over 25 years and over a decade of working with children and adults directly around issues relating to the internet, from Bullying to Porn viewing, from cybercrime to cybersecurity more.
I write about issues such as the impact of tech on the developing child, the impact of cyber trauma and the issues of immersive technology on eyes, brains, and bodies.
I run a company educating professionals about Child protection around tech and digital spaces and teach therapists how to be 'safe AND secure' when using tech to ensure they protect their clients.
You can visit my website to learn more about me and my work: www.catherineknibbs.co.uk
https://www.linkedin.com/in/catherine-knibbs-2ba67b52/
Security Circle ⭕️ is an IFPOD production for IFPO the International Foundation of Protection Officers
If you enjoy the security circle podcast, please like share and comment or even better. Leave us a Fabry view. We can be found on all podcast platforms. Be sure to subscribe. The security circle every Thursday. We love Thursdays.
Yoyo:Hi, this is Yolanda. Welcome. Welcome to the Security Circle podcast. IFPO is the International Foundation for Protection Officers, and I'd like to say thank you to all of our listeners around the world. We are dedicated to providing meaningful education and certification for all levels of security personnel and make a positive difference to our members mental health and well being. Today's special guest, I've been rather looking forward to speaking to her, actually. I figure if she piques my interest, she'll pique your interest as well. Her name is Catherine Nibbs, or Knibbs, actually, if you want to spell
Catherine:her name properly. She is an author.
Yoyo:of a book around online harm and online behaviors., she's also doing a PhD in the background. Pretty smart lady. Talks about hashtag children, hashtag technology, hashtag cyber trauma, hashtag online harms, and hashtag online safeguarding. www. childrenandtech. co. uk Welcome to the Security Circle podcast, Katherine.
Catherine:How are you doing? Ah, I'm, I'm brilliant and thank you very much for a lovely, lovely introduction. And then I'm gonna come in and go and, am I gonna be totally narcissistic if I say, actually it's five books, It's not, it's five books.
Yoyo:Catherine Cannus, uh, author, author us of five books. We'll talk about those. Catherine, look, uh, question number one, does tech and children go together?
Catherine:uh, no. No, no, no, not really, no, but when, I mean, yeah, let's start where we are. Right. When you think of the word technology and children, um, a lot of people immediately go to social media and what's currently happening in terms of Congress and the way that we think about where children go, however. I have raised children myself from the get go in terms of the beginning of the internet. I introduced my children to technology and computers around the ages of about three, so that they had what we now call media literacy, but also I wanted them to understand. The way the world is changing and technology is, is something that gets muddled up a lot of the time in terms of, do we mean social media? Do we mean gaming? Do we mean Excel spreadsheets? Do we mean, uh, cars, et cetera, et cetera. Um, when it comes to social media. Which is entirely its own thing, which we'll probably get on to, then perhaps not. And the reasons being are, um, the spaces that children can go and what they're exposed to. However, and there's going to be a lot of this that sounds like a contradiction, however, It is another environment in which children are going to be needed to be educated and skilled in, in order to survive in tomorrow's world. And a lot of the adults at the moment are like, well, let's take this stuff away from them. And that's fine. What happens when the parents are dead and the children don't have the skills to survive in the world that's currently in existence? Long winded answer but there you go.
Yoyo:No, I think we have to be long winded and I think we have to be real. I put a poll on LinkedIn Uh, and basically ask the question, you know, have you checked your child's phone? And there were three options, uh, and I should have added four, actually. I would, it was pointed out to me, quite rightly so, that I should have had a fourth option. The first one was, yes, and there's nothing too scary. The second one was, um, um, no, I don't check it. Uh, and the third one was, yes. I have checked it and add in the comments below and some of the comments are quite scary. But the fourth one was my child doesn't have a phone and or I've chosen not to give my child a phone because obviously it would be the parent's decision and I should have added that as an option. But ladies and gentle bums when you do put a poll on LinkedIn, you cannot edit it afterwards. There's a little tip for you if you haven't done it. Um, I was quite surprised actually that a third of the respondents had never checked their child's phone. And I should imagine,
Catherine:uh,
Yoyo:I mean, I, I kind of want to scare them into checking their kids phone as soon as they get home from school. This is important, isn't it?
Catherine:Um, yes, let's go with the first one where, yes, I've checked it and there's nothing on there. Holy moly, how do you know? Because actually when you check your child's phone, let me introduce you to children for just one moment. If they think like I do, I would have hidden the things I didn't want my parents to say. I would have utilized apps that are on there to Specifically hide things. I would have deleted things. I would be using somebody else's phone. And one of the things I've said recently in terms of, uh, conversations here, um, when we talk about getting rid of phones, that is not the only place that the internet exists. So we need to think about, and does that mean you'll get rid of the smart TV and maybe the Alexa and the fire stick, and then are you going to get rid of computers and handheld consoles and DSs and switches and yada, yada, yada? Because actually, if you don't want your child to access the internet, then take away their library pass. Because children can go to a library and access the internet in a public space without your oversight. And phones are not the only way that children can get into these spaces. And certainly, if you're going into some of the apps that are on there, you need to know how those apps work. And that's one of the difficulties that I find, is it's not a case of Uh, the word that was introduced to me the other day was policing children. We don't need to police them. We need to take an active interest in what they're doing. The same as we would if they were going to an after school club. We would go, more than likely, as parents, to see where that was, who was running the club, what kind of people went there. We would do that if we let our children go around to other people's houses, etc. But we just don't seem to have the same understanding of the digital environment. Now, I like your idea about
Yoyo:library, but I'm going to challenge that to a degree because, my research indicates that directors of social media companies insist that their children use mobile phone devices in shared spaces within the home, i. e. not in their bedrooms. They are never allowed to use their devices in their bedrooms. And I guess the library is a safe shared space. I mean, there's not a lot of sinister stuff that you can get up to it with shared spaces. Isn't that
Catherine:the point? Uh, well, it depends on what you mean by that, Yo Yo, in terms of, uh, yeah, now a, it always, I mean, it always depends on. Where a child is going, what, what a shared space actually means, um, so I think the difficulty with the, the environment that, that we're talking about at the moment is we don't have a standard lexicon, we don't have a language that people can understand to know what it is that we're talking about, um, and, and we'll probably get to that in terms of the online harms, but certainly, um, it depends on it. Where a child is going and the skills of the people around that technology and around that social media and around the gaming environment that can actually put in preventative measures. Okay,
Yoyo:so I listen to a, I listen to LBC actually, please don't judge me, I am a passionate liberal. Um, I love James O'Brien. I've only discovered him recently to my detriment. I wished I'd discovered him a long time ago. He, uh, featured this subject on one of his shows recently. So it's on, um, it's on all major podcast platforms to listen to retrospectively. And he asked parents to call in with what they found on their children's phones. And one guy. Uh, a man, a grown man, uh, said that just by chance his daughter left her phone on the sofa when she went to the toilet or something and he took a look at her phone and discovered horrifically that she had been sending sexual images of herself. She was about, I think, nine or ten years old. to a recipient. He then said, and of course James O'Brien is kind of horrified, as we all are listening, but then I've, I've been involved in cases like this as a police detective a long time ago now. I dread to think what's happening now in the 16 years I've been out of policing. Um, I should imagine it's a lot worse, but he said, I then pretended I was my daughter with this person for the next three days. And he was very keen to set up a meet with me. And I told him how old I was. And he was still keen to set up the meet with me and he discovered where this man worked, went to his workplace, this was a big media story at the time, pounded him a few times, got arrested for GBH, got let off in court because the judge sympathized with his passionate response to this predator. And he said, you know, what horrifies me even to this day, he said is that on that one occasion, I just literally happened. It wasn't a motivation. I didn't plan to go, go look at her phone. It wasn't a routine that we had. It just happened to be coincidental. And I stumbled across that. And he also said. He is accepted as a parent. It's his responsibility that that happened. I know this isn't an isolated story. I know that there are hundreds of stories like this. How are we going to tackle this? And in your experience, you're one of the best people to speak to about what parents can look out for on their children's phones.
Catherine:Um, yeah. So this is, this is the area of, um, Moving into, uh, giving them, I mean, this is where we start with this, uh, language. So this is the area of grooming, exploitation, child sexual abuse material. Um, there are lots of different names. So currently it's technology facilitated child sexual abuse or technology assisted child sexual abuse and. In terms of policing around the world, so this is not just in the UK, what policing around the world are dealing with is something that is unmanageable. So this is now one of the highest priorities in terms of society that I would certainly say that this is the point in the podcast that people skip to and pay attention to. Um, the numbers of predators, um, so perpetrators of crimes against children of many descriptions, but for this point here, let's talk about the, the sexual abuse of children is increasing. And just to give you some stats off the top of my head, as we went into lockdown, there were 800, 000 or it could be, yeah, about 800, 000 known child sexual abusers, uh, known to the police. And of those. Those were ones that already had records, et cetera, et cetera. And what we now live in is a society where live streaming events can take place. So we've moved from what that gentleman that you were giving the example of, we've moved from just the stuff that you can find on a phone that's permanent, that might have been shared previously to stuff that can be shared in the moment. And the hardest. Job for the police and all of the services that I I speak with in terms of this space is being able to manage how incredibly fast and numerous these Perpetrators are and no they do not all exist on the darknet. So when you're looking at your child's phone Um, and let's start with children as young as seven and maybe even younger. And I'm going to talk about not just phones, uh, any kind of device. Many of the services that remove this material that's, um, created. So we, we keep giving it different names and at the most. At the moment, I think we're going with changing from self generated content to first person produced. So hopefully we will move to a less victim blaming, um, but also confusing space because self generated suggests that there's, uh, consent. And first person produced allows us to think about the coercive nature of these perpetrators. And I would say you are looking for, um, People that your children are talking to, and certainly the numbers are increasing with boys, so please do not think that this is just a female only issue that we need to think about. Um, you would be looking for friends or contacts or, um, people that they are associated with on any of the platforms that they use. And those people, if you were in, I don't know, a village and you could name everybody who lived on your street, you would at least stand a chance of knowing who the people were that were talking to your children. And I would certainly say, take this as an approach of your children might be talking to people who say that they are the child's friends. And actually there's a very different undertone from a perpetrator as to what they mean by friend and what we As children in today's society, understand by the word friend. So you would certainly be looking for any kind of conversation, contact, chit chatting, message sharing, videos, photographs, and certainly keep up to date with online safety by going to the myriad of. Organizations that exist and learning what apps sit on a phone that you may need access to. Does that sound enough to start with, Yo Yo, You're blowing my
Yoyo:mind., I know that there are some amazing websites out there. What we'll do is we'll link to a few of them on your bio for the podcast. and I know that you've only got to Google Online Harms. or children and online harms. And, and there are, there were an amazing number of really informative geared to very basic understanding. So not too technical. Look, I, this radio show also talked about the fact that, you know, one guy said, my daughter says she's been followed by Harry Styles. We know that Harry Styles does not follow 13 year olds. how do we tell young impressionable children? That there are imposters out there who will calm them. And then I know we're going more into a, a very different type of romance fraud area, but if, if we keep it to the children's space, this has got to be a very easy way for predators, hasn't it, to get into children's bedrooms by Imposing as a famous idol.
Catherine:Yeah, so I, I wrote a blog a while ago called Ogres in the Screen, um, and, and the reason I called it that is because these, these people, and they are, by, um, by statistics that exist, they are often men. In fact, if, if you go and look at the research, it is primarily men that are conducting this kind of, uh, grooming and extortion and sexploitation. They do not care about your child. Okay, so they don't care about the mental health. They don't care about their vulnerabilities, special education needs. They don't care if they're a child who's living in residential care. They are only interested in the numbers of images and videos that they can coerce from a child. So in order to do that, they often pretend to be a friend. Somebody who has a concern, somebody who is like the child, and somebody who can have the same kind of characteristics of, the peer group. So one of the things that I would say, and this is pre to any kind of internet safety, advice is conversations, conversations, conversations. When you are out and about in the corporeal world with your children, ask them how they know somebody is trustworthy, ask them how they know, um, So, one of my favourite examples of this is actually to talk about, the film Kindergarten Cop. Oh yeah, I love it, yeah. Where, where, in the moment where, the, the police officer is trying to talk to the children and one of the children says, Dogs? Is dogs okay? Actually, that's the kind of conversation we need. Now, I'm going to take you to a little bit of neurobiology to help you do this. So, most of us have what we call an uh oh feeling. It is that gut instinct about, is this person safe? Are they trustworthy? Do I even want to continue in a conversation? Now, we've all had this experience where somebody's been a little bit too close in the supermarket, the post office, and we just want to move away from them. So getting children to dial into that feeling is really important. And the thing about you these ogres in the screen is you don't necessarily get those biological cues. So children have to learn to decipher. Language, text, conversation through these spaces in order to be able to discern what a friend is and what a not friend is. What a trustworthy person is and what a not trustworthy person might be like. And those conversations can only take place if a child can do that in a critical way in the real world. But I'm just, I'm
Yoyo:jesting here, but even adults can't determine whether someone's trustworthy or not. That's
Catherine:a tough question. Sometimes I question my own judgment. Um, yeah. And, and to be perfectly honest, this is because, um, by the time we've got to adulthood and, and I write, there's a, there's an entire chapter that I've got in my sexual harms, uh, book. There's an entire chapter on consent. And I'm not talking about that in a quote, enthusiastic consent that you have in the. He related, um, videos about sexual contact. I'm talking about consent, about what is my space, and what I can do with my body and what I can't do with my body. And what we tend to do as adults is we impose on children rules that are confusing. So it's your body, but go and give your grandma a kiss. It's your body, but put your shoes on. And if you don't put your shoes on, I'm going to force them on. And we do that because we get stressed and we are under time constraints. And then we say things like, um, you can say no, but you must eat your broccoli, don't say no. And children are growing up with a lot of confusing statements. And then we say to them, but when somebody approaches you, just say no. And, you know, it reminds me of the Grange Hill thing in the 90s. That is really showing my age there. Yeah, that was so That
Yoyo:was really powerful for us. just
Catherine:say no was so significant. It was
Yoyo:empowering. I mean, don't get me wrong. I could never say no to my mom and dad. I was like, Oh, I don't want to do that. That's why then don't you, you know, like, Oh, I don't want to eat that tonight. Well, then you don't eat. I mean, we did come from a different generation. My parents certainly weren't the, what would you like for dinner tonight type.
Catherine:Well, this is, this is the point you're, you're right. If you weren't able to say no because of a fear or because of a respect thing, or because it was a social norm, how on earth can you say no to somebody that's offering you, I don't know, for teenagers it might be bitcoin, for younger children it might be t shirts, it might be merch, it might be something that they perceive giving them, um, a feeling around a need. Now most of the, um, groomers, are exploiting a need in a child that isn't being met in the real world, because that's the only way that they can groom them. And those needs mean that as parents we need to be, around our children in terms of presence, and that's not under the tree type of presence, but presence, and Available to be,, spoken with, to have conversations with, to hear what's going on for them. And that's really difficult because as parents and adults and people working in these spaces, we're really busy. And, and I totally get it. I totally get it because the cost of living means everybody's got to work more and that means that you're coming home and you're slightly stressed and children pretty much in my therapy office go well there's no point she's too busy, she shouts at me, he's never there, etc, etc, etc. You made a
Yoyo:really good point here about, and this has really kind of struck a chord. You talked about exploiting a need that isn't met in the real world. Let's just, you know, we're all adults. We know that this is why affairs happen in marriages. We know that this is why people are attracted to cults. We know that this is why we do things that we wouldn't normally do that are out of character because there is a need that isn't being met. And we have to recognize, I think, don't we, as grown ups, that this is going to be very significant in how a young person it can be manipulated and certainly, you know, you might think they're safe in their bedroom. But I, I, I genuinely, whenever I hear someone say, Oh, you know, they're always in their bedroom. I'm like, Ooh, for
Catherine:me, that's red flags now, you know?
Yoyo:When we look at behaviours, let's just say the bedroom, what are we looking out for? What kind of action do we need parents to start thinking about? And I get it, we're going to go into the space around having uncomfortable conversations, we should get comfortable having uncomfortable conversations. Yeah, yeah, yeah, yeah. So let's look at the bedroom, let's look at where parents can start.
Catherine:So in terms of what, devices or the behaviours of the child, or both?
Yoyo:Yeah, because, you know, a young boy who's in his bedroom all the time obviously is out of the parent's hair, parents have tough jobs, they have working lives, they have conflicts in relationships, all sorts of things, and sometimes the bedroom is a sanctuary. But it's not a sanctuary when you know about the online harms. So let's just take a traditional, averagely safe home, but we recognize that there is too much time being spent on devices in the bedroom. How do we start to change habits if we, if we think that that's an easily exploitable space?
Catherine:Yeah, so I generally talk about, so I tend to have two names when I'm, talking about children. So I might have a Timmy or a Billy or a Bobby or something like that. So if, if, uh, you know, Timmy is seven and Timmy comes home and straight after his tea, he's off up the stairs to go and play his game, there is something about, um, I would certainly say, as parents, you have to take an active interest in what your child is doing. So, before the age of about secondary school, I would suggest that these consoles, these, systems that children can access the internet on are not in their bedrooms. That there is a rule within houses that what you're doing is done in front of us all so that we can see what you're doing. Now, yes, that does mean that you're going to be listening to the Minecraft music repeatedly. You're going to be hearing about obsidian. There are headphones around. Absolutely. and certainly listening to the, so in terms of headphones, that can actually block out what's being said to your child. So there, there is this thing about as parents, it's uncomfortable. See, I fell for that. And so the way, the way I look, I'll tell you what I did with my children in terms of that was the, that was the rule is before secondary school, this wasn't, out of sight. So they played games. Which is where I learned how children can communicate with each other without giving directives around 12 o'clock, 1 o'clock, 3 o'clock to know where somebody was on a map, they just said over here a lot, which I couldn't understand, but there is this kind of telekinesis that they all engage in, and I don't know how it works, but it just does. Literally, in their primary school years, you have this thing where they're playing the computer game. And I would get on with the dinner. And whilst I was getting on with the dinner, I'm listening for conversations that are taking place, conversations about who people are. And you have some beautiful teaching moments. As that happens, so who was that that came on then? Who was that that swore at you? Why did they swear at you for that? How do you know that these people are people you can trust? And what you can do is help them start to think about who they're engaging with. So, why is that person asking you to a private party? Why is that person sending you private messages? How do you know? That person really truly is who they say they are. How about we get our device and we go and Google that person's gamertag or let's have a look at how many points they don't have. There is this thing about prestige, not the, Call of Duty, version, but it's this prestige that people have on the internet and around gaming is if you engage in a game for a long time, your avatar is able to show that off to other players. So there are ways that you can see whether somebody is a regular player. Whether they have gained, increments in terms of the game, whether they have an established set of friends and their friends groups, you, you can actually do a lot of research based on these gamer tags and also gamer tags give away quite a lot of information. So children tend to pick things that relate to their address, their house, their dog, their age, the year they were born. Sadly, that's how the internet began. And if you look at most adolescents now, uh, they're. Early email addresses are either 1998, you know, in that early internet space. So what you can do as a parent is ask questions. Why, why does the game work that way? And learning the difference between natural stopping points, the types of games, and who gets involved. That can then transfer to the conversations you can then have about when they get to their adolescent phase and their upstairs. Because by that point. You've already had the conversations about trustworthy people online. And if somebody is saying something, let me know because we can block them, we can mute them, we can report them. There are tools for keeping your children safe, but you have to engage with your children in the first instance to do that. Yeah,
Yoyo:whereas at the moment, they're kind of doing all the heavy lifting, aren't they, in the main?
Catherine:Absolutely. Yeah.
Yoyo:I was going to say, with the exception of those parents who are going above and beyond and getting really involved. Look, this, you've only got to refer to, and we all saw Happy Valley. It's one of the best television dramas ever. And we recognize that this is how the estranged father found a way to contact the son. After he came out of prison and when he was on the run because he knew what game he used and he used the gaming console, didn't he, in the game to connect with him. I think that was a, that's a very valuable, uh, lesson. Certainly it brings the kind of reality to what you're saying for some people who saw that program. But look, one of the things to look out for is gamers that have got multiple accounts, right? Why is this something to
Catherine:look out for? well, again, people who have lots of different gaming accounts, and again, this is why sometimes the headsets not being in is helpful, because parents are pretty astute at recognizing the same voice time and time again, but certainly the way that groomers So if you think about, the way that we generally operate is now, similar to those phone calls we used to get in the, 90s and 2000s in terms of your phone would ring, you'd pick it up and if you heard a click, that meant that the, advertising agency that would call it cold calling you would be connected. So if you think about, A groomer having multiple accounts, it means that they can play at multiple different people at getting a relationship with you to see which is the one that can actually encourage you to do things. And also those accounts can then be encouraged to add other children. Because of course, if I'm, I don't know, pretending to be Lucy, but really I'm Frank and I've got a Lucy account and I've got a Lily account, I might say things about Lily. To make you follow Lucy and to engage with Lucy, therefore creating a much more complex system. And groomers do not just come across one set of children. They are skilled, technical, um, entrepreneurs in this space. They know how to have multiple accounts, multiple conversations, multiple ways to connect with children. And often they are quite happy to engage with the child and if they get told, excuse my French, fuck off, which a lot of children do tell them, they will just move on. Because there are so many children they can access that, it's easy pickings. And that's actually what you have to bear in mind is that the moment at which a groomer chooses your child is because they are hoping in that moment they are easy to manipulate.
Yoyo:And if you don't have great Relationships in the home and you don't have an open way of communicating with each other and on top of that you have other aggravating factors of maybe domestic violence or even alcohol or addiction, you know, you've got you've got some very simple green lights for predators that would indicate somebody is very vulnerable, indeed, without having mental health vulnerabilities.
Catherine:Yes, so I'm massively nodding, not that anybody can hear that, but there is, there is something about, um, my, my, so I've been doing this in this space, working with children who, um, have been groomed or exploited online, and more often than not, Um, the children that were first grou and it might be places like Facebook because, um, well, Meta as it is now. So many years ago, um, a lot of the social media accounts that children had were open and they'd say things like, Oh my God, shit day at work, shit day at school, shit day at this. Well, that is the neon sign to the, to the groomers. Okay, what I can now do is lie with that child about, Oh, I've had a day like that, or sounds like you need somebody to speak to. Groomers are so kind and they're so nice because they have to be. And it really is quite disturbing because there's another set at the opposite end of the spectrum that don't stick with the niceties that come straight in. And, there is a video that I share in training showing how quickly, perpetrators will come in and they can be in their fifties and, that kind of age. And they might say to a child, do you want to have some fun? Do you want to talk dirty? And that can happen in less than 10 seconds. So this is not just on gaming platforms. This is across the board. Let's
Yoyo:share that link as well for that video. We'll try and share as much impactful information as possible. Okay, look, I, I get it. I get that the perpetrator does not care about the child. In fact, um, I, you've only got to look at these recent cases of the parents who were bereft after their children fell to self harming. By watching, an implosion of videos through algorithms of self harming at times where they were very vulnerable. You can look at that as an example. And I think we, we have to understand that we'll talk about big tech in a minute, but what is the impact of a child? who has been groomed, then let's take a child that's very young, between 7 and 10 years old, you've dealt with this personally. Yeah, what can we spot signs as a parent as well in the child's behaviours?
Catherine:during the actual grooming process, you might not notice anything to begin with because, a child who has a new friend has a new friend. Now they might tell you about the new friend. And again, this is why conversations are so important because they might just be saying, and then today I was da, da, da, da, da. And they list off a whole set of names and you think, yeah, yeah, yeah, that's, that's the people that they're in school with or whatever. So it's, again. Having that, that, ear that's tuned in for new names. And I always say new names on the block are things that we should pay attention to., and certainly you might find that your child is more keen to go and do their homework. Okay, I need to be on my device or actually I'm just going to have a party with my friends and they invite their school friends in and then actually what they do is they leave that party and they go into something else. So certainly as a parent, we want to try and get the balance right between. Intrusive parenting into the space and looking at every single game that they're on and going, why aren't you with that set of friends and why have you moved to this game and so on and so forth. But watching over time because what we are looking for are patterns of behaviour. And certainly, groomers need to know that you are not watching your child. If they think that you are a parent who is taking an active role and interest in your child's online life, they're not going to hang around. They don't need people spotting them because they might get reported. So, in terms of the child's behavior, you might be looking for secrecy. But that's quite normative for children who don't want you to come in and see that they're not very good at a game or that actually they're talking to their friends about maybe their sexual identity. There's going to be a very fine balance. And a lot of the time when we look at grooming in the real world, we tend to look for gifts going missing. Well, that's not applicable to the online space. So again, you're looking for changes in behavior and those changes in behavior can be as,, Low level as fidgeting, it can be, and I mean, excessive fidgeting compared to a child who didn't, you might be looking for sleep disturbances, toilet disturbances, what parents tend to name as whiny or clingy children,, because part of this grooming process is the outset is nice, once the child has been coerced into a behavior. Then it's not nice. And this is why I talk about that ogre in the screen. If you know that there's an ogre under the bridge, he's going to let you walk across the bridge, but then he isn't going to let you walk back. And children don't necessarily understand, it's the coming back that causes the problem in terms of, it's after you have shared something or done something, and most children are tricked. And I'm going to use that word quite, and I'm doing quite an emphasis thing here, they are tricked. Where it might be a groomer who says, Oh my goodness, I've noticed you're in your gym kit, can you do a high kick? I bet you could do one where you could do a handstand And of course, children want to show off in terms of their skills and what they can do. And most of the time, the groomers are looking for low level, things for the child to engage in, so that they can push and ask for more. Because if a child does a handstand and shows gym knickers, the perpetrator can then say, You shouldn't have done that. And shame is the reason why they can be exploited further. Because if they think they're going to get into trouble, and they don't have that open relationship, and going back to what you talked about, Yo Yo, with the families, the domestic violence, the, the non open, then, you know, the, I think the phrase is, is pretty much that child is screwed in the whole process because they might not have a trusted adult. So sometimes even if we are not a parent, we can also be a trusted adult to a child because sometimes those children need to reach out to other people. Families are not always the safest spaces on the planet either. Yeah.
Yoyo:And, and that's true actually.'cause that was another podcast where I worked on, you're more likely to be sexually abused by inter familially, uh, yeah. By a member of the family in the home than you are to ever be groomed online. Uh, it's honestly, it's a trepid world out there. There's no real safe haven. this troubles me a lot when I look
Catherine:globally
Yoyo:at how as a human race, we let our children down on so many levels and, sometimes I kind of sit here thinking who's running this show and do we have a load of pedophiles running the show? Is that why all the odds are stacked against us? Why we have to keep
Catherine:trying so hard to climb up
Yoyo:so many hills to get all the right chips and balances in place. It feels like we're running through quicksand.
Catherine:Yeah, and we're not going QAnon here for a start, it's just that there is, there are so many of these. men and, and certainly in some of the research that's come out recently, one of the questions that was asked, so this was, conducted on the dark net and it was done with men who were in, environments where they were looking at child sexual abuse material. And one of the questions asked was if you could, and you could get away with it, would you? And there was a large number of yeses. Which is so concerning because
Yoyo:the ethics are out the window massively. And then, I remember this resonates with a guy who was threatening Olivia Atwood. And he was an online abuser. And when he went to prison for three months and came out, she had an interview with him on TV, and she said, I've just got to ask you why you thought it was appropriate to say online, that you thought I should be murdered and raped. And he just said, I was in a really bad place. I think I had really bad mental health. And I'm thinking, whoa, hang on a minute. There's a lot of people that have genuinely got really bad mental health challenges to sort of sweepingly turn around and say that it's concerning. It's not going to get
Catherine:you off the hook. God,
Yoyo:you know, this isn't, this is an inclination. This is not about mental health. You don't suddenly have bad mental health and say, Oh, do you know what? I'm going to turn into a deviant and have deviant behavior. Oh my God. I feel like James O'Brien now. I'm really going on my
Catherine:soapbox. I'm just going to say, the thing that shocks a lot of people is there's a lot of men interested in child sexual abuse material that are not interested in sexually abusing children. They're just interested in the material and what they do with that material is up to them. So not everybody who has an interest in children is interested in them sexually. There's lots of other reasons. But this material there's such a volume of it available and now it can be created in however, we're not even going to get into the AI stuff for a minute because that's just terrible. Yeah, yeah. There is such a volume of this material on the internet that young men who might not have had an interest in this might well have been exposed to it because of the way that back end conversations and community spaces works. Yeah, sharing. That actually then becomes something of an interest where they may not have. The mental health diagnoses or the, um, uh, the diagnostic, fillers of, um, uh, the, the ages at which they might be particularly interested in because, I think, and it might be the most shocking stat that we talk about today, there is a huge number of people who are interested in children under the age of two, and that one is horrific. What's the number? it's in, in terms of, this is the difficulty, how do we measure what's being traded on the internet when you can't get into some of those spaces? How do you measure what's being traded in the moment on services that are live streaming? How do you actually capture the, the realistic numbers of this space? When you look to organisations like the IWF, uh, so that's the Internet Watch Foundation, they have, figures where they have been tracking the numbers of infants and toddlers. And when I read their, their yearly report, I, I physically cried. It's, it's so hard to read around this stuff. Yeah, I know, I,
Yoyo:I covered it on a podcast a while back with a journalist who broke a story on the BBC around how AI is being used to generate images of child sexual abuse material. And it was, and it's still available for anyone who wants to listen to it. It is actually the, I, either the top three or top five downloaded podcasts the whole year of 2023. And, you know, there were limitations about what she could talk about on the BBC, and she sold a lot of her story to the Times, but they couldn't talk about the real gritty stuff, and so she gave me her whole story, which was phenomenal, just to highlight, really, the dangers. And This is a nice segue into really where I want to go with Mark Zuckerberg because he blurs out the faces of his children on his own platform and, and you have to ask yourself why that is. It was quite obvious. It's the same reason why many directors of social media companies won't allow their children to use their tech devices in their bedrooms. They have to use them in the family space. But look, you know, we have covered. Um, I'll, I'll probably add in about that, but what, what, what, let's talk about big tech now. And I, and a lot of people know he used to work for Facebook at one time, and I, I was desperate to get Mark Zuckerberg into a Lyft and say
Catherine:to him, look. For many different reasons. Do you forget? She's
Yoyo:really not my type. And the more money a man has, the more
Catherine:unattractive he is to me. Um, yeah, I was thinking, I was thinking more to do with, you know, a swift, a swift.
Yoyo:No, no, no, I'm not a violent person, but I really wanted to get him into a lift so I could say to him, dude, you've created a fantastic platform here with so much potential, but there's no guardrails and people are being harmed. And you need to take some responsibility and culpability for that and stop it before it gets too bad. And that was in 2015. And I had to step away and this is after being involved with setting up one of the content moderation, projects as well. And I'm going to say this cause it's, this happened years ago, but it wasn't even a compact, I was just helping to fundamentally make sure that there are enough security people. In this space. And it wasn't about the people they were recruiting to moderate the content. There was a big drive in 2014, 2015, because children were dying. And,, and I just, I remember saying to the project manager, I said, Do we have any safeguarding in place for these young people? because they're going to be hired to moderate all of this content, whether it's country, whether it's a type of content, sexual violence, whatever. I said, do we have any safeguarding in place? Because they should be having one to ones daily. They should be having team meetings. They should be having lots of breakout sessions to, you know, do lots of things so that their mind isn't centrally focused on this content. And they looked at me and said, it's a bit dramatic, Yolanda, don't you think?
Catherine:And I was like, Whoa,
Yoyo:it's like gaslighting you. It's like telling you that instinctively what you're thinking and believing is utterly bullshit. And I said, okay. So one day, one of these young people in this very metropolitan area of the world is going to walk up to the roof, walk off the edge of the building, and you have been warned that you've done nothing to stop that happening. Obviously, I'm just dramatic.
Catherine:Well, I mean, talk about the Cassandra effect in terms of, actually. I mean, we're now, we're now in a space where, so I'd certainly say for the past five years, because, certainly because of the stuff around cyber trauma, I have worked in, therapeutic ways with,, people who are content analysts, uh, in, in varying spaces. Um, And what I do know is that there's an organization, um, at Middlesex University that did a piece of research on a number of content moderators. They found that, you know, no surprise, uh, it was a No Shit Sherlock piece of research, that they are affected by watching this kind of stuff. And certainly the space of content moderator well being is only just being talked about. But if If I kind of interloop in terms of what you were talking about earlier with the, uh, journalists, I, I use Joe Tidy's, appearance at the IWF to show people just how quickly it happens. And this kind of trauma is the cyber trauma that I've been writing about for 14 years. It's the stuff that I'm talking about. This has a psychological impact like no other. Because we don't have the literature and the research to back it up in terms of, um, other types of trauma. This is, you know, this is where I could go on a rant now in terms of this is my, uh, one of my passions. There is something about repetitive exposure to this material. We know affects. people in, um, war situations, and we have services for those people. And we know that people go, so I'm ex military myself as well. We know that when you go into those spaces, you have to have, um, psychological assessments. There's, there's things that happen before you go to war. And there are things that happen when you come back, albeit they're not fantastic, but we do understand that because we've got a long, long history of research. Coming out of the VA spaces in the U S we've got big, big, theorists around this space. One of the biggest ones being, somebody called Bessel van der Kolk. And we know a lot about trauma. What we don't know a lot about, and this is Kat's temper tantrum is what we don't know a lot about is cyber trauma. And I don't understand why, because I was working with children in 2010 who were viewing, um, illegal movies., of murders, not, not adult, uh, sexual material. They were, because that was all the rage in the early days, those websites are still in existence. That kind of material is being shared on Snapchat, on TikTok, on even though the community standards say it doesn't. And when we talk about Facebook, you know, recently in Congress, he was, roasted a little bit in front of everybody. And yet we don't protect and we don't support those people that you were talking about there that have to do much of that groundwork to try and remove that material. And sadly, there is no way for a human to actually interact with that and not be affected. It's totally impossible.
Yoyo:I'm not surprised. I mean, then you've got to consider, you know, they might not be affected in the short term, but then are they affected in the mid to long term, and the variants are just too broad. And AI does need to be better. But I think, I wished, if anything, and I've been a huge fan of Facebook, I still have, I have a very open, Facebook account that I use for all sorts of things, which I can talk about. And then I have another Facebook account, which is my private one. And what I've noticed is as the years have gone by, I've just
Catherine:literally,
Yoyo:I like, do I know that person? No. Do they need to have access to my data? No. I've started to delete more, eat more. Photographs. I've started to remove myself from so many groups because I'm just finding it all, available and an exciting, not even exciting, but it's like, oh, that looks interesting. That looks interesting. And all of a sudden. As an adult, I feel like I'm getting too much unwanted contact, and it's not sexual, it's not predatory, but it's just unwanted contact. I don't want, yeah, I just, there's so much of it, and I don't want to know that 50 people in my local area were all meeting up for coffee, and, and I, and, you know, there's just so much. I was asked once to moderate a genuine, love of food Facebook page. And then what happened was, I don't know what happened, but I started getting a lot of notifications that, some illicit content was being posted on that page. And so I was like, Oh my God, got to delete this because there were literally, lots of explicit images of men's anatomy. And, in fact, it was worse than that. And, and I was deleting, and then I realized I was deleting, I was spending a half an hour deleting this content. And I'm thinking, why am I spending half an hour of my time? Deleting this content, where is the moderation in the, in the application? And I went into kind of like system settings and I was like, it was a freaking nightmare trying to navigate my way around. Facebook has now become an enterprise of a multiple dynamic things to do. And I think I just need to flip in, just look at this and bam, these people. And it was the same people and then more people, more people all posting up illicit content. And it said something like Facebook has advised you that there are 30 breaches of and I'm like, and? What are you going to do about it? I, I can't spend half an hour deleting everything. And it just, and so do you know what I did? I just left that group. Because
Catherine:my capacity
Yoyo:to help and get involved and minimize impact had completely been diminished. And even down to, do I want to spend a whole day reporting
Catherine:each
Yoyo:person to Facebook to only find out that a week later they're still putting illicit content on this? And it was literally a food appreciation page.
Catherine:That's the reason children don't report because it's incessant, there is no point, they don't get feedback, people make multiple accounts, and if they delete and report one person, and, and this is, you know, again, everything that we're talking about, I can kind of do a complete circle back to, this is what happens in cyberbullying, This is what happens with the harassment and stalking. This is also the thing that happens in, exploitation and children get to the point where it's much easier and. No, we haven't even started to talk about the VR space at the moment in terms of the Metaverse. This is why they just up and leave. And you know what, I will tell you Yolanda, I'm already dealing with lots of safeguarding issues and I've said this in a number of roundtable discussions and spaces. We don't, we don't even have the safeguards for Web 2. 0, God knows what we're going to do for Web 3. 0 because the young children I'm working with have already been exploited in that place and exposed to material that services will tell you they can't be exposed to. Ah,
Yoyo:wow. Wow. Uh, and, and, and I guess really this is just enablement, that 800,000 known sexual abusers, I take it, that was a UK
Catherine:figure and not a global figure. Yeah, yeah, yeah, yeah. That's just uk. And it might actually be, I'm just thinking I might have overestimated, it might be 300,000, but either way it's 300,000. Too many, but certainly. If they were on the sex offenders register and known to have harmed children previously and we say, uh, quite often I do see this in terms of, and they were told they were not allowed to use the internet or they were restricted from using the internet. And I think, how the bloody hell is that going to work? Who's going to monitor when those You're asking a
Yoyo:sexual offender to have self
Catherine:regulation., and yeah, and not use the internet. Okay, but how would you like them to check in for their job centre? Well, you can't have it always. So, okay,
Yoyo:so A, we're all in agreement that Big Tech's not doing enough. Two, we're in agreement that the number of predators is increasing. Three, we recognize that the environments are becoming more trepidous and there are more risks. And four, we know that what the policing agencies and units are dealing with is unmanageable. Wow, that's
Catherine:catastrophic. So it is. And it sounds all doom and gloom up to this point. We know. I'm just thinking people will be like, no, there's no hope. There's no point. So actually there are things that we can do. So, we
Yoyo:know that children are massively harmed. We know the predator doesn't care about the children and what the impacts are on them. We know now that the parents have to get more involved and take a more active interest, and we can recognize straight away that we shouldn't put tech in the bedroom. Right, next steps
Catherine:then. Okay, so, first of all, I do not want to parent blame. The reason we are in this difficult space is because, and, and, so this is one of the sentences that, um, I had at the beginning of my book, in terms of parents make an assumption about certain things, okay? Because we have been raised to trust services. And when we take our children to school, we make an assumption that the school will be a safe place for them. And certainly, I'm sure in your past you've dealt with many cases where those places weren't very safe and children were. harmed in those environments. But as a parent, you tend to go to the school, meet the teachers, have a look around the grounds, and you put your trust in the school. And sadly, when the internet started in terms of, so, my children are,, late adults, late, late twenties, they are adults, they have grown up with this space, and I was so surprised at how many parents said, yeah, but it's Facebook, Facebook is safe because parents. Naively, not because of anything else, put their trust in these organisations, expecting them to be caring and to take care of their children in the same way that we think the hospitals do and the playgrounds do. We thought, in terms of going back all the way to the late 90s, that these organisations would have had regulation before coming into the space in which our children go visiting. And sadly, that wasn't the way. You know, because if you go to the cinema, it's regulated. If you. This will take us back a bit. If you bought a VHS, borrowed one or a DVD, it was regulated. If you watched television, it was gated, et cetera, et cetera, et cetera. So the assumption that these spaces have the same level of care and due diligence wasn't. done through any kind of, stupidity. It was done through the fact that is how we've been raised. And sadly, what we now know through this complete saturation is parents didn't get the right kind of education. Parents are having fingers pointed at them. Parents are trying to point the fingers back to technology companies. The government is pointing the finger at, services, tech organizations, and parents, and the children are in the middle of that gap. When I
Yoyo:was in the police, I had to deal with a very young, sort of 12 year old boy who climbed out the window of his bedroom and used to go and help a little local gang of ne'er do wells, burgle houses. He was quite small, he could fit through windows, and so he was the Grease, Grease Boy, I think they call it, so to speak, or whatever. And the parents came to me and said, how do I stop him leaving the house? If it isn't the parents that need to take more accountability,
Catherine:who is it? It's a societal problem and I think what we've done, and earlier on we were on about letting children down. I say this, especially when I'm doing, keynotes and things like that, we've failed children for 27 years. All of us. And certainly what we are doing at the moment is digging our heads in the sand because it's too overwhelming. So the word technophobe is still being used, ubiquitously. And people will say, oh, but the children know more than me. And I used to say, okay, so now is the time to get tech savvy. Technology is not going away. And I found myself thinking, that's quite, It's beyond assertive at times because that's my frustration around, okay, we can't say this anymore. We have to stop doing that. I don't like it. I don't want to learn about it. No longer can we say that if we are going to raise children in, the 2020s and beyond. and that's difficult because, oh, if I get into my data protection head for a moment, when you go to read the privacy policies and it's all legalese. And not only that, as you're reading down it, there's another terms and condition page that you need to open. And if you opened everything to do with terms and conditions, you'd have a hundred tabs open. So it's, it's the way that things are written. It's not written for an average parent. And I, I'm, I'm trying to say here that the average parent needs ABC one, two, three, and that's not through stupidity. It's because big tech created big tech and created the big tech people to come in and write the policies. Yeah.
Yoyo:Okay, so I've got some stats here. Let me run these by you. Accountable Tech, produced a national poll with a thousand parents. of school aged children, and it looks like it's based in the UK. Below is a sampling of highlights from the poll. 3 in 4 parents, 74%, believe Facebook cares more about corporate profits than their children's safety, with half of all respondents saying they feel this way strongly. A majority of parents, 52%, Hardly a majority really, 52, but you know, I would say, it's not a majority, it's half. Half of them say social media is just as dangerous for my child's well being as products like, looking at e cigarettes, for example. And, if it's not e cigarettes, it's going to be something else. 90 percent of parents say Facebook should publicize. The research they've done on social media's harmful effects on children's mental health. They are not going to do that. That's like saying, if you drive our car, you've got a chance of crashing and dying. More than 8 in 10 parents support a host of different measures to expand protections for children and teens on tech platforms, including prohibiting companies from collecting personal data. That's something we can do without teens consent and requiring autoplay and algorithmic recommendation tools to be turned off by default, limiting push alerts. because I'm stopping them now. Like I've, it's so much more peaceful. The only way I can recommend this is during COVID when everyone was watching the news all the time, everyone was getting really stressed until they stopped turning the news off. And as soon as they turned the news off, all of a sudden the calm came back, the peace came back. And I think you, you turn off push alerts for the things that you really don't care about seeing day
Catherine:to day. The peace comes back.
Yoyo:93 percent of parents. want the federal government, this has got to be American, I changed my mind, they're using Zs instead of Ss, 93 percent of parents want the federal government to pass new legislation updating children's privacy protections online, and 81 percent of parents say big tech companies like Facebook and Google need to do more to protect children, compared to just 19 percent who say they've done enough. Now, maybe those people all work in Silicon Valley. The online safety bill came out, didn't it, last year, and it got sanctioned, but in my opinion, and I'm going to ask you to share yours, I think this is just the beginning. It doesn't have enough teeth. It's trying to keep everybody happy, but I think maybe we can help it to get more teeth. And let's be frank, for those people who use Facebook and who use Messenger and WhatsApp, when you're using an encrypted message service, you're protecting the rights to privacy. But in protecting privacy You restrict the availability for any oversight on how children can be harmed and abused using those services. And I know many women who would say, I would rather give up my privacy and protect children. A lot there
Catherine:for you to unpack. yes. So the Online Safety Act is, lots of people consider it a panacea. It is not. It is far from it, um, it is the start, and certainly, um, something that I was writing about in the book is, and they, uh, Ofcom, who are now going to, um, if you like, enforce the Act, in terms of regulation, uh, that, they have an impossible job. Um, the online safety act needed to be future proof and sadly, we don't know what the future problems are going to be just yet. So we had to speculate in terms of creating, uh, an act. And again, proaction in this space is difficult because we don't know what's coming., certainly I would say some of the haptic feedback type, technologies and the immersive technologies are going to bring about problems that we hadn't even conceived along with AI. I think we are in a very difficult time in terms of trying to regulate, look after people, and in, as you said there Yolanda, keep everybody happy. The difficulty is, the Online Safety Act is a UK regulation, it's a UK law, and the internet is a borderless community around the world. So what that means is that, realistically, Ofcom are going to be asking people for, certain types of actions to be taken, and really, Other countries could stick two fingers up at it, and certainly I know that services are now saying, right, well we won't come to the UK then. Which actually, that is going to have an impact on some people's working environments, and their ability to do their jobs, or even communicate with families. And it's not doing enough to protect children because sadly we've labelled things priority and non priority content and the way that children have been spoken about in the Online Safety Act, is the thing that I'm writing about certainly in my PhD. And, I think it's going to need more of an approach about how we look after children. And the difficulty is when we start getting into this, group of mental health impacts from technology, we're talking about concepts that we can't even define. So often, and this is my little bugbear about the research that's out at the moment, and certainly what happened at Congress. So when you hear that services are carrying out research about the impact on mental health, um, so this is a book that's currently under, production at the moment with, the publisher is when you ask that question, what do you mean by that definition in terms of mental health and how does it affect mental health? Because actually there are positives in mental health that are found in those spaces that are actually going to be regulated by Ofcom. For example, self harm. Self harm is a complicated subject and it's a symptom of a system. So that's whatever's going on for that child within that house, for the adult within that environment. And if we say, just get rid of it, we're doing exactly what we did with alcohol many years ago. We're doing the same as we've done, you know, abstinence doesn't work. Regulation doesn't necessarily work. Um, and we're not learning as human beings about how to protect. Each other going forward, and sadly that's the difficulty that we're in at the moment, is, um, So I saw something today that said two hours, this is the kind of shit that's out there at the minute, two hours! If you have two hours of screen time, that's enough to protect your mental health, and I'm like, Uh, what? What does that mean? What does that mean? It's, it's ridiculous. It
Yoyo:feels like someone's just being forced to give a number without any context, which is
Catherine:irrational. The, the number of, and I'm doing air quotes for the listeners here, gurus that have come into the space, Oh, well, we're, we're recommending that, you know, for example, infants should not have more than X amount of time. And I'm like, what does that mean? Because, now we have to think about the definition of screen time. And this is, this has been a nonsense term for a number of years. But, it is a term with which we have a shared language, so it's helpful, even though it's unhelpful. But certainly, my favourite thing to do in a, in, when I'm presenting is say to people, So what do we mean by screen time? And people will go, it's using a laptop, and I go, Hmm, well, bearing in mind I've had a PowerPoint on for the last two hours, you know, Uh, we're at the limit, so I'm gonna have to pack up and leave now, and I'll just take the money for what I've been paid to do for a full day's training. Cause actually, you could have screen addiction, scream overdose, it's, it's It's silly, it's silly, but what we have done is created a, an environment with, um, scaremongering, clickbait, and that's why the job is so difficult to now regulate this space. It should have been done at the outset, it wasn't, and we're now backpedaling, trying to resolve issues that have been going on for a long time. So
Yoyo:to wrap up then, and this is very poignant, I have to also echo again what James O'Brien said. He referred to Carol Cudwaller, who was one of the most amazing journalists. She reported in The Guardian. There are other newspapers
Catherine:available. But
Yoyo:she did report on the Zuckerberg story when he testified during the US Senate Judiciary hearing. And there's a particularly poignant moment here. She writes. It was a genuinely standout moment of awkwardness in which he was forced to face victims for the first time ever and apologize. Stricken parents holding the photographs of their dead children lost to cyberbullying and sexual exploitation on his platform. Less than six hours later, his company delivered its quarterly results. Meta's stock price surged by 20. 3% Three percent, delivering a 200 billion bump to the company's market capitalization. And if you're counting, which as a CEO, he presumably does a 700 million sweetener for Zuckerberg himself. Those who listened to the earnings call, tell me there was no mention of dead children. That is incredibly ill timed. I don't know whether they felt it was intentional to do it at that time or not,
Catherine:but does not pay a very. Good picture, actually, doesn't it? No, and his apology was far from being an apology, it was what we call doing a sorry, in terms of he stood up and he said I'm sorry. The actual apology was incredibly difficult to listen to. Yeah, it was. It was awkward, wasn't
Yoyo:it? And it's a bit like, you know, when Mark Rowley spoke for the Met in relation to trying to stamp out racism and, and other stuff, harassment, misogyny, whatever, he forgot other stuff. But it's a bit like, just, why don't you just say, do you know what? We need to be better. And we're gonna continue listening, we're gonna go through some pain, you're going through pain, we need to go through pain, we need to be better. I mean, for goodness sake, maybe his stocks and shares would have gone up by 30%, but I don't know. I don't know. I just feel like there's a huge amount of enablement of the wrong messaging
Catherine:here. So for me, there's something about that would have taken absolute vulnerability and courage and bravery and being able to look at oneself and say, I am so sorry that this is, there wasn't an I. There was an apology from a my perspective, and that's the difference is, what Zuckerberg did wasn't connective, and it didn't resonate with people because there was no I in what he did, and certainly, how do we model for these children to have a better experience in a digital space if the people running the platforms can't say, Hey, I've screwed up. I need to do things differently and I need to make a difference. And that is what we can do as the parents. That's what this podcast episode can do. And this is why, as much earlier it was doom and gloom and no hope. Actually we do have hope and we can do better. So the tech giants are not going to change who they are, but we as a society can do better. And the Wall
Yoyo:Street Journal actually announced that, has been able to show that Meta's algorithms enable paedophiles to find each other. I joined the police because I wanted to be on the right side. I've always been a Jedi rather than being a Vader and I'm just thinking it's starting to feel very tilted and I, and even I'm struggling now with trust and things like that from a personal perspective. Do you feel like we're in an
Catherine:alternate universe where everything's gone wrong? Absolutely. Sometimes these are the conversation and, and sometimes I apologize to my children for, you know, and this might be something that I regularly do go, I am really sorry I brought you into this world. I did not know this was how it was going to go. And, but then again, when I look back at history, We're not in, Nazi Germany. We are not in, the early times of, where cholera was killing people within the United Kingdom, in terms of the black deaths and so on. So there is something about contextually. I agree that we are in a very different time and that the ills and the things that we face are different. The future is going to look back and laugh at us. Of course they are because they're going to go, why didn't they put these things in place? Why didn't they do things differently? And hindsight is wonderful. And this is certainly why, I love talking to people like you, Yolanda, in terms of what you're trying to do, because this is so difficult. And it's hard. And bringing up children of my own, and now all of the children that I've worked with in therapy, parenting is the hardest job in the world. And it's made more difficult by the access to other spaces. And those other spaces are quite often out of sight and misunderstood and that's going to take a village to raise the child. We all need to do this together.
Yoyo:And let's leave with one final question to you, your worst day and your best day. I think we should leave people on a more higher positive note. We don't want to leave
Catherine:them on the negative stuff. Oh, my worst day. Wow. they happen often. So in terms of, so I've been through, divorce, I've been through, deaths. I've been through, I don't know how to categorize the worst, mainly because I'll tell you why, in terms of, where I would see this going. Yeah. For, the technology. Let me, let
Yoyo:me rephrase it. Let me rephrase it. My worst day. was when I had to go to the home of a 11 year old girl who had mental health, challenges. And I had to investigate a case where she'd been groomed online. And the shame that she had was between letting her parents see what images she'd had to send to this perpetrator
Catherine:or facing
Yoyo:suicide. They were her choices, right? That's no choice you want to put an 11 year old to make. That, I think that was my lowest day. Her parents were actually incredibly calm, but I think they'd had a long time between my getting there to process what had happened. And I don't even know now. I often think about that young girl and I wonder what the impact has been on her as she's grown up in life. So
Catherine:that's what I mean about your worst day. what I faced are nothing. Absolutely nothing. I've been through a divorce. Oh yeah, and deaths and bits and pieces like that. And there were many times I thought I was robust in terms of I can get through this. And then about 14, 15 years ago, I started in this space of online harms. And I worked with, a young girl, so around about age three, who had been sexually abused, and her dad had taken the pictures. And this was, I had nowhere to go, and even today, so the thing that I'm going to say about the worst days happening so often are, my profession is not skilled in this, so that's psychotherapy, social workers are not skilled in this, police are not skilled in this, up and down the country, and that's why my best day is when I know somebody's read the book, or they listened to something like this, because every kind of harm that I have. Come across is going into a format that I can share with people saying, think about this, think about when, when this happens. Because I've worked with a 3-year-old, I worked with, so the first case that ever came through from, uh, someone based in West Yorkshire, west Yorkshire Police, they had had a young primary school age boy, and this is going back to 2013. It might have even been 2012. And the police were like, oh my goodness. Not only has he been groomed, he is now grooming other people for the group. We don't know what to do. And I took that to a large. Conference where I knew the home office were where and I said to them, this cyber trauma is as bad if not worse in many different ways than the real trauma and my worst day happens again and again and again and again because people are not listening. They're not taking it seriously just yet. Yeah, is it
Yoyo:that whole kind of like, it'll never happen to
Catherine:me thing, you know? Um, I think it's because we don't understand it, but the best days that I have are when I'm able to resolve and I, I don't know, maybe I have a team around the child and I'm able to explain to the school why. Why the child behaves like they do and then suddenly people are like, oh, you mean that incident with the, as they used to call it sexting? Oh, you mean that incident with the stone set? That's why they're doing what they're doing. Yes, and I get to explain trauma and I get to make a difference for the child. And the reason I've written books and I've done the TEDx and I've done the videos and I do the podcast and I do stuff is because if I'm limited to the clients I work with, I only help that cohort. And actually I have a lot of knowledge to share with people. And it's, it's not about bragging. It's about saying, actually, we all need to think like this.
H
Catherine:And the only way I can get people to be able to help children after. an event, sadly, is by doing what I'm doing and that's the reason why I'm the pain in the arse to people turning up and, you know, constantly, constantly giving my professional opinion and hopefully we will start to change what we do for children. Keep doing it.
Yoyo:Be brave. Catherine Nibbs, thank you so much for joining us on the Security Circle. Bravo. Thank you.