Radio Kempe

The Challenge of 21st Century Child Abuse: A Conversation with Ernie Allen

The Kempe Center

Use Left/Right to seek, Home/End to jump to start or end. Hold shift to jump forward or backward.

0:00 | 37:50

Ernie Allen was the founder and longtime leader of both the National Center for Missing and Exploited Children and its international counterpart, and he now leads the WeProtect Global Alliance.  Recently, he has shaped child protection legislation in both the United Kingdom and United States.  What is 21st Century Child Abuse, and what can we do about it?  Is it possible to protect free speech and the open exchange of information on the internet, while still protecting children and youth from sexual exploitation and even trafficking?  Join Radio Kempe for an interview with Ernie Allen as we open our eyes both to new, devastating harms to children and to practical, achievable solutions.

Read more about Ernie Allen at https://www.weprotect.org/bio/ernie-allen/.

0:09 
Welcome and welcome back. 

 
0:12 
This is Radio Kemp. 

 
0:15 
I'm Kendall Marlow with the Kemp Center for the Prevention and Treatment of Child Abuse and Neglect. 

 
0:19 
Thank you for spending some of your time with us today that last year's Call to Action to Change Child Welfare Virtual and International Conference. 

 
0:30 
We began to address what might be called 21st century child abuse. 

 
0:36 
Our field of child protection has a long history, yet it operates within a changing world. 

 
0:43 
What's different now and how must we respond differently? 

 
0:49 
We have a special guest today that I'd like you to meet. 

 
0:51 
Ernie Allen, thanks for making the time to join us today. 

 
0:54 
It's great to be with you, Kendall. 

 
0:57 
Ernie, you've devoted your life to fighting child abduction, child sexual exploitation, human trafficking, and even what we'd call modern forms of slavery. 

 
1:09 
You are a founder and leader of both the National Center for Michelin Missing and Exploited Children and the International Center for Missing and Exploited Children. 

 
1:18 
A Prime Minister appointed you to chair the United Kingdom's initiative on these issues. 

 
1:24 
You've been recognized by four US presidents, and not everybody can say they were awarded the title of Officer of the Order of the British Empire by Queen Elizabeth. 

 
1:38 
That's a lot. 

 
1:40 
And those are only the highlights for the rest of us who may be just starting to travel this path. 

 
1:48 
Where did this start for you? 

 
1:49 
What was the spark that lit the flame of child protection for you, Kendall? 

 
1:56 
I I met victim parents. 

 
1:59 
I was working in a government agency, a local agency, and one of the things we were focused on were were was criminality with crimes. 

 
2:11 
And I met parents and I learned that the parents of a missing child in those days were basically on their own. 

 
2:20 
Police departments had mandatory waiting periods, They couldn't even take a report for 244872 hours. 

 
2:28 
You couldn't enter missing child information into the FB IS National Crime computer. 

 
2:34 
You could enter information about stolen cars, stolen guns, but not stolen children. 

 
2:41 
And and the result was that kids were forgotten and undervalued. 

 
2:48 
So I became involved in that effort, hosted a conference that brought together victim parents, leaders of Congress, law enforcement officials, and as a result of that conference, we adopted a 23 point action agenda. 

 
3:06 
The first item was to ask Congress to pass a Missing Children's Act to simply make it possible to enter missing child information into the FB IS databases. 

 
3:19 
And secondly, the goal was to create a kind of national clearinghouse, a National Center, because the challenge at the time and even today is that America's a nation of 18,000 different police departments, basically don't talk to each other. 

 
3:39 
And so the goal was to try to raise awareness, create a national response to the problem. 

 
3:45 
And and the results have been dramatic. 

 
3:49 
More missing children come home safely today than ever before, but through that I discovered the range of vulnerability that modern kids have and the range of victimization that they have to deal with. 

 
4:03 
And even today, most of those kids are what we call hidden victims. 

 
4:10 
Speaking of that range of harms and and threats of harm to kids, if you'll come with me for a second, Ernie, I want to take you back a few years when I was working as an administrator for a large state's child welfare agency. 

 
4:25 
And Ernie, it was just a matter of time. 

 
4:28 
It was just a matter of time if someone opened any kind of group home, residential treatment facility of any kind for kids in the foster care system. 

 
4:39 
Somebody opened a residential facility like that. 

 
4:42 
It was just a matter of time before I would start to receive support reports of men outside that location, of men halfway down the block walking the sidewalk who would approach kids, and you can imagine what might happen from there. 

 
5:03 
It was just a matter of time, and while it was a really tough situation, Ernie, the problem was localized. 

 
5:11 
It was visible. 

 
5:13 
If we were willing to open our eyes and see it, and a reasonable person could at least imagine what we could do about it. 

 
5:20 
What has changed since then in terms of threats of harms to kids? 

 
5:25 
What has changed? 

 
5:26 
And what's happening now? 

 
5:29 
Well, what What's happening now is that more than 5 billion human beings are on the Internet, and one in three of those is a child. 

 
5:42 
Today kids are carrying the Internet around in their hands so they are vulnerable, they are reachable, they are communicatable. 

 
5:53 
It is easy to approach them the the story you give I I remember having my own version of that. 

 
6:00 
One day the head of the county's group homes came to see me and said kids are disappearing from our group homes and two weeks, 3 weeks later are showing up on the streets being prostituted for sex and for money. 

 
6:17 
And what he said was, you know, our agencies treat them like runaways. 

 
6:25 
They look like anybody else's kids. 

 
6:27 
So law enforcement doesn't get involved and the two parts of the system don't connect. 

 
6:33 
So we did something that didn't seem all that radical at the time. 

 
6:37 
We created a a police social work team, a kind of predecessor of today's special victims units and we made the police and the social worker ride together and share information and and work these cases. 

 
6:52 
That's continuing to happen today, except kids are are on the Internet and they are engaging in these kinds of activities mostly involuntarily. 

 
7:05 
They're being persuaded, they're being groomed, they're being brought into these situations and they're not telling anybody. 

 
7:13 
So it it's it is more complex today because we we used to tell parents, you put the PC in a in a in the living room, in a public area. 

 
7:23 
So you can monitor. 

 
7:25 
Sure, get used. 

 
7:27 
That's responsible parenting. 

 
7:28 
That that should solve it, right, Ernie? 

 
7:30 
Exactly. 

 
7:31 
That will do the trick, but doesn't do the trick in the era which kids carry the Internet around in their hands via mobile devices. 

 
7:42 
And So what What happens is parents don't learn about what's going on in their kids lives. 

 
7:49 
They don't learn that kid that people they don't know are interacting with with their children in in real time. 

 
7:57 
We just did a a a study that found that when kids are on online gaming platforms, you know they're in public and individual can can contact them to groom them for all kinds of purposes within 19 seconds of the time they join that platform. 

 
8:20 
So the challenges are greater than ever and what we're trying to do is awaken the public, awaken policy makers to the fact that they're real risks to kids online and as a result of technology, kids are more vulnerable than ever before and we don't find out about their vulnerability and their victimization as readily as we used to. 

 
8:47 
This comes in different forms, doesn't it? 

 
8:50 
Including, you mentioned kids being lured in. 

 
8:54 
Sometimes they're generating the content that's later used against them, aren't they? 

 
8:59 
No, no question. 

 
9:01 
I mean the the whole concept of sexting of self generated images. 

 
9:08 
One of the big challenges we face today, the FBI Director just talked about it to Congress is a problem called sextortion in which criminals in many parts of the world, I mean the much of this is coming out of Africa. 

 
9:23 
This is not happening in Kansas City, but the the kid is being particularly Young boys, particularly teenage boys, are being approached online by people who describe themselves as interested and attractive young girls. 

 
9:41 
And they're asking those boys for sexually explicit images. 

 
9:46 
And the boys are then generating those images, sending them online. 

 
9:52 
And those images are then being used to extort the boys for money by organized crime groups. 

 
9:59 
And the results, I mean one these kids don't tell. 

 
10:04 
They either try to pay, but the criminals don't go away. 

 
10:09 
And there are other serious ramifications, including mental health issues. 

 
10:14 
The CDC tells us that the second leading cause of death today in the United States for 9 to 14 year olds is suicide. 

 
10:26 
And and we've heard just a few days ago that it's the number one leading cause of death for young teens in Colorado. 

 
10:35 
So the mental health implications are huge. 

 
10:38 
This the whole problem of extortion has grown 7200% in the past year, and it's affecting younger kids, and it's having lifelong impacts on those kids. 

 
10:53 
And by and large, America has not yet awakened to the problem. 

 
10:57 
A a second element of this that we talk about is what we used to call child ***********. 

 
11:03 
We now call child sexual abuse material. 

 
11:06 
Why? 

 
11:06 
Why the change in language there? 

 
11:08 
Yeah, well, simply because it's not really ***********. 

 
11:13 
I mean these these are, this is not *********** to entice or or or to allure. 

 
11:20 
These are crime scene photos. 

 
11:23 
These are images of the sexual abuse of a child. 

 
11:26 
Because they could not by law be consensual. 

 
11:30 
Absolutely, because these are children. 

 
11:33 
And so we we created at the National Center for Missing and Exploited Children more than 30 years ago, a cyber tip line where people could report suspected what was then called child *********** child sexual abuse material. 

 
11:50 
So that that information could be provided to one of America's 18,000 police departments, to the appropriate investigative entity who could follow up 'cause these are crimes. 

 
12:01 
This is child abuse, but it's also a a a crime on behalf of the person who's doing it, who's distributing the images, who's creating the images. 

 
12:13 
It took 20 years before that. 

 
12:16 
Cyber Tip Line handled as many as a million reports in a given year. 

 
12:21 
Last year they handled 36 million. 

 
12:24 
Wow, think about, let's let's all good, Ernie, let's just stop and think about that for a second. 

 
12:30 
Repeat that number and what that meant. 

 
12:32 
Say that again. 

 
12:34 
36 million reports of child sexual exploitation of child sexual abuse material on the Internet were provided to the National Center for Missing and Supported Children, which is now disseminating them not only to law enforcement in America, but to around the world. 

 
12:53 
Because this is a global phenomenon. 

 
12:57 
AI now waltzes into the picture, doesn't it? 

 
13:00 
These materials can be created in some form. 

 
13:04 
What's a deepfake? 

 
13:05 
What does that mean? 

 
13:06 
Well, a deepfake is an image, typically of a real person that is created through artificial intelligence. 

 
13:15 
There was just a recent flurry of activity around deep fakes of the of the singer, rock star Taylor Swift, not her. 

 
13:27 
But they were phony images. 

 
13:29 
That's going to complicate this problem enormously because the what is done, what we try to do with these images is each image again, is a crime scene photo. 

 
13:43 
There's a real victim in that photo. 

 
13:45 
So the goal is to try to place that child somewhere on planet Earth, identify the appropriate law enforcement agency who can identify the child, rescue the child and identify whoever created that image, whoever abused the child to begin with. 

 
14:03 
So the purposes are twofold. 

 
14:05 
One, it's to apprehend the the the criminal, the exploiter. 

 
14:10 
But most importantly, it's to rescue the victim. 

 
14:13 
Because fundamentally this is child abuse. 

 
14:17 
It's a child. 

 
14:18 
That's why we call it 21st century child abuse. 

 
14:21 
When one thinks of child abuse, they think of of harm done to a child physically or emotionally or in whatever way. 

 
14:29 
But what's changed now today is the mechanism through which the child is abused and how that abuse is memorialized. 

 
14:40 
One of the challenges we face today as we deal with with victims is that that abuse, you may, you may identify the abuser, you may bring them to justice, but the abuse linkers on the Internet forever, that child has to live with the fact that that abuse has been captured and will be a part of their lives forever. 

 
15:06 
So one of the things that we're trying to do now is identify those images and take them down. 

 
15:14 
You mentioned gaming platforms. 

 
15:17 
Not all of us are gamers. 

 
15:20 
I remember Pong, I remember that one. 

 
15:23 
But when I think of a video game and I see kids playing a video game, there's something that they're manipulating and it's shooting arrows or whatever it's doing. 

 
15:31 
How does that have anything to do with kids being sucked into this? 

 
15:36 
How does a gaming platform make that happen? 

 
15:39 
Well, when you're on a gaming platform, you're in public, you're playing games with other people, OK? 

 
15:48 
And those other people are just not other nine year olds or other 13 year olds or other 16 year olds. 

 
15:55 
All kinds of people are on that platform. 

 
15:58 
So one of the challenges is those who prey upon children use those platforms to gain legitimate access to a child. 

 
16:09 
If if the person contacting you, introducing themselves to you, developing a relationship with you, is a fellow gamer, they are inherently less threatening to a child. 

 
16:24 
It's the old Stranger danger situation, and one of the things we've tried to do for a generation is debunk the myth of the stranger. 

 
16:33 
Because most of those who prey upon children are not strangers in the eye or mind of the child. 

 
16:41 
They're either someone known to the child, at least casually. 

 
16:45 
Are there someone with that with whom a child has a relationship? 

 
16:48 
So through a gaming platform, a person who wants to gain access to a child, who wants to groom the child for other purposes has an ideal opportunity to meet them, to engage with them, to communicate with them, to take that communication offline into another setting. 

 
17:09 
So to their credit, some of the leading gaming companies have made a serious effort to address that through content moderation, observing the interaction of of those on on the gaming platforms so they can intervene earlier and stop these things from happening. 

 
17:31 
The problem is everybody's not doing it and it's not mandatory. 

 
17:38 
It's it's voluntary. 

 
17:39 
There's no law. 

 
17:40 
There's a federal law that requires technology companies, social media companies, to report suspected child sexual abuse material on their platform to the National Center. 

 
17:57 
But they only are required to report it if they learn about it. 

 
18:01 
If somebody calls them and tells them this is going on, or there are a number of companies that are using tools to proactively look for it. 

 
18:11 
But there's not the kind of universal regulatory approach that ensures that this kind of abuse is identified and reported. 

 
18:20 
You use the term modern slavery sometimes, and I I would imagine that many Americans are vaguely aware of the idea that a kid can be trafficked, lured in, and then trafficked sexually in some form. 

 
18:37 
We've maybe have heard those stories, seen something in a television episode that that evokes that. 

 
18:47 
But I think many of us might think this is an isolated situation with a small number of just, you know, seriously dangerous creeps who might prey on individual children. 

 
19:01 
Is it a localized and isolated thing? 

 
19:05 
Or is this idea of modern slavery something larger? 

 
19:09 
It is. 

 
19:09 
It is far larger. 

 
19:11 
The latest estimates are that it is $150 billion industry worldwide, and it has increasingly migrated from the streets to the Internet. 

 
19:25 
Companies are using social media platforms and other technology platforms to gain access to kids, to vulnerable kids. 

 
19:35 
They prey on that vulnerability. 

 
19:37 
They win the confidence of the child or maybe allure or something that brings them in, and then they sell them for for sex or for or for other purposes. 

 
19:47 
So it is. 

 
19:47 
It is a massive problem, and it's another problem in which there needs to be far greater attention to the technology aspect of it. 

 
19:56 
For example, the UN Office on Drugs and Crime have said that this is a problem that has migrated to social media for very basic reasons. 

 
20:08 
It's easier, it's less risky, it's much more profitable. 

 
20:14 
So yes, modern slavery is alive and well. 

 
20:20 
There was just as a survey done by the International Labour Organization and the International Organization for Migration that that estimated the total number of victims worldwide at 50 million, of which one out of three is a child. 

 
20:37 
And that's an increase of about 20% over the last 5-6 years. 

 
20:43 
So it's a larger became a larger problem during COVID, and it's become a larger problem with the emergence of social media and the power and the reach of the Internet. 

 
20:54 
And with social media, that's where the kids are. 

 
20:57 
I'm reminded of John Dillinger, the bank robber who was asked why he robbed banks and he said that's where they keep the money. 

 
21:08 
Social media and online now are where kids are. 

 
21:11 
So how do we respond? 

 
21:13 
How do we deal with this? 

 
21:14 
You're leading something now called the We Protect Global Alliance. 

 
21:19 
I'm aware that you've been involved in legislative efforts not just across the US but in the UK as well. 

 
21:26 
What do we do about this, Ernie? 

 
21:28 
Well, I think there's a a balanced reasonable approach and we believe that approach is that there needs to be some form of regulation, that there needs to be some standard for how you operate. 

 
21:46 
This is not designed to harm the business model of these companies. 

 
21:50 
One of a good friend of mine who's a leader in in this space calls it says that this is our seat belt moment. 

 
22:00 
What does that mean? 

 
22:01 
And her argument was that a generation ago we were concerned about individual safety and therefore government mandated seatbelts in automobiles. 

 
22:15 
The automobile industry opposed it. 

 
22:19 
They thought this was an an unreasonable burden. 

 
22:22 
But it's not harmed their business model. 

 
22:24 
It's saved millions of lives and and what it has done is also spawned a range of other things like bicycle helmets and safety seats. 

 
22:36 
So there are reasonable the the concept that's talked about is safety by design. 

 
22:43 
How do we minimize the negative aspects of a problem through design components, through in engaging with the companies that are delivering this in a reasonable way. 

 
22:56 
So, for example, there are tools now to find, identify and remove child sexual abuse material from Internet sites, from Internet servers. 

 
23:10 
A tool was developed by Microsoft, now almost 15 years ago called Photo DNA that matches the digital fingerprints of an image so that if an identified image is found on a server, it could be removed. 

 
23:26 
Hundreds of companies are using that tool voluntarily and and are reporting. 

 
23:31 
That's one of the reasons they're 36 million reports now, now instead of a smaller number, but all of them are not. 

 
23:40 
Europe just proposed to mandate scanning, monitoring and and reporting wasn't passed by the European Parliament or the OR the European Commission. 

 
23:51 
But hopefully we're moving in that direction and there will be a way in which we can maximize the positives, the strengths, the importance of social media and the Internet, while at the same time protecting children. 

 
24:06 
One of the arguments against it is we need to protect user privacy. 

 
24:11 
Well, my question to that is, whose privacy? 

 
24:16 
What about the privacy of the child who's being sexually exploited and whose images are going to be on the Internet forever if we don't do something about it? 

 
24:27 
I'm pro privacy. 

 
24:29 
I believe in maximizing privacy rights, but I think there are limits and I think there's an appropriate balance we need to find and achieve. 

 
24:38 
There are also free speech concerns, aren't there Any time we get the law or government involved in regulating human activity and and including the expressions of of people, something that they might post and what not. 

 
24:55 
A lot of what gets posted on the Internet is healthy and is making a better society. 

 
25:03 
Is this an issue of free speech? 

 
25:06 
I I don't think so. 

 
25:09 
I'm a free speech zealot. 

 
25:12 
You know, I believe in maximizing the ability of every person, of every entity to exercise that free speech, right? 

 
25:21 
But I go back to an old Supreme Court decision back 1918, I think it was in which they talked about shouting fire in a crowded theater. 

 
25:33 
Speech is not absolute. 

 
25:36 
There are limits based upon the logical implications of the exercise of that speech. 

 
25:43 
So I think that's where we are. 

 
25:45 
I think we need to maximize free speech. 

 
25:47 
But I think there are, there will be appropriate limits. 

 
25:51 
And I think one of those appropriate limits needs to relate to the protection of children who are hidden victims. 

 
25:59 
In these cases, the child rarely tells anybody. 

 
26:03 
He doesn't tell mom, doesn't tell dad. 

 
26:06 
If he tells anybody, he's going to tell a friend. 

 
26:08 
So overwhelmingly, these are cases we don't know about, we don't learn about, And as a result, a generation of children are being harmed. 

 
26:19 
And those harms are being manifested in lots of ways, including damage to their mental health and not only physical harm, but other kinds of harm. 

 
26:29 
So I think there is a balance. 

 
26:31 
I think we can find the balance that maximizes free speech but protects the nation's children. 

 
26:38 
And not to get too lawyerly about it, but child sex abuse materials are not protected speech under the US Constitution, are they? 

 
26:48 
Not at all. 

 
26:49 
Absolutely not. 

 
26:51 
So you have worked on legislation. 

 
26:54 
You had more than a little bit to do with legislation in the United Kingdom. 

 
26:59 
I know that many of us have heard at least something about legislation in California. 

 
27:05 
I know that you're very aware because you've been a part of this effort, of an effort in the State of Colorado, Senate Bill 158, which is working its way through the legislature as we speak that tries to address some of these issues. 

 
27:23 
Can you give us an overview because this is a a deeply complex topic with legislation in so many jurisdictions, but what's happening on the legislative front in the US and around the world to address this? 

 
27:39 
Well, it it is complex and and it really comes from a couple of places. 

 
27:43 
One is protecting children's privacy. 

 
27:48 
The reality is Harvard University did a study recently. 

 
27:52 
They found that the five or six leading social media companies generate something like 11 to $12 billion a year off the information from children for advertising purposes. 

 
28:07 
So you know, they're multiple elements of of this. 

 
28:10 
So one of the efforts and that really was the genesis of California's age appropriate design code law which is based on efforts of AUK based organization. 

 
28:22 
A federal court in California ruled that it had First Amendment flaws and so that's on the appeals process. 

 
28:31 
There have been other efforts to implement what's called age verification and that's that's an issue in which I've had some involvement as well in which the the premise is very simple. 

 
28:46 
For generations there have been certain products, certain services, certain locations that children can't access, so-called adult content. 

 
28:57 
You can't. 

 
28:57 
The Prime Minister of the UK said you can't walk into a a video store in downtown London if you're 10 years old and purchase an X-rated DVD. 

 
29:08 
But the same child can access the same content online. 

 
29:13 
So there have been efforts and there have been a number of places where age verification laws have been enacted, the UK, France, Germany and individual states in the US. 

 
29:25 
Can that can that actually work? 

 
29:27 
Can that actually work? 

 
29:28 
Because like many of us have clicked on something at some point. 

 
29:31 
Yes, I'm 18. 

 
29:32 
Yes, I'm not a robot. 

 
29:34 
And then off we go, that's not age verification. 

 
29:37 
What is what any nine year old could figure out if you click that says I'm 18, you get in. 

 
29:44 
But the the concept and there is now an age verification providers association and commercial enterprises that have developed around the world to do that including some very prominent companies in the United States. 

 
29:57 
We already do it in the United States for online gambling and for the purchase of of alcohol and other purposes. 

 
30:06 
So the the line is how you do it. 

 
30:09 
But there there are ways to do it reasonably whether it's through an Ida state issued ID or through another kind of age verification. 

 
30:21 
But so that's an element of this. 

 
30:23 
In Colorado, the focus is not on not just on Child Exploitation, but the the fact on drug sales online. 

 
30:32 
The number of fentanyl deaths in Colorado is off the charts. 

 
30:37 
Gun sales online. 

 
30:40 
During a recent hearing in the Colorado State Senate, a senator said we have among the toughest gun gun laws in the United States. 

 
30:49 
It's hard to buy a gun without going through the steps, he said. 

 
30:54 
What? 

 
30:54 
Why do you need this? 

 
30:56 
Well, a District Attorney in Colorado was that 'cause? 

 
31:00 
These aren't legal sales. 

 
31:03 
Kids are buying guns on social media sites. 

 
31:06 
I heard a community worker talk about how kids can also online readily purchase the components you need to make their own gun. 

 
31:17 
A so-called ghost gun that is not registered, no serial number or anything like that. 

 
31:22 
You know, they buy their own little Erector set. 

 
31:25 
I'm dating myself, but I don't know, the director said. 

 
31:27 
It can even create their own gun. 

 
31:30 
Well, it that's the the world we live in. 

 
31:34 
And part of the of the effort here in a reasonable way is to put guardrails in place. 

 
31:42 
And the Colorado law simply requires social media companies to report to the Colorado Attorney General, who already has authority under the Colorado Consumer Protection law to to provide reasonable steps. 

 
32:00 
So how this shakes out, how individual states do it. 

 
32:04 
There at least a dozen other states. 

 
32:06 
They're looking at similar kinds of laws. 

 
32:08 
Maryland just passed 1, Minnesota is looking at passing 1. 

 
32:13 
So I think there is recognition that we need to do something more and that the goal is not to harm these companies. 

 
32:25 
They do enormous social good. 

 
32:27 
You try to take take that resource away from kids, you're going to have a revolt on your hands. 

 
32:33 
But how do we do it in a way that maximizes the protection of the child and minimizes the exploitation of the child? 

 
32:42 
That's the challenge. 

 
32:45 
And the automakers who didn't want the seatbelts are now making plenty of money building much safer vehicles. 

 
32:53 
It's one of the best things that ever happened to them. 

 
32:56 
It didn't hurt their business at all. 

 
32:58 
It helped their business. 

 
33:01 
It expanded their market. 

 
33:04 
And I think there are lots of examples like that that that come into play. 

 
33:10 
Ernie, you're speaking to an international and multidisciplinary audience. 

 
33:17 
What's your call to action for each of us? 

 
33:22 
If we're a caseworker in New Zealand or an advocate in the UK or a a judge in the state of Ohio in the US, what can we do to help? 

 
33:34 
Well, I think the first thing you can do is make yourself more aware of what 21st century child abuse really is and the range of harms that are affecting kids and how many of those kids are hidden victims. 

 
33:51 
One of the most important things we can do is do a better job of identifying victims and getting them help. 

 
33:58 
So wherever you are, and let me say you mentioned that we protect global alliance, we we do a a global threat assessment. 

 
34:06 
And one of the things we did in partnership with the Economist News magazine was to do a global survey 58 countries by region to determine how many kids are really being harmed. 

 
34:21 
And we gave a rage of sexual harms that can happen to a child online. 

 
34:27 
We interviewed 18 and 19 year olds. 

 
34:30 
So we didn't try to interview 10 and 12 year olds, 18 and 19 year olds, regarding what happened to them as children. 

 
34:37 
And the numbers were stunning. 

 
34:40 
54% of those interviewed said they had experienced at least one of those sexual harms online as a child. 

 
34:50 
And the interesting thing is it was not just global N It's not just the United States where you the United States and Canada had the highest incident, 71%, 67% in Australia, New Zealand, 65% in Western Europe. 

 
35:10 
But the global spread of victimization was huge, 54% in Central America, in South America, in South Africa. 

 
35:26 
The lowest place in on the planet was 44% in the Middle East and North Africa and Eastern Europe. 

 
35:33 
So the message is wherever you are, this is a problem you need to wake up to and you need to motivate and mobilize your public officials to do more about it. 

 
35:46 
And they're simple, basic things you can do. 

 
35:48 
This is something that's not going to cost billions of dollars or pounds or EUR or or whatever, But the single most important thing we can do is put this on the agenda of policy makers. 

 
36:02 
You know, a lot of things that our elected officials and policy makers have to worry about today, from climate change to immigration to infrastructure to the economy. 

 
36:13 
But this is one of them. 

 
36:16 
Child abuse and the risks associated with 21st century child abuse and the role of technology really need to be on that priority agenda. 

 
36:28 
Thank you Ernie Allen for that and for all you've done and will do. 

 
36:36 
Speaking for everyone listening to this now, we're all happy to join you in this work. 

 
36:41 
Thank you, Arnie. 

 
36:42 
Thank you, Kendall. 

 
36:43 
Good to be with you and thank you to our listeners. 

 
36:47 
That conference is again on the horizon. 

 
36:50 
The Call to Action to Change Child Welfare Conference is virtual and international and takes place this year, 2024, October 7 to 10. 

 
36:59 
Check it out and register at www.ctaconference.org. 

 
37:08 
And do join us again for this podcast soon, and often this has been Radio Kemp.