Podcast on Crimes Against Women

Reforming Laws to Protect Against Digital Violence & Image-Based Sexual Abuse

Conference on Crimes Against Women

Brace yourself for a candid conversation about the urgent and often hidden issues of violence and digital abuse against women. Dani Pinter, Senior Legal Counsel for the National Center on Sexual Exploitation, joins us to pull back the curtain on the alarming reality of non-consensual sexual exploitation online. We confront shocking statistics and explore the staggering impact of these crimes on victims, as Dani shares insights into the history and mission of her organization to combat these deeply entrenched societal problems.

Our discussion takes a hard look at image-based sexual abuse and the complex landscape of legal accountability surrounding platforms like Backpage and Pornhub. We draw parallels between past legal actions against industries like big tobacco and current efforts to hold websites responsible for enabling exploitation. The conversation highlights the alarming rise of voyeuristic content and fake sexual images created by AI, as we unravel the challenges of curbing these offenses in the digital age and the significant implications for privacy and consent.

Navigating the murky waters of online accountability, we tackle the barriers victims face as they seek justice and content removal. The conversation zeroes in on Section 230 of the Communications Decency Act, emphasizing the urgent need for legal reform to empower victims and hold tech companies accountable. We also spotlight the progress made in addressing image-based sexual abuse, including amendments to the Violence Against Women Act, marking key steps toward more robust legal protections. Join us as we chart a path forward in advocating for victims and challenging societal norms and institutional practices.

Speaker 1:

The subject matter of this podcast will address difficult topics multiple forms of violence, and identity-based discrimination and harassment. We acknowledge that this content may be difficult and have listed specific content warnings in each episode description to help create a positive, safe experience for all listeners.

Speaker 2:

In this country, 31 million crimes 31 million crimes are reported every year. That is one every second. Out of that, every 24 minutes there is a murder. Every five minutes there is a rape. Every two to five minutes there is a sexual assault. Every nine seconds in this country, a woman is assaulted by someone who told her that he loved her, by someone who told her it was her fault, by someone who tries to tell the rest of us it's none of our business and I am proud to stand here today with each of you to call that perpetrator a liar.

Speaker 1:

Welcome to the podcast on crimes against women. I'm Maria McMullin. The National Center on Sexual Exploitation is the leading organization preventing sexual abuse and exploitation at mass scale by eliminating institutional practices and societal norms that perpetuate these harms. According to a 2017 US study, one in eight participants had been targets of the distribution, or threat of distribution of sexually graphic images without their consent, with women significantly more likely to have been targets of this abuse compared to men. Moreover, studies reveal that approximately 51% of perpetrators often demand sexual photographs. 42% of them demand that the victim return to the relationship and 26% demand that he and the victim meet in person, usually for sex. Additionally, the National Center on Sexual Exploitation learned that 73% of victims of IBSA image-based sexual abuse did not turn to anyone for help when they discovered that sexual images of themselves had been shared without their consent. In another study, northwestern University analyzed 462 accounts from teenage girls pertaining to their exchange of nude or semi-nude photographs. Two-thirds reported struggling to decide whether, when and to whom they should send photographs. This episode will take a deep dive into sexual exploitation through digital means and discuss what can be done to minimize and ultimately eradicate this increasingly expanding form of abuse.

Speaker 1:

Our guest for this discussion is Dani Pinter, who serves as Senior Legal Counsel for the National Center on Sexual Exploitation and its Law Center. In this role, dani serves as a voice for human dignity in precedent-setting legal cases on behalf of victims of sexual abuse and exploitation. She also drafts and consults on state and federal legislation behalf of victims of sexual abuse and exploitation. She also drafts and consults on state and federal legislation to support victims of sexual exploitation and hold exploiters accountable. Dani speaks regularly on a variety of exploitation topics, with a special focus on protecting youth in a digital age and legal remedies for survivors of exploitation and abuse. Dani Pinter originally joined the National Center on Sexual Exploitation Law Center at its inception in August of 2015. She was instrumental in reinvigorating the law center and traveled the country, building relationships and raising awareness. Notably, she drafted the first piece of legislation recognizing the public health impacts of pornography.

Speaker 1:

Dani, welcome to the podcast. Hi, thanks so much for having me. It's good to be with you. You've been working with the National Center on Sexual Exploitation Law Center since it began in 2015,. Is that right? That's right. Can you tell us a little bit about the National Center on Sexual Exploitation, its history and the founding of the Law Center?

Speaker 3:

Sure, so our organization is actually over 60 years old. So it started, like I said, 60 years ago as morality and media. It was focused on the harms of pornography and then it did that work for a while, went kind of dormant for several years and, under new leadership, work for a while, went kind of dormant for several years and under new leadership around 10 years ago it was picked up and kind of dusted off and changed a little bit. The name was changed to the National Center and the focus shifted from pornography to the full spectrum of sexual exploitation to include sex trafficking and all of those harms. And the Law Center was founded in 2015 because the nonprofit was largely a public advocacy organization, an education organization, and we wanted to add that legal aspect to the advocacy.

Speaker 3:

Nacozi Law Center specifically, where I'm senior legal counsel, we represent victims and survivors of sexual exploitation in civil cases to hold accountable those people that facilitated their abuse or those corporations that facilitated their abuse. So you know our mission and what we believe in is that there are major corporate offenders that are exacerbating harm and when you can hold them accountable, that's how you get top-down societal change. So if you think about big tobacco, how we really help people and how we really help them to even know the truth that it was unhealthy is you have to go after big tobacco because the big tobacco is wreaking havoc. So similarly, we think major websites like Backpage that was facilitating sex trafficking. We have a case against Pornhub, which was massively distributing all kinds of sexual abuse, but including child sexual abuse, child pornography we sue them.

Speaker 3:

So if you're a survivor who's experienced image-based sexual abuse and you want to hold accountable, you know, maybe there's a website that you really feel played an unbelievable role and really exacerbated the harm. You know, without that website facilitating it, you know you wouldn't have been harmed the way that you were. Let us know because we want to help you. We want to help you hold those corporations accountable and make that change so this doesn't happen to anyone else. But also, the Law Center works on policy, advises and drafts public policy. So those are our main main focus.

Speaker 1:

That's outstanding. Now, this is in DC, correct. Main focus that's outstanding, now, this is in DC, correct, correct, okay, and a special focus of your work is image-based sexual abuse, which is sometimes IBSA, which we may refer to it to IBSA in this show. Help us define that.

Speaker 3:

So image-based sexual abuse is exploitation involving images or pictures, videos and, fundamentally, like all sexual abuse, it's a violation of a person's privacy and personal autonomy.

Speaker 1:

So can you give us some examples of like what abuses are image-based?

Speaker 3:

Yeah, sure. So that could be anything from you know, a sexual assault that was filmed, or a secret recording, which that could be. You know, one person is recording the other during a sex act and that person doesn't know they're being recorded, or both people would be recorded and they don't realize there's a hidden camera. Or it could be a hidden camera placed in a private place like a bathroom. It could be an image that someone consensually shares with another person but did not consent for that to be distributed to third parties or on the internet. So all of those things can be a part of IBSA.

Speaker 1:

How common is this type of offense?

Speaker 3:

Well, it's extremely common and has exploded massively, even since the pandemic, where our lives became even more online. But ever since the internet you know, I mean this is this has been around ever since you could take a picture of someone you know there's been non consensual images taken and shared. But of course, the internet has exacerbated that and particularly because it's very difficult to hold online websites accountable like you could maybe hold on other distribution companies or places in the real and the physical world, the internet is harder. It's harder to get things taken down, it's harder to hold perpetrators accountable. So it's really exploded and the more that our lives become virtual, the more that this abuse is growing.

Speaker 1:

So are there any types of image-based abuses that are more common? And I mean you mentioned videos. There's photographs that can be put on websites. It can be used to blackmail people. What's most commonly, what are the most common tactics of doing this?

Speaker 3:

You know, I want to maybe make a distinction between image-based sexual abuse that involves children and image-based sexual abuse that involves adults, because of course it does affect children and adults. But when I talk about IBSA, I am usually talking about adults because, in my opinion and based on our laws, image-based sexual abuse that involves a child is child pornography, so that's a much more severe crime, it's contraband, and so I wanna keep that as the most severe form and so I separate it. So, setting aside the ways that children are exploited that way, which that is growing and explore and exploding as children are completely online these days, common ways is people consensually sharing images that then someone else puts you know, their partner now distributes to other people or their. Those images are taken because there was a hack or some other kind of, you know, privacy violation. Somehow a consensual image is distributed. That that's really really, really common.

Speaker 3:

But, um, the I you know, use of hidden cameras and secret recording is also unbelievably on the rise, especially how, because these hidden cameras can be almost undetectable. Now, you know, I think we've seen the news stories where hotel rooms, airbnbs, all this kind of stuff, but also just bad actors filming women, you know, without their knowledge, and the truth is, unfortunately, this kind of voyeuristic pornography that it produced. This whole concept of you know a woman or a man being filmed and they don't know about it is. There's a high demand for consuming that material online, so that, because of that, there's an incentive for people to do it.

Speaker 1:

Unfortunately, so that because of that there's an incentive for people to do it. Unfortunately. How is maybe AI, artificial intelligence really compounding this issue?

Speaker 3:

Well, because most of these images and their distribution involve the Internet. It's going to touch AI, right? Ai is going to be involved with any content that's on the internet today, and the truth is the way that AI is progressing so rapidly. Anyone with a photo on the internet so you have social media, or even you have a picture on someone else's social media you don't even have social media but, like your mom has a picture of you puts it on social media. Somebody could take that image and create a sexual photo of you, a nude photo, even a video. They could take your face and put it into a pornographic video. That would look so real that no one would be able to detect that it's fake or that's AI.

Speaker 1:

That is so unbelievable to me. I mean, we hear about it a lot. It happened to many celebrities those are usually the ones that we hear about on the news and that's a deep fake, right, Right, there's deep fake images and it's everywhere, and a lot of times I can't even trust what I'm seeing on the internet. You know, just related to could be looking at houses or looking at, you know, interior design or something like that. Like nothing looks real anymore. Everything looks like too perfect or it's been. It can't possibly be a real house or a real room. You know, it's just all so perfectly laid out or even architecturally impossible. Sometimes, Like some of the things I say, I'm like that's just not possible. So it's making it much more challenging, right, too, because of AI. So AI can do, I think, many wonderful and amazing things for us, but it can do so much harm as well. What are the implications that would be associated with the criminalization of IBSA where AI is concerned?

Speaker 3:

Well, I think we need to address it. So AI like, I believe, the internet, writ large because there are no current regulations, strong regulations or even remedies for victims. There's no accountability and there's no responsibility. So someone designing an AI model is not making that safe. They're putting that as open source code so any bad actor could use it to create bad material. I think that is the wrong attitude. I think it's really great that we have all the technology we have today, that we have the internet we have today. Part of that is because of this sort of hands-off regulatory strategy we've had. But I think it's time to re-examine that, because the real world consequences and harms of basically having a lawless internet are real and they're severe. So I think it's time to put some balance in there. You know, we can still have privacy, we can have innovation. We may have to give some of it up or slow it down so that we can keep people safe. So I think we need to do that.

Speaker 3:

Our laws are woefully behind when it comes to the non-consensual sharing of images, even without AI. But AI really isn't accounted for. But the good thing is there are actually three bills before Congress right now that would address AI and non-consensual images. Oh, can you tell us about those? Yes, so the SHIELD Act, the Defiance Act and the Take it Down Act all address non-consensual images, and I think two of the three specifically address AI. So the Take it Down Act is my favorite one because, as a lawyer, what I have always found frustrating about these IBSA issues is you know, copyright is handled Okay. There is no copyright issue on the internet. Um line wire is gone.

Speaker 3:

People, we don't have a problem with you know, videos or images that are copyrighted, they come down immediately right, Like if you go on YouTube you are not watching a Disney movie, Um, so that proves to me that, while it may be difficult, it is possible to control for some of these things. Yet when we talk about someone's, you know you have a period where you're notified through this laid out, already built out process, same as you would a DMCA. That's a copyright. If there's a copyright image, you send a DMCA takedown. They have, you know, a certain amount of time to take that down and if they don't, there's going to be big penalties. If they think it's allowed to be there and not a copyright violation, they can do a counter notice and then you can litigate that.

Speaker 3:

So the Take it Down Act would provide a process for victims to get their stuff taken down hopefully quickly, because the penalties would kick in if the websites don't comply quickly. And it would also address AI, non-consensual images. So as long as the image is an identifiable person, so it's AI. But we can tell you know, we could tell it to me, you could tell it to you, you would be able to take action. You'd be able to say hey, this is a non-consensual AI image of me, Take this down. It would also provide penalties against the websites, but also the perpetrators. So I really like that bill, but the SHIELD Act also would. We don't even have a federal law that makes non-consensual sharing of images illegal, so the SHIELD Act would make it illegal. So like that's a big fundamental that we just need for sure.

Speaker 1:

So how would a federal law like that work on the state level?

Speaker 3:

Well, it's about prosecution. So right now, not all states even have a law that make non-consensual sharing of images illegal. A lot of states most states really have something, but some don't, and it's a patchwork of standards, you know. Some of them cover something, some are really really narrow and are not very workable. So this would put prosecutorial authority in the Fed's hands. So if you had an image that was not consensually shared, if you could report that to FBI and, like a United States assistant attorney would prosecute that case, which is honestly, I mean, in some ways you want this handled locally and for some things you want locals to handle it'd be better. It's usually much faster. But the problem with internet is it usually involves a lot of jurisdictions. So I think having federal prosecutorial authority actually makes sense, because the federal prosecutors can cover all the jurisdictions where this image appears, whereas, like otherwise, you'd have to go state by state where maybe this image is being uploaded from or the perpetrators are residing, that kind of thing.

Speaker 1:

So let's talk about some of those battles that you've had to fight with companies and others, of taking down images. Give us an idea of how the center has, you know, approached companies, businesses or entities that perpetuate or exacerbate IBSA, and some of the outcomes.

Speaker 3:

So we often help victims of sex trafficking or sexual exploitation, try and get some of their images taken down. We just do that because it's the right thing to do. I wouldn't even say we're particularly adept at it, but we're at least trying and our experience is even when I send a legal letter to a website, most of the time it's ignored, even if this is a big website. So even if, like we've sent we sent hundreds of letters and requests to Reddit, for example, for child exploitation material, and they just ignored us and never took any of it down. And then a lot of times, for adults, we may get off some of the bigger sites, but it's been shared on tons of smaller foreign sites. They won't respond. It's hard to even figure out who really owns them. It's very difficult to get content taken down. I mean, the truth is, if you have abuse images online right now, hopefully things will change, but with the current legal landscape it's almost impossible really to get them all removed. It's like a game of whack-a-mole.

Speaker 1:

That just seems absurd Because, to the point you made earlier, if it was a Disney movie that's copyrighted and somebody put it up on YouTube, you're going to get sued. You know you can't do that, it's just an infringement of the copyright. So I'm not sure how that's really any different of someone putting a photo up of you or of me against our will and we can't say that we want it taken down. It just, it just seems ludicrous.

Speaker 3:

It is because the violation of rights is the same. The violation of of our legal rights has occurred at least as much as a copyright infringement is a violation of legal rights. But the problem is there's a federal law called Section 230 of the Communications Decency Act. It was passed in 1996 in the beginning of the internet, and essentially what it says is it was actually a bill that was supposed to help protect kids online, but it was poorly written. A lot of it got struck down and all that left is left of that bill is what was supposed to be the concession to tech companies. So they would agree to the bill, which said you know, a website can't be held liable for what third parties put on their site. Now that sounds kind of common sense, but it's been interpreted by courts to be almost blanket immunity blanket immunity. So anytime you try to sue a website like you would for copyright infringement because copyright infringement is an exception the court will just toss it out and say they're completely immune. It does not matter if they knew about it, it doesn't matter if they profited from it. They are not responsible for third party content.

Speaker 3:

So, for example, we had two 13 year old boys who were extorted on Snapchat into providing sexually explicit images of themselves. Those images later ended up on Twitter, were massively distributed. The whole high school knew one of the boys was suicidal and he had reached out to the platform, pleaded for them to take it down. They actually finally responded to him and asked for his ID. Meanwhile kept this content live, checked his ID, said we reviewed the material. So they checked his ID that showed he was a child. Reviewed the material which depicted a 13 year old engaged in a sexual act and said we're not taking it down.

Speaker 3:

And so far the court has said sorry, like they're immune. They're immune for that because that was a third party that uploaded it, even though he's a child, even though it was a child and even though it's contraband. So like if, if, if the child possessed that, if he had it on his phone, he could go to jail. But twitter, which acknowledged possession of this child pornography, claims they are immune. We're appealing that decision. We'll be having oral argument in february at the ninth circuit. But that's the current state of the law is that these websites are immune for the most heinous activity. But that law is what's responsible for sort of the landscape we have today. So this, this landscape of we can't do anything about things online or because of that law and how it's been interpreted. So you know, that's what we view at our law center, at Nicosia Law Center, like that's our, that's our goal, that's our job is to get some legal reform in this area, because otherwise victims are powerless.

Speaker 1:

For sure. So let's go back just a little bit, because I want to try to understand from a legal perspective. When did IBSA become defined and codified to, where it became a pursuable offense in the courts?

Speaker 3:

codified to where it became a pursuable offense in the courts. So it isn't really. There was a movement about 10 or 15 years ago by the Cyber Civil Rights Initiative to pass what they called then revenge porn laws, because they were lawyers who had women having this issue and there was nothing on the books they could do about it. It wasn't even a crime and so they just pro bono. These lawyers got together and, with their clients and lobbied to have these state laws passed. So that's why we have some laws.

Speaker 3:

It's usually called different things revenge porn or sharing of non-consensual sharing of images. There's those state laws but, like I said, it's really not a clear federal offense. And for civil remedies meaning like you could sue someone for this there's not clear civil remedies. And I think it was 2019 or 2020, the Violence Against Women Act was amended to provide some civil remedy, so that was the first time. So we're talking 2020-ish was the first time there was any kind of potentially federal civil remedy. So that was the first time. So we're talking 2020-ish was the first time there was any kind of potentially federal civil remedy for women or men or anybody who had, you know, this kind of abuse happened to them, but it's very, very limited and who you can bring this against?

Speaker 3:

And it excludes websites. You can't hold a website accountable. So that's why this isn't really an open legal landscape. This is an emerging legal issue and that's why we have sort of a suite of bills before Congress right now to address and fill these gaps. So and the Take it Down Act just passed unanimously through the Senate. So you know, for all the listeners here, let your Congress people know that this is something you care about, Let them know this is a priority that you're dumbfounded and upset that you would have no remedy, that your children would have no remedy, your loved ones would have no remedy if this kind of abuse happened to you and the truth is, it could happen to anyone because we're all online.

Speaker 1:

Yeah, absolutely. And now let's back up just one more time, because you use the term revenge porn, and so I want to try to get a definition for that and what it means and what it does not mean.

Speaker 3:

Sure. So it was a it's just a colloquial term that rose because there was a trend of boyfriends or husbands or male partners, you know, if they were angry with their spouse or girlfriend or they broke up with them, they would take explicit images and share them online as like a way to get revenge. They would take explicit images and share them online as like a way to get revenge or they would use it to blackmail them. There was rise there's a rise, of actual revenge porn websites where they called it that like. They called it revenge porn, and so it was an entire community of men sharing and consuming this content. There's men who like the fact that this was, you know, the girl next door whose husband's mad and is sharing this image without her will, and then there was men doing it right. So that's why that colloquial term rose to prominence and in our sort of collective conscience, we got to know what that is.

Speaker 3:

If I say revenge porn, most people kind of they could guess what I mean, but the limitations of that term is that then people don't understand image-based sexual abuse outside of that, and the truth is, image-based sexual abuse can happen for a lot of reasons and male or female, but mostly male partners can share private photos for motivations that are not revenge-based right. They could do it not because they want the revenge. They could do it because they have a fetish where they like sharing images of their sexual partner, or they're just a nasty person, or they want to make money. There's lots of motivations outside revenge. So that's why the term you know, we would advise not using that term and moving to image-based sexual abuse, because it's just covers. It's more accurate.

Speaker 1:

Yeah, I mean that makes sense. Thank you for giving us the background and context on that. Now, the center recently released a thorough guide for IBSA for practitioners and service providers in particular to use to identify, classify and define IBSA so that they can adequately address it in their respective disciplines. Tell us about that guide and where it can be found.

Speaker 3:

Sure, so you can find that on our website, which is nsexualexploitationorg, and then the specific web address would be nsexualexploitationorg. Slash issues, slash image-based sexual abuse, and there's a hyphen between each word image-based sexual abuse. But if you go to our website, image-based sexual abuse is a main topic so you can navigate yourself there. But what we do there is really try and compile the research and the testimony of individuals who've experienced this, to define terms, to discuss the context in which this happens and help people understand.

Speaker 3:

You know, because a lot of people will experience this and they're confused as to if, if this is even wrong, is this even illegal? Is this abusive? They don't know how to feel. So it helps everyone understand you know what is right and what is wrong in this context. Is it okay to draw a line and say, yeah, okay, I did share that image with you, but that doesn't mean you should post it online, because the problem is, I think a lot of people start blaming the victim and say, well, you shouldn't have shared that image with your partner and so you can't really complain now that it's on Pornhub, right?

Speaker 2:

And that's just wrong.

Speaker 3:

You should be able to have a relationship and share images, and not that. Doesn't a blanket consent for that image to now be put on the Internet and even be making money being monetized on a pornography website?

Speaker 1:

Yeah, absolutely. Now we talked a lot about the perpetrators, the people who are posting these images. Right, let's talk about the victims a little bit. As you work with clients and practitioners, what have you found to be some of the myths and misconceptions that these victims encounter?

Speaker 3:

I think the biggest one is that this doesn't really cause harm when it causes grievous, grievous harm. There's a couple examples I can give. One is you know, I am honored to serve survivors of sex trafficking or other abuse who are brave enough to you know, want to hold others accountable and prevent this from happening to others. And they will say you know, the original abuse is traumatizing, as that was, and as horrifying as that was. The images being circulated is worse, because I feel I can't move on from that. I can't escape that. That's out there for everyone to see and it won't end. It's an abuse that just keeps going.

Speaker 3:

And another example I want to give, because I think this was really educational for me, is I was speaking with a woman who is being targeted by an AI IPSA campaign. So somebody is taking images of her and creating pornographic content against her will. So she was. She doesn't even know who these people are, but she was not sexually assaulted. These aren't real images of her in the sense that, like they're not taking images of her actually engaged in sexual activity, they're contrived, they're totally AI generated. And she said in tears it feels like I've been raped, actually engaged in sexual activity. They're contrived, they're totally AI generated. And she said in tears it feels like I've been raped. It feels like I've been violated because these are everywhere and people are consuming them. So that's what I think people don't realize Like even if you didn't experience a sexual assault, having your images consumed by people, getting sexual gratification out of that against your will, is an unbelievable violation. It's humiliating and it feels it's just. It's very, very harmful to the individual.

Speaker 1:

Oh yeah, I think that's harmful. It's so disgusting and I'm just trying to get my head around what's happening to this woman. So this group of people who operate this AI are using her image and likeness and just putting it into pornographic images and videos. Is that right?

Speaker 3:

Yeah, it's a. It's like a stalking, harassment campaign. So they're harassing her and this is their method. So you know, stalkers and her people stalk and harass women in an obsessive manner to destroy their lives, right. And now they're using AI to do it and they're just bombarding her with these images and bombarding Right, like she keeps getting some taken down, they keep appearing and it's just this constant humiliation she can't escape and we don't exactly know what their motivations are. But, yeah, they're stalking and harassing her via AI. Image-based sexual abuse.

Speaker 1:

So what's the objective here with these perpetrators? Is it to humiliate, diminish, destroy? Just have fun all of that.

Speaker 3:

All of those things. It can be purely because they just want to make money. It could be because they get gratification off the humiliation of the person. It could be because it's like they get a boost in their community. They're part of, like a community, and sharing these things gives them clout, gives them credibility. It could be to hurt the person. So it really. I have seen each one of those things be at play. All of them are. You know. These things are happening for all of those reasons.

Speaker 1:

So people who commit those types of offenses? How common is it for them to actually engage in physical violence or other gender-based crimes?

Speaker 3:

So there's strong research to show that sort of low-level sex crimes have a connection to eventual or unknown very serious sex crimes, and a really current example of that, I think, is the Giselle Pellicott case in France. So you probably know and your viewers probably know, because it has rocked the world. Because it has rocked the world. But Giselle Pelicott, you know, was a 65 year old woman who recently discovered that her husband was drugging her and then filming strangers coming to the house and raping her, and this included over 70 men in her community. 55 of them are being prosecuted and her husband fully admitted it. But her husband was actually caught, which I think this is a detail most people miss. Her husband was caught because he was doing upskirting videos in a grocery store. So he was in a grocery store trying to snap pictures of women underneath her clothing.

Speaker 3:

So that's socially deviant behavior, right, that's antisocial behavior, and so that's a huge red flag. There is usually a connection. If somebody is engaging in that kind of dehumanizing, antisocial behavior, it's probably not an isolated incident, and especially the more brazen it is, that's usually a sign like this is escalated to quite a degree because they're now not even afraid of getting caught, right, doing it so brazenly in the public is usually a good indication that they've done it before and they're going to do it again. And doing it and getting away with it is a motivation and a reward for that behavior. That will often cause men to continue doing it and escalating that behavior.

Speaker 1:

So the men who've been raping her are being prosecuted? Is he being prosecuted as?

Speaker 3:

well, he is. He's being prosecuted and he fully admitted it. What's alarming? And that's now wrapped up and going, I think, to sentencing. At this point, the sad thing is in France the most he can even get is 20 years. This went on for 10 years.

Speaker 3:

She was raped by over 70 men and received multiple STDs. Years she was raped by over 70 men and received, got multiple stds, and the most time he could possibly get is 20 years. Um, which is just disturbing, but he admitted it. And what's crazy is all the other men. So the 55 other men being prosecuted were all trying to make excuses. Um, I didn't know she was drugged, I thought this was a sex game, I thought she liked it, all this stuff, even though, ironically, the main perpetrator, the husband, was saying, oh no, they knew. Like I fully told them the whole story, like they knew she was drugged and she was unaware. Um, and the other scary thing is these were not a list of 55 sex predators in the neighborhood. They were all like very normal men. Um, there was a fireman, a local baker that actually knew her in her neighborhood, a nurse. You know, these were your average everyday men engaging in this wildly antisocial behavior. So that to me, is such a such an indication of a bigger societal problem.

Speaker 1:

Absolutely. I mean, she's so courageous to even come out and talk about this and take it as far as she has, and it's something I think that could possibly be the tipping point for the rest of the world in really understanding that this stuff happens and the scale is massive. Right, that's exactly right. Now. The center has been on the forefront of policy and legislation when it comes to combating IBSA and particularly in the fight for corporate accountability. Have you made, have you been able to make, any strides in this space?

Speaker 3:

Yes, so we have and I mean we so we have a corporate advocacy department where we specifically try and reach out with corporations that we think, either knowingly or unknowingly, are facilitating sexual exploitation. So you know, we've sat down with or sent letters to, snapchat, google, apple and identified. You know the issues and you know I think often, sometimes, they really don't know what's going on, but then sometimes they do, and I think, for the reasons I explained earlier about you know, they know they can't be sued or they feel they can't be sued. So at some point there's an equation that happens where they say, well, we could fix this, but it will cost us money and lose us engagement. It will cost us money and lose us engagement, whereas if we don't fix it, nothing will happen. And every time then the ultimate decision is to not do the right thing. Um, we do end up having success, but it just takes an enormous amount of pressure.

Speaker 3:

Um, so, for example, when um, google was providing tons of laptops and things to schools for free which is great but we were like, can you please put the safety controls for these kids? You just defaulted on, because these schools especially like it was, underprivileged schools often really need them. They don't necessarily have a full time IT staff. They don't know how to implement that stuff to protect these kids. You're basically putting in their hands, you know, a really dangerous device that adults could use to harm them. They could be exposed to harmful material and they wouldn't voluntarily do it. It's just crazy, you know. But ultimately, if we do a big enough campaign, we get traction and there's a public outcry, they will. Or if they feel there's going to be legislative change they don't like, then they'll voluntarily make a change. So we helped, with many other advocates, arranged the January hearings earlier this year where the big CEOs of the tech companies came to Congress, and we provide a lot of material to Congress to ask them these questions.

Speaker 3:

Like, for example you know, I think I remember Josh Hawley asking Facebook. You know, or maybe I can't remember, who it was. You know you have this. I think it's Ted Cruz. Actually. You know you have this warning this may what you're looking for may depict child sexual abuse. Do you want to see the image anyway? And then you allow them to see it? It's like why? So you have an algorithm picking up that this might be child sexual abuse? Why are you not just deleting that image and not allowing the person to view it? Why are you allowing them to take that step of view anyway? So asking these tough questions? And so they have made some changes. Finally, snapchat's turned some default things on automatically.

Speaker 3:

Instagram has made some changes, like, for example, simple things. We said Instagram please don't let strange adults be able to direct message minors that they don't have a connection with. Like it's not a parent. You know they could tell Right Parent, same last name. Or you know there's a. They're already following each other. This is an adult they're not following. Why should they be able to DM a minor Just like disable that, and it took forever, you know, but now they do do that by default. So we do get these incremental improvements, but that's where you know the advocacy you have. In our view, you have to have the public advocacy. You have to engage the grassroots, you need to get parents saying what they care about, and then you also need to push for legislative change or legal recourse. Sue, because without pressure and without a feeling that this will affect their bottom line, they won't do it and truly like reputational harm isn't enough. They will not. That that they have proven reputational harm is not enough to motivate them to do the right thing.

Speaker 1:

Yeah, I mean, that's pretty obvious, right, because there's so much of the wrong thing that's actually still occurring, because there's so much of the wrong thing that's actually still occurring. I'm guessing that people listening have a lot more questions and I may not know what all of their questions are, but where could we send them for advice, guidance, resources, additional information about this type of abuse and what they can do about it?

Speaker 3:

So come to our website nsexualexploitationorg, and you can find our legal page too, where you can send a direct message. If you want to speak with a lawyer, we have a form there you can fill out and we'll reach out to you. So, and even just our public at NCOSE, n-c-o-s-ecom Public at N-C-O-S-E dot com is our public inquiries email. We always get back. I mean, it might not be right away, but we'll always get back to you.

Speaker 1:

So, in addition to that, what recommendations would you give to practitioners, service providers or just people who are concerned not victims, but people who are concerned about this issue to help them get involved as well as advocate for or protect victims?

Speaker 3:

So let your Congress people know you care about this. I really believe at this point we need the laws to change. I don't think public pressure is enough, although we won't stop doing that. So let your Congress people know that you care about legal remedies for image-based sexual abuse. You care that prosecutors could actually do something about this. At this time, if you specifically want to support these bills, take it down. Act, defiance act, shield act, please do. We have information about all of those on our website. We also, on our website, provide action. So we have a whole page where you know we can help you quickly send a message, either to like CEO of a corporation or send a sign, a petition that we're going to send to a congressperson, those kinds of things. So we do have actions that you can quickly take, but that's the priority I think right now is letting Congress know. This is important.

Speaker 1:

OK, before I let you go, give us your website just one more time so people can take that down and make note of it.

Speaker 3:

Sure it's nsexualexploitationorg. Dani, thank you so much for talking with me today, thank you so much for having me and thank you for caring about this important issue and talking about it.

Speaker 1:

Absolutely. Thanks so much for listening. Until next time, stay safe. Thanks so much for listening. Until next time, stay safe.