Behind the Paddle

E52:Protecting the Digital World: Breaking Down the Online Safety Act Part 1

Episode 52

Use Left/Right to seek, Home/End to jump to start or end. Hold shift to jump forward or backward.

0:00 | 56:18

Send a text

In this episode of Behind the paddle Podcast, we dive deep into the Online Safety Act what it is, why it matters, and how it’s changing the way we interact online. From protecting young users and tackling harmful content to holding tech giants accountable, we unpack the major provisions, the controversies, and the real-world impact this landmark legislation could have on everyday internet users. 

Support the show

Check out our socials!

Thank you so much for listening 💖

Speaker 1

Hello and welcome to Behind the Paddle Podcast with me, Porcelain Victoria. Today we're going to talk about something rather important, especially for any sex workers. And hopefully the clients who well anybody but a lot of people watch porn. A good majority of people watch porn. And I feel like this episode is going to be a deep dive into why you see the porn you see now, why you can't see certain porn. And why a lot of things um while some companies did move out of the UK and are like in Spain and other places because of the Online Safety Act. Regulation, rights, and risks. So we're going to speak about the Online Safety Act today. It is one of the most significant internet regulations of our time. It's uproar when this actually passed. This episode's going to break down the origins, the implementation, the controversies, and potential long-term consequences of the Act. The Online Safety Act was developed and passed in the United Kingdom. The drafting process began with a white paper published in April 2019. And after several years of consultations, debates, amendments, the final version of the law was passed by the UK Parliament on April 27, 2022. Now I want to say, make it very clear, if you are a sex worker and you did not sign and you had knowledge of us trying to get this revoked and you didn't sign um the government bill, I think it was, I don't know how to describe it, when you sign online just like we don't want this to pass. If you didn't sign that, I would love to know the reason genuinely. And I'm quite disappointed. Because this is our world that they're trying to change. This is our job, our money. I would love to know why you didn't sign. If you had no knowledge of it, absolutely fair play. But I personally do know DOMS and sex workers out there who did not sign this bill. And I was very shocked and it made me really think about who you should follow and um see as a role model. Because I also recently shared what was it exactly? Oh, that's it. The right for decriminalization. And I did not see certain DOMs very in high power, popular, well-known DOMs share this frickin' bill to help us decriminalize sex work. And it's disgusting. There is no reason not to sign it in my eyes. Because we all want a safer and better world where we can freely do our jobs without worry. So yeah, that was rather disappointing to me. Me sending it along to well-known established DOMs and they didn't share it. Was very, very sad and upset. And yeah, it shatters the oh, I saw you as a role model. But I've seen that very like I I tend to clash heads a wee bit with other sex workers because I don't I I I say it how it is. I say that some sex workers can't take deposits, and I was one of those sex workers, and that caused a lot of uproar in a sex worker group that I was in, and I basically from that point on was like banished and muted, and they didn't care about my opinion on anything because they saw it as oh, don't you care about your safety? And that's literally what one of them said, and I'm like, wow, this is like five big DOMs in the industry being extremely rude, being extremely close-minded, and not realizing the realities of the world, that not everybody is as privileged as them to be able to screen clients, and that that killed some role models for me, absolutely. And I know a hell of a lot of people would be shocked if I said their names because they didn't see them in that light either, they didn't realise they had their opinions, and I feel like a lot of sex workers do need to open up their minds and realize there is a reality and we're getting pushed more and more underground, whether that's online or in person. And maybe there's a bit of shame that you're a sex worker, but own it. Genuinely own that shit. I understand there's a lot of sex workers who can't do face in the industry, which is absolutely fair, but sharing a bill is no different from sharing a picture of your feet in a sub's mouth or you at a cash meet. Just share it. Actually be there for the sex workers that need you. Be in that community. Okay, my law rant's over now. I'm a little bit distressed. The The Online Safety Act is officially known as the Online Safety Bill in its draft from before receiving royal assent, which was granted on April 27, 2022. The law aims to regulate online content and enforce stricter requirements for platforms to manage harmful or illegal content, particularly forcing on protecting children and vulnerable groups online. I mean, there is a lot to unpack here, so I might even do a part two, honestly, because protecting children online and vulnerable groups. Did anybody hear about Pornhub having child pornography on their website? And then that also affects like MasterCard and stuff that that goes into it because MasterCard, you can't use it on some platforms. The emergence of the internet as a central feature of daily life brought up unpredicted access to information and social connection. But it also gave rise to new dangers, the widespread availability of digital platforms, social media networks, and communication tools created fertile ground for harmful content to flourish. As incidents of online child exploitation, cyberbullying, extremism, and the spread of misinformation increased, it became evident that existing regulations were inadequate for addressing the scope of these issues. I feel like at the end of the day you will always get um cyberbullying. You will always see extremism where whether that comes down to assault, murder, um, gore, and the spread of misinformation. We recently just did an episode of um the abortion protests in Glasgow. That is the spread of misinformation right there, and that is not online, that is straight in person. Um you can listen more about that in the previous episode, but that was just a little example of just like misinformation can be either online or in person. I feel like online there is a lot it's a lot more easily accessible because you can go from link to link rather if you go up to somebody that's in the street talking about the Bible or something like that. Um, that's the only thing I can think of. And they talk about something like you will go to the internet, hopefully, to look back it up and to see if they're correct or whatnot, and like word of mouth only gets you so far. And yeah, again, there's loads of social networks out there. Some social networks you're meant to be older than 13, I think that's TikTok, and on others you're meant to be like 16 plus and things like that. I mean, I was a little shit as a kid. No, I wasn't a little shit. I was horrendously bullied, so I like the online world better, let's put it that way. I wasn't a little shit. I wanted to uh explore and get rid of real life and get out of the light of what was happening in my daily life. And I feel like a lot of kids will explore that, will explore the online world, especially because it is so much more accessible, but at the same time, you need to be aware of what your kids are watching. You need to watch them, you need to regulate yourself as a parent if possible, what they're watching. It does influence them absolutely cyberbullying that can again watch what other kids are doing. Um I mean you might be getting cyberbullied by an adult, which is horrific. But I think yeah, watch what your kid's doing, talk to your kid, be more in tune with your kid. I I like I I I do gentle parenting. I don't do the ridiculous letting my kid get away with everything. I think it was called progressive parenting, the other gentle parenting bit, but like I do tell them what's right and wrong, and like I do tell them off and things like that. Um and we have a really, really good relationship where it comes to communication, and I would absolutely um speak more on that at another episode where I do believe I do believe that it is really essential to have a good relationship with your kid and to recognise if you're being an ass. Um, because I grew up with asshole parents and they didn't realise when they were being an ass. And understand if whenever you're being unreasonable, or if you're having a power trip, or if um you're just too hot-headed, and you're only doing it because you're all ramped up, and that takes a lot of knowledge about yourself and to recognize that. Again, a lot of things loop into each other, and you can talk about other things which come from the online safety act because it includes so much. So, online child exploitation. The internet allowed child sexual abuse material to be shared, traded, and accessed by predatory individuals, which um recent news there's been like telegram groups about uh people sharing videos, pictures, children, everything like that, as well as other telegrap telegram groups. Um, it's absolutely disgusting. And yeah, as I said, Pornhub, massive thing that happened with that. I'm pr I'm sure it's still going on. The UK, like many other countries, was faced with rising numbers of online sexual exploitation cases, both in terms of access to harmful content and instances of grooming and exploitation. Now again, when it comes to grooming, why did we not put um a bill through for people who groom children? Why didn't we deal with that like I feel like this is similar to how now this isn't a stretch, but it might sound like a stretch. But I do see this as the with a woman when they say no, we're with somebody else when they say, Oh, you were dressed like that, you deserved it or some shit. Like, wha why are we blaming the victims here? Why aren't we going after the actual um predators, the groomers? Like, yes, we absolutely need to get rid of online child ex exploitation, absolutely. But can we also have something for the predators? Because as we're gonna find out that is this really helping? Is it really affecting the internet? Is it affecting the people who are gonna do the bad thing in the first place and they know it's bad? So yeah, as I said, with the rise in number numbers of online child sexual exploitation cases, both in both in terms of access to harmful content and instances of grooming and exploitation. Cyberbullying became a serious concern with children, teenagers, and even adults being harassed and coerced online. In particular, social media platforms provided spaces where users could target other users anonymously, which intensified the psychological impact of bullying and harassment. Again, I if somebody's gonna bully someone, they're gonna do it. They're not gonna think about the consequences. The consequences come after it. The internet also became a platform for the dissemination of extremist ideologies, including hate speech, radicalization material, and the recruitment of vulnerable individuals to extremist case causes. The ease of spreading this content without accountability further escalated concerns. Now, this again is a very bold statement with extremism. Are we gonna talk about the army? Like every time I go to the cinema there is a army air force advert that plays. If you go around the street and ask people would they join the army these days, especially in the UK? No. Why? Like no going to war, I would say, is something very extreme. And you take risks doing that. So why isn't that advert seen as extreme when it get gives extreme it it can give extreme outcomes? Just saying just just putting it out there. If anybody disagrees with me, I would love to hear. I am absolutely open to conversation and I might even change my mind. I'm not one of these people where they are so stuck on something like please open my mind. That is the only way people will learn is by knowing more knowledge and talking to people. Um obviously extremism, it is everywhere, but online there is a fuck ton. Absolutely. There is no stopping it at all. So again, misinformation of fake news and disinformation campaigns on digital platforms, particularly during significant events like elections or COVID-19 pandemic, exposed the dangers of unregulated information. Misinformation spread rapidly through social media, affecting public opinion and even influencing p political outcomes. Now, Trump's election back in the day, not the most recent one, and with COVID-19, Jesus Christ. I literally had clients telling me that it was the government they didn't believe in vaccinations, and the government just wanted to track us and all sorts, and I was like, oh what the fuck? J just get the vaccination. And of course, people who yeah, who thought it didn't exist, and I was like, are you fucking serious? Like I had a family member die of COVID. Right, okay. As these issues grow in prominence, there was mounting pressure on the UK government to take action. Adversy groups, child protection organizations, and digital rights organizations became began calling for stronger regulation of online platforms, arguing that tech companies had a moral and legal legal obligation to protect users from harmful content. See in my opinion my lovely opinion which everybody wants to hear you will always see harmful content. You will always see cyberbullying. You might even be bullied. There will always be this there is no way to get rid of it all. Now we will see if there has been a decrease further in this episode of cyberbullying and extremism and all that stuff. But the internet is ever growing in response to mounting advocacy from organizations such as the National Society for the Prevention of Cruelty to Children, the Internet Watch Foundation, and other child protection bodies, the government the UK government began considering legislative action. These groups highlighted the need for stronger oversight of tech companies, particularly social media platforms, and demanded that online spaces be safer for children and vulnerable individuals. The government faced significant pressure to take more proactive steps, especially after high profile cases of child exploitation and deaths linked to cyberbullying. Public outcry further fueled the urgency for reform with calls to hold tech giants accountable for the content they hosted and the political harm that could be done to users. Now my problem with that is have the tech companies have the social media platforms actually been held accountable? Has anything changed? So there was other regulations. There was the Digital Economy Act of 2017. The UK government attempted to regulate aspects of online content through the Digital Economy Act 2017 which sought to implement age verification for adult content. However the legislation was widely regarded as a failure because it was technically flawed and failed to effectively address the broader issue of harmful online content. The Act was intended to limit children's access to explicit material but was met with significant pushback from civil liberties groups who raised concerns about data privacy and feasibility of its implementation. And then we had the EU Digital Services Act which came into effect in 2022 shares many similarities with the Online Safety Act. Both laws seek to hold tech platforms accountable for user generated content and requires platforms to moderate content with a particular focus on protecting vulnerable users. However, the DSA is more focused on ensuring digital services operate transparently and fairly while the UK's act places greater emphasis on ensuring the safety of miners and combating harm in a more comprehensive manner. The DSA also contains provisions for establishing a regulatory framework within the European Union while the Online Safety Act is more UK centric in its enforcement mechanisms. We also have the US section two hundred and thirty of the Communications decency act nineteen ninety six has been a landmark regulation that shields online platforms from liability for the content posted by users. In stark contrast to the UK's more interventionist stance, Section 230 has largely protected platforms from being held accountable for user generated content. While the UK's Online Safety Act shifts much of the responsibility for content moderation onto the platforms themselves. Section 230 has allowed platforms in the US to operate with less oversight and legal consequence. This difference highlights the UK's more aggressive approach to regulating online spaces in a way that holds platforms accountable for harmful content. Yeah the UK just likes to come in with force it really does. Like proper balls to the wall they don't care about anything else this is their plan this is going to happen. So the Online Safety Act applies to a wide range of online platforms that are widely used in the UK like Facebook, Twitter, Instagram and newer platforms like TikTok. These platforms are responsible for regulating and moderating user content also include search engines like Google, Bing and other search engines must ensure that harmful or illegal content such as child sexual abuse material or terrorist content is not accessible via search results or easily shared. Now the terrorist content I mean that really depends on how your government I not how the government stand but like how the That is interesting actually. I wonder with terrorist content because like we're we're we're in multi well are we all in multiple wars right now? Is everybody fighting? We helped out with Ukraine. The UK did help Gaza by providing um humanitarian aid and funding to Gaza including financial support to international organisations and direct delivery of supplies. Okay so I feel like when it comes to like terrorist terrorist content it depends on what side and that sounds really gross. But yeah absolutely uh child sexual abuse material should not be anywhere nowhere should not even exist also messaging services the act also applies to private messaging services including popular encrypted platforms like WhatsApp, Telegram and Signal I that's interesting because I know that some clients do like to message me through Telegram, Signal and of course WhatsApp because they're all encrypted and they think like nothing's gonna get through it. While these platforms are used for private communication the government argues that the scale of their use means they should have mechanisms in place to prevent illegal content from being shared. However this is highly controversial obviously as the act's potential impact on encrypted communication may lead to privacy and surveillance concerns. Any online platform with a significant UK user base is covered by the law so porn sites like loyal fans, only fans um dog fans, Scatbook anything like that um which is why a lot of them jumped ship and moved out of the freaking UK but this could include online platforms, forums um gaming platforms and other digital services that allow user generated content or interaction. A key area of debate revolves around private communications particularly end-to-end encrypted messaging services while the law aims to regulate public-facing platforms encrypted services that offer private messaging or exempt from certain provisions. However this exemption has faced scrutiny with some advocating for the ability to scan encrypted messages for harmful content such as child abuse material or terrorist propaganda. The issue for continuous as privacy advocates argue that scanning encrypted communications undermines individual privacy rights and the security of personal data. And this does talk about WhatsApp signal and parts of Facebook Messenger. So obviously now the tech companies have a duty so they're required to they're required to actively remove illegal content within a defined timeframe including terrorist material, child sexual abuse material, hate speech or unharmful content. You've got duty of care a central feature of the act is the duty of care imposed on online platforms. This requires companies to take proactive steps to protect users from harm including bullying harassment psychological harm risk assessments which is well weird because companies must perform regular risk assessments to identify potential risks to their users, particularly vulnerable groups and outline strategies to address these risks these assessments will need to be submitted to Ofcom to ensure compliance. That's like I want to see these assessments that's so it so intrigues me. And of course age verification for adult content. To protect children from access and explicit material, the ACT imposes age verification requirements on adult content platforms. This includes ensuring that users accessing adult content over the age of 18. The implementation of such measures has sparked debates about privacy and feasibility of widespread age verification systems, as it may involve collecting personal data such as IDs or payment information to verify age. And I find it so interesting because the platforms also have to submit risk assessments that outline how they are preventing harmful content from being posted and shared. And they have to publish annual transparency reports dealing how they handle content moderation including the removal of harmful content, account suspensions and user complaints. They of course have to be more transparent about their modern practices including the use of automated tools like AI to increase accountability to users and regulators. So concerns from free speech activists a major concern raised by free speech activists is the vagueness of the term harmful content. The Online Safety Act requires platforms to remove content that could be harmful but the act does not provide a clear universally agreed upon definitions of what constitutes harmful material. This ambiguity raises significance and concerns about political overreach and unintended consequences see this is what I'm on about where it's like you you're so vague about certain things it's gonna be hard to tackle the vagueness of harmful content could easily be used to censor controversial political discussions, satirical content, or expressions of dissent. For example, content that critiques government policies, mocks political figures or challenges widely accepted norms could be flagged as harmful. Even now it is not illegal. I know recently that with the UK government we have said that only cis females can use like toilets and it's horrible it's horrible that I'm going to make um a podcast episode about soon because it needs to be talked about and it needs to be said. Yeah because the UK Supreme Court decided the terms woman and sex in the Equality Act 2010 referred to a biological woman and biological sex which opens up a whole frickin' podcast episode because it's like if you follow us on Instagram or any of the other platforms we a hundred percent support trans rights. I I've got no words because it's just it's shocking that the it's it's not at the end of the day that there are people out there and it's old men and old women who want this and have made this pass. But my point was that online you have seen people talk about trans um in a good light and that has been flagged for hate speech. Yeah it's sad it's sad. So basically you can't have free speech because it's not defined enough of what hate speech is like again if you have a different opinion from somebody then it can be flagged. Yeah there's a real risk of overblocking where legitimate speech including politically sensitive or controversial topics is unjustifiably removed to avoid penalties and we'll get into this penalty and fines further on be patient there's a little bit more there's a little bit more we've got to talk about the overblocking of political and social dissent the requirement to remove harmful content could lead to the unintended suppression of activism and social movements particularly those that can challenge the status quo groups advocating for human rights, environmental justice or anti-government causes may find their content disproportionately removed because it falls into vague categories like hate speech or misinformation. This can have a chilling effect on activists who rely on digital platforms to organize, mobilize disemerate information. I mean a lot of people get shadow banned for speaking how they feel especially the LGBTQ community, the sex work community anybody who doesn't like the current government we're usually silenced. And that goes for all over the world really the Online Safety Act introduces a controversial provision that may compel platforms to remove content that is that is legal but deemed quote harmful. This raises several serious concerns Who decides what is harmful? The crux of the issue lies in the definition of harm and who gets to decide it? A platform could be asked to remove content that does not break any laws but is still considered harmful by some. This could include content related to eating disorders, self-harm mental health issues or even political disinformation. The power to define harmful content lies with government bodies like Ofcom but this introduces the possibility of subjective decisions that could be influenced by political agendas leading to the censorship of legitimate speech the risk of overreach is particularly high when harmful content could be interpreted as any speech that makes someone uncomfortable or challenges their world view as I said earlier on it's like it comes down to like peep other people's views. Oh it's so fucking frustrating it really is living in this world where it's like who do you support? It's like goddamn the government's role in defining harmful content at the heart of the issue is whether the government should be allowed to regulate content that is not illegal but has the potential to cause harm the Online Safety Act empowers Ofcom the UK's communications regulator to make subjective determinations about what constitutes harmful content. Critiques argue that government agencies are not equipped to make these kinds of subjective judgments. When an online platform has to decide whether content is harmful it often relies on algorithms or automated systems that cannot fully grasp the context or the nuance of certain discussions making them prone to missed classification. Again these are all really good points okay so I can't choose do we want to know more about the other acts I feel like I've covered them a good amount the EU's Digital Services Act I'm asking you this question like you're in front of me when I'm actually looking at a camera and speaking in front of the mic more about that we can absolutely go more into it but I think we are going to stick with just the UK right now with a little bit of the US and Europe into this discussion. So how does this affect sex workers that are online? The Online Safety Act has significant implications for sex workers and adult content creators who rely on digital platforms to conduct their work. While the act is primarily designed to target illegal content and protect users from harm its broad and vague definitions can lead to unintended consequences for those communities. So we have increased deplatforming of sex workers. One of the most immediate effects of the Act on sex work online is the heightened risk of deplatforming. The act places a heavy emphasis on removing illegal content, including child sexual abuse material, terrorist propaganda, and harmful content. While sex work itself is not illegal, platforms often err on the side of caution to avoid fines or reputational damage under the new regulations. As a result, sex workers as a result, sex workers, adult content creators, and platforms offering adult services, e.g. OnlyFans Pornhub, etc., may face increased content removal or even account deactivation due to perceived violations of platform safety norms or vague content moderation policies. Similar to CESTA and Foster in the US, legislation in the United States passed in 2018 similarly pushed platforms to remove content related to sex work. Fearing legal consequences, the Online Safety Act in the UK mirrors these concerns as it compels platforms to take aggressive action against any content that could be perceived as harmful. The fear of liability under the UK's new laws may incentivize platforms to implement stricter content moderation policies resulting in censoring and deplatforming sex workers, even when their content is entirely legal. Platforms may overcorrect by premeditively removing adult content, creating an environment where sex workers are forced to find increasingly precarious and unreliable avenues to continue their work. So many platforms are becoming way more restrictive. You can barely have anything on them these days to the point where like me being a dominatrix I can't put CBT I can't put anything with blood I can't even do fisting um what else can't I do? There's a good few things I mean on other certain platforms you can't do piss you can't do scat I mean not everybody wants to see shit and that's what my scat book is for um you can't even do period like you literally can't work on your period unless you stuff something up there but that's a shh, that's a secret but like wow so I can absolutely see that there are a lot of online sex workers that are just kind of done with online work because there's so little they can produce and it comes with the fact that people who watch porn genuinely gen generally want more and more extreme stuff. Which obviously can be seen as harmful. It's like the other day I saw some BDSM porn where consensually legally somebody was getting strangled and waterboarded and that's quite extreme. But you can't put that on most platforms and so you have to make your own. But then you have to also regulate it and it's it's it's it's a lot. It's a lot. Yeah and you have to remember many sex workers move online because it's safer than street based work or meeting people in person. With fewer online safe spaces some are forced back into offline work so whether that be on the street or in person or having to get like a quote unquote normal job which could absolutely pay less. And they could be faced with more violence less control and higher risk of exploitation. Some parts of the act allow or encourage more reporting mechanisms for quote harmful content which can be used to target and harass sex workers even when they're following the law. This adds stress, anxiety and real world consequences like the loss of accounts, shadow banning online resources like harm reduction advice, mental health support groups and community organizing for sex workers can also get flagged as harmful under the Act's VAID categories. This isolates sex workers even more the Online Safety Act unintentionally pushes sex workers into more dangerous conditions, reduces their income, undermines their safety strategies and limits their freedom of speech all while doing very little to actually protect them. So yeah that's just a little snippet of how it affects sex workers like I feel like we're always forgotten about like yes the children are absolutely more fucking important absolutely um but oh god that sounds gross when I say but we also have lives we also are trying to make a living we're trying to survive but we're gonna get into this even more and this is gonna be a two-parter. The Online Safety Act could disproportionately affect marginalized communities particularly those whose content may be more prone to being flagged as harmful or inappropriate. Again, as I've said previously the LGBTQ community the Act's provisions for removing harmful content could unintentionally lead to discrimination against LGBTQ plus individuals, especially when content related to gender identity, sexual orientation and non-binary experiences is flagged by content moderation algorithms, given that platforms have limited clarity on what constitutes harmful content, there is a concern that LGBTQ users may face increased censorship. For example, posts discussing gender dysphoria, transgender experiences or even queer sexual expression could be misclassed as harmful, leading to the removal of supportive spaces and community building content. This could stifle freedom of expression and prevent marginalized individuals from finding solidarity and support in digital spaces. One of the primary concerns surrounding content moderation under the Online Safety Act is the risk of the algorithmic bias. Algorithms used to detect harmful content are often trained on large data sets that may not account for the nuances of diversity of marginalized groups as a result these algorithms might disproportionately flag and remove content from communities such as sex workers, LGBTQ individuals, radical minorities and other marginalized groups. This could create a feedback loop in which these communities are continuously censored because platforms fail to distinguish between harmful content and legitimate forms of expression. The result is an unequal application of the law, where minority voices are disproportionately silenced while mainstream content or dominant social narratives go unchallenged. The impact on radical and ethnic minorities beyond LGBTQ plus issues, other marginalized groups such as black, indigenous people of colour may also face heightened scrutiny under the Safety Act. Content related to activision, anti-racism movements and cultural expression could be flagged as harmful due to its radical nature or perceived offence to dominant societal views, this effect would mirror previous concerns about algorithmic bias where content critical of systemic inequality or advocating of radical justice might be censored because it challenges societal norms or is seen as offensive by automated moderation systems. Self-censorship and increased vulnerability as a result of the fear of content removal, many in marginalized communities may self-center their posts, avoiding sensitive topics or identity related discussions out of concern for being flagged or banned. This self-censorship could have damaging effects on mental health as individuals may feel they cannot fully express their identities or experiences in online spaces. It also weakens the community's solidarity that is crucial for marginalized groups who often rely on online platforms to share resources, find support and advocate for social justice. Again, this is what I'm talking about where a lot of things come down to other people's views of who are in charge of that platform. Like right now on X, Twitter, whatever you want to call it I'm seeing so much far right stuff and Trump and horrible horrible stuff it makes me not want to be on that platform. And I'm shadowbanned as hell my my um dominatrix profile and behind the paddle we are shadowbanned on Instagram Twitter wherever we go we are shadowbanned and it sucks because what behind the paddle represents is a frickin' voice for everybody. It's horrible the world we live in and I don't know if it's gonna get any better. Under the Online Safety Act platforms are expected to have a duty of care towards users which includes taking proactive steps to protect them from harm. However this duty is not always aligned with the needs of marginalized communities. In practice platforms may choose to err on the side of caution censoring more content than necessary to avoid penalties this could result in a blanket approach that targets all forms of adult content including those created by sex workers or LGBTQ plus individuals under the umbrella of harmful content. The result is a disproportionate impact on vulnerable communities including the increased risk of digital erasure for marginalized groups who depend on digital platforms for their livelihoods, identity expression and support networks I mean everything is censored these days this podcast is censored we are so shadowbanned and all we talk about is freedom, LGBTQ plus, women's rights, sex workers, the right to do what we want to do with our body and it's disgusting that how censored groups are when all we want to do is look after each other and make sure the world is going to be safe and people have knowledge and education on things as small but as big as periods. Again we've covered periods twice now in multiple different areas and it's something that always keeps cropping up like recently in the UK on the news we were talking about if boys should be taught about periods and it's like yes everybody should be taught about periods why should you not be I mean in the US currently it makes me so irate that one um one state recently wanted to remove from the education board talking about consent why would you want to get rid of talking about consent that is so disgusting and dangerous I'm getting to how irate about this but it needs to be irate we need to talk about this we need to get mad we need to get loud about our opinions on whether something is right or wrong or this is what should be taught and it should be normalized but it's not so I think this is where we're gonna end part one of the online safety act because there there is just much more to this and yeah it's not a good episode in forms of just people control us basically this is what this online safety act is people controlling other people's views people controlling other people's wages and I feel like if you have any feedback or if you have any views of your own on this online safety act again this is part one we are going to go more into the tech industry and the compliance challenges for those tech industries and then we're gonna talk about the legal challenges and future amendments and then yeah we're gonna give our final thoughts. So yeah we've still got a few things to discuss but yeah if anybody would like to chime in or anything like that I would love to hear it. I know from my point of view that sex work online has definitely um got more regulated and it's a shame I can't even I um on one of the phone chat sites I use um called online amazing lovely I can't even I fe I I believe we can't even show dildos now and like strap-ons and it's like what that's crazy so it stops it stops a lot and that might not seem a lot but from somebody who is in the industry it stops us advertising in multiple different ways. Thank you for listening to Behind the Padder Podcast this has been the episode on the online safety act this is just part one part two will be next week and yeah if you want to see the spicier version because I can do this I mean they can well reading reading a podcast and talking about it um then you can go on our dark fans you can go on many vids and I might be releasing loyal fans as well. Yeah and of course you can find my own personal professional links through Behind a Puddle as well and yeah if anybody has any feedback or any other topics they would like to discuss please gimme gimme and it would be oh so nice if you leave us a five star review if you did enjoy this podcast. And if you would like to come on the show you are more than welcome to give me an email and we can discuss a topic of your choice and yeah have a lovely chat. Right this has been Behind the Bad Podcast with me Paulson Victoria. Thank you very much for listening and goodbye