21st Century Child Abuse: Ensuring Child Safety Online in the 21st Century & Policy Implications. A Conversation with Maura Gissen, MA

Radio Kempe

Radio Kempe
21st Century Child Abuse: Ensuring Child Safety Online in the 21st Century & Policy Implications. A Conversation with Maura Gissen, MA
Jul 31, 2025
The Kempe Center

Summary of Key Findings  

In the current state of technology, we have failed to create any meaningful legal framework to ensure our children are safe in the digital world. The lack of legal protections for children online differs from those of the physical world. Children online are at risk of experiencing poor mental health outcomes, being victims of predatory behavior, being exposed to harmful content, and being illegally sold firearms and/or drugs. Without age assurance or verification in the digital world, adults have unfettered access to children. Currently, most child sex trafficking victims report they are being contacted via text and internet platforms such as social media and gaming. Additionally, the number of images of child sex abuse material (CSAM; formerly referred to as child pornography) sent to authorities increased exponentially in the 21st century, and the vast majority of CSAM images come from social media. Children are also at greater risk of being exposed to adult content, including pornography, whether the exposure was sought out or unwanted. 

Protecting children online would require platforms to verify the ages of their users and to put restrictions in place so children can navigate the digital world safely. Various advanced age verification processes have been developed and are widely available. Despite this, many platforms have failed to implement age protections even when they know children frequent their sites. Thus, lawmakers are compelled to make clear that the same protections for children that exist in the physical world are also required in the digital world. There are many policy changes needed to further ensure technology safety, and a starting point is to: a) require pornography websites to verify user age to ensure only adults are accessing the sites; b) require social media companies and gaming platforms to verify a new account holder’s age; and c) requiring social media platforms to enable maximum default privacy settings for users who are children.

 

Maura Gissen Bio

Maura Gissen is a fifth year Clinical Psychology doctoral student with the University of Colorado Denver and has her master’s in counseling psychology. Maura currently works with the Farley Health Policy Center (FHPC) with CU Anschutz engaging in research and program implementation. More specifically, she has been focused on youth mental health related to diversifying the workforce pipeline, and on child health and safety in digital spaces. Maura has been working in the mental health field for ten years and focuses on the intersection of trauma and systemic disparities for individuals across the lifespan. She is passionate about engaging in clinical therapeutic practice, along with research focused on policy, advocacy, and systems-level change.