Partnered with a Survivor: David Mandel and Ruth Reymundo Mandel
Partnered with a Survivor is a professional-focused podcast created and produced by Ruth Reymundo and hosted by the Safe & Together Institute. What began as intimate conversations between Ruth and David Mandel—founder of the Institute and creator of the Safe & Together Model—about violence, relationships, abuse, and the systems that respond to them has grown into a global conversation about systems and culture change.
Hosted by Ruth and co-hosted by David, the podcast features in-depth, professionally grounded discussions about how institutions respond to domestic abuse, gender-based violence, and child maltreatment. Many episodes also feature global leaders working across fields such as child safety, men and masculinity, perpetrator accountability, fatherhood, and partnering with survivors.
Together, these conversations examine how systems often fail adult and child survivors, how societal narratives about masculinity and violence shape professional practice, and how intersectional realities—including cultural and religious beliefs, racialised identities, LGBTQ+ experiences, immigration status, disability, and other structural vulnerabilities—shape responses to abuse and violence.
The podcast offers an insider lens into how professionals navigate systems not only as practitioners, but also as parents and partners. Through candid dialogue and critical reflection, Ruth and David challenge the assumptions and structures that limit meaningful accountability, safety, and healing. The goal is collective movement across systems, cultures, and families toward greater safety, nurturance, and sustained change.
Disclaimer: Episodes contain sensitive topics and occasional mature language that may be difficult for some listeners. The views and opinions expressed by podcast guests are their own and do not necessarily reflect those of the Safe & Together Institute or its staff.
Partnered with a Survivor: David Mandel and Ruth Reymundo Mandel
Season 7 Episode 5: AI in Child Protection: Can Technology Make Social Work Safer?
Use Left/Right to seek, Home/End to jump to start or end. Hold shift to jump forward or backward.
Artificial intelligence is already in social work and child protection, and its use is deepening. The question is: How safe, effective, and equitable is it?
In this episode, David and Ruth talk with Dr. LaSharia Turner and Dr. Helen Fischle from Alabama A&M University about what ethical, human-centered, AI-driven tech should look like in social work education and frontline practice.
As agencies face workforce shortages, austerity, high caseloads, and increasing complexity, technology is being introduced as a solution. But can AI actually support domestic violence–informed practice when child safety is on the line? Or does it risk automating bias, victim-blaming, erasing survivor context, and shifting responsibility away from systems and perpetrators as parents?
We explore:
- What “human-centered” AI really means in child welfare
- The risks of predictive tools and automation
- Why social workers must have a seat at the technology table
- How to prevent tech from increasing survivor and worker burden
- The future of ethical innovation in high-stakes systems
If you work in child protection, domestic violence services, family courts, behavioral health, or policy, this conversation is for you.
Technology should enhance professional judgment—not replace it.
Now available! Mapping the Perpetrator’s Pattern: A Practitioner’s Tool for Improving Assessment, Intervention, and Outcomes The web-based Perpetrator Pattern Mapping Tool is a virtual practice tool for improving assessment, intervention, and outcomes through a perpetrator pattern-based approach. The tool allows practitioners to apply the Model’s critical concepts and principles to their current case load in real
Check out David Mandel's new book Stop Blaming Mothers and Ignoring Fathers: How to Transform the Way We Keep Children Safe from Domestic Violence.
Visit the Safe & Together Institute website.
Start taking Safe & Together Institute courses.
Check out Safe & Together Institute upcoming events.
Setting The Stage: Tech Meets Care
Ruth ReymundoAnd we're back.
David MandelWe are back. It's like magic, Jinx.
Ruth ReymundoHi. How are you?
David MandelToday's been a day, and we're happy to be doing this podcast. We've got a great interview coming.
Ruth ReymundoYeah.
David MandelI'm David Mandel, CEO and founder of the Safe and Together Institute. Who are you?
Ruth ReymundoI am Ruth Reymundo Mandel, and I am the Safe and Together co-owner and business development officer. And we are partnered with a Survivor.
Speaker 3Yes.
Introducing Guests And Their Work
Ruth ReymundoAnd we are coming to you on a very pre-storm day where a huge storm is working its way through the country. And we are on Tunxsis Masaco land, the traditional custodians of this land, the greater Algonquin nation. And it is frigid and snowy. And we would just want to send our regards to any traditional custodians, past, present, and emerging. And we are sending our solidarity and support. So lots of people. To lots of people.
David MandelSo this is an intersection of tech and social work conversation we're going to have. And I just want to kind of set the stage for folks, which is at the Safety Other Institute, we are becoming more and more involved with tech. We've always been tech forward with, you know, with e-learning and and our mapping tool being web-based eventually. And we spun off in the last year, Safety Nexus, where where you and I are co-founders, equal partners. Yes, yes. So wonderful. And we are taking the Safe It Together model and putting it into AI-powered, an AI-powered tool. And next week we're actually so beginning end of January. We're beginning our testing, our internal testing, and then going external to test.
Speaker 4Right.
David MandelAnd so this is feeling more and more real. And we're living, you and I spending more time at the intersection of social work, change, the safety together model, domestic abuse, systems, and technology.
Ruth ReymundoYeah.
David MandelSo I just wanted to frame that for folks. It's such a critical area. And we've been, we're about to do a presentation in Australia to New South Wales, child and family agencies. And everybody's like, what does AI mean for us? You know, and and what does technology mean for us? And we're going to talk about safety nexus, we're going to talk about ethical technology. And we have deep, deep concerns about how technology can help and technology can hurt.
Ruth ReymundoYes, yes, absolutely. And not just concerns, not just concerns, but also a level of curiosity.
David MandelYeah.
Ruth ReymundoA level of what is all of our responsibility as academics and practitioners and policymakers and leaders in the field to really harness tech in a way that provides the care, the social care that helps families truly be safe, sustainable, and well-nurtured and self-determined. So there's a lot of hope in that too, because anybody who's labored in a situation where they have a high caseload and there are a lot of complex needs, and there's a lot of complex potential harms as well knows that technology can be a relief to some of those pressures when used well. But also technology can be highly destructive and erasing of the fact that there are real harms. So we have hope and we have concerns. And that's also why we brought into the picture Dr. LaSharia Turner and Dr. Helen Fischel from Alabama A&M University. And we were down there with them this summer. Was it this summer?
Speaker 4Yeah, August.
Ruth ReymundoAnd we in August, right? It feels like a million years ago. And we were really just grateful to be invited to come down and have the conversation and listen to all of the people that were convened in that space by the School of Social Work to name the concerns, to name the potential for helping people in these industries and care systems, but also to really think through the problem of how we provide the right technology, the right technology for the right care. So we just want to thank you both for joining us again for this conversation. And we would love for you to introduce yourselves, please.
Dr. Helen Fischle, DSW, MSW.Thank you so much, David and Ruth, for inviting myself and Dr. Turner to speak with you today and just to share. So I'm I'm Helen Fischel. I've been at Alabama AM since 2017. I'm an assistant professor there and also a program coordinator. And I work primarily with our BSW students. And I've been in social education for many, many years. So it's just a privilege to be here.
Dr. LaSharia Turner, DSW, MSW, CSWHello. I am so excited to be here. Absolute honor to be in this space and share it with you all. I serve as an assistant professor at Hall of Domain as well as an outside field coordinator. And you know, some of my background definitely comes in the arena of like development and technology, in addition to working with children, families, and survivors of uh intimate partner violence and domestic violence, which is what my research is centered around as well. And so I'm so excited to be in this space with you all.
Simulation Labs And Online Training
Ruth ReymundoYeah. And so I would love to hear a synopsis of why you all there at the School of Social Work wanted to have that discussion. What was the driving factors? What were the pressures and the hopes for collaboration that you were you were looking for?
Dr. Helen Fischle, DSW, MSW.Well, we, you know, the impetus for for the conference is number one, we have a child welfare program, and we have funding through Title IV to focus on child welfare. And we during COVID, we developed a simulation lab and where we did in-person simulations. And then soon, you know, when COVID came in and and and we had to go online, we started using a program that used simulations that students could use through their desktops. And that just got us interested and really helped us to understand the value of using technology to help our students practice the skills that we were trying to teach them, such as assessment, engaging with clients and so forth. And just, you know, the whole the whole process of having to go online really emphasized the importance of technology and social work education. But the the simulations, the in-person simulation and simulations and the online simulations, they they really were, the students reported that they were really helpful in terms of them learning how to, you know, practice those skills that we've been teaching them.
Ruth ReymundoNow, LaSharia, you are definitively a tech person in a lot of levels. So I'm sure that you were kind of back there, you know, championing this. Do you want to talk about that?
Dr. LaSharia Turner, DSW, MSW, CSWYeah, so you know, I actually joined a university a little later in early 2023. And so what Dr. Fischwell was doing was already rolling, but with my experience of working in case management and looking at the lack of technology in child welfare and you know, working with students as they're preparing to enter into their practicum, we've obviously seen a need. We've talked to individuals, social workers, case managers that practice every day, and there was a major gap in social work. It wasn't actually until last year that our earth accrediting body actually put this as a part of their forward plan because it hasn't been something that's been focused on in the field. And so we were excited, Dr. Fieldsville and I were like, yeah, you know, it's it's finally happening. It's gonna be something that we can look forward to because as a social work technologist, we we were looking like, hey, what about us when other fields were already having this grand implementation plan, this path forward plan, and we were we were not there as a profession.
Defining The Social Work Technologist
Dr. Helen Fischle, DSW, MSW.Yeah, yeah. And I I think the you know, the other thing is I think for us was, you know, I I've always been into sci-fi and thinking about the future, and you know, I was uh I'm a trick, you know, a Star Trek girl. And so, you know, I've I I've always, you know, what's gonna happen in the future, how is you know AI going to be integrated into our daily lives? And also just thinking about the digital divide, you know, working at the HBCU, you know, I just wanted to make sure that our students were going to be able to be a part of this new technology. And also how how did it, how would it impact them as well? And so as a department, you know, we were just like, let's go for it. And our leadership was like, yes, let's do it. Let's let's let's use this technology to our advantage. And so the conference was born out of that desire to move with the times and that desire to give our students access really to this technology, right? Access to it. Because as Dr. As Lashari was saying, we didn't want to miss out on that. And and so the conference was really bringing people in the community, other social workers, people who were forerunners in the community using technology, bringing them together and saying, Hey, what are you doing? You know, how are we using this technology? What technology is out there? What are some of the ethical considerations that we we need to take in mind as we use this technology? And also we need to think about seriously about social workers having a seat at the table when this technology is being developed. And when you all presented, you really you really talked about the idea of uh of this technology being human-centered, right? Keeping humans at the center of it. And so I know, for example, you know, the Council on Social Education and the grand challenges of social uh of uh social work, you know, they all you know talk about how are we going to leverage technology to help humanity. So, you know, we we wanted to be a part of the the change and and um be have a seat at the table.
Ruth ReymundoRight, right.
David MandelYou know, it's you you mentioned Star Trek and my ears perked up immediately, but I think it's you know, it's it's you know, people forget, I think, that Star Trek, if I can kind of take a take a serious turn on it, Star Trek was really groundbreaking about envisioning a future that was multiracial, that envisioned a future in the middle of the Cold War where humanity was worried about just destruction from nuclear war between the Soviet Union and United States, imagined a future. Imagine a future where where where you could have you know Chekhov who is Russian and you know people who represented you know what would be United States. You had the first interracial kiss, I think, on Star Trek, you know, and you know who are in a in a leadership position as a black woman. So I think I think there's really I love this reference because it really is about envisioning a future where we all can see ourselves, where we all feel like we have a voice and we're all we're cre everybody's creating it together versus being created by you know tech moguls who look a certain way, who represent a certain class, and and I think this is so important. I think that there's so much here. So that's one thing I want to say. The other thing I want to say is sorry, I use the word social work technologist, and I'm not gonna let that one go by because that may be the first time our audience has members have heard that. Can you can you kind of just expound on when you say social work technologist, what does that mean?
Dr. LaSharia Turner, DSW, MSW, CSWYeah, so essentially we are first of all, let me go back. I was so excited when we actually kind of got a way to identify ourselves because in the past we were just a social worker who loved to double and dab into technology. And so it wasn't until more recently where it was a title that was coined for us, which is social work technologist. And so essentially what we are, we're just social workers who try to solve problems and use innovative ways through the use of technology to work towards solving those problems. And so Dr. Fieldsville and myself, we are definitely, you know, you know, some of the ones within our university that use that term. And both of our research was actually centered around that when we were in school. And, you know, how how can we solve this? And so, you know, I work towards developing at app for survivors after incidents of you know violence, and and so you know, it is it's really more of a newer term, and I but I was so excited once I had a way to identify who I was.
Access, Equity, And The Digital Divide
Ruth ReymundoRight, right. Yeah, that's great, that's great. You know, I I I feel like obviously when there's new innovations, when there's emerging ways of delivering care that have never existed before, that one of the things that we're doing very poorly is making sure that we have the right inputs from the beginning. And what we're seeing globally is we're seeing development projects that have to do with court systems or social care or child protection, that they're actually being predominantly developed by tech companies with maybe one SME who comments every once in a while. And we maintain that while that may be efficient for the person who's developing it, that it is not only an inefficient way to create technology for care situations, but it's also an irresponsible way and a dangerous way to do it. And that is because from the beginning, you need that architectural input by practitioners about what it is that they need. And we've already been through this in the medical field. We've already been through over 20 years of technology application to care provision in the medical field. And we've seen how the way that that is developed has needed to shift and how medical professionals became more informed about technology so that they could make better tech for delivering care, for managing patients, for case notes, for integration of care, especially in hospital settings where you have to have a lot of information and you have multiple streams of information and you have to keep them all managed. So I'm just gonna put a call out there to people who are developing technology for social care situations, that you should indeed begin your plans with social technologists, social work technologists, and with people who are very informed as to the delivery of that care before you hire 30 engineers, spend five or 10 or 8 or 20 million dollars because it's a waste of everybody's time and it's dangerous and it's a waste of money. So I'm just gonna say that like Skipy Survivor says it, like right out. I don't know if you guys agree, but that's how I'm seeing the situation. So, how do you guys feel about not only how the technology world has interfaced with social work, how do you see the future forming to make better and and more applicable and more safe technology?
Why Practitioners Must Shape Tools
Dr. Helen Fischle, DSW, MSW.I think you know, we we were talking the other day about, you know, trying to get students more involved with technology, training them, collaborating with our STEM programs on the campus. So for so for example, I I have a student, we uh I I was teaching a course and I I that particular day I had an assignment where I asked the students if you could create an app to help children, what would it be? And one of the students actually we did it, we we did it, we did the assignment, and one of the students actually took it further and actually has created an app. Yeah, you know, so I think you know what I see needing to happen in the future is letting our social work students be open or just helping them to embrace technology and look at the possibilities. If you were to create an app that could help children, what would it be? You know, what would it look like? And then working maybe with a STEM student that they could actually create it together. And so I think that's really important that we need to work in a multidisciplinary way. Yeah. Um you know, I don't know how to write a program for a computer, but I can envision what I would want the program to do. And so if I could partner with somebody who could do that, then I think there's so much that can be accomplished. So I think that's what I would like to see in terms of what happens in the future. So our students getting in, you know, why, you know, if we have a STEM day, you know, having social work students present on that day with with some type of technology that they've created. So I think bridging those the those two areas of social work and and um computer science, I think would be a big achievement.
Ruth ReymundoYeah.
Dr. LaSharia Turner, DSW, MSW, CSWAbsolutely. What about you? Oh, I'm so sorry. But yeah, just starting that conversation early, as soon as possible, letting students you know, teaching them how to use technology ethically, but how to be innovative with the strategies that they use. And I always tell my students, you know, in an ideal world, if you had a planning to do something, what would you what would you create a device for, or how would you use this technology? And I I just start the conversations with what would you do and how would you do it ethically? Because you know, we we try to keep ethics in the forefront of our students and and just try to guide them, but you gotta you know it's kind of like with a child, you you start early, and so this way it's not uh an unknown phenomenon once they hear something, you know. And even with our non-traditional students, we've gotten a buy-in, they have been able to participate in it, and they're excited when they participate in it. And that's one of the things that I love. Even the non-traditional students have been a part of this conversation and these experiences. That's great.
Dr. Helen Fischle, DSW, MSW.And I think the I think the other thing is too, is that a lot of times we just assume that young people are, how can I put it, experts with technology. When in fact they're experts, you know, maybe with their cell phone, but but you know, that we have to be more intentional in introducing them to, you know, other types of technology and training them to use technology for educational purposes and you know, work purposes and those types of things. So we just can't assume that, oh, they they'll get this, you know, or they just now have to hop on a computer and do this and that.
David MandelYeah, that was exactly what I was thinking about, Alan, which is that there's a myth of young people's affinity for technology and uh and that older people Are slower to adopt. And what I find around AI, particularly, is that older people are are maybe embracing it differently or have a different view of it. And that younger student in 20s generation, teens have uh real ethical, environmental concerns about AI. And you know, our kids, I think, I'll say, you know, that's my sample. Our kids view it negatively. They you know, so I don't know what you're seeing in the social work student population, particularly around AI. What's the what's the what's the reaction you have and what's the use of it? And you know, because it's a big ethical issue about writing and about papers and research and environment. You know, so there's a lot of layers of social work and environment. So talk to us about AI specifically.
Teaching Tech Ethics In Classrooms
Dr. Helen Fischle, DSW, MSW.Well, you know, when I was working on my doctorate, one of the things that I found in the literature was that where they, you know, they had there's various studies where they looked at, for example, virtual reality and using that in social work. And one of the things, some of the studies that were done, they found that students didn't automatically know how to use it and that there needed to be training on how to use it. And so I think in general, it's it's for social work educators, having a lab, for example, a computer lab with a person who is able to walk students through how to use it, get them set up and so forth is really important. Because as a professor, I can say, oh, we're going to do this, go to this link, and so on and so forth. But it's really important to have a dedicated space, have somebody that can assist the students if they're having technical problems. Also integrating, integrating how to use AI and so forth into the course curriculum, having a policy statement in your syllabus, you know, that you know, we do not want you to generate responses for an essay and you know, the ethical use of AI. So I think there is definitely a call for us to have more courses where we integrate AI into the courses and also how to use the AI in an ethical way. And so, you know, in my classes we talk about it with students and so they know what the expectation is. I know a lot of schools now have really good policies about the use of AI, but I think if they if you as a professor can model that for them in terms of say, you know, for example, you uh you know, you may stop at some point in your class and say, okay, let's let's use Gemini to see what it says about this answer, or or you know, and but then teaching them to check behind the AI, right? Because sometimes it hallucinates, it doesn't, you know, it might give you inaccurate information, maybe because you didn't prompt it right. So teaching them how to that it's important that you check behind it, you check the facts. But this is it can give you a good starting place, but ultimately we want you to be able to think for yourself. We want you to be able to critically think and sort through information. I think that's really important to teach them that because it's here to stay. AI is here to stay, right? So we need to teach our students how to use it in an ethical way and you know, so that they can help others and and they can work quickly.
Dr. LaSharia Turner, DSW, MSW, CSWYeah, and and teaching them about biases when it comes to data, data production. Because I if I put something in slightly wrong, like a change one word, I can get a total different result. And so helping them to understand, you know, wow, it may not be difficult to use in certain situations or cases. And so that's one of the things I do in the classroom. I'm gonna give you a case study, I'm gonna make you run a different but then they realize like everybody got a different case study, and then we got all different results, and you know, just simply changing a word can completely change the outcomes.
Bias, Data Quality, And Harm
Ruth ReymundoYeah, that's really important, LaSharia So important because and and I know that the the argument is that as AI grows and it's training and gets more inputs, that it's going to be more contextually, you know, accurate, contextually appropriate. But I am very much a proponent of, and and I think the fact that we were having a conversation where we were saying that social workers should be at the table is already defining part of the problem. And that is that these platforms are being trained with information that is fundamentally focused in very narrow ways, which means bias because it's not contextual, it's lacking a lot of context, and also being trained in bias. And so what we've seen is that the predictive analytics that have been overlaid on child protection, you know, models and case files are extremely biased. They've they have the old biases in them, they have information in them, which is decontextual, and it is leading to a lot of removal of liberties and removal of children when that is not necessary. And that's not benign. That is not benign. That has long-term societal health cost impacts for everyone, not just that child and not just that family. This is a collective problem. So I want to go back to the training of AI.
David MandelAnd yeah, can I give an example in the real world that carries over AI? And I'd love to hear both of you comment on this. You know, as we work globally at the institute around multi-agency teams and information sharing, there's a lot of movement to, well, you need to be more coordinated, you need to be co-located, you need to share information to create efficiencies, to create better assessments. All of it makes sense. But what I found in many of those cases when it comes to the issue of domestic abuse in kids is that more information didn't mean better information, didn't mean better decision making, especially in the sense of if we're going to ask our agency partners for information on family, that a lot of that's gonna be deficit focused. A lot of it's gonna include sort of decontextualized failing out of programs, that usually, you know, moms being held responsible for more information about moms who are victims than about perpetrators who might be might be parents. And so there was all these biases, you know, built into the data. And so what you did was you were accelerating, you weren't shifting and making the paradigm fairer or more just, you were just supercharging, coordinating degraded information, let's say. And and my concern is would be that you move that same set of gaps, that same set of problems into AI. It gives the veneer of technology, objectivity, it was reviewed, it was, and and people even trust it more, and it comes together more quickly and becomes you, we know that the the the I'm gonna say the underbelly of social work and human services is that the professional narrative eclipses the family's own story of themselves and becomes more powerful and a determinant. And if that happens without AI, my concern is untrained AI, I'm gonna make that distinction. Untrained, poorly trained AI will will do more of that. So I'm just wondering what what both of you think about that.
Dr. LaSharia Turner, DSW, MSW, CSWOkay, I'm sorry, Doctor. No, go ahead, go ahead. No, no, I was just gonna say, first of all, you were exactly correct. So someone has to start the input, input into the system. Honestly, right back to what Ruth was saying earlier. You know, we need to be a part of the software development, the ongoing development of these systems. And you know, it is possible that this is a great way to hand off a lot of the biases that right now we're seeing and it's predictive um analytics and the use of like you know, just verbal technologies and things like that that Ruth was speaking of earlier, like in healthcare, it is a great way to to hand it off. If we are not providing insight, input, then it's it's never gonna be corrected. You know, like Helen said earlier, AI is here to stay, it's not going anywhere. We are pretty much in this place, and I'm gonna bring up something I'm gonna age myself, so I apologize. It's kind of like the chip, you know, it was made in like 1962, I think, and it was like a hundred years like out. So it was like 20, I think, 62. And so the person who made that was a futurist, right? And I I definitely the way that you all think you would definitely qualify within that. You are thinking ahead, you have to think ahead in order to make things happen. You know, I remember on the Jesses as a kid, I watched and I'm like, oh my gosh, I can't talk to people on their on their watch. You know, I thought that was just so cool as a kid, and so they were thinking ahead, and that's what we have to do as we're thinking about AI. And I know I just engaged myself, I am so sorry, but that's what we have to do as professionals. And I know David is a counselor, and that's that's what he's doing, and that's what we have to do in in all of these professions, especially the ones that serve others to help make their lives better, you know, with the resources that we have.
Guardrails That Interrupt Bad Practice
Dr. Helen Fischle, DSW, MSW.Yeah. Can I can I also just add that in in reference to what you said, David, in terms of people using, you know, predictive analytics and so forth. And I think where the danger lies is that AI gives you fast answers, right? It it you put something in and boom, the answer pops up, right? And because we're human and because we've got so many things on our plate to do, we might just go with that answer and not really even think about what did it just say. Is this answer biased? You know, because you know, I want to go, you know, I want to finish work and you know, go out to to have tea with my friend, you know, whatever, go have lunch, right? Yeah. So I'm just gonna go with whatever it turns out and not really stop to, like I said, you know, I teach my students to check behind what it puts out, right? And so I think that's where the danger lies, is because you uh the human part of us wants to just get it done, um, wrap it up, finish that case, close it out.
Ruth ReymundoYeah.
Dr. Helen Fischle, DSW, MSW.And we therefore there is a temptation to not really critically think about what it the the AI told you, or analyze it, or how will it impact my clients? And what does my gut? We talk about gut feelings in social work, right? Uh, you know, what does the evidence say? What does what do what do based on my interview with that family, the mere fact that, you know, or the fact looking at the family within a, you know, a cultural context or where they're from or that type of thing, can can AI do that, you know?
Speaker 4Yeah.
SpeakerAnd so that's where we have to be careful because it's quick and easy. And social work and working with people is never quick and easy.
Ruth ReymundoYeah, you know, AI is like so fast to the solution. It's like, oh, boom, you're, you know, this is the answer. And it takes a lot of work, actually. We know this firsthand. It takes a lot of work to make AI curious and to default to curiosity and to default to exploring the context and to default to really listening to the information from a place of knowledge of patterns, of violence, of how harm usually occurs, of what the socioeconomic situations and the cultural situations and the familial situations. So it's really real that you truly cannot safely take any form of untrained AI without any guardrails and just use it for decision making in social care situations. I want to say that again, people. It is not okay and it is not safe. And you're gonna get something that sounds like a really good answer, or it's going to affirm your own bias because it's very people-pleasing, and it really wants you to keep engaging with it. So we really do have to talk about that human-centered design, about not replacing the human and their trained judgment and their institutional responsibility and authority, but using AI to lead, be the leading edge of that curiosity, to be the leading edge of this is where, you know, what is what is the next thing? What should we consider? And guiding that human towards better information gathering, better documentation, and better decision making.
David MandelSo for me, I I think I, you know, the the biggest fear, one of the biggest fears, I mean, there's a whole thing about sort of training social work students, if I'm thinking about them, about being critical, thinking ethically about the systems they're going to step into and work in, where they're where there's more likely to be an AI note-taking system or uh AI, you know, analysis of families' performance or or compliance and surveillance being AI driven, you know, talking about, you know, what AI is letting us do in terms of watching things, people. I mean, as a whole, we say AI like it's one thing and it's not. So one of my concerns is uh is is is sort of this this seduction of of efficiency that really kind of reinforces disempowerment. And and as an antidote, you know, just to sort of pivot us to sort of hope maybe and possibility, a lot out of our work, you know, is is where does it, and I'm saying this to everybody to be thinking about our audience, but how does AI get used to interrupt bad practice?
Ruth ReymundoYeah.
Supervision Over Automation
David MandelYou know, the using with people pleasing, you know, and that the sense of uh what I give you and what we're doing with the safety nexus tool, because we've got a point of view with the safety of the model, and we're bringing that IP into to AI powered tools is you have the capacity to build rules in, guardrails. You have the you have the ability to flag and say, wait a second, that that may be domestic violence destructive language you're using. Can we dig behind that? Because you're saying the mom is in denial. Can we can we explore what you really? So we're really we're we're what we're literally doing is testing and building something that is is supports good practice, that we do believe will be more efficient, that we do believe more effective, that will be more supportive, that will be person-centered, but has the capacity to say, wait a second, you may not have written that the best way. Did you mean to write that in a way that blames the victim for what the perpetrator did? So I'm I think this idea of how do you build systems that use AI to interrupt?
Ruth ReymundoYeah.
David MandelBad, bad practice, dangerous practice, or bias, bias, yep. You know, versus focusing. I think I'm really worried that the focus is gonna all be on efficiency because we're because time is precious. It's one of the most valuable resources. We talk about social workers in agencies being time poor and and the caseloads being too large. So there's a there's a legitimate and it's it's it's I'm gonna keep saying it's seductive to sell efficiency. So I don't know. I'm just curious, you know, if you either one of you have thoughts about that.
Dr. Helen Fischle, DSW, MSW.I I think I think you, like you said, I think does the does the say if I'm creating some type of system to help social workers? And I and I you when you presented uh our conference, you you talked about this in terms of does it ask questions? I think you had mentioned, you know, did did do you ask the father about his involvement? Like whereby it will, you know, where the AI might say, okay, this could possibly, but where is the evidence that this that this is so? You know, where it will ask you those questions, or it might say this might be, or maybe, or could be an indication of, but you need to investigate this further. I think having those type of responses built into the system is very important. But again, will the will the worker read it?
Ruth ReymundoYeah, right.
David MandelYou know, will they use that? No, you're right. And you're right. And that still doesn't solve the problem of who is it, who has the authority to approve good practice, bad practice, and it can't be the machine. It can't be the it can't be.
Dr. Helen Fischle, DSW, MSW.It can't be us.
David MandelYeah, that's right.
Dr. Helen Fischle, DSW, MSW.It has to be, you know, as as okay, if if I'm running an agency and you know, my workers are using AR, we have this wonderful AR system that we've invested, you know, thousands of dollars into. Ultimately, I believe supervision, human supervision, to is very important. Okay, I prompted the I prompted the system, this is what it said. And just having you know that supervision, well, let's look over that, or it have a team to look over it, you know, some kind of case management team. This is what it said, but you know, as a team, what do we think? Do we agree? What are some of the other other factors that we need to consider? And and this is this is why I feel again, I love I love technology and what it can do, but ultimately we as humans need to to, how can I put it, we need to be in charge. Yeah. You know, we we need to, you know, we need to be leveraging the technology, not the technology leveraging us, you know. So yeah, that's that's what I that's what I think.
Efficiency Without Losing Judgment
Dr. LaSharia Turner, DSW, MSW, CSWLike using it as a guide, not a replacement. I and I don't know, someone said this earlier on here, but just using it as a guide and not a not a replacement. Um once the piggyback, we have to encourage why. And so I love like how your technology is doing that, it's providing the why behind it. So you know what's happening is that we are seeing a reduction in critical thinking fields amongst individuals that is actually using this technology if they're not using it properly, right? I love technology, I embrace technology all times, and so we are seeing a weakened critical thinking right as a result of yeah.
Ruth ReymundoTechnology should support curiosity and human creativity, it should not replace it. That is my firm stance, and and just you know, I did not, it was not lost on me, Lasharia, that you called us the Jetsons, and I just want to future well with creating the future. I don't want to be Jane. So FYI, very intelligent Astro version, but you're not gonna be able to do that to for anybody who doesn't know the Jetsons, Astro was with a dog. Okay, so so you know, I this is this is where that leading edge starts again, where we go back to the reality that you mentioned, Lasharia, and and the term having a place at the table. I'm sorry, but the table should be made partially by the people who are the ones delivering care. The table should not be someplace that we have to petition a place at, because it is vital that that creativity, that curiosity, and that human-centered decision making is the center. And and and I feel like one of the things that's really happened around algorithmically based technology. That means technology needs to work on zeros and ones in order for it to work. But who is it that defines the parameters of those zeros and ones? A lot of times it's a very narrow set of people. And once you have a very narrow set of people create a technological tool that is said to support human beings, what starts to happen is that the tool starts to control human behavior rather than human beings. Behavior being liberated and creativity being freed for the human to operate on a deeper level and get more information and context. It can sometimes really narrow the conversation dangerously down so that the actions can be faster instead of those actions being very well contextualized, being well considered, having multiple perspectives look at it so that we can really make sure that we are not impeding on people's liberties, their self-determination, and their contact and custody of their children. Because technology has started to define things in ways that are very narrow and are not actually tied to that family's reality. That's when it becomes very dangerous.
Dr. LaSharia Turner, DSW, MSW, CSWAbsolutely.
David MandelYeah, I I um want to kind of move us you know towards I want to share about something we're doing and then kind of move us to some comments, uh, you know, just on where the field we think it's going and what you're you're excited about and your futurist brains. I love that. You know, see I've never been described as a futurist. Thanks, Lasaria. That's uh that's something I'm gonna take away from this tomorrow.
Ruth ReymundoEvery time we meet Lasharia, we feel really good about ourselves.
David MandelYeah, but it's just like now I'm like a futurist.
Ruth ReymundoShe was like, in the middle of an airport, in the middle of nowhere. She goes, Are you David Mandel and Ruth Reymundo? And we're like, we're famous?
Co‑Design With Lived Experience
David MandelThat's great. And so the You are famous. Yeah. In a very teeny tiny, teeny tiny way. But one of the things I'm most excited about with the safety nexus tool and and our our use of AI is our ability to take, you know, take the proven model, the safety of the model, which has been built off of listening to survivors and listening to social workers and listening to practitioners. So a lot of listening has gone to building the actual model, a lot of testing, a lot of research. So we know, I'll just say we know it works in the real world. It makes differences, it leads to different outcomes, right? So we have a proven model. So we're not starting from scratch and we're moving it into technology. And that even isn't even that's exciting. But what excites me even more is the idea that we can then open up the conversation through AI, through technology, to different voices to comment on the model and how the model applies or adapts to their situation. So whether it's lived experienced people, survivors, you know, what does partnering look like to you if you're an African-American woman in the South who is, you know, is is got a particular set of lived experience concerns, yeah, you know, and sort of how does partnering, which is one of the principles of the model, look to you? What is the partnering needs to look? What does the professional other side need to do so you uniquely in your position feel partner with? Then take that to an Aboriginal survivor in in Australia in a rural area or Scotland, or you know, so we're really looking at, you know, bringing together people who are experts in neurodivergence and saying, you know, we know neurodivergence and domestic abuse intersect. So talk to us about how the model can be shaped or be tweaked so it works better for you in your circumstance. And we're actively working, like we're actively creating functionality in the tool so we can have those conversations and gather the information in a non-extractive way, in a collaborative way. And I I'm really deeply excited about, you know, this is a vision that you and I have had, but I share it to share about what we're doing, but to but to really say there are ways that technology and Ruth, you've always been like social media was really important to you in terms of learning things about the world that were held withheld from you under kind of regimes of control. So I want to, I can't, I guess I'm sharing that as an example, but I'm curious what are what looking forward, you know, how I'm looking forward, sorry, what most excites you about technology and social work, technology and child protection?
Dr. LaSharia Turner, DSW, MSW, CSWYeah, I think what it is doing, it is allowing us to touch train more social workers at a faster pace. And what I mean by that is for instance, like one of the things that Dr. Fisher and I is we're working on is like training social business in Alabama through the use of like AI simulation. And you know, to be able to train to give them similar trainings using the technology that we currently have. We are looking and we're obviously currently continuing to advance. I can't wait to provide you all with some of our new partnerships. Um but what what we're excited about is like the the training and and leveraging the AI to go back to what you said earlier, David, is to reduce burnout for social workers. Um if we can keep social workers in the field, then we have you know less turn turnover in cases, we have less lag and getting cases resolved, we can we have the possibilities of returning children home disasters or or whatever that whatever that situation is, is is our ability to support the families that we have. I know we still have a ways to go, but I'm gonna be honest, if you had asked me teenagers if we'd have been here, I probably would have said no, just because of how scary that is to think about that and to process that. And so I am honestly excited about the possibilities of embracing the technology and training AIDS. Even you know, and documentation tools have been out for a while. You know, I remember using Dragon when I first started. If I'm continuing to age myself, I apologize.
David MandelOh my god, I use Dragon. I'm so sorry, I use Dragon, and I mean it's so funny that we're all excited about that at that point, using like voice to text and taking notes in the car and you know, and getting it like 50% wrong and going in and correcting it and standing with that I haven't like like dragged guys. Sorry to interrupt, but like you like you brought me back. Yeah, like anyway.
Dr. LaSharia Turner, DSW, MSW, CSWNo, I used to be in my car with my little recorder. Yeah. I just met with Helen Fischle all with a little voice recorder.
David MandelYou thought you were on Star Trek then. That's where you were like, look at me. I got I got a little voice recorder handheld thing, and uh sorry, yeah.
Dr. LaSharia Turner, DSW, MSW, CSWAnd I I used to brag to my friends who didn't have one. I was like, Well, my company taught me this. I'm a dinosaur for saying that.
Ruth ReymundoThat's so funny.
David MandelI I for I I forgot.
Ruth ReymundoDid you keep your page or two?
Speaker 4Yeah.
Dr. LaSharia Turner, DSW, MSW, CSWWe won't talk about it.
David MandelThat's the thing. No, you you may have been going someplace that we sorry, LaSharia.
Dr. LaSharia Turner, DSW, MSW, CSWWe we we listen, I have a tendency to do that. No, just like we we have so much that we have to look forward to with you know, using this technology. And I'm I'm excited about its possibilities. I love what you all are doing with your program. Like it's it's very impressive for those who did not have an opportunity to join us this summer. You know, we kind of got some some insight about that technology, and so we're we're so excited. And I do know it's it's it's it's continuing to change every day. I got a demo about a software and Dr. Fields and I we were so I mean use the word psyched, we were so psyched about it because it goes back to what David was saying earlier, is that to actually learn so like if I'm an African American woman in Alabama, that's gonna look different from an African-American woman in California, right? So we wanted to test one of their cases that was with an African woman, and she was, I would say, more middle-aged, and they had the the not just the accents, but they had the mannerism of like perfect on this technology. And so we were just impressed because it was like, you know, normally it's it's not to that manner, and so when you would go through the different cases, like they have the even down a manner, uh mannerism of the person, which I thought was extremely important. You were actually listening to the person, paying attention, watching their movements, and and we're starting to see this type of technology roll out. I mean, I can't be you know more thrilled about that. I'm extremely thrilled and excited to tap into some of this new technology.
Dr. Helen Fischle, DSW, MSW.Yeah, I I think what I'm looking forward to is as Dr. Turner was saying, an increased child welfare workforce and a well-trained child welfare uh workforce, right? Where they have where they can use technology and to make their work uh more efficient, faster, but also where they have been trained using this technology. So that means where we have more, where we can create simulations that speak to us, to our personal experiences. So creating those simulations, telling those stories. I think one of the things that that we've been doing is working with our theater department. So there the theater students are getting involved with um social work, right? In the sense of um, so with like our in-person simulations, you know, we we uh we've been using uh theatre students. Sometimes we use each other, our faculty in our department to play roles. But it's been a we for example, we had a mock court case, which was a child protection case. That was amazing. And we had our theater students play the play the clients, and we actually had a real judge, and we had a real attorneys, and and so I think just I'm excited about us working again, as I said earlier, just with other disciplines and and introducing them to the field of social work and how they can play a role in also uh helping children and helping their community because a lot of times they're like, I didn't think, I didn't know, I didn't think that, you know, that I could be involved with this. So I I think technology is a way of just again, just bringing us all together and saying, how can we improve the world that we live in? How can we make children and women and children safer? You know, um we can all do something because sometimes people think, oh, it's that's that's what social workers do. No, you can help you can help a family, you can help, you can help decrease child abuse, even if your degree is in computer science. How can we all come together, right, and work to to address you know these these issues that our society is dealing with?
Realistic Simulations And Cultural Nuance
Ruth ReymundoYeah. I mean, I I I am really hopeful and grateful because you all are part of the conversation, because you are all are out there leading the way in the conversation. It makes me feel like, you know, we're in we're in the right conversations, we're pushing the right topics. The people who know what it means to have to train the new social workers and and child protection workers of the future know that we need to do better at training this technology so it's contextual, so it's so that it's accurate, so it's equitable, so it it is non-biased, and so it leads to better decision making. It doesn't just lead to faster, bad decision making. So, so that's my hope.
David MandelYeah. Yeah. I I think it's I love this being a way to wrap up because I think social work and and I'm not, you know, for people who are inside these fields, you know, there's distinctions between counseling and psychology and social work and psychiatry and mental health and addiction, and they all overlap. And for lay people, they can kind of get almost together. But social work particularly is about justice. It's about changing systems, it's about lifting up not just individuals and families, but it's but about communities. And I can't think of a better place for leadership to come around shaping technology, advocating for not only professionals, but for families and individuals who have been impacted to be at the table and and really feel like you, Ruth, that that love being in this conversation with you. Really appreciated the invitation to come to Alabama AM, to come to the School of Social Work, to talk about technology and child protection. And we're looking forward to more conversations like this with both of you.
Ruth ReymundoYes.
David MandelSo I want to thank you for being being on the show.
Ruth ReymundoYeah. And you know, we are firm believers that tools have to serve people. People can't serve tools. So we're all gonna be focused on that as an end objective.
David MandelSo I'm sorry, and I just had another science fiction moment, but not a good one. Oh no, but this is like the Twilight Zone where where the cookbook, the book was how to serve humanity. And it wasn't about how to serve humanity in a positive way, it was actually having not gave it. We are now we're in the terminator. We're just yeah, sorry, sorry.
Ruth ReymundoAnd you know, stay sorry, sorry, stay in home. Yes, and and of course, I I I brought it. We want to be we want we're thankful for both of you. Very grateful.
David MandelBoth of you.
RSo thank you so much both for joining us. Thank you. Really so much every time we spend time with you, we just want more.
David MandelSo thank you both. And you've uh been listening, you've spent another hour with us and our guests on Partner with a Survivor. And uh, if you like this podcast, share it. Like it, follow it on your on your platform, get it out there. We're we're nearing 200,000 episode downloads.
RYes, we are.
David MandelWe'll hit you know this year in the next few months, which is just amazing. We're an amazing run. So I am David Mandel.
RAnd I am Ruth Reymundo Mandel.
David MandelAnd we are out.