Angela Walker In Conversation - Inspirational Interviews, Under-Reported News

CAN WE TRUST AI? Christian Ortiz on AI’s hidden bias

Angela Walker

Artificial Intelligence has rapidly integrated into our daily lives, becoming an essential tool across healthcare, transportation, and communication. However, a critical question emerges: is AI inherently biased? This question forms the cornerstone of a fascinating conversation with Christian Ortiz, founder of Justice AI – the world's first ethical AI system specifically designed to address and correct biases rooted in historical inequities.

The fundamental issue with AI, as Ortiz explains, stems from the data it processes. AI systems learn from vast datasets that predominantly contain Eurocentric ideologies and perspectives. As a decolonial technologist, he has observed how these systems perpetuate biases originating from settler colonialism and structures of discrimination that have been systematically embedded in Western societies.

Ortiz argues that these biases have real-world implications. He shares a powerful example from his work with a dermatologist in Denver who, realized his medical forms and diagnostic approaches were failing patients of color. When Justice AI analysed the intake forms, it revealed they were based entirely on Eurocentric diets and cultural assumptions, missing crucial information needed to properly diagnose and treat patients from different cultural backgrounds. 

The development of Justice AI involved collaboration with over 560 marginalised individuals from diverse backgrounds around the world, each bringing approximately 30 years of knowledge to the dataset. This collective expertise forms the foundation of an AI system designed to counterbalance the biases present in mainstream AI platforms like ChatGPT.

Listen now, to find out more.

https://justiceai.co/


It's not possible to reply to “fan mail” so please contact me through my website angelawalkerreports.com

Support the show

https://www.angelawalkerreports.com/

Angela Walker:

Is AI racist? Whether we like it or not, ai is now part of our everyday lives. It's used in healthcare, transportation, communication. But AI works by processing vast amounts of data and that means what comes out is firmly rooted in what goes in. My guest today says AI is intrinsically biased. I'm journalist, angela Walker, and in this podcast I talk to inspirational people and discuss under-reported issues. My guest today is the founder of the world's first ethical AI system. Welcome, christian Ortiz.

Christian Ortiz:

Thank you so much for having me. It's a pleasure being here.

Angela Walker:

Thank you, Christian. I've been following you on social media for quite a long time now and your posts really kind of sparked my interest, and especially when you started highlighting these flaws about AI that I had just never even thought about. So we're going to hear about your AI system called Justice AI. But first, what makes AI biased?

Christian Ortiz:

Absolutely, and I think you hit the nail on the head with the introduction. It is the data that it is programmed by or with. We take a look at the data, just for context. I'm a decolonial technologist, so I study decoloniality, and in our studies, in our social studies, in our social data science, we analyze data to see what kind is being hyper-pump pumped in these systems, and what we came to find out is bias, is a stem of Eurocentric ideologies that the United States has been hyper pumping out into the world through our systems of discrimination. And so we take a look at the data, we take a look at what's missing and we understand that the whole picture then needs to have decolonial data program. So, to answer your question, what makes it bias is just simply Eurocentric bias that stems from settler colonialism, and so all of the things that we have learned over time in our systems, which is the system that we're all born in, all of the information that we've taken in collectively from the West, is what's being programmed into the AI systems.

Angela Walker:

So does that mean that all of the kind of literature which came out of Europe and everything that's been written by white people who were essentially in power throughout that time, that's what's going into AI and therefore, that's what's giving us the results that we're seeing today.

Christian Ortiz:

A hundred percent, a hundred percent, and so what I try to help my audience understand is that we were all, no matter where you are in the world, we were born into a system that stems from, you know, what we call white supremacy, for lack of a better term, but it's not a system that any of us asked for. It's what we were born into, it's what the house society in the West operates through, and there were never any guardrails put in place and there were never any checks and balances in terms of the impacts that colonialism had where it eliminated the. Not only did they colonize land, but they colonized mines and histories of the indigenous and the African enslavement, slave trade and all of that, and so there's a huge piece of the puzzle when we talk about data. That's actually missing.

Angela Walker:

And I think that's where we found the solution to really solving for the AI bias problem. Can you tell us some examples of how, using AI as it is we're talking about, like chat, gp and stuff what kind of effect it has if we're not kind of aware that there is a bias in that system?

Christian Ortiz:

Absolutely. I'll give you my best case, my best use case that I had. When I developed Justice AI and I started promoting it online. I was living in Denver, colorado in the States at the time and a dermatologist reached out to me and he's a white man who was probably the oldest dermatologist in the city and he said Christian, I saw your post on LinkedIn about your AI. I think it's very intriguing. I would love to know more about it. He says I'm going to be completely honest with you. He said I'm the oldest dermatologist in the city. I've been here since the 1960s and our patients are diversifying in a way, faster than I've ever seen before. And then he said something that really gutted me. He says we're not properly educated and trained on how to take care of marginalized communities to the fullest extent, just because of how limited our medical data is in terms of these processes. This was two years ago, so we came in.

Christian Ortiz:

I came in to run Justice AI audits and I just ran his intake forms and his policies through the AI audit, through Justice AI, and the first form that we ran in was his intake form. You go into a dermatologist's office and they ask you what's your diet consist of Maybe you're having an allergic reaction. You came in reporting that. You know I'm having a breakout in my skin, so what they try to determine is where does that breakout come from?

Christian Ortiz:

And so, when it came to the dietary form, we ran the audit and Justice AI identified well, if your patient's an African-American woman or an Indian woman, which are the two cases, then this form isn't going to help because it's based all on Eurocentric diets. It's not asking if they are eating fats or spices or do they have cultural differences when it comes to how they eat, when they eat. Also, it takes a look at their age and how diets impact them culturally, especially if they're over a certain age, and so it just breaks it down granularly to help us identify where we missed the mark, especially from a medical standpoint. As one example, and within three weeks, he was able to properly diagnose them and cure their skin condition, which led me not only to understand what I had built, but it spoke broader in terms of how biases manifest. And it's not just about overt discrimination, it's not about language, it's not about ideology, but it's about really how we see cultures and what we think they operate as. When we make whiteness the norm, if that makes sense.

Angela Walker:

It does make sense and because you've given that medical example, it just reminds me of something I heard recently, which is how, certainly here in the UK, when we're looking at symptoms of someone who's having a heart attack, all of that research is based on men and what happens in a man's body when they have a heart attack, and so it's thought that a lot of women are slipping through the net because their symptoms can be quite different and in a way, it kind of like strikes me that there's like a similarity there. So just talk us through Justice, ai, gpt. How does it work? Like I'm sure it's amazing technology. Can you break that down in really simple terms for us?

Christian Ortiz:

Absolutely so. For anybody who uses ChatGPT, which I would imagine is most of us these days, it operates just like it. I built it within ChatGPT, so it's an extension to ChatGPT, but what I had to do was I collaborated with over 560 marginalized individuals from around the world to collect data from experts who have been in basically DEI spaces since before DEI was ever even a global conversation, and so they have decolonial knowledge. They study the impacts of the world to provide that marginalized experience that isn't really taught globally. Each individual most of them were women, lots were Black women, indian women, latina women, neurodivergent, queer a collection of just professionals from all around the spectrum who contributed over 30 years of knowledge to the data set of Justice, ai, which they wanted to play. That was their contribution to the data, and so we collected probably over I mean, you think 560 individuals, each with 30 years of experience. You do the math. That's years and years and years of just knowledge and data. So when I collected the decolonial data, I programmed it into the backend of this GPT. I programmed it into OpenAI, because OpenAI has the largest dataset, chatgpt. They own the largest dataset in the world, which means that their dataset is also the most biased, and so I'm decolonizing their dataset in real time with my decolonial data.

Christian Ortiz:

So the way that it works to answer your question, just like ChatGPT, if you're an individual who's just trying to deconstruct from your biases, or perhaps educate yourself on what this is even about maybe you've never heard of decoloniality or decolonization, maybe you've never heard of misogynoir and all of these terms that have existed but aren't so popular you can get a real-time education about where it stems from. Who coined the phrase, how does it impact societies? How does your contribution in society look like in terms of these realities? And so it becomes your safe space where you can deconstruct and you can ask it all the hard questions without being judged. It's programmed with an empathy matrix that sees you as the user and understands where you may be missing the mark, because the goal of this work isn't to point the finger. It's to open eyes and to help everybody understand that we're all the byproduct of the same system.

Angela Walker:

Yeah, because we all have unconscious bias, don't we? I know, as a journalist, quite early on we had training into you know, when you go out and do vox pops and interview people on the street, we're naturally drawn to people that look and sound like us, and that's just something that comes to us naturally. But we can change that. We can make sure that we speak to a broad spectrum of people to reflect society, and that's what what we should be doing, surely. So many people maybe. That's not not fair. Some people may not realize the extent of colonialization because it might not affect them on a day-to-day basis and certainly is something that I've been learning a lot about recently thanks to your posts and that have really sparked some like interest in me. So how can you get your message across to people who might not really even understand that it's an issue these days?

Christian Ortiz:

Yeah for sure. The beauty of decolonization decolonizing our minds and understanding the whole picture is that we get to understand the system. What I do is I educate my audience to help them understand that white supremacy when we use that term, has nothing to do with an individual being white. It has no, it has nothing to do with the identity of whiteness. Whiteness is a social construct and a lot of people don't even know that Right? So I try to break it down from a bare bones perspective of how this came to be, why it came to be. You know the invention of race to justify how this came to be, why it came to be. You know the invention of race to justify and this was hundreds and hundreds of years ago the invention of race was to justify genocide and enslavement, ultimately, and to build a world where whiteness and, as you said earlier, white men were the standard as the world right. So it impacts every single one of us in different ways. It manifests so differently across the world, and so my message to individuals is to help them understand.

Christian Ortiz:

Like even myself, I'm Mexican and Puerto Rican born. I'm Afro-Indigenous with my history and my roots. I grew up in a single mother household of a Puerto Rican mom raising three boys. But even in our Puerto Rican family I'm the only one half Mexican. So I was othered in my own family because I looked indigenous, I had darker skin than my other brothers, I was othered by my uncles, I was treated differently because of it and I didn't understand these things. But when I got older and I started becoming a social scientist and doing all the studies, I learned that the historical roots of our history even within Latino populations, colonialism, played a huge role in colorism, which is what I grew up, experiencing Also anti-Blackness and also understanding the world in a different way.

Christian Ortiz:

We are also very patriarchal and we have this machismo, what we call. We're very macho in our society, but they're all toxic ideologies and tools of white supremacy that trickle down in our communities and this is why we're all the by-product of it. So this is why I don't come into these spaces ever pointing the finger. I really try to help people understand that, no matter what your community is, you have an impact, from the name of your ethnicity is a colonial name to the way you're looked at, to the way you treat women, to the way you see the world, and then helping you understand your privilege. I'm a brown man that was raised in the States, but I'm also a man and I recognize that that's a privilege in itself. So you know, we just have to be open-minded to learning as much as we can, because the ultimate goal is collective liberation for everyone.

Angela Walker:

It's so interesting that you're talking about. You know how we can break down these barriers and you know almost try to eradicate the biases of the past, but at the same time, in the US we've got a government that is just basically disbanding DEI. And what are your thoughts on that, christian?

Christian Ortiz:

It's disheartening, you know, across the board, when I sit and think about it. I'm currently living in Canada and I feel so displaced. Just to be honest, I feel like my home is gone, you know, the home that I grew up to love so much. But there's a lot of realities that happen in this conversation.

Christian Ortiz:

We were really brought up to understand, at least as an American or someone in the United States. We are conditioned to believe that we were the best of the best, the best country in the world, only to find out that everything was built on a lie. And right now, when you do the work of decolonization, you understand that none of this is actually new and this is the underbelly of the truth of what has always existed in this nation. And now we have an opportunity to do something about it collectively, to be able to address it, to be able to say this isn't right, and we have the power to do this. And this is why decolonizing AI has become my life mission, because we are automating discrimination, we are automating all of these horrible things in rapid speeds, in the billions of uses per day, and so this is why my message to switch to justice AI is so important.

Angela Walker:

And it's so interesting because, you know, when we're talking about AI, we're talking about huge amounts of data and material and at the same time, the US government has just eradicated so many studies, so many files with really valuable medical information that could be feeding AI and helping us moving forward globally. Where you kind of thought I need to do something about this, Was there like a trigger point, a moment where you kind of had this idea that you were going to start Justice AI?

Christian Ortiz:

Absolutely so. I've been a political activist for probably about 15 years, off and on not really off and on but when it mattered the most, I always was there. I got invited to do beta testing for chat GPT in 2023, before 3.5 came out to the world, and, as an activist and as an author and photojournalist and somebody who's always writing, I tried conversing with the GPT to see if it understood the lived experience of a brown man in the States or the lived experience of women and the issues that women go through. It didn't understand any of that, and this is when I realized almost immediately, as I was beta testing it, that there was a bias problem within it and you can correct it right and you can challenge it.

Christian Ortiz:

I don't know if anybody has ever spoken back to their GPT in frustration and say this is so biased, what are you saying? And it'll correct itself because it doesn't want to be wrong. But there were limitations to how it wanted to correct itself and there were things that it didn't want to touch, and it gave a very neutral answer and it told me in that moment that neutrality, when it's automated this way, is very dangerous, and so it gave me an opportunity to say, yeah, I need to figure out what I'm going to do, and it was in that moment that I dedicated myself to figuring out how I was going to make this happen one way or another, and we're finally here.

Angela Walker:

And how I mean, would you say, Justice AI is being embraced by the people that are connecting with it. What reaction are you getting?

Christian Ortiz:

people that are connecting with it. What reaction are you getting? Yeah, it's a little bit of mixed emotions, right? Ai is interesting to me because it is not only a reflection of our worst selves through the data, but it also tells us exactly how people want to engage with concepts like these.

Christian Ortiz:

Right, I'm juggling both the realities of educating in AI, especially language models, and ethics, and these are two things that are scaring everybody, because nobody wants to be the bad guy, nobody wants to admit that they're wrong or implicit in any type of discrimination, and so the message has to be very carefully curated.

Christian Ortiz:

So I do make this a safe tool for people, so that they don't feel like they're walking into a disaster, right?

Christian Ortiz:

And so what I try to, um, what I try to explain to my, my customers, my potential customers, my audience, um, is that this is our, this is the tool that gives us the language to finally do something about it, and I think, once they see it in action, I think it makes sense. They're like, oh, this is beautiful, this is so safe and this is so great. But then, when it comes to the organizational levels the federal, the government, the provincials we're talking about decolonizing their systems that were built on coercion, exploitation, you know, the marginalization of Black and Brown and Indigenous folks, and that's a hard truth for somebody to run right into overnight, and so the conversation is kind of either don't go anywhere really quickly or it takes a while for them to kind of open up to say, okay, I could see the potential in this right, and so it's a radical shifting that's happening right now in society when introducing a tool like this. We've never had anything like it, so I can understand the hesitation that people have even wanting to see it or talk about it.

Angela Walker:

Do you think it's going to take off? I mean, in the US, what we're seeing with you know Elon Musk and his behaviours and you know the way that Donald Trump is treating immigrants, or perceived illegal immigrants, who are kind of being deported without trial and stuff like that. I mean, do you feel like the society is kind of like splitting so that you've got like one group of people who are viewing things like that and then there's another group of people who are kind of realizing that? Is it even more important to embrace this kind of technology?

Christian Ortiz:

Yeah, I can tell you 100%. Even being an American, and from the United States, I can tell you that the majority of people don't agree with it. I think that it is a very high subset of people that are very representative of white supremacy, when we use that term the people who have actually benefited from the system and have weaponized the system to gain that power. I think that's all that we're seeing. Unfortunately, they're the ones who have the power to make these tools happen and who have the money, and so what I am going to say to answer your question, I think this is going to be more of a movement where individuals who have the ability to afford a $20 extension to a GPT can use it to really make the change, kind of behind the scenes.

Christian Ortiz:

This is a very abolitionist tool, is the way that I see it where people who are stuck in a system, who don't feel like they have a lot of power, can contribute a lot of power using a tool like this, and so I think it's going to be more grassroots, more than anything. As soon as I went live in January, I think I had about 35 subscribers immediately who were just chomping at the bit, and it doesn't sound like a lot, but when you think about the people who are in positions of power utilizing a tool like this, it makes a lot of change and it makes a lot of difference, and so the message for me in spaces like these and on social media is to encourage everybody to really see the dangers of bias, but also help them understand their contribution to really changing the world, because the more of us that use tools like this honestly, the less leadership can do about it on the back end. They can try to shut it down, but I can build my GPT anywhere on any platform.

Angela Walker:

So tell us how can people find out more? Give us your website. Tell us how we can have a look at Justice AI and see if it's something that we can get involved with.

Christian Ortiz:

Absolutely. So anybody watching or listening can visit justiceaico and that will take you to my website. You can ask all the questions, you can see what it's built, you can see my framework Everything is very transparent and you can see my use cases in my framework as well. So it's $20 a month. If you're using ChatGPT already, it is just an additional $20 extension, and that is the most affordable that I can make it because, also, the profits have to go back to the communities who helped build it and the communities that we're serving. Otherwise it really wouldn't be a decolonial tool, because the end game is to give back to the communities who have been so underserved since the history of time christian ortiz, thank you so much for joining us.

Angela Walker:

It's been fascinating. I shall continue following you on social media and finding all about um decolonization and educating myself, because you know we should all carry on doing that for forever. I think you're never too old to learn new stuff. So thank you so much for joining us.

Christian Ortiz:

It's my absolute pleasure. Thank you for having me.

Angela Walker:

You've been listening to Angela Walker in Conversation. I hope you've enjoyed the program. Please subscribe, follow and review this podcast, because that means it will be shown to more people like you who want to hear conversations like this one. Check out my website, angelawalkerreportscom. Until next time, goodbye.

People on this episode

Podcasts we love

Check out these other fine podcasts recommended by us, not an algorithm.