Allan Boyd Talks to Experts About Things

Ramona Vijeyarasa - AI assistants fuelling dangerous stereotypes

Allan Boyd - Journalist RTRFM Perth Western Australia

Use Left/Right to seek, Home/End to jump to start or end. Hold shift to jump forward or backward.

0:00 | 9:57

Dr Ramona Vijeyarasa - Professor, Faculty of Law, University of Technology Sydney

INTRO: Many of us use a voice assistant every day on our phones and home devices. We might ask Siri to start a timer, or Google for the weather. But is the assistant’s default female voice an example of gender subordination? RTRFM’s Allan Boyd caught up with an expert to find out more…

Conversation Article: Most AI assistants are feminine – and it’s fuelling dangerous stereotypes and abuse - January 27, 2026  

Send me a message

SPEAKER_00

Did you know there are over 8 billion voice assistants out there in the world? AI chatbot apps like our good friends Apple Siri, Google Assistant, and Amazon Alexa. Which means we have more AI voice assistants now than there are people on the planet. These apps are useful tools in our daily lives, but why is the default voice of your chat assistant a female voice and what are the gender connotations? And does the design choice of a feminine voice normalize gendered subordination? To discuss this, I'm joined by Professor Ramona Vigierata from the Faculty of Law, University of Technology in Sydney. Welcome, Ramona.

SPEAKER_01

Thanks very much for having me on, Alan.

SPEAKER_00

Thank you so much. So with Google Assistant, uh Apple Siri, Amazon Alexa, they all have a female voice by default. Why is this?

SPEAKER_01

Well, it's actually quite typical in the tech sector for these softwares to have a female voice by default. It's been this way for quite some time, and you've given your listeners some examples Siri, Alexa, Google Home. Even when apps are designed with a chatbot, so some of your listeners might have a bank app with a chatbot that they use to assist them. Similarly, they're often given a female name. And the consequence is that we're sending this message that women are there to serve. So we're feminizing these technologies that are there to assist, and we're sending this sort of stereotype that women are uh meant to be in these roles. And it's interesting that when a male voice is used, it's often for a very different purpose. Back in 2015, IBM developed this program called Watson for Oncology, and it was used to help doctors process medical data. And in that case, Watson was given a male voice. So men's voices are used in more serious, instructive modes, and women to serve. So your question was why is this the case? I think it's really interesting that the software engineers I've talked to, even the female ones, are sometimes not even conscious that this is what they're doing. They just make them female by default. So this has happened for so long that it's just quite an ingrained practice in the industry.

SPEAKER_00

So this is symbolic of patriarchy, I suppose.

SPEAKER_01

Well, some people do say that this is the sort of the voice that people want to hear, that we've gotten used to it and and there is there is an appreciation for subservience. I think the bigger issue that's being layered on top of that is now the evidence that we're actually abusing our assistant technologies, and sometimes the abuse itself is also gendered. When you have a female voiced assistant technology or an app that's been given a female name, they're more likely to receive abuse that's sexualized than when it's when it's given a male voice. And in fact, one study showed that when it's a female voice, 18% of the interactions involve some kind of sexual abuse language, only sexual connotations in 10% of the case when it's a male gendered technology, and it's only 2% when it's a so-called gender-neutral bot. So there's definitely a gendered element to this conversation.

SPEAKER_00

Yeah, I was reading also about uh the BMW. They launched a GPS system with a female voice in Germany a while back, and male drivers rejected it and refused to take directions from a woman, and then BMW switched the voice to a male one for that market.

SPEAKER_01

I guess it also raises the question about how much for the sake of money, because obviously these are commercial enterprises, they're willing to sort of feed into this stereotype and these gendered norms, allowing male buyers to say, Well, actually, I don't want to be told what to do by a female, so you have to change this practice.

SPEAKER_00

Yeah, true. So um I suppose this highlights the concept of gendered subordination and perhaps abuse as well.

SPEAKER_01

It's hard to sort of process. Some people sort of think, well, why would you abuse a technology on the one hand? And for some people the statistics are shocking. The language that's been documented is terrible, not the kind of language we we would use on radio. The statistics are really serious. And yet at the other extreme, some people say, Well, this there's no real victim, it's just a virtual entity. So there's quite different views in this field.

SPEAKER_00

We've changed some of our voices. My wife, for example, has changed her Siri voice to a male Australian voice, and it sounds quite interesting when you ask Siri can do something.

SPEAKER_01

Yeah, and it's and I think it is when you do that, it's interesting to hear because it's so not what you're used to. And a lot of people say to me, is that the solution? Is that the solution to change the voices to male or try to get something that's sort of sort of neutral? I think the big issue we want to be talking about, you know, why are we allowing ourselves to engage in this kind of gendered, sexualized, and harassing language? And what does it say about objectification of women and control and submission of women? I think we want to be having that conversation and really get to the to the structural issue here. Well, there's a sorry, go ahead, Alan.

SPEAKER_00

Oh, I was just going to um ask you to describe the concept of gendered subordination, perhaps a couple of examples of that for our listeners.

SPEAKER_01

Sure. Well, I mean, I think the the bigger issue here is that we live in a society where in nearly every domain women are not equal to men. Even in a wealthy country like Australia, women face bigger gender pay gaps, we have a bigger responsibility for unpaid care work in the home, we suffer far higher rates of gender-based violence, and we have a problem of femicide in this country where women are being killed by their partners. All of this is a reflection of subordination, where you have one sex, in this case women, who are treated inferiorly to others and not enough interventions to do anything about it. And so when we look at tech, it isn't necessarily the assistant technologies or these new technologies that are emerging in our lives that are the problem. The tech is just amplifying or reflecting the realities of our society, which is that we live in an unequal one. And I think the concern with the abuse of these technologies is sometimes the examples are really grave, and we have to ask ourselves: are we normalizing violence against women in real society? And I'll give you an example from South Korea. They created an app called Luda, and she was designed in 2020. I say she, because she was designed as a 20-year-old female university student, and then was manipulated into responding to sexual requests as a sex slave. And again, in in the Korean South Korean online community, people said, Well, is this a real victim? And this is in a society which has really high rates of deep fake technologies and obviously sexual and gender-based violence which exists in you know all societies. And so I think it is a worry when when the there's a bleeding between the virtual and the real, and we we have to wonder what are we normalizing in our societies within our homes.

SPEAKER_00

Well, it's interesting too with the recently with Grok abused as well in terms of gender subordination in that sense.

SPEAKER_01

Well, in that case, Grok has a capacity to actually make take imagery and and make people appear naked, which raised a lot of alarm bells from from many different groups, including the risk of of child abuse. And in some countries in the Philippines, for example, they banned Grok altogether. So some countries are really taking this seriously and saying, well, we can't we can't allow this. And and other countries, I think, are still struggling to say, well, how do we respond to these new technologies? These I think these are interesting conversations to have because there's so many perspectives. I recently saw a doctoral researcher, so someone doing their PhD at a university, advertising for recruits to be part of a study. And the study was going to ask people, how much would you accept an app like ROC making you naked? What level of nudity would you choose to accept that an app can do? And it was, I thought it was a really unusual way to approach the question in an academic setting and ask what is acceptable and not acceptable to this group of people in this focus group discussion. I think we're all really grappling with where do we draw the line? How do we understand these technologies? We know there's not a real being at the other end. We know if if Grok is doing this to an image of you, if you have some control over the image, it's not a real image, it's been adopted image. And so people have different responses. Of course, very quickly, that technology can be abused. The photos can be manipulated and used against you, they can be distributed without your consent. And so it really opens a lot of worrying examples for people when we're when the technology is so new and the regulation is so weak.

SPEAKER_00

It is, isn't it? And I suppose that's the key there is to try to keep up with uh regulation across the globe. What can we do? What's what what do you suggest we we do?

SPEAKER_01

Well, from a regulatory perspective, the government in Australia took quite a turn prior to the US election. They were they were quite ready to regulate AI, and there was quite a lot of consultation with society. Post-election, there's been a sentiment expressed by the government that we will just use the existing rules that we have, which are fit for purpose. And a lot of people raise questions about that because there's these are new technologies and and new harms that we don't know, do we? Exactly. We don't know, we don't know the consequences, and and clearly not not all previous and existing legislation is fit for purpose. So I think that's the first area we want to be asking. Are we adequately protected by the laws we have? We want to be having conversations like this one so people are conscious of it, because it is very easy to say, well, it's it's I I've said bad things to my my Google assistant at home. That I think is a really interesting example because many of us live in homes with other people. And if you have children in your household overhearing that language, a female voice, whether or not they're conscious, obviously they're going to be aware to some degree that there's no real female at the other end. But the potential that you're normalizing that way of speaking to a female voice within a household, I think these are important conversations we want to be having. Um, and then of course we want to get to the systemic issues, which is, you know, what does this say about our society when we know that women are still subordinate to men? You know, we're we're just over one month away from International Women's Day, and the year comes around, and we often celebrate successes and talk about what else needs to be done, and then the issue dies down. I I would say I definitely see that as a practice in the media as well. It gets a lot of attention in the start of March, and then come April, it's it's business as usual, and and we haven't really closed gender gaps in a way that we need to see. And so I think we want to talk about those bigger issues. How does this problem of subordinating a female voice entity reflect a bigger societal problem we have where women are still unequal after all these decades of fighting? And, you know, what more can we do to to challenge gender stereotypes, to challenge the subordination, to address gender based violence, to address the normalization of of verbal abuse? I think they're conversations we we must maintain on a regular basis.