Chances are if you’ve interacted with a chatbot it’s had a female name or at the very least an avatar with female characteristics.
Why? Where along the decision-making process was it decided that digital assistants had to be female?
Can it be linked back to announcements that were used in public transport? Where a female voice was used to provide helpful information and a male voice was used to provide warnings. Think soothing mum and authoritarian dad stereotypes.
Research into the gender of voiceover announcements shows that if you want your announcement to be:
· soothing 92% of people are happier with a female voice
· forceful 98% surveyed prefer a male voice.
Has this unconscious bias transferred over to voice assistants and chatbots?
With the increased growth of voice assistants and our reliance on them for everyday tasks we may have become immune to the fact the default setting is a female voice. I have changed mine to an Irish male — what can I say, I love an Irish accent!
This default setting is usually not changed (changing the voice of the Google Assistant is not as easy as it is for Siri) and reinforces the gender-specific cues that the voice assistant is designed to convey.
Why is this bad?
If you’ve ever read chatbot or voice assistant transcripts, you quickly learn that people write or say some nauseating things to these assistants. I’m not talking about lots of swearing or telling the assistant how useless they are. I’m talking rape threats and sentence phrasing that leaves you truly baffled that these are functioning people walking among us.
And yes, there are people behind these assistants who observe your interactions so they can train these assistants to better help you. We read where the assistant is failing so we can deliver a better user experience. What I will never deliver is an assistant who responds to this kind of abuse.
Gender bias in conversational AI
The United Nations Educational, Scientific and Cultural Organization (UNESCO) delivered a report outlining the potential harmful effects gender bias in AI can have on society.
The report I’d blush if I could: closing gender divides in digital skills through education highlights the gender imbalance of companies producing voice assistants. At the time of the publication, two-thirds to three-quarters of the workforce were men.
According to the report “companies like Amazon and Apple have cited academic work demonstrating the people prefer a female voice to a male voice”. So, are they just following marketing trends and giving consumers what they want? Although not if you choose Arabic, British English, Dutch or French as your language. Siri is a male by default if you choose one of these languages.
Sexual harassment and verbal abuse
As in real life, femanised assistants get abused too. I’m not talking about the “you’re useless” or “f*^% off you’re not helping”, I’m talking about people making sexual propositions and harassing. According to the report this can be higher than 5% of interactions.
After the release of the report many of the voice assistants changed their responses to sexual harassment from coy or flirtatious to firmer responses that remind users that they’re interacting with a non-human, or that show they don’t understand what is being said.
If you’re working on a chatbot project push to have a genderless name and avatar — play around with names that can be unique to your company or department. An avatar is where you can show your company’s personality, will it be something fun and quirky or more serious.
When you’re including chit chat and you need a response for “Do you want to go on a date? or similar, don’t make it coy — use your company’s tone of voice and develop a response that puts the user firmly but politely back in their place. You can even make a joke out of the fact that the user is trying to ask a non-sentient being out.
If you’re interacting with a chatbot, remember there’s a human who reads the chatbot’s transcripts. Yes, sometimes chatbots can be frustrating, call it useless if you must but please don’t type long-winded diatribes about all the terrible things you want to do to the assistant. It can’t be good for your mind to conjure up the images and we don’t need to read it.
If you’re keen to learn more, I recommend The Smart Wife by Australian researchers Yolanda Strengers and Jenny Kennedy.