According to a United Nations report, Popular digital assistants that help you and answer your questions in a woman’s voice are reinforcing sexist stereotypes. The study said that the majority of assistants including Apple’s Siri, Amazon Alexa, and Microsoft’s Cortana are styled as female helpers, from their names to their voices and personalities.
Researchers from the U.N. scientific and cultural body UNESCO reported that all these assistants are designed to be submissive and servile, respond to insults politely that means they reinforce gender bias and normalize sexist harassment.
Hey Siri, You Are Sexist: UN Report
It has also been highlighted by the study that previously Siri was designed to respond to users calling her a “bitch” by replying “I’d blush if I could” as an example of the issue. UNESCO report warned of the negative consequences of the personal assistants, it claims they are supporting the idea that “women are obliging, docile and eager-to-please helpers, available at the touch of a button or with a blunt voice command.”
The report states:
Now there is needs to pay much closer attention to how, when and whether AI technologies are gendered and who is gendering them.
The UNESCO directed companies to take action in order to stop making digital assistants female by default, exploring gender-neutral options and programming assistants to stop encouraging gender-based insults along with abusive language.