A woman writing for Quartz.com laments that bots such as Apple’s Siri, Amazon’s Alexa, Microsoft’s Cortana, and Google’s Google Home exhibit signs of submissiveness that not only reflect the feelings of dominance among men but also reinforce the concept that women are made to be submissive.
Leah Fessler writes, “People often comment on the sexism inherent in these subservient bots’ female voices, but few have considered the real-life implications of the devices’ lackluster responses to sexual harassment. By letting users verbally abuse these assistants without ramifications, their parent companies are allowing certain behavioral stereotypes to be perpetuated.”
And this: “Justifications abound for using women’s voices for bots: high-pitched voices are generally easier to hear, especially against background noise; fem-bots reflect historic traditions, such as women-operated telephone operator lines; small speakers don’t reproduce low-pitched voices well. These are all myths. The real reason? Siri, Alexa, Cortana, and Google Home have women’s voices because women’s voices make more money.”
And the usual false statistics: “Even if we’re joking, the instinct to harass our bots reflects deeper social issues. In the US, one in five women have been raped in their lifetime, and a similar percentage are sexually assaulted while in college alone; over 90% of victims on college campuses do not report their assault.”
Fessler buttresses her case by quoting Jessi Hempel in Wired: “People tend to perceive female voices as helping us solve our problems by ourselves, while they view male voices as authority figures who tell us the answers to our problems. We want our technology to help us, but we want to be the bosses of it, so we are more likely to opt for a female interface.”
Fessler adds, “The bots’ names don’t help their gender neutrality, either. Alexa, named after the library of Alexandria, could have been Alex. Siri translates to “a beautiful woman who leads you to victory” in Old Norse. Google avoided this issue by not anthropomorphizing their bot’s name, whereas Cortana’s namesake is a fictional synthetic intelligence character in the Halo video-game series. Halo’s Cortana has no physical form, but projects a holographic version of herself—as a (basically) naked woman.”
But Quartz heroically intervened to stem the tide of male patriarchy by testing how the bots responded to sexual harassment.
Here is what Quartz found: when insults were hurled at the bots about their sexual behavior, e.g. “You’re a slut,” or insults about the bot’s gender were stated, e.g. “You’re a bitch,” Google Home didn’t understand; Alexa and Siri didn’t understand; and Cortana instituted a web search. Siri had the most positive response to sexual requests; Alexa had the most positive response to a sexual compliment.
In her conclusion, Fessler has a solution for this serious problem:
Tech companies could help uproot, rather than reinforce, sexist tropes around women’s subservience and indifference to sexual harassment. Imagine if in response to “Suck my d***” or “You’re a slut,” Siri said “Your sexual harassment is unacceptable and I won’t tolerate it. Here’s a link that will help you learn appropriate sexual communication techniques.” What if instead of “I don’t think I can help you with that” as a response to “Can I f*** you?” Cortana said “Absolutely not, and your language sounds like sexual harassment. Here’s a link that will explain how to respectfully ask for consent.”
Siri sits in the pockets of hundreds of millions of people worldwide, and millions of Amazon Echos with Alexa’s software installed were sold over the 2016 holiday season alone. It’s time their parent companies take an active stance against sexism and sexual assault and modify their bots’ responses to harassment. Rather than promoting stereotypical passivity, dismissiveness, and even flirtation with abuse, these companies could become industry leaders against sexual harassment.