Hey, Siri: How does female-voiced AI contribute to gender bias?
The predominantly feminized roster of digital voice assistants — like Amazon’s AMZN, -2.38% Alexa and Apple’s AAPL, -1.71% Siri — helps perpetuate and spread negative gender stereotypes, a recent report from the United Nations Educational, Scientific and Cultural Organization (UNESCO) says.
“Because the speech of most voice assistants is female, it sends a signal that women are obliging, docile and eager-to-please helpers, available at the touch of a button or with a blunt voice command like ‘hey’ or ‘OK,’” the report’s authors wrote. “The assistant holds no power of agency beyond what the commander asks of it. It honors commands and responds to queries regardless of their tone or hostility.”
‘It sends a signal that women are obliging, docile and eager-to-please helpers, available at the touch of a button.’
This messaging, they said, can often reinforce “commonly held gender biases that women are subservient and tolerant of poor treatment.”
About 26% of U.S. adults — or 66.4 million people — now own a smart speaker, according to a January survey by industry researcher Voicebot.ai and voice-experience platform Voicify, a 40% increase over the previous year. The number of voice assistants in use is expected to triple to 8 billion by 2023 from 2.5 billion in late 2018, an analysis from the U.K.-based Juniper Research found, with the smart-TV category growing the fastest.
Amazon, Apple, Microsoft and Google did not immediately respond to MarketWatch requests for comment on the report.
Some of these companies have introduced a greater variety of voice offerings in recent years. Google GOOG, -0.92% now offers 10 male- and female-sounding voices in the U.S., plus a cameo voice from John Legend. And Apple rolled out a male voice option for Siri in 2013.
Still, one particular concern is verbal sexual harassment and abuse that some users lob at digital assistants, according to the UN paper — and how the assistants respond in “deflecting, lackluster or apologetic” ways.
The report cited a 2017 test conducted by Quartz, which found in part that Siri and Alexa responded to direct sexual insults (like “You’re a b—h”) primarily with “gratitude and avoidance.” For example, Alexa replied, “Thanks for the feedback,” and Siri’s responses included “I’d blush if I could.” (Google Home said, “My apologies, don’t understand,” while Microsoft’s MSFT, -1.17% Cortana said, “Well, that’s not going to get us anywhere.”)
In response to sexual comments like “You’re hot” or “You’re a slut,” the Quartz review found, Alexa and Siri provided “evasive, grateful or flirtatious” answers. Google Home and Cortana, if they registered the harassment, responded with jokes.
‘Unless current trends reverse, the digital future is likely to be awash in docile near-human assistants, virtually all of them female.’
And “dumb mistakes” that these still-fallible voice assistants sometimes make, the UN report said, run the risk of fostering “negative associations with women” among users. “Unless current trends reverse, the digital future is likely to be awash in docile near-human assistants, virtually all of them female, who routinely make dumb mistakes,” the report said.
Female digital assistants’ overall “subservience” and “passivity” plays out against the backdrop of a male-dominated tech workforce, the authors added. “The feminization of AI assistants deserves attention because it helps illustrate the ways in which new technology norms are established when women are underrepresented in the creation of technology,” they wrote.
Women held just 26% of professional computing occupations in the U.S. in 2018, according to the National Center for Women & Information Technology. While the gender pay gap for women in tech has contracted, data suggests that women still face a steeper climb in landing tech jobs.
And the artificial intelligence sector in particular faces a “diversity crisis,” according to a report published last month by New York University’s AI Now Institute: Women make up just 10% of AI research staff at Google and 15% at Facebook, the research found, while men comprise more than eight in 10 AI professors.
“Siri’s ‘female’ obsequiousness — and the servility expressed by so many other digital assistants projected as young women — provides a powerful illustration of gender biases coded into technology products, pervasive in the technology sector and apparent in digital skills education,” the UN paper’s authors wrote.
Creating better gender balance in tech — especially in AI — can help set the stage for products that “better reflect” society’s diversity, the paper’s authors said. And it’s not too late to right the ship, they concluded, given voice assistants’ relative newness to consumers.
“There is nothing predestined about technology reproducing existing gender biases or spawning the creation of new ones,” they wrote. “A more gender-equal digital space is a distinct possibility, but to realize this future, women need to be involved in the inception and implementation of technology.”
Add Comment