Why Alexa, and not Alex? Digital assistants enforce gender bias, says UN

New study finds that AI-powered assistants are usually female, normalising the idea that women should be 'subservient'

A woman displays "Siri", voice-activated assistant technology, on an Apple iPhone 4S in Taipei on July  30, 2012. Taiwan's National Cheng Kung University has filed a suit against US tech giant Apple, claiming the company's Siri intelligent assistant has infringed on two of its patents. AFP PHOTO / Mandy CHENG (Photo by Mandy Cheng / AFP)
Powered by automated translation

It's always "hi Alexa", not "hello Alexander", and "hey Siri", not "hi Stuart".

This is the issue a UN agency honed in on a new study, which found that assigning female genders to virtual assistants such as Amazon's Alexa and Apple's Siri is reinforcing negative gender biases.

The report, conducted by the United Nations Educational, Scientific and Cultural Organisation (Unesco), found that by using a female voice for AI-powered assistants, gender stereotypes were being perpetuated.

“It sends a signal that women are obliging, docile and eager-to-please helpers, available at the touch of a button or with a blunt voice command like ‘hey’ or ‘OK’,” the study says.

"The assistant holds no power of agency beyond what the commander asks of it. It honours commands and responds to queries regardless of their tone or hostility. In many communities, this reinforces commonly held gender biases that women are subservient and tolerant of poor treatment.”"

The report also noted the coquettish intonation of some programmed responses, such as Siri's "I’d blush if I could", which is uttered when prompted with a specific explicit statement.

"Companies like Apple and Amazon, staffed by overwhelmingly male engineering teams, have built AI systems that cause their feminised digital assistants to greet verbal abuse with catch-me-if-you-can flirtation,” the report says.

While Apple and Amazon allow users to opt for a male voice, the default voice is that of a woman.

FILE - In this Aug. 16, 2018, file photo a child holds his Amazon Echo Dot in Kennesaw, Ga. Amazon met with skepticism from some privacy advocates and members of Congress last year when it introduced its first kid-oriented voice assistant , along with brightly colored models of its Echo Dot speaker designed for children. (AP Photo/Mike Stewart, File)
Amazon's Alexa has a female voice as its default setting. AP

“The assistant holds no power of agency beyond what the commander asks of it," Unesco's study says. "It honours commands and responds to queries regardless of their tone or hostility. In many communities, this reinforces commonly held gender biases that women are subservient and tolerant of poor treatment.”

The report calls on technology companies to stop making voice assistants female by default, and urges them to develop a gender-neutral option.

Unesco also encouraged companies to dissuade users from gender-based insults and abuse by using appropriate, rather than flirtatious, responses.

Earlier this year, a group of linguists, technologists and sound designers revealed Q, a genderless digital voice.