Why Do Virtual Assistants Like Siri and Alexa Traditionally Have Female Voices?

Virtual assistants often have female voices. Why?
Virtual assistants often have female voices. Why? | grinvalds/iStock via Getty Images

With the notable exception of HAL 9000 in Stanley Kubrick’s seminal 2001: A Space Odyssey (1968), most of the voices we associate with faceless artificial intelligence are female. Products like Amazon’s Alexa and Apple’s Siri relate information and field requests using a woman’s voice. Automated telephone directories don’t typically guide us along in deep baritones. It’s clearly not by chance. So why have companies chosen to treat AI as feminine?

In an opinion piece for PCMag, author Chandra Steele examined the idea of gender stereotypes playing a role. An Amazon spokesperson told Steele a female voice was what test users for Alexa responded to most strongly. For Microsoft’s Cortana, the company said it found a female voice to best embody the qualities expected of the digital assistant—helpful, supportive, and trustworthy. It’s been theorized that both men and women generally warm to a female voice, whereas there’s more divisiveness when the voice is masculine.

But the real reason may not be a result of contemporary biases or perceptions of women taking on administrative roles. Instead, it could be because of outdated notions. When designing Google Assistant, Google noted it wanted to offer both male and female options and then realized how technologically difficult that would be. Why? Text-to-speech systems were trained mostly on female voices.

There are other possibilities. Some researchers believe women tend to articulate vowel sounds more clearly, or that the pitch of a woman’s voice is easier to hear. While these are tenuous arguments, they may have been behind the decision to use women in early developmental efforts.

There are, of course, exceptions. Many devices, including Siri, offer the option to switch between male and female voices. But it may take more than a toggle switch to outpace gender biases.