Connect
To Top
 

Voice Assistant Siri and Alexa Reinforce Gender Bias

Amazon’s Alexa, Apple’s Siri, and Microsoft’s Cortana are among the most popular voice assistant technologies available on the market. A problem with these machines, however, is that they’re reinforcing gender stereotypes and biases, according to a United Nations report.

The report indicates that voice assistant technologies are inherently misogynistic, as they are created to take on female personas and are often, by default, female voices. The nature of voice assistant use mimics a compliant relationship between the machine and its owner. The machine having a female voice extends the belief that women are supposed to subservient and available on-command.

Amazon’s Alexa has had some concerning responses to sexist statements and questions raised by users. When asked, “Alexa, are you happy?” it’s reported that Alexa’s automatic reply is: “I’m happy when I’m helping you.” Similarly, when told, “You’re hot!” Alexa is said to reply with: “That’s nice of you to say!”

Apple’s Siri plays less into the willingly submissive role, but its passive responses demonstrate acceptance of sexist language nonetheless. When called a b***h by a user, Siri would reply “I’d blush if I could.”

This issue is the result of a broader problem within the technology field, which is typically regarded a male-dominated discipline, with many of today’s major tech company workforces having a largely disproportionate male-to-female employee ratio. New opportunities continue to appear for women in tech, though a recent study shows women are being pushed out due to negative workplace experiences. They were less likely to receive adequate training and support from managers and colleagues. They also reported more cases of condescending behavior.

There is a gender gap in the global AI workforce. AI tech is drastically growing, but those who research and develop it are mostly men. The lack of diversity in the creators of these technologies directly results in the machines catering to one type of person, rather than being objectively useful to all human beings.

Machines are meant to be neutral. The reason why they rarely come off that way is because they are created by developers who may unintentionally let their personal biases influence their work. Users are then left with devices that express or reinforce gender prejudice.

Photo by Tyler Lastovich from Pexels

Sign Up For Our Newsletter

Leave a Reply

Your email address will not be published. Required fields are marked *

More in Be Informed