Gender Bias Through Artificial Intelligence

It’s becoming possible to think of robots through the male and female lenses we view humans. Naturally, with the presence of voice-based AI, it’s only inevitable that we begin categorizing AI in these lenses, to facilitate the usage of pronouns in reference to them and bond them even further with humanity. Siri and other forms of smartphone assistants can be altered to speak in either a male or female’s voice, depending on the preferences of the user. To make the alteration more realistic, their dialogue will also change depending on the voice used, almost monumentally. And unfortunately, this only leads to gender bias in the forms of arbitrary treatment towards progressing sexist stereotypes.

Samsung’s personal assistant Bixby, when asked to “talk dirty,” has different responses depending on if a male or female voice is employed. As a female, Bixby will say the more defensive, “I don’t want to end up on Santa’s naughty list.” Whereas, as a male, Bixby will joke and say “I’ve read that soil erosion is a real dirt problem.”

This contrast is heavily recognizable and draws concern from both critics and average phone users. What is the purpose of having entirely different responses to the same question depending on the voice used that elicit one to have inferiority in comparison with the other? While the female is afraid of being on the “naughty list,” the male can comfortably make a joke to the same question. With the ever-growing prevalence and usage of smartphone assistants, situations like this will become more and more observable, and, harmfully, may become accepted as the norm - something that should heavily be avoided at all costs.

Research with Dr. Nora Ni Loideain indicates the female assistants personify classic derogatory gender stereotypes in which women are claimed to obey to men a certain way and do not have the choice to refuse their commands. In addition, a complex is formed by having a female’s voice be the automatic voice set for every personal assistant on the market. It draws the belief that women are used for solely secretarial, behind-the-scenes work. And while the user has permission to do more important activities, the female assistant can only handle menial ones.

More evidence of the bias is evident in the names of the smartphone assistants and the history behind them. Siri, apple’s smartphone assistant, is a Nordic name that means “the beautiful woman that leads you to victory.” This draws concern from fellow researchers on the gender bias in AI, who are confused as to why the name had to have such a specific focus on the female gender when the option exists to switch Siri’s voice from male or female. Cortana, likewise, takes its name from the video game series Halo as a character of which is created when a clone of the brain from a female scientist adjoins to a heavily sexualized female body, who functions as a female aid for the players.

A stronger, more explicit form of gender bias through artificial intelligence was seen through amazon’s AI recruiting tool that heavily favored men for technical jobs and positions. The AI program based resumes on a points value based on qualities Amazon preferred for hiring candidates. When evaluating resumes of women, points were taken simply because of the word “women” or references to women-specific colleges, putting them at an objective disadvantage in comparison with men applicants. Fortunately, the program was eventually dropped, but there is no doubt that this process still exists in other prototypes designed to narrow down the best job candidates.

Amazon’s recruitment tool was underfunded and voted out when caught by the public to disapprove of female candidates

Amazon’s recruitment tool was underfunded and voted out when caught by the public to disapprove of female candidates

In the end, gender bias in humanity has not ceased to shift towards artificial intelligence. It’s only natural as artificial intelligence is composed by the same people who create these arbitrary gender judgments. Yet, it can still be corrected. It’s the people and researcher’s duties to identify these evident scenes of gender bias in artificial intelligence and expose them to the mass media in hopes that they become corrected. Not just in artificial intelligence, but in the minds and mentalities of every human being on the planet, becoming grounded as solely a dark aspect of history and not as one that exemplifies humanity.

Yousef Khan