Ethical Considerations
Let’s dive into some of the biggest ethical concerns surrounding VA technology.
According to a 2018 study, users can be aware that AI assistants are not sentient and still implicitly view them as humanoid — people responded with signs of stress when asked to switch off a robot with a computerized voice that was begging them not to. The researchers wrote, “due to their social nature, people will rather make the mistake of treating something falsely as human than treating something falsely as non-human” These results are exacerbated when gender comes into play. Women are more often socialized to be loving, nurturing, and kind, while men are raised to be forceful, assertive, and competent. A 2021 meta-analysis of four studies found that female chatbots and robots are perceived as more sociable, less competent, and more human than their male counterparts. Unfortunately, tech companies have a vested interest in exploiting this phenomenon. Experiments have found that the personification of Alexa was associated with increased user satisfaction, even if the more human-like Alexa had more technological defects. 91.1% of reviews that personified Alexa were classified as “joyful” using sentiment analysis. The humanization of machines can keep users loyal to substandard products. It should come as no surprise that Amazon is currently working on building a more nuanced personality for Alexa and improving her conversational skills.
It might seem dystopian to imagine a world in which robots meet all our social needs, but virtual assistants might reach this point sooner than we expect. Sites like Invisible Girlfriend and games like LovePlus already simulate relationships online. Are virtual partners healthy, and could they impede our ability to form genuine human connections? Johanna Blakley of the Norman Lear Centre points out that people have always used games and storytelling to simulate real life and “vicariously experience new and strange things in a relatively safe environment.” But what if these “new and strange things” are offensive, immoral, or illegal? This is where the feminization of voice assistants becomes especially relevant. Over 5% of interactions in one GPS system for truckers are sexual in nature. A Cortana developer revealed that a sizeable portion of the first questions asked of Cortana concerned the VA’s sex life. And a Quartz experiment found that when reporters verbally harassed voice assistants, it was most common for the bots to either dismiss their comments or respond with flirtation, humor, or gratitude. The VAs were rarely found to express discomfort or attempt to educate the user. As a United Nations report explains, VAs have no power or purpose beyond serving their users, which means that they are designed to respond to requests regardless of tone and language (West, 2019). The report points out that “in many communities, this reinforces commonly held gender biases that women are subservient and tolerant of poor treatment.” Implicit sexism toward voice assistants may not be confined to a safe environment — it may instead encourage sexist attitudes toward real women. There are already recorded differences in the way women treat smart assistants compared to men: according to a Pew study, 62% of women versus 45% of men use terms like “please” and “thank you” when speaking to VAs. Writer and technologist Gideon Rosenblatt suggests that the lack of consequences for abusive conduct toward digital assistants could influence human relationships by normalizing toxic behavior. And it’s not just adults who are using these devices — products like Echo and Google Home are intended to provide services for the entire household. This means that by design, children may observe or participate in the mistreatment of female AI assistants, which might impair their social and cognitive development.
Each of these issues can and should be analyzed through an intersectional lens. Ph.D. student Taylor C. Moran notes that VAs “reflect characteristics of white femininity in voice and cultural configuration for the purposes of white supremacy and capitalistic gain.” This is evident when we consider major VAs like Siri, whose original voice was perceived as white by 75% of participants in one study. It is not an accident that popular culture relies on white women like Alicia Vikander (Ex Machina) and Scarlett Johansson (Her) to portray representations of VAs. And even when voice assistants offer users a choice between male and female voices, these options reinforce the gender binary.