Sources

I used the following sources to build this website - if you’re interested in this topic, I’d recommend all of them for future reading.

Abercrombie, G., Curry, A., Pandya, M., & Rieser, V. (2021). Alexa, Google, Siri: What are Your Pronouns? Gender and Anthropomorphism in the Design and Perception of Conversational Assistants. Retrieved 10 October 2022, from https://arxiv.org/abs/2106.02578

Borau, S., Otterbring, T., Laporte, S., & Fosso Wamba, S. (2021). The most human bot: Female gendering increases humanness perceptions of bots and acceptance of AI. Psychol Mark, 38, 1052– 1068. https://doi.org/10.1002/mar.21480

Christian, B. (2013). The Samantha Test. Retrieved 18 October 2022, from https://www.newyorker.com/culture/culture-desk/the-samantha-test

Clark, L. (2013). Falling in love with AI virtual assistants: a creepy love affair nearer than you think. Retrieved 10 October 2022, from https://www.wired.co.uk/article/virtual-assistant-ai-love

Fessler, L. (2017). We tested bots like Siri and Alexa to see who would stand up to sexual harassment. Retrieved 18 October 2022, from https://qz.com/911681/we-tested-apples-siri-amazon-echos-alexa-microsofts-cortana-and-googles-google-home-to-see-which-personal-assistant-bots-stand-up-for-themselves-in-the-face-of-sexual-harassment/

Fisher, E. (2021). Gender Bias in AI: Why Voice Assistants Are Female. Retrieved 10 October 2022, from https://www.adaptworldwide.com/insights/2021/gender-bias-in-ai-why-voice-assistants-are-female

Gao, Yang & Pan, Zhengyu & Wang, Honghao & Chen, Guanling. (2018). Alexa, My Love: Analyzing Reviews of Amazon Echo. 372-380. 10.1109/SmartWorld.2018.00094.

Griggs, B. (2011). Why computer voices are mostly female: CNN Business. Retrieved 18 October 2022, from https://www.cnn.com/2011/10/21/tech/innovation/female-computer-voices

Heisler, Y. (2022). Steve Jobs wasn’t a fan of the Siri name. Retrieved 18 October 2022, from https://www.networkworld.com/article/2221246/steve-jobs-wasn-t-a-fan-of-the-siri-name.html

Hempel, J. (2015). Siri and Cortana Sound Like Ladies Because of Sexism. Retrieved 10 October 2022, from https://www.wired.com/2015/10/why-siri-cortana-voice-interfaces-sound-female-sexism/

Hill, A (2022). Voice assistants could ‘hinder children’s social and cognitive development’. Retrieved 18 October 2022, from https://www.theguardian.com/technology/2022/sep/28/voice-assistants-could-hinder-childrens-social-and-cognitive-development

Horstmann A.C., Bock N., Linhuber E., Szczuka J.M., Straßmann C., Kramer N.C. (2018) Do a robot’s social skills and its objection discourage interactants from switching the robot off? PLoS ONE 13(7): e0201581. https://doi.org/10.1371/ journal.pone.0201581

Moran, T. (2021) Racial technological bias and the white, feminine voice of AI VAs, Communication and Critical/Cultural Studies, 18:1, 19-36, DOI: 10.1080/14791420.2020.1820059

Nickelsburg, M. (2016). Why is AI female? How our ideas about sex and service influence the personalities we give machines. Retrieved 10 October 2022, from https://www.geekwire.com/2016/why-is-ai-female-how-our-ideas-about-sex-and-service-influence-the-personalities-we-give-machines/

Pawlowski, A. (2019). Women are more polite to Alexa and Siri than men — is that a good thing?. Retrieved 18 October 2022, from https://www.today.com/health/should-you-be-polite-your-smart-speakers-why-many-people-t169365

Peckham, M. (2014). Microsoft’s Cortana Raises Important Questions About Sexism and Gender Stereotyping. Retrieved 18 October 2022, from https://time.com/48123/microsofts-cortana-raises-important-questions-about-sexism-and-gender-stereotyping/

Robison, M. (2020). Voice assistants have a gender bias problem. What can we do about it?. Retrieved 10 October 2022, from https://www.brookings.edu/blog/techtank/2020/12/09/voice-assistants-have-a-gender-bias-problem-what-can-we-do-about-it/

Robison, M., Chin, C. (2020). How AI bots and voice assistants reinforce gender bias. Retrieved 10 October 2022, from https://www.brookings.edu/research/how-ai-bots-and-voice-assistants-reinforce-gender-bias/

Rosenblatt, G. (2019). Why Not to Abuse Digital Assistants. Retrieved 10 October 2022, from https://www.the-vital-edge.com/digital-assistants-abuse/

Samuel, S. (2019). Alexa, are you making me sexist?. Retrieved 18 October 2022, from https://www.vox.com/future-perfect/2019/6/12/18660353/siri-alexa-sexism-voice-assistants-un-study

Sheffield, U. (2005). Male and female voices affect brain differently - The University of Sheffield. Retrieved 10 October 2022, from https://www.sheffield.ac.uk/news/nr/422-1.174743

Soper, T. (2015). Why people in China love Microsoft’s Xiaoice virtual companion, and what it says about artificial intelligence. Retrieved 10 October 2022, from https://www.geekwire.com/2015/people-china-love-microsofts-xiaoice-virtual-companion-says-artificial-intelligence/

Spencer, G. (2018). Much more than a chatbot: China’s Xiaoice mixes AI with emotions and wins over millions of fans - Microsoft Stories Asia. Retrieved 10 October 2022, from https://news.microsoft.com/apac/features/much-more-than-a-chatbot-chinas-xiaoice-mixes-ai-with-emotions-and-wins-over-millions-of-fans/

Suarez, B. (2021). The Ethics of Digital Voice Assistants - Viterbi Conversations in Ethics. Retrieved 10 October 2022, from https://vce.usc.edu/volume-5-issue-1/the-ethics-of-digital-voice-assistants/

Wanqing, Z. (2020). The AI Girlfriend Seducing China’s Lonely Men. Retrieved 18 October 2022, from https://www.sixthtone.com/news/1006531/the-ai-girlfriend-seducing-chinas-lonely-men

Watkins, Heather, “User Perceptions and Stereotypic Responses to Gender and Age of Voice Assistants” (2021). All Theses. 3652. https://tigerprints.clemson.edu/all_theses/3652

West, Mark, et al. (2019). “I’d Blush If I Could: Closing Gender Divides in Digital Skills through Education.” UNESCO.

West, S.M., Whittaker, M. and Crawford, K. (2019). Discriminating Systems: Gender, Race and Power in AI. AI Now Institute. Retrieved from https://ainowinstitute.org/ discriminatingsystems.html

Xu, S. (2021). Bloomberg - Are you a robot?. Retrieved 18 October 2022, from https://www.bloomberg.com/news/articles/2021-07-14/microsoft-chatbot-spinoff-xiaoice-reaches-1-billion-valuation

Zhou, L. et. al. (2020). The Design and Implementation of XiaoIce, an Empathetic Social Chatbot. Computational Linguistics; 46 (1): 53–93. doi: https://doi.org/10.1162/coli_a_00368