Long-Term Effects of Using Intelligent Personal Assistants on Gender-Based Prejudices

Long-Term Effects of Using Intelligent Personal Assistants on Gender-Based Prejudices
Valentin Schwind, Niels Henze
Paul Ballack
Niels Henze
IPA, speech, Alexa, gender


Current intelligent personal assistants (IPAs), including Alexa and Siri, not only carry female names but are probably even based on female personas. The interaction with a female-voiced IPA could increase users’ sexual prejudges. While a growing body of work investigated how users interact with IPAs and how to improve the interaction, the effect of long-term use of an IPA is unclear.

Zielsetzung der Arbeit

The aim of this thesis is to investigate whether the interaction with IPAs has an effect on users’ gender prejudges. To reveal the effect, we conduct a study using Amazon’s Alexa, the most commonly used commercial IPA. We will conduct a between-subject study by deploying a number of Echo Dots at users’ homes and compare the participants with a control group without an Echo Dot.

Konkrete Aufgaben

  • Further development of a study design
  • Implementing and conducting an experiment
  • Analysis of the results

Erwartete Vorkenntnisse


Weiterführende Quellen

  • Greenwald, A. G., Mcghee, D. E., & Schwartz, J. L. (1998). Measuring individual differences in implicit cognition: The implicit association test. Journal of Personality and Social Psychology,74(6), 1464-1480.
  • Fiske, S. T., Cuddy, A. J., Glick, P., & Xu, J. (2002). A model of (often mixed) stereotype content: Competence and warmth respectively follow from perceived status and competition. Journal of Personality and Social Psychology,82(6), 878-902.
  • Swim, Janet & Aikin, Kathryn & S. Hall, Wayne & Hunter, Barbara. (1995). Sexism and Racism: Old-Fashioned and Modern Prejudices. Journal of Personality and Social Psychology. 68. 199-214.
  • Salmanowitz, N. (2018). The impact of virtual reality on implicit racial bias and mock legal decisions. Journal of Law and the Biosciences,5(1), 174-203.
  • Lopez, S., Yang, Y., Beltran, K., Kim, S. J., Cruz Hernandez, J., Simran, C., … & Yuksel, B. F. (2019, April). Investigating Implicit Gender Bias and Embodiment of White Males in Virtual Reality with Full Body Visuomotor Synchrony. In Proceedings of the 2019 CHI Conference on Human Factors in Computing Systems(p. 557). ACM.
  • Oswald, F. L., Mitchell, G., Blanton, H., Jaccard, J., & Tetlock, P. E. (2013). Predicting ethnic and racial discrimination: A meta-analysis of IAT criterion studies. Journal of Personality and Social Psychology,105(2), 171-192.
  • Schwind, V., Deierlein, N., Poguntke, R., & Henze, N. (2019). Understanding the Social Acceptability of Mobile Devices using the Stereotype Content Model. Proceedings of the 2019 CHI Conference on Human Factors in Computing Systems - CHI 19.
  • Danescu-Niculescu-Mizil, C., Sudhof, M., Jurafsky, D., Leskovec, J., & Potts, C. (2013). A computational approach to politeness with application to social factors. In Proceedings of the 51st Annual Meeting of the Association for Computational Linguistics (Volume 1: Long Papers) (Vol. 1, pp. 250-259).