arbeiten:voicevsgender

Effects of Smart Virtual Assistants’ Voice on Gender Bias

Thema:
Effects of Smart Virtual Assistants’ Voice on Gender Bias
Art:
BA
BetreuerIn:
Niels und Valentin
BearbeiterIn:
Kevin Angermeyer
ErstgutachterIn:
Niels Henze
ZweitgutachterIn:
Valentin Schwind
Status:
abgeschlossen
Stichworte:
Smart Personal Assistant, Alexa, Gender Bias
angelegt:
2019-07-04
Beginn:
2019-08-01
Antrittsvortrag:
2019-09-02
Abgabe:
2019-09-30
Textlizenz:
Unbekannt
Codelizenz:
Unbekannt

Hintergrund

Smart virtual assistants such as Amazon's Alexa, Apple's Siri or Microsoft's Cortana are predominantly modeled after female personas and have female names. It is possible that users might interact differently with the system because it appears to be female. This could be the fault of preexisting gender bias. It is also possible that gender bias might increase when using a male voice or persona.

Zielsetzung der Arbeit

The aim of this thesis is to determine the effect of interacting with smart virtual assistants on gender bias. We conduct a study in which male and female participants interact with a simulated male or female smart virtual assistant to determine effects on gender bias.

Konkrete Aufgaben

  • Literature review
  • Developing a study design
  • Developing an apparatus that simulates a male and a female smart virtual assistant
  • Conduct the study
  • Analyzing the results

Erwartete Vorkenntnisse

Keine

Weiterführende Quellen

  • Hannon, C. (2018). Avoiding bias in robot speech. Interactions, 25(5), 34-37.
  • Hoy, M. B. (2018). Alexa, siri, cortana, and more: An introduction to voice assistants. Medical reference services quarterly, 37(1), 81-88.
  • Sciuto, A., Saini, A., Forlizzi, J., & Hong, J. I. (2018, June). Hey Alexa, What's Up?: A Mixed-Methods Studies of In-Home Conversational Agent Usage. In Proceedings of the 2018 on Designing Interactive Systems Conference 2018 (pp. 857-868). ACM. Weitere Quellen nach Absprache mit dem Betreuer