(WIP) With or Without: Comparing User Eliciting Mid-Air Gestures with or without Objects in Hand

Thema:
Comparing User Eliciting Mid-Air Gestures with or without Objects in Hand
Art:
MA
BetreuerIn:
David Halbhuber
BearbeiterIn:
Yu Liu
ErstgutachterIn:
Niels Henze
Status:
in Bearbeitung
Stichworte:
Hand gesture, AR, Elicitation study
angelegt:
2024-01-30
Antrittsvortrag:
2024-05-06

Hintergrund

Hand gestures are widely regarded as an intuitive means of interaction, and are increasingly being researched as a naturally intuitive form of interaction for a wide range of applications (Vuletic et al., 2019; Gentile et al., 2019; Modanwal und Sarawadekar, 2018; Tscharn et al., 2016). Numerous studies have focused on possible interactions using free hand gestures (Buchmann et al., 2004; Rehman et al., 2020; Le und Kim, 2017; Bai et al., 2014; Shi et al., 2023; Boudjelthia et al., 2018; Yu et al., 2022). However, intuitive proposals by users might not be their favourite (Morris et al., 2014). Hoffmann et al. (2019) conducted an elicitation study in which participants were asked to propose orders by voice, mid-air gestures and a touch display to control smart homes. They had participants rate three modalities, and found mid-air gestures received mainly negative comments. Reasons cited by participants included (mid-air gestures) being counterintuitive, complex to use, need for long term memory and long periods of operation. The problem, that users' gesture suggestions are often influenced by their experience with previous interfaces and technologies, was known as legacy bias. (Morris et al., 2014). But legacy bias is not inherently negative (Köpsel & Bubalo, 2015), and appropriate feedback could help with improving user experience (Schönauer et al., 2015). In a study by Lindeman et al. (1999), four different interaction techniques in immersive virtual environments were compared, two conditions with passive haptic feedback and two without. The passive-haptic feedback conditions required the subject to hold a physical paddle in their non-dominant hand, while the other conditions only allowed the subject to hold the handle of a paddle. The subjects performed docking and selecting tasks with their dominant hand. The study results indicate that holding a physical paddle in virtual reality helps subjects to perform faster and more precise.

As technology advances, gestures, a natural way of interacting, are more supported by technology (Tscharn et al., 2016). Gestures also used in Mixed Reality to freely interact with the virtual elements (Buchmann et al., 2004; Rehman et al., 2020; Le und Kim, 2017; Bai et al., 2014; Shi et al., 2023; Yu et al., 2022). AR could be used to train new employees in industry (Eder et al., 2020; Barsom et al., 2016), and it is not far-fetched to imagine its integration into everyday life, such as with the Vision Pro. Regardless of which system is used for gesture interaction, the user's hands may not always be empty in actual use. Our research question is whether holding objects in hand could affect the gestures proposed by users.

Zielsetzung der Arbeit

The focus of this work is to look at hand gestures when holding objects and to see what gestures the user would suggest to interact with the system when holding a tool. In addition, this work would also investigate how holding a tool affects the user's behaviour.

Konkrete Aufgaben

  • literature research
  • study design
  • test szenario implement
  • user study
  • analyze study results
  • paper writing

Erwartete Vorkenntnisse

Basic knowledge of Unity development

Weiterführende Quellen

  • Schönauer, C., Mossel, A., Zaiți, I. A., & Vatavu, R. D. (2015). Touch, movement and vibration: user perception of vibrotactile feedback for touch and mid-air gestures. In Human-Computer Interaction–INTERACT 2015: 15th IFIP TC 13 International Conference, Bamberg, Germany, September 14-18, 2015, Proceedings, Part IV 15 (pp. 165-172). Springer International Publishing.
  • Vuletic, T., Duffy, A., Hay, L., McTeague, C., Campbell, G., & Grealy, M. (2019). Systematic literature review of hand gestures used in human computer interaction interfaces. International Journal of Human-Computer Studies, 129, 74-94.
  • Gentile, V., Fundarò, D., & Sorce, S. (2019, June). Elicitation and evaluation of zoom gestures for touchless interaction with desktop displays. In Proceedings of the 8th ACM International Symposium on Pervasive Displays (pp. 1-7).
  • Hoffmann, F., Tyroller, M. I., Wende, F., & Henze, N. (2019, November). User-defined interaction for smart homes: voice, touch, or mid-air gestures?. In Proceedings of the 18th International Conference on Mobile and Ubiquitous Multimedia (pp. 1-7).
  • Kopinski, T., Eberwein, J., Geisler, S., & Handmann, U. (2016, November). Touch versus mid-air gesture interfaces in road scenarios-measuring driver performance degradation. In 2016 IEEE 19th International Conference on Intelligent Transportation Systems (ITSC) (pp. 661-666). IEEE.
  • Lindeman, R. W., Sibert, J. L., & Hahn, J. K. (1999, May). Towards usable VR: an empirical study of user interfaces for immersive virtual environments. In Proceedings of the SIGCHI conference on Human factors in computing systems (pp. 64-71).
  • Köpsel, A., & Bubalo, N. (2015). Benefiting from legacy bias. interactions, 22(5), 44-47.
  • Morris, M. R., Danielescu, A., Drucker, S., Fisher, D., Lee, B., Schraefel, M. C., & Wobbrock, J. O. (2014). Reducing legacy bias in gesture elicitation studies. interactions, 21(3), 40-45.
  • Buchmann, V., Violich, S., Billinghurst, M., & Cockburn, A. (2004, June). FingARtips: gesture based direct manipulation in Augmented Reality. In Proceedings of the 2nd international conference on Computer graphics and interactive techniques in Australasia and South East Asia (pp. 212-221).
  • Rehman, I. U., Ullah, S., Khan, D., Khalid, S., Alam, A., Jabeen, G., … & Khan, S. (2020). Fingertip gestures recognition using leap motion and camera for interaction with virtual environment. Electronics, 9(12), 1986.
  • Le, H. Q., & Kim, J. I. (2017, February). An augmented reality application with hand gestures for learning 3D geometry. In 2017 IEEE International Conference on Big Data and Smart Computing (BigComp) (pp. 34-41). IEEE.
  • Bai, H., Lee, G. A., Ramakrishnan, M., & Billinghurst, M. (2014). 3D gesture interaction for handheld augmented reality. In SIGGRAPH Asia 2014 Mobile Graphics and Interactive Applications (pp. 1-6).
  • Shi, R., Zhang, J., Yue, Y., Yu, L., & Liang, H. N. (2023, April). Exploration of Bare-Hand Mid-Air Pointing Selection Techniques for Dense Virtual Reality Environments. In Extended Abstracts of the 2023 CHI Conference on Human Factors in Computing Systems (pp. 1-7).
  • Boudjelthia, A., Nasim, S., Eskola, J., Adeegbe, J. M., Hourula, O., Klakegg, S., & Ferreira, D. (2018, October). Enabling mid-air browser interaction with leap motion. In Proceedings of the 2018 ACM International Joint Conference and 2018 International Symposium on Pervasive and Ubiquitous Computing and Wearable Computers (pp. 335-338).
  • Yu, D., Zhou, Q., Dingler, T., Velloso, E., & Goncalves, J. (2022, October). Blending On-Body and Mid-Air Interaction in Virtual Reality. In 2022 IEEE International Symposium on Mixed and Augmented Reality (ISMAR) (pp. 637-646). IEEE.
  • Modanwal, G., & Sarawadekar, K. (2018). A gesture elicitation study with visually impaired users. In HCI International 2018–Posters' Extended Abstracts: 20th International Conference, HCI International 2018, Las Vegas, NV, USA, July 15-20, 2018, Proceedings, Part II 20 (pp. 54-61). Springer International Publishing.
  • Tscharn, R., Schaper, P., Sauerstein, J., Steinke, S., Stiersdorfer, S., Scheller, C., & Huynh, H. T. (2016). User Experience of 3D Map Navigation–Bare-Hand Interaction or Touchable Device?. Mensch und Computer 2016-Tagungsband.
  • Barsom, E. Z., Graafland, M., & Schijven, M. P. (2016). Systematic review on the effectiveness of augmented reality applications in medical training. Surgical endoscopy, 30, 4174-4183.
  • Eder, M., Hulla, M., Mast, F., & Ramsauer, C. (2020). On the application of augmented reality in a learning factory working environment. Procedia Manufacturing, 45, 7-12.