Understanding intuitive gestures in wearable mixed reality environments

Thumbnail Image
Date
2020-01-01
Authors
Doty, Karen
Major Professor
Advisor
James Lathrop
Committee Member
Journal Title
Journal ISSN
Volume Title
Publisher
Altmetrics
Authors
Research Projects
Organizational Units
Organizational Unit
Journal Issue
Is Version Of
Versions
Series
Department
Computer Science
Abstract

Augmented and mixed reality experiences are increasingly accessible due to advances in technology in both professional and daily settings. Technology continues to evolve into multiple different forms, including tablet experiences in the form of augmented reality (AR) and mixed reality (MR) using wearable heads-up displays (HUDs). Currently, standards for best usability practices continue to evolve for MR HUD two-dimensional user interfaces (2D UI) and three-dimensional user interfaces (3D UI). Therefore, research on evolving usability practices will serve as guidance for future development of MR HUD applications.

The objective of this dissertation is to understand what gestures users intuitively make to respond to a MR environment while wearing a HUD. The Microsoft HoloLens is a wearable HUD that can be used for MR. The Microsoft HoloLens contains two core gestures that were developed to interact with holographic interfaces in MR. Although current gestures can be learned to generate successful outcomes, this dissertation provides a better understanding of which gestures are intuitive to new users of a MR environment.

To understand which gestures are intuitive to users, 74 participants without any experience with MR attempted to make gestures within a wearable MR HUD environment. The results of this study show that previous technology experience can influence gesture choice; however, gesture choice also depends on the goal of the interaction scenario. Results suggest that a greater number of programmed gestures are needed in order to best utilize all tools available in wearable HUDs in MR. Results of this dissertation suggest that five new gestures should be created, with three of these gestures serving to reflect a connection between MR interaction and current gesture-based technology. Additionally, results suggest that two new gestures should be created that reflect a connection between gestures for MR and daily movements in the physical world space.

Comments
Description
Keywords
Citation
Source
Copyright
Fri May 01 00:00:00 UTC 2020