Campus Units

Mechanical Engineering, Industrial and Manufacturing Systems Engineering, Electrical and Computer Engineering, Materials Science and Engineering, Human Computer Interaction, Virtual Reality Applications Center

Document Type

Presentation

Conference

Interservice/Industry Training, Simulation and Education Conference (I/ITSEC) 2017

Publication Version

Published Version

Publication Date

2017

Journal or Book Title

Proceedings of the 2017 Interservice/Industry Training, Simulation, and Education Conference (I/ITSEC)

Volume

2017

First Page

17208

Conference Title

Interservice/Industry Training, Simulation and Education Conference (I/ITSEC) 2017

Conference Date

November 27-December 1, 2017

City

Orlando, FL

Abstract

Increased product complexity and the focus on zero defects, especially when manufacturing complex engineered products, means new tools are required for helping workers conduct challenging assembly tasks. Augmented reality (AR) has shown considerable promise in delivering work instructions over traditional methods. Many proof-of-concept systems have demonstrated the feasibility of AR but little work has been devoted to understanding how users perceive different AR work instruction interface elements. This paper presents a between-subjects study looking at how interface elements for object depth placement in a scene impact a user’s ability to quickly and accurately assemble a mock aircraft wing in a standard work cell. For object depth placement, modes with varying degrees of 3D modeled occlusion were tested, including a control group with no occlusion, virtual occlusion, and occlusion by contours. Results for total assembly time and total errors indicated no statistically significant difference between interfaces, leading the authors to conclude a floor has been reached for optimizing the current assembly when using AR for work instruction delivery. However, looking at a handful of highly error prone steps showed the impact different types of occlusion have on helping users correctly complete an assembly task. The results of the study provide insight into how to construct an interface for delivering AR work instructions using occlusion. Based on these results, the authors recommend customizing the occlusion method based on the features of the required assembly task. The authors also identified a floor effect for the steps of the assembly process, which involved picking the necessary parts from tables and bins. The authors recommend using vibrant outlines and large textual cues (e.g., numbers on parts bins) as interface elements to guide users during these types of “picking” steps.

Comments

This proceeding was published as MacAllister, Anastacia, Melynda Hoover, Stephen Gilbert, James Oliver, Rafael Radkowski, Timothy Garrett, Joseph Holub, Eliot Winer, Scott Terry, and Paul Davies. (2017). "Comparing Visual Assembly Aids for Augmented Reality Work Instructions." In Volume 2017, Proceedings of the 2017 Interservice/Industry Training, Simulation, and Education Conference (I/ITSEC). Paper no. 17208. Arlington, VA: National Training and Simulation Association. Posted with permission.

Copyright Owner

National Training and Simulation Association

Language

en

File Format

application/pdf

Share

Article Location

 
COinS