Campus Units

Virtual Reality Applications Center, Mechanical Engineering, Psychology

Document Type

Conference Proceeding

Conference

2014 IEEE Virtual Reality

Publication Version

Accepted Manuscript

Link to Published Version

http://dx.doi.org/10.1109/VR.2014.6802050

Publication Date

2014

Conference Title

2014 IEEE Virtual Reality

Conference Date

March 29-April 2, 2014

City

Minneapolis, MN

Abstract

Abstract: Selection of moving targets is a common, yet complex task in human-computer interaction (HCI) and virtual reality (VR). Predicting user intention may be beneficial to address the challenges inherent in interaction techniques for moving-target selection. This article extends previous models by integrating relative head-target and hand-target features to predict intended moving targets. The features are calculated in a time window ending at roughly two-thirds of the total target selection time and evaluated using decision trees. With two targets, this model is able to predict user choice with up to ~ 72% accuracy on general moving-target selection tasks and up to ~ 78% by also including task-related target properties.

Comments

This is a manuscript of a conference proceeding published as Casallas, Juan Sebastian, James H. Oliver, Jonathan W. Kelly, Frédéric Merienne, and Samir Garbaya. "Using relative head and hand-target features to predict intention in 3D moving-target selection." In Virtual Reality (VR), 2014 iEEE, pp. 51-56. IEEE, 2014. Posted with permission.

Rights

© 2014 IEEE. Personal use of this material is permitted. Permission from IEEE must be obtained for all other uses, in any current or future media, including reprinting/republishing this material for advertising or promotional purposes, creating new collective works, for resale or redistribution to servers or lists, or reuse of any copyrighted component of this work in other works.

Copyright Owner

IEEE

Language

en

File Format

application/pdf

Published Version

Share

Article Location

 
COinS