Campus Units

Virtual Reality Applications Center, Mechanical Engineering, Psychology

Document Type

Conference Proceeding


2014 IEEE Virtual Reality

Publication Version

Accepted Manuscript

Link to Published Version

Publication Date


Conference Title

2014 IEEE Virtual Reality

Conference Date

March 29-April 2, 2014


Minneapolis, MN


Abstract: Selection of moving targets is a common, yet complex task in human-computer interaction (HCI) and virtual reality (VR). Predicting user intention may be beneficial to address the challenges inherent in interaction techniques for moving-target selection. This article extends previous models by integrating relative head-target and hand-target features to predict intended moving targets. The features are calculated in a time window ending at roughly two-thirds of the total target selection time and evaluated using decision trees. With two targets, this model is able to predict user choice with up to ~ 72% accuracy on general moving-target selection tasks and up to ~ 78% by also including task-related target properties.


This is a manuscript of a conference proceeding published as Casallas, Juan Sebastian, James H. Oliver, Jonathan W. Kelly, Frédéric Merienne, and Samir Garbaya. "Using relative head and hand-target features to predict intention in 3D moving-target selection." In Virtual Reality (VR), 2014 iEEE, pp. 51-56. IEEE, 2014. Posted with permission.


© 2014 IEEE. Personal use of this material is permitted. Permission from IEEE must be obtained for all other uses, in any current or future media, including reprinting/republishing this material for advertising or promotional purposes, creating new collective works, for resale or redistribution to servers or lists, or reuse of any copyrighted component of this work in other works.

Copyright Owner




File Format


Published Version


Article Location