ISMAR 2018
IEEEIEEE computer societyIEEE vgtc

Sponsors

Platinum Apple
Silver MozillaIntelDaqriPTCAmazon
Bronze FacebookUmajinDisney Research
SME EnvisageAR
Academic TUMETHZ

Brian Williamson, Andrés N Vargas González, Patrick Garrity, Robert Sottilare, and Joseph LaViola. Agileslam: a localization approach for agile head movements in augmented reality. In Adjunct Proceedings of the IEEE and ACM International Symposium for Mixed and Augmented Reality 2018 (To appear). 2018.
[BibTeX▼]

Abstract

Realistic augmented reality systems require both accurate localization of the user and a mapping of the environment. In a markerless environment this is often done with SLAM algorithms which, for localization, pick out features in the environment and compare how they have changed from keyframe to current frame. However, human head agility, such as seen in video gaming tasks or training exercises, poses a problem; fast rotations will cause all previously tracked features to no longer be within the field of view and the system will struggle to localize accurately. In this paper we present an approach that is capable of tracking a human head's agile movements by using an array of RGB-D sensors and a reconstruction of this sensor data into 360 degrees of features that is fed into our SLAM algorithm. We run an experiment with pre-recorded agile movement scenarios that demonstrate the accuracy of our system. We also compare our approach against single sensor algorithms and show a significant improvement (up to 15 to 20 times better accuracy) in localization. The development of our sensor array and SLAM algorithm creates a novel approach to accurately localize extremely agile human head movements.