ISMAR 2018
IEEEIEEE computer societyIEEE vgtc

Sponsors

Platinum Apple
Silver MozillaIntelDaqriPTCAmazon
Bronze FacebookUmajinDisney Research
SME EnvisageAR
Academic TUMETHZ

Ann McNamara, Somyung Oh, Sarah Suther, Katherine Boyd, and Ryan Sharpe. Using eye tracking to improve information retrieval in virtual reality. In Adjunct Proceedings of the IEEE and ACM International Symposium for Mixed and Augmented Reality 2018 (To appear). 2018.
[BibTeX▼]

Abstract

Virtual Reality has the potential to transform the way we work; rest and play. This promise comes with new challenges. One challenge stems from the interactive nature of immersive Virtual Environments (VEs). Placement of contextual information in VEs can be critical to the user experience. This poster describes the use of eye-tracking to alleviate usability issues surrounding information presentation in immersive VEs. Results from our experiments show that integrating eye tracking into a VE to dictate where and when textual information is presented can improve performance when searching for contextual information. In summary; the results show improvement in task performance when the new direct method is employed to reveal information in target regions based on gaze. This seems to hold true independent of the VE or the type of information questioned.