ISMAR 2018
IEEEIEEE computer societyIEEE vgtc

Sponsors

Platinum Apple
Silver MozillaIntelDaqriPTCAmazon
Bronze FacebookUmajinDisney Research
SME EnvisageAR
Academic TUMETHZ

Benzun Pious Wisely Babu, Zhixin Yan, Mao Ye, and Liu Ren. On exploiting per-pixel motion conflicts to extract secondary motions. In Proceedings of the IEEE and ACM International Symposium for Mixed and Augmented Reality 2018 (To appear). 2018.
[BibTeX▼]

Abstract

Ubiquitous Augmented Reality requires robust localization in complex daily environments. The combination of camera and Inertial Mersurement Unit (IMU) has shown promising results for robust localization due to the complementary characteristics of the visual and inertial modalities. However, there exists many cases where the measurements from visual and inertial modalities do not provide a single consistent motion estimate thus causing disagreement on the estimated motion. Limited literature has addressed this problem associated with sensor fusion for localization. Since the disagreement is not a result of measurement noises, existing outlier rejection techniques are not suitable to address this problem. In this paper, we propose a novel approach to handle the disagreement as motion conflict with two key components. The first one is a generalized Hidden Markov Model (HMM) that formulates the tracking and management of the primary motion and the secondary motion as a single estimation problem. The second component is an epipolar constrained Deep Neural Network that generates a per-pixel motion conflict probability map. Experimental evaluations demonstrate significant improvement to the tracking accuracy in cases of strong motion conflict compared to previous state-of-the-art algorithms for localization. Moreover, as a consequence of motion tracking on the secondary maps, our solution enables augmentation of virtual content attached to secondary motions, which brings us one step closer to Ubiquitous Augmented Reality.