Real-Time RGB-D Camera Relocalization
- Ben Glocker ,
- Shahram Izadi ,
- Jamie Shotton ,
- Antonio Criminisi
International Symposium on Mixed and Augmented Reality (ISMAR) |
Published by IEEE
We introduce an efficient camera relocalization approach which can be easily integrated into dense 3D reconstruction pipelines, such as KinectFusion. Our method is based on keyframes and makes use of randomized ferns which provide both compact encoding of frames and fast retrieval of pose proposals in case of tracking failure. During successful tracking, each frame/pose pair is considered as a potential keyframe. Only frames which are sufficiently dissimilar in the space of appearance are added to the set of keyframes. This keeps the scene representation compact and at the same time the coverage sufficiently dense. Frame dissimilarity is defined via the block-wise hamming distance (BlockHD) between the codes generated by the ferns. Distances between incoming frames and keyframes are efficiently and simultaneously evaluated by simply traversing the nodes of the ferns and counting co-occurrences with keyframes having equal sub-codes. For tracking recovery, camera pose proposals are retrieved from keyframes with smallest BlockHDs which are then used to reinitialize the tracking algorithm.
Both, online determination of keyframes and camera pose recovery are computationally efficient and have minimal impact on the run-time of the 3D reconstruction. Incorporating our method allows seamless continuation of reconstructions even when tracking is frequently lost. Additionally, we demonstrate how marker-free augmented reality can be realized by mesh-to-volume registration between an offline model and the online reconstruction. Our relocalization method is particularly appealing in such AR applications where the experience of steady pose tracking is essential.