A Wearable System for Articulated Human Pose Tracking under Uncertainty of Sensor Placement

  • Xuesu Xiao ,
  • Shuayb Zarar

IEEE RAS/EMBS Int. Conf. Biomedical Robotics and Biomechatronics (BioRob) |

Related File

To precisely track human motion, today’s state-of-the-art employs either well-calibrated sensors tightly strapped to the body or high-speed cameras confined to a finite capture volume. These restrictions make such systems less mobile. In this paper, we aim to break this usability barrier around motion-capture technology through a wearable system that has sensors integrated directly into garments. We develop a pose-estimation approach based on classic kinematics and show that it is insufficient to analyze motion in such a system, leading to mean Euler angle errors of up to +/-60 degrees and standard deviations of 120 degrees. Thus, we motivate the need for data-driven algorithms in this domain. Through a quantitative study, we attribute motion-estimation errors to the high-degree of sensor displacement (up to 118 degrees standard deviation from the nominal value) with respect to the body segments that are present when human poses change. Based on controlled experiments, we develop a new dataset for such systems  comprising over 3 hours of biomechanical motion recordings from 215 trials on 12 test subjects.