Probabilistic Articulated Real-Time Tracking for Robot Manipulation
We propose a probabilistic filtering method, which fuses joint measurements with depth images, in order to correct biases in the joint measurements, as well as inaccuracies in the robot model, such as poor extrinsic camera calibration. The proposed method yields an accurate, real-time estimate of the end-effector pose in the camera frame, which avoids the need for frame transformations when using it in combination with visual object tracking methods. We quantitatively evaluate our approach on a dataset recorded from a real robotic system and annotated with ground truth from a motion capture system. We show that our approach is robust and accurate even under challenging conditions such as fast motion, significant and long-term occlusions, time-varying biases, and the robot arm getting in and out of view. We release the dataset along with open-source code of our approach to allow for quantitative comparison with alternative approaches.
View on arXiv