Autonomous Motion Article 2017

Probabilistic Articulated Real-Time Tracking for Robot Manipulation

Fig  quali  arm

We propose a probabilistic filtering method which fuses joint measurements with depth images to yield a precise, real-time estimate of the end-effector pose in the camera frame. This avoids the need for frame transformations when using it in combination with visual object tracking methods. Precision is achieved by modeling and correcting biases in the joint measurements as well as inaccuracies in the robot model, such as poor extrinsic camera calibration. We make our method computationally efficient through a principled combination of Kalman filtering of the joint measurements and asynchronous depth-image updates based on the Coordinate Particle Filter. We quantitatively evaluate our approach on a dataset recorded from a real robotic platform, annotated with ground truth from a motion capture system. We show that our approach is robust and accurate even under challenging conditions such as fast motion, significant and long-term occlusions, and time-varying biases. We release the dataset along with open-source code of our approach to allow for quantitative comparison with alternative approaches.

Award: (Best Paper of RA-L 2017, Finalist of Best Robotic Vision Paper Award of ICRA 2017)
Author(s): Garcia Cifuentes, Cristina and Jan Issac and Manuel Wüthrich and Stefan Schaal and Jeannette Bohg
Journal: IEEE Robotics and Automation Letters (RA-L)
Volume: 2
Number (issue): 2
Pages: 577-584
Year: 2017
Month: April
Project(s):
Bibtex Type: Article (article)
DOI: https://doi.org/10.1109/LRA.2016.2645124
State: Published
Award Paper: Best Paper of RA-L 2017, Finalist of Best Robotic Vision Paper Award of ICRA 2017
Electronic Archiving: grant_archive
ISBN: 2377-3766
Links:
Attachments:

BibTex

@article{GarciaCifuentes.RAL,
  title = {Probabilistic Articulated Real-Time Tracking for Robot Manipulation},
  aword_paper = {Best Paper of RA-L 2017, Finalist of Best Robotic Vision Paper Award of ICRA 2017},
  journal = {IEEE Robotics and Automation Letters (RA-L)},
  abstract = {We propose a probabilistic filtering method which fuses joint measurements with depth images to yield a precise, real-time estimate of the end-effector pose in the camera frame. This avoids the need for frame transformations when using it in combination with visual object tracking methods.
  
  Precision is achieved by modeling and correcting biases in the joint measurements as well as inaccuracies in the robot model, such as poor extrinsic camera calibration. We make our method computationally efficient through a principled combination of Kalman filtering of the joint measurements and asynchronous depth-image updates based on the Coordinate Particle Filter.
  
  We quantitatively evaluate our approach on a dataset recorded from a real robotic platform, annotated with ground truth from a motion capture system. We show that our approach is robust and accurate even under challenging conditions such as fast motion, significant and long-term occlusions, and time-varying biases. We release the dataset along with open-source code of our approach to allow for quantitative comparison with alternative approaches.},
  volume = {2},
  number = {2},
  pages = {577-584},
  month = apr,
  year = {2017},
  slug = {garciacifuentes-ral},
  author = {Garcia Cifuentes, Cristina and Issac, Jan and W{\"u}thrich, Manuel and Schaal, Stefan and Bohg, Jeannette},
  month_numeric = {4}
}