Perceiving Systems Conference Paper 2021

Active Visual SLAM with Independently Rotating Camera

Untitled

In active Visual-SLAM (V-SLAM), a robot relies on the information retrieved by its cameras to control its own movements for autonomous mapping of the environment. Cameras are usually statically linked to the robot's body, limiting the extra degrees of freedom for visual information acquisition. In this work, we overcome the aforementioned problem by introducing and leveraging an independently rotating camera on the robot base. This enables us to continuously control the heading of the camera, obtaining the desired optimal orientation for active V-SLAM, without rotating the robot itself. However, this additional degree of freedom introduces additional estimation uncertainties, which need to be accounted for. We do this by extending our robot's state estimate to include the camera state and jointly estimate the uncertainties. We develop our method based on a state-of-the-art active V-SLAM approach for omnidirectional robots and evaluate it through rigorous simulation and real robot experiments. We obtain more accurate maps, with lower energy consumption, while maintaining the benefits of the active approach with respect to the baseline. We also demonstrate how our method easily generalizes to other non-omnidirectional robotic platforms, which was a limitation of the previous approach. Code and implementation details are provided as open-source.

Author(s): Bonetto, Elia and Goldschmid, Pascal and Black, Michael J. and Ahmad, Aamir
Book Title: 2021 European Conference on Mobile Robots (ECMR 2021)
Pages: 266--273
Year: 2021
Publisher: IEEE
Bibtex Type: Conference Paper (inproceedings)
Address: Piscataway, NJ
DOI: 10.1109/ECMR50962.2021.9568791
Event Name: European Conference on Mobile Robots (ECMR 2021)
Event Place: Virtual
State: Published
URL: https://ieeexplore.ieee.org/document/9568791
Electronic Archiving: grant_archive
ISBN: 978-1-6654-1213-1
Links:

BibTex

@inproceedings{independentCamBonetto,
  title = {Active Visual {SLAM} with Independently Rotating Camera},
  booktitle = {2021 European Conference on Mobile Robots (ECMR 2021)},
  abstract = {In active Visual-SLAM (V-SLAM), a robot relies on the information retrieved by its cameras to control its own movements for autonomous mapping of the environment. Cameras are usually statically linked to the robot's body, limiting the extra degrees of freedom for visual information acquisition. In this work, we overcome the aforementioned problem by introducing and leveraging an independently rotating camera on the robot base. This enables us to continuously control the heading of the camera, obtaining the desired optimal orientation for active V-SLAM, without rotating the robot itself. However, this additional degree of freedom introduces additional estimation uncertainties, which need to be accounted for. We do this by extending our robot's state estimate to include the camera state and jointly estimate the uncertainties. We develop our method based on a state-of-the-art active V-SLAM approach for omnidirectional robots and evaluate it through rigorous simulation and real robot experiments. We obtain more accurate maps, with lower energy consumption, while maintaining the benefits of the active approach with respect to the baseline. We also demonstrate how our method easily generalizes to other non-omnidirectional robotic platforms, which was a limitation of the previous approach. Code and implementation details are provided as open-source.},
  pages = {266--273},
  publisher = {IEEE},
  address = {Piscataway, NJ},
  year = {2021},
  slug = {independentcambonetto},
  author = {Bonetto, Elia and Goldschmid, Pascal and Black, Michael J. and Ahmad, Aamir},
  url = {https://ieeexplore.ieee.org/document/9568791}
}