Autonomous Motion Conference Paper 2001

Overt visual attention for a humanoid robot

The goal of our research is to investigate the interplay between oculomotor control, visual processing, and limb control in humans and primates by exploring the computational issues of these processes with a biologically inspired artificial oculomotor system on an anthropomorphic robot. In this paper, we investigate the computational mechanisms for visual attention in such a system. Stimuli in the environment excite a dynamical neural network that implements a saliency map, i.e., a winner-take-all competition between stimuli while simultenously smoothing out noise and suppressing irrelevant inputs. In real-time, this system computes new targets for the shift of gaze, executed by the head-eye system of the robot. The redundant degrees-of- freedom of the head-eye system are resolved through a learned inverse kinematics with optimization criterion. We also address important issues how to ensure that the coordinate system of the saliency map remains correct after movement of the robot. The presented attention system is built on principled modules and generally applicable for any sensory modality.

Author(s): Vijayakumar, S. and Conradt, J. and Shibata, T. and Schaal, S.
Book Title: IEEE International Conference on Intelligent Robots and Systems (IROS 2001)
Year: 2001
Bibtex Type: Conference Paper (inproceedings)
URL: http://www-clmc.usc.edu/publications/V/vijayakumar-IROS2001.pdf
Cross Ref: p1459
Electronic Archiving: grant_archive
Note: clmc

BibTex

@inproceedings{Vijayakumar_IICIRS_2001,
  title = {Overt visual attention for a humanoid robot},
  booktitle = {IEEE International Conference on Intelligent Robots and Systems (IROS 2001)},
  abstract = {The goal of our research is to investigate the interplay between oculomotor control, visual processing, and limb control in humans and primates by exploring the computational issues of these processes with a biologically inspired artificial oculomotor system on an anthropomorphic robot. In this paper, we investigate the computational mechanisms for visual attention in such a system. Stimuli in the environment excite a dynamical neural network that implements a saliency map, i.e., a winner-take-all competition between stimuli while simultenously smoothing out noise and suppressing irrelevant inputs. In real-time, this system computes new targets for the shift of gaze, executed by the head-eye system of the robot. The redundant degrees-of- freedom of the head-eye system are resolved through a learned inverse kinematics with optimization criterion. We also address important issues how to ensure that the coordinate system of the saliency map remains correct after movement of the robot. The presented attention system is built on principled modules and generally applicable for any sensory modality.},
  year = {2001},
  note = {clmc},
  slug = {vijayakumar_iicirs_2001},
  author = {Vijayakumar, S. and Conradt, J. and Shibata, T. and Schaal, S.},
  crossref = {p1459},
  url = {http://www-clmc.usc.edu/publications/V/vijayakumar-IROS2001.pdf}
}