Perceiving Systems Talk Biography
18 April 2012

A Modular, Multimodal Arm Model: Multisensory Integration and Flexible Motion Control

D8c2bfb8be

Humans interact with their environment in a highly flexible manner. One important component for the successful control of such flexible interactions is an internal body model. To maintain a consistent internal body model, the brain appears to continuously and probabilistically integrate multiple sources of information, including various sensory modalities but also anticipatory, re-afferent information about current body motion. A modular, multimodal arm model (MMM) is presented.

The model represents a seven degree of freedom arm in various interactive modality frames. The modality frames distinguish between proprioceptive, limb-relative orientation, head-relative orientation, and head-relative location frames. Each arm limb is represented separately but highly interactively in each of these modality frames. Incoming sensory and motor feedback information is continuously exchanged in a rigorous, probabilistic fashion, while a consistent overall arm model is maintained due to the local interactions.

The model is able to automatically identify sensory failures and sensory noise. Moreover, it is able to mimic the rubber hand illusion phenomenon. Currently, we endow the model with neural representations for each modality frame to play-out its full potential for planning and goal-directed control.

Speaker Biography

Martin Butz (University of Tübingen, Computer Science, Cognitive Modeling)