Back
Human body movements are highly complex spatio-temporal patterns and their control and recognition represent challenging problems for technical as well as neural systems. The talk will present an overview of recent work of our group, exploiting biologically-inspired learning-based reprensentations for the recognition and synthesis of body motion.
The first part of the talk will present a neural theory for the visual processing of goal-directed actions, which reproduces and partially correctly predicts electrophysiological results from action-selective cortical neurons in monkey cortex. In particular, we show that the same neural circuits might account for the recognition of natural and abstract action stimuli.
In the second part of the talk different techniques for the learning of structured online-capable synthesis models for complex body movements are discussed. One approach is based on the learning of kinematic primitives, exploiting anechoic demixing, and the generation of such primitives by networks of canonical dynamical systems.
An approach for the design of a stable overall system dynamics of such nonlinear networks is discussed. The second approach is the learning of hierarchical models for interactive movements, combining Gaussian Process Latent Variable models and Gaussian process Dynamical Models, and resulting in animations that pass the Turing test of computer graphics. The presented work was funded by the DFG, and EC FP 7 projects SEARISE, TANGO and AMARSI.
Martin Giese (Section for Computational Sensomotorics, Dept. for Cognitive Neurology, HIH and CIN, University Clinic Tuebingen, Germany)