Autonomous Robotic Manipulation
Modeling Top-Down Saliency for Visual Object Search
Interactive Perception
State Estimation and Sensor Fusion for the Control of Legged Robots
Probabilistic Object and Manipulator Tracking
Global Object Shape Reconstruction by Fusing Visual and Tactile Data
Robot Arm Pose Estimation as a Learning Problem
Learning to Grasp from Big Data
Gaussian Filtering as Variational Inference
Template-Based Learning of Model Free Grasping
Associative Skill Memories
Real-Time Perception meets Reactive Motion Generation
Autonomous Robotic Manipulation
Learning Coupling Terms of Movement Primitives
State Estimation and Sensor Fusion for the Control of Legged Robots
Inverse Optimal Control
Motion Optimization
Optimal Control for Legged Robots
Movement Representation for Reactive Behavior
Associative Skill Memories
Real-Time Perception meets Reactive Motion Generation
Global Object Shape Reconstruction by Fusing Visual and Tactile Data

While there exist many solutions and criteria for selecting the best grasp for an object of known shape, grasping an object whose shape is uncertain and noise remains a challenge. In this project, we consider the problem of object shape estimation when it is only partially observable. Once we have a prediction, we can apply the criteria for grasp synthesis that require knowledge of the complete object shape.
The proposed approach to object shape prediction aims at closing the knowledge gaps in the robot’s understanding of the world. Psychological studies suggest that humans are able to predict the portions of a scene that are not visible to them through controlled scene continuation. The expected structure of unobserved object parts are governed by two classes of knowledge: i) visual evidence and ii) completion rules gained through prior visual experience. A very strong prior that exists in especially man-made objects is symmetry. In [], we showed that by exploiting visibility constraints, we could estimate the pose of the symmetry axis from a single view of the object. However, the quality of this estimate depends on having a sufficiently good view point of the object to not overestimate the object’s width in the viewing direction.
In [], we propose to include tactile measurements for estimating the complete object shape under the assumption of symmetry. Specifically, we consider the location of the contacts between hand and object as additional constraints in the estimation process. The problem is formulated as state estimation where the state contains all observed object points, the parameters of the symmetry plane and the bias error between camera and arm. Given the contact points as measurements, we can correct an initial guess of the symmetry plane such that the original points and the points mirrored across the symmetry line comply with the contact points. This optimization is solved with an Iterative Extended Kalman Filter.
Members
Publications