Back
In recent years, vision-based high-resolution tactile sensors such as GelSight have been widely used because the rich signal provides useful information regarding the state of the robot and the environment. However, a major barrier for applying tactile sensors like GelSight is the cost. To make the tactile sensors more accessible to a broader community, we propose to build a simulation model for vision-based tactile sensors like GelSight. We explore two modeling methods for making the model: a physically-based method that uses rendering technologies to simulate the sensor's optical design and a data-driven method that uses a small amount of example data to model the sensor quickly. With the help of our simulation model, we can make a robot to collect realistic tactile data in a simulation environment. This will help roboticists test the system design and sensor choice before setting up the real robots and collecting large-scale tactile datasets at a much lower cost. The simulation model will also lead us to have a deeper understanding of the sensor design and improve the sensor design and system design.
Wenzhen Yuan (Carnegie Mellon University)
Assistant Professor
Wenzhen Yuan is an assistant professor in the Robotics Institute at Carnegie Mellon University and the director of the CMU RoboTouch Lab. She is a pioneer in high-resolution tactile sensing for robots, and she also works in multi-modal robot perception, soft robots, robot manipulation, and haptics. Yuan received her Master of Science and PhD degrees from MIT and Bachelor of Engineering degree from Tsinghua University. She also worked as a Postdoctoral researcher at Stanford University.