Quantifying the Quality of Haptic Interfaces
Shape-Changing Haptic Interfaces
Generating Clear Vibrotactile Cues with Magnets Embedded in a Soft Finger Sheath
Salient Full-Fingertip Haptic Feedback Enabled by Wearable Electrohydraulic Actuation
Cutaneous Electrohydraulic (CUTE) Wearable Devices for Pleasant Broad-Bandwidth Haptic Cues
Modeling Finger-Touchscreen Contact during Electrovibration
Perception of Ultrasonic Friction Pulses
Vibrotactile Playback for Teaching Sensorimotor Skills in Medical Procedures
CAPT Motor: A Two-Phase Ironless Motor Structure
4D Intraoperative Surgical Perception: Anatomical Shape Reconstruction from Multiple Viewpoints
Visual-Inertial Force Estimation in Robotic Surgery
Enhancing Robotic Surgical Training
AiroTouch: Naturalistic Vibrotactile Feedback for Large-Scale Telerobotic Assembly
Optimization-Based Whole-Arm Teleoperation for Natural Human-Robot Interaction
Finger-Surface Contact Mechanics in Diverse Moisture Conditions
Computational Modeling of Finger-Surface Contact
Perceptual Integration of Contact Force Components During Tactile Stimulation
Dynamic Models and Wearable Tactile Devices for the Fingertips
Novel Designs and Rendering Algorithms for Fingertip Haptic Devices
Dimensional Reduction from 3D to 1D for Realistic Vibration Rendering
Prendo: Analyzing Human Grasping Strategies for Visually Occluded Objects
Learning Upper-Limb Exercises from Demonstrations
Minimally Invasive Surgical Training with Multimodal Feedback and Automatic Skill Evaluation
Efficient Large-Area Tactile Sensing for Robot Skin
Haptic Feedback and Autonomous Reflexes for Upper-limb Prostheses
Gait Retraining
Modeling Hand Deformations During Contact
Intraoperative AR Assistance for Robot-Assisted Minimally Invasive Surgery
Immersive VR for Phantom Limb Pain
Visual and Haptic Perception of Real Surfaces
Haptipedia
Gait Propulsion Trainer
TouchTable: A Musical Interface with Haptic Feedback for DJs
Exercise Games with Baxter
Intuitive Social-Physical Robots for Exercise
How Should Robots Hug?
Hierarchical Structure for Learning from Demonstration
Fabrication of HuggieBot 2.0: A More Huggable Robot
Learning Haptic Adjectives from Tactile Data
Feeling With Your Eyes: Visual-Haptic Surface Interaction
S-BAN
General Tactile Sensor Model
Insight: a Haptic Sensor Powered by Vision and Machine Learning
Human-Robot Interaction

While autonomous robots excel at repetitive tasks in controlled environments, the world in which most humans live is messy, constantly changing, and filled with other people. Important opportunities exist for helping humans in these unstructured environments, particularly as our population ages, but new approaches are needed for robots to be as successful inside homes, clinics, and hospitals as they already are in factories. We are thus working to discover whether and how human-robot interaction can benefit humans by designing, building, and evaluating new systems targeted at particular user populations.
To facilitate study of a variety of forms of human-robot interaction, we constructed what we call the Robot Interaction Studio in one of our laboratories. It uses a commercial markerless motion-capture system to estimate the pose of a human user in the room in real time, so that the robot being tested can act on this perceptual data. We were particularly curious about the potential benefits of allowing a robot to respond to the user's movements on a continual basis, rather than only at the end of an interaction. Though it is more complex to implement, this kind of feedback from a robot greatly benefits the interaction and deserves further exploration.
Given the importance of the sense of touch for everyday life, we are particularly interested in robots that physically interact with both objects and people to accomplish useful tasks, taking advantage of novel strategies to detect and understand contact. Sometimes the accelerometers built into a robot's grippers are sufficient for detecting impacts, but many commercial robots lack these sensors, and they also have no form of sensitive skin on their rigid links. Our work in this domain often combines the invention of novel practical tactile sensors that can enable a robot to feel contact from the humans with which it is interacting, as well as extensive human-subject studies to investigate how touch-enabled robot reactions affect the interaction.
Many physical interactions that transpire between humans, such as object handovers and hugs, have strong social dynamics that have been carefully studied. Robots that skillfully take part in such interactions may be able to work more effectively with and around humans. We are interested in how such robots should behave and how people react to them in different scenarios. To this end, we are continuing one final investigation with HuggieBot, our previously developed hugging robot.