Quantifying the Quality of Haptic Interfaces
Shape-Changing Haptic Interfaces
Generating Clear Vibrotactile Cues with Magnets Embedded in a Soft Finger Sheath
Salient Full-Fingertip Haptic Feedback Enabled by Wearable Electrohydraulic Actuation
Cutaneous Electrohydraulic (CUTE) Wearable Devices for Pleasant Broad-Bandwidth Haptic Cues
Modeling Finger-Touchscreen Contact during Electrovibration
Perception of Ultrasonic Friction Pulses
Vibrotactile Playback for Teaching Sensorimotor Skills in Medical Procedures
CAPT Motor: A Two-Phase Ironless Motor Structure
4D Intraoperative Surgical Perception: Anatomical Shape Reconstruction from Multiple Viewpoints
Visual-Inertial Force Estimation in Robotic Surgery
Enhancing Robotic Surgical Training
AiroTouch: Naturalistic Vibrotactile Feedback for Large-Scale Telerobotic Assembly
Optimization-Based Whole-Arm Teleoperation for Natural Human-Robot Interaction
Finger-Surface Contact Mechanics in Diverse Moisture Conditions
Computational Modeling of Finger-Surface Contact
Perceptual Integration of Contact Force Components During Tactile Stimulation
Dynamic Models and Wearable Tactile Devices for the Fingertips
Novel Designs and Rendering Algorithms for Fingertip Haptic Devices
Dimensional Reduction from 3D to 1D for Realistic Vibration Rendering
Prendo: Analyzing Human Grasping Strategies for Visually Occluded Objects
Learning Upper-Limb Exercises from Demonstrations
Minimally Invasive Surgical Training with Multimodal Feedback and Automatic Skill Evaluation
Efficient Large-Area Tactile Sensing for Robot Skin
Haptic Feedback and Autonomous Reflexes for Upper-limb Prostheses
Gait Retraining
Modeling Hand Deformations During Contact
Intraoperative AR Assistance for Robot-Assisted Minimally Invasive Surgery
Immersive VR for Phantom Limb Pain
Visual and Haptic Perception of Real Surfaces
Haptipedia
Gait Propulsion Trainer
TouchTable: A Musical Interface with Haptic Feedback for DJs
Exercise Games with Baxter
Intuitive Social-Physical Robots for Exercise
How Should Robots Hug?
Hierarchical Structure for Learning from Demonstration
Fabrication of HuggieBot 2.0: A More Huggable Robot
Learning Haptic Adjectives from Tactile Data
Feeling With Your Eyes: Visual-Haptic Surface Interaction
S-BAN
General Tactile Sensor Model
Insight: a Haptic Sensor Powered by Vision and Machine Learning
How Should Robots Hug?
Hugs are one of the first forms of contact and affection that humans experience. Not only are hugs a common way to provide comfort, support, or affection, but they have also been shown to have significant health benefits. Hugs can lower blood pressure, increase oxytocin levels (the hormone that makes you happy), lower cortisol levels (the hormone that makes you stressed), and improve your immune system.
As roboticists who study human interaction, we are interested in creating robots that can hug humans as seamlessly as people hug each other. The purpose of this first HuggieBot project was to evaluate human responses to different physical characteristics and hugging behaviors exhibited by a hugging robot. Specifically, we aimed to test the hypothesis that a soft, warm, touch-sensitive PR2 humanoid robot can provide humans with satisfying hugs by matching both their hugging pressure and their hugging duration [].
After a brief introduction and opening survey, thirty relatively young participants with mostly technical backgrounds experienced and evaluated twelve hugs with the robot. Each person started the study with three randomly ordered trials that focused on physical robot characteristics, wherein the robot was Hard-Cold, Soft-Cold, or Soft-Warm. The study then proceeded to nine randomly ordered behavioral trials with all combinations of low, medium, and high hug pressure and low, medium, and high hug duration.
Analysis of the results showed that participants significantly preferred soft, warm hugs over hard, cold hugs []. Furthermore, users preferred hugs that physically squeeze them and then release immediately, when they are ready for the hug to end. Taking part in the experiment also significantly increased positive user opinions of robots and robot use.
Members
Publications