Quantifying the Quality of Haptic Interfaces
Shape-Changing Haptic Interfaces
Generating Clear Vibrotactile Cues with Magnets Embedded in a Soft Finger Sheath
Salient Full-Fingertip Haptic Feedback Enabled by Wearable Electrohydraulic Actuation
Cutaneous Electrohydraulic (CUTE) Wearable Devices for Pleasant Broad-Bandwidth Haptic Cues
Modeling Finger-Touchscreen Contact during Electrovibration
Perception of Ultrasonic Friction Pulses
Vibrotactile Playback for Teaching Sensorimotor Skills in Medical Procedures
CAPT Motor: A Two-Phase Ironless Motor Structure
4D Intraoperative Surgical Perception: Anatomical Shape Reconstruction from Multiple Viewpoints
Visual-Inertial Force Estimation in Robotic Surgery
Enhancing Robotic Surgical Training
AiroTouch: Naturalistic Vibrotactile Feedback for Large-Scale Telerobotic Assembly
Optimization-Based Whole-Arm Teleoperation for Natural Human-Robot Interaction
Finger-Surface Contact Mechanics in Diverse Moisture Conditions
Computational Modeling of Finger-Surface Contact
Perceptual Integration of Contact Force Components During Tactile Stimulation
Dynamic Models and Wearable Tactile Devices for the Fingertips
Novel Designs and Rendering Algorithms for Fingertip Haptic Devices
Dimensional Reduction from 3D to 1D for Realistic Vibration Rendering
Prendo: Analyzing Human Grasping Strategies for Visually Occluded Objects
Learning Upper-Limb Exercises from Demonstrations
Minimally Invasive Surgical Training with Multimodal Feedback and Automatic Skill Evaluation
Efficient Large-Area Tactile Sensing for Robot Skin
Haptic Feedback and Autonomous Reflexes for Upper-limb Prostheses
Gait Retraining
Modeling Hand Deformations During Contact
Intraoperative AR Assistance for Robot-Assisted Minimally Invasive Surgery
Immersive VR for Phantom Limb Pain
Visual and Haptic Perception of Real Surfaces
Haptipedia
Gait Propulsion Trainer
TouchTable: A Musical Interface with Haptic Feedback for DJs
Exercise Games with Baxter
Intuitive Social-Physical Robots for Exercise
How Should Robots Hug?
Hierarchical Structure for Learning from Demonstration
Fabrication of HuggieBot 2.0: A More Huggable Robot
Learning Haptic Adjectives from Tactile Data
Feeling With Your Eyes: Visual-Haptic Surface Interaction
S-BAN
General Tactile Sensor Model
Insight: a Haptic Sensor Powered by Vision and Machine Learning
HuggieBot: Evolution of an Interactive Hugging Robot with Visual and Haptic Perception

Hugs are complex interactions that must adapt to the height, body shape, actions, and preferences of the hugging partner. Because hugs are known to greatly benefit humans, we created a series of hugging robots that use visual and haptic perception to provide enjoyable interactive hugs. Each improved version of HuggieBot was evaluated by measuring how users emotionally and behaviorally responded to hugging it.
Building on research both within and outside of human-robot interaction, this project proposed eleven guidelines of natural and enjoyable robotic hugging. These eleven guidelines are essential to delivering high-quality robot hugs []. We present these guidelines for designers to follow when creating new hugging robots to enhance user acceptance.
In our initial work with HuggieBot 1.0 [], we evaluated user response to different robot physical characteristics and hugging behaviors. We then iteratively created three versions of an entirely new robotic platform, referred to as HuggieBot 2.0 [
], 3.0 [
], and 4.0 [
]. To enable perceptive and pleasing autonomous robot behavior, we investigated robot responses to four human intra-hug gestures: holding, rubbing, patting, and squeezing. We developed a real-time perceptual algorithm that detects and classifies user actions with 88% accuracy. The algorithm utilizes microphone and pressure sensor data collected from the robot's inflatable sensing torso, which we have named HuggieChest [
]. We also created a probabilistic behavior algorithm that chooses natural robot responses in real time.
The results of our five user studies validated our eleven hugging guidelines and informed the iterative design of HuggieBot []. Users enjoy robot softness, robot warmth, and being physically squeezed by the robot. Users dislike being released too soon from a hug and equally dislike being held by the robot for too long. Adding haptic reactivity definitively improves user perception of a hugging robot; the robot's responses and proactive intra-hug gestures were greatly enjoyed. In our last study, we learned that HuggieBot can positively affect users on a physiological level and can be comparable to hugging a person. Participants consistently have more favorable opinions about hugging robots after prolonged interaction with HuggieBot in all our research studies.
Members
Publications