Quantifying the Quality of Haptic Interfaces
Shape-Changing Haptic Interfaces
Generating Clear Vibrotactile Cues with Magnets Embedded in a Soft Finger Sheath
Salient Full-Fingertip Haptic Feedback Enabled by Wearable Electrohydraulic Actuation
Cutaneous Electrohydraulic (CUTE) Wearable Devices for Pleasant Broad-Bandwidth Haptic Cues
Modeling Finger-Touchscreen Contact during Electrovibration
Perception of Ultrasonic Friction Pulses
Vibrotactile Playback for Teaching Sensorimotor Skills in Medical Procedures
CAPT Motor: A Two-Phase Ironless Motor Structure
4D Intraoperative Surgical Perception: Anatomical Shape Reconstruction from Multiple Viewpoints
Visual-Inertial Force Estimation in Robotic Surgery
Enhancing Robotic Surgical Training
AiroTouch: Naturalistic Vibrotactile Feedback for Large-Scale Telerobotic Assembly
Optimization-Based Whole-Arm Teleoperation for Natural Human-Robot Interaction
Finger-Surface Contact Mechanics in Diverse Moisture Conditions
Computational Modeling of Finger-Surface Contact
Perceptual Integration of Contact Force Components During Tactile Stimulation
Dynamic Models and Wearable Tactile Devices for the Fingertips
Novel Designs and Rendering Algorithms for Fingertip Haptic Devices
Dimensional Reduction from 3D to 1D for Realistic Vibration Rendering
Prendo: Analyzing Human Grasping Strategies for Visually Occluded Objects
Learning Upper-Limb Exercises from Demonstrations
Minimally Invasive Surgical Training with Multimodal Feedback and Automatic Skill Evaluation
Efficient Large-Area Tactile Sensing for Robot Skin
Haptic Feedback and Autonomous Reflexes for Upper-limb Prostheses
Gait Retraining
Modeling Hand Deformations During Contact
Intraoperative AR Assistance for Robot-Assisted Minimally Invasive Surgery
Immersive VR for Phantom Limb Pain
Visual and Haptic Perception of Real Surfaces
Haptipedia
Gait Propulsion Trainer
TouchTable: A Musical Interface with Haptic Feedback for DJs
Exercise Games with Baxter
Intuitive Social-Physical Robots for Exercise
How Should Robots Hug?
Hierarchical Structure for Learning from Demonstration
Fabrication of HuggieBot 2.0: A More Huggable Robot
Learning Haptic Adjectives from Tactile Data
Feeling With Your Eyes: Visual-Haptic Surface Interaction
S-BAN
General Tactile Sensor Model
Insight: a Haptic Sensor Powered by Vision and Machine Learning
Haptic Actuation

Haptic interfaces are mechatronic systems that modulate the physical interaction between a human and their tangible surroundings so that the human can act on and feel a virtual and/or remote environment. How can such systems vividly reproduce the perceptual experience of touching real objects and provide feedback that helps the user improve their motor skills? We seek to answer these questions by carefully studying existing technologies and inventing new haptic interfaces.
Since the start of the field of haptics in the early 1990's, three distinct archetypal haptic interface categories have emerged: grounded kinesthetic haptic interfaces, ungrounded haptic interfaces, and surface haptic interfaces. Although they differ in key ways, they all function in the same overall manner: the haptic interface's mechanical, electrical, and computational elements work together to monitor and modify the user's physical interaction with his or her tangible surroundings.
We do research on all three categories of haptic interfaces. In the relatively well-established area of grounded force-feedback devices, we mainly seek to understand, share, and benchmark the diversity of past designs to accelerate haptic device innovation and enable standardized performance comparisons. We have also invented a high-performance motor design for grounded devices.
In the newer area of ungrounded haptic interfaces, we aim to expand what is possible by inventing, refining, and carefully evaluating new devices. Such work often includes hardware design, actuator and sensor selection, calibration, control optimization, application design, system integration, and human studies. In recent years, we explored the unusual approach of shape-change haptic feedback, finding that it may be particularly well suited to spatial guidance tasks.
Most recently, we have started working closely with Christoph Keplinger and members of his Robotic Materials Department to adapt their HASEL artificial muscles for use as haptic actuators, i.e., to deliver broad-bandwidth haptic cues to human skin. As shown on a subsequent page, we have created HASEL-based fingertip wearable devices and shown how to evaluate their haptic outputs with a commercial fingertip sensor. Furthermore, we collaborated to create the new category of cutaneous electrohydraulic (CUTE) devices, which achieve high stroke, high force, and high bandwidth on hairy skin. We anticipate many exciting developments in this domain in coming years!