Quantifying the Quality of Haptic Interfaces
Shape-Changing Haptic Interfaces
Generating Clear Vibrotactile Cues with Magnets Embedded in a Soft Finger Sheath
Salient Full-Fingertip Haptic Feedback Enabled by Wearable Electrohydraulic Actuation
Cutaneous Electrohydraulic (CUTE) Wearable Devices for Pleasant Broad-Bandwidth Haptic Cues
Understanding the Perception of Electrovibration
Perception of Ultrasonic Friction Pulses
Vibrotactile Playback for Teaching Sensorimotor Skills in Medical Procedures
Halbach-Ring Motor Design
4D Intraoperative Surgical Perception: Anatomical Shape Reconstruction from Multiple Viewpoints
Visual-Inertial Force Estimation in Robotic Surgery
Enhancing Robotic Surgical Training
AiroTouch: Naturalistic Vibrotactile Feedback for Large-Scale Telerobotic Assembly
Optimization-Based Whole-Arm Teleoperation for Natural Human-Robot Interaction
Finger-Surface Contact Mechanics in Diverse Moisture Conditions
Computational Modeling of Finger-Surface Contact
Perceptual Integration of Contact Force Components During Tactile Stimulation
Dynamic Models and Wearable Tactile Devices for the Fingertips
Novel Designs and Rendering Algorithms for Fingertip Haptic Devices
Dimensional Reduction from 3D to 1D for Realistic Vibration Rendering
Prendo: Analyzing Human Grasping Strategies for Visually Occluded Objects
Learning Upper-Limb Exercises from Demonstrations
Minimally Invasive Surgical Training with Multimodal Feedback and Automatic Skill Evaluation
Efficient Large-Area Tactile Sensing for Robot Skin
Haptic Feedback and Autonomous Reflexes for Upper-limb Prostheses
Gait Retraining
Modeling Hand Deformations During Contact
Intraoperative AR Assistance for Robot-Assisted Minimally Invasive Surgery
Immersive VR for Phantom Limb Pain
Visual and Haptic Perception of Real Surfaces
Haptipedia
Gait Propulsion Trainer
TouchTable: A Musical Interface with Haptic Feedback for DJs
Exercise Games with Baxter
Intuitive Social-Physical Robots for Exercise
How Should Robots Hug?
Hierarchical Structure for Learning from Demonstration
Fabrication of HuggieBot 2.0: A More Huggable Robot
Learning Haptic Adjectives from Tactile Data
Feeling With Your Eyes: Visual-Haptic Surface Interaction
S-BAN
General Tactile Sensor Model
Insight: a Haptic Sensor Powered by Vision and Machine Learning
Research Overview

Touch is far less understood than vision or hearing, since what you feel greatly depends on how you move, and engineered haptic sensors, actuators, and algorithms typically struggle to match human capabilities. Consequently, today's computers can show beautiful images and play clear sounds, but they don't let you physically touch digital items. Similarly, most robots are surprisingly unskilled at physically interacting with real objects and with people. The Haptic Intelligence (HI) Department works to expand our understanding of touch and movement while simultaneously inventing new technical capabilities for interactive systems involving humans, computers, and robots.
When you touch objects in your surroundings, you can discern each item's physical properties from the rich array of haptic cues you experience, including both the tactile sensations in your skin and the kinesthetic cues in your muscles and joints. For example, picking up a glass of sparkling water refines your visual estimates of the glass's location, size, shape, and weight while also making you rapidly aware of the temperature, stiffness, smoothness, and friction of its surfaces. Feeling how these haptic sensations develop in response to your motions enables you to not only perceive the glass's properties but also manipulate it fluidly, whether your goal is to hand it to a friend, bring it to your own lips to drink, or rotate it under a flow of hot water as your other hand scrubs it clean.
Over the course of life, humans leverage their rich sense of touch to master a wide variety of physical tasks, from everyday necessities like zipping a jacket to difficult feats such as inserting a needle into a patient's vein. Many tasks are challenging when first tried, but practice usually enables them to become almost automatic. You can gain some appreciation for the complexity of tasks that normally feel effortless, such as slicing bread or brushing your teeth, by trying to complete them with your non-dominant hand. Similarly, even the simplest manual skills become almost impossible if you lose your tactile sensitivity due to local anesthetic or a lack of blood flow. The crucial role of the sense of touch is also deeply appreciated by researchers working to create autonomous robots that can competently manipulate everyday objects and safely interact with humans in unstructured environments. Such systems rarely take advantage of haptic cues and thus often struggle to match the perception, manipulation, and interaction capabilities of humans.
Although humans experience this sense coherently, touch stems from a wide range of distributed mechanical, thermal, and pain receptors. Each type of mechanoreceptor responds most strongly to a particular category of stimulation, with overlapping characteristic frequency ranges that go from steady state all the way up to 1000 Hz. While vision has high spatial acuity and only moderate temporal acuity, hearing is the opposite; different aspects of haptic perception lie along this spatiotemporal continuum. Given this broad sensory bandwidth as well as the technical challenge of creating expressive, robust, low-profile haptic sensors and actuators, few computer and machine interfaces provide the human operator with high-fidelity touch feedback or carefully analyze the physical signals generated during interactions, limiting their utility.
Exploring the world through touch requires action, and what one feels greatly depends on how one moves. Thus, our research often centers on a tight closed loop between perception and action. Motivated by great potential benefits to society, the interdisciplinary researchers of the HI Department work together to advance our understanding of touch and movement from the perspectives of both humans and robots. Our diverse team explores myriad aspects of haptic intelligent systems, frequently collaborating with scientists from outside HI.
We are presently pursuing projects in six main research fields: natural embodied touch, artificial touch sensing and processing, haptic actuation, immersive teleoperation, human-robot interaction, and human movement. We relish working on this broad range of topics because these fields build on one another in sometimes unexpected ways. For example, insights from our research on natural embodied touch can give us ideas for new approaches to try in artificial touch sensing and can also inspire new haptic actuation methods. Similarly, the kinds of haptic data that need to be sensed, processed, and actuated for the human operator of a telerobotic system are exactly the same kinds of haptic data that autonomous robots most stand to benefit from, whether deployed for physical human-robot interaction or autonomous manipulation tasks. We are fascinated by the movements humans make and see their sensing and analysis as another common thread running through our research, especially for immersive teleoperation and human-robot interaction. Each field we investigate uncovers a different facet of the deeper topic of haptic intelligence, enabling greater involvement of touch and movement in future intelligent systems.