Quantifying the Quality of Haptic Interfaces
Shape-Changing Haptic Interfaces
Generating Clear Vibrotactile Cues with Magnets Embedded in a Soft Finger Sheath
Salient Full-Fingertip Haptic Feedback Enabled by Wearable Electrohydraulic Actuation
Cutaneous Electrohydraulic (CUTE) Wearable Devices for Pleasant Broad-Bandwidth Haptic Cues
Modeling Finger-Touchscreen Contact during Electrovibration
Perception of Ultrasonic Friction Pulses
Vibrotactile Playback for Teaching Sensorimotor Skills in Medical Procedures
CAPT Motor: A Two-Phase Ironless Motor Structure
4D Intraoperative Surgical Perception: Anatomical Shape Reconstruction from Multiple Viewpoints
Visual-Inertial Force Estimation in Robotic Surgery
Enhancing Robotic Surgical Training
AiroTouch: Naturalistic Vibrotactile Feedback for Large-Scale Telerobotic Assembly
Optimization-Based Whole-Arm Teleoperation for Natural Human-Robot Interaction
Finger-Surface Contact Mechanics in Diverse Moisture Conditions
Computational Modeling of Finger-Surface Contact
Perceptual Integration of Contact Force Components During Tactile Stimulation
Dynamic Models and Wearable Tactile Devices for the Fingertips
Novel Designs and Rendering Algorithms for Fingertip Haptic Devices
Dimensional Reduction from 3D to 1D for Realistic Vibration Rendering
Prendo: Analyzing Human Grasping Strategies for Visually Occluded Objects
Learning Upper-Limb Exercises from Demonstrations
Minimally Invasive Surgical Training with Multimodal Feedback and Automatic Skill Evaluation
Efficient Large-Area Tactile Sensing for Robot Skin
Haptic Feedback and Autonomous Reflexes for Upper-limb Prostheses
Gait Retraining
Modeling Hand Deformations During Contact
Intraoperative AR Assistance for Robot-Assisted Minimally Invasive Surgery
Immersive VR for Phantom Limb Pain
Visual and Haptic Perception of Real Surfaces
Haptipedia
Gait Propulsion Trainer
TouchTable: A Musical Interface with Haptic Feedback for DJs
Exercise Games with Baxter
Intuitive Social-Physical Robots for Exercise
How Should Robots Hug?
Hierarchical Structure for Learning from Demonstration
Fabrication of HuggieBot 2.0: A More Huggable Robot
Learning Haptic Adjectives from Tactile Data
Feeling With Your Eyes: Visual-Haptic Surface Interaction
S-BAN
General Tactile Sensor Model
Insight: a Haptic Sensor Powered by Vision and Machine Learning
Human Movement

The HI Department's study of natural embodied touch, artificial touch sensing and processing, haptic actuation, immersive teleoperation, and human-robot interaction repeatedly brings our researchers into contact with the fundamental topic of human movement. Over the years, our collaborations with researchers in neurology, biomechanics, computer vision, and human factors have stoked our fascination in this research field on its own, even without the inclusion of haptic sensing or haptic feedback. Consequently, a number of our projects now center on human movement itself.
Sometimes our work aims to improve sensing and reconstruction of human movement. For example, the sign-language-focused doctoral research of Paola Forte is co-advised by Michael Black and involves collaborations with several other members of his Perceiving Systems Department, bridging our knowledge of biomedical engineering with their world-class expertise in computer vision and human pose reconstruction. The results achieved in this project reinforce our commitment to collaborating across disciplinary and departmental boundaries.
Another thrust has centered on combining human motion tracking with wearable haptic actuators to shape human movement in real time. Katherine's 2009 grant from the US National Science Foundation laid out a framework for our approach, which we call tactile motion guidance; what the user feels on their skin is a direct function of the movement that they make, so that digital guidance becomes tangible. This topic has particular resonance with the newly founded Center for Bionic Intelligence Tübingen Stuttgart (BITS), so we hope to develop it further in the coming years.
Finally, our attention has also been drawn to the dynamics of teams of people collaborating on complex tasks, so we have created a comprehensive instrumentation system for such scenarios. Analyzing human positions, orientations, speech utterances, breathing, and heart rate over time promises to enable new quantitative insights about human and team behavior. In the long term, we want to recognize collaboration issues as they unfold and provide digital guidance to help get the team back on track, just as we use haptic guidance to help individuals adjust their movements toward better patterns over time.