Quantifying the Quality of Haptic Interfaces
Shape-Changing Haptic Interfaces
Generating Clear Vibrotactile Cues with Magnets Embedded in a Soft Finger Sheath
Salient Full-Fingertip Haptic Feedback Enabled by Wearable Electrohydraulic Actuation
Cutaneous Electrohydraulic (CUTE) Wearable Devices for Pleasant Broad-Bandwidth Haptic Cues
Modeling Finger-Touchscreen Contact during Electrovibration
Perception of Ultrasonic Friction Pulses
Vibrotactile Playback for Teaching Sensorimotor Skills in Medical Procedures
CAPT Motor: A Two-Phase Ironless Motor Structure
4D Intraoperative Surgical Perception: Anatomical Shape Reconstruction from Multiple Viewpoints
Visual-Inertial Force Estimation in Robotic Surgery
Enhancing Robotic Surgical Training
AiroTouch: Naturalistic Vibrotactile Feedback for Large-Scale Telerobotic Assembly
Optimization-Based Whole-Arm Teleoperation for Natural Human-Robot Interaction
Finger-Surface Contact Mechanics in Diverse Moisture Conditions
Computational Modeling of Finger-Surface Contact
Perceptual Integration of Contact Force Components During Tactile Stimulation
Dynamic Models and Wearable Tactile Devices for the Fingertips
Novel Designs and Rendering Algorithms for Fingertip Haptic Devices
Dimensional Reduction from 3D to 1D for Realistic Vibration Rendering
Prendo: Analyzing Human Grasping Strategies for Visually Occluded Objects
Learning Upper-Limb Exercises from Demonstrations
Minimally Invasive Surgical Training with Multimodal Feedback and Automatic Skill Evaluation
Efficient Large-Area Tactile Sensing for Robot Skin
Haptic Feedback and Autonomous Reflexes for Upper-limb Prostheses
Gait Retraining
Modeling Hand Deformations During Contact
Intraoperative AR Assistance for Robot-Assisted Minimally Invasive Surgery
Immersive VR for Phantom Limb Pain
Visual and Haptic Perception of Real Surfaces
Haptipedia
Gait Propulsion Trainer
TouchTable: A Musical Interface with Haptic Feedback for DJs
Exercise Games with Baxter
Intuitive Social-Physical Robots for Exercise
How Should Robots Hug?
Hierarchical Structure for Learning from Demonstration
Fabrication of HuggieBot 2.0: A More Huggable Robot
Learning Haptic Adjectives from Tactile Data
Feeling With Your Eyes: Visual-Haptic Surface Interaction
S-BAN
General Tactile Sensor Model
Insight: a Haptic Sensor Powered by Vision and Machine Learning
AiroTouch: Naturalistic Vibrotactile Feedback for Large-Scale Telerobotic Assembly

Teleoperation enables humans to perform delicate or dangerous tasks by controlling precise and powerful machines from a safe distance. However, because teleoperation relies primarily on visual feedback, operators may struggle to perceive stiff contacts. Moreover, poor visibility further compromises situational awareness, thereby complicating the task. To bridge this gap, we propose that intuitive, reliable, economical, and easy-to-implement naturalistic vibrotactile feedback could improve telerobotic control interfaces such as surgery and construction.
In a collaborative project funded by IntCDC, we explored how to deliver naturalistic vibrotactile feedback from a robot's end-effectors to the hand of an operator performing telerobotic assembly tasks and investigated the effects of such haptic cues. We first engineered AiroTouch, a naturalistic vibrotactile feedback system that measures the vibrations experienced by each robot tool with a high-bandwidth three-axis accelerometer and enables the users to feel those vibrations in real time through voice-coil actuators [].
Then we evaluated AiroTouch and explored the effects of the naturalistic vibrotactile feedback it delivers in three user studies. The first study involved a small-scale telerobotic assembly task using a da Vinci Si surgical robot. Results from this study show that naturalistic vibrotactile feedback increases the realism of the interaction and reduces the perceived task duration, task difficulty, and fatigue []. These results demonstrated the potential for AiroTouch to be integrated with various telerobotic applications, such as unstructured construction environments. Consequently, we validated the wireless version of AiroTouch during an on-site large-scale assembly of the timber building LivMatS Biomimetic Shell in Freiburg with a mini-crane construction robot. Qualitative analysis of the study indicates that this type of feedback enhances the participants' awareness of both robot motion and contact between the robot and other objects, particularly in scenarios with limited visibility [
]. Finally, we evaluated its effects during live teleoperation of the mini-crane in large-scale assembly-related tasks [
]. Our results indicate that naturalistic vibrotactile feedback increases participants' confidence when controlling the robot. Moreover, there is a noticeable trend of reduced vibration magnitude in the conditions where haptic feedback is provided.
The primary contribution of this work is to identify the characteristics that are essential for the effective implementation of naturalistic vibrotactile feedback. These findings lay the foundation for further exploration of the potential benefits of incorporating haptic cues to enhance user experience during teleoperation.
This research project involves collaborations with Anja Patricia Regina Lauer (University of Stuttgart) and Oliver Sawodny (University of Stuttgart).
Members
Publications