Quantifying the Quality of Haptic Interfaces
Shape-Changing Haptic Interfaces
Generating Clear Vibrotactile Cues with Magnets Embedded in a Soft Finger Sheath
Salient Full-Fingertip Haptic Feedback Enabled by Wearable Electrohydraulic Actuation
Cutaneous Electrohydraulic (CUTE) Wearable Devices for Pleasant Broad-Bandwidth Haptic Cues
Modeling Finger-Touchscreen Contact during Electrovibration
Perception of Ultrasonic Friction Pulses
Vibrotactile Playback for Teaching Sensorimotor Skills in Medical Procedures
CAPT Motor: A Two-Phase Ironless Motor Structure
4D Intraoperative Surgical Perception: Anatomical Shape Reconstruction from Multiple Viewpoints
Visual-Inertial Force Estimation in Robotic Surgery
Enhancing Robotic Surgical Training
AiroTouch: Naturalistic Vibrotactile Feedback for Large-Scale Telerobotic Assembly
Optimization-Based Whole-Arm Teleoperation for Natural Human-Robot Interaction
Finger-Surface Contact Mechanics in Diverse Moisture Conditions
Computational Modeling of Finger-Surface Contact
Perceptual Integration of Contact Force Components During Tactile Stimulation
Dynamic Models and Wearable Tactile Devices for the Fingertips
Novel Designs and Rendering Algorithms for Fingertip Haptic Devices
Dimensional Reduction from 3D to 1D for Realistic Vibration Rendering
Prendo: Analyzing Human Grasping Strategies for Visually Occluded Objects
Learning Upper-Limb Exercises from Demonstrations
Minimally Invasive Surgical Training with Multimodal Feedback and Automatic Skill Evaluation
Efficient Large-Area Tactile Sensing for Robot Skin
Haptic Feedback and Autonomous Reflexes for Upper-limb Prostheses
Gait Retraining
Modeling Hand Deformations During Contact
Intraoperative AR Assistance for Robot-Assisted Minimally Invasive Surgery
Immersive VR for Phantom Limb Pain
Visual and Haptic Perception of Real Surfaces
Haptipedia
Gait Propulsion Trainer
TouchTable: A Musical Interface with Haptic Feedback for DJs
Exercise Games with Baxter
Intuitive Social-Physical Robots for Exercise
How Should Robots Hug?
Hierarchical Structure for Learning from Demonstration
Fabrication of HuggieBot 2.0: A More Huggable Robot
Learning Haptic Adjectives from Tactile Data
Feeling With Your Eyes: Visual-Haptic Surface Interaction
S-BAN
General Tactile Sensor Model
Insight: a Haptic Sensor Powered by Vision and Machine Learning
Shape-Changing Haptic Interfaces

Haptic devices are a promising alternative to devices with visual feedback, which demands one's gaze and is inaccessible to vision-impaired people, and audio feedback, which can mask or be masked by environmental sounds and is inaccessible to hearing-impaired people. Shape-changing devices are a subset of haptic interfaces that provide tangible feedback by physically transforming their shape; this transformation can easily be perceived by the body part contacting the device.
We are exploring how such devices can be used to guide human motion, such as rotation and translation. Shape-changing feedback can be particularly intuitive for this task because the device itself can rotate and translate to different poses in the hand of the user. We created three shape-changing devices that assist people with real-world tasks: the S-BAN (Shape-Based-Assistance for Navigation) provides navigation guidance, and Drangle and Brangle guide users in construction tasks.
The S-BAN is a handheld haptic device that can pivot left/right and extend/retract its body [], opening up possibilities and questions around spatial data representation through touch. To date, we have tested the feedback of the S-BAN via perceptual studies and embodied navigation tasks in virtual reality, where user performance with the S-BAN was compared to other navigation modalities. Results indicated highest user sensitivity to guidance cues in the cardinal directions and equivalent navigation efficiency, slower navigation time, and a more elevated gaze when compared to vision-based guidance from a smartphone proxy.
Drangle helps a user orient a drill when drilling angled holes, and Brangle guides a user to place bricks in a curvilinear arrangement []. Both devices use graded bidirectional edge-changing cues to guide the user to the target orientation. In a user study, participants understood the shape-changing feedback and enjoyed using both devices. Users strongly preferred Drangle over a mechanical drill guide, and they found Drangle's fingertip feedback more intuitive than Brangle's palmar cues. Future work includes integrating such devices into systems that can improve construction workflows relevant to IntCDC.
This research project involves collaborations with Tiffany Cheng (University of Stuttgart), Achim Menges (University of Stuttgart), Yasaman Tahouni (University of Stuttgart), and Dylan Wood (University of Stuttgart).
Members
Publications