Quantifying the Quality of Haptic Interfaces
Shape-Changing Haptic Interfaces
Generating Clear Vibrotactile Cues with Magnets Embedded in a Soft Finger Sheath
Salient Full-Fingertip Haptic Feedback Enabled by Wearable Electrohydraulic Actuation
Cutaneous Electrohydraulic (CUTE) Wearable Devices for Pleasant Broad-Bandwidth Haptic Cues
Modeling Finger-Touchscreen Contact during Electrovibration
Perception of Ultrasonic Friction Pulses
Vibrotactile Playback for Teaching Sensorimotor Skills in Medical Procedures
CAPT Motor: A Two-Phase Ironless Motor Structure
4D Intraoperative Surgical Perception: Anatomical Shape Reconstruction from Multiple Viewpoints
Visual-Inertial Force Estimation in Robotic Surgery
Enhancing Robotic Surgical Training
AiroTouch: Naturalistic Vibrotactile Feedback for Large-Scale Telerobotic Assembly
Optimization-Based Whole-Arm Teleoperation for Natural Human-Robot Interaction
Finger-Surface Contact Mechanics in Diverse Moisture Conditions
Computational Modeling of Finger-Surface Contact
Perceptual Integration of Contact Force Components During Tactile Stimulation
Dynamic Models and Wearable Tactile Devices for the Fingertips
Novel Designs and Rendering Algorithms for Fingertip Haptic Devices
Dimensional Reduction from 3D to 1D for Realistic Vibration Rendering
Prendo: Analyzing Human Grasping Strategies for Visually Occluded Objects
Learning Upper-Limb Exercises from Demonstrations
Minimally Invasive Surgical Training with Multimodal Feedback and Automatic Skill Evaluation
Efficient Large-Area Tactile Sensing for Robot Skin
Haptic Feedback and Autonomous Reflexes for Upper-limb Prostheses
Gait Retraining
Modeling Hand Deformations During Contact
Intraoperative AR Assistance for Robot-Assisted Minimally Invasive Surgery
Immersive VR for Phantom Limb Pain
Visual and Haptic Perception of Real Surfaces
Haptipedia
Gait Propulsion Trainer
TouchTable: A Musical Interface with Haptic Feedback for DJs
Exercise Games with Baxter
Intuitive Social-Physical Robots for Exercise
How Should Robots Hug?
Hierarchical Structure for Learning from Demonstration
Fabrication of HuggieBot 2.0: A More Huggable Robot
Learning Haptic Adjectives from Tactile Data
Feeling With Your Eyes: Visual-Haptic Surface Interaction
S-BAN
General Tactile Sensor Model
Insight: a Haptic Sensor Powered by Vision and Machine Learning
Multimodal Sensing Reveals the Dynamics of Time-Constrained Teamwork

Effective teamwork is crucial in high-stakes environments like surgical operating rooms [] and emergency response scenarios, where rapid team coordination and decision-making directly impact outcomes. This project integrates and analyze multimodal data to uncover how team dynamics, leadership emergence, and diversity-driven interactions contribute to overall performance.
Our approach employs multimodal sensing technologies to capture a broad range of individual and team behaviors, including psychological (stress, workload, emotional contagion), communicational (voice-activity, turn-taking), and spatial (physical activity, coordination) aspects, from team members in time-sensitive tasks. A key aspect is the integration of diverse data streams, including physiological metrics like heart rate variability, audio recordings for speech analysis, and advanced tracking of individuals' positions within the environment through an Error-State Extended Kalman Filter (ES-EKF). This filter combines ultra-wideband (UWB) positioning with inertial measurement units to address common challenges such as electromagnetic occlusion and reduced accuracy in multi-participant setups. By fusing these data sources, the ES-EKF significantly improves both sampling rate and precision, while preserving the cost-effectiveness and portability of UWB systems [].
Funded by a 2024 Grassroots grant, this interdisciplinary effort leverages expertise from the Haptic Intelligence Department, focusing on advanced sensing and computational analysis, and the Organizational Leadership & Diversity Group, which explores leadership emergence and the effects of diversity on team collaboration. We investigate how variations in factors such as gender, age, and educational experience influence communication patterns and group coordination by engaging diverse participants in a portable escape-game scenario at our institute in Stuttgart. Beyond controlled experimental settings, we extend our research to real surgical environments, where we analyze team interactions in authentic high-stakes conditions. This comparison between simulated and real-world scenarios enables us to develop insights to enhance team performance across domains.
By identifying the key drivers of effective collaboration, this project aims to inform the design of enhanced training protocols, real-time feedback systems, and collaborative technologies for high-stakes environments.
This research project involves a collaboration with Mathieu Chollet (University of Glasgow).
Members
Publications