Quantifying the Quality of Haptic Interfaces
Shape-Changing Haptic Interfaces
Generating Clear Vibrotactile Cues with Magnets Embedded in a Soft Finger Sheath
Salient Full-Fingertip Haptic Feedback Enabled by Wearable Electrohydraulic Actuation
Cutaneous Electrohydraulic (CUTE) Wearable Devices for Pleasant Broad-Bandwidth Haptic Cues
Modeling Finger-Touchscreen Contact during Electrovibration
Perception of Ultrasonic Friction Pulses
Vibrotactile Playback for Teaching Sensorimotor Skills in Medical Procedures
CAPT Motor: A Two-Phase Ironless Motor Structure
4D Intraoperative Surgical Perception: Anatomical Shape Reconstruction from Multiple Viewpoints
Visual-Inertial Force Estimation in Robotic Surgery
Enhancing Robotic Surgical Training
AiroTouch: Naturalistic Vibrotactile Feedback for Large-Scale Telerobotic Assembly
Optimization-Based Whole-Arm Teleoperation for Natural Human-Robot Interaction
Finger-Surface Contact Mechanics in Diverse Moisture Conditions
Computational Modeling of Finger-Surface Contact
Perceptual Integration of Contact Force Components During Tactile Stimulation
Dynamic Models and Wearable Tactile Devices for the Fingertips
Novel Designs and Rendering Algorithms for Fingertip Haptic Devices
Dimensional Reduction from 3D to 1D for Realistic Vibration Rendering
Prendo: Analyzing Human Grasping Strategies for Visually Occluded Objects
Learning Upper-Limb Exercises from Demonstrations
Minimally Invasive Surgical Training with Multimodal Feedback and Automatic Skill Evaluation
Efficient Large-Area Tactile Sensing for Robot Skin
Haptic Feedback and Autonomous Reflexes for Upper-limb Prostheses
Gait Retraining
Modeling Hand Deformations During Contact
Intraoperative AR Assistance for Robot-Assisted Minimally Invasive Surgery
Immersive VR for Phantom Limb Pain
Visual and Haptic Perception of Real Surfaces
Haptipedia
Gait Propulsion Trainer
TouchTable: A Musical Interface with Haptic Feedback for DJs
Exercise Games with Baxter
Intuitive Social-Physical Robots for Exercise
How Should Robots Hug?
Hierarchical Structure for Learning from Demonstration
Fabrication of HuggieBot 2.0: A More Huggable Robot
Learning Haptic Adjectives from Tactile Data
Feeling With Your Eyes: Visual-Haptic Surface Interaction
S-BAN
General Tactile Sensor Model
Insight: a Haptic Sensor Powered by Vision and Machine Learning
Perceptual Integration of Contact Force Components During Fingertip Sliding
Humans need to accurately process the contact forces that arise as they perform everyday haptic interactions such as sliding the fingers along a surface to feel for bumps, sticky regions, or other features. Several different mechanisms are possible for how the forces on the skin could be represented and integrated in such interactions. Forces on the finger could also be of several types: frictional, kinesthetic, normal. Each force type will target specific afferents within the fingerpad, eliciting different neural codes. These applied forces can also result either from conscious decision (often the case for the normal force) or from the bilateral interaction with the touched object or surface (typically the case of friction that depends on both the material properties and touch characteristics).
To investigate the respective contibution of different force types to the human feeling of tactile pressure on a surface, we built an apparatus that could independently modulate each of these forces. This cutting-edge tool combined the force-controlled displacement capacities of an industrial robot with a mounted display capable of finger-surface friction modulation through ultrasonic lubrication.
This experimental setup enabled us to show that humans are up to three times less sensitive to brief variations of normal force compared to similar variations in friction or lateral force regardless of the stickiness of the touched surface []. After that, we also showed that despite the difference in sensitivity, human perception of tactile pressure relied equally on the normal and lateral components of the contact force. Support vector machine analysis of participants psychophysical responses suggested that the sense of touch most probably relied on the amplitude of the three-dimensional force vector applied on the fingerpad rather than on the coefficient of dynamic friction [
].
This research project involves collaborations with Julien Lambert (Institute of Neuroscience of UCLouvain) and Jean-Louis Thonnard (Institute of Neuroscience of UCLouvain).
Members
Publications