Quantifying the Quality of Haptic Interfaces
Shape-Changing Haptic Interfaces
Generating Clear Vibrotactile Cues with Magnets Embedded in a Soft Finger Sheath
Salient Full-Fingertip Haptic Feedback Enabled by Wearable Electrohydraulic Actuation
Cutaneous Electrohydraulic (CUTE) Wearable Devices for Pleasant Broad-Bandwidth Haptic Cues
Modeling Finger-Touchscreen Contact during Electrovibration
Perception of Ultrasonic Friction Pulses
Vibrotactile Playback for Teaching Sensorimotor Skills in Medical Procedures
CAPT Motor: A Two-Phase Ironless Motor Structure
4D Intraoperative Surgical Perception: Anatomical Shape Reconstruction from Multiple Viewpoints
Visual-Inertial Force Estimation in Robotic Surgery
Enhancing Robotic Surgical Training
AiroTouch: Naturalistic Vibrotactile Feedback for Large-Scale Telerobotic Assembly
Optimization-Based Whole-Arm Teleoperation for Natural Human-Robot Interaction
Finger-Surface Contact Mechanics in Diverse Moisture Conditions
Computational Modeling of Finger-Surface Contact
Perceptual Integration of Contact Force Components During Tactile Stimulation
Dynamic Models and Wearable Tactile Devices for the Fingertips
Novel Designs and Rendering Algorithms for Fingertip Haptic Devices
Dimensional Reduction from 3D to 1D for Realistic Vibration Rendering
Prendo: Analyzing Human Grasping Strategies for Visually Occluded Objects
Learning Upper-Limb Exercises from Demonstrations
Minimally Invasive Surgical Training with Multimodal Feedback and Automatic Skill Evaluation
Efficient Large-Area Tactile Sensing for Robot Skin
Haptic Feedback and Autonomous Reflexes for Upper-limb Prostheses
Gait Retraining
Modeling Hand Deformations During Contact
Intraoperative AR Assistance for Robot-Assisted Minimally Invasive Surgery
Immersive VR for Phantom Limb Pain
Visual and Haptic Perception of Real Surfaces
Haptipedia
Gait Propulsion Trainer
TouchTable: A Musical Interface with Haptic Feedback for DJs
Exercise Games with Baxter
Intuitive Social-Physical Robots for Exercise
How Should Robots Hug?
Hierarchical Structure for Learning from Demonstration
Fabrication of HuggieBot 2.0: A More Huggable Robot
Learning Haptic Adjectives from Tactile Data
Feeling With Your Eyes: Visual-Haptic Surface Interaction
S-BAN
General Tactile Sensor Model
Insight: a Haptic Sensor Powered by Vision and Machine Learning
Dimensional Reduction from 3D to 1D for Realistic Vibration Rendering

Unconstrained tool-mediated interaction with a surface generates 3D vibrations that contain high-frequency accelerations in all Cartesian directions. These vibrations convey rich task information, so they need to be captured and portrayed for the user to feel in both virtual and remote interactions. To limit system cost and complexity, haptics researchers often reduce 3D vibrations into 1D signals and render them using a single-axis actuator as humans cannot easily perceive the direction of vibrations. Such three-to-one (321) reduction can be performed using many different algorithms that have rarely been compared.
This project investigates the quality of 321 conversion methods by analyzing the properties of their input and output vibrations. We established a real-time conversion system that simultaneously measures 3D accelerations and plays corresponding 1D vibrations []. A user can interact with various objects via a stylus that contains a three-axis accelerometer. The captured signals are then reduced to 1D by different algorithms and rendered by a standard voice-coil actuator. Objective analysis and subjective user ratings confirmed that more sophisticated conversion methods such as DFT321 perform better than the common approach of choosing a single-axis signal [
].
We also developed a novel multi-dimensional vibration rendering system that can accurately generate 3D vibrations using a commercial magnetic levitation haptic device. We quantitatively and qualitatively verified its performance at rendering recorded 3D vibrations []. The system was then used to conduct experiments to compare human perception between the original 3D and different 1D versions of the same vibration signal. Ongoing work assesses the characteristics of all common 321 algorithms by establishing perceptual spaces [
].
This project provides many practical results for both real-time haptic teleoperation and offline haptic processing. For instance, our findings can contribute to medical teleoperation systems by offering a simple but realistic 321 method that allows surgeons to feel vibrotactile feedback from remote surgical robots.
Members
Publications