Quantifying the Quality of Haptic Interfaces
Shape-Changing Haptic Interfaces
Generating Clear Vibrotactile Cues with Magnets Embedded in a Soft Finger Sheath
Salient Full-Fingertip Haptic Feedback Enabled by Wearable Electrohydraulic Actuation
Cutaneous Electrohydraulic (CUTE) Wearable Devices for Pleasant Broad-Bandwidth Haptic Cues
Modeling Finger-Touchscreen Contact during Electrovibration
Perception of Ultrasonic Friction Pulses
Vibrotactile Playback for Teaching Sensorimotor Skills in Medical Procedures
CAPT Motor: A Two-Phase Ironless Motor Structure
4D Intraoperative Surgical Perception: Anatomical Shape Reconstruction from Multiple Viewpoints
Visual-Inertial Force Estimation in Robotic Surgery
Enhancing Robotic Surgical Training
AiroTouch: Naturalistic Vibrotactile Feedback for Large-Scale Telerobotic Assembly
Optimization-Based Whole-Arm Teleoperation for Natural Human-Robot Interaction
Finger-Surface Contact Mechanics in Diverse Moisture Conditions
Computational Modeling of Finger-Surface Contact
Perceptual Integration of Contact Force Components During Tactile Stimulation
Dynamic Models and Wearable Tactile Devices for the Fingertips
Novel Designs and Rendering Algorithms for Fingertip Haptic Devices
Dimensional Reduction from 3D to 1D for Realistic Vibration Rendering
Prendo: Analyzing Human Grasping Strategies for Visually Occluded Objects
Learning Upper-Limb Exercises from Demonstrations
Minimally Invasive Surgical Training with Multimodal Feedback and Automatic Skill Evaluation
Efficient Large-Area Tactile Sensing for Robot Skin
Haptic Feedback and Autonomous Reflexes for Upper-limb Prostheses
Gait Retraining
Modeling Hand Deformations During Contact
Intraoperative AR Assistance for Robot-Assisted Minimally Invasive Surgery
Immersive VR for Phantom Limb Pain
Visual and Haptic Perception of Real Surfaces
Haptipedia
Gait Propulsion Trainer
TouchTable: A Musical Interface with Haptic Feedback for DJs
Exercise Games with Baxter
Intuitive Social-Physical Robots for Exercise
How Should Robots Hug?
Hierarchical Structure for Learning from Demonstration
Fabrication of HuggieBot 2.0: A More Huggable Robot
Learning Haptic Adjectives from Tactile Data
Feeling With Your Eyes: Visual-Haptic Surface Interaction
S-BAN
General Tactile Sensor Model
Insight: a Haptic Sensor Powered by Vision and Machine Learning
Quantifying the Quality of Haptic Interfaces

A grounded force-feedback (GFF) device is a mechatronic system mounted to a stationary surface that measures the user's motion and/or force and outputs forces and/or motions in response so that the user can feel a virtual or remote environment. GFF devices are a well-established and diverse class of haptic technology. However, the number of designs and the heterogeneity of reported features complicate the selection of suitable devices. We created Haptify [] to facilitate this process.
Haptify is our benchmarking system that can thoroughly, fairly, and noninvasively evaluate GFF devices. Haptify has three main sensing components: a motion-capture system, a custom-built force plate, and a sensing element to mount the end-effector []. Our approach to examining GFF devices is inspired by real use cases, in which the haptic device is placed on a table and the human user moves the end-effector while the device is either off (unpowered mode) or rendering virtual content (active mode). We use Haptify's measurements to define new performance metrics that enable one to compare how devices feel to the user during similar tasks. We validated Haptify by benchmarking two commercial haptic devices, the 3D Systems Touch and Touch X. Our results show that with a slightly smaller workspace than the 3D Systems Touch, the more expensive Touch X outputs smaller free-space forces and vibrations, smaller and more predictable dynamic forces and torques, and higher-quality renderings of a frictionless surface and high stiffness [
].
One limitation of the previous method is its reliance solely on quantitative data, without incorporating user opinions. Experts, however, typically base their assessments on hands-on experience with devices. To close this gap, we conducted a user study on expert hapticians to evaluate four representative devices rendering different virtual benchmark environments while recording interaction data using Haptify []. By correlating qualitative expert assessments with quantitative external measurements from Haptify, we aim to establish a systematic way to characterize the capabilities of haptic devices.
This research project involves a collaboration with Karon E. MacLean (University of British Columbia).
Members
Publications