Quantifying the Quality of Haptic Interfaces
Shape-Changing Haptic Interfaces
Generating Clear Vibrotactile Cues with Magnets Embedded in a Soft Finger Sheath
Salient Full-Fingertip Haptic Feedback Enabled by Wearable Electrohydraulic Actuation
Cutaneous Electrohydraulic (CUTE) Wearable Devices for Pleasant Broad-Bandwidth Haptic Cues
Modeling Finger-Touchscreen Contact during Electrovibration
Perception of Ultrasonic Friction Pulses
Vibrotactile Playback for Teaching Sensorimotor Skills in Medical Procedures
CAPT Motor: A Two-Phase Ironless Motor Structure
4D Intraoperative Surgical Perception: Anatomical Shape Reconstruction from Multiple Viewpoints
Visual-Inertial Force Estimation in Robotic Surgery
Enhancing Robotic Surgical Training
AiroTouch: Naturalistic Vibrotactile Feedback for Large-Scale Telerobotic Assembly
Optimization-Based Whole-Arm Teleoperation for Natural Human-Robot Interaction
Finger-Surface Contact Mechanics in Diverse Moisture Conditions
Computational Modeling of Finger-Surface Contact
Perceptual Integration of Contact Force Components During Tactile Stimulation
Dynamic Models and Wearable Tactile Devices for the Fingertips
Novel Designs and Rendering Algorithms for Fingertip Haptic Devices
Dimensional Reduction from 3D to 1D for Realistic Vibration Rendering
Prendo: Analyzing Human Grasping Strategies for Visually Occluded Objects
Learning Upper-Limb Exercises from Demonstrations
Minimally Invasive Surgical Training with Multimodal Feedback and Automatic Skill Evaluation
Efficient Large-Area Tactile Sensing for Robot Skin
Haptic Feedback and Autonomous Reflexes for Upper-limb Prostheses
Gait Retraining
Modeling Hand Deformations During Contact
Intraoperative AR Assistance for Robot-Assisted Minimally Invasive Surgery
Immersive VR for Phantom Limb Pain
Visual and Haptic Perception of Real Surfaces
Haptipedia
Gait Propulsion Trainer
TouchTable: A Musical Interface with Haptic Feedback for DJs
Exercise Games with Baxter
Intuitive Social-Physical Robots for Exercise
How Should Robots Hug?
Hierarchical Structure for Learning from Demonstration
Fabrication of HuggieBot 2.0: A More Huggable Robot
Learning Haptic Adjectives from Tactile Data
Feeling With Your Eyes: Visual-Haptic Surface Interaction
S-BAN
General Tactile Sensor Model
Insight: a Haptic Sensor Powered by Vision and Machine Learning
Immersive VR for Phantom Limb Pain

Up to 90% of individuals who undergo amputation experience a persistent sensation of the missing limb, which is called a phantom limb A substantial subset of these people feel intense pain in the missing extremity; this phantom limb pain (PLP) often responds poorly to medications or other interventions and significantly interferes with quality of life. There is an urgent need for better treatments for PLP, particularly for people with lower-limb amputation.
Other researchers have published a small number of studies showing that virtual reality (VR) can alleviate phantom limb pain for some people, but the strength of the effect varies greatly, and some people show no response. The research objective of this project is to test the hypothesis that limitations in the verisimilitude of the sensory feedback provided by current PLP therapies limit their efficacy.
Specifically we aim to determine whether PLP after leg amputation is reduced by high-quality, multi-modal feedback provided through immersive VR technology that tracks the participant's real leg motions and shows him or her a pair of intact legs moving in the same way. Immersive games that require a large range of leg motions add to the engagement and long-term interest of such a treatment option.
Pilot data collected from two individuals with PLP using an early version of this system strongly support the hypothesis in question []. After improving the system's hardware and software, we are now conducting a proof-of-concept study to serve as the next step in the translational pipeline.
This study is being conducted in Philadelphia, USA, and this project is funded by an NIH R21 grant to H. Branch Coslett.
Members
Publications