Quantifying the Quality of Haptic Interfaces
Shape-Changing Haptic Interfaces
Generating Clear Vibrotactile Cues with Magnets Embedded in a Soft Finger Sheath
Salient Full-Fingertip Haptic Feedback Enabled by Wearable Electrohydraulic Actuation
Cutaneous Electrohydraulic (CUTE) Wearable Devices for Pleasant Broad-Bandwidth Haptic Cues
Modeling Finger-Touchscreen Contact during Electrovibration
Perception of Ultrasonic Friction Pulses
Vibrotactile Playback for Teaching Sensorimotor Skills in Medical Procedures
CAPT Motor: A Two-Phase Ironless Motor Structure
4D Intraoperative Surgical Perception: Anatomical Shape Reconstruction from Multiple Viewpoints
Visual-Inertial Force Estimation in Robotic Surgery
Enhancing Robotic Surgical Training
AiroTouch: Naturalistic Vibrotactile Feedback for Large-Scale Telerobotic Assembly
Optimization-Based Whole-Arm Teleoperation for Natural Human-Robot Interaction
Finger-Surface Contact Mechanics in Diverse Moisture Conditions
Computational Modeling of Finger-Surface Contact
Perceptual Integration of Contact Force Components During Tactile Stimulation
Dynamic Models and Wearable Tactile Devices for the Fingertips
Novel Designs and Rendering Algorithms for Fingertip Haptic Devices
Dimensional Reduction from 3D to 1D for Realistic Vibration Rendering
Prendo: Analyzing Human Grasping Strategies for Visually Occluded Objects
Learning Upper-Limb Exercises from Demonstrations
Minimally Invasive Surgical Training with Multimodal Feedback and Automatic Skill Evaluation
Efficient Large-Area Tactile Sensing for Robot Skin
Haptic Feedback and Autonomous Reflexes for Upper-limb Prostheses
Gait Retraining
Modeling Hand Deformations During Contact
Intraoperative AR Assistance for Robot-Assisted Minimally Invasive Surgery
Immersive VR for Phantom Limb Pain
Visual and Haptic Perception of Real Surfaces
Haptipedia
Gait Propulsion Trainer
TouchTable: A Musical Interface with Haptic Feedback for DJs
Exercise Games with Baxter
Intuitive Social-Physical Robots for Exercise
How Should Robots Hug?
Hierarchical Structure for Learning from Demonstration
Fabrication of HuggieBot 2.0: A More Huggable Robot
Learning Haptic Adjectives from Tactile Data
Feeling With Your Eyes: Visual-Haptic Surface Interaction
S-BAN
General Tactile Sensor Model
Insight: a Haptic Sensor Powered by Vision and Machine Learning
Optimization-Based Whole-Arm Teleoperation for Natural Human-Robot Interaction

Within the realm of HRI, prior studies often underutilize robots' physical motion capabilities, focusing instead on verbal or affective interactions. Teleoperation is a key tool for leveraging expert knowledge to develop and evaluate social robot behaviors. Motion capture-based teleoperation is considered one of the most intuitive methods for non-expert users to teach behaviors to robots. However, researchers in HRI come from varied backgrounds and work with diverse robotic systems. To address this, we introduce OCRA, an optimization-based, customizable retargeting algorithm for real-time teleoperation of any serial robotic arm [].
OCRA adapts to various robots and allows customization via a novel arm-skeleton error term. By considering entire arm shape and hand orientation, the algorithm enables expressive, natural motion for social-physical interactions such as high-fives or fist bumps []. To evaluate the effectiveness of OCRA, we conducted a user study where participants compared OCRA-generated motions with human demonstrations. Results confirmed that OCRA produces human-like motion. The algorithm utilizes full-body motion capture to map movements onto robots such as Baxter and NAO [
].
OCRA is operator-agnostic, requiring no additional setup between users once configured for a robot. This feature enables non-roboticists, such as clinicians, to personalize robot exercises without advanced robotics expertise, facilitating seamless integration into rehabilitation settings. To assess clinician interest in utilizing such a system with their clients, we gathered expert perspectives through semi-structured interviews on teleoperated social exercise robots []. Experts broadly agreed that teleoperation improves perceptions of social robots, alleviating concerns about their role in therapy.
Future work will evaluate OCRA's usability across platforms and explore autonomous learning from teleoperated behaviors. Our vision is to enable social robots to act as exercise partners or coaches in rehabilitation settings []. Teleoperated behaviors can be recorded, autonomously learned, and repeated, paving the way for intuitive, personalized robot-assisted therapy.
Members
Publications