Quantifying the Quality of Haptic Interfaces
Shape-Changing Haptic Interfaces
Generating Clear Vibrotactile Cues with Magnets Embedded in a Soft Finger Sheath
Salient Full-Fingertip Haptic Feedback Enabled by Wearable Electrohydraulic Actuation
Cutaneous Electrohydraulic (CUTE) Wearable Devices for Pleasant Broad-Bandwidth Haptic Cues
Modeling Finger-Touchscreen Contact during Electrovibration
Perception of Ultrasonic Friction Pulses
Vibrotactile Playback for Teaching Sensorimotor Skills in Medical Procedures
CAPT Motor: A Two-Phase Ironless Motor Structure
4D Intraoperative Surgical Perception: Anatomical Shape Reconstruction from Multiple Viewpoints
Visual-Inertial Force Estimation in Robotic Surgery
Enhancing Robotic Surgical Training
AiroTouch: Naturalistic Vibrotactile Feedback for Large-Scale Telerobotic Assembly
Optimization-Based Whole-Arm Teleoperation for Natural Human-Robot Interaction
Finger-Surface Contact Mechanics in Diverse Moisture Conditions
Computational Modeling of Finger-Surface Contact
Perceptual Integration of Contact Force Components During Tactile Stimulation
Dynamic Models and Wearable Tactile Devices for the Fingertips
Novel Designs and Rendering Algorithms for Fingertip Haptic Devices
Dimensional Reduction from 3D to 1D for Realistic Vibration Rendering
Prendo: Analyzing Human Grasping Strategies for Visually Occluded Objects
Learning Upper-Limb Exercises from Demonstrations
Minimally Invasive Surgical Training with Multimodal Feedback and Automatic Skill Evaluation
Efficient Large-Area Tactile Sensing for Robot Skin
Haptic Feedback and Autonomous Reflexes for Upper-limb Prostheses
Gait Retraining
Modeling Hand Deformations During Contact
Intraoperative AR Assistance for Robot-Assisted Minimally Invasive Surgery
Immersive VR for Phantom Limb Pain
Visual and Haptic Perception of Real Surfaces
Haptipedia
Gait Propulsion Trainer
TouchTable: A Musical Interface with Haptic Feedback for DJs
Exercise Games with Baxter
Intuitive Social-Physical Robots for Exercise
How Should Robots Hug?
Hierarchical Structure for Learning from Demonstration
Fabrication of HuggieBot 2.0: A More Huggable Robot
Learning Haptic Adjectives from Tactile Data
Feeling With Your Eyes: Visual-Haptic Surface Interaction
S-BAN
General Tactile Sensor Model
Insight: a Haptic Sensor Powered by Vision and Machine Learning
Exercise Games with Baxter

Improvements in healthcare have led to an increase in human life expectancy. Members of this aging population want to stay healthy and active, but many forms of exercise and physical therapy are expensive, boring, or inefficient. Past research has shown that well-designed robots can play a vital role in motivating users to perform regular exercise and physical therapy.
To discover how people respond to physical exercise interactions with a robot, we have developed eight human-robot exercise games for Max, our Baxter Research Robot (developed by Rethink Robotics): six of these games involve some form of physical contact with the robot, and two involve performing movements as directed by the robot, which has been the standard approach in prior work. These games were developed with the input and guidance of experts in game design, therapy and rehabilitation [], as well as through extensive pilot testing [
]. The viability of the games was then formally evaluated in a user study conducted at the Rehabilitation Robotics Laboratory at the University of Pennsylvania.
Our subject group included 20 younger and 20 older adult users. Participants of both age groups were willing to enter Baxter's workspace and physically interact with the robot through all of these games []. Additionally, participating in the experiment caused a significant increase in user trust and confidence in Baxter [
]. Careful analysis of the human-robot interactions that occurred throughout the study provided us with detailed feedback on the usability of all of the games. These results support the potential use of bimanual humanoid robots for social-physical interaction in exercise and will help guide our ongoing efforts in this research domain.
Members
Publications