Haptic Intelligence Article 2022

Learning to Feel Textures: Predicting Perceptual Similarities from Unconstrained Finger-Surface Interactions

Whenever we touch a surface with our fingers, we perceive distinct tactile properties that are based on the underlying dynamics of the interaction. However, little is known about how the brain aggregates the sensory information from these dynamics to form abstract representations of textures. Earlier studies in surface perception all used general surface descriptors measured in controlled conditions instead of considering the unique dynamics of specific interactions, reducing the comprehensiveness and interpretability of the results. Here, we present an interpretable modeling method that predicts the perceptual similarity of surfaces by comparing probability distributions of features calculated from short time windows of specific physical signals (finger motion, contact force, fingernail acceleration) elicited during unconstrained finger-surface interactions. The results show that our method can predict the similarity judgments of individual participants with a maximum Spearman's correlation of 0.7. Furthermore, we found evidence that different participants weight interaction features differently when judging surface similarity. Our findings provide new perspectives on human texture perception during active touch, and our approach could benefit haptic surface assessment, robotic tactile perception, and haptic rendering.

Author(s): Benjamin A. Richardson and Yasemin Vardar and Christian Wallraven and Katherine J. Kuchenbecker
Journal: IEEE Transactions on Haptics
Volume: 15
Number (issue): 4
Pages: 705--717
Year: 2022
Month: October
Bibtex Type: Article (article)
DOI: 10.1109/TOH.2022.3212701
State: Published
Electronic Archiving: grant_archive
Note: Benjamin A. Richardson and Yasemin Vardar contributed equally to this publication

BibTex

@article{Richardson22-TH-Similarities,
  title = {Learning to Feel Textures: Predicting Perceptual Similarities from Unconstrained Finger-Surface Interactions},
  journal = {IEEE Transactions on Haptics},
  abstract = {Whenever we touch a surface with our fingers, we perceive distinct tactile properties that are based on the underlying dynamics of the interaction. However, little is known about how the brain aggregates the sensory information from these dynamics to form abstract representations of textures. Earlier studies in surface perception all used general surface descriptors measured in controlled conditions instead of considering the unique dynamics of specific interactions, reducing the comprehensiveness and interpretability of the results. Here, we present an interpretable modeling method that predicts the perceptual similarity of surfaces by comparing probability distributions of features calculated from short time windows of specific physical signals (finger motion, contact force, fingernail acceleration) elicited during unconstrained finger-surface interactions. The results show that our method can predict the similarity judgments of individual participants with a maximum Spearman's correlation of 0.7. Furthermore, we found evidence that different participants weight interaction features differently when judging surface similarity. Our findings provide new perspectives on human texture perception during active touch, and our approach could benefit haptic surface assessment, robotic tactile perception, and haptic rendering.},
  volume = {15},
  number = {4},
  pages = {705--717},
  month = oct,
  year = {2022},
  note = {Benjamin A. Richardson and Yasemin Vardar contributed equally to this publication},
  slug = {richardson22-th-similarities},
  author = {Richardson, Benjamin A. and Vardar, Yasemin and Wallraven, Christian and Kuchenbecker, Katherine J.},
  month_numeric = {10}
}