Perceiving Systems Conference Paper 2017

Detailed, accurate, human shape estimation from clothed 3D scan sequences

Web teaser

We address the problem of estimating human body shape from 3D scans over time. Reliable estimation of 3D body shape is necessary for many applications including virtual try-on, health monitoring, and avatar creation for virtual reality. Scanning bodies in minimal clothing, however, presents a practical barrier to these applications. We address this problem by estimating body shape under clothing from a sequence of 3D scans. Previous methods that have exploited statistical models of body shape produce overly smooth shapes lacking personalized details. In this paper we contribute a new approach to recover not only an approximate shape of the person, but also their detailed shape. Our approach allows the estimated shape to deviate from a parametric model to fit the 3D scans. We demonstrate the method using high quality 4D data as well as sequences of visual hulls extracted from multi-view images. We also make available a new high quality 4D dataset that enables quantitative evaluation. Our method outperforms the previous state of the art, both qualitatively and quantitatively.

Author(s): Chao Zhang and Sergi Pujades and Michael Black and Gerard Pons-Moll
Book Title: 2017 IEEE Conference on Computer Vision and Pattern Recognition (CVPR)
Pages: 5484-5493
Year: 2017
Month: July
Day: 21-26
Publisher: IEEE Computer Society
Project(s):
Bibtex Type: Conference Paper (inproceedings)
Address: Washington, DC, USA
DOI: 10.1109/CVPR.2017.582
Event Name: IEEE International Conference on Computer Vision and Pattern Recognition (CVPR) 2017
Event Place: Honolulu, HI, USA
Electronic Archiving: grant_archive
ISBN: 978-1-5386-0457-1
ISSN: 1063-6919
Note: Spotlight
Links:
Attachments:

BibTex

@inproceedings{shape_under_cloth:CVPR17,
  title = {Detailed, accurate, human shape estimation from clothed {3D} scan sequences},
  booktitle = {2017 IEEE Conference on Computer Vision and Pattern Recognition (CVPR)},
  abstract = {We address the problem of estimating human body shape from 3D scans over time. Reliable estimation of 3D body shape is necessary for many applications including virtual try-on, health monitoring, and avatar creation for virtual reality. Scanning bodies in minimal clothing, however, presents a practical barrier to these applications. We address this problem by estimating body shape under clothing from a sequence of 3D scans. Previous methods that have exploited statistical models of body shape produce overly smooth shapes lacking personalized details. In this paper we contribute a new approach to recover not only an approximate shape of the person, but also their detailed shape. Our approach allows the estimated shape to deviate from a parametric model to fit the 3D scans. We demonstrate the method using high quality 4D data as well as sequences of visual hulls extracted from multi-view images.  We also make available a new high quality 4D dataset that enables quantitative evaluation. Our method outperforms the previous state of the art, both qualitatively and quantitatively.},
  pages = {5484-5493},
  publisher = {IEEE Computer Society},
  address = {Washington, DC, USA},
  month = jul,
  year = {2017},
  note = {Spotlight},
  slug = {shape_under_cloth-cvpr17},
  author = {Zhang, Chao and Pujades, Sergi and Black, Michael and Pons-Moll, Gerard},
  month_numeric = {7}
}