TightCap: 3D Human Shape Capture
with Clothing Tightness Field

Xin Chen1,2   Anqi Pang1,2   Wei Yang3   Peihao Wang1,2   Lan Xu1   Jingyi Yu1  

Download Video: HD (MP4, 59 MB)

Abstract

In this paper, we present TightCap, a data-driven scheme to capture both the human shape and dressed garments accurately with only a single 3D human scan, which enables numerous applications such as virtual try-on, biometrics, and body evaluation. To break the severe variations of the human poses and garments, we propose to model the clothing tightness field – the displacements from the garments to the human shape implicitly in the global UV texturing domain. To this end, we utilize an enhanced statistical human template and an effective multi-stage alignment scheme to map the 3D scan into a hybrid 2D geometry image. Based on this 2D representation, we propose a novel framework to predict clothing tightness field via a novel tightness formulation, as well as an effective optimization scheme to further reconstruct multi-layer human shape and garments under various clothing categories and human postures. We further propose a new clothing tightness dataset (CTD) of human scans with a large variety of clothing styles, poses, and corresponding ground-truth human shapes to stimulate further research. Extensive experiments demonstrate the effectiveness of our TightCap to achieve the high-quality human shape and dressed garments reconstruction, as well as the further applications for clothing segmentation, retargeting, and animation.

Downloads


Dataset

Results of clothing retargeting and animation

Results on 3D human clothing and body shape capture

Citation

BibTeX, 1 KB

@article{chen2021tightcap,
  title={TightCap: 3D Human Shape Capture with Clothing Tightness Field},
  author={Chen, Xin and Pang, Anqi and Yang, Wei and Wang, Peihao and Xu, Lan and Yu, Jingyi},
  journal={ACM Transactions on Graphics (TOG)},
  volume={41},
  number={1},
  pages={1--17},
  year={2021},
  publisher={ACM New York, NY}
}

Contact

If you have any question, please feel free to ask Xin Chen, chenxin2@shanghaitech.edu.cn.