Download Video: HD (MP4, 62 MB)

Abstract

Video-based human motion transfer creates video animations of humans following a source motion. Current methods show remarkable results for tightly-clad subjects. However, the lack of temporally consistent handling of plausible clothing dynamics, including fine and high-frequency details, significantly limits the attainable visual quality. We address these limitations for the first time in the literature and present a new framework which performs high-fidelity and temporally-consistent human motion transfer with natural pose-dependent non-rigid deformations, for several types of loose garments. In contrast to the previous techniques, we perform image generation in three subsequent stages, synthesizing human shape, structure, and appearance. Given a monocular RGB video of an actor, we train a stack of recurrent deep neural networks that generate these intermediate representations from 2D poses and their temporal derivatives. Splitting the difficult motion transfer problem into subtasks that are aware of the temporal motion context helps us to synthesize results with plausible dynamics and pose-dependent detail. It also allows artistic control of results by manipulation of individual framework stages. In the experimental results, we significantly outperform the state-of-the-art in terms of video realism. Our code and data will be made publicly available.

Downloads


Citation

BibTeX, 1 KB

@misc{kappel2020high-fidelity,
      title={High-Fidelity Neural Human Motion Transfer from Monocular Video}, 
      author={Kappel, Moritz and Golyanik, Vladislav and Elgharib, Mohamed and Henningson, Jann-Ole and Seidel, Hans-Peter and Castillo, Susana  and Theobalt, Christian and Magnor, Marcus},
      year={2020},
      eprint={2012.10974},
      archivePrefix={arXiv},
      primaryClass={cs.CV}
}

Acknowledgments

The authors gratefully acknowledge funding by the German Science Foundation (DFG MA2555/15-1 “Immersive Digital Reality”). This work was partially funded by the ERC Consolidator Grant 4DRepLy (770784). We thank Jalees Nehvi and Navami Kairanda for helping with comparisons.

Contact

For questions, clarifications, please get in touch with:
Moritz Kappel
kappel@cg.cs.tu-bs.de
Vladislav Golyanik
golyanik@mpi-inf.mpg.de
Mohamed Elgharib
elgharib@mpi-inf.mpg.de

This page is Zotero translator friendly. Imprint. Data Protection.