首页|Learning a shared deformation space for efficient design-preserving garment transfer
Learning a shared deformation space for efficient design-preserving garment transfer
扫码查看
点击上方二维码区域,可以放大扫码查看
原文链接
NETL
NSTL
Elsevier
Garment transfer from a source mannequin to a shape-varying individual is a vital technique in computer graphics. Existing garment transfer methods are either time consuming or lack designed details especially for clothing with complex styles. In this paper, we propose a data-driven approach to efficiently transfer garments between two distinctive bodies while preserving the source design. Given two sets of simulated garments on a source body and a target body, we utilize the deformation gradients as the representation. Since garments in our dataset are with various topologies, we embed cloth deformation to the body. For garment transfer, the deformation is decomposed into two aspects, typically style and shape. An encoder-decoder network is proposed to learn a shared space which is invariant to garment style but related to the deformation of human bodies. For a new garment in a different style worn by the source human, our method can efficiently transfer it to the target body with the shared shape deformation, meanwhile preserving the designed details. We qualitatively and quantitatively evaluate our method on a diverse set of 3D garments that showcase rich wrinkling patterns. Experiments show that the transferred garments can preserve the source design even if the target body is quite different from the source one.
Garment transferCloth deformationShape analysis
Min Shi、Yukun Wei、Lan Chen、Dengming Zhu、Tianlu Mao、Zhaoqi Wang
展开 >
School of Control and Computer Engineering, North China Electric Power University, Beijing, China
Institute of Automation, Chinese Academy of Sciences, Beijing, China, School of Artificial Intelligence, University of Chinese Academy of Sciences, Beijing, China
Institute of Computing Technology, Chinese Academy of Sciences, Beijing, China