LOGAN: Unpaired shape transform in latent overcomplete space

Kangxue Yin, Zhiqin Chen, Hui Huang, Daniel Cohen-Or, Hao Zhang

Research output: Contribution to journalArticlepeer-review

Abstract

We introduce LOGAN, a deep neural network aimed at learning generalpurpose shape transforms from unpaired domains. The network is trained on two sets of shapes, e.g., tables and chairs, while there is neither a pairing between shapes from the domains as supervision nor any point-wise correspondence between any shapes. Once trained, LOGAN takes a shape from one domain and transforms it into the other. Our network consists of an autoencoder to encode shapes from the two input domains into a common latent space, where the latent codes concatenate multi-scale shape features, resulting in an overcomplete representation. The translator is based on a generative adversarial network (GAN), operating in the latent space, where an adversarial loss enforces cross-domain translation while a feature preservation loss ensures that the right shape features are preserved for a natural shape transform. We conduct ablation studies to validate each of our key network designs and demonstrate superior capabilities in unpaired shape transforms on a variety of examples over baselines and state-of-the-art approaches. We show that LOGAN is able to learn what shape features to preserve during shape translation, either local or non-local, whether content or style, depending solely on the input domains for training.

Original languageEnglish
Article number3356494
JournalACM Transactions on Graphics
Volume38
Issue number6
DOIs
StatePublished - Nov 2019

Keywords

  • Generative adversarial network
  • Multi-scale point cloud encoding
  • Shape transform
  • Unpaired domain translation
  • Unsupervised learning

All Science Journal Classification (ASJC) codes

  • Computer Graphics and Computer-Aided Design

Fingerprint

Dive into the research topics of 'LOGAN: Unpaired shape transform in latent overcomplete space'. Together they form a unique fingerprint.

Cite this