SENS: Part-Aware Sketch-based Implicit Neural Shape Modeling

Alexandre Binninger, Amir Hertz, Olga Sorkine-Hornung, Daniel Cohen-Or, Raja Giryes

Research output: Contribution to journalArticlepeer-review

Abstract

We present SENS, a novel method for generating and editing 3D models from hand-drawn sketches, including those of abstract nature. Our method allows users to quickly and easily sketch a shape, and then maps the sketch into the latent space of a part-aware neural implicit shape architecture. SENS analyzes the sketch and encodes its parts into ViT patch encoding, subsequently feeding them into a transformer decoder that converts them to shape embeddings suitable for editing 3D neural implicit shapes. SENS provides intuitive sketch-based generation and editing, and also succeeds in capturing the intent of the user's sketch to generate a variety of novel and expressive 3D shapes, even from abstract and imprecise sketches. Additionally, SENS supports refinement via part reconstruction, allowing for nuanced adjustments and artifact removal. It also offers part-based modeling capabilities, enabling the combination of features from multiple sketches to create more complex and customized 3D shapes. We demonstrate the effectiveness of our model compared to the state-of-the-art using objective metric evaluation criteria and a user study, both indicating strong performance on sketches with a medium level of abstraction. Furthermore, we showcase our method's intuitive sketch-based shape editing capabilities, and validate it through a usability study.

Original languageEnglish
Article numbere15015
JournalComputer Graphics Forum
Volume43
Issue number2
DOIs
StatePublished - May 2024

Keywords

  • CCS Concepts
  • Neural networks
  • • Computing methodologies → Volumetric models

All Science Journal Classification (ASJC) codes

  • Computer Graphics and Computer-Aided Design

Fingerprint

Dive into the research topics of 'SENS: Part-Aware Sketch-based Implicit Neural Shape Modeling'. Together they form a unique fingerprint.

Cite this