OneSketch: learning high-level shape features from simple sketches

Eyal Reisfeld, Andrei Sharf

פרסום מחקרי: פרסום בכתב עתמאמרביקורת עמיתים


Humans use simple sketches to convey complex concepts and abstract ideas in a concise way. Just a few abstract pencil strokes can carry a large amount of semantic information that can be used as meaningful representation for many applications. In this work, we explore the power of simple human strokes denoted to capture high-level 2D shape semantics. For this purpose, we introduce OneSketch, a crowd-sourced dataset of abstract one-line sketches depicting high-level 2D object features. To construct the dataset, we formulate a human sketching task with the goal of differentiating between objects with a single minimal stroke. While humans are rather successful at depicting high-level shape semantics and abstraction, we investigate the ability of deep neural networks to convey such traits. We introduce a neural network which learns meaningful shape features from our OneSketch dataset. Essentially, the model learns sketch-to-shape relations and encodes them in an embedding space which reveals distinctive shape features. We show that our network is applicable for differentiating and retrieving 2D objects using very simple one-stroke sketches with good accuracy.

שפה מקוריתאנגלית אמריקאית
עמודים (מ-עד)2811-2822
מספר עמודים12
כתב עתVisual Computer
מספר גיליון7
מזהי עצם דיגיטלי (DOIs)
סטטוס פרסוםפורסם - 1 יולי 2023

ASJC Scopus subject areas

  • ???subjectarea.asjc.1700.1712???
  • ???subjectarea.asjc.1700.1707???
  • ???subjectarea.asjc.1700.1704???

טביעת אצבע

להלן מוצגים תחומי המחקר של הפרסום 'OneSketch: learning high-level shape features from simple sketches'. יחד הם יוצרים טביעת אצבע ייחודית.

פורמט ציטוט ביבליוגרפי