OneSketch: learning high-level shape features from simple sketches

Eyal Reisfeld, Andrei Sharf

نتاج البحث: نشر في مجلةمقالةمراجعة النظراء


Humans use simple sketches to convey complex concepts and abstract ideas in a concise way. Just a few abstract pencil strokes can carry a large amount of semantic information that can be used as meaningful representation for many applications. In this work, we explore the power of simple human strokes denoted to capture high-level 2D shape semantics. For this purpose, we introduce OneSketch, a crowd-sourced dataset of abstract one-line sketches depicting high-level 2D object features. To construct the dataset, we formulate a human sketching task with the goal of differentiating between objects with a single minimal stroke. While humans are rather successful at depicting high-level shape semantics and abstraction, we investigate the ability of deep neural networks to convey such traits. We introduce a neural network which learns meaningful shape features from our OneSketch dataset. Essentially, the model learns sketch-to-shape relations and encodes them in an embedding space which reveals distinctive shape features. We show that our network is applicable for differentiating and retrieving 2D objects using very simple one-stroke sketches with good accuracy.

اللغة الأصليةإنجليزيّة أمريكيّة
الصفحات (من إلى)2811-2822
عدد الصفحات12
دوريةVisual Computer
مستوى الصوت39
رقم الإصدار7
المعرِّفات الرقمية للأشياء
حالة النشرنُشِر - 1 يوليو 2023

All Science Journal Classification (ASJC) codes

  • !!Software
  • !!Computer Vision and Pattern Recognition
  • !!Computer Graphics and Computer-Aided Design


أدرس بدقة موضوعات البحث “OneSketch: learning high-level shape features from simple sketches'. فهما يشكلان معًا بصمة فريدة.

قم بذكر هذا