AnySURF: Flexible local features computation

Eran Sadeh-Or, Gal A. Kaminka

Research output: Chapter in Book/Report/Conference proceedingChapterpeer-review

Abstract

Many vision-based tasks for autonomous robotics are based on feature matching algorithms, finding point correspondences between two images. Unfortunately, existing algorithms for such tasks require significant computational resources and are designed under the assumption that they will run to completion and only then return a complete result. Since partial results-a subset of all features in the image-are often sufficient, we propose in this paper a computationally-flexible algorithm, where results monotonically increase in quality, given additional computation time. The proposed algorithm, coined AnySURF (Anytime SURF), is based on the SURF scale- and rotation-invariant interest point detector and descriptor. We achieve flexibility by re-designing several major steps, mainly the feature search process, allowing results with increasing quality to be accumulated. We contrast different design choices for AnySURF and evaluate the use of AnySURF in a series of experiments. Results are promising, and show the potential for dynamic anytime performance, robust to the available computation time.

Original languageEnglish
Title of host publicationRoboCup 2011
Subtitle of host publicationRobot Soccer World Cup XV
EditorsThomas Rofer, Norbert Michael Mayer, Jesus Savage, Uluc Saranli
Pages174-185
Number of pages12
DOIs
StatePublished - 2012

Publication series

NameLecture Notes in Computer Science (including subseries Lecture Notes in Artificial Intelligence and Lecture Notes in Bioinformatics)
Volume7416 LNCS

All Science Journal Classification (ASJC) codes

  • Theoretical Computer Science
  • General Computer Science

Fingerprint

Dive into the research topics of 'AnySURF: Flexible local features computation'. Together they form a unique fingerprint.

Cite this