Distributed vision-aided cooperative localization and navigation based on three-view geometry

Vadim Indelman, Pini Gurfil, Ehud Rivlin, Hector Rotstein

Research output: Contribution to journalArticlepeer-review

Abstract

This paper presents a new method for distributed vision-aided cooperative localization and navigation for multiple inter-communicating autonomous vehicles based on three-view geometry constraints. Each vehicle is equipped with a standard inertial navigation system and an on-board camera only. In contrast to the traditional approach for cooperative localization, which is based on relative pose measurements, the proposed method formulates a measurement whenever the same scene is observed by different vehicles. Each such measurement is comprising of three images, which are not necessarily captured at the same time. The captured images, to which some navigation parameters are attached, are stored in repositories by some of the vehicles in the group. A graph-based approach is applied for calculating the correlation terms between the navigation parameters associated to images participating in the same measurement. The proposed method is examined using a statistical simulation and is further validated in an experiment that involved two vehicles in a holding pattern scenario. The experiments show that the cooperative three-view-based vision-aided navigation may considerably improve the performance of an inferior INS.

Original languageEnglish
Pages (from-to)822-840
Number of pages19
JournalRobotics and Autonomous Systems
Volume60
Issue number6
DOIs
StatePublished - Jun 2012

Keywords

  • Computer vision
  • Distributed navigation
  • Information fusion
  • Navigation aiding

All Science Journal Classification (ASJC) codes

  • Software
  • Control and Systems Engineering
  • Computer Science Applications
  • General Mathematics

Fingerprint

Dive into the research topics of 'Distributed vision-aided cooperative localization and navigation based on three-view geometry'. Together they form a unique fingerprint.

Cite this