Spacecraft relative navigation with an omnidirectional vision sensor

Omri Kaufman, Pini Gurfil

Research output: Contribution to journalArticlepeer-review


With the onset of autonomous spacecraft formation flying missions, the ability of satellites to autonomously navigate relatively to other space objects has become essential. To implement spacecraft relative navigation, relative measurements should be taken, and processed using relative state estimation. An efficient way to generate such information is by using vision-based measurements. Cameras are passive, low-energy, and information-rich sensors that do not actively interact with other space objects. However, pointing cameras with a conventional field-of-view to other space objects requires much a-priori initialization data; in particular, dedicated attitude maneuvers are needed, which may interfere with the satellite's main mission. One way to overcome these difficulties is to use an omnidirectional vision sensor, which has a 360-degree horizontal field of view. In this work, we present the development of an omnidirectional vision sensor for satellites, which can be used for spacecraft relative navigation, formation flying, and space situational awareness. The study includes the development of the measurement equations, dynamical models, and state estimation algorithms, as well as a numerical study, an experimental investigation, and a space scalability analysis.

Original languageEnglish
Pages (from-to)334-351
Number of pages18
JournalActa Astronautica
StatePublished - Nov 2021


  • Computer vision
  • Extended Kalman Filter
  • Omnidirectional vision sensor
  • Space navigation
  • Spacecraft relative dynamics
  • Unified projection model

All Science Journal Classification (ASJC) codes

  • Aerospace Engineering


Dive into the research topics of 'Spacecraft relative navigation with an omnidirectional vision sensor'. Together they form a unique fingerprint.

Cite this