Qualitative Belief Space Planning via Compositions

Itai Zilberman, Vadim Indelman

Research output: Chapter in Book/Report/Conference proceedingConference contributionpeer-review

Abstract

Planning under uncertainty is a fundamental problem in robotics. Classical approaches rely on a metrical representation of the world and robot's states to infer the next course of action. While these approaches are considered accurate, they are often susceptible to metric errors and tend to be costly regarding memory and time consumption. However, in some cases, relying on qualitative geometric information alone is sufficient. Hence, the issues described above become an unnecessary burden. This work presents a novel qualitative Belief Space Planning (BSP) approach, highly suitable for platforms with low-cost sensors and particularly appealing in sparse environment scenarios. Our algorithm generalizes its predecessors by avoiding any deterministic assumptions. Moreover, it smoothly incorporates spatial information propagation techniques, known as compositions. We demonstrate our algorithm in simulations and the advantage of using compositions in particular.

Original languageEnglish
Title of host publicationIEEE/RSJ International Conference on Intelligent Robots and Systems, IROS 2022
Pages5099-5106
Number of pages8
ISBN (Electronic)9781665479271
DOIs
StatePublished - 2022
Event2022 IEEE/RSJ International Conference on Intelligent Robots and Systems, IROS 2022 - Kyoto, Japan
Duration: 23 Oct 202227 Oct 2022

Publication series

NameIEEE International Conference on Intelligent Robots and Systems
Volume2022-October

Conference

Conference2022 IEEE/RSJ International Conference on Intelligent Robots and Systems, IROS 2022
Country/TerritoryJapan
CityKyoto
Period23/10/2227/10/22

All Science Journal Classification (ASJC) codes

  • Control and Systems Engineering
  • Software
  • Computer Vision and Pattern Recognition
  • Computer Science Applications

Fingerprint

Dive into the research topics of 'Qualitative Belief Space Planning via Compositions'. Together they form a unique fingerprint.

Cite this