UAV/UGV Search and Capture of Goal-Oriented Uncertain Targets

Mor Sinay, Noa Agmon, Oleg Maksimov, Guy Levy, Moshe Bitan, Sarit Kraus

Research output: Chapter in Book/Report/Conference proceedingConference contributionpeer-review

Abstract

This paper considers a new, complex problem of UAV/UGV collaborative efforts to search and capture attackers under uncertainty. The goal of the defenders (UAV/UGV team) is to stop all attackers as quickly as possible, before they arrive at their selected goal. The uncertainty considered is twofold: the defenders do not know the attackers' location and destination, and there is also uncertainty in the defenders' sensing. We suggest a real-time algorithmic framework for the defenders, combining entropy and stochastic-temporal belief, that aims at optimizing the probability of a quick and successful capture of all of the attackers. We have empirically evaluated the algorithmic framework, and have shown its efficiency and significant performance improvement compared to other solutions.

Original languageEnglish
Title of host publication2018 IEEE/RSJ International Conference on Intelligent Robots and Systems, IROS 2018
PublisherInstitute of Electrical and Electronics Engineers Inc.
Pages8505-8512
Number of pages8
ISBN (Electronic)9781538680940
DOIs
StatePublished - 27 Dec 2018
Event2018 IEEE/RSJ International Conference on Intelligent Robots and Systems, IROS 2018 - Madrid, Spain
Duration: 1 Oct 20185 Oct 2018

Publication series

NameIEEE International Conference on Intelligent Robots and Systems

Conference

Conference2018 IEEE/RSJ International Conference on Intelligent Robots and Systems, IROS 2018
Country/TerritorySpain
CityMadrid
Period1/10/185/10/18

All Science Journal Classification (ASJC) codes

  • Control and Systems Engineering
  • Software
  • Computer Vision and Pattern Recognition
  • Computer Science Applications

Fingerprint

Dive into the research topics of 'UAV/UGV Search and Capture of Goal-Oriented Uncertain Targets'. Together they form a unique fingerprint.

Cite this