Three-dimensional kinematics-based real-time localization method using two robots

Guy Elmakis, Matan Coronel, David Zarrouk

Research output: Contribution to journalArticlepeer-review

Abstract

This paper presents a precise two-robot collaboration method for three-dimensional (3D) self-localization relying on a single rotating camera and onboard accelerometers used to measure the tilt of the robots. This method allows for localization in global positioning system-denied environments and in the presence of magnetic interference or relatively (or totally) dark and unstructured unmarked locations. One robot moves forward on each step while the other remains stationary. The tilt angles of the robots obtained from the accelerometers and the rotational angle of the turret, associated with the video analysis, make it possible to continuously calculate the location of each robot. We describe a hardware setup used for experiments and provide a detailed description of the algorithm that fuses the data obtained by the accelerometers and cameras and runs in real-time on onboard microcomputers. Finally, we present 2D and 3D experimental results, which show that the system achieves 2% accuracy for the total traveled distance (see Supporting Information S1: video).

Original languageAmerican English
Pages (from-to)2676-2688
Number of pages13
JournalJournal of Field Robotics
Volume41
Issue number8
DOIs
StatePublished - 1 Dec 2024

Keywords

  • field robotics
  • kinematics
  • localization
  • tracked robot

All Science Journal Classification (ASJC) codes

  • Control and Systems Engineering
  • Computer Science Applications

Cite this