Self-Supervised Monocular Depth Underwater

Shlomi Amitai, Itzik Klein, Tali Treibitz

Research output: Chapter in Book/Report/Conference proceedingConference contributionpeer-review


Depth estimation is critical for any robotic system. In the past years, the estimation of depth from monocular images has shown great improvement. However, in the underwater environment results are still lagging behind due to appearance changes caused by the medium. So far little effort has been invested in overcoming this. Moreover, underwater, there are more limitations to using high-resolution depth sensors, which is a serious obstacle to generating ground truth. So far unsupervised methods that tried to solve this have achieved limited success as they relied on domain transfer from a dataset in the air. We suggest network training using subsequent frames, self-supervised by a reprojection loss, as was demonstrated successfully above water. We propose several additions to the self-supervised framework to cope with the underwater environment and achieve state-of-the-art results on a challenging forward-looking underwater dataset.

Original languageAmerican English
Title of host publicationProceedings - ICRA 2023
Subtitle of host publicationIEEE International Conference on Robotics and Automation
PublisherInstitute of Electrical and Electronics Engineers Inc.
Number of pages7
ISBN (Electronic)9798350323658
StatePublished - 2023
Event2023 IEEE International Conference on Robotics and Automation, ICRA 2023 - London, United Kingdom
Duration: 29 May 20232 Jun 2023

Publication series

NameProceedings - IEEE International Conference on Robotics and Automation


Conference2023 IEEE International Conference on Robotics and Automation, ICRA 2023
Country/TerritoryUnited Kingdom

All Science Journal Classification (ASJC) codes

  • Software
  • Control and Systems Engineering
  • Electrical and Electronic Engineering
  • Artificial Intelligence


Dive into the research topics of 'Self-Supervised Monocular Depth Underwater'. Together they form a unique fingerprint.

Cite this