A Joint Training and Confidence Calibration Procedure That is Robust to Label Noise

Coby Penso, Jacob Goldberger

Research output: Chapter in Book/Report/Conference proceedingConference contributionpeer-review

Abstract

Manually annotated medical imaging data tend to have unreliable labels due to the complexity of the medical data and the considerable variability across experts. Noisy data can pose a significant challenge when it comes to learning the model's parameters and calibrating its predictive confidence. This study presents a joint training and confidence calibration procedure that is robust to label noise. The method is based on estimating the noise level as part of a noise-robust training procedure. The estimated noise level is then used to modify the computed network accuracy on the noisy validation set which is required by the calibration procedure. We demonstrate that, despite the unreliable labels, we can still achieve calibration results similar to those obtained by a procedure using data with noise-free labels.

Original languageEnglish
Title of host publicationIEEE International Symposium on Biomedical Imaging, ISBI 2024 - Conference Proceedings
PublisherIEEE Computer Society
ISBN (Electronic)9798350313338
DOIs
StatePublished - 2024
Event21st IEEE International Symposium on Biomedical Imaging, ISBI 2024 - Athens, Greece
Duration: 27 May 202430 May 2024

Publication series

NameProceedings - International Symposium on Biomedical Imaging

Conference

Conference21st IEEE International Symposium on Biomedical Imaging, ISBI 2024
Country/TerritoryGreece
CityAthens
Period27/05/2430/05/24

Keywords

  • calibration
  • network confidence
  • neural networks
  • noisy labels

All Science Journal Classification (ASJC) codes

  • Biomedical Engineering
  • Radiology Nuclear Medicine and imaging

Cite this