Evaluating and Calibrating Uncertainty Prediction in Regression Tasks

Dan Levi, Liran Gispan, Niv Giladi, Ethan Fetaya

Research output: Contribution to journalArticlepeer-review


Predicting not only the target but also an accurate measure of uncertainty is important for many machine learning applications, and in particular, safety-critical ones. In this work, we study the calibration of uncertainty prediction for regression tasks which often arise in real-world systems. We show that the existing definition for the calibration of regression uncertainty has severe limitations in distinguishing informative from non-informative uncertainty predictions. We propose a new definition that escapes this caveat and an evaluation method using a simple histogram-based approach. Our method clusters examples with similar uncertainty prediction and compares the prediction with the empirical uncertainty on these examples. We also propose a simple, scaling-based calibration method that preforms as well as much more complex ones. We show results on both a synthetic, controlled problem and on the object detection bounding-box regression task using the COCO and KITTI datasets.

Original languageEnglish
Article number5540
Issue number15
StatePublished - 25 Jul 2022


  • prediction uncertainty
  • regression

All Science Journal Classification (ASJC) codes

  • Analytical Chemistry
  • Information Systems
  • Instrumentation
  • Atomic and Molecular Physics, and Optics
  • Electrical and Electronic Engineering
  • Biochemistry


Dive into the research topics of 'Evaluating and Calibrating Uncertainty Prediction in Regression Tasks'. Together they form a unique fingerprint.

Cite this