Network Calibration by Temperature Scaling based on the Predicted Confidence

Lior Frenkel, Jacob Goldberger

Research output: Chapter in Book/Report/Conference proceedingConference contributionpeer-review


Calibrating neural networks is crucial in applications where the decision making depends on the predicted probabilities. Modern neural networks can be poorly calibrated. They tend to overestimate probabilities when compared to the expected accuracy. This results in a misleading reliability that corrupts our decision policy. We show that the magnitude of calibration error depends on the predicted confidence for each sample. This prediction confidence calibration paradigm is then applied to the concept of temperature scaling. We describe an optimization method that finds the suitable temperature scaling for each bin of a discretized value of prediction confidence. We report extensive experiments on a variety of image datasets and network architectures. Our approach achieves state-of-the-art calibration with a guarantee that the classification accuracy is not altered.

Original languageEnglish
Title of host publication30th European Signal Processing Conference, EUSIPCO 2022 - Proceedings
Number of pages5
ISBN (Electronic)9789082797091
StatePublished - 2022
Event30th European Signal Processing Conference, EUSIPCO 2022 - Belgrade, Serbia
Duration: 29 Aug 20222 Sep 2022

Publication series

NameEuropean Signal Processing Conference


Conference30th European Signal Processing Conference, EUSIPCO 2022


  • Expected Calibration Error (ECE)
  • network calibration
  • neural networks
  • temperature scaling

All Science Journal Classification (ASJC) codes

  • Signal Processing
  • Electrical and Electronic Engineering


Dive into the research topics of 'Network Calibration by Temperature Scaling based on the Predicted Confidence'. Together they form a unique fingerprint.

Cite this