Network Calibration by Class-based Temperature Scaling

Lior Frenkel, Jacob Goldberger

Research output: Chapter in Book/Report/Conference proceedingConference contributionpeer-review

Abstract

It is well known that modern neural networks are poorly calibrated. They tend to overestimate or underestimate probabilities when compared to the expected accuracy. This results in misleading reliability and corrupting our decision policy. We show that the amount of calibration error differs across the classes. As a result, we propose to calibrate each class separately. We apply this class-level calibration paradigm to the concept of temperature scaling and describe an optimization method that finds the suitable temperature scaling for each class. We report extensive experiments on a variety of image datasets, and a wide variety of network architectures, and show that our approach achieves state-of-the-art calibration without compromising on accuracy in almost all cases.

Original languageEnglish
Title of host publication29th European Signal Processing Conference, EUSIPCO 2021 - Proceedings
Pages1486-1490
Number of pages5
ISBN (Electronic)9789082797060
DOIs
StatePublished - 2021
Event29th European Signal Processing Conference, EUSIPCO 2021 - Dublin, Ireland
Duration: 23 Aug 202127 Aug 2021

Publication series

NameEuropean Signal Processing Conference
Volume2021-August

Conference

Conference29th European Signal Processing Conference, EUSIPCO 2021
Country/TerritoryIreland
CityDublin
Period23/08/2127/08/21

Keywords

  • Expected calibration error (ECE)
  • Network calibration
  • Neural networks
  • Temperature scaling

All Science Journal Classification (ASJC) codes

  • Signal Processing
  • Electrical and Electronic Engineering

Fingerprint

Dive into the research topics of 'Network Calibration by Class-based Temperature Scaling'. Together they form a unique fingerprint.

Cite this