Conformal Prediction is Robust to Label Noise

Bat-Sheva Einbinder, Stephen Bates, Anastasios N. Angelopoulos, Asaf Gendler, Yaniv Romano

Research output: Working paperPreprint

Abstract

We study the robustness of conformal prediction, a powerful tool for uncertainty quantification, to label noise. Our analysis tackles both regression and classification problems, characterizing when and how it is possible to construct uncertainty sets that correctly cover the unobserved noiseless ground truth labels. Through stylized theoretical examples and practical experiments, we argue that naive conformal prediction covers the noiseless ground truth label unless the noise distribution is adversarially designed. This leads us to believe that correcting for label noise is unnecessary except for pathological data distributions or noise sources. In such cases, we can also correct for noise of bounded size in the conformal prediction algorithm in order to ensure correct coverage of the ground truth labels without score or data regularity.
Original languageEnglish
StatePublished - 28 Sep 2022

Keywords

  • cs.AI
  • cs.LG
  • math.ST
  • stat.ME
  • stat.ML
  • stat.TH

Fingerprint

Dive into the research topics of 'Conformal Prediction is Robust to Label Noise'. Together they form a unique fingerprint.

Cite this