The Resistance to Label Noise in K-NN and DNN Depends on its Concentration

Amnon Drory, Oria Ratzon, Shai Avidan, Raja Giryes

Research output: Contribution to conferencePaperpeer-review


We investigate the classification performance of K-nearest neighbors (K-NN) and deep neural networks (DNNs) in the presence of label noise. We first show empirically that a DNN's prediction for a given test example depends on the labels of the training examples in its local neighborhood. This motivates us to derive a realizable analytic expression that approximates the multi-class K-NN classification error in the presence of label noise, which is of independent importance. We then suggest that the expression for K-NN may serve as a first-order approximation for the DNN error. Finally, we demonstrate empirically the proximity of the developed expression to the observed performance of K-NN and DNN classifiers. Our result may explain the already observed surprising resistance of DNN to some types of label noise. It also characterizes an important factor of it showing that the more concentrated the noise the greater is the degradation in performance.

Original languageEnglish
StatePublished - 2020
Event31st British Machine Vision Conference, BMVC 2020 - Virtual, Online
Duration: 7 Sep 202010 Sep 2020


Conference31st British Machine Vision Conference, BMVC 2020
CityVirtual, Online

All Science Journal Classification (ASJC) codes

  • Artificial Intelligence
  • Computer Vision and Pattern Recognition


Dive into the research topics of 'The Resistance to Label Noise in K-NN and DNN Depends on its Concentration'. Together they form a unique fingerprint.

Cite this