ROBUST REGRESSION ANALYSIS BASED ON THE K-DIVERGENCE

Yair Sorek, Koby Todros

Research output: Contribution to journalConference articlepeer-review

Abstract

This paper presents a new framework for robust regression analysis. Under this framework, the input-output relation of a system is inferred by minimizing a new robust loss which relates the outputs to a presumed parametric function of the inputs. The considered loss arises from a modified version of the recently developed Kdivergence (tailored here for regression analysis), whose empirical estimate employs Parzen’s non-parametric kernel density estimator to mitigate the effect of low-density contaminations, attributed to outliers. The use of Parzen’s non-parametric density estimator provides a model-free weighting mechanism to mitigate the effect of outlying measurements in both input and output data sets. The performance advantage of the considered approach, over other robust regression methods, is illustrated in a simulation study focusing on robust training of a shallow GELU neural network.

Original languageAmerican English
Pages (from-to)9511-9515
Number of pages5
JournalProceedings - ICASSP, IEEE International Conference on Acoustics, Speech and Signal Processing
DOIs
StatePublished - 1 Jan 2024
Event2024 IEEE International Conference on Acoustics, Speech, and Signal Processing, ICASSP 2024 - Seoul, Korea, Republic of
Duration: 14 Apr 202419 Apr 2024

Keywords

  • Divergences
  • estimation theory
  • regression analysis
  • robust statistics

All Science Journal Classification (ASJC) codes

  • Software
  • Signal Processing
  • Electrical and Electronic Engineering

Fingerprint

Dive into the research topics of 'ROBUST REGRESSION ANALYSIS BASED ON THE K-DIVERGENCE'. Together they form a unique fingerprint.

Cite this