Abstract
This paper presents a new framework for robust regression analysis. Under this framework, the input-output relation of a system is inferred by minimizing a new robust loss which relates the outputs to a presumed parametric function of the inputs. The considered loss arises from a modified version of the recently developed Kdivergence (tailored here for regression analysis), whose empirical estimate employs Parzen’s non-parametric kernel density estimator to mitigate the effect of low-density contaminations, attributed to outliers. The use of Parzen’s non-parametric density estimator provides a model-free weighting mechanism to mitigate the effect of outlying measurements in both input and output data sets. The performance advantage of the considered approach, over other robust regression methods, is illustrated in a simulation study focusing on robust training of a shallow GELU neural network.
Original language | American English |
---|---|
Pages (from-to) | 9511-9515 |
Number of pages | 5 |
Journal | Proceedings - ICASSP, IEEE International Conference on Acoustics, Speech and Signal Processing |
DOIs | |
State | Published - 1 Jan 2024 |
Event | 2024 IEEE International Conference on Acoustics, Speech, and Signal Processing, ICASSP 2024 - Seoul, Korea, Republic of Duration: 14 Apr 2024 → 19 Apr 2024 |
Keywords
- Divergences
- estimation theory
- regression analysis
- robust statistics
All Science Journal Classification (ASJC) codes
- Software
- Signal Processing
- Electrical and Electronic Engineering