Abstract
We present a new computational and statistical approach for fitting isotonic models under convex differentiable loss functions through recursive partitioning. Models along the partitioning path are also isotonic and can be viewed as regularized solutions to the problem. Our approach generalizes and subsumes the well-known work of Barlow and Brunk on fitting isotonic regressions subject to specially structured loss functions, and expands the range of loss functions that can be used (e.g., adding Huber loss for robust regression). This is accomplished through an algorithmic adjustment to a recursive partitioning approach recently developed for solving large-scale l2-loss isotonic regression problems.We prove that the new algorithm solves the generalized problem while maintaining the favorable computational and statistical properties of the l2 algorithm. The results are demonstrated on both real and synthetic data in two settings: Fitting count data using negative Poisson log-likelihood loss, and fitting robust isotonic regressions using Huber loss. Proofs of theorems and a MATLAB-based software package implementing our algorithm are available in the online supplementary materials.
Original language | English |
---|---|
Pages (from-to) | 192-210 |
Number of pages | 19 |
Journal | Journal of Computational and Graphical Statistics |
Volume | 23 |
Issue number | 1 |
DOIs | |
State | Published - 2014 |
Keywords
- Convex optimization
- Nonparametric regression
- Regularization path
All Science Journal Classification (ASJC) codes
- Statistics and Probability
- Discrete Mathematics and Combinatorics
- Statistics, Probability and Uncertainty