Generalized isotonic regression

Ronny Luss, Saharon Rosset

Research output: Contribution to journalArticlepeer-review

Abstract

We present a new computational and statistical approach for fitting isotonic models under convex differentiable loss functions through recursive partitioning. Models along the partitioning path are also isotonic and can be viewed as regularized solutions to the problem. Our approach generalizes and subsumes the well-known work of Barlow and Brunk on fitting isotonic regressions subject to specially structured loss functions, and expands the range of loss functions that can be used (e.g., adding Huber loss for robust regression). This is accomplished through an algorithmic adjustment to a recursive partitioning approach recently developed for solving large-scale l2-loss isotonic regression problems.We prove that the new algorithm solves the generalized problem while maintaining the favorable computational and statistical properties of the l2 algorithm. The results are demonstrated on both real and synthetic data in two settings: Fitting count data using negative Poisson log-likelihood loss, and fitting robust isotonic regressions using Huber loss. Proofs of theorems and a MATLAB-based software package implementing our algorithm are available in the online supplementary materials.

Original languageEnglish
Pages (from-to)192-210
Number of pages19
JournalJournal of Computational and Graphical Statistics
Volume23
Issue number1
DOIs
StatePublished - 2014

Keywords

  • Convex optimization
  • Nonparametric regression
  • Regularization path

All Science Journal Classification (ASJC) codes

  • Statistics and Probability
  • Discrete Mathematics and Combinatorics
  • Statistics, Probability and Uncertainty

Fingerprint

Dive into the research topics of 'Generalized isotonic regression'. Together they form a unique fingerprint.

Cite this