Efficient Kirszbraun extension with applications to regression

Hananel Zaichyk, Armin Biess, Aryeh Kontorovich, Yury Makarychev

Research output: Contribution to journalArticlepeer-review


We introduce a framework for performing vector-valued regression in finite-dimensional Hilbert spaces. Using Lipschitz smoothness as our regularizer, we leverage Kirszbraun’s extension theorem for off-data prediction. We analyze the statistical and computational aspects of this method—to our knowledge, its first application to supervised learning. We decompose this task into two stages: training (which corresponds operationally to smoothing/regularization) and prediction (which is achieved via Kirszbraun extension). Both are solved algorithmically via a novel multiplicative weight updates (MWU) scheme, which, for our problem formulation, achieves significant runtime speedups over generic interior point methods. Our empirical results indicate a dramatic advantage over standard off-the-shelf solvers in our regression setting.

Original languageAmerican English
JournalMathematical Programming
StatePublished - 7 Dec 2023


  • Convex optimization
  • Kirszbraun extension
  • Quadratically constrained quadratic program
  • Regression

All Science Journal Classification (ASJC) codes

  • Software
  • General Mathematics


Dive into the research topics of 'Efficient Kirszbraun extension with applications to regression'. Together they form a unique fingerprint.

Cite this