Regression via Kirszbraun Extension with Applications to Imitation Learning.

Armin Biess, Aryeh Kontorovich, Yury Makarychev, Hanan Zaichyk

Research output: Working paperPreprint


We present a framework for performing regression between two Hilbert spaces. We accomplish this via Kirszbraun's extension theorem -- apparently the first application of this technique to supervised learning -- and analyze its statistical and computational aspects. We begin by formulating the correspondence problem in terms of quadratically constrained quadratic program (QCQP) regression. Then we describe a procedure for smoothing the training data, which amounts to regularizing hypothesis complexity via its Lipschitz constant. The Lipschitz constant is tuned via a Structural Risk Minimization (SRM) procedure, based on the covering-number risk bounds we derive. We apply our technique to learn a transformation between two robotic manipulators with different embodiments, and report promising results.
Original languageAmerican English
StatePublished - 2019


Dive into the research topics of 'Regression via Kirszbraun Extension with Applications to Imitation Learning.'. Together they form a unique fingerprint.

Cite this