Random Fourier features for kernel ridge regression: Approximation bounds and statistical guarantees

Haim Avron, Michael Kapralov, Cameron Musco, Christopher Musco, Ameya Velingker, Amir Zandieh

Research output: Chapter in Book/Report/Conference proceedingConference contributionpeer-review

Abstract

Random Fourier features is one of the most popular techniques for scaling up kernel methods, such as kernel ridge regression. However, despite impressive empirical results, the statistical properties of random Fourier features are still not well understood. In this paper we take steps toward filling this gap. Specifically, we approach random Fourier features from a spectral matrix approximation point of view, give tight bounds on the number of Fourier features required to achieve a spectral approximation, and show how spectra) matrix approximation bounds imply statistical guarantees for kernel ridge regression.

Original languageEnglish
Title of host publication34th International Conference on Machine Learning, ICML 2017
Pages370-404
Number of pages35
ISBN (Electronic)9781510855144
StatePublished - 2017
Event34th International Conference on Machine Learning, ICML 2017 - Sydney, Australia
Duration: 6 Aug 201711 Aug 2017

Publication series

Name34th International Conference on Machine Learning, ICML 2017
Volume1

Conference

Conference34th International Conference on Machine Learning, ICML 2017
Country/TerritoryAustralia
CitySydney
Period6/08/1711/08/17

All Science Journal Classification (ASJC) codes

  • Computational Theory and Mathematics
  • Human-Computer Interaction
  • Software

Fingerprint

Dive into the research topics of 'Random Fourier features for kernel ridge regression: Approximation bounds and statistical guarantees'. Together they form a unique fingerprint.

Cite this