Size-independent sample complexity of neural networks

Noah Golowich, Alexander Rakhlin, Ohad Shamir

Research output: Contribution to journalArticlepeer-review


We study the sample complexity of learning neural networks by providing new bounds on their Rademacher complexity, assuming norm constraints on the parameter matrix of each layer. Compared to previous work, these complexity bounds have improved dependence on the network depth and, under some additional assumptions, are fully independent of the network size (both depth and width). These results are derived using some novel techniques, which may be of independent interest.

Original languageEnglish
Pages (from-to)473-504
Number of pages32
JournalInformation and Inference: A Journal of the IMA
Issue number2
Early online date4 May 2020
StatePublished - Jun 2020


Dive into the research topics of 'Size-independent sample complexity of neural networks'. Together they form a unique fingerprint.

Cite this