Exponentially vanishing sub-optimal local minima in multilayer neural networks

Daniel Soudry, Elad Hoffer

Research output: Chapter in Book/Report/Conference proceedingConference contributionpeer-review

Abstract

Background: Statistical mechanics results (Dauphin et al. (2014); Choromanska et al. (2015)) suggest that local minima with high error are exponentially rare in high dimensions. However, to prove low error guarantees for Multilayer Neural Networks (MNNs), previous works so far required either a heavily modified MNN model or training method, strong assumptions on the labels (e.g., “near” linear separability), or an unrealistically wide hidden layer with Ω (N) units. Results: We examine a MNN with one hidden layer of piecewise linear units, a single output, and a quadratic loss. We prove that, with high probability in the limit of N → ∞ datapoints, the volume of differentiable regions of the empiric loss containing sub-optimal differentiable local minima is exponentially vanishing in comparison with the same volume of global minima, given standard normal input of dimension d0 = Ω N , and a more realistic number of d1 = Ω (N/d0) hidden units. We demonstrate our results numerically: for example, 0% binary classification training error on CIFAR with only N/d0 ≈ 16 hidden neurons.

Original languageEnglish
Title of host publication6th International Conference on Learning Representations, ICLR 2018
StatePublished - 2018
Event6th International Conference on Learning Representations, ICLR 2018 - Vancouver, Canada
Duration: 30 Apr 20183 May 2018

Conference

Conference6th International Conference on Learning Representations, ICLR 2018
Country/TerritoryCanada
CityVancouver
Period30/04/183/05/18

All Science Journal Classification (ASJC) codes

  • Education
  • Language and Linguistics
  • Computer Science Applications
  • Linguistics and Language

Fingerprint

Dive into the research topics of 'Exponentially vanishing sub-optimal local minima in multilayer neural networks'. Together they form a unique fingerprint.

Cite this