TY - GEN
T1 - Analytic Study of Families of Spurious Minima in Two-Layer ReLU Neural Networks
T2 - 35th Conference on Neural Information Processing Systems, NeurIPS 2021
AU - Arjevani, Yossi
AU - Field, Michael
N1 - Publisher Copyright: © 2021 Neural information processing systems foundation. All rights reserved.
PY - 2021
Y1 - 2021
N2 - We study the optimization problem associated with fitting two-layer ReLU neural networks with respect to the squared loss, where labels are generated by a target network. We make use of the rich symmetry structure to develop a novel set of tools for studying families of spurious minima. In contrast to existing approaches which operate in limiting regimes, our technique directly addresses the nonconvex loss landscape for a finite number of inputs d and neurons k, and provides analytic, rather than heuristic, information. In particular, we derive analytic estimates for the loss at different minima, and prove that modulo O(d-1/2)-terms the Hessian spectrum concentrates near small positive constants, with the exception of Θ(d) eigenvalues which grow linearly with d. We further show that the Hessian spectrum at global and spurious minima coincide to O(d-1/2)-order, thus challenging our ability to argue about statistical generalization through local curvature. Lastly, our technique provides the exact fractional dimensionality at which families of critical points turn from saddles into spurious minima. This makes possible the study of the creation and the annihilation of spurious minima using powerful tools from equivariant bifurcation theory.
AB - We study the optimization problem associated with fitting two-layer ReLU neural networks with respect to the squared loss, where labels are generated by a target network. We make use of the rich symmetry structure to develop a novel set of tools for studying families of spurious minima. In contrast to existing approaches which operate in limiting regimes, our technique directly addresses the nonconvex loss landscape for a finite number of inputs d and neurons k, and provides analytic, rather than heuristic, information. In particular, we derive analytic estimates for the loss at different minima, and prove that modulo O(d-1/2)-terms the Hessian spectrum concentrates near small positive constants, with the exception of Θ(d) eigenvalues which grow linearly with d. We further show that the Hessian spectrum at global and spurious minima coincide to O(d-1/2)-order, thus challenging our ability to argue about statistical generalization through local curvature. Lastly, our technique provides the exact fractional dimensionality at which families of critical points turn from saddles into spurious minima. This makes possible the study of the creation and the annihilation of spurious minima using powerful tools from equivariant bifurcation theory.
UR - http://www.scopus.com/inward/record.url?scp=85132036332&partnerID=8YFLogxK
M3 - منشور من مؤتمر
T3 - Advances in Neural Information Processing Systems
SP - 15162
EP - 15174
BT - Advances in Neural Information Processing Systems 34 - 35th Conference on Neural Information Processing Systems, NeurIPS 2021
A2 - Ranzato, Marc'Aurelio
A2 - Beygelzimer, Alina
A2 - Dauphin, Yann
A2 - Liang, Percy S.
A2 - Wortman Vaughan, Jenn
Y2 - 6 December 2021 through 14 December 2021
ER -