TY - GEN
T1 - Proving the Lottery Ticket Hypothesis
T2 - 37th International Conference on Machine Learning, ICML 2020
AU - Malacli, Erati
AU - Ydiudai, Gilad
AU - Shalev-Schwartz, Shai
AU - Sliainii, Oliad
N1 - Publisher Copyright: © 2020 37th International Conference on Machine Learning, ICML 2020. All rights reserved.
PY - 2020/3/1
Y1 - 2020/3/1
N2 - The lottery ticket hypothesis (Frankle and Carbin. 2018), states that a randomly-initialized network contains a small subnetwork such that, when trained in isolation, can compete with the performance of the original network. We prove an even stronger hypothesis (as was also conjectured in Rarnanujan et al., 2019), showing that for every bounded distribution and every target network with bounded weights. a sufficiently over-parameterized neural network with random weights contains a subnetwork with roughly the same accuracy as the target network, without any further training.
AB - The lottery ticket hypothesis (Frankle and Carbin. 2018), states that a randomly-initialized network contains a small subnetwork such that, when trained in isolation, can compete with the performance of the original network. We prove an even stronger hypothesis (as was also conjectured in Rarnanujan et al., 2019), showing that for every bounded distribution and every target network with bounded weights. a sufficiently over-parameterized neural network with random weights contains a subnetwork with roughly the same accuracy as the target network, without any further training.
UR - http://www.scopus.com/inward/record.url?scp=85105242003&partnerID=8YFLogxK
M3 - منشور من مؤتمر
T3 - 37th International Conference on Machine Learning, ICML 2020
SP - 6638
EP - 6647
BT - 37th International Conference on Machine Learning, ICML 2020
A2 - Daume, Hal
A2 - Singh, Aarti
Y2 - 13 July 2020 through 18 July 2020
ER -