TY - GEN
T1 - ReSQueing Parallel and Private Stochastic Convex Optimization
AU - Carmon, Yair
AU - Jambulapati, Arun
AU - Jin, Yujia
AU - Lee, Yin Tat
AU - Liu, Daogao
AU - Sidford, Aaron
AU - Tian, Kevin
N1 - Publisher Copyright: © 2023 IEEE.
PY - 2023
Y1 - 2023
N2 - We introduce a new tool for stochastic convex optimization (SCO): a Reweighted Stochastic Query (ReSQue) estimator for the gradient of a function convolved with a (Gaussian) probability density. Combining ReSQue with recent advances in ball oracle acceleration [CJJ+20], [ACJ+21], we develop algorithms achieving state-of-the-art complexities for SCO in parallel and private settings. For a SCO objective constrained to the unit ball in Rd, we obtain the following results (up to polylogarithmic factors).1)We give a parallel algorithm obtaining optimization error ϵ_ opt with d1/3 ϵ_ opt -2/3 gradient oracle query depth and d1/3 ϵ_ opt -2/3+ϵ_ opt -2 gradient queries in total, assuming access to a bounded-variance stochastic gradient estimator. For ϵ_ opt ∈[d-1, d-1/4], our algorithm matches the state-of-the-art oracle depth of [BJL+19] while maintaining the optimal total work of stochastic gradient descent.2)Given n samples of Lipschitz loss functions, prior works [BFTT19], [BFGT20], [AFKT21], [KLL21] established that if n > rsim d ϵ_dp-2,(ϵ_dp, Δ)-differential privacy is attained at no asymptotic cost to the SCO utility. However, these prior works all required a superlinear number of gradient queries. We close this gap for sufficiently large n > rsim d2 ϵ_d p-3, by using ReSQue to design an algorithm with near-linear gradient query complexity in this regime.
AB - We introduce a new tool for stochastic convex optimization (SCO): a Reweighted Stochastic Query (ReSQue) estimator for the gradient of a function convolved with a (Gaussian) probability density. Combining ReSQue with recent advances in ball oracle acceleration [CJJ+20], [ACJ+21], we develop algorithms achieving state-of-the-art complexities for SCO in parallel and private settings. For a SCO objective constrained to the unit ball in Rd, we obtain the following results (up to polylogarithmic factors).1)We give a parallel algorithm obtaining optimization error ϵ_ opt with d1/3 ϵ_ opt -2/3 gradient oracle query depth and d1/3 ϵ_ opt -2/3+ϵ_ opt -2 gradient queries in total, assuming access to a bounded-variance stochastic gradient estimator. For ϵ_ opt ∈[d-1, d-1/4], our algorithm matches the state-of-the-art oracle depth of [BJL+19] while maintaining the optimal total work of stochastic gradient descent.2)Given n samples of Lipschitz loss functions, prior works [BFTT19], [BFGT20], [AFKT21], [KLL21] established that if n > rsim d ϵ_dp-2,(ϵ_dp, Δ)-differential privacy is attained at no asymptotic cost to the SCO utility. However, these prior works all required a superlinear number of gradient queries. We close this gap for sufficiently large n > rsim d2 ϵ_d p-3, by using ReSQue to design an algorithm with near-linear gradient query complexity in this regime.
KW - differential privacy
KW - parallel computation
KW - stochastic optimization
UR - http://www.scopus.com/inward/record.url?scp=85182400349&partnerID=8YFLogxK
U2 - https://doi.org/10.1109/FOCS57990.2023.00124
DO - https://doi.org/10.1109/FOCS57990.2023.00124
M3 - منشور من مؤتمر
T3 - Proceedings - Annual IEEE Symposium on Foundations of Computer Science, FOCS
SP - 2031
EP - 2058
BT - Proceedings - 2023 IEEE 64th Annual Symposium on Foundations of Computer Science, FOCS 2023
PB - IEEE Computer Society
T2 - 64th IEEE Annual Symposium on Foundations of Computer Science, FOCS 2023
Y2 - 6 November 2023 through 9 November 2023
ER -