Abstract
We present an accelerated gradient method for nonconvex optimization problems with Lipschitz continuous first and second derivatives. In a time O(−7/4 log(1/)), the method finds an -stationary point, meaning a point x such that ∇f(x) ≤ . The method improves upon the O(− 2) complexity of gradient descent and provides the additional second-order guarantee that λmin(∇2f(x)) −1/2 for the computed x. Furthermore, our method is Hessian free, i.e., it only requires gradient computations, and is therefore suitable for large-scale applications.
| Original language | English |
|---|---|
| Pages (from-to) | 1751-1772 |
| Number of pages | 22 |
| Journal | SIAM Journal on Optimization |
| Volume | 28 |
| Issue number | 2 |
| DOIs | |
| State | Published - 2018 |
| Externally published | Yes |
Keywords
- Accelerated gradient descent
- Convergence rate
- Lanczos method
- Negative curvature
- Nonlinear optimization
- Second-order stationarity
- Semiconvexity
All Science Journal Classification (ASJC) codes
- Software
- Theoretical Computer Science
- Applied Mathematics