Skip to main navigation Skip to search Skip to main content

Global non-convex optimization with discretized diffusions

Murat A. Erdogdu, Lester Mackey, Ohad Shamir

Research output: Contribution to journalConference articlepeer-review

Abstract

An Euler discretization of the Langevin diffusion is known to converge to the global minimizers of certain convex and non-convex optimization problems. We show that this property holds for any suitably smooth diffusion and that different diffusions are suitable for optimizing different classes of convex and non-convex functions. This allows us to design diffusions suitable for globally optimizing convex and non-convex functions not covered by the existing Langevin theory. Our non-asymptotic analysis delivers computable optimization and integration error bounds based on easily accessed properties of the objective and chosen diffusion. Central to our approach are new explicit Stein factor bounds on the solutions of Poisson equations. We complement these results with improved optimization guarantees for targets other than the standard Gibbs measure.

Original languageEnglish
Pages (from-to)9671-9680
Number of pages10
JournalAdvances in Neural Information Processing Systems
Volume2018
StatePublished - 2018
Event32nd Conference on Neural Information Processing Systems, NeurIPS 2018 - Montreal, Canada
Duration: 2 Dec 20188 Dec 2018

All Science Journal Classification (ASJC) codes

  • Computer Networks and Communications
  • Information Systems
  • Signal Processing

Fingerprint

Dive into the research topics of 'Global non-convex optimization with discretized diffusions'. Together they form a unique fingerprint.

Cite this