Localization Schemes: A Framework for Proving Mixing Bounds for Markov Chains (extended abstract)

Yuansi Chen, Ronen Eldan

Research output: Chapter in Book/Report/Conference proceedingConference contributionpeer-review

Abstract

Two recent and seemingly-unrelated techniques for proving mixing bounds for Markov chains are: (i) the framework of Spectral Independence, introduced by Anari, Liu and Oveis Gharan, and its numerous extensions, which have given rise to several breakthroughs in the analysis of mixing times of discrete Markov chains and (ii) the Stochastic Localization technique which has proven useful in establishing mixing and expansion bounds for both log-concave measures and for measures on the discrete hypercube. In this paper, we introduce a framework which connects ideas from both techniques. Our framework unifies, simplifies and extends those two techniques. In its center is the concept of a 'localization scheme' which, to every probability measure on some space O, assigns a martingale of probability measures which 'localize' in space as time evolves. As it turns out, to every such scheme corresponds a Markov chain, and many chains of interest appear naturally in this framework. This viewpoint provides tools for deriving mixing bounds for the dynamics through the analysis of the corresponding localization process. Generalizations of concepts of Spectral Independence and Entropic Independence naturally arise from our definitions, and in particular we recover the main theorems in the spectral and entropic independence frameworks via simple martingale arguments (completely bypassing the need to use the theory of high-dimensional expanders). We demonstrate the strength of our proposed machinery by giving short and (arguably) simpler proofs to many mixing bounds in the recent literature. In particular, we: (i) Give the first O(n log n) bound for mixing time of the hardcore-model (of arbitrary degree) in the tree-uniqueness regime, under Glauber dynamics, (ii) Give the first optimal mixing bounds for Ising models in the uniqueness regime under any external fields, (iii) Prove a KL-divergence decay bound for log-concave sampling via the Restricted Gaussian Oracle, which achieves optimal mixing under any exp (n)-ivarm start, (iv) Prove a logarithmic-Sobolev inequality for near-critical Ferromagnetic Ising models, recovering in a simple way a variant of a recent result by Bauerschmidt and Dagallier.

Original languageEnglish
Title of host publicationProceedings - 2022 IEEE 63rd Annual Symposium on Foundations of Computer Science, FOCS 2022
PublisherIEEE Computer Society
Pages110-122
Number of pages13
ISBN (Electronic)9781665455190
DOIs
StatePublished - 2022
Externally publishedYes
Event63rd IEEE Annual Symposium on Foundations of Computer Science, FOCS 2022 - Denver, United States
Duration: 31 Oct 20223 Nov 2022

Publication series

NameProceedings - Annual IEEE Symposium on Foundations of Computer Science, FOCS
Volume2022-October
ISSN (Print)0272-5428

Conference

Conference63rd IEEE Annual Symposium on Foundations of Computer Science, FOCS 2022
Country/TerritoryUnited States
CityDenver
Period31/10/223/11/22

All Science Journal Classification (ASJC) codes

  • General Computer Science

Fingerprint

Dive into the research topics of 'Localization Schemes: A Framework for Proving Mixing Bounds for Markov Chains (extended abstract)'. Together they form a unique fingerprint.

Cite this